Thammasat University students interested in urban studies, architecture, political science, municipal planning, sociology, anthropology, and related subjects may find it useful to participate in a free 27 November Zoom webinar on Walkability: Geo-informatic Tools for Sustainable Urban Mobility.
The event, on Wednesday, 27 November at 7pm Bangkok time, is organized by the Institute of Advanced Studies, Loughborough University, the United Kingdom.
The TU Library collection includes books about urban mobility.
Students are invited to register at this link:
https://us06web.zoom.us/webinar/register/WN_XnSUhMgESgOAQwFH08QbGg#/registration
The event announcement states:
IAS Visiting Fellow Dr. Achituv Cohen delivers a seminar on their research –
Walkability is a key element in modern urban planning, shaping cities’ environmental, social, and economic dynamics. Factors such as street design, destination proximity, connectivity, and subjective perceptions like safety and comfort influence walkability. However, measuring it effectively remains challenging, particularly when combining both objective and subjective factors.
In this seminar, Dr. Cohen will share insights from his research and other studies on walkability measurement using Geographic Information System (GIS) and geospatial data science. He will explore how innovative methodologies, like machine and deep learning, offering new ways to analyse walking behaviour. Additionally, he will introduce emerging trends, including integrating subjective perceptions, improving accessibility for vulnerable populations, and utilizing the POI VizNet, tool we developed for assessing urban visual accessibility. These approaches enhance our ability to understand walkability and not just deepen our knowledge of urban mobility, but also contribute to the creation of more inclusive and sustainable communities.
In 2020, Professor Cohen coauthored an article, Route planning for blind pedestrians using OpenStreetMap.
Its abstract:
While most of us take wayfinding and orientation for granted, instinctively utilizing our visual channels to do so, millions of blind people around the world face challenges and obstacles when attempting to perform the most basic tasks, such as walking to the corner store or using public transportation. As blind pedestrians lack critical information about the space they traverse outside the familiarity of their home, they are restricted, dependent on others, and have decreased quality of life. While assistive technologies for providing specific navigation solutions do exist, research is still limited regarding customized wayfinding solutions for blind pedestrians.
This research aims at developing a wayfinding algorithm that relies on the OpenStreetMap mapping catalogue for planning accessible and safe routes specifically suited to blind pedestrians. In-depth investigations, observations, and interviews were conducted with blind people and with orientation and mobility instructors, in order to define and categorize spatial criteria relating to mobility, accessibility, and safety. Using controlled iterative experiments, weighted network graph criteria were defined, leading to the development of a route planning software that generates optimized routes for blind pedestrians.
The developed software was then tested on a variety of routes, with the help of blind volunteers and orientation and mobility instructors. The results show that the optimal routes generated by the software were identical or very similar to those suggested by the experienced orientation and mobility instructors. Moreover, the blind volunteers also stated that the software planned routes were indeed more accessible and safer for them to walk along compared to routes suggested by existing commercial software developed for seeing pedestrians.
The findings of this research indicate that our solution, based on OpenStreetMap and developed for the benefit of blind pedestrians, is effective and practical, and could improve the mobility, independence, and quality of life of this population, as well as increasing their integration into society.
From the Introduction:
For most pedestrians, it is a matter of course to choose the shortest, most convenient route, utilizing their innate wayfinding skills and perceptual and cognitive structures of space into spatial information. Yet, for people who are blind, this so-called simple task is often complex and even dangerous. In 2018, the World Health Organization1 estimated that 36 million people around the globe are blind.
These blind pedestrians are unable to take advantage of landmarks and road networks for orientation purposes, nor are they able to instinctively avoid obstacles, such as road cracks, sidewalk benches, and areas shared by pedestrians and vehicles or bikes, without the help of assistive technologies, guide dogs, or seeing people. As a result, their mobility and independence are often restricted, rendering them housebound and with decreased well-being (SSMR, 2009).
Although navigation tools for blind pedestrians have greatly developed in recent years, comprehensive customized wayfinding solutions are still lacking. Smartphone mobility applications, which mainly rely on the Global Navigation Satellite System (GNSS) sensor, assist blind pedestrians in their conducting of daily activities, such as catching a bus (e.g. Step-Hear2), reading road sign text, and identifying their exact location (e.g. Sendero Group3, GeorgiePhone4).
Other technologies offer real-time obstacle detection and classification, based on video streams and photos (e.g. OrCam5). While these technologies are important and contributory, no comprehensive solution currently exists for providing blind pedestrians with safe, accessible, and convenient routes tailored to their needs and preferences.
With the aim of developing a wayfinding tool specifically suited to blind pedestrians, this study examined the use of OpenStreetMap (OSM) for geospatial data mapping and optimized walking route planning. OSM is an open-data mapping project based on Volunteered Geographic Information and participatory mapping, whereby any person in any location can add or update features on the map.
Data can also be integrated from various sources, resulting in powerful, flexible, and updated mapping services. While optimal routes for seeing pedestrians are generally the shortest or fastest routes (generated by most commercial route planning software accordingly, such as Google Maps), this is not the case for blind pedestrians, as their optimal path must be the safest and most accessible route. To date, no solution enables the combining of OSM map data with optimal weighted network graphs that rely on spatial criteria specifically relevant to blind pedestrians.
In this study, we created a weighted network graph based on OSM data and developed a unique route-planning algorithm tailored to the wayfinding requirements of blind pedestrians. To understand which environmental features and abstract phenomena impact the wayfinding and navigating capabilities of blind pedestrians, we conducted interviews with orientation and mobility instructors (OMIs) and with blind pedestrians and observed the physical wayfinding challenges of the latter.
Based on the gathered insights and understandings, we built a criteria system of obstacles, preferences, and strategies that could then be translated and materialized through the OSM map data into practical, computerized solutions. Finally, a weighted network graph was created for quantitating the accessibility and safety level of each network road segment, resulting in the recommending of the most optimal route for blind pedestrians from the point of origin to the point of destination.
Implementing the study’s promising results and scientific insights in assistive navigation systems could enhance the independence of blind pedestrians and optimize their wayfinding and navigation in urban environments. Moreover, the knowledge gathered in this study could provide policymakers and urban planners with an applicable tool for designing smart walkable cities that are accessible to blind and seeing pedestrians alike. […]
(All images courtesy of Wikimedia Commons)