SafeStep is a mobile application developed to improve the mobility and navigation experience for visually impaired individuals using rideshare services. The app provides real-time navigation and obstacle detection, empowering users to travel safely and independently from the rideshare drop-off point to their final destination.
Sept 2023 - May 2024
Researcher, iOS Developer
There are over 20 million visually impaired Americans. These individuals struggle with navigating unfamiliar environments and avoiding physical obstacles during the transition from the vehicle to their final destination. Additionally, rideshare companies often sidestep ADA requirements, exacerbating accessibility issues for these travelers.
SafeStep was inspired by the broader implications of autonomous vehicle technology for the visually impaired population. Recognizing the immediate needs within the existing rideshare landscape, the focus shifted to enhancing the current experience. This user-centered approach ensures that SafeStep addresses real-world problems effectively, improving both safety and independence for its users.
Difficulty booking a ride through apps with screen-reader capability and voice command options.
Difficulty locating the vehicle upon arrival.
Lack of auditory updates about the journey, current location, and ETA.
Uncertainty about the safety and accuracy of the drop-off location.
Difficulty navigating to the final destination.
Difficulty in communicating their needs to drivers due to language differences.
Navigating unfamiliar environments and avoiding obstacles after exiting the vehicle.
Inadequate support forces reliance on others, compromising independence.
Striving for autonomy in navigating the city.
SafeStep aims to address the critical question of how to improve the journey from the rideshare drop-off point to the final destination for visually impaired users. By leveraging advanced technologies and a user-centered design approach, SafeStep provides a reliable and enjoyable travel experience, ensuring users' safety and independence.
Visually impaired travelers encounter significant difficulties in using rideshare services, especially when transitioning from the vehicle to their final destination. This gap is due to a lack of adequate technological support for obstacle detection and navigation.
Interviews and surveys with visually impaired users highlighted the need for real-time, reliable navigation assistance. Users expressed frustration with current solutions that do not adequately address their needs for safety and independence. They emphasized the desire for complete independence without relying on anyone to navigate the city.
Research indicates that 8% of Americans have a visual impairment, and rideshare usage among blind or visually impaired individuals is double the rate compared to those with other disabilities. This underscores the necessity of solutions like SafeStep.
7.1%: Blind and low-vision respondents
3.5%: Other disabled population
Seamlessly integrates with existing platforms, ensuring a smooth, user-friendly experience.
Empowers users to travel autonomously, fostering self-reliance.
Provides real-time, easy-to-understand feedback without the need for additional devices.
Ensures the technology is affordable, eliminating the need for expensive supplementary devices.
The development of SafeStep was grounded in extensive design research and consultations with experts from institutions such as the Perkins School for the Blind, the Carroll Center for the Blind, and Harvard Digital Accessibility Services. These consultations provided invaluable insights into the needs of visually impaired users, ensuring that SafeStep's design was user-centered and addressed real-world problems effectively.
SafeStep leverages cutting-edge technologies to provide its features. The app uses machine learning models, specifically the YOLO v8 model, for real-time object detection. GPS navigation is integrated to guide users from the drop-off point to their destination. The app's development involved using tools like Google Colab for model training and Xcode for app development, with integration via Apple's CoreML framework to optimize performance on iOS devices.
By integrating advanced technologies and user-centered design principles, SafeStep ensures a safe, reliable, and enjoyable journey from the rideshare drop-off point to the final destination. Below are the key features that make SafeStep an essential tool for visually impaired users.
Utilizes advanced machine learning models to detect obstacles in real-time, ensuring user safety.
Combines audio, haptic, and voice feedback to inform users of obstacles and navigation cues.
Provides accurate guidance to the destination's entrance, ensuring users reach the correct location.
Offers detailed, voice-assisted directions from the drop-off point to the final destination, tailored for visually impaired users.
Users receive a notification as they approach the drop-off point.
Real-time haptic feedback alerts users to nearby obstacles.
Voice navigation provides detailed directions to the final destination.
Haptic and voice feedback guide users to the correct door.
The SafeStep prototype demonstrates how the app's features work together to create a seamless and safe navigation experience for visually impaired users. The following gallery and interactive demo provide an in-depth look at the app's interface, functionality, and user experience.
SafeStep underwent extensive user testing with visually impaired individuals to ensure its effectiveness and usability. Feedback was gathered through interviews, surveys, and real-world usage scenarios, highlighting the importance of clear and concise audio feedback, intuitive navigation, and reliable obstacle detection. This feedback was crucial in refining the app’s features and improving the overall user experience.
The technical aspects of SafeStep, such as the YOLO v8 model for object detection and GPS integration for navigation, were rigorously tested in various environments. This ensured that the app performs reliably under different conditions and provides accurate, real-time information to users.
Based on user feedback, several iterations of the app were developed. Each version was tested to identify any issues or areas for improvement. Key insights from testing led to enhancements in the accuracy of object detection, the responsiveness of multimodal feedback, and the clarity of voice navigation instructions.
Iteration 1: Picture-in-Picture Mode
The picture-in-picture mode for the map was too distracting for users, drawing their focus away from crucial real-time navigation cues.
Iteration 2: Middle Third Focus Area
Highlighting the focus area in the middle third of the screen and coloring it red improved distant object detection, but the constant vibrations and sense of urgency from the red color were distracting rather than helpful.
One of the most significant learnings from the SafeStep project was the importance of deeply understanding the needs and challenges faced by visually impaired travelers. Engaging directly with users provided invaluable insights that shaped the design and functionality of the app.
The iterative design process, guided by user feedback, underscored the value of a user-centered approach. By continually refining the app based on real-world testing, SafeStep was able to deliver a solution that genuinely addresses the needs of its users.
Working on SafeStep provided an opportunity to develop technical skills, particularly in machine learning and mobile app development. Additionally, the project fostered personal growth, highlighting the impact of accessible design and the importance of empathy in creating inclusive technology.