Navigating safely and independently is challenging for people who are blind or have low vision (BLV) due to the need for a detailed understanding of their surroundings. Our user study highlights the importance of identifying sidewalk materials and objects for effective navigation.
This project presents a pioneering study in navigational aids for BLV individuals, investigating the use of auditory data—specifically, cane tip sounds on various sidewalk materials—for material identification. Using machine learning and deep learning techniques, we classify sidewalk materials based on audio cues, enhancing BLV individuals’ autonomy.

Cane Quest: Audio Adventure: Test your hearing skills and check if you can beat AI model 🙂
Instructions: Listen to the audio clip and select the best fitting answer by clicking one of the sidewalk materials below.
Link: https://canequest.vercel.app/
Data Collection:
- We are seeking high school and undergraduate students for urban accessibility data collection and provide a community service letter for volunteers. Please email [email protected] to request the letter.
Publications:
- Jiawei Liu, Wayne Lam, Zhigang Zhu, Hao Tang. SMDAF: A Scalable Sidewalk Material Data Acquisition Framework with Bidirectional Cross-Modal Knowledge Distillation, accepted by IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2025
- Jiawei Liu, Wayne Lam, Zhigang Zhu, Hao Tang. Surveying Sidewalk Materials for and by Individuals Who Are Blind or Have Low Vision: Audio Data Collection and Classification. International Conference on SMART MULTIMEDIA, March 27-29, 2024 (Late Breaking Paper).
- Jiawei Liu, Classifying Sidewalk Materials Using Multi-Modal Data. Master Thesis (Advisors: Professor Hao Tang and Professor Zhigang Zhu), Data Science and Engineering, The City College of New York, August 2023.