hackquest logo

BrightWalk

Bright Walk is a voice-operated app that assists visually impaired individuals and tourists by providing scene recognition, navigation, multilingual translation, and itinerary suggestions in real-time

视频

技术栈

Kotlin
Python
Java
XML

描述

Bright Walk is an inclusive Android app designed to empower visually impaired individuals and tourists navigating unfamiliar environments. By integrating multimodal AI (Qwen), OCR, and real-time translation, it addresses critical challenges through a 100% voice-operated interface and support for 13 Indian languages, ensuring accessibility across diverse linguistic demographics.

Key Features

For Visually Impaired Users:

Scene Exploration: Capture images via voice commands and ask iterative questions (e.g., “Describe the left side” or “Read the sign ahead”) for granular insights.

Multilingual Support: Translates text from signs, menus, or labels into 13 Indian languages (e.g., Hindi, Tamil, Bengali).

Dynamic Navigation: Provides audio-guided directions and obstacle alerts.

For Tourists:

Language Barrier Solutions: Translates text from photos into the user’s native language, including regional Indian languages.

Journey Planning: Suggests nearby hotspots (historical sites, restaurants) with cultural context.

Unified Functionality:

Voice-Driven Workflow: Fully operated via natural language commands.

Contextual AI: Retains image context for follow-up queries (e.g., “Find vegetarian restaurants nearby” after scanning a street).

Market Differentiation

Replaces Multiple Apps: Combines OCR, translation, and navigation into one platform.

Localized for India: Supports 13 Indian languages, bridging gaps in rural and urban accessibility.

Inclusive Design: Tailored for voice interaction, eliminating reliance on visual UI.

Impact

Bright Walk bridges independence gaps for 4.95M visually impaired Indians while streamlining travel for tourists. By enabling iterative Q&A, AI-curated journey suggestions, and multilingual support, it reduces cognitive load in foreign environments.

Technologies Used:

Multimodal AI (Qwen) for scene understanding.

OCR (Tesseract) with Indian language support.

Speech-to-text (Google Speech) for voice commands.

This app has the potential to serve 35M+ global tourists annually, fostering inclusivity and seamless exploration.

本次黑客松进展

It began with a simple idea: “What if one app could guide both a blind person through a bustling Indian market and a tourist lost in Tokyo?” We started by tackling scene recognition—training a multimodal AI (Qwen) to describe environments in real-time. Next, we wove in voice commands, letting users ask, “What’s ahead?” and hear instant replies. But language barriers lingered, so we added OCR and translation, enabling the app to read a Japanese menu and recite it in Hindi. For tourists, we mapped nearby hotspots—temples, cafes, museums—and let them ask follow-ups: “Is there vegetarian food here?” Then came the magic touch: multilingual support for 13 Indian languages, ensuring a grandmother in rural Gujarat could navigate as effortlessly as a tech-savvy traveler. Each feature stacked like bricks: navigation cues for the blind, itinerary tips for tourists, and a layered query system where a single image could unravel through endless questions. By the final hour, Bright Walk wasn’t just an app—it was a bridge, turning “I can’t” into “I explored.”

融资状态

We haven't raised the funds yet , but considering the vast opportunities while catering tourist and helping visually impaired , we are in a strong position to raise money for market deployment.
队长
PPrimus
项目链接
赛道
AI