Review 2012-2013 @ SRE, McGill University.

ISAS – Audio Augmented Reality App for the blind

Project Page – [http://isas.cim.mcgill.ca/]

ISAS is an eyes-free mobile system designed to give blind users a better sense of their surroundings. The goal is to use 3D spatialized audio to reveal the kind of information that visual cues such as neon signs provide to sighted users. Once users notice a point of interest, additional details are available on demand.

My work was on the Android Port of the app. I got the bare bone structure of the android app at the beginning and now it is nearing completion. It will be released in the Google Play store next month. My report after first sem, can be found (here). I also got an opportunity to do a Poster Presentation at GRAND 2013 on my work. The poster can be found (here).

RTER –  Real Time Emergency Response

Live – [http://rter.cim.mcgill.ca/]

Project Page – [http://cim.mcgill.ca/sre/projects/rter/]

This project deals with the detection, observation, and assessment of situations requiring intervention by emergency responders, offering them access to high-quality “live” data that may be visualized effectively both by responders in-situ and by remote operators in dedicated control rooms. Its components will include multimodal data registration, interactive visualization capabilities, and live streaming of the integrated contents.

I developed the Android App from scratch. It is still under development as new features are being added continuously.

This is being developed under Mozilla Ignite Challenge. We were declared the winners by White House. http://whitehouse.gov/blog/2013/06/25/mozilla-ignite-challenge-winner-announced

Our Presentation :  https://www.youtube.com/watch?v=B2iTtdosbV4

News: http://gcn.com/articles/2013/07/24/mozilla-ignite-challenge.aspx

Walking Straight – Part of ISAS

My report on this is  (here).

The system detects crosswalk and helps the user align with the direction and position of walking.The detection of the crosswalk is based on bipolarity and regular geometrical features of the markings. Thereafter, relevant  pedestrian traffic lights in the environment are detected. On the green(go) signal, audio feedback is provided to the user to walk without veering. This method tries to compensate for the erroneous sensors on the mobile phones which were leveraged by Walking Straight application to correct the deviation of the blind while walking.

This was my solely research based contribution, others were more implementation based. I still have not been able to complete this work.