Smart Eyes is a mobile Application that uses realtime Video Processing and Object detection to inform the visually impaired user of the objects around them. These objects consist of things like people, cars, bikes, cats, dogs, etc. The project also introduces many new ways and techniques to enable the user to navigate the application (which out the use of buttons). Aside from informing the user of the objects around them, it also uses services like Twilio to send texts to friends and family of the user's location when they don't feels safe. The way that the application informs the user of the objects in their surroundings, is through different characterized audio files for each object. The logic is also provides a 3D perspective of the audio (left and right as well as louder and quieter) depending on where the object is located. The application also exposes the user of events that are occurring on campus or in nearby areas using web scraping. The objective of this application is to redesign the instruments that the visually impaired user uses to go about their days as well giving them the independency.
The team is composed of Omar Said and Mohammed Safery. Omar Said is a self taught app developer. Developed skills including machine learning, object detection as well as creating accessibility applications for users with disabilities. Omar has done talks about similar topics in New York City as well as Wilfred Laurier University.
Mohammed Safery is a technological enthusiastic currently in his 3rd year at York University. He has a passion to solve real-world problems using his passion for programming to automate and limit user interaction from manual tasks. Safery has created and contributed to various projects ranging from application development to making use of real-world hardware. He is currently involved in creating a universal “mobile-application” generator to help non-profit organizations make use of mobile platforms spread their mission among the mobile users.