Book Details

NoIMG

Oyooni: a deep learning-based arabic assistant for visually impaired people / Sami Alan Abdullah ; Najdat Al-Akkad ; Hala Kassab ; Dana Al Kailani ; Abdurrahman Qusaibaty

Publication year: 2021

ISBN: CCE00042

Internet Resource: Please Login to download book


Since the beginning of time, visually impaired people have been relying on other senses to interpret their surroundings and function properly, using mobility tools and white canes to guide them. About a decade ago, their quality of life was leveled up by the advances in deep learning and computer vision that were able to achieve state-of-the-art results to assist blind people in daily tasks. Unfortunately, these progresses are very limited in Arabic language. Hence, this project presents “Oyooni” (Arabic for “My Eyes”), a voice-powered Arabic assistant, that introduces several contributions to the Arabic language community, and showcases four deep learning models trained to be used by Arabic-speaking users, in the hopes of helping visually challenged people detect Syrian banknotes, caption scenes in Arabic, detect color and recognize Arabic written text.


Subject: Computer Sceince, Deep learning, Digital assistive technology, Syrian banknote detection, Arabic image caption, Text recognition, WebRTC, Web sockets, Transformers, Encoder-Decoder, Deep neural networks, Yolov5, Convolutional neural network, Inception, Imagenet, Flutter