View on GitHub

2018-cap1-7

Soundview는 청각장애인을 위해, 소리를 시각화하는 프로젝트입니다.

Welcome to Viewtifuls’s git Pages

1. 프로젝트 개요

2. 팀 소개

SoundView를 개발하는 Viewtiful은 총 4명의 학부생으로 이루어져 있습니다.

Alt text

고가을 (팀장)

Alt text

김예린

Alt text

류성호

Alt text

정승우

3. Abstract

‘SoundView’ has the meaning of ‘eyes before eyes’ and it is smart glasses that are designed to detect the dangerous situation of hearing-impaired people and to communicate their coping intention to the public more quickly and conveniently.

It wears like a pair of glasses, recognizes the surrounding sounds, and displays the recognized contents on a transparent display so that you can immediately react to the sound. It uses a microphone module attached to the glasses to recognize the sound, analyzes the sound from the server, and provides information about the sound through the transparent display.

In addition, the user recognizes the gesture of the user through the camera module, searches the DB based on the analyzed contents, and transmits a corresponding sentence to the speaker, which allows the user to say “Thank you”, “See you again next time” It reproduces the words that the user needs.

4. UseCase Diagram

Alt text

5. 소개영상

6. 결과영상