Mobile vision (MobiVis)

Author(s):  
Rahul Swaminathan ◽  
Michael Rohs ◽  
Jussi Ängeslevä
Keyword(s):  
Author(s):  
Lambert Spaanenburg ◽  
Suleyman Malki

In the early days of photography, camera movement was a nuisance that could blur a picture. Once movement becomes measurable by micro-mechanical means, the effects can be compensated by optical, mechanical or digital technology to enhance picture quality. Alternatively movement can be quantified by processing image streams. This opens up for new functionality upon convergence of the camera and the mobile phone, for instance by ’actively extending the hand’ for remote control and interactive signage.


2020 ◽  
Vol 17 (9) ◽  
pp. 3958-3963
Author(s):  
M. V. Manoj Kumar ◽  
B. S. Prashanth ◽  
Syeda Sarah ◽  
Md. Kashif ◽  
H. R. Sneha

This Paper proposes method to recommend a set of songs based on the facial emotion state of the user. Emotion state of the user is detected with the help of google mobile vision SDK. The detected emotion state is fed to Expression-X algorithm that would sort the music (based on emotion value is keyed in) and generates a playlist which suites the emotion state of user. Since emotions are calculated based on the facial expression of a user, achieving 100% accuracy is undoubtedly hard as everyone has their own way of expressing emotions facially, with repetitive testing we have achieved 70–75% success rate in detecting the rite emotion state of the user, and generating the suitable set of song recommendation.


Sign in / Sign up

Export Citation Format

Share Document