Home automation with smart robot featuring live video feed and remotely controlled switches

Author(s):  
Zaki Ud Din ◽  
Wee-Gin David Lim
2011 ◽  
pp. 160-171 ◽  
Author(s):  
Yeonjoo Oh ◽  
Ken Camarata ◽  
Michael Philetus Weller ◽  
Mark D. Gross ◽  
Ellen Yi-Luen Do

People can use computationally-enhanced furniture to interact with distant friends and places without cumbersome menus or widgets. We describe computing embedded in a pair of tables and a chair that enables people to experience remote events in two ways: The TeleTables are ambient tabletop displays that connect two places by projecting shadows cast on one surface to the other. The Window Seat rocking chair through its motion controls a remote camera tied to a live video feed. Both explore using the physical space of a room and its furniture to create “bilocative” interfaces.


2012 ◽  
Author(s):  
Bartlomiej Bosek ◽  
Leszek Horwath ◽  
Grzegorz Matecki ◽  
Arkadiusz Pawlik

The project throws light on detection of object and afterwards tracking of the said object using IOT and WSN. All the operations will be performed in real time as capturing of images is a continuous process which is achieved with the help of ESP32CAM mounted on the chassis of the robot and its connection is given to ESP32CAM. Ultrasonic detects object and tracking by robot is done by its right or left movements and backward or forward movements depending on the said object’s displacement. The distance between the robot and the said object is constant which is preserved with the help of ultrasonic sensors. Tracking involves live video feed and trigger of manual mode for detecting the object. Once the object is detected it will be intimated through WSN to base station and through IoT local and central headquarters for further analysis.


Author(s):  
Yeonjoo Oh ◽  
Ken Camarata ◽  
Michael Philetus Weller ◽  
Mark D. Gross ◽  
Ellen Yi-Luen Do

People can use computationally-enhanced furniture to interact with distant friends and places without cumbersome menus or widgets. We describe computing embedded in a pair of tables and a chair that enables people to experience remote events in two ways: The TeleTables are ambient tabletop displays that connect two places by projecting shadows cast on one surface to the other. The Window Seat rocking chair through its motion controls a remote camera tied to a live video feed. Both explore using the physical space of a room and its furniture to create “bilocative” interfaces.


2004 ◽  
Vol 5 (1) ◽  
pp. 75-97 ◽  
Author(s):  
Irene M. Pepperberg ◽  
Steven R. Wilkes

Grey parrots (Psittacus erithacus) do not acquire referential English labels when tutored with videotapes displayed on CRT screens if (a) socially isolated, (b) reward for attempted labels is possible, (c) trainers direct birds’ attention to the monitor, (d) live video feed avoids habituation or (d) one trainer repeats labels produced on video and rewards label attempts. Because birds learned referential labels from live tutor pairs in concurrent sessions, we concluded that video failed because input lacked live social interaction and modeling (Pepperberg, 1999). Recent studies (e.g. Ikebuchi & Okanoya, 1999), however, suggest that standard CRT monitor flickering could instead have prevented learning. Using an LCD monitor, we found that eliminating flickering did not enable birds to learn from video under conditions of limited social interaction. Results emphasize the role of social interaction in referential label learning and may generalize to other systems (e.g. disabled children, or possibly software and robotic agents).


2019 ◽  
Vol 8 (4) ◽  
pp. 11524-11528

Today, there are a few autonomous fire-fighting robots but the absolute autonomous decision making in places that involve discrete thinking is still unresolved. With remotely operated fire-fighting robots, this problem can be solved to an extent. The project involves use of a remote power source to reduce the weight of the robot and a bio-inspired design of the fire hose manipulator mimicking the elephant’s trunk using which the hose tip could be moved precisely up to 5° on every direction. The hose can be manipulated to direct the water towards the fire, using the live video feed from a camera and raspberry pi set up that are on board. The movement of the robot and the fire hose manipulator can be remotely operated using GUI interface. The response of the robot for various intensities of flame, the angular freedom of the manipulator and the projectile of water-flow were studied and calibrated for better performance.


2019 ◽  
Vol 8 (4) ◽  
pp. 8941-8944

The number of major road accidents that occur per day is on a rise and most of them are attributed to being the driver’s fault. According to the survey done in 2015, drivers are held responsible for approximately 78% of the accidents. To minimize the occurrence of these incidents a monitoring system that alerts the driver when he succumbs to sleep is proposed. This algorithm processes live video feed focused on the driver’s face and tracks his eye and mouth movements to detect eye closure and yawning rates. An alarm sounds if the driver is drowsy or already asleep. Haar-cascade classifiers run parallelly on the extracted facial features to detect eye closure and yawning.


Sign in / Sign up

Export Citation Format

Share Document