Programming == Joy: A Whistle-Stop Tour of Ruby and Object Orientation

2020 ◽  
pp. 13-36
Author(s):  
Carleton DiLeo ◽  
Peter Cooper
Keyword(s):  
Author(s):  
Toby J. Lloyd-Jones ◽  
Juergen Gehrke ◽  
Jason Lauder

We assessed the importance of outline contour and individual features in mediating the recognition of animals by examining response times and eye movements in an animal-object decision task (i.e., deciding whether or not an object was an animal that may be encountered in real life). There were shorter latencies for animals as compared with nonanimals and performance was similar for shaded line drawings and silhouettes, suggesting that important information for recognition lies in the outline contour. The most salient information in the outline contour was around the head, followed by the lower torso and leg regions. We also observed effects of object orientation and argue that the usefulness of the head and lower torso/leg regions is consistent with a role for the object axis in recognition.


1992 ◽  
Vol 21 (1) ◽  
pp. 123-132 ◽  
Author(s):  
Stefan Conrad ◽  
Martin Gogolla

Sensors ◽  
2021 ◽  
Vol 21 (13) ◽  
pp. 4515
Author(s):  
Rinku Roy ◽  
Manjunatha Mahadevappa ◽  
Kianoush Nazarpour

Humans typically fixate on objects before moving their arm to grasp the object. Patients with ALS disorder can also select the object with their intact eye movement, but are unable to move their limb due to the loss of voluntary muscle control. Though several research works have already achieved success in generating the correct grasp type from their brain measurement, we are still searching for fine controll over an object with a grasp assistive device (orthosis/exoskeleton/robotic arm). Object orientation and object width are two important parameters for controlling the wrist angle and the grasp aperture of the assistive device to replicate a human-like stable grasp. Vision systems are already evolved to measure the geometrical attributes of the object to control the grasp with a prosthetic hand. However, most of the existing vision systems are integrated with electromyography and require some amount of voluntary muscle movement to control the vision system. Due to that reason, those systems are not beneficial for the users with brain-controlled assistive devices. Here, we implemented a vision system which can be controlled through the human gaze. We measured the vertical and horizontal electrooculogram signals and controlled the pan and tilt of a cap-mounted webcam to keep the object of interest in focus and at the centre of the picture. A simple ‘signature’ extraction procedure was also utilized to reduce the algorithmic complexity and system storage capacity. The developed device has been tested with ten healthy participants. We approximated the object orientation and the size of the object and determined an appropriate wrist orientation angle and the grasp aperture size within 22 ms. The combined accuracy exceeded 75%. The integration of the proposed system with the brain-controlled grasp assistive device and increasing the number of grasps can offer more natural manoeuvring in grasp for ALS patients.


1994 ◽  
Vol 5 (1) ◽  
pp. 13-33
Author(s):  
Franklin Figoli
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document