scholarly journals Ultraviolet-reflective film applied to windows reduces the likelihood of collisions for two species of songbird

PeerJ ◽  
2020 ◽  
Vol 8 ◽  
pp. e9926
Author(s):  
John P. Swaddle ◽  
Lauren C. Emerson ◽  
Robin G. Thady ◽  
Timothy J. Boycott

Perhaps a billion birds die annually from colliding with residential and commercial windows. Therefore, there is a societal need to develop technologies that reduce window collisions by birds. Many current window films that are applied to the external surface of windows have human-visible patterns that are not esthetically preferable. BirdShades have developed a short wavelength (ultraviolet) reflective film that appears as a slight tint to the human eye but should be highly visible to many bird species that see in this spectral range. We performed flight tunnel tests of whether the BirdShades external window film reduced the likelihood that two species of song bird (zebra finch, Taeniopygia guttata and brown-headed cowbird, Molothrus ater) collide with windows during daylight. We paid particular attention to simulate the lighting conditions that birds will experience while flying during the day. Our results indicate a 75–90% reduction in the likelihood of collision with BirdShades-treated compared with control windows, in forced choice trials. In more ecologically relevant comparison between trials where all windows were either treated or control windows, the estimated reduction in probability of collision was 30–50%. Further, both bird species slow their flight by approximately 25% when approaching windows treated with the BirdShades film, thereby reducing the force of collisions if they were to happen. Therefore, we conclude that the BirdShades external window film will be effective in reducing the risk of and damage caused to populations and property by birds’ collision with windows. As this ultraviolet-reflective film has no human-visible patterning to it, the product might be an esthetically more acceptable low cost solution to reducing bird-window collisions. Further, we call for testing of other mitigation technologies in lighting and ecological conditions that are more similar to what birds experience in real human-built environments and make suggestions for testing standards to assess collision-reducing technologies.

Sensors ◽  
2020 ◽  
Vol 20 (10) ◽  
pp. 2984
Author(s):  
Yue Mu ◽  
Tai-Shen Chen ◽  
Seishi Ninomiya ◽  
Wei Guo

Automatic detection of intact tomatoes on plants is highly expected for low-cost and optimal management in tomato farming. Mature tomato detection has been wildly studied, while immature tomato detection, especially when occluded with leaves, is difficult to perform using traditional image analysis, which is more important for long-term yield prediction. Therefore, tomato detection that can generalize well in real tomato cultivation scenes and is robust to issues such as fruit occlusion and variable lighting conditions is highly desired. In this study, we build a tomato detection model to automatically detect intact green tomatoes regardless of occlusions or fruit growth stage using deep learning approaches. The tomato detection model used faster region-based convolutional neural network (R-CNN) with Resnet-101 and transfer learned from the Common Objects in Context (COCO) dataset. The detection on test dataset achieved high average precision of 87.83% (intersection over union ≥ 0.5) and showed a high accuracy of tomato counting (R2 = 0.87). In addition, all the detected boxes were merged into one image to compile the tomato location map and estimate their size along one row in the greenhouse. By tomato detection, counting, location and size estimation, this method shows great potential for ripeness and yield prediction.


Author(s):  
Joseph Coyne ◽  
Ciara Sibley

Eye tracking technologies are being utilized at increasing rates within industry and research due to the very recent availability of low cost systems. This paper presents results from a study assessing two eye tracking systems, Gazepoint GP3 and Eye Tribe, both of which are available for under $500 and provide streaming gaze and pupil size data. The emphasis of this research was in evaluating the ability of these eye trackers to identify changes in pupil size which occur as a function of variations in lighting conditions as well as those associated with workload. Ten volunteers participated in an experiment in which a digit span task was employed to manipulate workload as user’s fixated on a monitor which varied in background luminance (black, gray and white). Results revealed that both systems were able to significantly differentiate pupil size differences in high and low workload trials and changes due to the monitor’s luminance. These findings are exceedingly promising for human factors researchers, as they open up the opportunity to augment studies with non-obtrusive, streaming measures of mental workload with technologies available for as little as $100.


Author(s):  
D. Pagliaria ◽  
L. Pinto ◽  
M. Reguzzoni ◽  
L. Rossi

Since its launch on the market, Microsoft Kinect sensor has represented a great revolution in the field of low cost navigation, especially for indoor robotic applications. In fact, this system is endowed with a depth camera, as well as a visual RGB camera, at a cost of about 200$. The characteristics and the potentiality of the Kinect sensor have been widely studied for indoor applications. The second generation of this sensor has been announced to be capable of acquiring data even outdoors, under direct sunlight. The task of navigating passing from an indoor to an outdoor environment (and vice versa) is very demanding because the sensors that work properly in one environment are typically unsuitable in the other one. In this sense the Kinect could represent an interesting device allowing bridging the navigation solution between outdoor and indoor. In this work the accuracy and the field of application of the new generation of Kinect sensor have been tested outdoor, considering different lighting conditions and the reflective properties of the emitted ray on different materials. Moreover, an integrated system with a low cost GNSS receiver has been studied, with the aim of taking advantage of the GNSS positioning when the satellite visibility conditions are good enough. A kinematic test has been performed outdoor by using a Kinect sensor and a GNSS receiver and it is here presented.


2017 ◽  
Vol 30 (3) ◽  
pp. 403-416 ◽  
Author(s):  
Milos Petkovic ◽  
Vladimir Sibinovic ◽  
Dragisa Popovic ◽  
Vladimir Mitic ◽  
Darko Todorovic ◽  
...  

This paper presents two simple and cost effective indoor localisation methods. The first method uses ceiling-mounted wide-view angle webcam, computer vision and coloured circular markers, placed on the top of a robot. Main drawbacks of this method are lens distortion and sensitivity to lighting conditions. After solving these problems, a high localisation accuracy of ?1cm is achieved at about 5 Hz sampling rate. The second method is a version of trilateration, based on ultrasound time of flight distance measurement. An ultrasonic beacon is placed on a robot while wall detectors are strategically placed to avoid an excessive occlusion. The ZigBee network is used for inter-device synchronisation and for broadcasting measured data. Robot location is determined as a solution to the minimisation of measurement errors. Using Nelder-Mead algorithm and low-cost distance measuring devices, a solid sub 5 cm localisation accuracy is achieved at 10Hz.


2018 ◽  
Author(s):  
Joshua Harvey ◽  
Takuma Morimoto ◽  
Manuel Spitschan

At this year's European Conference on Visual Perception we debuted a novel colour science demonstration---and visual illusion---for the Un mare di illusioni exhibition. Under carefully curated lighting conditions, cycling through different illuminant spectra, certain fruits and vegetables appear to glow and dim in an unchanging environment. Encouraged by the positive reactions it received, and the numerous and specific questions from conference delegates, we here describe what this illusion is, why we believe it may work, and how this particular low-cost setup may be assembled and demonstrated for the amazement of your friends, students, and members of the public.


Author(s):  
M. M. Nawaf ◽  
J.-M. Boï ◽  
D. Merad ◽  
J.-P. Royer ◽  
P. Drap

This paper provides details of both hardware and software conception and realization of a hand-held stereo embedded system for underwater imaging. The designed system can run most image processing techniques smoothly in real-time. The developed functions provide direct visual feedback on the quality of the taken images which helps taking appropriate actions accordingly in terms of movement speed and lighting conditions. The proposed functionalities can be easily customized or upgraded whereas new functions can be easily added thanks to the available supported libraries. Furthermore, by connecting the designed system to a more powerful computer, a real-time visual odometry can run on the captured images to have live navigation and site coverage map. We use a visual odometry method adapted to low computational resources systems and long autonomy. The system is tested in a real context and showed its robustness and promising further perspectives.


Sensors ◽  
2021 ◽  
Vol 21 (24) ◽  
pp. 8357
Author(s):  
Akito Tohma ◽  
Maho Nishikawa ◽  
Takuya Hashimoto ◽  
Yoichi Yamazaki ◽  
Guanghao Sun

Camera-based remote photoplethysmography (rPPG) is a low-cost and casual non-contact heart rate measurement method suitable for telemedicine. Several factors affect the accuracy of measuring the heart rate and heart rate variability (HRV) using rPPG despite HRV being an important indicator for healthcare monitoring. This study aimed to investigate the appropriate setup for precise HRV measurements using rPPG while considering the effects of possible factors including illumination, direction of the light, frame rate of the camera, and body motion. In the lighting conditions experiment, the smallest mean absolute R–R interval (RRI) error was obtained when light greater than 500 lux was cast from the front (among the following conditions—illuminance: 100, 300, 500, and 700 lux; directions: front, top, and front and top). In addition, the RRI and HRV were measured with sufficient accuracy at frame rates above 30 fps. The accuracy of the HRV measurement was greatly reduced when the body motion was not constrained; thus, it is necessary to limit the body motion, especially the head motion, in an actual telemedicine situation. The results of this study can act as guidelines for setting up the shooting environment and camera settings for rPPG use in telemedicine.


The Auk ◽  
1984 ◽  
Vol 101 (1) ◽  
pp. 25-37 ◽  
Author(s):  
D. M. Bryant ◽  
C. J. Hails ◽  
P. Tatner

Abstract Daily energy expenditure (DEE) during breeding was studied in Pacific Swallows (Hirundo tahitica) and Blue-throated Bee-eaters (Merops viridis) in Malaysia. DEE was measured directly by the doubly labelled water (D218O) technique and indirectly by TAL (time-activity-laboratory) methods during the time adults were feeding their young at the nest. DEE was 76.6 kJ/day in the Pacific Swallow and 77.4 kJ/day in the larger Blue-throated Bee-eater (D218O results). The relatively low DEE, compared to temperate-zone insectivores that also feed in flight, was attributed to the action of proximate factors, namely a more favorable thermal environment and shorter days (which results in less daytime activity). In bee-eaters, partial use of a low-cost foraging technique also contributed to their lower DEE. The suitability of DEE as a measure of reproductive effort is discussed.


PLoS Biology ◽  
2020 ◽  
Vol 18 (11) ◽  
pp. e3000929
Author(s):  
Sofija V. Canavan ◽  
Daniel Margoliash

Birds and mammals share specialized forms of sleep including slow wave sleep (SWS) and rapid eye movement sleep (REM), raising the question of why and how specialized sleep evolved. Extensive prior studies concluded that avian sleep lacked many features characteristic of mammalian sleep, and therefore that specialized sleep must have evolved independently in birds and mammals. This has been challenged by evidence of more complex sleep in multiple songbird species. To extend this analysis beyond songbirds, we examined a species of parrot, the sister taxon to songbirds. We implanted adult budgerigars (Melopsittacus undulatus) with electroencephalogram (EEG) and electrooculogram (EOG) electrodes to evaluate sleep architecture, and video monitored birds during sleep. Sleep was scored with manual and automated techniques, including automated detection of slow waves and eye movements. This can help define a new standard for how to score sleep in birds. Budgerigars exhibited consolidated sleep, a pattern also observed in songbirds, and many mammalian species, including humans. We found that REM constituted 26.5% of total sleep, comparable to humans and an order of magnitude greater than previously reported. Although we observed no spindles, we found a clear state of intermediate sleep (IS) similar to non-REM (NREM) stage 2. Across the night, SWS decreased and REM increased, as observed in mammals and songbirds. Slow wave activity (SWA) fluctuated with a 29-min ultradian rhythm, indicating a tendency to move systematically through sleep states as observed in other species with consolidated sleep. These results are at variance with numerous older sleep studies, including for budgerigars. Here, we demonstrated that lighting conditions used in the prior budgerigar study—and commonly used in older bird studies—dramatically disrupted budgerigar sleep structure, explaining the prior results. Thus, it is likely that more complex sleep has been overlooked in a broad range of bird species. The similarities in sleep architecture observed in mammals, songbirds, and now budgerigars, alongside recent work in reptiles and basal birds, provide support for the hypothesis that a common amniote ancestor possessed the precursors that gave rise to REM and SWS at one or more loci in the parallel evolution of sleep in higher vertebrates. We discuss this hypothesis in terms of the common plan of forebrain organization shared by reptiles, birds, and mammals.


Sign in / Sign up

Export Citation Format

Share Document