A call-independent and automatic acoustic system for the individual recognition of animals: A novel model using four passerines

2010 ◽  
Vol 43 (11) ◽  
pp. 3846-3852 ◽  
Author(s):  
Jinkui Cheng ◽  
Yuehua Sun ◽  
Liqiang Ji
2012 ◽  
Vol 131 (4) ◽  
pp. 2859-2865 ◽  
Author(s):  
Bo Zhang ◽  
Jinkui Cheng ◽  
Yan Han ◽  
Liqiang Ji ◽  
Fuming Shi

Author(s):  
Trung Minh Nguyen ◽  
Thien Huu Nguyen

The previous work for event extraction has mainly focused on the predictions for event triggers and argument roles, treating entity mentions as being provided by human annotators. This is unrealistic as entity mentions are usually predicted by some existing toolkits whose errors might be propagated to the event trigger and argument role recognition. Few of the recent work has addressed this problem by jointly predicting entity mentions, event triggers and arguments. However, such work is limited to using discrete engineering features to represent contextual information for the individual tasks and their interactions. In this work, we propose a novel model to jointly perform predictions for entity mentions, event triggers and arguments based on the shared hidden representations from deep learning. The experiments demonstrate the benefits of the proposed method, leading to the state-of-the-art performance for event extraction.


Behaviour ◽  
1991 ◽  
Vol 119 (3-4) ◽  
pp. 302-316 ◽  
Author(s):  
Patrice Robisson

AbstractIn the colonial emperor penguin Aptenodytes forsteri, the broadcast distance of the mutual display call (the distance over which the individual information conveyed by the call is transmitted) was determined. The variables measured were (1) the sound amplitude that averaged 94.8 dB SPL for birds facing toward the microphone and 85.7 dB SPL for birds facing away, (2) the sound attenuation that decreased with about 6 dB per doubling distance, (3) the ratio of the signal to the background noise of the colony which was 20-25 dB during the rearing period of chicks, and (4) the degradation of the signal structure by the scattering medium (penguin bodies) and distance, which affected the timbre, but not the two fundamental frequencies of the call that produced a beat phenomenon. This distance, 4-7 m indicates that the call is transmitted at short- or medium-range, and corresponds closely to the distance covered between two stops where a parent in search for its chick calls. The beat phenomenon undegraded by the scattering medium and distance is likely to serve individual recognition, assuming there is a relationship between the broadcast distance and the functional structure of emperor penguin call.


Behaviour ◽  
2014 ◽  
Vol 152 (1) ◽  
pp. 57-82 ◽  
Author(s):  
Charline Couchoux ◽  
Torben Dabelsteen

Vocal signals convey many types of information, and individually recognizable cues can benefit signallers and receivers, as shown in birdsongs that are used in the contexts of mating and territoriality. Bird calls are typically less complex than songs and thus are likely to convey less information. However, the rattle calls of some species serve a dual function, being emitted as an anti-predator and deterrence signal, and thus may encode information on individual identity. We investigated these questions in the common blackbird (Turdus merula), which emits complex rattle calls in both territorial and alarm contexts. The vocalisations of free-living males were elicited and recorded by playing back songs of unknown males in birds’ territories (territorial context) and also while approaching individuals (predator context). These song-like highly-structured multi-syllabic calls typically had three types of elements. Acoustic and statistical analyses revealed, through elevated repeatability indexes, that most of the acoustic measurements used to describe the complexity of the calls (structural, temporal and frequency parameters) were highly variable, due to inter-individual differences. The size of the call and the characteristics of the starting element only were able to discriminate a high portion of the individual calls. Beyond the very well studied songs of oscines, calls therefore deserve more attention as they also carry a potential for conveying information on individual identity.


Author(s):  
Ihsan M Salloum ◽  
Juan E. Mezzich

The person-centered integrative diagnosis (PID) is a model that aims at putting into practice the vision of person-centered medicine affirming the whole person of the patient in context as the center of clinical care and health promotion at the individual and community levels. The PID is a novel model of conceptualizing the process and formulation of clinical diagnosis. The PID presents a paradigm shift with a broader and deeper notion of diagnosis, beyond the restricted concept of nosological diagnoses. It involves a multilevel formulation of health status (both ill and positive aspects of health) through interactive participation and engagement of clinicians, patients, and families using all relevant descriptive tools (categorization, dimensions, and narratives). The current organizational schema of the PID comprises a multilevel standardized component model integrating three main domains. Each level or major domain addresses both ill health and positive aspects of health. The first level is the assessment of health status (ill health and positive aspects of health or well-being). The second level includes contributors to health, both risk factors and protective factors. The third major level includes health experience and values. Experience with the PID through a practical guide in Latin America supported the usefulness and adequacy of the PID model.


2002 ◽  
Vol 205 (24) ◽  
pp. 3793-3798 ◽  
Author(s):  
Thierry Aubin ◽  
Pierre Jouventin

SUMMARY King penguin chicks identify their parents by an acoustic signal, the display call. This call consists of a succession of similar syllables. Each syllable has two harmonic series, strongly modulated in frequency and amplitude, with added beats of varying amplitude generated by a two-voice system. Previous work showed that only one syllable of the call is needed for the chick to identify the calling adult. Both the frequency modulation pattern of the syllable and the two-voice system play a role in the call identification. The syllabic organisation of the call, the harmonic structure and the amplitude modulations of the syllables apparently do not contribute to individual recognition. Are these acoustic features useless? To answer to this question, playback experiments were conducted using three categories of experimental signals: (i) signal with only the fundamental frequencies of the natural call, (ii) signal with the amplitude of each syllable kept at a constant level and (iii) signals with only one syllable, repeated or not. The responses of chicks to these experimental signals were compared to those obtained with the calls of their natural parents. We found that these acoustic features, while not directly implicated in the individual recognition process,help the chicks to better localise the signal of their parents. In addition,the redundant syllabic organisation of the call is a means of counteracting the masking effect of the background noise of the colony.


Behaviour ◽  
2001 ◽  
Vol 138 (6) ◽  
pp. 709-726 ◽  
Author(s):  
Judith Fischer ◽  
Ralf Wanker

AbstractThe ability to discriminate individuals or different social classes of individuals is important for the evolution of social behaviour. In animal societies with ample social relationships selection will often favour the capacity to signal and perceive the identity and the membership to a certain social class. Spectacled parrotlets (Forpus conspicillatus, Psittacidae, Psittaciformes) live in a complex system of social relationships throughout their lives and are able to recognize their mates and their siblings on the basis of their contact calls. Here we attempt to identify the acoustic parameters that might be used in individual recognition and recognition of social categories. Therefore we analysed recordings of contact calls with reference to the variation of certain acoustical parameters. There was significant interindividual variation in the peak frequency, maximum frequency, duration, energy, bandwidth and minimum frequency in the contact calls of spectacled parrotlets. Discriminant function analysis has shown individual and social subunit specific calls but also that individuals of different social classes share some calls. From our results we hypothesize that spectacled parrotlets could use at least six acoustical cues in their contact calls that might encode information about the individual, the age class, the pair, the pairing status and the family.


2021 ◽  
Vol 17 (2) ◽  
pp. 155014772199262
Author(s):  
Shiwen Chen ◽  
Junjian Yuan ◽  
Xiaopeng Xing ◽  
Xin Qin

Aiming at the shortcomings of the research on individual identification technology of emitters, which is primarily based on theoretical simulation and lack of verification equipment to conduct external field measurements, an emitter individual identification system based on Automatic Dependent Surveillance–Broadcast is designed. On one hand, the system completes the individual feature extraction of the signal preamble. On the other hand, it realizes decoding of the transmitter’s individual identity information and generates an individual recognition training data set, on which we can train the recognition network to achieve individual signal recognition. For the collected signals, six parameters were extracted as individual features. To reduce the feature dimensions, a Bessel curve fitting method is used for four of the features. The spatial distribution of the Bezier curve control points after fitting is taken as an individual feature. The processed features are classified with multiple classifiers, and the classification results are fused using the improved Dempster–Shafer evidence theory. Field measurements show that the average individual recognition accuracy of the system reaches 88.3%, which essentially meets the requirements.


2016 ◽  
Author(s):  
Nayna Vyas-Patel ◽  
John D Mumford

AbstractA number of image recognition systems have been specifically formulated for the individual recognition of large animals. These programs are versatile and can easily be adapted for the identification of smaller individuals such as insects. The Interactive Individual Identification System, I3S Classic, initially produced for the identification of individual whale sharks was employed to distinguish between different species of mosquitoes and bees, utilising the distinctive vein pattern present on insect wings. I3S Classic proved to be highly effective and accurate in identifying different species and sexes of mosquitoes and bees, with 80% to100% accuracy for the majority of the species tested. The sibling species Apis mellifera and Apis mellifera carnica were both identified with100% accuracy. Bombus terrestris terrestris and Bombus terrestris audax; were also identified and separated with high degrees of accuracy (90% to 100% respectively for the fore wings and 100% for the hind wings). When both Anopheles gambiae sensu stricto and Anopheles arabiensis were present in the database, they were identified with 94% and 100% accuracy respectively, allowing for a morphological and non-molecular method of sorting between these members of the sibling complex. Flat, not folded and entire, rather than broken, wing specimens were required for accurate identification. Only one wing image of each sex was required in the database to retrieve high levels of accurate results in the majority of species tested. The study describes how I3S was used to identify different insect species and draws comparisons with the use of the CO1 algorithm. As with CO1, I3S Classic proved to be suitable software which could reliably be used to aid the accurate identification of insect species. It is emphasised that image recognition for insect species should always be used in conjunction with other identifying characters in addition to the wings, as is the norm when identifying species using traditional taxonomic keys.


Sign in / Sign up

Export Citation Format

Share Document