object tracking
Recently Published Documents


TOTAL DOCUMENTS

6522
(FIVE YEARS 1615)

H-INDEX

80
(FIVE YEARS 20)

2022 ◽  
Vol 151 ◽  
pp. 106928
Author(s):  
Qinghuan Xu ◽  
Jia Zhao ◽  
Chonglei Sun ◽  
Liuge Du ◽  
Baoqing Sun ◽  
...  

Sensors ◽  
2022 ◽  
Vol 22 (2) ◽  
pp. 608
Author(s):  
Cameron Aume ◽  
Keith Andrews ◽  
Shantanu Pal ◽  
Alice James ◽  
Avishkar Seth ◽  
...  

Nowadays, there is tremendous growth in the Internet of Things (IoT) applications in our everyday lives. The proliferation of smart devices, sensors technology, and the Internet makes it possible to communicate between the digital and physical world seamlessly for distributed data collection, communication, and processing of several applications dynamically. However, it is a challenging task to monitor and track objects in real-time due to the distinct characteristics of the IoT system, e.g., scalability, mobility, and resource-limited nature of the devices. In this paper, we address the significant issue of IoT object tracking in real time. We propose a system called ‘TrackInk’ to demonstrate our idea. TrackInk will be capable of pointing toward and taking pictures of visible satellites in the night sky, including but not limited to the International Space Station (ISS) or the moon. Data will be collected from sensors to determine the system’s geographical location along with its 3D orientation, allowing for the system to be moved. Additionally, TrackInk will communicate with and send data to ThingSpeak for further cloud-based systems and data analysis. Our proposed system is lightweight, highly scalable, and performs efficiently in a resource-limited environment. We discuss a detailed system’s architecture and show the performance results using a real-world hardware-based experimental setup.


2022 ◽  
Author(s):  
Jinzhen Yao ◽  
Jianlin Zhang ◽  
Zhixing Wang ◽  
Linsong Shao

Author(s):  
Jieming Yang ◽  
Hongwei Ge ◽  
Shuzhi Su ◽  
Guoqing Liu

Author(s):  
Madison Harasyn ◽  
Wayne S. Chan ◽  
Emma L. Ausen ◽  
David G. Barber

Aerial imagery surveys are commonly used in marine mammal research to determine population size, distribution and habitat use. Analysis of aerial photos involves hours of manually identifying individuals present in each image and converting raw counts into useable biological statistics. Our research proposes the use of deep learning algorithms to increase the efficiency of the marine mammal research workflow. To test the feasibility of this proposal, the existing YOLOv4 convolutional neural network model was trained to detect belugas, kayaks and motorized boats in oblique drone imagery, collected from a stationary tethered system. Automated computer-based object detection achieved the following precision and recall, respectively, for each class: beluga = 74%/72%; boat = 97%/99%; and kayak = 96%/96%. We then tested the performance of computer vision tracking of belugas and manned watercraft in drone videos using the DeepSORT tracking algorithm, which achieved a multiple-object tracking accuracy (MOTA) ranging from 37% – 88% and multiple object tracking precision (MOTP) between 63% – 86%. Results from this research indicate that deep learning technology can detect and track features more consistently than human annotators, allowing for larger datasets to be processed within a fraction of the time while avoiding discrepancies introduced by labeling fatigue or multiple human annotators.


2022 ◽  
Author(s):  
Thomas Lombaerts ◽  
Kimberlee H. Shish ◽  
Gordon Keller ◽  
Vahram Stepanyan ◽  
Nick B. Cramer ◽  
...  

Author(s):  
Navneet Ghedia ◽  
Chandresh Vithalani ◽  
Ashish M. Kothari ◽  
Rohit M. Thanki
Keyword(s):  

2022 ◽  
Author(s):  
Qiankun Liu ◽  
Dongdong Chen ◽  
Qi Chu ◽  
Lu Yuan ◽  
Bin Liu ◽  
...  
Keyword(s):  

Author(s):  
V. Vijeya Kaveri ◽  
V. Meenakshi ◽  
R. Meena Devi ◽  
A. Kousalya ◽  
M. Sujaritha
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document