Comparative Study on Artificial Intelligence Based Real Time Voice Augmentation Applications

2021 ◽  
Vol 7 (1) ◽  
pp. 42-48
Author(s):  
Sumaiya M N ◽  
Akash U S ◽  
Aravind Sharma Kala ◽  
Sreekanth B V ◽  
Dharanendra Gowda G M
Endoscopy ◽  
2021 ◽  
Author(s):  
Carolin Zippelius ◽  
Saleh A. Alqahtani ◽  
Jörg Schedel ◽  
Dominic Brookman-Amissah ◽  
Klaus Muehlenberg ◽  
...  

Background and Aims: Adenoma detection rate (ADR) varies significantly between endoscopists with up to 26% adenoma miss rate (AMR). Artificial intelligence (AI) systems may improve endoscopic quality and reduce the rate of interval cancer. We evaluated the efficacy of an AI system in real time colonoscopy and its influence on the AMR and the ADR. Patients and methods: In this prospective non-randomized comparative study we analyzed 150 patients (age 65±14, 69 women, 81 men) undergoing diagnostic colonoscopy at a single endoscopy center in Germany from June to October 2020. Every patient was examined concurrently by an endoscopist and AI using two opposing screens. The AI system GI Genius (Medtronic), overseen by a second observer, was not visible to the endoscopist. AMR was the primary outcome. Both methods were compared by the McNemar Test. Results: There was no significant and no clinically relevant difference (p=0.754) in AMR between the AI system (6/197, 3.0%, 95%CI [1.1-6.5]) and routine colonoscopy (4/197, 2.0%, 95%CI [0.6-5.1]). The polyp miss rate of the AI system (14/311, 4.5%, 95%CI [2.5-7.4]) was not significantly different (p=0.720) from routine colonoscopy (17/311, 5.5%, 95%CI [3.2-8.6]). There was no significant difference (p=0.500) between the ADR with routine colonoscopy (78/150, 52.0%, 95%CI [43.7-60.2]) and the AI system (76/150, 50.7%, 95%CI [42.4-58.9]). Routine colonoscopy detected adenomas in two patients that were missed by the AI system. Conclusion: The AI system had a comparable performance to experienced endoscopists during real-time colonoscopy with similar high ADR (>50%).


2010 ◽  
Vol 36 (5) ◽  
pp. 984-989
Author(s):  
Zhe ZHAO ◽  
Chun-Hua REN ◽  
Xiao JIANG ◽  
Lü-Ping ZHANG ◽  
Juan FENG ◽  
...  

2020 ◽  
Vol 34 (10) ◽  
pp. 13849-13850
Author(s):  
Donghyeon Lee ◽  
Man-Je Kim ◽  
Chang Wook Ahn

In a real-time strategy (RTS) game, StarCraft II, players need to know the consequences before making a decision in combat. We propose a combat outcome predictor which utilizes terrain information as well as squad information. For training the model, we generated a StarCraft II combat dataset by simulating diverse and large-scale combat situations. The overall accuracy of our model was 89.7%. Our predictor can be integrated into the artificial intelligence agent for RTS games as a short-term decision-making module.


Author(s):  
Petar Radanliev ◽  
David De Roure ◽  
Kevin Page ◽  
Max Van Kleek ◽  
Omar Santos ◽  
...  

AbstractMultiple governmental agencies and private organisations have made commitments for the colonisation of Mars. Such colonisation requires complex systems and infrastructure that could be very costly to repair or replace in cases of cyber-attacks. This paper surveys deep learning algorithms, IoT cyber security and risk models, and established mathematical formulas to identify the best approach for developing a dynamic and self-adapting system for predictive cyber risk analytics supported with Artificial Intelligence and Machine Learning and real-time intelligence in edge computing. The paper presents a new mathematical approach for integrating concepts for cognition engine design, edge computing and Artificial Intelligence and Machine Learning to automate anomaly detection. This engine instigates a step change by applying Artificial Intelligence and Machine Learning embedded at the edge of IoT networks, to deliver safe and functional real-time intelligence for predictive cyber risk analytics. This will enhance capacities for risk analytics and assists in the creation of a comprehensive and systematic understanding of the opportunities and threats that arise when edge computing nodes are deployed, and when Artificial Intelligence and Machine Learning technologies are migrated to the periphery of the internet and into local IoT networks.


Sign in / Sign up

Export Citation Format

Share Document