scholarly journals Artificial Intelligence Alongside Physicians in Canada: Reality and Risks

2020 ◽  
Vol 17 (01) ◽  
Author(s):  
Sumedha Sachar ◽  
Maïa Dakessian ◽  
Saina Beitari ◽  
Saishree Badrinarayanan

Artificial intelligence (AI) and machine learning (ML) have the potential to revolutionize the healthcare system with their immense potential to diagnose, personalize treatments, and reduce physician burnout. These technologies are highly dependent on large datasets to learn from and require data sharing across organizations for reliable and efficient predictive analysis. However, adoption of AI/ML technologies will require policy imperatives to address the challenges of data privacy, accountability, and bias. To form a regulatory framework, we propose that algorithms should be interpretable and that companies that utilize a black box model for their algorithms be held accountable for the output of their ML systems. To aid in increasing accountability and reducing bias, physicians can be educated about the inherent bias that can be generated from the ML system. We further discuss the potential benefits and disadvantages of existing privacy standards ((Personal Information Protection and Electronic Documents Act) PIPEDA and (Personal Information Protection and Electronic Documents Act) GDPR) at the federal, provincial and territorial levels. We emphasize responsible implementation of AI by ethics, skill-building, and minimizing data privacy breaches while boosting innovation and increased accessibility and interoperability across provinces.

2021 ◽  
Author(s):  
Yurong Gao ◽  
Yiping Guo ◽  
Awais Khan Jumani ◽  
Achyut Shankar

Abstract Data security needs a comprehensive system design approach that combines legal, administrative, and technical protection. These laws generally contain complete rules and principles relevant to the collecting, storing, and using personal information in line with international standards on privacy and data protection. Personal data should be legally collected for a specified reason and not be used without authorization for unlawful monitoring or profiling by governments or third parties. In advocacy and open data activity, increasing attention has been placed on privacy problems. To secure the protection of this data, the Privacy Law (PL) and the Regulations typically put forth industrial and technical standards on IT systems that hold and handle personal data. Concerns about information privacy are genuine, valid, and exacerbated on the Internet of Things (IoT) and Cyber-Physical Systems (CPS). This article suggests that compliance with IoT and CPS Data Privacy (DP) at technical and non-technical levels should be dealt with. The proposed architecture is then coupled with a reference framework for the business architecture to offer a DP-IoT model focused on the industry and technology and positioned to comply with the Personal Information Protection Act (POPI). Therefore, methods are necessary to protect data privacy based on both system and organizational reference designs. In the end, users should have specific rights to information about them, including the capacity and method to seek recourse to protect such rights, to acquire and amend incorrect details. The DP-IoT model shows a privacy ratio of 92.6%, scalability ratio of 91.5, data management ratio of 94.3%, data protection ratio of 96.7%, customer satisfaction rate of 92.2 %, attack prevention ratio of 95.5% and energy consumption ratio of 25.5 % compared to the existing methods.


2021 ◽  
Vol 11 (6) ◽  
pp. 2704
Author(s):  
Oyun Kwon ◽  
Sun K. Yoo

Medical imaging is currently being applied in artificial intelligence and big data technologies in data formats. In order for medical imaging collected from different institutions and systems to be used for artificial intelligence data, interoperability is becoming a key element. Whilst interoperability is currently guaranteed through medical data standards, compliance to personal information protection laws, and other methods, a standard solution for measurement values is deemed to be necessary in order for further applications as artificial intelligence data. As a result, this study proposes a model for interoperability in medical data standards, personal information protection methods, and medical imaging measurements. This model applies Health Level Seven (HL7) and Digital Imaging and Communications in Medicine (DICOM) standards to medical imaging data standards and enables increased accessibility towards medical imaging data in the compliance of personal information protection laws through the use of de-identifying methods. This study focuses on offering a standard for the measurement values of standard materials that addresses uncertainty in measurements that pre-existing medical imaging measurement standards did not provide. The study finds that medical imaging data standards conform to pre-existing standards and also provide protection to personal information within any medical images through de-identifying methods. Moreover, it proposes a reference model that increases interoperability by composing a process that minimizes uncertainty using standard materials. The interoperability reference model is expected to assist artificial intelligence systems using medical imaging and further enhance the resilience of future health technologies and system development.


2019 ◽  
Vol 44 (2) ◽  
Author(s):  
Jonathan A. Obar

The problematic presumption that users can control the vast consent and data-management responsibilities associated with big data is referred to as the fallacy of data privacy self-management. Though untenable, this presumption remains fundamental to Canadian privacy law, exemplified in the individual access principle of the Personal Information Protection and Electronic Documents Act governing commercial data management. This article describes the fallacy, critiques the individual access principle, and introduces potential solutions relevant to Canada’s digital strategy. On peut qualifier d’« illusion de maîtrise sur ses données privées » cette présomption qu’ont les utilisateurs de pouvoir assumer les vastes responsabilités de gestion et de consentement associées aux mégadonnées. Cette présomption, bien qu’elle soit sans fondement, demeure fondamentale dans les lois canadiennes sur la protection de la vie privée. Par exemple, pour la gestion de données commerciales, la Loi sur la protection des renseignements personnels et les documents électroniques (LPRPDE) se base sur un principe erroné d’accès individuel. Cet article décrit l’illusion de maîtrise sur ses données personnelles, critique le principe d’accès individuel, et propose des solutions pour améliorer la stratégie numérique canadienne.


2020 ◽  
Vol 10 (2) ◽  
pp. 27-35
Author(s):  
Suhyeon Kim ◽  
Sumin Kang ◽  
Jaein Yoo ◽  
Gahyeon Lee ◽  
Hyojeong Yi ◽  
...  

2016 ◽  
Author(s):  
Marc-Aurele Racicot

These days, is there a topic more significant and provocative than the protection of privacy in the private sector? The importance of this topic has been highlighted since the Canadian Parliament adopted the Personal Information Protection and Electronic Documents Act which came into full force on 1 January 2004 and which is scheduled for review in 2006. Although it seems that everywhere we turn, the word "privacy" and its companion PIPEDA are at centre stage, many say that this attention is unwarranted and a knee-jerk reaction to the information age where one can run but cannot hide. Like it or not, we are subject to the prying eyes of cameras in public places, the tracking and trailing of Internet activities, the selling of address lists and other such listings, and the synthesizing by marketers of frightful amounts of personal information that, when pulled together, reveals a lot about our personal life, our ancestry, our relationships, our interests and our spending habits.


Sign in / Sign up

Export Citation Format

Share Document