National Medical Licensing Examination

1970 ◽  
Vol 10 (1) ◽  
pp. 65-69 ◽  
Author(s):  
Sam Sup Choi
2012 ◽  
Vol 35 (2) ◽  
pp. 173-173 ◽  
Author(s):  
Keh-Min Liu ◽  
Tsuen-Chiuan Tsai ◽  
Shih-Li Tsai

2020 ◽  
Vol 34 (05) ◽  
pp. 8822-8829
Author(s):  
Sheng Shen ◽  
Yaliang Li ◽  
Nan Du ◽  
Xian Wu ◽  
Yusheng Xie ◽  
...  

Question answering (QA) has achieved promising progress recently. However, answering a question in real-world scenarios like the medical domain is still challenging, due to the requirement of external knowledge and the insufficient quantity of high-quality training data. In the light of these challenges, we study the task of generating medical QA pairs in this paper. With the insight that each medical question can be considered as a sample from the latent distribution of questions given answers, we propose an automated medical QA pair generation framework, consisting of an unsupervised key phrase detector that explores unstructured material for validity, and a generator that involves a multi-pass decoder to integrate structural knowledge for diversity. A series of experiments have been conducted on a real-world dataset collected from the National Medical Licensing Examination of China. Both automatic evaluation and human annotation demonstrate the effectiveness of the proposed method. Further investigation shows that, by incorporating the generated QA pairs for training, significant improvement in terms of accuracy can be achieved for the examination QA system. 1


Author(s):  
Guemin Lee

National Health Personnel Licensing Examination Board (hereafter NHPLEB) has used 60% correct responses of overall tests and 40% correct responses of each subject area test as a criterion to give physician licenses to satisfactory candidates. The 60%-40% criterion seems reasonable to laypersons without pychometric or measurement knowledge, but it may causes several severe problems on pychometrician's perspective. This paper pointed out several problematic cases that can be encountered by using the 60%-40% criterion, and provided several pychometric alternatives that could overcome these problems. A fairly new approach, named Bookmark standard setting method, was introduced and explained in detail as an example. This paper concluded with five considerations when the NHPLEB decides to adopt a pychometric standard setting approach to set a cutscore for a licensure test like medical licensing examination.


Author(s):  
Kun Hwang

The purpose of this study was to examine the opinions of medical students and physician writers regarding the medical humanities as a subject and its inclusion in the medical school curriculum. Furthermore, we addressed whether an assessment test should be added to the National Medical Licensing Examination of Korea (KMLE). A total of 192 medical students at Inha University and 39 physician writers registered with the Korean Association of Physician Essayists and the Korean Association of Physician Poets participated in this study. They were asked to answer a series of questionnaires. Most medical students (59%) and all physician writers (100%) answered that the medical humanities should be included in the medical school curriculum to train good physicians. They thought that the KMLE did not currently include an assessment of the medical humanities (medical students 69%, physician writers 69%). Most physician writers (87%; Likert scale, 4.38 ± 0.78) felt that an assessment of the medical humanities should be included in the KMLE. Half of the medical students (51%; Likert scale, 2.51 ± 1.17) were against including it in the KMLE, which they would have to pass after several years of study. For the preferred field of assessment, medical ethics was the most commonly endorsed subject (medical students 59%, physician writers 39%). The most frequently preferred evaluation method was via an interview (medical students 45%, physician writers 33%). In terms of the assessment of the medical humanities and the addition of this subject to the KMLE, an interview-based evaluation should be developed.


2008 ◽  
Vol 8 (1) ◽  
Author(s):  
Sohail Bajammal ◽  
Rania Zaini ◽  
Wesam Abuznadah ◽  
Mohammad Al-Rukban ◽  
Syed Moyn Aly ◽  
...  

Author(s):  
Yoon Hee Kim

In May 2011, the Ministry of Unification of the Republic of Korea (Korea) announced that 21,165 defectors from Democratic People's Republic of Korea (North Korea) had settled in Korea. Since healthcare workers are counted among these defectors, it is necessary to provide them with a pathway to certification to work in Korea. This report summarizes the vetting and approval process defectors from North Korea must pass through to be eligible to take the national medical licensing examination. Defectors must pass an oral test conducted by the National Health Personnel Licensing Examination Board to be eligible to sit for the exam. From 2002 to August 2011, 41 North Korean defectors applied for the approval process to take the exam. Twenty-nine were approved (70.7%): 23 physicians, 1 dentist, 2 oriental medical doctor, 1 nurse, and 2 pharmacists. Out of 29 approved, 11 passed the licensing examination (39.3%). This report also highlights the difficulty in assessing North Korean defectors' eligibility by oral test, and suggests that adequate competency should be emphasized to recognize their unique abilities as healthcare personnel.


Sign in / Sign up

Export Citation Format

Share Document