rule type
Recently Published Documents


TOTAL DOCUMENTS

29
(FIVE YEARS 5)

H-INDEX

7
(FIVE YEARS 0)

2020 ◽  
Vol 19 (2) ◽  
pp. 93-106
Author(s):  
Clifford E. Hauenstein ◽  
Susan E. Embretson

The Concept Formation subtest of the Woodcock Johnson Tests of Cognitive Abilities represents a dynamic test due to continual provision of feedback from examiner to examinee. Yet, the original scoring protocol for the test largely ignores this dynamic structure. The current analysis applies a dynamic adaptation of an explanatory item response theory model to evaluate the impact of feedback on item difficulty. Additionally, several item features (rule type, number of target shapes) are considered in the item difficulty model. Results demonstrated that all forms of feedback significantly reduced item difficulty, with the exception of corrective feedback that could not be directly applied to the next item in the series. More complex and compound rule types also significantly predicted item difficulty, as did increasing the number of shapes, thereby supporting the response process aspect of validity. Implications for continued use of the Concept Formation subtest for educational programming decisions are discussed.


Author(s):  
Desi Laidawati ◽  
Y Yuhandri

The choice of computer-based national exams is a choice of a student that must be adjusted to his interests and talents, so in this case the selection of majors is very important for the future of a student who will continue his studies to college. But in reality the decisions taken in choosing majors often cause problems, because the majors taken only follow the choice of friends or on the basis of coercion from parents. Causing the large number of students who feel out of line with expectations or abilities and want to change majors. For this reason, an expert system has been made that can make it easy for students to consult early to determine elective subjects for computer-based national examinations. The method used in making this expert system is the Forward Chaining method to determine conclusions. The process of this application is to receive input in the form of types of problems experienced by students. The result of the application is that it can provide early instructions for subjects that match the talents and interests of students. With the application of the forward chaining method that is applied to the system that is governed by the rule type problem. From the accuracy of 89.29%, the system can be said to be good enough to be implemented.


2019 ◽  
Vol 1 (3) ◽  
pp. 1-6
Author(s):  
Desi Laidawati ◽  
Yuhandri Yuhandri

The choice of computer-based national exams is a choice of a student that must be adjusted to his interests and talents, so in this case the selection of majors is very important for the future of a student who will continue his studies to college. But in reality the decisions taken in choosing majors often cause problems, because the Majors taken only follow the choice of friends or on the basis of coercion from parents. Causing the large number of students who feel out of line with expectations or abilities and want to change majors. For this reason, an expert system has been made that can make it easy for students to consult early to determine elective subjects for computer-based national examinations. The method used in making this expert system is the Forward Chaining method to determine conclusions. The process of this application is to receive input in the form of types of problems experienced by students. The result of the application is that it can provide early instructions for subjects that match the talents and interests of students. With the application of the forward chaining method that is applied to the system that is governed by the rule type problem. From the accuracy of 89.29%, the system can be said to be good enough to be implemented


Author(s):  
Eline van der Linden ◽  
◽  
Koen Smit ◽  
Matthijs Berkhout ◽  
Martijn Zoet ◽  
...  
Keyword(s):  

2019 ◽  
Vol 28 (3) ◽  
pp. 483-506
Author(s):  
Mislav Sudić ◽  
Pavle Valerjev ◽  
Josip Ćirić

Domain theory suggests that moral rules and conventions are perceived differently and elicit a different response. A special procedure was designed to test this hypothesis in a laboratory setting using a deontic reasoning task. The goal was to gain insight into the cognitive and metacognitive processes of deontic reasoning from simple deontic premises. In the 3x2x2 within-subjects design, we varied rule-content (moral, conventional, abstract), rule-type (obligation, permission) and the induced dilemma (punishment dilemma, reward dilemma). Participants (N = 78) were presented with 12 laws. After memorizing a law, eight cases were presented to participants so that they make a quick judgment. Participants were tasked with punishing rule-violators, ignoring rule-conformists, and rewarding rule-supererogation. Response times (RT) and accuracy were measured for each judgment, and final confidence was measured after a set of judgments. No differences were expected between rule-types, except for superior performance for moral content and punishment dilemmas. RT correlated negatively with confidence levels, while accuracy correlated positively. Moral reasoning was more accurate than conventional and abstract reasoning, and produced higher confidence levels. Better performance was found for punishment dilemmas than reward dilemmas, likely due to the presence of a cheater-detection module; but the differences were not found in moral reasoning. Moral reasoning was also independent of rule-type, while conventional and abstract reasoning produced superior performance in obligation-type than in permission-type rules. A large drop-off in accuracy was detected for rules that allowed undesirable behaviour, a phenomenon we termed the "deontic blind spot". However, this blind spot was not present in moral reasoning. Three lines of evidence indicate a qualitative difference between the moral and other deontic domains: (1) performance for moral content was independent of rule-type, (2) moral content produced an equal activation of violator- and altruist-detection modules, and (3) moral content produces higher levels of confidence.


2017 ◽  
Vol 17 (1) ◽  
pp. 1 ◽  
Author(s):  
Octa Heriana ◽  
Ali Matooq Al Misbah

The heart is considered the most important organ of our body that controls the circulation of blood throughout the body. Measured heartbeat signals can be further analyzed in order to know the health condition of a person. The challenge of ECG signal measurement and analysis is how to remove the noises imposed on the signal that is interfered from many different sources, such as internal noise in sensor devices, power line interference, muscle activity, and body movements. This paper implemented wavelet transform to reduce the noise imposed on the ECG signal to get a closely actual heart signal. ECG data used in this research are three digitized recorded ECG data obtained from MIT-BIH Arrhythmia Database. The first step is generating the noisy ECG signal as the input system by adding 1W WGN signal into the original ECG signal. Then DWT is applied to extract the noisy ECG signal. Some DWT’s parameters, threshold selection (rule, type, rescaling), decomposition level, and desired wavelet family are varied to get the best denoised output signal. All results are recorded to be compared. Based on the results, the best DWT parameter for ECG signal denoising is obtained by Symlet wavelet when the decomposition level is set to 3, with soft thresholding, in rigrsure thresholding rule.


Author(s):  
Chhaya Shori ◽  
Rakesh Shori ◽  
Gannaram Laxmiprasad ◽  
Ashalatha Alli

Background: Intraocular lens implantation is the only surgical approach available mostly in developing countries. Thus cataract constitutes as the leading cause of blindness in developing countries as many patients with cataract do not have access to hospitals and surgery. Objective was to study the clinical and ophthalmologic profile of patients undergoing cataract surgery.Methods: A hospital based cross sectional study was carried out among 100 cataract patients assigned to undergo conventional extra capsular cataract excision surgery for a period of two years in a tertiary care referral hospital. One day before the surgery these patients were admitted to the indoor wards of department of ophthalmology. From each patient, detailed history was obtained. Visual acuity was checked with Snellen’s visual acuity chart and pinhole improvement was noted.Results: Maximum prevalence was seen in the age group of 51-60 years i.e. 37% followed by 61-70 years of 30%. Thus cataract is seen more commonly in the age group of above 50 years. Prevalence of cataract was more in females (59%) as compared to 41% in males. Cortical cataract constituted 86% of total cases and remaining 14% were constituted by nuclear type of cataract. Majority of the patients i.e. 59% had visual acuity of less than 1/60 followed by 33% of having 5/60 to 1/60. 58% of the patients had against the rule type of astigmatism. 34% of the patients had with the rule type of astigmatism. Only 8% had NOA type of astigmatism.Conclusions: Increasing age and female gender were the most important risk factors in the present study. Cortical type of cataract was more common than nuclear type of cataract. Maximum had low vision.


Author(s):  
Ayaho Miyamoto

This paper describes an acquisitive method of rule‐type knowledge from the field inspection data on highway bridges. The proposed method is enhanced by introducing an improvement to a traditional data mining technique, i.e. applying the rough set theory to the traditional decision table reduction method. The new rough set theory approach helps in cases of exceptional and contradictory data, which in the traditional decision table reduction method are simply removed from analyses. Instead of automatically removing all apparently contradictory data cases, the proposed method determines whether the data really is contradictory and therefore must be removed or not. The method has been tested with real data on bridge members including girders and filled joints in bridges owned and managed by a highway corporation in Japan. There are, however, numerous inconsistent data in field data. A new method is therefore proposed to solve the problem of data loss. The new method reveals some generally unrecognized decision rules in addition to generally accepted knowledge. Finally, a computer program is developed to perform calculation routines, and some field inspection data on highway bridges is used to show the applicability of the proposed method.


Author(s):  
Leo Pruijt ◽  
Wiebe Wiersema ◽  
Jan Martijn E. M. van der Werf ◽  
Sjaak Brinkkemper
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document