DETERMINING THE CHARACTERISTICS OF THE IDEAL PROFESSOR: AN ALTERNATIVE APPROACH

1974 ◽  
Vol 11 (4) ◽  
pp. 269-276 ◽  
Author(s):  
MICHAEL J. SUBKOVIAK ◽  
JOEL R. LEVIN
2020 ◽  
Vol 14 (1) ◽  
pp. 87-105
Author(s):  
Kort C. Prince ◽  
Jeremiah W. Jaggers ◽  
Allyn Walker ◽  
Jess Shade ◽  
Erin B. Worwood

Mental Health Courts (MHCs) are problem-solving courts that have been implemented throughout the United States. One critical component of MHCs is determining their effectiveness and limitations. However, unique challenges are encountered when evaluating MHCs. One major challenge, and the focus of this paper, is identifying an adequate control group. The ideal approach to determining efficacy is using a controlled group design whereby participants are randomized to treatment or control conditions. However, this approach is not possible when conducting retrospective evaluation of court data. In addition, a specific set of ethical and logistical issues arise. Propensity score matching (PSM) provides an alternative approach for comparing groups when randomization is not possible. PSM works by first identifying the characteristics that make a person likely to be in treatment. We describe our attempts to use PSM in a MHC evaluation. Specific challenges with PSM are discussed and recommendations are made for use of PSM with MHCs.


2021 ◽  
Author(s):  
◽  
Alan J. Taylor

<p>The performances of observers in auditory experiments are likely to be affected by extraneous noise from physiological or neurological sources and also by decision noise. Attempts have been made to measure the characteristics of this noise, in particular its level relative to that of masking noise provided by the experimenter. This study investigated an alternative approach, a method of analysis which seeks to reduce the effects of extraneous noise on measures derived from experimental data. Group-Operating-Characteristic (GOC) analysis was described by Watson (1963) and investigated by Boven (1976). Boven distinguished between common and unique noise. GOC analysis seeks to reduce the effects of unique noise. In the analysis, ratings of the same stimulus on different occasions are sunned. The cumulative frequency distributions of the resulting variable define a GOC curve. This curve is analogous to an ROC curve, but since the effects of unique noise tend to be averaged out during the summation, the GOC is less influenced by extraneous noise. The amount of improvement depends on the relative variance of the unique and common noise (k). Higher levels of unique noise lead to greater improvement. In this study four frequency discrimination experiments were carried out with pigeons as observers, using a three-key operant procedure. In other experiments, computer-simulated observers were used. The first two pigeon experiments, and the simulations, were based on known distributions of common noise. The ROCs for the constructed distributions provided a standard with which the GOC curve could be compared. In all cases the analysis led to improvements in the measures of performance and increased the match of the experimental results and the ideal ROC. The amount of improvement, as well as reflecting the level of unique noise, depended on the number of response categories. With smaller numbers of categories, improvement was reduced and k was underestimated. Since the pigeon observers made only "yes" or "no" responses, the results for the pigeon experiments were compared with the results of simulations with known distributions in order to obtain more accurate estimates of k. The third and fourth pigeon experiments involved frequency discrimination tasks with a standard of 450 Hz and comparison frequencies of 500, 600, 700, 800 and 900 Hz, and 650 Hz, respectively. With the multiple comparison frequencies the results were very variable. This was due to the small number of trials for each frequency and the small number of replications. The results obtained with one comparison frequency were more orderly but, like those of the previous experiment, were impossible to distinguish from those which would be expected if there was no common noise. A final set of experiments was based on a hardware simulation. Signals first used in the fourth pigeon experiment were processed by a system made up of a filter, a zero-axis crossing detector and a simulated observer. The results of these experiments were compatible with the possibility that the amount of unique noise in the pigeon experiments overwhelmed any evidence of common noise.</p>


2017 ◽  
Author(s):  
David Wiens

It is conventional wisdom among political philosophers that ideal principles of justice must guide our attempts to design institutions to avert actual injustice. Call this the ideal guidance approach. I argue that this view is misguided— ideal principles of justice are not appropriate "guiding principles" that actual institutions must aim to realize, even if only approximately. Fortunately, the conventional wisdom is also avoidable. In this paper, I develop an alternative approach to institutional design, which I call institutional failure analysis. The basic intuition of this approach is that our moral assessment of institutional proposals is most effective when we proceed from a detailed understanding of the causal processes generating problematic social outcomes. Failure analysis takes the institutional primary design task to be obviating or averting institutional failures. Consequently, failure analysis enables theorists to prescribe more effective solutions to actual injustice because its focuses on understanding the injustice, rather than specifying an ideal of justice.


2021 ◽  
Author(s):  
◽  
Alan J. Taylor

<p>The performances of observers in auditory experiments are likely to be affected by extraneous noise from physiological or neurological sources and also by decision noise. Attempts have been made to measure the characteristics of this noise, in particular its level relative to that of masking noise provided by the experimenter. This study investigated an alternative approach, a method of analysis which seeks to reduce the effects of extraneous noise on measures derived from experimental data. Group-Operating-Characteristic (GOC) analysis was described by Watson (1963) and investigated by Boven (1976). Boven distinguished between common and unique noise. GOC analysis seeks to reduce the effects of unique noise. In the analysis, ratings of the same stimulus on different occasions are sunned. The cumulative frequency distributions of the resulting variable define a GOC curve. This curve is analogous to an ROC curve, but since the effects of unique noise tend to be averaged out during the summation, the GOC is less influenced by extraneous noise. The amount of improvement depends on the relative variance of the unique and common noise (k). Higher levels of unique noise lead to greater improvement. In this study four frequency discrimination experiments were carried out with pigeons as observers, using a three-key operant procedure. In other experiments, computer-simulated observers were used. The first two pigeon experiments, and the simulations, were based on known distributions of common noise. The ROCs for the constructed distributions provided a standard with which the GOC curve could be compared. In all cases the analysis led to improvements in the measures of performance and increased the match of the experimental results and the ideal ROC. The amount of improvement, as well as reflecting the level of unique noise, depended on the number of response categories. With smaller numbers of categories, improvement was reduced and k was underestimated. Since the pigeon observers made only "yes" or "no" responses, the results for the pigeon experiments were compared with the results of simulations with known distributions in order to obtain more accurate estimates of k. The third and fourth pigeon experiments involved frequency discrimination tasks with a standard of 450 Hz and comparison frequencies of 500, 600, 700, 800 and 900 Hz, and 650 Hz, respectively. With the multiple comparison frequencies the results were very variable. This was due to the small number of trials for each frequency and the small number of replications. The results obtained with one comparison frequency were more orderly but, like those of the previous experiment, were impossible to distinguish from those which would be expected if there was no common noise. A final set of experiments was based on a hardware simulation. Signals first used in the fourth pigeon experiment were processed by a system made up of a filter, a zero-axis crossing detector and a simulated observer. The results of these experiments were compatible with the possibility that the amount of unique noise in the pigeon experiments overwhelmed any evidence of common noise.</p>


2019 ◽  
Vol 7 (2) ◽  
pp. 10-20
Author(s):  
Dr. Partha Protim Borthakur

Purpose: The present paper tries to cross-examine Sen’s notion of justice and to find a midway between the ideal and non-ideal theorizing of justice. Besides, searching for reconciliation between Rawls and Sen, the present paper also attempts to go beyond Sen, while critically engaging with his idea of justice. Methodology: This study has applied qualitative method; however, both the historical and analytical methods are employed for reaching out the conclusive findings of the study. As the sources of this paper are basically secondary, all necessary and relevant materials are collected from a range of related books, articles, journals, newspapers, and reports of various seminars and conferences that fall within the domain of the study area. Main Findings: While analyzing Sen’s critique of Rawlsian theory, the study finds that the Rawlsian theory cannot be discarded only as a theory that formulates ideal justice and is not redundant. The study while revisiting Sen’s notion finds that there is also a possibility of reconciliation between ideal and non-ideal theorizing of justice. Application: This study will be useful in understanding the debate between ideal versus non-ideal theories of justice that has lately been haunting the political philosophy. Besides, it will also be useful in searching for reconciliation between Rawls’ and Sen’s paradigms of justice and thereby offering a conception of justice that is reasonable and true in assessing issues of justice in the present scenario. Novelty/ Originality: Revisiting Sen’s notion of justice and analyzing such dimensions of politics, the study will benefit the reader to evaluate the debate between ideal versus non-ideal theorizing of justice. Moreover, by searching for a possibility between Rawls and Sen, the study will contribute towards developing an alternative approach and understanding of justice.


Author(s):  
M.S. Shahrabadi ◽  
T. Yamamoto

The technique of labeling of macromolecules with ferritin conjugated antibody has been successfully used for extracellular antigen by means of staining the specimen with conjugate prior to fixation and embedding. However, the ideal method to determine the location of intracellular antigen would be to do the antigen-antibody reaction in thin sections. This technique contains inherent problems such as the destruction of antigenic determinants during fixation or embedding and the non-specific attachment of conjugate to the embedding media. Certain embedding media such as polyampholytes (2) or cross-linked bovine serum albumin (3) have been introduced to overcome some of these problems.


Author(s):  
R. A. Crowther

The reconstruction of a three-dimensional image of a specimen from a set of electron micrographs reduces, under certain assumptions about the imaging process in the microscope, to the mathematical problem of reconstructing a density distribution from a set of its plane projections.In the absence of noise we can formulate a purely geometrical criterion, which, for a general object, fixes the resolution attainable from a given finite number of views in terms of the size of the object. For simplicity we take the ideal case of projections collected by a series of m equally spaced tilts about a single axis.


Author(s):  
R. Beeuwkes ◽  
A. Saubermann ◽  
P. Echlin ◽  
S. Churchill

Fifteen years ago, Hall described clearly the advantages of the thin section approach to biological x-ray microanalysis, and described clearly the ratio method for quantitive analysis in such preparations. In this now classic paper, he also made it clear that the ideal method of sample preparation would involve only freezing and sectioning at low temperature. Subsequently, Hall and his coworkers, as well as others, have applied themselves to the task of direct x-ray microanalysis of frozen sections. To achieve this goal, different methodological approachs have been developed as different groups sought solutions to a common group of technical problems. This report describes some of these problems and indicates the specific approaches and procedures developed by our group in order to overcome them. We acknowledge that the techniques evolved by our group are quite different from earlier approaches to cryomicrotomy and sample handling, hence the title of our paper. However, such departures from tradition have been based upon our attempt to apply basic physical principles to the processes involved. We feel we have demonstrated that such a break with tradition has valuable consequences.


Author(s):  
G. Van Tendeloo ◽  
J. Van Landuyt ◽  
S. Amelinckx

Polytypism has been studied for a number of years and a wide variety of stacking sequences has been detected and analysed. SiC is the prototype material in this respect; see e.g. Electron microscopy under high resolution conditions when combined with x-ray measurements is a very powerful technique to elucidate the correct stacking sequence or to study polytype transformations and deviations from the ideal stacking sequence.


Author(s):  
N. Bonnet ◽  
M. Troyon ◽  
P. Gallion

Two main problems in high resolution electron microscopy are first, the existence of gaps in the transfer function, and then the difficulty to find complex amplitude of the diffracted wawe from registered intensity. The solution of this second problem is in most cases only intended by the realization of several micrographs in different conditions (defocusing distance, illuminating angle, complementary objective apertures…) which can lead to severe problems of contamination or radiation damage for certain specimens.Fraunhofer holography can in principle solve both problems stated above (1,2). The microscope objective is strongly defocused (far-field region) so that the two diffracted beams do not interfere. The ideal transfer function after reconstruction is then unity and the twin image do not overlap on the reconstructed one.We show some applications of the method and results of preliminary tests.Possible application to the study of cavitiesSmall voids (or gas-filled bubbles) created by irradiation in crystalline materials can be observed near the Scherzer focus, but it is then difficult to extract other informations than the approximated size.


Sign in / Sign up

Export Citation Format

Share Document