extraneous data
Recently Published Documents


TOTAL DOCUMENTS

11
(FIVE YEARS 2)

H-INDEX

4
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Katharina Groskurth ◽  
Matthias Bluemke ◽  
Clemens M. Lechner

To evaluate model fit in confirmatory factor analysis, researchers compare goodness-of-fit indices (GOFs) against fixed cutoff values derived from simulation studies. However, these cutoffs may not be as broadly applicable as researchers typically assume, especially when used in settings not covered in the simulation scenarios from which these cutoffs were derived. Thus, we aim to evaluate (1) the sensitivity of GOFs to model misspecification and (2) their susceptibility to extraneous data and analysis characteristics (i.e., estimator, number of indicators, number of response options, distribution of response options, loading magnitude, sample size, and factor correlation). Our study includes the most comprehensive simulation on that matter to date. This enables us to uncover several previously unknown or at least underappreciated issues with GOFs. All widely used GOFs are far more susceptible to extraneous influences in even more complex ways than generally appreciated, and their sensitivity to misspecifications in factor loadings and factor correlations varies significantly across different scenarios. For instance, one of those strong influences on all GOFs constituted the magnitude of factor loadings (either as a main effect or two-way interaction with other characteristics). The strong susceptibility of GOFs to data and analysis characteristics showed that the practice of judging the fit of models against fixed cutoffs is more problematic than so-far assumed. Hitherto unnoticed effects on GOFs imply that no general cutoff rules can be applied to evaluate model fit. We discuss alternatives for assessing model fit and develop a new approach to tailor cutoffs for GOFs to research settings.


2019 ◽  
Vol 35 (4) ◽  
pp. 247-255 ◽  
Author(s):  
Patrick Ippersiel ◽  
Richard Preuss ◽  
Shawn M. Robbins

Continuous relative phase (CRP) analysis using the Hilbert transform is prone to end effects. The purpose was to investigate the impact of padding techniques (reflection, spline extrapolation, extraneous data, and unpadded) on end effects following Hilbert-transformed CRP calculations, using sinusoidal, nonsinusoidal, and kinematic data from a repeated sit-to-stand-to-sit task in adults with low back pain (n = 16, mean age = 30 y). CRP angles were determined using a Hilbert transform of sinusoidal and nonsinusoidal signals with set phase shifts, and for the left thigh/sacrum segments. Root mean square difference and true error compared test signals with a gold standard, for the start, end, and full periods, for all data. Mean difference and 95% bootstrapped confidence intervals were calculated to compare padding techniques using kinematic data. The unpadded approach showed near-negligible error using sinusoidal data across all periods. No approach was clearly superior for nonsinusoidal data. Spline extrapolation showed significantly less root mean square difference (all periods) when compared with double reflection (full period: mean difference = 2.11; 95% confidence interval, 1.41 to 2.79) and unpadded approaches (full period: mean difference = −15.8; 95% confidence interval, −18.9 to −12.8). Padding sinusoidal data when performing CRP analyses are unnecessary. When extraneous data have not been collected, our findings recommend padding using a spline to minimize data distortion following Hilbert-transformed CRP analyses.


2018 ◽  
Vol 19 (1-2) ◽  
pp. 29-39
Author(s):  
Patrick O’Byrne ◽  
Jean Daniel Jacob ◽  
Lauren Orser

The provision of HIV medications to HIV-negative persons after exposure to HIV is known as postexposure prophylaxis (PEP). Because this prevention strategy is primarily only available in emergency rooms, we piloted a nurse-led community-based PEP program in Ottawa from September 2013 through August 2015. As part of evaluating this program, we conducted qualitative interviews with persons who initiated PEP. Twelve men who had engaged in condomless anal sex with other males participated. Thematic analysis of the interview transcripts highlighted that PEP was considered unmentionable because the participants’ saw it as proof of past behavior that was perceived negatively. Our results thus revealed that PEP was stigmatized, which made our participants reluctant to answer health care professionals’ “questions” about why they needed PEP. To do so was to be exposed to stigma. The use of PEP for our participants was a balance between wanting to minimize the risks of HIV acquisition against the risks of disclosing the unmentionable. We take these findings to mean that clinicians and health service policy workers should move PEP into community clinics (decentralizing it from hospitals, and increase the involvement of nurses); aim to provide all required PEP services in community settings (consolidate PEP provision in these clinics); and ensure PEP services are streamlined to remove extraneous data collection (meaning history and exam tools should be standardized to minimize needless questions that may impede PEP access). Together, these recommendations may increase patients’ access to PEP, and maximize its HIV prevention effects.


2013 ◽  
Vol 31 (31_suppl) ◽  
pp. 31-31
Author(s):  
Debra A. Patt ◽  
Jennifer Trageser ◽  
Jeffrey A. Howard ◽  
Max Rush ◽  
Cara Heiman ◽  
...  

31 Background: Our large network of oncology practices (PRs) launched a health IT (HIT) patient portal (PP) to improve patient (PT) access to clinical information (CI) and serve as a platform to enhance PR-to-PT communication (C). Methods: A team of HIT specialists and oncologists engaged in development of the PP to develop a platform to facilitate PR-to-PT C, satisfy meaningful use (MU) requirements, and have brand identification (ID) for PRs. Workflow planning for implementation was conducted including ID and education of key participants at PRs. Educational signage was posted at PRs during initiation partnered with information at check-in at clinic visits to inform PTs about PP benefits and registration steps. After consent was obtained, pts were invited by email to the PP and could access their PP and view and download their secure CI. A review of support calls from both PR personnel and pts highlighted opportunities for enhanced PP engagement. Enrollment (E) was captured monthly. Results: From April 2012 to June 2013 more than 34,000 pts have enrolled in the PP across over 47 PRs (Table). In addition, inclusion of the PR brand and removing extraneous data capture during E are critical to success. Comparing E data from April 2012 to April 2013 after increased PR brand ID and reduction of pt validation changes were implemented, there was an increase of 13% of opened Is and 22% increase in Es. Conclusions: By engaging a development team, and strategically planning content dissemination and education around initiation, implementation of the PP was widely utilized throughout the PRs. By monitoring adoption rates and capturing the PT feedback, incremental enhancements can positively affect PT engagement with PRs. This functional mechanism can now serve as a platform to facilitate C between PRs and PTs, fulfill MU requirements, and plan future dissemination of educational content. [Table: see text]


2013 ◽  
Vol 11 (02) ◽  
pp. 1250022 ◽  
Author(s):  
HIROSHI FUJII ◽  
TATSUSHI OGATA ◽  
TAKEHIKO SHIMADA ◽  
TOMOKO ENDO ◽  
HIROYUKI IKETANI ◽  
...  

DNA markers are frequently used to analyze crop varieties, with the coded marker data summarized in a computer-generated table. Such summary tables often provide extraneous data about individual crop genotypes, needlessly complicating and prolonging DNA-based differentiation between crop varieties. At present, it is difficult to identify minimal marker sets — the smallest sets that can distinguish between all crop varieties listed in a marker-summary table — due to the absence of algorithms capable of such characterization. Here, we describe the development of just such an algorithm and MinimalMarker, its accompanying Perl-based computer program. MinimalMarker has been validated in variety identification of fruit trees using published datasets and is available for use with both dominant and co-dominant markers, regardless of the number of alleles, including SSR markers with numeric notation. We expect that this program will prove useful not only to genomics researchers but also to government agencies that use DNA markers to support a variety of food-inspection and -labeling regulations.


Author(s):  
Ben Kei Daniel

Statistical and probability inferences are basically dependent on two major methods of reasoning, conventional (frequentist) and Bayesian probability. Frequentists’ methods are mainly based on numerous events, where Bayesian probability applies prior knowledge and subjective belief. Frequentist models of probability do not permit the introduction of prior knowledge into the calculations. This is traditionally to maintain the rigour of a scientific method and as way to prevent the introduction of extraneous data that might skew the experimental results. However, there are times when the use of prior knowledge would be a useful contribution to evaluation a situation. The Bayesian approach was proposed to help us reason in situation where prior knowledge is need, and especially under highly uncertain circumstances. This Chapter provides an overview of the main principles underlying the Bayesian method and Bayesian belief networks. The ultimate goal is to provide the reader with the basic knowledge necessary for understanding the Bayesian Belief Network approach to building computational model. The Chapter does not go into more technical details of probability theory and Bayesian statistics. But to make it more accessible to a wide range of readers, some technical details are simplified.


2004 ◽  
Vol 18 (2) ◽  
pp. 279-287 ◽  
Author(s):  
Nigel J. C. Bailey ◽  
Matjaz Oven ◽  
Elaine Holmes ◽  
Meinhart H. Zenk ◽  
Jeremy K. Nicholson

High field1H NMR spectroscopy has been employed to obtain, in conjunction with chemometric analysis, information regarding fluctuations in endogenous metabolic profiles forCrotalaria cobalticolaplant cells following exposure to cobalt chloride.Such ‘metabolomic’ type data analysis is often confounded by experimental, environmental or genetic factors that are not correlated to the classifications of interest and serve only to complicate the extraction of meaningful answers from a dataset. This work demonstrates the application of data filtering to remove extraneous data that result from spectrometer variation rather than being correlated with the classes of interest. Samples were analysed fromCrotalaria cobalticolasuspension cell culture following exposure to cobalt chloride using 2 spectrometers. Removal of confounding data due to spectrometer variation resulted in clear separation between control and dosed classes. It was then possible to use the model to determine key changes in biochemical status caused as a result of exposure to cobalt. Branched chain amino acids, succinate and secondary metabolite precursors phenylalanine and tyrosine were all higher in the control samples, whilst choline, glutamate, alanine and lactate were higher in the dosed samples.


Author(s):  
Catherine Dibble

Geographic information systems (GISs) are fairly good at handling three types of data: locational, attribute, and topological. Recent work holds promise for adding temporal data to this list as well (e.g., see Langran, 1992). Yet the unprecedentedly vast resources of geographically referenced data continue to outstrip our ability to derive meaningful information from such databases, despite dramatic improvements in computer processing power, algorithm efficiency, and parallel processing. In part this is because such research has emphasized improvements in processing efficiency rather than effectiveness. We humans are slow-minded compared with our silicon inventions; yet our analytical capabilities remain far more powerful, primarily because we have evolved elaborate cognitive infrastructures devoted to ensuring that we leverage our limited processing power by focusing our attention on the events and information most likely to be relevant. In GIS use, so far only human perception provides the requisite integration of spatial context, and human attention directs the determination of relevance and the selection of geographic features and related analyses. Understanding of spatial context and analytical purpose exists only in the minds of humans working with the GIS or viewing the displays and maps created by such operations. We still extract information from our geographic data systems primarily through long series of relatively tedious and complex spatial operations, performed—or at least explicitly preprogrammed—by a human, in order to derive each answer. Human integration of analytical purpose and spatial and attribute contexts is perhaps the most essential and yet the most invisible component of any geographic analysis, yet it is also perhaps the most fundamental missing link in any GIS. Only humans can glance at a map of a toxic waste dumps next to school yards, or oil spills upstream from fisheries, and recognize the potential threat of such proximity; human cartographers understand the importance of emphasizing either road or stream networks depending on the purpose of a map; humans understand that “near” operates at different scales for corner stores versus cities, or tropical jungle habitat versus open savannah. Given a GIS with the capability to deluge any inquiry with myriad layers of extraneous data, this natural human ability to filter data and manipulate only the meaningful elements is essential.


Nature ◽  
1990 ◽  
Vol 346 (6281) ◽  
pp. 215-215 ◽  
Author(s):  
John Maddox
Keyword(s):  

1982 ◽  
Vol 23 (3) ◽  
pp. 395-412 ◽  
Author(s):  
David Henige

This essay treats the effects of acculturation on oral historical materials. Rather than addressing it as a matter of ‘contamination’, that is, as a question of extraneous data entering and distorting ‘pristine’ traditions, it is considered here to be a facet of the larger question of cultural assimilation – a case of the old and familiar constantly confronting and responding to the new and strange. Seen in this way, oral data continuously adopt and adapt whatever new, relevant and interesting materials come their way in not very different – though decidedly less visible – ways from those that written data have always done. This argument is illustrated by examples from various times and places, largely situations where missionaries, newly literate members, or colonial officials, perceptibly influenced the historical views of societies on their way to becoming literate.In fact this phenomenon seems widespread enough to justify advancing a model that can be tested against specific cases. For our purposes, this model begins with the first meeting of oral and literate cultures, although we can fairly assume that an infinity of similar but unrecorded meetings of oral cultures also resulted in change. After this initial impetus, the constraints of colonial rule, the exigencies of independence, and the aims of modern academic oral historiography each contributed in some measure to this process of ongoing change. As a result, historians, whether primarily interested in the reliability of oral data, or in the process and effects of changes in them, must look to a wider range of sources than has been customary.


Sign in / Sign up

Export Citation Format

Share Document