Coping with ever larger problems, models, and data bases

1999 ◽  
Vol 39 (4) ◽  
pp. 1-11 ◽  
Author(s):  
M. B. Beck

Abstract Those who construct models, including models of the quality of the aquatic environment, are driven largely by the search for (theoretical) completeness in the products of their efforts. For if we know of something of potential relevance, and computational power is increasing, why should that something be left out? Those who use the results of such models are probably reassured by this imprimatur, of having supposedly based their decisions on the best available scientific evidence. Our models, and certainly those we would label “state-of-the-art”, seem destined always to get larger. Some observations on possible strategies for coping with this largeness, while yet making well reasoned and adequately buttressed decisions on how to manage the water environment, are the subject of this paper. Because it is so obvious, and because it has been the foundation of analytical enquiry for such a very long time, our point of departure is the classical procedure of disassembling the whole into its parts with subsequent re-assembly of the resulting part solutions into an overall solution. This continues to serve us well, at least in terms of pragmatic decision-making, but perhaps not in terms of reconciling the model with the field observations, i.e., in terms of model calibration. If the indivisible whole is to be addressed, and it is large, contemporary studies show that we shall have to shed an attachment to locating the single, best decision and be satisfied instead with having identified a multiplicity of acceptably good possibilities. If, in the face of an inevitable uncertainty, there is then a concern for reassurance regarding the robustness of a specific course of action (chosen from among the good possibilities), significant recent advances in the methods of global (as opposed to local) sensitivity analysis are indeed timely. Ultimately, however, no matter how large and seemingly complete the model, whether we trust its output is a very strong function of whether this outcome tallies with our mental image of the given system's behaviour. The paper argues that largeness must therefore be pruned through the application of appropriate methods of model simplification, through procedures aimed directly at this issue of promoting the generation, corroboration, and refutation of high-level conceptual insights and understanding. The paper closes with a brief discussion of two aspects of the role of field observations in evaluating a (large) model: quality assurance of that model in the absence of any data; and the previously somewhat under-estimated challenge of reconciling large models with high-volume data sets.

2003 ◽  
Vol 47 (2) ◽  
pp. 43-51 ◽  
Author(s):  
M.B. Beck ◽  
Z. Lin

In spite of a long history of automated instruments being deployed in the water industry, only recently has the difficulty of extracting timely insights from high-grade, high-volume data sets become an important problem. Put simply, it is now relatively easy to be “data-rich”, much less easy to become “information-rich". Whether the availability of so many data arises from “technological push” or the “demand pull” of practical problem solving is not the subject of discussion. The paper focuses instead on two issues: first, an outline of a methodological framework, based largely on the algorithms of (on-line) recursive estimation and involving a sequence of transformations to which the data can be subjected; and second, presentation and discussion of the results of applying these transformations in a case study of a biological system of wastewater treatment. The principal conclusion is that the difficulty of transforming data into information may lie not so much in coping with the high sampling intensity enabled by automated monitoring networks, but in coming to terms with the complexity of the higher-order, multi-variable character of the data sets, i.e., in interpreting the interactions among many contemporaneously measured quantities.


2016 ◽  
Vol 17 (1) ◽  
pp. 99-110 ◽  
Author(s):  
Sean M. Tweedy ◽  
Emma M. Beckman ◽  
Leanne M. Johnston ◽  
Mark J. Connick

This paper investigates the premise that long-term engagement in performance-focussed sports training may lead to significantly enhanced clinical outcomes for people with neurological impairments (NI). The minimum volume of moderate-intensity activity recommended for good health is 450 MET.minutes/week, although evidence from the general population indicates that outcomes may be enhanced by completing up to five times this volume (2250 MET.minutes/week) at vigorous (rather than moderate) intensity. Most studies evaluating physical activity interventions for people with NI deliver low volumes (<450 MET.minutes/week), which may explain why evidence for some clinical outcomes is weak. Athletes (with or without NI) who aim to achieve high-level sports performance undertake an increasingly large volume of vigorous intensity physical activity over several seasons. Evidence that people with NI may enhance clinical outcomes through performance-focussed sports training includes: evidence from studies investigating the benefits of high-intensity and/or high volume clinical exercise; scientific evidence from elite/high-level athletes; and anecdotal evidence from Paralympic athlete testimonials. Additionally, sports participants with NI may also accrue an important array of psychosocial benefits, including higher rates of employment, and higher satisfaction with life and social integration. Rigorous, prospective, longitudinal clinical monitoring of people with NI undertaking performance-focussed sports training are required to evaluate its clinical utility.


1997 ◽  
Vol 3 (S2) ◽  
pp. 1109-1110
Author(s):  
D.C. McCord ◽  
S.K. Kennedy ◽  
D.G. Kritikos

Manual scanning electron microscope (SEM) analysis is historically considered to be slow and tedious resulting in a low volume of data. This is due in large part to the mechanics of moving stage locations and recording image and spectral data. Conversely, high volume data acquired using automated SEM analysis has been associated with the need for complex systems for data management and analysis. In addition, the proliferation of high volume digital microscopy and its attendant “ tonnage” of paper images has lead to the desire for a “green” (filmless and hardcopy-reduced) operation.There are some classes of projects which are amenable to automated feature analysis - discrete features that are distinct from a background material. However, many projects require operator intervention in order to identify the region or points of interest. Yet, these projects may also require that large data sets be acquired and analyzed for statistical rigor.


2018 ◽  
Author(s):  
R. Serafin ◽  
O. Gliko ◽  
S. J Smith ◽  
F. Collman

AbstractArray tomography (AT) is a technique for acquiring high resolution highly multiplexed imagery from series of ultra-thin sections arranged as an array on a rigid substrate. Specialized microscope control has been required to utilize AT as an imaging technique, which is often time consuming, and yields small volume data sets. Here we present MosaicPlanner, an open source software platform for light level AT, that streamlines the acquisition process and utilizes the general microscope control API provided by Micro-Manager, allowing AT data to be acquired on a wide variety of microscope hardware. This report provides a description of the MosaicPlanner software design, and platform improvements that were implemented to increase the acquisition speed of high volume, multiplexed AT datasets.


Cancers ◽  
2020 ◽  
Vol 13 (1) ◽  
pp. 86
Author(s):  
Mohit Kumar ◽  
Chellappagounder Thangavel ◽  
Richard C. Becker ◽  
Sakthivel Sadayappan

Immunotherapy is one of the most effective therapeutic options for cancer patients. Five specific classes of immunotherapies, which includes cell-based chimeric antigenic receptor T-cells, checkpoint inhibitors, cancer vaccines, antibody-based targeted therapies, and oncolytic viruses. Immunotherapies can improve survival rates among cancer patients. At the same time, however, they can cause inflammation and promote adverse cardiac immune modulation and cardiac failure among some cancer patients as late as five to ten years following immunotherapy. In this review, we discuss cardiotoxicity associated with immunotherapy. We also propose using human-induced pluripotent stem cell-derived cardiomyocytes/ cardiac-stromal progenitor cells and cardiac organoid cultures as innovative experimental model systems to (1) mimic clinical treatment, resulting in reproducible data, and (2) promote the identification of immunotherapy-induced biomarkers of both early and late cardiotoxicity. Finally, we introduce the integration of omics-derived high-volume data and cardiac biology as a pathway toward the discovery of new and efficient non-toxic immunotherapy.


Geophysics ◽  
1983 ◽  
Vol 48 (11) ◽  
pp. 1514-1524 ◽  
Author(s):  
Edip Baysal ◽  
Dan D. Kosloff ◽  
John W. C. Sherwood

Migration of stacked or zero‐offset sections is based on deriving the wave amplitude in space from wave field observations at the surface. Conventionally this calculation has been carried out through a depth extrapolation. We examine the alternative of carrying out the migration through a reverse time extrapolation. This approach may offer improvements over existing migration methods, especially in cases of steeply dipping structures with strong velocity contrasts. This migration method is tested using appropriate synthetic data sets.


Author(s):  
B. Chandrasekaran

AbstractI was among those who proposed problem solving methods (PSMs) in the late 1970s and early 1980s as a knowledge-level description of strategies useful in building knowledge-based systems. This paper summarizes the evolution of my ideas in the last two decades. I start with a review of the original ideas. From an artificial intelligence (AI) point of view, it is not PSMs as such, which are essentially high-level design strategies for computation, that are interesting, but PSMs associated with tasks that have a relation to AI and cognition. They are also interesting with respect to cognitive architecture proposals such as Soar and ACT-R: PSMs are observed regularities in the use of knowledge that an exclusive focus on the architecture level might miss, the latter providing no vocabulary to talk about these regularities. PSMs in the original conception are closely connected to a specific view of knowledge: symbolic expressions represented in a repository and retrieved as needed. I join critics of this view, and maintain with them that most often knowledge is not retrieved from a base as much as constructed as needed. This criticism, however, raises the question of what is in memory that is not knowledge as traditionally conceived in AI, but can support theconstructionof knowledge in predicate–symbolic form. My recent proposal about cognition and multimodality offers a possible answer. In this view, much of memory consists of perceptual and kinesthetic images, which can be recalled during deliberation and from which internal perception can generate linguistic–symbolic knowledge. For example, from a mental image of a configuration of objects, numerous sentences can be constructed describing spatial relations between the objects. My work on diagrammatic reasoning is an implemented example of how this might work. These internal perceptions on imagistic representations are a new kind of PSM.


2018 ◽  
Vol 26 (3) ◽  
pp. 230949901880249 ◽  
Author(s):  
Kinh Luan Thanh Dang ◽  
Helen Badge ◽  
Ian A Harris

Background: Evaluating the effectiveness of total hip arthroplasty (THA) and total knee arthroplasty (TKA) often relies on accurate patient reporting of postoperative complications. Despite this, there is little research regarding the accuracy of patient reports. We aimed to determine the accuracy of patient-reported significant complications after THA and TKA. Methods: Patients were recruited prior to undergoing primary hip or knee arthroplasty at 19 high-volume hospitals. After surgery, follow-up of patients via telephone interviews at 35, 90 and 365 days recorded surgical outcomes including readmission, reoperation and venous thromboembolism (VTE). Patient-reported complications were verified via medical record audits and liaison with surgeons, general practitioners or other health professionals. Surgical and demographic information and patient-reported and verified complications were entered into a database. Patient-reported and verified complications were compared for readmission, reoperation and VTE. Results: The sample included 150 of 1811 patients who reported a total of 242 significant complications. Of the 242 patient-reported complications, 224 (92.6%) were correct (true positive). The type of complication had variable levels of accuracy in patient reports. Readmission to hospital was accurately reported by 90.2% (129/143) of patients. Reoperation (including any manipulations under anaesthesia, joint washouts, reductions of dislocated joints and revisions) was accurately reported by 98.7% (75/76) of patients. VTE was accurately reported by 86.7% (20/23) of patients. Conclusion: A high level of accuracy in patient-reported experience of complications was demonstrated following THA and TKA. Patient-reported complications may be reliably used for post-operative surveillance of joint replacement surgery.


Author(s):  
Agus Wibowo

Abstract: Implementation of guidance and counseling services should be based on the needs and problems of students, so the effectiveness of the service will be achieved to the fullest. But the reality is a lot of implementation of guidance and counseling services in schools, do not notice it. So that the completion of the problems experienced by students sama.Berangkat always use the services of this, the research level of effectiveness of guidance and counseling that implementation has been using the application activity instrumentation and data sets as the basis for an implementation of the service. The method used is a qualitative research subjects that teachers BK and Students at SMA Negeri 1 Metro. Data collection technique through interview, observation and documentation. Research results show that by utilizing activity instrumentation applications and data sets, the counseling services have a high level of effectiveness. In carrying out the service, BK teachers can identify problems and needs experienced by students, so that the efforts of the assistance provided to be more precise, and problem students can terentaskan optimally.Keyword: Guidance and Counseling, Instrumentation Applications, Data Association


2020 ◽  
Author(s):  
Ying Bi ◽  
Bing Xue ◽  
Mengjie Zhang

© Springer International Publishing AG, part of Springer Nature 2018. Feature extraction is an essential process for image data dimensionality reduction and classification. However, feature extraction is very difficult and often requires human intervention. Genetic Programming (GP) can achieve automatic feature extraction and image classification but the majority of existing methods extract low-level features from raw images without any image-related operations. Furthermore, the work on the combination of image-related operators/descriptors in GP for feature extraction and image classification is limited. This paper proposes a multi-layer GP approach (MLGP) to performing automatic high-level feature extraction and classification. A new program structure, a new function set including a number of image operators/descriptors and two region detectors, and a new terminal set are designed in this approach. The performance of the proposed method is examined on six different data sets of varying difficulty and compared with five GP based methods and 42 traditional image classification methods. Experimental results show that the proposed method achieves better or comparable performance than these baseline methods. Further analysis on the example programs evolved by the proposed MLGP method reveals the good interpretability of MLGP and gives insight into how this method can effectively extract high-level features for image classification.


Sign in / Sign up

Export Citation Format

Share Document