established technique
Recently Published Documents


TOTAL DOCUMENTS

247
(FIVE YEARS 84)

H-INDEX

22
(FIVE YEARS 3)

2022 ◽  
Vol 6 (POPL) ◽  
pp. 1-30
Author(s):  
Ningning Xie ◽  
Matthew Pickering ◽  
Andres Löh ◽  
Nicolas Wu ◽  
Jeremy Yallop ◽  
...  

Multi-stage programming using typed code quotation is an established technique for writing optimizing code generators with strong type-safety guarantees. Unfortunately, quotation in Haskell interacts poorly with type classes, making it difficult to write robust multi-stage programs. We study this unsound interaction and propose a resolution, staged type class constraints, which we formalize in a source calculus λ ⇒ that elaborates into an explicit core calculus F . We show type soundness of both calculi, establishing that well-typed, well-staged source programs always elaborate to well-typed, well-staged core programs, and prove beta and eta rules for code quotations. Our design allows programmers to incorporate type classes into multi-stage programs with confidence. Although motivated by Haskell, it is also suitable as a foundation for other languages that support both overloading and quotation.


2022 ◽  
Vol 22 (1) ◽  
Author(s):  
Anastasia Martin ◽  
Diane Nzelu ◽  
Annette Briley ◽  
Graham Tydeman ◽  
Andrew Shennan

Abstract Background The rate of second stage caesarean section (CS) is rising with associated increases in maternal and neonatal morbidity, which may be related to impaction of the fetal head in the maternal pelvis. In the last 10 years, two devices have been developed to aid disimpaction and reduce these risks: the Fetal Pillow (FP) and the Tydeman Tube (TT). The aim of this study was to determine the distance of upward fetal head elevation achieved on a simulator for second stage CS using these two devices, compared to the established technique of per vaginum digital disimpaction by an assistant. Methods We measured elevation of the fetal head achieved with the two devices (TT and FP), compared to digital elevation, on a second stage Caesearean simulator (Desperate Debra ™ set at three levels of severity. Elevation was measured by both a single operator experienced with use of the TT and FP and also multiple assistants with no previous experience of using either device. All measurements were blinded Results The trained user achieved greater elevation of the fetal head at both moderate and high levels of severity with the TT (moderate: 30mm vs 12.5mm p<0.001; most severe: 25mm vs 10mm p<0.001) compared to digital elevation. The FP provided comparable elevation to digital at both settings (moderate: 10 vs 12.5mm p=0.149; severe 10 vs 10mm p=0.44). With untrained users, elevation was also significantly greater with the TT compared to digital elevation (20mm vs 10mm p<0.01). However digital disimpaction was significantly greater than the FP (10mm vs 0mm p<0.0001). Conclusion On a simulator, with trained operators, the TT provided greater fetal head elevation than digital elevation and the FP. The FP achieved similar elevation to the digital technique, especially when the user was trained in the procedure.


2022 ◽  
pp. 096228022110417
Author(s):  
Kian Wee Soh ◽  
Thomas Lumley ◽  
Cameron Walker ◽  
Michael O’Sullivan

In this paper, we present a new model averaging technique that can be applied in medical research. The dataset is first partitioned by the values of its categorical explanatory variables. Then for each partition, a model average is determined by minimising some form of squared errors, which could be the leave-one-out cross-validation errors. From our asymptotic optimality study and the results of simulations, we demonstrate under several high-level assumptions and modelling conditions that this model averaging procedure may outperform jackknife model averaging, which is a well-established technique. We also present an example where a cross-validation procedure does not work (that is, a zero-valued cross-validation error is obtained) when determining the weights for model averaging.


2022 ◽  
Vol 9 ◽  
pp. 237437352110698
Author(s):  
Chung M Chan ◽  
Adam D. Lindsay ◽  
Andre R V Spiguel ◽  
C. Parker Gibbs ◽  
Mark T Scarborough

Rotationplasty is an established technique that is indicated as part of the surgical reconstruction for certain patients with primary bone tumors around the knee who undergo tumor resection. There is considerable variation in the application of rotationplasty by surgeons as well as acceptance of the procedure by patients who may be candidates for this procedure. We qualitatively studied the decision-making process of families of patients who had undergone rotationplasty by interviewing 4 patients and their families using semi-structured interviews. Thematic analysis identified the following themes that were important in the decision-making process: (1) the desire for good information sources, (2) finding value in meeting with other patients who had been faced with a similar decision, (3) prioritizing function over cosmesis, (4) a desire to limit the need for revision surgeries, and (5) accepting that a return to normalcy is not an option with a surgery. Physicians and patients faced with a similar decision can benefit from a better understanding of the process, and by the normalization of anxieties and concerns that they may experience.


2021 ◽  
Vol 26 (4) ◽  
pp. 82
Author(s):  
Farrukh Jamal ◽  
Ali H. Abuzaid ◽  
Muhammad H. Tahir ◽  
Muhammad Arslan Nasir ◽  
Sadaf Khan ◽  
...  

In this article, Burr III distribution is proposed with a significantly improved functional form. This new modification has enhanced the flexibility of the classical distribution with the ability to model all shapes of hazard rate function including increasing, decreasing, bathtub, upside-down bathtub, and nearly constant. Some of its elementary properties, such as rth moments, sth incomplete moments, moment generating function, skewness, kurtosis, mode, ith order statistics, and stochastic ordering, are presented in a clear and concise manner. The well-established technique of maximum likelihood is employed to estimate model parameters. Middle-censoring is considered as a modern general scheme of censoring. The efficacy of the proposed model is asserted through three applications consisting of complete and censored samples.


2021 ◽  
Author(s):  
Vojtech Mlynsky ◽  
Michal Janecek ◽  
Petra Kuhrova ◽  
Thorben Frohlking ◽  
Michal Otyepka ◽  
...  

Atomistic molecular dynamics (MD) simulations represent established technique for investigation of RNA structural dynamics. Despite continuous development, contemporary RNA simulations still suffer from suboptimal accuracy of empirical potentials (force fields, ffs) and sampling limitations. Development of efficient enhanced sampling techniques is important for two reasons. First, they allow to overcome the sampling limitations and, second, they can be used to quantify ff imbalances provided they reach a sufficient convergence. Here, we study two RNA tetraloops (TLs), namely the GAGA and UUCG motifs. We perform extensive folding simulations and calculate folding free energies (ΔGfold) with the aim to compare different enhanced sampling techniques and to test several modifications of the nonbonded terms extending the AMBER OL3 RNA ff. We demonstrate that replica exchange solute tempering (REST2) simulations with 12-16 replicas do not show any sign of convergence even when extended to time scale of 120 μs per replica. However, combination of REST2 with well-tempered metadynamics (ST-MetaD) achieves good convergence on a time-scale of 5-10 μs per replica, improving the sampling efficiency by at least two orders of magnitude. Effects of ff modifications on ΔGfold energies were initially explored by the reweighting approach and then validated by new simulations. We tested several manually-prepared variants of gHBfix potential which improve stability of the native state of both TLs by up to ~2 kcal/mol. This is sufficient to conveniently stabilize the folded GAGA TL while the UUCG TL still remains under-stabilized. Appropriate adjustment of van der Waals parameters for C-H...O5' base-phosphate interaction are also shown to be capable of further stabilizing the native states of both TLs by ~0.6 kcal/mol.


2021 ◽  
Author(s):  
Midori Tanaka ◽  
Yuji Matsumoto ◽  
Tatsuya Imabayashi ◽  
Takuya Kawahara ◽  
Takaaki Tsuchida

Abstract Background: Cryobiopsy is an established technique that yields larger and higher-quality samples than does a forceps biopsy. However, it remains underutilised in the diagnosis of peripheral pulmonary lesions (PPLs), mainly because of difficulties in handling conventional cryoprobes. A recently introduced single-use cryoprobe with a smaller diameter and more flexibility than conventional ones may improve its diagnostic ability for PPLs. We conducted this prospective study to evaluate the feasibility of transbronchial cryobiopsy in the diagnoses of PPLs, using a new 1.7-mm cryoprobe. Methods: The study included patients with PPLs less than 30 mm in diameter scheduled to undergo bronchoscopy. All the procedures were performed using a combination of virtual bronchoscopic navigation, radial endobronchial ultrasound (R-EBUS) and X-ray fluoroscopy, and all the samples were collected using the cryoprobe alone. Thereafter, we assessed the diagnostic outcomes and safety profiles.Results: A total of 50 patients were enrolled and underwent cryobiopsy. The median lesion size was 20.8 mm (range, 8.2–29.6 mm), and the negative bronchus sign was seen in 34% of lesions. The diagnostic yield was 94% (95% confidence interval, 83.5–98.8%). A positive bronchus sign had a significantly higher diagnostic yield than did a negative bronchus sign (100% vs. 82.4%; P=0.035). The yield was achieved regardless of other variables, including lesion size, location, and R-EBUS findings. The major complications were mild and moderate bleeding in 28% and 62% of patients, respectively. Pneumothorax was identified in one patient.Conclusion: Transbronchial cryobiopsy using the new 1.7-mm cryoprobe is a feasible procedure that has the potential to increase the diagnostic accuracy for PPLs.Trial Registration: Japan Registry of Clinical Trials, jRCT1032200065. Registered 8 July 8 2020, https://jrct.niph.go.jp/en-latest-detail/jRCT1032200065


Water ◽  
2021 ◽  
Vol 13 (21) ◽  
pp. 3042
Author(s):  
Andrew Folkard

Thermal microstructure profiling is an established technique for investigating turbulent mixing and stratification in lakes and oceans. However, it provides only quasi-instantaneous, 1-D snapshots. Other approaches to measuring these phenomena exist, but each has logistic and/or quality weaknesses. Hence, turbulent mixing and stratification processes remain greatly under-sampled. This paper contributes to addressing this problem by presenting a novel analysis of thermal microstructure profiles, focusing on their multi-scale stratification structure. Profiles taken in two small lakes using a Self-Contained Automated Micro-Profiler (SCAMP) were analysed. For each profile, buoyancy frequency (N), Thorpe scales (LT), and the coefficient of vertical turbulent diffusivity (KZ) were determined. To characterize the multi-scale stratification, profiles of d2T/dz2 at a spectrum of scales were calculated and the number of turning points in them counted. Plotting these counts against the scale gave pseudo-spectra, which were characterized by the index D of their power law regression lines. Scale-dependent correlations of D with N, LT and KZ were found, and suggest that this approach may be useful for providing alternative estimates of the efficiency of turbulent mixing and measures of longer-term averages of KZ than current methods provide. Testing these potential uses will require comparison of field measurements of D with time-integrated KZ values and numerical simulations.


2021 ◽  
Author(s):  
Sunil Dahiya ◽  
Akansha Tyagi ◽  
Ankur Mandal ◽  
Thomas Pfeifer ◽  
Kamal P. Singh

Abstract White light interferometry is a well established technique with diverse precision applications, however, the conventional interferometers such as Michelson, Mach-Zehnder or Linnik are large in size, demand tedious alignment for obtaining white light fringes, require noise-isolation to achieve sub-nanometric stability and importantly, exhibit unbalanced dispersion causing uncertainty in absolute zero delay reference. Here, we demonstrate an ultrathin white light interferometer enabling picometer resolution by exploiting the wavefront division of a broadband incoherent light beam after transmission through a pair of micrometer thin identical glass plates. Spatial overlap between the two diffracted split wavefronts readily produce high-contrast and stable white light fringes, with unambiguous reference to absolute zero path-delay position. The colored fringes evolve when one of the ultrathin plates is rotated to tune the interferometer with picometric precision over tens of µm range. Our theoretical analysis validates formation of fringes and highlights self-calibration of the interferometer for picoscale measurements. We demonstrate measurement of coherence lengths of several broadband incoherent sources as small as a few micrometer with picoscale precision. Furthermore, we propose a versatile double-pass configuration using the ultrathin interferometer enabling a sample cavity for additional applications in probing dynamical properties of matter.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Lei Lei ◽  
Yafei Song

Error-Correcting Output Codes has become a well-known, established technique for multiclass classification due to its simplicity and efficiency. Each binary split contains different original classes. A noncompetent classifier emerges when it classifies an instance whose real class does not belong to the metasubclasses which is used to learn the classifier. How to reduce the error caused by the noncompetent classifiers under diversity big enough is urgent for ECOC classification. The weighted decoding strategy can be used to reduce the error caused by the noncompetence contradiction through relearning the weight coefficient matrix. To this end, a new weighted decoding strategy taking the classifier competence reliability into consideration is presented in this paper, which is suitable for any coding matrix. Support Vector Data Description is applied to compute the distance from an instance to the metasubclasses. The distance reflects the competence reliability and is fused as the weight in the base classifier combination. In so doing, the effect of the competent classifiers on classification is reinforced, while the bias induced by the noncompetent ones is decreased. Reflecting the competence reliability, the weights of classifiers for each instance change dynamically, which accords with the classification practice. The statistical simulations based on benchmark datasets indicate that our proposed algorithm outperforms other methods and provides new thought for solving the noncompetence problem.


Sign in / Sign up

Export Citation Format

Share Document