rigorous testing
Recently Published Documents


TOTAL DOCUMENTS

172
(FIVE YEARS 74)

H-INDEX

20
(FIVE YEARS 8)

2022 ◽  
pp. 163-182
Author(s):  
Kamalendu Pal

Agile software development methodologies are attracting attention from academics and practitioners for planning and managing software projects. The eXtreme Programming (XP) challenges conformist wisdom regarding software system development processes and practices as agile methodologies. To work efficiently in the current software development practice, characterized by requirements fuzziness, XP moves away from document-centric operations into people-centric management. In the XP-based software project, the customers play an essential role, having multiple responsibilities such as driving the project, gathering requirements (‘user stories'), and exercising quality control (or acceptance testing). Besides, the customers must liaise with external project stakeholders (e.g., funding authorities, end-users) while maintaining the development team's trust and the wider business. The success of such software project management practices relies on the quality result of each stage of development obtained through rigorous testing. This chapter describes three characteristics of XP project management: customer role, software testing feedback, and learning.


2021 ◽  
Vol 40 (1) ◽  
Author(s):  
Nadeem Iqbal

This study aims to see the anchoring effect on portfolio return volatility in the case of KSE-30. Business anomalies such as overreaction and under-reaction are affected by a variety of psychological causes. The use of anchors or baseline values known as the anchoring effect causes market under-reaction and overreaction. This research used nearness to 52-week high and nearness to historical high as proxies for under and over-reaction, respectively, to analyze the psychological causes for under and over-reaction. On the KSE-30, the findings revealed that proximity to the 52-week peak positively predicts future returns, whereas proximity to the historical high negatively predicts future returns. KSE-30 was used for rigorous testing. Similarly, the three macroeconomic variables used as control variables are the exchange rate, inflation rate, and interest rate to provide a more robust model of strong prediction capacity. The findings revealed that proximity to the 52-week maximum and proximity to the historical high and other macroeconomic factors had a forecast capacity of around 62 percent. Similarly, focused on volatility clusters, the GARCH (1, 1) model was used to measure the association between potential and past returns. The results show that there is a first order autoregressive function in the GARCH (1, 1) model. The findings also show that their predictive capacity decreases when the study's individual variables are moved from every day to annual Periods.


Author(s):  
Alfred A. Zinn ◽  
Mina Izadjoo ◽  
Hosan Kim ◽  
Rachel L. Brody ◽  
Robert R. Roth ◽  
...  

The continued proliferation of superbugs in hospitals and the coronavirus disease 2019 (COVID-19) has created an acute worldwide demand for sustained broadband pathogen suppression in households, hospitals, and public spaces. In response, we have created a highly active, self-sterilizing copper configuration capable of inactivating a wide range of bacteria and viruses in 30-60 seconds. The highly active material destroys pathogens faster than any conventional copper configuration and acts as quickly as alcohol wipes and hand sanitizers. Unlike the latter, our copper material does not release volatile compounds or leave harmful chemical residues and maintains its antimicrobial efficacy over sustained use; it is shelf stable for years. We have performed rigorous testing in accordance with guidelines from U.S. regulatory agencies and believe that the material could offer broad spectrum, non-selective defense against most microbes via integration into masks, protective equipment, and various forms of surface coatings.


2021 ◽  
Author(s):  
Matt Amos ◽  
Ushnish Sengupta ◽  
Paul Young ◽  
J. Hosking

Continuous historic datasets of vertically resolved stratospheric ozone, support the case for ozone recovery, are necessary for the running of offline models and increase understanding of the impacts of ozone on the wider atmospheric system. Vertically resolved ozone datasets are typically constructed from multiple satellite, sonde and ground-based measurements that do not provide continuous coverage. As a result, several methods have been used to infill these gaps, most commonly relying on regression against observed time series. However, these existing methods either provide low accuracy infilling especially over polar regions, unphysical extrapolation, or an incomplete estimation of uncertainty. To address these methodological shortcomings we used and further developed an infilling framework that fuses observations with output from an ensemble of chemistry-climate models within a Bayesian neural network. We used this deep learning framework to produce a continuous record of vertically resolved ozone with uncertainty estimates. Under rigorous testing the infilling framework extrapolated and interpolated skillfully and maintained realistic interannual variability due to the inclusion of physically and chemically realistic models. This framework and the ozone dataset it produced, enables a more thorough investigation of vertically resolved trends throughout the atmosphere.


2021 ◽  
Vol 16 (12) ◽  
pp. C12006
Author(s):  
Y. Allard ◽  
G. De Lentdecker ◽  
D. Hohov ◽  
F. Robert ◽  
A. Safa ◽  
...  

Abstract To build silicon trackers of modern and future high-luminosity collider experiments, thousands of silicon strip modules have to be produced and tested. The modules in new trackers must reliably work usually during 5–10 years or more under harsh irradiation conditions, as it will be impossible to replace a failing module once installed inside the detector. It means that reliable and rigorous testing of strip modules and its components is mandatory. To sustain the production throughput we should be able to test several modules in parallel. For this reason a fast, reliable, scalable and cost effective production QC test bench has to be designed and implemented. For the CV and IV measurements of sensors and modules we are developing a low-cost (less than 500 €) integrated electronic board which will be scaled up to ten channels to measure DUTs in parallel. In the current work the design of the IV/CV board and the calibration procedure to increase the accuracy of the current and capacitance measurements, for which a special calibration dipole board based on tight tolerance capacitors and resistors has been designed, as well as future development plans are described.


2021 ◽  
Vol 17 ◽  
Author(s):  
Rajesh Basnet ◽  
Til Bahadur Basnet ◽  
Buddha Bahadur Basnet ◽  
Sandhya Khadka ◽  
Sanjeep Sapkota

Background: The spread of new coronavirus 2019, the causative agent of viral pneumonia documented in Wuhan, brought a recent public health crisis globally. The best solution to overcome this pandemic is developing suitable and effective vaccines and therapeutics. However, discovering and creating a new drug is a lengthy process requiring rigorous testing and validation. Objective: Despite many newly discovered and old repurposed COVID-19 drugs under clinical trial, more emphasis should be given to research on COVID-19 NPs-based medicines, which could improve the efficacy of antiviral drugs to reduce their side effects. The use of NPs as carriers can reduce the frequency and duration of drug ingestion, enhance approved antiviral therapeutics' effectiveness, and overcome their limitations, such as low bioavailability. Besides, they can play a crucial role in fighting against the COVID-19 pandemic. In this regard, nanotechnology provides new opportunities to develop new strategies for preventing, diagnosing, and treating COVID-19. Conclusion: This review highlighted the importance of NMs-based technical solutions in antiviral drugs for testing against the SARS-CoV-2 virus emergencies in the form of nanotherapeutics.


2021 ◽  
Vol 72 ◽  
Author(s):  
Anthony Corso ◽  
Robert Moss ◽  
Mark Koren ◽  
Ritchie Lee ◽  
Mykel Kochenderfer

Autonomous cyber-physical systems (CPS) can improve safety and efficiency for safety-critical applications, but require rigorous testing before deployment. The complexity of these systems often precludes the use of formal verification and real-world testing can be too dangerous during development. Therefore, simulation-based techniques have been developed that treat the system under test as a black box operating in a simulated environment. Safety validation tasks include finding disturbances in the environment that cause the system to fail (falsification), finding the most-likely failure, and estimating the probability that the system fails. Motivated by the prevalence of safety-critical artificial intelligence, this work provides a survey of state-of-the-art safety validation techniques for CPS with a focus on applied algorithms and their modifications for the safety validation problem. We present and discuss algorithms in the domains of optimization, path planning, reinforcement learning, and importance sampling. Problem decomposition techniques are presented to help scale algorithms to large state spaces, which are common for CPS. A brief overview of safety-critical applications is given, including autonomous vehicles and aircraft collision avoidance systems. Finally, we present a survey of existing academic and commercially available safety validation tools.


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Yilin Miao ◽  
Zhewei Liu ◽  
Xiangning Wu ◽  
Jie Gao

After the production of printed circuit boards (PCB), PCB manufacturers need to remove defected boards by conducting rigorous testing, while manual inspection is time-consuming and laborious. Many PCB factories employ automatic optical inspection (AOI), but this pixel-based comparison method has a high false alarm rate, thus requiring intensive human inspection to determine whether alarms raised from it resemble true or pseudo defects. In this paper, we propose a new cost-sensitive deep learning model: cost-sensitive siamese network (CSS-Net) based on siamese network, transfer learning and threshold moving methods to distinguish between true and pseudo PCB defects as a cost-sensitive classification problem. We use optimization algorithms such as NSGA-II to determine the optimal cost-sensitive threshold. Results show that our model improves true defects prediction accuracy to 97.60%, and it maintains relatively high pseudo defect prediction accuracy, 61.24% in real-production scenario. Furthermore, our model also outperforms its state-of-the-art competitor models in other comprehensive cost-sensitive metrics, with an average of 33.32% shorter training time.


2021 ◽  
pp. SP512-2021-124
Author(s):  
Isabel Patricia Montañez

AbstractIcehouses are the less common climate state on Earth, and thus it is notable that the longest lived (∼370 to 260 Ma) and possibly most extensive and intense of icehouse periods spanned the Carboniferous Period. Mid- to high-latitude glaciogenic deposits reveal a dynamic glaciation-deglaciation history with ice waxing and waning from multiple ice centers and possible transcontinental ice sheets during the apex of glaciation. New high-precision U-Pb ages confirm a hypothesized west-to-east progression of glaciation through the icehouse, but reveal that its demise occurred as a series of synchronous and widespread deglaciations. The dynamic glaciation history, along with repeated perturbations to Earth System components, are archived in the low-latitude stratigraphic record revealing similarities to the Cenozoic icehouse. Further assessing the phasing between climate, oceanographic, and biotic changes during the icehouse requires additional chronostratigraphic constraints. Astrochronology permits the deciphering of time, at high resolution, in the late Paleozoic record as has been demonstrated in deep- and quit-water deposits. Rigorous testing for astronomical forcing in low-latitude cyclothemic successions, which have a direct link to higher latitude glaciogenic records through inferred glacioeustasy, however, will require a comprehensive approach that integrates new techniques with further optimization and additional independent age constraints given challenges associated with shallow-marine to terrestrial records.


2021 ◽  
Vol 24 (1) ◽  
Author(s):  
Serguei S. Komissarov

AbstractA recently proposed simple approximate theory of snow machining is applied to modelling of several basic manoeuvres of alpine skiing: fall-line side-slipping, traversing, and hockey stop. The results agree with the skiing practice and explain the abnormally high friction reported in previous field studies. They also prepare foundation for future rigorous testing of the theory, which will determine its accuracy and limits of applicability.


Sign in / Sign up

Export Citation Format

Share Document