processing times
Recently Published Documents


TOTAL DOCUMENTS

1542
(FIVE YEARS 359)

H-INDEX

53
(FIVE YEARS 7)

Author(s):  
Oscar Danilo Montoya ◽  
Carlos Alberto Ramírez-Vanegas ◽  
Luis Fernando Grisales-Noreña

<p>The problem of parametric estimation in photovoltaic (PV) modules considering manufacturer information is addressed in this research from the perspective of combinatorial optimization. With the data sheet provided by the PV manufacturer, a non-linear non-convex optimization problem is formulated that contains information regarding maximum power, open-circuit, and short-circuit points. To estimate the three parameters of the PV model (i.e., the ideality diode factor (a) and the parallel and series resistances (R<sub>p</sub> and R<sub>s</sub>)), the crow search algorithm (CSA) is employed, which is a metaheuristic optimization technique inspired by the behavior of the crows searching food deposits. The CSA allows the exploration and exploitation of the solution space through a simple evolution rule derived from the classical PSO method. Numerical simulations reveal the effectiveness and robustness of the CSA to estimate these parameters with objective function values lower than 1 × 10<sup>−28</sup> and processing times less than 2 s. All the numerical simulations were developed in MATLAB 2020a and compared with the sine-cosine and vortex search algorithms recently reported in the literature.</p>


Drones ◽  
2022 ◽  
Vol 6 (1) ◽  
pp. 24
Author(s):  
Taleatha Pell ◽  
Joan Y. Q. Li ◽  
Karen E. Joyce

With the increased availability of low-cost, off-the-shelf drone platforms, drone data become easy to capture and are now a key component of environmental assessments and monitoring. Once the data are collected, there are many structure-from-motion (SfM) photogrammetry software options available to pre-process the data into digital elevation models (DEMs) and orthomosaics for further environmental analysis. However, not all software packages are created equal, nor are their outputs. Here, we evaluated the workflows and output products of four desktop SfM packages (AgiSoft Metashape, Correlator3D, Pix4Dmapper, WebODM), across five input datasets representing various ecosystems. We considered the processing times, output file characteristics, colour representation of orthomosaics, geographic shift, visual artefacts, and digital surface model (DSM) elevation values. No single software package was determined the “winner” across all metrics, but we hope our results help others demystify the differences between the options, allowing users to make an informed decision about which software and parameters to select for their specific application. Our comparisons highlight some of the challenges that may arise when comparing datasets that have been processed using different parameters and different software packages, thus demonstrating a need to provide metadata associated with processing workflows.


2022 ◽  
Author(s):  
Giorgio D‘Ettorre ◽  
Marco Farronato ◽  
Ettore Candida ◽  
Vincenzo Quinzi ◽  
Cristina Grippaudo

ABSTRACT Objectives To compare three-dimensional facial scans obtained by stereophotogrammetry with two different applications for smartphone supporting the TrueDepth system, a structured light technology. Materials and Methods Facial scans of 40 different subjects were acquired with three different systems. The 3dMDtrio Stereophotogrammetry System (3dMD, Atlanta, Ga) was compared with a smartphone (iPhone Xs; Apple, Cupertino, Calif) equipped with the Bellus3D Face Application (version 1.6.11; Bellus3D Inc, Campbell, Calif) or Capture (version 1.2.5; Standard Cyborg Inc, San Francisco, Calif). Times of image acquisition and elaboration were recorded. The surface-to-surface deviation and the distance between 18 landmarks from 3dMD reference images to those acquired with Bellus3D or Capture were measured. Results Capturing and processing times with the smartphone applications were considerably longer than with the 3dMD system. The surface-to-surface deviation analysis between the Bellus3D and 3dMD showed an overlap percentage of 80.01% ± 5.92% and 56.62% ± 7.65% within the ranges of 1 mm and 0.5 mm discrepancy, respectively. Images from Capture showed an overlap percentage of 81.40% ± 9.59% and 56.45% ± 11.62% within the ranges of 1 mm and 0.5 mm, respectively. Conclusions The face image acquisition with the 3dMD device is fast and accurate, but bulky and expensive. The new smartphone applications combined with the TrueDepth sensors show promising results. They need more accuracy from the operator and more compliance from the patient because of the increased acquisition time. Their greatest advantages are related to cost and portability.


Author(s):  
Julio Mar-Ortiz ◽  
Alex J. Ruiz Torres ◽  
Belarmino Adenso-Díaz

AbstractThis paper explores the characteristics of solutions when scheduling jobs in a shop with parallel machines. Three classical objective functions were considered: makespan, total completion time, and total tardiness. These three criteria were combined in pairs, resulting in three bi-objective formulations. These formulations were solved using the ε-constraint method to obtain a Pareto frontier for each pair. The objective of the research is to evaluate the Pareto set of efficient schedules to characterize the solution sets. The characterization of the solutions sets is based on two performance metrics: the span of the objective functions' values for the points in the frontier and their closeness to the ideal point. Results that consider four experimental factors indicate that when the makespan is one of the objective functions, the range of the processing times among jobs has a significant influence on the characteristics of the Pareto frontier. Simultaneously, the slack of due dates is the most relevant factor when total tardiness is considered.


2022 ◽  
Vol 17 (01) ◽  
pp. C01047
Author(s):  
E. Fabbrica ◽  
M. Carminati ◽  
D. Butta ◽  
M. Uslenghi ◽  
M. Fiorini ◽  
...  

Abstract We present the design of the first prototype of MIRA (MIcro-channel plate Readout ASIC) that has been designed to read out Micro-Channel Plates (MCP), in particular for UV spectroscopy. MIRA will be able to detect the cloud of electrons generated by each photon interacting with the MCP, sustaining high local and global count rates to fully exploit the MCP intrinsic dynamic range with low dead time. The main rationale that guided the electronics design is the reduction of the input Equivalent Noise Charge (ENC) in order to allow operations with lower MCP gain, thus improving its lifetime, crucial aspect for long missions in space. MIRA features two selectable analog processing times, 133 ns or 280 ns (i.e. fast mode or slow mode), granting a count rate per pixel of 100 kcps. Moreover, it shows an Equivalent Noise Charge ENC = 17 e r m s − . A spatial resolution of 35 μm and an operation with zero dead time, due to the readout, are targeted. The low noise, high count rate and high spatial resolution requirements are expected by keeping a compact pixel size (35 μm × 35 μm) for a total of 32 × 32 pixels in a 2 mm × 2 mm ASIC area. In this work, the ASIC design is described.


2021 ◽  
Vol 12 (1) ◽  
pp. 197
Author(s):  
Chunxia Zhang ◽  
Xiaoli Wei ◽  
Sang-Woon Kim

This paper empirically evaluates two kinds of features, which are extracted, respectively, with traditional statistical methods and convolutional neural networks (CNNs), in order to improve the performance of seismic patch image classification. In the latter case, feature vectors, named “CNN-features”, were extracted from one trained CNN model, and were then used to learn existing classifiers, such as support vector machines. In this case, to learn the CNN model, a technique of transfer learning using synthetic seismic patch data in the source domain, and real-world patch data in the target domain, was applied. The experimental results show that CNN-features lead to some improvements in the classification performance. By analyzing the data complexity measures, the CNN-features are found to have the strongest discriminant capabilities. Furthermore, the transfer learning technique alleviates the problems of long processing times and the lack of learning data.


2021 ◽  
Vol 11 (1) ◽  
pp. 94
Author(s):  
Jiyoung Kim ◽  
Choongrak Kim ◽  
Song Yi Park

The purpose of this retrospective observational study was to identify the impact of COVID-19 on emergency medical services (EMS) processing times and transfers to the emergency department (ED) among patients with acute stroke symptoms before and during the COVID-19 pandemic in Busan, South Korea. The total number of patients using EMS for acute stroke symptoms decreased by 8.2% from 1570 in the pre-COVID-19 period to 1441 during the COVID-19 period. The median (interquartile range) EMS processing time was 29.0 (23–37) min in the pre-COVID-19 period and 33.0 (25–41) minutes in the COVID-19 period (p < 0.001). There was a significant decrease in the number of patients transferred to an ED with a comprehensive stroke center (CSC) (6.37%, p < 0.001) and an increase in the number of patients transferred to two EDs nearby (2.77%, p = 0.018; 3.22%, p < 0.001). During the COVID-19 pandemic, EMS processing time increased. The number of patients transferred to ED with CSC was significantly reduced and dispersed. COVID-19 appears to have affected the stroke chain of survival by hindering entry into EDs with stroke centers, the gateway for acute stroke patients.


2021 ◽  
pp. 1-10
Author(s):  
Claudio Gutiérrez-Soto ◽  
Tatiana Gutiérrez-Bunster ◽  
Guillermo Fuentes

Big Data is a generic term that involves the storing and processing of a large amount of data. This large amount of data has been promoted by technologies such as mobile applications, Internet of Things (IoT), and Geographic Information Systems (GIS). An example of GIS is a Spatio-Temporal Database (STDB). A complex problem to address in terms of processing time is pattern searching on STDB. Nowadays, high information processing capacity is available everywhere. Nevertheless, the pattern searching problem on STDB using traditional Data Mining techniques is complex because the data incorporate the temporal aspect. Traditional techniques of pattern searching, such as time series, do not incorporate the spatial aspect. For this reason, traditional algorithms based on association rules must be adapted to find these patterns. Most of the algorithms take exponential processing times. In this paper, a new efficient algorithm (named Minus-F1) to look for periodic patterns on STDB is presented. Our algorithm is compared with Apriori, Max-Subpattern, and PPA algorithms on synthetic and real STDB. Additionally, the computational complexities for each algorithm in the worst cases are presented. Empirical results show that Minus-F1 is not only more efficient than Apriori, Max-Subpattern, and PAA, but also it presents a polynomial behavior.


Author(s):  
Nilay Noyan ◽  
Gábor Rudolf ◽  
Miguel Lejeune

We introduce a new class of distributionally robust optimization problems under decision-dependent ambiguity sets. In particular, as our ambiguity sets, we consider balls centered on a decision-dependent probability distribution. The balls are based on a class of earth mover’s distances that includes both the total variation distance and the Wasserstein metrics. We discuss the main computational challenges in solving the problems of interest and provide an overview of various settings leading to tractable formulations. Some of the arising side results, such as the mathematical programming expressions for robustified risk measures in a discrete space, are also of independent interest. Finally, we rely on state-of-the-art modeling techniques from machine scheduling and humanitarian logistics to arrive at potentially practical applications, and present a numerical study for a novel risk-averse scheduling problem with controllable processing times. Summary of Contribution: In this study, we introduce a new class of optimization problems that simultaneously address distributional and decision-dependent uncertainty. We present a unified modeling framework along with a discussion on possible ways to specify the key model components, and discuss the main computational challenges in solving the complex problems of interest. Special care has been devoted to identifying the settings and problem classes where these challenges can be mitigated. In particular, we provide model reformulation results, including mathematical programming expressions for robustified risk measures, and describe how these results can be utilized to obtain tractable formulations for specific applied problems from the fields of humanitarian logistics and machine scheduling. Toward demonstrating the value of the modeling approach and investigating the performance of the proposed mixed-integer linear programming formulations, we conduct a computational study on a novel risk-averse machine scheduling problem with controllable processing times. We derive insights regarding the decision-making impact of our modeling approach and key parameter choices.


2021 ◽  
Vol 11 (24) ◽  
pp. 12104
Author(s):  
Adela Cristina Martinez Urango ◽  
Monique Martins Strieder ◽  
Eric Keven Silva ◽  
Maria Angela A. Meireles

This study aimed to examine the impact of the combination of acoustic energy at the nominal powers of 100, 200, 300, and 400 W with moderate heat processing at 40, 50, and 60 °C on the extraction of phytochemical compounds from Foeniculum vulgare. Thermosonication processing, based on high-intensity ultrasound combined with an external heat source, can potentialize the extraction of soluble solids from plant material. However, the excessive temperature increase generated by the two energy sources during thermosonication treatment may degrade the thermolabile bioactive compounds. Regardless of the temperature condition, fennel extracts obtained at 400 W presented lower total phenolic content (TPC) than those obtained at 300 W. The cavitation heat and mechanical stress provided at 400 W may have degraded the phenolic compounds. Thereby, the best extraction condition was 300 W and 60 °C. The fennel extract presented the highest content of TPC (3670 ± 67 µg GAE/g) and antioxidant activity determined by DPPH and ABTS methods (1195 ± 16 µg TE/g and 2543.12 ± 0.00 µg TE/g, respectively) using this treatment. Thermosonication can be an innovative technique for extracting phytochemicals because it provides good results in shorter processing times, with 73% and 88% less energy consumption than Percolation and Soxhlet techniques, respectively.


Sign in / Sign up

Export Citation Format

Share Document