scholarly journals Multi-source transfer learning of time series in cyclical manufacturing

2019 ◽  
Vol 31 (3) ◽  
pp. 777-787 ◽  
Author(s):  
Werner Zellinger ◽  
Thomas Grubinger ◽  
Michael Zwick ◽  
Edwin Lughofer ◽  
Holger Schöner ◽  
...  

Abstract This paper describes a new transfer learning method for modeling sensor time series following multiple different distributions, e.g. originating from multiple different tool settings. The method aims at removing distribution specific information before the modeling of the individual time series takes place. This is done by mapping the data to a new space such that the representations of different distributions are aligned. Domain knowledge is incorporated by means of corresponding parameters, e.g. physical dimensions of tool settings. Results on a real-world problem of industrial manufacturing show that our method is able to significantly improve the performance of regression models on time series following previously unseen distributions. Graphic abstract

2014 ◽  
Vol 14 (22) ◽  
pp. 12251-12270 ◽  
Author(s):  
V. De Bock ◽  
H. De Backer ◽  
R. Van Malderen ◽  
A. Mangold ◽  
A. Delcloo

Abstract. At Uccle, Belgium, a long time series (1991–2013) of simultaneous measurements of erythemal ultraviolet (UV) dose (Sery), global solar radiation (Sg), total ozone column (Q_{O3}$) and aerosol optical depth (τaer) (at 320.1 nm) is available, which allows for an extensive study of the changes in the variables over time. Linear trends were determined for the different monthly anomalies time series. Sery, Sg and QO3 all increase by respectively 7, 4 and 3% per decade. τaer shows an insignificant negative trend of −8% per decade. These trends agree with results found in the literature for sites with comparable latitudes. A change-point analysis, which determines whether there is a significant change in the mean of the time series, is applied to the monthly anomalies time series of the variables. Only for Sery and QO3, was a significant change point present in the time series around February 1998 and March 1998, respectively. The change point in QO3 corresponds with results found in the literature, where the change in ozone levels around 1997 is attributed to the recovery of ozone. A multiple linear regression (MLR) analysis is applied to the data in order to study the influence of Sg, QO3 and τaer on Sery. Together these parameters are able to explain 94% of the variation in Sery. Most of the variation (56%) in Sery is explained by Sg. The regression model performs well, with a slight tendency to underestimate the measured Sery values and with a mean absolute bias error (MABE) of 18%. However, in winter, negative Sery are modeled. Applying the MLR to the individual seasons solves this issue. The seasonal models have an adjusted R2 value higher than 0.8 and the correlation between modeled and measured Sery values is higher than 0.9 for each season. The summer model gives the best performance, with an absolute mean error of only 6%. However, the seasonal regression models do not always represent reality, where an increase in Sery is accompanied with an increase in QO3 and a decrease in τaer. In all seasonal models, Sg is the factor that contributes the most to the variation in Sery, so there is no doubt about the necessity to include this factor in the regression models. The individual contribution of τaer to Sery is very low, and for this reason it seems unnecessary to include τaer in the MLR analysis. Including QO3, however, is justified to increase the adjusted R2 and to decrease the MABE of the model.


Author(s):  
T. Brent Baker ◽  
Raymond G. Deardorf

A combination of statistical approaches is used to develop near-to mid-range ridership and revenue forecasting models for Washington State Ferries for use in quarterly budget updates. Econometric regression models use historical and forecast trends in state economic and demographic variables to project systemwide ridership by six fare categories for different fare scenarios. Time series analysis models are used to project ferry ridership at the individual route level by six fare categories. The sum of the time series route forecasts is then calibrated to the econometric systemwide totals to yield unconstrained ridership forecasts by route and fare category. A capacity constraint model handles cases where the demand for vehicle travel exceeds vessel capacity by generating ridership ceilings for different service scenarios. Finally, the appropriate fares are applied to the ridership projections to arrive at revenue forecasts over a 10-year horizon.


2010 ◽  
Vol 33 (2-3) ◽  
pp. 159-160 ◽  
Author(s):  
S. Brian Hood ◽  
Benjamin J. Lovett

AbstractCramer et al.'s account of comorbidity comes with a substantive philosophical view concerning the nature of psychological disorders. Although the network account is responsive to problems with extant approaches, it faces several practical and conceptual challenges of its own, especially in cases where the individual differences in network structures require the analysis of intra-individual time-series data.


2021 ◽  
Vol 39 (2) ◽  
pp. 202-225
Author(s):  
Roger T. Dean ◽  
David Bulger ◽  
Andrew J. Milne

Production of relatively few rhythms with non-isochronous beats has been studied. So we assess reproduction of most well-formed looped rhythms comprising K=2-11 cues (a uniform piano tone, indicating where participants should tap) and N=3-13 isochronous pulses (a uniform cymbal). Each rhythm had two different cue interonset intervals. We expected that many of the rhythms would be difficult to tap, because of ambiguous non-isochronous beats and syncopations, and that complexity and asymmetry would predict performance. 111 participants tapped 91 rhythms each heard over 129 pulses, starting as soon as they could. Whereas tap-cue concordance in prior studies was generally >> 90%, here only 52.2% of cues received a temporally congruent tap, and only 63% of taps coincided with a cue. Only −2 ms mean tap asynchrony was observed (whereas for non-musicians this value is usually c. −50 ms). Performances improved as rhythms progressed and were repeated, but precision varied substantially between participants and rhythms. Performances were autoregressive and mixed effects cross-sectional time series analyses retaining the integrity of all the individual time series revealed that performance worsened as complexity features K, N, and cue inter-onset interval entropy increased. Performance worsened with increasing R, the Long: short (L: s) cue interval ratio of each rhythm (indexing both complexity and asymmetry). Rhythm evenness and balance, and whether N was divisible by 2 or 3, were not useful predictors. Tap velocities positively predicted cue fulfilment. Our data indicate that study of a greater diversity of rhythms can broaden our impression of rhythm cognition.


2020 ◽  
Author(s):  
Pathikkumar Patel ◽  
Bhargav Lad ◽  
Jinan Fiaidhi

During the last few years, RNN models have been extensively used and they have proven to be better for sequence and text data. RNNs have achieved state-of-the-art performance levels in several applications such as text classification, sequence to sequence modelling and time series forecasting. In this article we will review different Machine Learning and Deep Learning based approaches for text data and look at the results obtained from these methods. This work also explores the use of transfer learning in NLP and how it affects the performance of models on a specific application of sentiment analysis.


Pathogens ◽  
2021 ◽  
Vol 10 (4) ◽  
pp. 480
Author(s):  
Rania Kousovista ◽  
Christos Athanasiou ◽  
Konstantinos Liaskonis ◽  
Olga Ivopoulou ◽  
George Ismailos ◽  
...  

Acinetobacter baumannii is one of the most difficult-to-treat pathogens worldwide, due to developed resistance. The aim of this study was to evaluate the use of widely prescribed antimicrobials and the respective resistance rates of A. baumannii, and to explore the relationship between antimicrobial use and the emergence of A. baumannii resistance in a tertiary care hospital. Monthly data on A. baumannii susceptibility rates and antimicrobial use, between January 2014 and December 2017, were analyzed using time series analysis (Autoregressive Integrated Moving Average (ARIMA) models) and dynamic regression models. Temporal correlations between meropenem, cefepime, and ciprofloxacin use and the corresponding rates of A. baumannii resistance were documented. The results of ARIMA models showed statistically significant correlation between meropenem use and the detection rate of meropenem-resistant A. baumannii with a lag of two months (p = 0.024). A positive association, with one month lag, was identified between cefepime use and cefepime-resistant A. baumannii (p = 0.028), as well as between ciprofloxacin use and its resistance (p < 0.001). The dynamic regression models offered explanation of variance for the resistance rates (R2 > 0.60). The magnitude of the effect on resistance for each antimicrobial agent differed significantly.


2019 ◽  
Vol 2 ◽  
pp. 205920431984735
Author(s):  
Roger T. Dean ◽  
Andrew J. Milne ◽  
Freya Bailes

Spectral pitch similarity (SPS) is a measure of the similarity between spectra of any pair of sounds. It has proved powerful in predicting perceived stability and fit of notes and chords in various tonal and microtonal instrumental contexts, that is, with discrete tones whose spectra are harmonic or close to harmonic. Here we assess the possible contribution of SPS to listeners’ continuous perceptions of change in music with fewer discrete events and with noisy or profoundly inharmonic sounds, such as electroacoustic music. Previous studies have shown that time series of perception of change in a range of music can be reasonably represented by time series models, whose predictors comprise autoregression together with series representing acoustic intensity and, usually, the timbral parameter spectral flatness. Here, we study possible roles for SPS in such models of continuous perceptions of change in a range of both instrumental (note-based) and sound-based music (generally containing more noise and fewer discrete events). In the first analysis, perceived change in three pieces of electroacoustic and one of piano music is modeled, to assess the possible contribution of (de-noised) SPS in cooperation with acoustic intensity and spectral flatness series. In the second analysis, a broad range of nine pieces is studied in relation to the wider range of distinctive spectral predictors useful in previous perceptual work, together with intensity and SPS. The second analysis uses cross-sectional (mixed-effects) time series analysis to take advantage of all the individual response series in the dataset, and to assess the possible generality of a predictive role for SPS. SPS proves to be a useful feature, making a predictive contribution distinct from other spectral parameters. Because SPS is a psychoacoustic “bottom up” feature, it may have wide applicability across both the familiar and the unfamiliar in the music to which we are exposed.


Sensors ◽  
2021 ◽  
Vol 21 (5) ◽  
pp. 1590
Author(s):  
Arnak Poghosyan ◽  
Ashot Harutyunyan ◽  
Naira Grigoryan ◽  
Clement Pang ◽  
George Oganesyan ◽  
...  

The main purpose of an application performance monitoring/management (APM) software is to ensure the highest availability, efficiency and security of applications. An APM software accomplishes the main goals through automation, measurements, analysis and diagnostics. Gartner specifies the three crucial capabilities of APM softwares. The first is an end-user experience monitoring for revealing the interactions of users with application and infrastructure components. The second is application discovery, diagnostics and tracing. The third key component is machine learning (ML) and artificial intelligence (AI) powered data analytics for predictions, anomaly detection, event correlations and root cause analysis. Time series metrics, logs and traces are the three pillars of observability and the valuable source of information for IT operations. Accurate, scalable and robust time series forecasting and anomaly detection are the requested capabilities of the analytics. Approaches based on neural networks (NN) and deep learning gain an increasing popularity due to their flexibility and ability to tackle complex nonlinear problems. However, some of the disadvantages of NN-based models for distributed cloud applications mitigate expectations and require specific approaches. We demonstrate how NN-models, pretrained on a global time series database, can be applied to customer specific data using transfer learning. In general, NN-models adequately operate only on stationary time series. Application to nonstationary time series requires multilayer data processing including hypothesis testing for data categorization, category specific transformations into stationary data, forecasting and backward transformations. We present the mathematical background of this approach and discuss experimental results based on implementation for Wavefront by VMware (an APM software) while monitoring real customer cloud environments.


Materials ◽  
2021 ◽  
Vol 14 (8) ◽  
pp. 1986
Author(s):  
Andreas Koenig ◽  
Julius Schmidtke ◽  
Leonie Schmohl ◽  
Sibylle Schneider-Feyrer ◽  
Martin Rosentritt ◽  
...  

The performance of dental resin-based composites (RBCs) heavily depends on the characteristic properties of the individual filler fraction. As specific information regarding the properties of the filler fraction is often missing, the current study aims to characterize the filler fractions of several contemporary computer-aided design/computer-aided manufacturing (CAD/CAM) RBCs from a material science point of view. The filler fractions of seven commercially available CAD/CAM RBCs featuring different translucency variants were analysed using Scanning Electron Microscopy (SEM) with Energy Dispersive X-ray Spectroscopy (EDS), Micro-X-ray Computed Tomography (µXCT), Thermogravimetric Analysis (TG) and X-ray Diffractometry (XRD). All CAD/CAM RBCs investigated included midifill hybrid type filler fractions, and the size of the individual particles was clearly larger than the individual specifications of the manufacturer. The fillers in Shofu Block HC featured a sphericity of ≈0.8, while it was <0.7 in all other RBCs. All RBCs featured only X-ray amorphous phases. However, in Lava Ultimate, zircon crystals with low crystallinity were detected. In some CAD/CAM RBCs, inhomogeneities (X-ray opaque fillers or pores) with a size <80 µm were identified, but the effects were minor in relation to the total volume (<0.01 vol.%). The characteristic parameters of the filler fraction in RBCs are essential for the interpretation of the individual material’s mechanical and optical properties.


2012 ◽  
Vol 8 (1) ◽  
pp. 89-115 ◽  
Author(s):  
V. K. C. Venema ◽  
O. Mestre ◽  
E. Aguilar ◽  
I. Auer ◽  
J. A. Guijarro ◽  
...  

Abstract. The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training the users on homogenization software was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that automatic algorithms can perform as well as manual ones.


Sign in / Sign up

Export Citation Format

Share Document