scholarly journals Improved Time-Series Photometry and Calibration Method for Non-Crowded Fields: MMT Megacam and HAT-South Experiences

2011 ◽  
Vol 7 (S285) ◽  
pp. 291-293 ◽  
Author(s):  
Seo-Won Chang ◽  
Yong-Ik Byun ◽  
Dae-Won Kim

AbstractWe present a new photometric reduction method for precise time-series photometry of non-crowded fields that does not need to involve relatively complicated and CPU intensive techniques such as point-spread-function (PSF) fitting or difference image analysis. This method, which combines multi-aperture index photometry and a spatio-temporal de-trending algorithm, gives much superior performance in data recovery and light-curve precision. In practice, the brutal filtering that is often applied to remove outlying data points can result in the loss of vital data, with seriously negative impacts on short-term variations such as flares. Our method utilizes nearly 100% of available data and reduces the rms scatter to several times smaller than that for archived light curves for brighter stars. We outline the details of our new method, and apply it to cases of sample data from the MMT survey of the M37 field, and the HAT-South survey.

Author(s):  
J. Gu ◽  
G. Y. Li ◽  
Z. Dong

Metamodeling techniques are increasingly used in solving computation intensive design optimization problems today. In this work, the issue of automatic identification of appropriate metamodeling techniques in global optimization is addressed. A generic, new hybrid metamodel based global optimization method, particularly suitable for design problems involving computation intensive, black-box analyses and simulations, is introduced. The method employs three representative metamodels concurrently in the search process and selects sample data points adaptively according to the values calculated using the three metamodels to improve the accuracy of modeling. The global optimum is identified when the metamodels become reasonably accurate. The new method is tested using various benchmark global optimization problems and applied to a real industrial design optimization problem involving vehicle crash simulation, to demonstrate the superior performance of the new algorithm over existing search methods. Present limitations of the proposed method are also discussed.


2009 ◽  
Vol 66 (3) ◽  
pp. 367-381 ◽  
Author(s):  
Yong-Woo Lee ◽  
Bernard A. Megrey ◽  
S. Allen Macklin

Multiple linear regressions (MLRs), generalized additive models (GAMs), and artificial neural networks (ANNs) were compared as methods to forecast recruitment of Gulf of Alaska walleye pollock ( Theragra chalcogramma ). Each model, based on a conceptual model, was applied to a 41-year time series of recruitment, spawner biomass, and environmental covariates. A subset of the available time series, an in-sample data set consisting of 35 of the 41 data points, was used to fit an environment-dependent recruitment model. Influential covariates were identified through statistical variable selection methods to build the best explanatory recruitment model. An out-of-sample set of six data points was retained for model validation. We tested each model’s ability to forecast recruitment by applying them to an out-of-sample data set. For a more robust evaluation of forecast accuracy, models were tested with Monte Carlo resampling trials. The ANNs outperformed the other techniques during the model fitting process. For forecasting, the ANNs were not statistically different from MLRs or GAMs. The results indicated that more complex models tend to be more susceptible to an overparameterization problem. The procedures described in this study show promise for building and testing recruitment forecasting models for other fish species.


2020 ◽  
Vol 2020 (14) ◽  
pp. 306-1-306-6
Author(s):  
Florian Schiffers ◽  
Lionel Fiske ◽  
Pablo Ruiz ◽  
Aggelos K. Katsaggelos ◽  
Oliver Cossairt

Imaging through scattering media finds applications in diverse fields from biomedicine to autonomous driving. However, interpreting the resulting images is difficult due to blur caused by the scattering of photons within the medium. Transient information, captured with fast temporal sensors, can be used to significantly improve the quality of images acquired in scattering conditions. Photon scattering, within a highly scattering media, is well modeled by the diffusion approximation of the Radiative Transport Equation (RTE). Its solution is easily derived which can be interpreted as a Spatio-Temporal Point Spread Function (STPSF). In this paper, we first discuss the properties of the ST-PSF and subsequently use this knowledge to simulate transient imaging through highly scattering media. We then propose a framework to invert the forward model, which assumes Poisson noise, to recover a noise-free, unblurred image by solving an optimization problem.


Author(s):  
Carlos A. Severiano ◽  
Petrônio de Cândido de Lima e Silva ◽  
Miri Weiss Cohen ◽  
Frederico Gadelha Guimarães

2020 ◽  
Vol 72 (1) ◽  
Author(s):  
Masayuki Kano ◽  
Shin’ichi Miyazaki ◽  
Yoichi Ishikawa ◽  
Kazuro Hirahara

Abstract Postseismic Global Navigation Satellite System (GNSS) time series followed by megathrust earthquakes can be interpreted as a result of afterslip on the plate interface, especially in its early phase. Afterslip is a stress release process accumulated by adjacent coseismic slip and can be considered a recovery process for future events during earthquake cycles. Spatio-temporal evolution of afterslip often triggers subsequent earthquakes through stress perturbation. Therefore, it is important to quantitatively capture the spatio-temporal evolution of afterslip and related postseismic crustal deformation and to predict their future evolution with a physics-based simulation. We developed an adjoint data assimilation method, which directly assimilates GNSS time series into a physics-based model to optimize the frictional parameters that control the slip behavior on the fault. The developed method was validated with synthetic data. Through the optimization of frictional parameters, the spatial distributions of afterslip could roughly (but not in detail) be reproduced if the observation noise was included. The optimization of frictional parameters reproduced not only the postseismic displacements used for the assimilation, but also improved the prediction skill of the following time series. Then, we applied the developed method to the observed GNSS time series for the first 15 days following the 2003 Tokachi-oki earthquake. The frictional parameters in the afterslip regions were optimized to A–B ~ O(10 kPa), A ~ O(100 kPa), and L ~ O(10 mm). A large afterslip is inferred on the shallower side of the coseismic slip area. The optimized frictional parameters quantitatively predicted the postseismic GNSS time series for the following 15 days. These characteristics can also be detected if the simulation variables can be simultaneously optimized. The developed data assimilation method, which can be directly applied to GNSS time series following megathrust earthquakes, is an effective quantitative evaluation method for assessing risks of subsequent earthquakes and for monitoring the recovery process of megathrust earthquakes.


Mathematics ◽  
2021 ◽  
Vol 9 (15) ◽  
pp. 1832
Author(s):  
Mariano Méndez-Suárez

Partial least squares structural equations modeling (PLS-SEM) uses sampling bootstrapping to calculate the significance of the model parameter estimates (e.g., path coefficients and outer loadings). However, when data are time series, as in marketing mix modeling, sampling bootstrapping shows inconsistencies that arise because the series has an autocorrelation structure and contains seasonal events, such as Christmas or Black Friday, especially in multichannel retailing, making the significance analysis of the PLS-SEM model unreliable. The alternative proposed in this research uses maximum entropy bootstrapping (meboot), a technique specifically designed for time series, which maintains the autocorrelation structure and preserves the occurrence over time of seasonal events or structural changes that occurred in the original series in the bootstrapped series. The results showed that meboot had superior performance than sampling bootstrapping in terms of the coherence of the bootstrapped data and the quality of the significance analysis.


2021 ◽  
Vol 259 ◽  
pp. 112394
Author(s):  
Huijin Yang ◽  
Bin Pan ◽  
Ning Li ◽  
Wei Wang ◽  
Jian Zhang ◽  
...  

Electronics ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 261
Author(s):  
Tianyang Liu ◽  
Zunkai Huang ◽  
Li Tian ◽  
Yongxin Zhu ◽  
Hui Wang ◽  
...  

The rapid development in wind power comes with new technical challenges. Reliable and accurate wind power forecast is of considerable significance to the electricity system’s daily dispatching and production. Traditional forecast methods usually utilize wind speed and turbine parameters as the model inputs. However, they are not sufficient to account for complex weather variability and the various wind turbine features in the real world. Inspired by the excellent performance of convolutional neural networks (CNN) in computer vision, we propose a novel approach to predicting short-term wind power by converting time series into images and exploit a CNN to analyze them. In our approach, we first propose two transformation methods to map wind speed and precipitation data time series into image matrices. After integrating multi-dimensional information and extracting features, we design a novel CNN framework to forecast 24-h wind turbine power. Our method is implemented on the Keras deep learning platform and tested on 10 sets of 3-year wind turbine data from Hangzhou, China. The superior performance of the proposed method is demonstrated through comparisons using state-of-the-art techniques in wind turbine power forecasting.


Sign in / Sign up

Export Citation Format

Share Document