scholarly journals The first data release of LAMOST low-resolution single-epoch spectra

2021 ◽  
Vol 21 (10) ◽  
pp. 249
Author(s):  
Zhong-Rui Bai ◽  
Hao-Tong Zhang ◽  
Hai-Long Yuan ◽  
Dong-Wei Fan ◽  
Bo-Liang He ◽  
...  

Abstract LAMOST Data Release 5, covering ∼17 000 deg2 from –10° to 80° in declination, contains 9 million co-added low-resolution spectra of celestial objects, each spectrum combined from repeat exposure of two to tens of times during Oct 2011 to Jun 2017. In this paper, we present the spectra of individual exposures for all the objects in LAMOST Data Release 5. For each spectrum, the equivalent width of 60 lines from 11 different elements are calculated with a new method combining the actual line core and fitted line wings. For stars earlier than F type, the Balmer lines are fitted with both emission and absorption profiles once two components are detected. Radial velocity of each individual exposure is measured by minimizing χ 2 between the spectrum and its best template. The database for equivalent widths of spectral lines and radial velocities of individual spectra are available online. Radial velocity uncertainties with different stellar type and signal-to-noise ratio are quantified by comparing different exposure of the same objects. We notice that the radial velocity uncertainty depends on the time lag between observations. For stars observed in the same day and with signal-to-noise ratio higher than 20, the radial velocity uncertainty is below 5km s−1, and increases to 10 km s−1 for stars observed in different nights.

2015 ◽  
Vol 8 (3) ◽  
pp. 2913-2955 ◽  
Author(s):  
B. Langford ◽  
W. Acton ◽  
C. Ammann ◽  
A. Valach ◽  
E. Nemitz

Abstract. All eddy-covariance flux measurements are associated with random uncertainties which are a combination of sampling error due to natural variability in turbulence and sensor noise. The former is the principal error for systems where the signal-to-noise ratio of the analyser is high, as is usually the case when measuring fluxes of heat, CO2 or H2O. Where signal is limited, which is often the case for measurements of other trace gases and aerosols, instrument uncertainties dominate. We are here applying a consistent approach based on auto- and cross-covariance functions to quantifying the total random flux error and the random error due to instrument noise separately. As with previous approaches, the random error quantification assumes that the time-lag between wind and concentration measurement is known. However, if combined with commonly used automated methods that identify the individual time-lag by looking for the maximum in the cross-covariance function of the two entities, analyser noise additionally leads to a systematic bias in the fluxes. Combining datasets from several analysers and using simulations we show that the method of time-lag determination becomes increasingly important as the magnitude of the instrument error approaches that of the sampling error. The flux bias can be particularly significant for disjunct data, whereas using a prescribed time-lag eliminates these effects (provided the time-lag does not fluctuate unduly over time). We also demonstrate that when sampling at higher elevations, where low frequency turbulence dominates and covariance peaks are broader, both the probability and magnitude of bias are magnified. We show that the statistical significance of noisy flux data can be increased (limit of detection can be decreased) by appropriate averaging of individual fluxes, but only if systematic biases are avoided by using a prescribed time-lag. Finally, we make recommendations for the analysis and reporting of data with low signal-to-noise and their associated errors.


2003 ◽  
Vol 131 (8) ◽  
pp. 1715-1732 ◽  
Author(s):  
Matthew Newman ◽  
Prashant D. Sardeshmukh ◽  
Christopher R. Winkler ◽  
Jeffrey S. Whitaker

Abstract The predictability of weekly averaged circulation anomalies in the Northern Hemisphere, and diabatic heating anomalies in the Tropics, is investigated in a linear inverse model (LIM) derived from their observed simultaneous and time-lag correlation statistics. In both winter and summer, the model's forecast skill at week 2 (days 8–14) and week 3 (days 15–21) is comparable to that of a comprehensive global medium-range forecast (MRF) model developed at the National Centers for Environmental Prediction (NCEP). Its skill at week 3 is actually higher on average, partly due to its better ability to forecast tropical heating variations and their influence on the extratropical circulation. The geographical and temporal variations of forecast skill are also similar in the two models. This makes the much simpler LIM an attractive tool for assessing and diagnosing atmospheric predictability at these forecast ranges. The LIM assumes that the dynamics of weekly averages are linear, asymptotically stable, and stochastically forced. In a forecasting context, the predictable signal is associated with the deterministic linear dynamics, and the forecast error with the unpredictable stochastic noise. In a low-order linear model of a high-order chaotic system, this stochastic noise represents the effects of both chaotic nonlinear interactions and unresolved initial components on the evolution of the resolved components. Its statistics are assumed here to be state independent. An average signal-to-noise ratio is estimated at each grid point on the hemisphere and is then used to estimate the potential predictability of weekly variations at the point. In general, this predictability is about 50% higher in winter than summer over the Pacific and North America sectors; the situation is reversed over Eurasia and North Africa. Skill in predicting tropical heating variations is important for realizing this potential skill. The actual LIM forecast skill has a similar geographical structure but weaker magnitude than the potential skill. In this framework, the predictable variations of forecast skill from case to case are associated with predictable variations of signal rather than of noise. This contrasts with the traditional emphasis in studies of shorter-term predictability on flow-dependent instabilities, that is, on the predictable variations of noise. In the LIM, the predictable variations of signal are associated with variations of the initial state projection on the growing singular vectors of the LIM's propagator, which have relatively large amplitude in the Tropics. At times of strong projection on such structures, the signal-to-noise ratio is relatively high, and the Northern Hemispheric circulation is not only potentially but also actually more predictable than at other times.


2018 ◽  
Vol 614 ◽  
pp. A133 ◽  
Author(s):  
J.-B. Delisle ◽  
D. Ségransan ◽  
X. Dumusque ◽  
R. F. Diaz ◽  
F. Bouchy ◽  
...  

We report the discovery of four super-Earth planets around HD 215152, with orbital periods of 5.76, 7.28, 10.86, and 25.2 d, and minimum masses of 1.8, 1.7, 2.8, and 2.9 M⊕ respectively. This discovery is based on 373 high-quality radial velocity measurements taken by HARPS over 13 yr. Given the low masses of the planets, the signal-to-noise ratio is not sufficient to constrain the planet eccentricities. However, a preliminary dynamical analysis suggests that eccentricities should be typically lower than about 0.03 for the system to remain stable. With two pairs of planets with a period ratio lower than 1.5, with short orbital periods, low masses, and low eccentricities, HD 215152 is similar to the very compact multi-planet systems found by Kepler, which is very rare in radial-velocity surveys. This discovery proves that these systems can be reached with the radial-velocity technique, but characterizing them requires a huge amount of observations.


1981 ◽  
Vol 59 ◽  
pp. 83-86
Author(s):  
Roberto H. Méndez ◽  
Alberto D. Verga

The present observations are part of a search for spectral and radial velocity variations among central stars of planetary nebulae (Méndez 1980). The spectrograms were taken with the image-tube spectrographs of the 1-m and 4-m telescopes at the Cerro Tololo Inter-American Observatory (CTIO). The emulsion was always IIIa-J baked in “forming gas” (N2+H2). The “blue” spectrograms extend from 3600 to 5000 Å at 45 Å mm-1; the “red” ones extend from 5000 to 7000 Å, at 45 Å mm-1 (4-m plates) and 90 Å mm-1 (1-m plates). All plates were calibrated with a spot sensitometer. Seven “blue” and seven “red” spectrograms, all obtained with the 4-m telescope, were traced with the PDS microphotometer of the David Dunlap Observatory. The intensities from each plate were stored in computer memory and were later added together, in order to improve the signal-to-noise ratio. The resulting intensity tracings reveal more details than had previously been observed (Swings and Struve 1941, Aller and Wilson 1954, Andrillat 1957, Aller and Kaler 1964).


2008 ◽  
Vol 22 (6) ◽  
pp. 467-474 ◽  
Author(s):  
João Carlos Lázaro ◽  
Carlos J. de Lima ◽  
Leonardo M. Moreira ◽  
Landulfo Silveira Jr. ◽  
Nelson J. F. da Silveira ◽  
...  

The present article is focused on the optimization of the optical parameters of a Raman spectrometer in order to obtain a minimum width of its spectral lines. In this way, using as reference the width of a fingerprint band of a calcified biological tissue, a spectral line without distortion or any loss of resolution was identified. This optimization is employed with the aim of improvement of the signal-to-noise ratio (SNR). A great improvement in the efficiency of the spectral collect was obtained, which can reduce significantly the time of diagnosis of target tissues, such as the calcified coronarian tissue. Therefore, the potential application of this new spectroscopic system increases the efficiency and precision, favoring the security of this technique to futurein vivoapplications. The excellent results obtained in this work become this spectroscopic system a powerful tool to the clinical diagnosis of several diseases.


2020 ◽  
Vol 13 (7) ◽  
pp. 3957-3975
Author(s):  
Kukka-Maaria Kohonen ◽  
Pasi Kolari ◽  
Linda M. J. Kooijmans ◽  
Huilin Chen ◽  
Ulli Seibt ◽  
...  

Abstract. Carbonyl sulfide (COS) flux measurements with the eddy covariance (EC) technique are becoming popular for estimating gross primary productivity. To compare COS flux measurements across sites, we need standardized protocols for data processing. In this study, we analyze how various data processing steps affect the calculated COS flux and how they differ from carbon dioxide (CO2) flux processing steps, and we provide a method for gap-filling COS fluxes. Different methods for determining the time lag between COS mixing ratio and the vertical wind velocity (w) resulted in a maximum of 15.9 % difference in the median COS flux over the whole measurement period. Due to limited COS measurement precision, small COS fluxes (below approximately 3 pmol m−2 s−1) could not be detected when the time lag was determined from maximizing the covariance between COS and w. The difference between two high-frequency spectral corrections was 2.7 % in COS flux calculations, whereas omitting the high-frequency spectral correction resulted in a 14.2 % lower median flux, and different detrending methods caused a spread of 6.2 %. Relative total uncertainty was more than 5 times higher for low COS fluxes (lower than ±3 pmol m−2 s−1) than for low CO2 fluxes (lower than ±1.5 µmol m−2 s−1), indicating a low signal-to-noise ratio of COS fluxes. Due to similarities in ecosystem COS and CO2 exchange, we recommend applying storage change flux correction and friction velocity filtering as usual in EC flux processing, but due to the low signal-to-noise ratio of COS fluxes, we recommend using CO2 data for time lag and high-frequency corrections of COS fluxes due to the higher signal-to-noise ratio of CO2 measurements.


Author(s):  
Yusheng Cheng ◽  
Kai Ma ◽  
Haitao Li ◽  
Shilin Sun ◽  
Yichuan Wang

AbstractA spectral-line-extraction algorithm based on the ant-colony algorithm is proposed to address the difficulty of extracting spectral lines in low signal-to-noise ratio conditions, and the problem that results from the optimal path algorithm falls into local optimization. The algorithm applies the ant-colony path-optimization strategy to detect a spectral line and constructs a corresponding mathematical model using the grid method. A new cost function is proposed to replace path length as the optimization standard in the conventional ant-colony algorithm. At the same time, the roulette rule is used to determine the direction of the next step. This algorithm improves the traditional heuristic function, increases the attraction of the target spectral line to the route search, and improves the convergence rate. Sea-trial data show that the algorithm performs better in extracting spectral lines with a low signal-to-noise ratio than the optimal path algorithm.


2015 ◽  
Vol 8 (10) ◽  
pp. 4197-4213 ◽  
Author(s):  
B. Langford ◽  
W. Acton ◽  
C. Ammann ◽  
A. Valach ◽  
E. Nemitz

Abstract. All eddy-covariance flux measurements are associated with random uncertainties which are a combination of sampling error due to natural variability in turbulence and sensor noise. The former is the principal error for systems where the signal-to-noise ratio of the analyser is high, as is usually the case when measuring fluxes of heat, CO2 or H2O. Where signal is limited, which is often the case for measurements of other trace gases and aerosols, instrument uncertainties dominate. Here, we are applying a consistent approach based on auto- and cross-covariance functions to quantify the total random flux error and the random error due to instrument noise separately. As with previous approaches, the random error quantification assumes that the time lag between wind and concentration measurement is known. However, if combined with commonly used automated methods that identify the individual time lag by looking for the maximum in the cross-covariance function of the two entities, analyser noise additionally leads to a systematic bias in the fluxes. Combining data sets from several analysers and using simulations, we show that the method of time-lag determination becomes increasingly important as the magnitude of the instrument error approaches that of the sampling error. The flux bias can be particularly significant for disjunct data, whereas using a prescribed time lag eliminates these effects (provided the time lag does not fluctuate unduly over time). We also demonstrate that when sampling at higher elevations, where low frequency turbulence dominates and covariance peaks are broader, both the probability and magnitude of bias are magnified. We show that the statistical significance of noisy flux data can be increased (limit of detection can be decreased) by appropriate averaging of individual fluxes, but only if systematic biases are avoided by using a prescribed time lag. Finally, we make recommendations for the analysis and reporting of data with low signal-to-noise and their associated errors.


2000 ◽  
Vol 176 ◽  
pp. 489-489
Author(s):  
A. Kanaan ◽  
A. P. Hatzes ◽  
D. Mkrtichian

We have used the 2D-Coudé spectrograph of the 2.7-m telescope at McDonald Observatory at a resolution of 60,000. We use an iodine cell which provides us with permanent wavelength reference.γ Equ was observed for a period of approximately 6 hours. The signal to noise ratio was variable due to the presence of clouds. A typical value for the “good” spectra is 80.The use of a large detector provides us with complete spectral coverage from 5,000 to 6,000 Å (the region where I2 lines are most visible and useful as a wavelength reference). This allowed us to expand over our previous work analyzing the pulsations of γ Equulei through the use of spectroscopy.


Sign in / Sign up

Export Citation Format

Share Document