scholarly journals Optimized transit detection algorithm to search for periodic transits of small planets

2019 ◽  
Vol 623 ◽  
pp. A39 ◽  
Author(s):  
Michael Hippke ◽  
René Heller

We present a new method to detect planetary transits from time-series photometry, the transit least squares (TLS) algorithm. TLS searches for transit-like features while taking the stellar limb darkening and planetary ingress and egress into account. We have optimized TLS for both signal detection efficiency (SDE) of small planets and computational speed. TLS analyses the entire, unbinned phase-folded light curve. We compensated for the higher computational load by (i.) using algorithms such as “Mergesort” (for the trial orbital phases) and by (ii.) restricting the trial transit durations to a smaller range that encompasses all known planets, and using stellar density priors where available. A typical K2 light curve, including 80 d of observations at a cadence of 30 min, can be searched with TLS in ∼10 s real time on a standard laptop computer, as fast as the widely used box least squares (BLS) algorithm. We perform a transit injection-retrieval experiment of Earth-sized planets around sun-like stars using synthetic light curves with 110 ppm white noise per 30 min cadence, corresponding to a photometrically quiet KP = 12 star observed with Kepler. We determine the SDE thresholds for both BLS and TLS to reach a false positive rate of 1% to be SDE = 7 in both cases. The resulting true positive (or recovery) rates are ∼93% for TLS and ∼76% for BLS, implying more reliable detections with TLS. We also test TLS with the K2 light curve of the TRAPPIST-1 system and find six of seven Earth-sized planets using an iterative search for increasingly lower signal detection efficiency, the phase-folded transit of the seventh planet being affected by a stellar flare. TLS is more reliable than BLS in finding any kind of transiting planet but it is particularly suited for the detection of small planets in long time series from Kepler, TESS, and PLATO. We make our python implementation of TLS publicly available.

2020 ◽  
Author(s):  
Poomipat Boonyakitanont ◽  
Apiwat Lek-uthai ◽  
Jitkomut Songsiri

AbstractThis article aims to design an automatic detection algorithm of epileptic seizure onsets and offsets in scalp EEGs. A proposed scheme consists of two sequential steps: the detection of seizure episodes, and the determination of seizure onsets and offsets in long EEG recordings. We introduce a neural network-based model called ScoreNet as a post-processing technique to determine the seizure onsets and offsets in EEGs. A cost function called a log-dice loss that has an analogous meaning to F1 is proposed to handle an imbalanced data problem. In combination with several classifiers including random forest, CNN, and logistic regression, the ScoreNet is then verified on the CHB-MIT Scalp EEG database. As a result, in seizure detection, the ScoreNet can significantly improve F1 to 70.15% and can considerably reduce false positive rate per hour to 0.05 on average. In addition, we propose detection delay metric, an effective latency index as a summation of the exponential of delays, that includes undetected events into account. The index can provide a better insight into onset and offset detection than conventional time-based metrics.


Author(s):  
J. Doblas ◽  
A. Carneiro ◽  
Y. Shimabukuro ◽  
S. Sant’Anna ◽  
L. Aragão ◽  
...  

Abstract. In this study we analyse the factors of variability of Sentinel-1 C-band radar backscattering over tropical rainforests, and propose a method to reduce the effects of this variability on deforestation detection algorithms. To do so, we developed a random forest regression model that relates Sentinel-1 gamma nought values with local climatological data and forest structure information. The model was trained using long time-series of 26 relevant variables, sampled over 6 undisturbed tropical forests areas. The resulting model explained 71.64% and 73.28% of the SAR signal variability for VV and VH polarizations, respectively. Once the best model for every polarization was selected, it was used to stabilize extracted pixel-level data of forested and non-deforested areas, which resulted on a 10 to 14% reduction of time-series variability, in terms of standard deviation. Then a statistically robust deforestation detection algorithm was applied to the stabilized time-series. The results show that the proposed method reduced the rate of false positives on both polarizations, especially on VV (from 21% to 2%, α=0.01). Meanwhile, the omission errors increased on both polarizations (from 27% to 37% in VV and from 27% to 33% on VV, α=0.01). The proposed method yielded slightly better results when compared with an alternative state-of-the-art approach (spatial normalization).


2019 ◽  
Vol 627 ◽  
pp. A66 ◽  
Author(s):  
René Heller ◽  
Michael Hippke ◽  
Kai Rodenbeck

The extended Kepler mission (K2) has revealed more than 500 transiting planets in roughly 500 000 stellar light curves. All of these were found either with the box least-squares algorithm or by visual inspection. Here we use our new transit least-squares (TLS) algorithm to search for additional planets around all K2 stars that are currently known to host at least one planet. We discover and statistically validate 17 new planets with radii ranging from about 0.7 Earth radii (R⊕) to roughly 2.2 R⊕ and a median radius of 1.18 R⊕. EPIC 201497682.03, with a radius of 0.692+0.059−0.048, is the second smallest planet ever discovered with K2. The transit signatures of these 17 planets are typically 200 ppm deep (ranging from 100 ppm to 2000 ppm), and their orbital periods extend from about 0.7 d to 34 d with a median value of about 4 d. Fourteen of these 17 systems only had one known planet before, and they now join the growing number of multi-planet systems. Most stars in our sample have subsolar masses and radii. The small planetary radii in our sample are a direct result of the higher signal detection efficiency that TLS has compared to box-fitting algorithms in the shallow-transit regime. Our findings help in populating the period-radius diagram with small planets. Our discovery rate of about 3.7% within the group of previously known K2 systems suggests that TLS can find over 100 additional Earth-sized planets in the data of the Kepler primary mission.


2019 ◽  
Vol 489 (2) ◽  
pp. 2117-2129 ◽  
Author(s):  
Paul J Morris ◽  
Nachiketa Chakraborty ◽  
Garret Cotter

ABSTRACT Time-series analysis allows for the determination of the Power Spectral Density (PSD) and Probability Density Function (PDF) for astrophysical sources. The former of these illustrates the distribution of power at various time-scales, typically taking a power-law form, while the latter characterizes the distribution of the underlying stochastic physical processes, with Gaussian and lognormal functional forms both physically motivated. In this paper, we use artificial time series generated using the prescription of Timmer & Koenig to investigate connections between the PDF and PSD. PDFs calculated for these artificial light curves are less likely to be well described by a Gaussian functional form for steep (Γ⪆1) PSD indices due to weak non-stationarity. Using the Fermi LAT monthly light curve of the blazar PKS2155-304 as an example, we prescribe and calculate a false positive rate that indicates how likely the PDF is to be attributed an incorrect functional form. Here, we generate large numbers of artificial light curves with intrinsically normally distributed PDFs and with statistical properties consistent with observations. These are used to evaluate the probabilities that either Gaussian or lognormal functional forms better describe the PDF. We use this prescription to show that PKS2155-304 requires a high prior probability of having a normally distributed PDF, $P(\rm {G})~$ ≥ 0.82, for the calculated PDF to prefer a Gaussian functional form over a lognormal. We present possible choices of prior and evaluate the probability that PKS2155-304 has a lognormally distributed PDF for each.


2019 ◽  
Vol 2019 ◽  
pp. 1-9 ◽  
Author(s):  
Hong Zhao ◽  
Zhaobin Chang ◽  
Guangbin Bao ◽  
Xiangyan Zeng

Malicious domain name attacks have become a serious issue for Internet security. In this study, a malicious domain names detection algorithm based on N-Gram is proposed. The top 100,000 domain names in Alexa 2013 are used in the N-Gram method. Each domain name excluding the top-level domain is segmented into substrings according to its domain level with the lengths of 3, 4, 5, 6, and 7. The substring set of the 100,000 domain names is established, and the weight value of a substring is calculated according to its occurrence number in the substring set. To detect a malicious attack, the domain name is also segmented by the N-Gram method and its reputation value is calculated based on the weight values of its substrings. Finally, the judgment of whether the domain name is malicious is made by thresholding. In the experiments on Alexa 2017 and Malware domain list, the proposed detection algorithm yielded an accuracy rate of 94.04%, a false negative rate of 7.42%, and a false positive rate of 6.14%. The time complexity is lower than other popular malicious domain names detection algorithms.


2020 ◽  
Vol 17 (5) ◽  
pp. 2342-2348
Author(s):  
Ashutosh Upadhyay ◽  
S. Vijayalakshmi

In the field of computer vision, face detection algorithms achieved accuracy to a great extent, but for the real time applications it remains a challenge to maintain the balance between the accuracy and efficiency i.e., to gain accuracy computational cost also increases to deal with the large data sets. This paper, propose half face detection algorithm to address the efficiency of the face detection algorithm. The full face detection algorithm consider complete face data set for training which incur more computation cost. To reduce the computation cost, proposed model captures the features of the half of the face by assuming that the human face is symmetric about the vertical axis passing through the nose and train the system using reduced half face features. The proposed algorithm extracts Linear Binary Pattern (LBP) features and train model using adaboost classifier. Algorithm performance is presented in terms of the accuracy i.e., True Positive Rate (TPR), False Positive Rate (FTR) and face recognition time complexity.


2021 ◽  
Vol 300 ◽  
pp. 01011
Author(s):  
Jun Wu ◽  
Sheng Cheng ◽  
Shangzhi Pan ◽  
Wei Xin ◽  
Liangjun Bai ◽  
...  

Defects such as insulator, pins, and counterweight in highvoltage transmission lines affect the stability of the power system. The small targets such as pins in the unmanned aerial vehicle (UAV) inspection images of transmission lines occupy a small proportion in the images and the characteristic representations are poor which results a low defect detection rate and a high false positive rate. This paper proposed a transmission line pin defect detection algorithm based on improved Faster R-CNN. First, the pre-training weights with higher matching degree are obtained based on transfer learning. And it is applied to construct defect detection model. Then, the regional proposal network is used to extract features in the model. The results of defect detection are obtained by regression calculation and classification of regional characteristics. The experimental results show that the accuracy of the pin defect detection of the transmission line reaches 81.25%


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Guangyong Gao ◽  
Zhao Feng ◽  
Tingting Han

Data authentication is an important part of wireless sensor networks (WSNs). Aiming at the problems of high false positive rate and poor robustness in group verification of existing reversible watermarking schemes in WSNs, this paper proposes a scheme using reversible watermarking technology to achieve data integrity authentication with high detection efficiency (DAHDE). The core of DAHDE is dynamic grouping and double verification algorithm. Under the condition of satisfying the requirement of the group length, the synchronization point is used for dynamic grouping, and the double verification ensures that the grouping will not be confused. According to the closely related characteristics of adjacent data in WSNs, a new data item prediction method is designed based on the prediction-error expansion formula, and a flag check bit is added to the data with embedded watermarking during data transmission to ensure the stability of grouping, by which the fake synchronization point can be accurately identified. Moreover, the embedded data can be recovered accurately through the reversible algorithm of digital watermarking. Analysis and experimental results show that compared with the previously known schemes, the proposed scheme can avoid false positive rate, reduce computation cost, and own stronger grouping robustness.


Algorithms ◽  
2021 ◽  
Vol 14 (12) ◽  
pp. 368
Author(s):  
Yajing Zhang ◽  
Kai Wang ◽  
Jinghui Zhang

Considering the contradiction between limited node resources and high detection costs in mobile multimedia networks, an adaptive and lightweight abnormal node detection algorithm based on artificial immunity and game theory is proposed in order to balance the trade-off between network security and detection overhead. The algorithm can adapt to the highly dynamic mobile multimedia networking environment with a large number of heterogeneous nodes and multi-source big data. Specifically, the heterogeneous problem of nodes is solved based on the non-specificity of an immune algorithm. A niche strategy is used to identify dangerous areas, and antibody division generates an antibody library that can be updated online, so as to realize the dynamic detection of the abnormal behavior of nodes. Moreover, the priority of node recovery for abnormal nodes is decided through a game between nodes without causing excessive resource consumption for security detection. The results of comparative experiments show that the proposed algorithm has a relatively high detection rate and a low false-positive rate, can effectively reduce consumption time, and has good level of adaptability under the condition of dynamic nodes.


2016 ◽  
Vol 12 (S328) ◽  
pp. 69-76
Author(s):  
Adriana Valio

AbstractMagnetic activity of stars manifests itself in the form of dark spots on the stellar surface. This in turn will cause variations of a few percent in the star light curve as it rotates. When an orbiting planet eclipses its host a star, it may cross in front of one of these spots. In this case, a “bump” will be detected in the transit lightcurve. By fitting these spot signatures with a model, it is possible to determine the spots physical properties such as size, temperature, location, magnetic field, and lifetime. Moreover, the monitoring of the spots longitude provides estimates of the stellar rotation and differential rotation. For long time series of transits during multiple years, magnetic cycles can also be determined. This model has been applied successfully to CoRoT-2, CoRoT-4, CoRot-5, CoRoT-6, CoRoT-8, CoRoT-18, Kepler-17, and Kepler-63.


Sign in / Sign up

Export Citation Format

Share Document