scholarly journals Fog and Low Stratus Obstruction of Wind Lidar Observations in Germany—A Remote Sensing-Based Data Set for Wind Energy Planning

Energies ◽  
2020 ◽  
Vol 13 (15) ◽  
pp. 3859
Author(s):  
Benjamin Rösner ◽  
Sebastian Egli ◽  
Boris Thies ◽  
Tina Beyer ◽  
Doron Callies ◽  
...  

Coherent wind doppler lidar (CWDL) is a cost-effective way to estimate wind power potential at hub height without the need to build a meteorological tower. However, fog and low stratus (FLS) can have a negative impact on the availability of lidar measurements. Information about such reductions in wind data availability for a prospective lidar deployment site in advance is beneficial in the planning process for a measurement strategy. In this paper, we show that availability reductions by FLS can be estimated by comparing time series of lidar measurements, conducted with WindCubes v1 and v2, with time series of cloud base altitude (CBA) derived from satellite data. This enables us to compute average maps (2006–2017) of estimated availability, including FLS-induced data losses for Germany which can be used for planning purposes. These maps show that the lower mountain ranges and the Alpine regions in Germany often reach the critical data availability threshold of 80% or below. Especially during the winter time special care must be taken when using lidar in southern and central regions of Germany. If only shorter lidar campaigns are planned (3–6 months) the representativeness of weather types should be considered as well, because in individual years and under persistent weather types, lowland areas might also be temporally affected by higher rates of data losses. This is shown by different examples, e.g., during radiation fog under anticyclonic weather types.

2016 ◽  
Vol 2016 ◽  
pp. 1-13 ◽  
Author(s):  
Xiuguo Wu

Replication technology is commonly used to improve data availability and reduce data access latency in the cloud storage system by providing users with different replicas of the same service. Most current approaches largely focus on system performance improvement, neglecting management cost in deciding replicas number and their store places, which cause great financial burden for cloud users because the cost for replicas storage and consistency maintenance may lead to high overhead with the number of new replicas increased in a pay-as-you-go paradigm. In this paper, towards achieving the approximate minimum data sets management cost benchmark in a practical manner, we propose a replicas placements strategy from cost-effective view with the premise that system performance meets requirements. Firstly, we design data sets management cost models, including storage cost and transfer cost. Secondly, we use the access frequency and the average response time to decide which data set should be replicated. Then, the method of calculating replicas’ number and their store places with minimum management cost is proposed based on location problem graph. Both the theoretical analysis and simulations have shown that the proposed strategy offers the benefits of lower management cost with fewer replicas.


2018 ◽  
Vol 10 (3) ◽  
pp. 1451-1456 ◽  
Author(s):  
Marion Maturilli ◽  
Kerstin Ebell

Abstract. Clouds are a key factor for the Arctic amplification of global warming, but their actual appearance and distribution are still afflicted by large uncertainty. On the Arctic-wide scale, large discrepancies are found between the various reanalyses and satellite products, respectively. Although ground-based observations by remote sensing are limited to point measurements, they have the advantage of obtaining extended time series of vertically resolved cloud properties. Here, we present a 25-year data record of cloud base height measured by ceilometer at the Ny-Ålesund, Svalbard, Arctic site. We explain the composition of the three sub-periods with different instrumentation contributing to the data set, and show examples of potential application areas. Linked to cyclonic activity, the cloud base height provides essential information for the interpretation of the surface radiation balance and contributes to the understanding of meteorological processes. Furthermore, it is a useful auxiliary component for the analysis of advanced technologies that provide insight into cloud microphysical properties, like the cloud radar. The long-term time series also allows derivation of an annual cycle of the cloud occurrence frequency, revealing the more frequent cloud cover in summer and the lowest cloud cover amount in April. However, as the use of different ceilometer instruments over the years potentially imposed inhomogeneities onto the data record, any long-term trend analysis should be avoided. The Ny-Ålesund cloud base height data from August 1992 to July 2017 are provided in a high temporal resolution of 5 min (1 min) before (after) July 1998, respectively, at the PANGAEA repository (https://doi.org/10.1594/PANGAEA.880300).


Author(s):  
Diaz Juan Navia ◽  
Diaz Juan Navia ◽  
Bolaños Nancy Villegas ◽  
Bolaños Nancy Villegas ◽  
Igor Malikov ◽  
...  

Sea Surface Temperature Anomalies (SSTA), in four coastal hydrographic stations of Colombian Pacific Ocean, were analyzed. The selected hydrographic stations were: Tumaco (1°48'N-78°45'W), Gorgona island (2°58'N-78°11'W), Solano Bay (6°13'N-77°24'W) and Malpelo island (4°0'N-81°36'W). SSTA time series for 1960-2015 were calculated from monthly Sea Surface Temperature obtained from International Comprehensive Ocean Atmosphere Data Set (ICOADS). SSTA time series, Oceanic Nino Index (ONI), Pacific Decadal Oscillation index (PDO), Arctic Oscillation index (AO) and sunspots number (associated to solar activity), were compared. It was found that the SSTA absolute minimum has occurred in Tumaco (-3.93°C) in March 2009, in Gorgona (-3.71°C) in October 2007, in Solano Bay (-4.23°C) in April 2014 and Malpelo (-4.21°C) in December 2005. The SSTA absolute maximum was observed in Tumaco (3.45°C) in January 2002, in Gorgona (5.01°C) in July 1978, in Solano Bay (5.27°C) in March 1998 and Malpelo (3.64°C) in July 2015. A high correlation between SST and ONI in large part of study period, followed by a good correlation with PDO, was identified. The AO and SSTA have showed an inverse relationship in some periods. Solar Cycle has showed to be a modulator of behavior of SSTA in the selected stations. It was determined that extreme values of SST are related to the analyzed large scale oscillations.


2018 ◽  
Vol 32 (2) ◽  
pp. 103-119
Author(s):  
Colleen M. Boland ◽  
Chris E. Hogan ◽  
Marilyn F. Johnson

SYNOPSIS Mandatory existence disclosure rules require an organization to disclose a policy's existence, but not its content. We examine policy adoption frequencies in the year immediately after the IRS required mandatory existence disclosure by nonprofits of various governance policies. We also examine adoption frequencies in the year of the subsequent change from mandatory existence disclosure to a disclose-and-explain regime that required supplemental disclosures about the content and implementation of conflict of interest policies. Our results suggest that in areas where there is unclear regulatory authority, mandatory existence disclosure is an effective and low cost regulatory device for encouraging the adoption of policies desired by regulators, provided those policies are cost-effective for regulated firms to implement. In addition, we find that disclose-and-explain regulatory regimes provide stronger incentives for policy adoption than do mandatory existence disclosure regimes and also discourage “check the box” behavior. Future research should examine the impact of mandatory existence disclosure rules in the year that the regulation is implemented. Data Availability: Data are available from sources cited in the text.


2020 ◽  
Vol 47 (3) ◽  
pp. 547-560 ◽  
Author(s):  
Darush Yazdanfar ◽  
Peter Öhman

PurposeThe purpose of this study is to empirically investigate determinants of financial distress among small and medium-sized enterprises (SMEs) during the global financial crisis and post-crisis periods.Design/methodology/approachSeveral statistical methods, including multiple binary logistic regression, were used to analyse a longitudinal cross-sectional panel data set of 3,865 Swedish SMEs operating in five industries over the 2008–2015 period.FindingsThe results suggest that financial distress is influenced by macroeconomic conditions (i.e. the global financial crisis) and, in particular, by various firm-specific characteristics (i.e. performance, financial leverage and financial distress in previous year). However, firm size and industry affiliation have no significant relationship with financial distress.Research limitationsDue to data availability, this study is limited to a sample of Swedish SMEs in five industries covering eight years. Further research could examine the generalizability of these findings by investigating other firms operating in other industries and other countries.Originality/valueThis study is the first to examine determinants of financial distress among SMEs operating in Sweden using data from a large-scale longitudinal cross-sectional database.


2012 ◽  
Vol 197 ◽  
pp. 271-277
Author(s):  
Zhu Ping Gong

Small data set approach is used for the estimation of Largest Lyapunov Exponent (LLE). Primarily, the mean period drawback of Small data set was corrected. On this base, the LLEs of daily qualified rate time series of HZ, an electronic manufacturing enterprise, were estimated and all positive LLEs were taken which indicate that this time series is a chaotic time series and the corresponding produce process is a chaotic process. The variance of the LLEs revealed the struggle between the divergence nature of quality system and quality control effort. LLEs showed sharp increase in getting worse quality level coincide with the company shutdown. HZ’s daily qualified rate, a chaotic time series, shows us the predictable nature of quality system in a short-run.


2021 ◽  
Vol 10 (15) ◽  
pp. 3251
Author(s):  
Juan J. Gagliardino ◽  
Martin R. Salazar ◽  
Walter G. Espeche ◽  
Paula E. Tolosa Chapasian ◽  
Daniela Gomez Gomez Garizoain ◽  
...  

Aims: To evaluate arterial stiffness indicators in people with prediabetes (PreD) and its possible pathogenesis. Materials and methods: Pulse wave velocity (PWV) was measured in 208 people with FINDRISC ≥ 13 (57 ± 8 years old, 68.7% women) and thereafter divided into those having either normal glucose tolerance (NGT) or PreD. In each subgroup we also identified those with/out insulin resistance (IR) measured by the triglyceride/HDL-c ratio (normal cut off values previously established in our population). Clinical and metabolic data were collected for all participants. PWV was compared between subgroups using independent t test. Results: Women and men had comparable clinical and metabolic characteristics with obesity (BMI ≥ 30) and antihypertensive-statin treatment, almost half with either NGT or PreD. Whereas 48% of NGT people presented IR (abnormally high TG/HDL-c ratio), 52% had PreD. PWV was significantly higher only in those with a complete picture of metabolic syndrome (MS). Conclusions: Since PWV was significantly impaired in people with a complete picture of MS, clinicians must carefully search for early diagnosis of this condition and prescribe a healthy life-style to prevent development/progression of CVD. This proactive attitude would provide a cost-effective preventive strategy to avoid CVD’s negative impact on patients’ quality of life and on health systems due to their higher care costs.


AI ◽  
2021 ◽  
Vol 2 (1) ◽  
pp. 48-70
Author(s):  
Wei Ming Tan ◽  
T. Hui Teo

Prognostic techniques attempt to predict the Remaining Useful Life (RUL) of a subsystem or a component. Such techniques often use sensor data which are periodically measured and recorded into a time series data set. Such multivariate data sets form complex and non-linear inter-dependencies through recorded time steps and between sensors. Many current existing algorithms for prognostic purposes starts to explore Deep Neural Network (DNN) and its effectiveness in the field. Although Deep Learning (DL) techniques outperform the traditional prognostic algorithms, the networks are generally complex to deploy or train. This paper proposes a Multi-variable Time Series (MTS) focused approach to prognostics that implements a lightweight Convolutional Neural Network (CNN) with attention mechanism. The convolution filters work to extract the abstract temporal patterns from the multiple time series, while the attention mechanisms review the information across the time axis and select the relevant information. The results suggest that the proposed method not only produces a superior accuracy of RUL estimation but it also trains many folds faster than the reported works. The superiority of deploying the network is also demonstrated on a lightweight hardware platform by not just being much compact, but also more efficient for the resource restricted environment.


2020 ◽  
Vol 79 (Suppl 1) ◽  
pp. 1859.2-1859
Author(s):  
L. Zerweck ◽  
U. Henkemeier ◽  
P. H. Nguyen ◽  
T. Rossmanith ◽  
A. Pippow ◽  
...  

Background:Psoriasis (Pso) is one of the most common chronic inflammatory skin diseases in Europe. Psoriatic arthritis (PsA) is closely associated to Pso whereas the skin manifestation appears usually years before PsA-related symptoms emerge. Up to 30% of Pso patients develop PsA, biomarkers for its early detection are of major importance. In early PsA, changes in synovial vascularisation appear first. Imaging biomarkers for detection of changes in vascularisation might be useful for early detection of musculoskeletal disease. Fluorescence-optical imaging (FOI) is a new method to detect changes in microvascularisation of the hands. Each collected data set of the FOI system contains 360 images representing a time progression of the indocyanine green (ICG) distribution.Objectives:To evaluate a reader-independent assessment method for evaluation of FOI in patients with PsO and PsA.Methods:A prospective study including patients with dermatological confirmed skin PsO was performed. 411 patients were included from German dermatology units without PsA diagnosis but potential risk for its development. Clinical examination (CE) was performed by a qualified rheumatologist. For a reader independent evaluation of the FOI images an objective joint-based scoring method was developed. For this method, the joint areas are defined by image segmentation and scored based on generated heatmaps. To calculate a heatmap indicating conspicuous joints from a data set containing 360 images, each pixel is converted to a time series containing 360 values. From this time series, three independent values (features) are extracted: amplitude, average value and maximal slope. Thus, each pixel is reduced to three different feature values. After the three features are determined for each pixel, k-means clustering is performed on each feature. The numbers of centroids (k) are set to 3, 5, 7 and 9. 12 heatmaps (3 features à 4 ks) are calculated, which results in 12 scores for each joint as well. The clusters are then sorted dependent on their centroid value and coloured accordingly to a predefined heatmap colour palette. To finally score each joint, the pixels in the segmented joint area and their assigned cluster are summed and normalized by the area’s amount of pixels and k.Results:271 of the patients were investigated by the newly developed method and compared with the CE scoring. 6426 joints were labeled as healthy whereas 1162 joints were either labeled as swollen, tender or both. The result over all investigated patients for k = 9 is summed in table 1. It is observable that every average and median healthy value is lower than the corresponding affected value.Table 1.Resulting scores for k = 9 for all 271 patients.Feature Statistic valueAmplitudeMeanSlopeHealthyAffectedHealthyAffectedHealthyAffectedAverage0.5030.5280.4860.5090.3950.414Median0.4960.5320.4820.5050.3890.415Conclusion:FOI is an innovative method that detects early changes in vascularization of the hands. So, this method can be useful in early detection of arthritis especially in risk populations such as PsO patients. The results of the objective scoring method show that a clear distinction between healthy and affected joints is possible with the average scores as well as the median values. However, if the range of the scores is considered, the overlap between healthy and affected is not neglectable. Thus, the current scoring system can be used as an indicator but not as a single classification marker. Nevertheless, the research at hand has shown the expected outcome and motivates further development on the heatmap approach.Disclosure of Interests:Lukas Zerweck: None declared, Ulf Henkemeier: None declared, Phuong-Ha Nguyen: None declared, Tanja Rossmanith Grant/research support from: Janssen, BMS, LEO, Pfizer, Andreas Pippow: None declared, Harald Burkhardt Grant/research support from: Pfizer, Roche, Abbvie, Consultant of: Sanofi, Pfizer, Roche, Abbvie, Boehringer Ingelheim, UCB, Eli Lilly, Chugai, Bristol Myer Scripps, Janssen, and Novartis, Speakers bureau: Sanofi, Pfizer, Roche, Abbvie, Boehringer Ingelheim, UCB, Eli Lilly, Chugai, Bristol Myer Scripps, Janssen, and Novartis, Frank Behrens Grant/research support from: Pfizer, Janssen, Chugai, Celgene, Lilly and Roche, Consultant of: Pfizer, AbbVie, Sanofi, Lilly, Novartis, Genzyme, Boehringer, Janssen, MSD, Celgene, Roche and Chugai, Michaela Köhm Grant/research support from: Pfizer, Janssen, BMS, LEO, Consultant of: BMS, Pfizer, Speakers bureau: Pfizer, BMS, Janssen, Novartis


2021 ◽  
pp. 089443932098382
Author(s):  
Jildau Borwell ◽  
Jurjen Jansen ◽  
Wouter Stol

While criminality is digitizing, a theory-based understanding of the impact of cybercrime on victims is lacking. Therefore, this study addresses the psychological and financial impact of cybercrime on victims, applying the shattered assumptions theory (SAT) to predict that impact. A secondary analysis was performed on a representative data set of Dutch citizens ( N = 33,702), exploring the psychological and financial impact for different groups of cybercrime victims. The results showed a higher negative impact on emotional well-being for victims of person-centered cybercrime, victims for whom the offender was an acquaintance, and victims whose financial loss was not compensated and a lower negative impact on emotional well-being for victims with a higher income. The study led to novel scientific insights and showed the applicability of the SAT for developing hypotheses about cybercrime victimization impact. In this study, most hypotheses had to be rejected, leading to the conclusion that more work has to be done to test the applicability of the SAT in the field of cybercrime. Furthermore, policy implications were identified considering the prioritization of and approach to specific cybercrimes, treatment of victims, and financial loss compensation.


Sign in / Sign up

Export Citation Format

Share Document