scholarly journals Szorstkość pokrycia terenu jako źródło błędu metody SfM zastosowanej do rekonstrukcji zasięgu pokrywy śnieżnej = Terrain roughness as a source of error with the SfM method applied to the reconstruction of snow cover extent

2020 ◽  
Vol 92 (3) ◽  
pp. 377-389
Author(s):  
Damian Szafert ◽  
Bartłomiej Miziński ◽  
Tomasz Niedzielski

A comparison between errors associated with snow-cover reconstruction performed by processing aerial imagery acquired by a visible-light camera mounted on board unmanned aerial vehicles, one the one hand; and average terrain roughness, on the other, revealed a dependent relationship between these variables. A stronger correlation is noted for two of the studied test areas (Polana Izerska and Krobica, both located in SW Poland), as opposed to the remaining site (Drożyna, SW Poland). In particular, correlations are noticeable where the analysis is performed in moving windows. It is typical for terrain where depth of snow cover is reconstructed with severe errors to reveal a high degree of roughness caused by single trees, clumps of trees or buildings. Ambiguous results are obtained for the Drożyna research field. While the character of the dependent relationship there seems consistent with results for the remaining sites, the strength is low. The lower values for the correlation coefficient were driven by observations for which errors were found to be high while values for the Topographic Ruggedness Index were at the same time low. This effect can be explained by reference to the specific nature of the area reconstructed, which is much transformed by human activity. It proves difficult to reconstruct the depth of snow cover on roads properly, as these are either partially cleared or snow or characterised by its loss in the course of melting. Low thickness of snow cover is thus found to be a constrained when it comes to the generation of accurate reconstructions of the depth of snow cover. This is in fact a finding in agreement with what has been reported by other authors.

2019 ◽  
Vol 37 (3) ◽  
pp. 31
Author(s):  
Raquel Fernández González ◽  
Marcos Íñigo Pérez Pérez

The return of institutions to the main research agenda has highlighted the importance of rules in economic analysis. The New Institutional Economics has allowed a better understanding of the case studies that concern different areas of knowledge, also the one concerning the management of natural resources. In this article, the institutional analysis focuses on the maritime domain, where two large civil liability regimes for pollution coexist (OPA 90-IMO), each in a different geographical area (United States - Europe). Therefore, a comparative analysis is made between the two large regimes of civil responsibility assignment applying them to the Prestige catastrophe. In this way, the allocation and distribution of responsibilities in the investigation and subsequent judicial process of the Prestige is compared with an alternative scenario in which the applicable compensation instruments are governed by the provisions of the Oil Polution Act of 1990 (OPA 90), in order to establish a rigorous analysis on the effects that the different norms can have in the same scenario. In the comparative established in the case of the Prestige, where the responsibilities were solved very slowly in a judicial process with high transaction costs, the application of rules governed by the OPA 90 would not count with such a high degree of imperfection. This is so, since by applying the preponderance of the evidence existing in OPA 90 there would be no mitigation for the presumed culprits. On the other hand, the agents involved in the sinking would not be limited only to the owner, but also that operators or shipowners would be responsible as well. In addition, the amount of compensation would increase when counting in the damage count the personal damages, the taxes without perceiving and the ecological damage caused in a broad sense, damages not computable in the IMO.


2021 ◽  
Vol 14 ◽  
pp. 194008292110147
Author(s):  
Dipto Sarkar ◽  
Colin A. Chapman

The term ‘smart forest’ is not yet common, but the proliferation of sensors, algorithms, and technocentric thinking in conservation, as in most other aspects of our lives, suggests we are at the brink of this evolution. While there has been some critical discussion about the value of using smart technology in conservation, a holistic discussion about the broader technological, social, and economic interactions involved with using big data, sensors, artificial intelligence, and global corporations is largely missing. Here, we explore the pitfalls that are useful to consider as forests are gradually converted to technological sites of data production for optimized biodiversity conservation and are consequently incorporated in the digital economy. We consider who are the enablers of the technologically enhanced forests and how the gradual operationalization of smart forests will impact the traditional stakeholders of conservation. We also look at the implications of carpeting forests with sensors and the type of questions that will be encouraged. To contextualize our arguments, we provide examples from our work in Kibale National Park, Uganda which hosts the one of the longest continuously running research field station in Africa.


2021 ◽  
Author(s):  
Mickaël Lalande ◽  
Martin Ménégoz ◽  
Gerhard Krinner

<p>The High Mountains of Asia (HMA) region and the Tibetan Plateau (TP), with an average altitude of 4000 m, are hosting the third largest reservoir of glaciers and snow after the two polar ice caps, and are at the origin of strong orographic precipitation. Climate studies over HMA are related to serious challenges concerning the exposure of human infrastructures to natural hazards and the water resources for agriculture, drinking water, and hydroelectricity to whom several hundred million inhabitants of the Indian subcontinent are depending. However, climate variables such as temperature, precipitation, and snow cover are poorly described by global climate models because their coarse resolution is not adapted to the rugged topography of this region. Since the first CMIP exercises, a cold model bias has been identified in this region, however, its attribution is not obvious and may be different from one model to another. Our study focuses on a multi-model comparison of the CMIP6 simulations used to investigate the climate variability in this area to answer the next questions: (1) are the biases in HMA reduced in the new generation of climate models? (2) Do the model biases impact the simulated climate trends? (3) What are the links between the model biases in temperature, precipitation, and snow cover extent? (4) Which climate trajectories can be projected in this area until 2100? An analysis of 27 models over 1979-2014 still show a cold bias in near-surface air temperature over the HMA and TP reaching an annual value of -2.0 °C (± 3.2 °C), associated with an over-extended relative snow cover extent of 53 % (± 62 %), and a relative excess of precipitation of 139 % (± 38 %), knowing that the precipitation biases are uncertain because of the undercatch of solid precipitation in observations. Model biases and trends do not show any clear links, suggesting that biased models should not be excluded in trend and projections analysis, although non-linear effects related to lagged snow cover feedbacks could be expected. On average over 2081-2100 with respect to 1995-2014, for the scenarios SSP126, SSP245, SSP370, and SSP585, the 9 available models shows respectively an increase in annual temperature of 1.9 °C (± 0.5 °C), 3.4 °C (± 0.7 °C), 5.2 °C (± 1.2 °C), and 6.6 °C (± 1.5 °C); a relative decrease in the snow cover extent of 10 % (± 4.1 %), 19 % (± 5 %), 29 % (± 8 %), and 35 % (± 9 %); and an increase in total precipitation of 9 % (± 5 %), 13 % (± 7 %), 19 % (± 11 %), and 27 % (± 13 %). Further analyses will be considered to investigate potential links between the biases at the surface and those at higher tropospheric levels as well as with the topography. The models based on high resolution do not perform better than the coarse-gridded ones, suggesting that the race to high resolution should be considered as a second priority after the developments of more realistic physical parameterizations.</p>


2013 ◽  
Vol 17 (10) ◽  
pp. 3921-3936 ◽  
Author(s):  
M. Ménégoz ◽  
H. Gallée ◽  
H. W. Jacobi

Abstract. We applied a Regional Climate Model (RCM) to simulate precipitation and snow cover over the Himalaya, between March 2000 and December 2002. Due to its higher resolution, our model simulates a more realistic spatial variability of wind and precipitation than those of the reanalysis of the European Centre of Medium range Weather Forecast (ECMWF) used as lateral boundaries. In this region, we found very large discrepancies between the estimations of precipitation provided by reanalysis, rain gauges networks, satellite observations, and our RCM simulation. Our model clearly underestimates precipitation at the foothills of the Himalaya and in its eastern part. However, our simulation provides a first estimation of liquid and solid precipitation in high altitude areas, where satellite and rain gauge networks are not very reliable. During the two years of simulation, our model resembles the snow cover extent and duration quite accurately in these areas. Both snow accumulation and snow cover duration differ widely along the Himalaya: snowfall can occur during the whole year in western Himalaya, due to both summer monsoon and mid-latitude low pressure systems bringing moisture into this region. In Central Himalaya and on the Tibetan Plateau, a much more marked dry season occurs from October to March. Snow cover does not have a pronounced seasonal cycle in these regions, since it depends both on the quite variable duration of the monsoon and on the rare but possible occurrence of snowfall during the extra-monsoon period.


1956 ◽  
Vol 9 (4) ◽  
pp. 545 ◽  
Author(s):  
EG Bowen

It is reasonable to suppose that observations like that of cirrus cloud in the upper air and heavy falls of snow in relatively warm latitudes correspond to the presence of a large number of freezing nuclei in the atmosphere. A 300-year record of snow covering the ground at Tokyo and a 10-year record of cirrus cloud in Western Australia are examined and compared with one year's measurement of freezing nucleus concentration. The curves show a high degree of correlation, and all three tend to maximize on certain calendar dates.


2020 ◽  
Vol 10 (5) ◽  
pp. 6187-6190
Author(s):  
A. S. Alshammari

The keyspace of a cryptography system must be long enough in order to protect it from brute force attacks. The One-Time Pad (OTP) encryption is unconditionally secure because of its truly random keystream that is used only once. This paper proposes a new chaotic symmetric cryptosystem approach, comparable to OTP. The proposed system utilizes two Lorenz generators, a main and an auxiliary, where the aim of the second one is to make one of the main Lorenz generator’s parameters to vary continually with time in a chaotic manner. This technique was built on digitizing two Lorenz chaotic models to increase the security level. The scrambling scheme was developed and the Lorenz stream cipher binary stream successfully passed the NIST randomness test. The cryptosystem showed a high degree of security, as it had a keyspace of 2576, and it was compared with existing symmetric key cryptography systems, such as DES, 3DES, AES, Blowfish, and OTP.


Author(s):  
Weikang Qian ◽  
John Backes ◽  
Marc D. Riedel

Emerging technologies for nanoscale computation such as self-assembled nanowire arrays present specific challenges for logic synthesis. On the one hand, they provide an unprecedented density of bits with a high degree of parallelism. On the other hand, they are characterized by high defect rates. Also they often exhibit inherent randomness in the interconnects due to the stochastic nature of self-assembly. We describe a general method for synthesizing logic that exploits both the parallelism and the random effects. Our approach is based on stochastic computation with parallel bit streams. Circuits are synthesized through functional decomposition with symbolic data structures called multiplicative binary moment diagrams. Synthesis produces designs with randomized parallel components—and operations and multiplexing—that are readily implemented in nanowire crossbar arrays. Synthesis results for benchmarks circuits show that our technique maps circuit designs onto nanowire arrays effectively.


Sign in / Sign up

Export Citation Format

Share Document