Image segmentation and classification of Landsat Thematic Mapper data using a sampling approach for forest cover assessmentThis article is one of a selection of papers from Extending Forest Inventory and Monitoring over Space and Time.

2011 ◽  
Vol 41 (1) ◽  
pp. 35-43 ◽  
Author(s):  
Yasumasa Hirata ◽  
Tomoaki Takahashi

Remote sensing surveys for estimating forest cover may be divided into two approaches: wall-to-wall and sampling. Sampling approaches offer a practical alternative to wall-to-wall mapping, but estimates of forest cover may be affected by the sampling rate of the estimation area. This study aimed to obtain stable estimates of forest cover from satellite data using object-oriented classification at the national level. We investigated a suitable value for the scale parameter in object-oriented classification using eCognition software to identify land cover types, and we evaluated the sampling rate for estimating forest cover at the national level. We used eight different scale parameters when applying object-oriented classification to Landsat data for a set of forty-six 10 km × 10 km sampling tiles centered at each degree of latitude and longitude in Japan. The scale parameter of 10 or less was found suitable for obtaining objects with areas of about 5 ha. Overall accuracy in classification was greater than 75% and greatest when the scale parameter was between 6 and 10. We then analyzed the entire land area of Japan using 10 km × 10 km tiles to evaluate the optimum sampling rate for estimating forest cover. A sampling rate greater than 20% was required to stably estimate forest cover in Japan.

2021 ◽  
Vol 87 (7) ◽  
pp. 503-511
Author(s):  
Lei Zhang ◽  
Hongchao Liu ◽  
Xiaosong Li ◽  
Xinyu Qian

Image segmentation is a critical procedure in object-based identification and classification of remote sensing data. However, optimal scale-parameter selection presents a challenge, given the presence of complex landscapes and uncertain feature changes. This study proposes a local optimal segmentation approach that considers both intersegment heterogeneity and intrasegment homogeneity, uses the standard deviation and local Moran's index to explore each optimal segment across different scale parameters, and combines the optimal segments into a single layer. The optimal segment is measured by using high-spatial-resolution images. Results show that our approach out-performs and generates less error than the global optimal segmentation approach. The variety of land cover types or intrasegment homogeneity leads to segment matching with the geo-objects on different scales. Local optimal segmentation demonstrates sensitivity to land cover discrepancy and provides good performance on cross-scale segmentation.


2015 ◽  
Vol 7 (3) ◽  
pp. 23-38 ◽  
Author(s):  
Ippei Harada ◽  
Keitarou Hara ◽  
Mizuki Tomita ◽  
Kevin Short ◽  
Jonggeol Park

Abstract Japan, with over 75% forest cover, is one of the most heavily forested countries in the world. Various types of climax forest are distributed according to latitude and altitude. At the same time, human intervention in Japan has historically been intensive, and many forest habitats show the influence of various levels of disturbance. Furthermore, Japanese landscapes are changing rapidly, and a system of efficient monitoring is needed. The aim of this research was to identify major historical trends in Japanese landscape change and to develop a system for identifying and monitoring patterns of landscape change at the national level. To provide a base for comparison, Warmth Index (WI) climatic data was digitalized and utilized to map potential climax vegetation for all of Japan. Extant Land Use Information System (LUIS) data were then modified and digitalized to generate national level Land Use/Land Cover (LU/LC) distribution maps for 1900, 1950 and 1985. In addition, MODIS data for 2001 acquired by the Tokyo University of Information Sciences were utilized for remote LU/LC classification using an unsupervised method on multi-temporal composite data. Eight classification categories were established using the ISODATA (cluster analyses) method; alpine plant communities, evergreen coniferous forest, evergreen broad-leaved forest, deciduous broad-leaved forest, mixed forest, arable land (irrigated rice paddy, non-irrigated, grassland), urban area, river and marsh. The results of the LUIS analyses and MODIS classifications were interpreted in terms of a Landscape Transformation Sere model assuming that under increasing levels of human disturbance the landscape will change through a series of stages. The results showed that overall forest cover in Japan has actually increased over the century covered by the data; from 72.1% in 1900 to 76.9% in 2001. Comparison of the actual vegetation and the potential vegetation as predicted by WI, however, indicated that in many areas the climax vegetation has been replaced by secondary forests such as conifer timber plantations. This trend was especially strong in the warm and mid temperate zones of western Japan. This research also demonstrated that classification of moderate resolution remote sensing data, interpreted within a LTS framework, can be an effective tool for efficient and repeat monitoring of landscape changes at the national level. In the future, the authors plan to continue utilizing this approach to track rapidly occurring changes in Japanese landscapes at the national level.


Author(s):  
I.A. Arkharov ◽  
E.S. Navasardyan ◽  
N.E. Shishova

In this paper, we analyze the methods for predicting the MTTF of microcryocoolers of various manufacturers based on the Weibull law. The values of the shape and scale parameters of the Weibull distribution law for rotary microcryocoolers of various foreign manufacturers and a method for calculating the shape and scale parameters of this law are applied to the data obtained from experimental studies of microcryocooler samples. The limiting values of the shape variable were estimated and the methods for calculating the scale parameter needed to predict the MTTF of the microcryocooler both at the stage of improving existing models and at the stage of designing newly developed samples are presented. It seems that the approach to predicting the MTTF of the microcryocooler described in the work will allow us to determine the parameters of the Weibull distribution function for specific values set by the customer of the MTTF of the created microcryocooler sample, which in turn will allow the selection of machine components (assemblies and details) also for a specific values of MTTF


2019 ◽  
Vol 1 (7) ◽  
pp. 19-23
Author(s):  
S. I. Surkichin ◽  
N. V. Gryazeva ◽  
L. S. Kholupova ◽  
N. V. Bochkova

The article provides an overview of the use of photodynamic therapy for photodamage of the skin. The causes, pathogenesis and clinical manifestations of skin photodamage are considered. The definition, principle of action of photodynamic therapy, including the sources of light used, the classification of photosensitizers and their main characteristics are given. Analyzed studies that show the effectiveness and comparative evaluation in the selection of various light sources and photosensitizing agents for photodynamic therapy in patients with clinical manifestations of photodamage.


Author(s):  
Elena Domínguez-Romero

The present article claims that the British public opinion’s repositioning towards inner terror after the 2017 Westminster attacks was (i) affected by the visual reframing of an original viral press photograph of the attacks targeting a Muslim passerby as an inner terrorist and (ii) linguistically expressed through the use of ‘look’ object-oriented visual markers of evidentiality in written digital discourse. To support this claim, British readers’ commentaries on a selection of online opinion articles reframing inner terror into terror through the use of reframed press photographs will be taken as the corpus of analysis. The ultimate aim of the article is to unveil the British readers’ reactions to the reframed photographs of the attacks as linguistically expressed through their use of ‘look’ object-oriented repositioning strategies of visual evidentiality in order to analyse the repositioning process.


2020 ◽  
Vol 3 (152) ◽  
pp. 92-99
Author(s):  
S. M. Geiko ◽  
◽  
O. D. Lauta

The article provides a philosophical analysis of the tropological theory of the history of H. White. The researcher claims that history is a specific kind of literature, and the historical works is the connection of a certain set of research and narrative operations. The first type of operation answers the question of why the event happened this way and not the other. The second operation is the social description, the narrative of events, the intellectual act of organizing the actual material. According to H. White, this is where the set of ideas and preferences of the researcher begin to work, mainly of a literary and historical nature. Explanations are the main mechanism that becomes the common thread of the narrative. The are implemented through using plot (romantic, satire, comic and tragic) and trope systems – the main stylistic forms of text organization (metaphor, metonymy, synecdoche, irony). The latter decisively influenced for result of the work historians. Historiographical style follows the tropological model, the selection of which is determined by the historian’s individual language practice. When the choice is made, the imagination is ready to create a narrative. Therefore, the historical understanding, according to H. White, can only be tropological. H. White proposes a new methodology for historical research. During the discourse, adequate speech is created to analyze historical phenomena, which the philosopher defines as prefigurative tropological movement. This is how history is revealed through the art of anthropology. Thus, H. White’s tropical history theory offers modern science f meaningful and metatheoretically significant. The structure of concepts on which the classification of historiographical styles can be based and the predictive function of philosophy regarding historical knowledge can be refined.


2011 ◽  
Vol 8 (1) ◽  
pp. 201-210
Author(s):  
R.M. Bogdanov

The problem of determining the repair sections of the main oil pipeline is solved, basing on the classification of images using distance functions and the clustering principle, The criteria characterizing the cluster are determined by certain given values, based on a comparison with which the defect is assigned to a given cluster, procedures for the redistribution of defects in cluster zones are provided, and the cluster zones parameters are being changed. Calculations are demonstrating the range of defect density variation depending on pipeline sections and the universal capabilities of linear objects configuration with arbitrary density, provided by cluster analysis.


1997 ◽  
Vol 3 (S2) ◽  
pp. 341-342
Author(s):  
Sara E. Miller

Negative staining is the most frequently used procedure for preparing particulate specimens, e.g., cell organelles, macromolecules, and viruses, for electron microscopy (Figs. 1-4). The main advantage is that it is rapid, requiring only minutes of preparation time. Another is that it avoids some of the harsh chemicals, e.g., organic solvents, used in thin sectioning. Also, it does not require advanced technical skill. It is widely used in virology, both in classification of viruses as well as diagnosis of viral diseases. Notwithstanding the necessity for fairly high particle counts, virus identification by negative staining is advantageous in not requiring specific reagents such as antibodies, nucleic acid probes, or protein standards which necessitate prior knowledge of potential pathogens for selection of the proper reagent. Furthermore, it does not require viable virions as does growth in tissue culture. Another procedure that uses negative contrasting is ultrathin cryosectioning (Fig. 5).In 1954 Farrant was the first to publish negatively stained material, ferritin particles.


2021 ◽  
Vol 11 (9) ◽  
pp. 3836
Author(s):  
Valeri Gitis ◽  
Alexander Derendyaev ◽  
Konstantin Petrov ◽  
Eugene Yurkov ◽  
Sergey Pirogov ◽  
...  

Prostate cancer is the second most frequent malignancy (after lung cancer). Preoperative staging of PCa is the basis for the selection of adequate treatment tactics. In particular, an urgent problem is the classification of indolent and aggressive forms of PCa in patients with the initial stages of the tumor process. To solve this problem, we propose to use a new binary classification machine-learning method. The proposed method of monotonic functions uses a model in which the disease’s form is determined by the severity of the patient’s condition. It is assumed that the patient’s condition is the easier, the less the deviation of the indicators from the normal values inherent in healthy people. This assumption means that the severity (form) of the disease can be represented by monotonic functions from the values of the deviation of the patient’s indicators beyond the normal range. The method is used to solve the problem of classifying patients with indolent and aggressive forms of prostate cancer according to pretreatment data. The learning algorithm is nonparametric. At the same time, it allows an explanation of the classification results in the form of a logical function. To do this, you should indicate to the algorithm either the threshold value of the probability of successful classification of patients with an indolent form of PCa, or the threshold value of the probability of misclassification of patients with an aggressive form of PCa disease. The examples of logical rules given in the article show that they are quite simple and can be easily interpreted in terms of preoperative indicators of the form of the disease.


Sign in / Sign up

Export Citation Format

Share Document