scholarly journals Ice ridge density signatures in high resolution SAR images

2021 ◽  
Author(s):  
Mikko Johannes Lensu ◽  
Markku Henrik Similä

Abstract. The statistics of ice ridging signatures was studied using a high (1.25 m) and a medium (20 m) resolution SAR image over the Baltic sea ice cover, acquired in 2016 and 2011, respectively. Ice surface profiles measured by a 2011 Baltic campaign was used as ground truth data for both. The images did not delineate well individual ridges as linear features. This was assigned to the random, intermittent occurrence of ridge rubble block arrangements with bright SAR return. Instead, the ridging signature was approached in terms of the density of bright pixels and relations with the corresponding surface profile quantity, ice ridge density, were studied. In order to apply discrete statistics, these densities were quantified by counting bright pixel numbers (BPN) in pixel blocks of side length L, and by counting ridge sail numbers (RSN) in profile segments of length L. The scale L is a variable parameter of the approach. The other variable parameter is the pixel intensity threshold defining bright pixels, equivalently bright pixel percentage (BPP), or the ridge sail height threshold used to select ridges from surface profiles, respectively. As a sliding image operation the BPN count resulted in enhanced ridging signature and better applicability of SAR in ice information production. A distribution model for BPN statistics was derived by considering how the BPN values change in BPP changes. The model was found to apply over wide range of values for BPP and L. The same distribution model was found to apply to RSN statistics. This reduces the problem of correspondence between the two density concepts to connections between the parameters of the respective distribution models. The correspondence was studied for the medium resolution image for which the 2011 surface data set has close temporal match. The comparison was done by estimating ridge rubble coverage in 1 km2 squares from surface profile data and, on the other hand, assuming that the bright pixel density can be used as a proxy for ridge rubble coverage. Apart from a scaling factor, both were found to follow the presented distribution model.

2020 ◽  
Author(s):  
Philipp Ulbrich ◽  
Alexander Gail

AbstractOngoing goal-directed movements can be rapidly adjusted following new environmental information, e.g. when chasing pray or foraging. This makes movement trajectories in go-before-you-know decision-making a suitable behavioral readout of the ongoing decision process. Yet, existing methods of movement analysis are often based on statistically comparing two groups of trial-averaged trajectories and are not easily applied to three-dimensional data, preventing them from being applicable to natural free behavior. We developed and tested the cone method to estimate the point of overt commitment (POC) along a single two- or three-dimensional trajectory, i.e. the position where movement is adjusted towards a newly selected spatial target. In Experiment 1, we established a “ground truth” data set in which the cone method successfully identified the experimentally constrained POCs across a wide range of all but the shallowest adjustment angles. In Experiment 2, we demonstrate the power of the method in a typical decision-making task with expected decision time differences known from previous findings. The POCs identified by cone method matched these expected effects. In both experiments, we compared the cone method’s single trial performance with a trial-averaging method and obtained comparable results. We discuss the advantages of the single-trajectory cone method over trial-averaging methods and possible applications beyond the examples presented in this study. The cone method provides a distinct addition to existing tools used to study decisions during ongoing movement behavior, which we consider particularly promising towards studies of non-repetitive free behavior.


Author(s):  
Hassan I. Kazi ◽  
Lyndon S. Stephens

In this paper a method for fabricating deterministic microasperities on stainless steel by a photolithographic lift-off technique followed by electrodeposition of nickel is presented. Triangular asperities with an asperity area fraction of 0.20 are deposited onto a stainless steel disk. Chemical mechanical planarization (CMP) polishing is employed for adjusting the heights of the asperities. The surface profiles of the asperities are then characterized. Results show that the surfaces of the as-deposited asperities have a curved profile with one side shorter than the other. The surface profile becomes more uniform after polishing in the CMP machine but polishing tends to round the edges and corners of the asperities. Additionally, polishing results in the grooving of channel in the substrate in between the asperities and chips the asperity walls.


2018 ◽  
Vol 7 (3.27) ◽  
pp. 179
Author(s):  
K M. Monica ◽  
G Bindu ◽  
S Sridevi

Images provide rich information. With reference to the data set which may be related or unrelated in nature, locates step by step, a wide range of application and its attributes through capturing mechanism by sensing the suitable technologies. On the other hand, it also creates a huge quantity of data which may be relevant, irrelevant or redundant in nature and it is used for detailed task of the image.  Also, Many brings a lot of problems such as increase in computational time of image, density of image and range of mapping of data, semantics of the data set and also it also there is a scope of huge amount of labeled data for the process of training to the new environment setup.  Mostly, this is not easy and costly for users to obtain sufficient training models in several application modules.  This research paper deals with these problems by exploring the more classical dimension reduction algorithms with deep knowledge for supporting communities.  


Author(s):  
Philipp Ulbrich ◽  
Alexander Gail

AbstractOngoing goal-directed movements can be rapidly adjusted following new environmental information, e.g., when chasing pray or foraging. This makes movement trajectories in go-before-you-know decision-making a suitable behavioral readout of the ongoing decision process. Yet, existing methods of movement analysis are often based on statistically comparing two groups of trial-averaged trajectories and are not easily applied to three-dimensional data, preventing them from being applicable to natural free behavior. We developed and tested the cone method to estimate the point of overt commitment (POC) along a single two- or three-dimensional trajectory, i.e., the position where the movement is adjusted towards a newly selected spatial target. In Experiment 1, we established a “ground truth” data set in which the cone method successfully identified the experimentally constrained POCs across a wide range of all but the shallowest adjustment angles. In Experiment 2, we demonstrate the power of the method in a typical decision-making task with expected decision time differences known from previous findings. The POCs identified by cone method matched these expected effects. In both experiments, we compared the cone method’s single trial performance with a trial-averaging method and obtained comparable results. We discuss the advantages of the single-trajectory cone method over trial-averaging methods and possible applications beyond the examples presented in this study. The cone method provides a distinct addition to existing tools used to study decisions during ongoing movement behavior, which we consider particularly promising towards studies of non-repetitive free behavior.


Author(s):  
Samuel Hsiao-Heng Chang ◽  
Rachel Blagojevic ◽  
Beryl Plimmer

AbstractAlthough many approaches to digital ink recognition have been proposed, most lack the flexibility and adaptability to provide acceptable recognition rates across a variety of problem spaces. This project uses a systematic approach of data mining analysis to build a gesture recognizer for sketched diagrams. A wide range of algorithms was tested, and those with the best performance were chosen for further tuning and analysis. Our resulting recognizer, RATA.Gesture, is an ensemble of four algorithms. We evaluated it against four popular gesture recognizers with three data sets; one of our own and two from other projects. Except for recognizer–data set pairs (e.g., PaleoSketch recognizer and PaleoSketch data set) the results show that it outperforms the other recognizers. This demonstrates the potential of this approach to produce flexible and accurate recognizers.


2020 ◽  
pp. 1192-1198
Author(s):  
M.S. Mohammad ◽  
Tibebe Tesfaye ◽  
Kim Ki-Seong

Ultrasonic thickness gauges are easy to operate and reliable, and can be used to measure a wide range of thicknesses and inspect all engineering materials. Supplementing the simple ultrasonic thickness gauges that present results in either a digital readout or as an A-scan with systems that enable correlating the measured values to their positions on the inspected surface to produce a two-dimensional (2D) thickness representation can extend their benefits and provide a cost-effective alternative to expensive advanced C-scan machines. In previous work, the authors introduced a system for the positioning and mapping of the values measured by the ultrasonic thickness gauges and flaw detectors (Tesfaye et al. 2019). The system is an alternative to the systems that use mechanical scanners, encoders, and sophisticated UT machines. It used a camera to record the probe’s movement and a projected laser grid obtained by a laser pattern generator to locate the probe on the inspected surface. In this paper, a novel system is proposed to be applied to flat surfaces, in addition to overcoming the other limitations posed due to the use of the laser projection. The proposed system uses two video cameras, one to monitor the probe’s movement on the inspected surface and the other to capture the corresponding digital readout of the thickness gauge. The acquired images of the probe’s position and thickness gauge readout are processed to plot the measured data in a 2D color-coded map. The system is meant to be simpler and more effective than the previous development.


Author(s):  
Ngoc Anh Nguyen

The analysis of a data set of observation for Vietnamese banks in period from 2011 - 2015 shows how Capital Adequacy Ratio (CAR) is influenced by selected factors: asset of the bank SIZE, loans in total asset LOA, leverage LEV, net interest margin NIM, loans lost reserve LLR, Cash and Precious Metals in total asset LIQ. Results indicate based on data that NIM, LIQ have significant effect on CAR. On the other hand, SIZE and LEV do not appear to have significant effect on CAR. Variables NIM, LIQ have positive effect on CAR, while variables LLR and LOA are negatively related with CAR.


2020 ◽  
Vol 24 ◽  
Author(s):  
Bubun Banerjee ◽  
Gurpreet Kaur ◽  
Navdeep Kaur

: Metal-free organocatalysts are becoming an important tool for the sustainable developments of various bioactive heterocycles. On the other hand, during last two decades, calix[n]arenes have been gaining considerable attention due to their wide range of applicability in the field of supramolecular chemistry. Recently, sulfonic acid functionalized calix[n] arenes are being employed as an efficient alternative catalyst for the synthesis of various bioactive scaffolds. In this review we have summarized the catalytic efficiency of p-sulfonic acid calix[n]arenes for the synthesis of diverse biologically promising scaffolds under various reaction conditions. There is no such review available in the literature showing the catalytic applicability of p-sulfonic acid calix[n]arenes. Therefore, we strongly believe that this review will surely attract those researchers who are interested about this fascinating organocatalyst.


2019 ◽  
Vol 16 (7) ◽  
pp. 808-817 ◽  
Author(s):  
Laxmi Banjare ◽  
Sant Kumar Verma ◽  
Akhlesh Kumar Jain ◽  
Suresh Thareja

Background: In spite of the availability of various treatment approaches including surgery, radiotherapy, and hormonal therapy, the steroidal aromatase inhibitors (SAIs) play a significant role as chemotherapeutic agents for the treatment of estrogen-dependent breast cancer with the benefit of reduced risk of recurrence. However, due to greater toxicity and side effects associated with currently available anti-breast cancer agents, there is emergent requirement to develop target-specific AIs with safer anti-breast cancer profile. Methods: It is challenging task to design target-specific and less toxic SAIs, though the molecular modeling tools viz. molecular docking simulations and QSAR have been continuing for more than two decades for the fast and efficient designing of novel, selective, potent and safe molecules against various biological targets to fight the number of dreaded diseases/disorders. In order to design novel and selective SAIs, structure guided molecular docking assisted alignment dependent 3D-QSAR studies was performed on a data set comprises of 22 molecules bearing steroidal scaffold with wide range of aromatase inhibitory activity. Results: 3D-QSAR model developed using molecular weighted (MW) extent alignment approach showed good statistical quality and predictive ability when compared to model developed using moments of inertia (MI) alignment approach. Conclusion: The explored binding interactions and generated pharmacophoric features (steric and electrostatic) of steroidal molecules could be exploited for further design, direct synthesis and development of new potential safer SAIs, that can be effective to reduce the mortality and morbidity associated with breast cancer.


Author(s):  
Eun-Young Mun ◽  
Anne E. Ray

Integrative data analysis (IDA) is a promising new approach in psychological research and has been well received in the field of alcohol research. This chapter provides a larger unifying research synthesis framework for IDA. Major advantages of IDA of individual participant-level data include better and more flexible ways to examine subgroups, model complex relationships, deal with methodological and clinical heterogeneity, and examine infrequently occurring behaviors. However, between-study heterogeneity in measures, designs, and samples and systematic study-level missing data are significant barriers to IDA and, more broadly, to large-scale research synthesis. Based on the authors’ experience working on the Project INTEGRATE data set, which combined individual participant-level data from 24 independent college brief alcohol intervention studies, it is also recognized that IDA investigations require a wide range of expertise and considerable resources and that some minimum standards for reporting IDA studies may be needed to improve transparency and quality of evidence.


Sign in / Sign up

Export Citation Format

Share Document