scholarly journals On Two Methods to Computate the Trapezoidal Characteristic of Automatics for Elimination of Asynchronous Operation Based on Equal-Step Data

The paper presents two methods to compute the trapezoidal characteristic of automatics for elimination of asynchronous operation (AEAO). The source data are hodographs of asynchronous operations (AO hodographs). Every AO hodograph is an array of active and reactive resistances values, which have been received or calculated with equal time step. The first method is based on the phase increase of characteristic size without proportions sacrificing of sensitive and coarse elements of the characteristic (trapeze). Trapeze bases are stretched as long as we have unfixed AO hodographs for AEAO. The second method is based on narrowing the coarse characteristic element without sacrificing the initial size of the sensitive element. Both methods are iterative homogenous operations algorithms. These homogenous operations are necessary to compute the cha­racteristic with minimal size and keep constraints for AEAO. The main emphasis of the characteristic computation is on keeping the sensitivity constraints and fixing the AO hodographs. These methods are used to build up a software. The paper also describes a computational experiment based on actual data. The experiment showed these methods are effective when calculating the AEAO sets. Experts on electrical modes can use these methods to adjust the AEAO with trapezoidal characteristic.

2020 ◽  
Author(s):  
Joseph P. Verdian ◽  
Leonard S. Sklar ◽  
Clifford S. Riebe ◽  
Jeffrey R. Moore

Abstract. The detachment of rock fragments from fractured bedrock on hillslopes creates sediment with an initial size distribution that sets the upper limits on particle size for all subsequent stages in the life of sediment in landscapes. We hypothesize that the initial size distribution should depend on the size distribution of latent sediment (i.e., blocks defined by through-going fractures) and weathering of sediment before or during detachment (e.g., disintegration along crystal grain boundaries). However, the initial size distribution is difficult to measure, because the interface across which sediment is produced is often shielded from view by overlying soil. Here we overcome this limitation by comparing fracture spacings measured from exposed bedrock on cliff faces with particle size distributions in adjacent talus deposits at 15 talus-cliff pairs spanning a wide range of climates and lithologies in California. Median fracture spacing and particle size vary by more than tenfold and correlate strongly with lithology. Fracture spacing and talus size distributions are also closely correlated in central tendency, spread, and shape, with b-axis diameters showing the closest correspondence with fracture spacing at most sites. This suggests that weathering has not modified latent sediment either before or during detachment from the cliff face. In addition, talus has not undergone much weathering after deposition and is slightly coarser than the latent sizes, suggesting that it contains some fractures inherited from bedrock. We introduce a new conceptual framework for understanding the relative importance of latent size and weathering in setting initial sediment size distributions in mountain landscapes. In this framework, hillslopes exist on a spectrum defined by the ratio of two characteristic timescales: the residence time in saprolite and weathered bedrock, and the time required to detach a particle of a characteristic size. At one end of the spectrum, where weathering residence times are negligible, the latent size distribution can be used to predict the initial size distribution. At the other end of the spectrum, where weathering residence times are long, the latent size distribution can be erased by weathering in the critical zone.


Geophysics ◽  
1990 ◽  
Vol 55 (3) ◽  
pp. 379-379 ◽  
Author(s):  
Rakesh Mithal ◽  
Emilio E. Vera

In his discussion, McGowan directs his attention exclusively to which method should be used to produce a plane-wave decomposition of point-source seismic data. Although the choice of method is an important point, it was not the main emphasis of our paper which, as its title indicates, was the comparison between plane-wave decomposition (cylindrical slant stacking) and simple slant stacking. We demonstrated the differences between these two processes and clearly indicated the necessity of using cylindrical slant stacking in order to get the correct plane-wave reflection response of point-source data. McGowan criticizes our method because it makes use of the standard asymptotic approximation of the Bessel function [Formula: see text] and considers only outward traveling waves. In our paper we acknowledged that these simplifications do not produce accurate results for ray parameters near zero and explicitly mentioned the method of Brysk and McGowan (1986) as a suitable alternative to deal with this problem.


2021 ◽  
Vol 9 (4) ◽  
pp. 1073-1090
Author(s):  
Joseph P. Verdian ◽  
Leonard S. Sklar ◽  
Clifford S. Riebe ◽  
Jeffrey R. Moore

Abstract. The detachment of rock fragments from fractured bedrock on hillslopes creates sediment with an initial size distribution that sets the upper limits on particle size for all subsequent stages in the evolution of sediment in landscapes. We hypothesize that the initial size distribution should depend on the size distribution of latent sediment (i.e., fracture-bound blocks in unweathered bedrock) and weathering of blocks both before and during detachment (e.g., disintegration along crystal grain boundaries). However, the initial size distribution is difficult to measure because the interface across which sediment is produced is often shielded from view by overlying soil. Here we overcome this limitation by comparing fracture spacings measured from exposed bedrock on cliff faces with particle size distributions in adjacent talus deposits at 15 talus–cliff pairs spanning a wide range of climates and lithologies in California. Median fracture spacing and particle size vary by more than 10-fold and correlate strongly with lithology. Fracture spacing and talus size distributions are also closely correlated in central tendency, spread, and shape, with b-axis diameters showing the closest correspondence with fracture spacing at most sites. This suggests that weathering has not modified latent sediment either before or during detachment from the cliff face. In addition, talus at our sites has not undergone much weathering after deposition and is slightly coarser than the latent sizes because it contains unexploited fractures inherited from bedrock. We introduce a new conceptual framework for understanding the relative importance of latent size and weathering in setting initial sediment size distributions in mountain landscapes. In this framework, hillslopes exist on a spectrum defined by the ratio of two characteristic timescales: the residence time in saprolite and weathered bedrock and the time required to detach a particle of a characteristic size. At one end of the spectrum, where weathering residence times are negligible, the latent size distribution can be used to predict the initial size distribution. At the other end of the spectrum, where weathering residence times are long, the latent size distribution can be erased by weathering in the critical zone.


Author(s):  
C. S. Potter ◽  
C. D. Gregory ◽  
H. D. Morris ◽  
Z.-P. Liang ◽  
P. C. Lauterbur

Over the past few years, several laboratories have demonstrated that changes in local neuronal activity associated with human brain function can be detected by magnetic resonance imaging and spectroscopy. Using these methods, the effects of sensory and motor stimulation have been observed and cognitive studies have begun. These new methods promise to make possible even more rapid and extensive studies of brain organization and responses than those now in use, such as positron emission tomography.Human brain studies are enormously complex. Signal changes on the order of a few percent must be detected against the background of the complex 3D anatomy of the human brain. Today, most functional MR experiments are performed using several 2D slice images acquired at each time step or stimulation condition of the experimental protocol. It is generally believed that true 3D experiments must be performed for many cognitive experiments. To provide adequate resolution, this requires that data must be acquired faster and/or more efficiently to support 3D functional analysis.


2020 ◽  
pp. 54-59
Author(s):  
A. A. Yelizarov ◽  
A. A. Skuridin ◽  
E. A. Zakirova

A computer model and the results of a numerical experiment for a sensitive element on a planar mushroom-shaped metamaterial with cells of the “Maltese cross” type are presented. The proposed electrodynamic structure is shown to be applicable for nondestructive testing of geometric and electrophysical parameters of technological media, as well as searching for inhomogeneities in them. Resonant frequency shift and change of the attenuation coefficient value of the structure serve as informative parameters.


Author(s):  
A. A. Nedbaylov

The calculations required in project activities for engineering students are commonly performed in electronic spreadsheets. Practice has shown that utilizing those calculations could prove to be quite difficult for students of other fields. One of the causes for such situation (as well as partly for problems observed during Java and C programming languages courses) lies in the lack of a streamlined distribution structure for both the source data and the end results. A solution could be found in utilizing a shared approach for information structuring in spreadsheet and software environment, called “the Book Method”, which takes into account the engineering psychology issues regarding the user friendliness of working with electronic information. This method can be applied at different levels in academic institutions and at teacher training courses.


In the article, the author considers the problems of complex algorithmization and systematization of approaches to optimizing the work plans of construction organizations (calendar plans) using various modern tools, including, for example, evolutionary algorithms for "conscious" enumeration of options for solving a target function from an array of possible constraints for a given nomenclature. Various typical schemes for modeling the processes of distribution of labor resources between objects of the production program are given, taking into account the array of source data. This data includes the possibility of using the material and technical supply base (delivery, storage, packaging) as a temporary container for placing the labor resource in case of released capacity, quantitative and qualification composition of the initial labor resource, the properties of the construction organization as a counterparty in the contract system with the customer of construction and installation works etc. A conceptual algorithm is formed that is the basis of the software package for operational harmonization of the production program ( work plans) in accordance with the loading of production units, the released capacities of labor resources and other conditions stipulated by the model. The application of the proposed algorithm is most convenient for a set of objects, which determines the relevance of its implementation in optimization models when planning production programs of building organizations that contain several objects distributed over a time scale.


Author(s):  
Fernando Garcia Perez ◽  
Guillermo Martinez de Pinillos Gordillo ◽  
Mariana Tome Fernandez-Ladreda ◽  
Eyvee Arturo Cuellar Lloclla ◽  
Jose Alvaro Romero Porcel ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document