scholarly journals FORMALIZATION OF REGULATORY TEXTS

Author(s):  
Виктор Пименов ◽  
Mihail Voronov

Modern information technologies provide text manipulation processes with high efficiency. First of all, this means storing, editing, and formatting texts and their components. Having achieved significant success in developing tools for content-free computer text processing, researchers faced problems with their content processing. Therefore, further steps in this direction are associated with the creation, among other things, of methods for automated purposeful manipulation of texts, taking into account their content. The analysis of works devoted to the study of the problems of formal presentation of texts and their subsequent use is carried out. Despite a number of successful projects, the challenges of solving the problem of the relationship between the content of the text and its meaning remain relevant. It seems that formalization of a General-purpose text while preserving its semantics is not feasible at this stage. However, there are types of texts that can be formalized while preserving their semantics. One of them is a regulatory text type, which is essentially a verbally expressed algorithm for a sequence of targeted actions. It is distinguished by logic and accuracy (lack of allegories), coherence and integrity, clarity, understandability (due to the lack of emotional coloring and figurative means), accessibility (due to the use of specific terminology). In other words, when developing regulatory texts, they usually try to display the mechanisms of the described actions as clearly as possible. Purpose: development of a method for formalizing a regulatory text while preserving its semantics. Methods: structural linguistics, representation of objects in the form of an ontology, constructive algorithms. The use of this method is demonstrated by describing the solution of a system of algebraic equations. Results: method for constructing a mathematical model of a regulatory text. Practical relevance: the application of the developed method makes it possible to develop software systems for building libraries of individual subject areas, develop tools for evaluating regulatory texts for their certainty, completeness, connectivity and other characteristics, as well as simulators and self-learning tools.

2021 ◽  
Vol 11 (16) ◽  
pp. 7223
Author(s):  
Dengyu Xiong ◽  
Mingliang Wu ◽  
Wei Xie ◽  
Rong Liu ◽  
Haifeng Luo

To address the problems of high damage rate, low seeding accuracy, and poor seeding generally in the seeding process, a general-purpose seeding device was designed in this study based on the principle of mechanical pneumatic combined seeding. The air-blowing-type cleaning and seed unloading of the device laid the conditions for precise seeding and flexible seeding. In addition, single-factor experiments were performed on seeds (e.g., soybeans, corn, and rape-seeds) with different particle sizes and shapes to verify the general properties of the seed metering device. A multi-factor response surface optimization experiment was performed by applying soybean seeds as the test object to achieve the optimal performance parameters of the seed metering device. At a seed-clearing air velocity of 16.7 m/s, a seed feeding drum speed of 13.7 r/min, and a hole cone angle of 35.6°, corresponding to the optimal performance index, the qualified index, the replay index, and the missed index reached 97.94%, 0.03%, and 2.03%, respectively. The verification test results are consistent with the optimized ones. As indicated from the results, the seed metering device exhibits good general properties, low damage rate, great precision, and high efficiency; it is capable of meeting general seeding operations of different crop seeds and technically supporting for the reliability and versatility of the seeder.


2021 ◽  
pp. 140-151
Author(s):  
Petro Putsenteilo ◽  
Andrii Dovbush

Purpose. The aim of the article is the analysis of scientific approaches to the interpretation of the peculiarities of the development of digital technologies of modern accounting in the digital economy. Methodology of research. The study was conducted using a dialectical approach to the study of the current state of the digital economy. In the course of the research general and special methods were used, in particular: the analytical method was used in the review of normative sources; the method of classification made it possible to differentiate the main components of the digital economy and accounting, and the method of description ‒ to give them a detailed description; monographic method is used in the study of literature sources on digital economics and accounting, system and analytical ‒ in the processing of information. Findings. It is determined that the digital economy is a communication environment of economic activity on the Internet, the result of transformational effects of new general-purpose technologies in the field of information and communication. It is substantiated that significant technological and informatization shifts, as well as growth of potential of digital economic information space stimulate modernization of accounting science, promote development of methodology and organization of accounting process, actualize the problem of positioning of accounting activity. The basic principles of functioning of the digital accounting platform are revealed, which allow to create the newest electronic systems with a significant number of users. Originality. It has been established that the block-chain is a promising accounting technology that eliminates traditional methods of accounting, documentation, processing, registration, inventory systems, while allowing companies to register both parties to the transaction in a joint book in real time, rather than keeping agreed records of financial transactions separately, private database. Practical value. The obtained results of the study will help increase the efficiency of the formation and development of accounting in a digital economy and will be the basis for further research in this area. Key words: digital economy, information technologies, accounting, digital accounting, block-chain, digital platform, digital technologies.


2017 ◽  
Vol 7 (3) ◽  
pp. 193-200
Author(s):  
Иван Бартенев ◽  
Ivan Bartenev ◽  
Михаил Лысыч ◽  
Mikhail Lysych ◽  
Михаил Шабанов ◽  
...  

The basic soil preparation is an important factor in ensuring high survival rate and preservation of cultural plants. Its objective is to loosen the soil to a predetermined depth, it is also important to ensure the containment of weeds growth in areas, adjacent to the row of crop strips. Depending on soil conditions it may be cutting of one or double earthboard furrows (drained soil) and in the formation of microhill (temporarily waterlogged soils). For these purposes, currently, ploughs PKL-70, PRL-70, PL-1, PLM-1.5A, PDV-1.5, PL-2-50, PLD-1.2, etc. are used. Their common drawback is the ability to perform only one operation. A large variety of used tillage equipment and its low efficiency confirms the urgency of developing multi-tools modular construction. This will effectively produce basic preparation of the soil in a variety of conditions with a single instrument. The article provides a description of design and basic layout options of multifunctional plough. By simple changeovers, carried out on site, the plough can be adapted for the main preparation of soil on clearings with drained and temporarily waterlogged soils. It is also possible to change the distance between the bodies of the plow in accordance with the specified rows and rearrange them to work "in" or "out". Process works is modeled in conditions of non-uprooted cutting. The simulation experiment showed high efficiency of the plough, which is ensured by the presence of safety recoverable devices. Replacement in the forestry enterprise of a set of ploughs with one multipurpose plough will reduce total specific amount of metal almost in 4 times.


2021 ◽  
Vol 16 (6) ◽  
pp. 3376-3386
Author(s):  
Maira Bedebayeva ◽  
Roza Kadirbayeva ◽  
Laura Suleimenova ◽  
 Gulzhan O Zhetpisbayeva ◽  
Gulira Nurmukhanbetova

Blended cooperative learning applications, which offer education with the opportunities offered by information technologies, have the potential to increase the interaction between learners and support learners to learn information more permanently and to develop positive attitudes towards the lesson. Along with the developing technology, technological tools have been included in education. The effect of the blended learning method, which is a technological innovation, is very important in language teaching. The aim of this study; To determine the opinions of English teachers about blended teaching. Within the scope of this general purpose, the positive aspects and negative aspects of the technological tools, the advantages and disadvantages of the blended learning methods, the effect of this method on the students are determined by the English teachers working in the secondary school. A qualitative research method was used to reach the results of this research. The opinions of 15 English teachers who use the blended learning method and technological tools in their classes were taken. In the selection of the sample, the teachers' use of technology was taken as a basis. The opinions of 15 English teachers who used technological equipment in their classes and participated in the research voluntarily were consulted. The findings were thematised and explained with the content analysis method. It has been concluded that the teachers participating in the research have positive contributions to the learning of technological tools. It is among the results of the study that blended learning has important advantages such as providing instant feedback and continuous feedback to students, taking into account individual differences, increasing the interaction and communication outside the classroom, and increasing the interest in the lesson. Keywords: English, language, blended learning, technology, educational environment, information technology


Author(s):  
Sheng Kang ◽  
Guofeng Chen ◽  
Chun Wang ◽  
Ruiquan Ding ◽  
Jiajun Zhang ◽  
...  

With the advent of big data and cloud computing solutions, enterprise demand for servers is increasing. There is especially high growth for Intel based x86 server platforms. Today’s datacenters are in constant pursuit of high performance/high availability computing solutions coupled with low power consumption and low heat generation and the ability to manage all of this through advanced telemetry data gathering. This paper showcases one such solution of an updated rack and server architecture that promises such improvements. The ability to manage server and data center power consumption and cooling more completely is critical in effectively managing datacenter costs and reducing the PUE in the data center. Traditional Intel based 1U and 2U form factor servers have existed in the data center for decades. These general purpose x86 server designs by the major OEM’s are, for all practical purposes, very similar in their power consumption and thermal output. Power supplies and thermal designs for server in the past have not been optimized for high efficiency. In addition, IT managers need to know more information about servers in order to optimize data center cooling and power use, an improved server/rack design needs to be built to take advantage of more efficient power supplies or PDU’s and more efficient means of cooling server compute resources than from traditional internal server fans. This is the constant pursuit of corporations looking at new ways to improving efficiency and gaining a competitive advantage. A new way to optimize power consumption and improve cooling is a complete redesign of the traditional server rack. Extracting internal server power supplies and server fans and centralizing these within the rack aims to achieve this goal. This type of design achieves an entirely new low power target by utilizing centralized, high efficiency PDU’s that power all servers within the rack. Cooling is improved by also utilizing large efficient rack based fans for airflow to all servers. Also, opening up the server design is to allow greater airflow across server components for improved cooling. This centralized power supply breaks through the traditional server power limits. Rack based PDU’s can adjust the power efficiency to a more optimum point. Combine this with the use of online + offline modes within one single power supply. Cold backup makes data center power to achieve optimal power efficiency. In addition, unifying the mechanical structure and thermal definitions within the rack solution for server cooling and PSU information allows IT to collect all server power and thermal information centrally for improved ease in analyzing and processing.


1984 ◽  
Vol 78 ◽  
pp. 549-562 ◽  
Author(s):  
J.R.P. Angel

AbstractThe full potential of the next generation of larger telescopes will be realized only if they have well instrumented large fields of view. Scientific problems for which very large ground-based optical telescopes will be of most value often will need surveys to very deep limits with imaging and slitless spectroscopy, followed by spectroscopy of faint objects taken many at once over the field. Improved instruments and detectors for this purpose are being developed. Remotely positioned fibers allow the coupling of light from many objects in the field to the spectrograph slit. CCD arrays, operated in the TDI or drift scan mode, will make large area detectors of high efficiency that may supercede photographic plates. An ideal telescope optical design should be based on a fast parabolic primary, have a field of at least 1° with achromatic images < 0.25 arcseconds and have provision for dispersive elements to be used for slitless spectroscopy and compensation of atmospheric dispersion over the full field. A good solution for a general purpose telescope that can satisfy these needs is given by a three element refractive corrector at a fast Cassegrain focus. A specialized telescope dedicated to sky surveys, with better image quality and higher throughput than presently available, might be built as a scaled up Schmidt with very large photographic plates. Better performance in most areas should be obtained with a large CCD mosaic detector operated in the drift scan mode at a telescope with a 2-mirror reflecting corrector.


Professor Hartree in his paper has recalled that all the essential ideas of the general-purpose calculating machines now being made are to be found in Babbage’s plans for his analytical engine. In modern times the idea of a universal calculating machine was independently introduced by Turing (1938) in connexion with a logical problem, which there is unfortunately no time to mention, and the construction of actual machines was begun independently in America, towards the end of the late war. A ‘universal’ machine is one which, when given suitable instructions, will carry out automatically any well-defined series of computations of certain specified kinds, say additions, subtractions, multiplications and divisions of integers or finite decimals. This is a rather doubtful definition, since it depends on what is meant by a ‘welldefined’ series of computations; and undoubtedly the best definition of this is ‘one that can be done by a machine ’. However, this description is not quite so circular as it may seem; for most people have a fairly clear idea of w hat processes can be done by machines specially constructed for each separate purpose. There are, for example, machines for solving sets of linear algebraic equations, for finding the prim e factors of large integers, for solving ordinary differential equations of certain types, and so on. A universal machine is a single machine which, when provided with suitable instructions, will perform any calculation that could be done by a specially constructed machine. No real machine can be truly universal because its size is limited—for example, no machine will work out π to lO 1000 places of decimals, because there is no room in the world for the working or the answer; but subject to this limitation of size, the machines now being made in America and in this country will be ‘ universal ’ —if they work a t all; that is, they will do every kind of job that can be done by special machines.


10.12737/3880 ◽  
2014 ◽  
Vol 8 (2) ◽  
pp. 83-90
Author(s):  
Ольга Цеханович ◽  
Olga Tsekhanovich ◽  
Виктор Бырдин ◽  
Viktor Byrdin

The article is a description of the 2D didactical methodology as applied to HE settings. The 2D didactics concept, «Gzhel-1», was developed at the Gzhel state artistically-industrial institute with the view to developing more refined common cultural and professional competences of undergraduate students majoring in tourism. The methodology aims at developing individual capabilities of students, an integration of teacher-facilitated in-class, individual and joint learning ensuring a collective as well as individual content processing, constructing knowledge, a creation of new individually meaningful knowledge, and acquiring socially and professionally valuable competences [1]. The author supports the general view of the use of mobile devices (computers, smart phones, tablet PCs) as a currently important contributor to new educational technologies implementation. Mobile devices afford student access to education-tailored applications: «Gzhel-1» methodology, as applied in «Computer sciences» and «Information technologies in the service sphere» classes, assumes that students receive discipline-specific tasks, as well as tasks involving the regional ecological component. From the author’s perspective, the influence of students’ knowledge and skills acquired through the methodology serves to decrease social tension in the Gzhel zone, as the content of the Gzhel-1-supported disciplines includes research tools and a range of mechanisms of enhacing the quality of Gzhel’s eco-surroundings. The article provides a description of the major eco-surroundings challenges such as: drinking water quality, electromagnetic fields, air pollution, historical landscape alterations, the status of unique natural monuments, clay extraction, anthropogenic load, and radioecological situation.


Author(s):  
Anjaneyulu Lankadasu ◽  
Laurent Krumenacker ◽  
Anil Kumar ◽  
Amita Tripathi

Accurate prediction of condensation plays an important role in the development of high efficiency turbo-machines working on condensable fluid. Therefore it demands modeling of poly-disperse characteristic of number distribution function while modeling condensation. Two such kind of models are considered in this work and they are namely, quadrature method of moments (QMOM) and multi-fluid method (MFM) models. The vital difference between these two models lies in the method of discretisation of the droplet size distribution. Further, their numerical aspects like ease of implementation in general purpose computational fluid dynamics solvers, accuracy and associated computational cost are discussed. In order to obtain accurate thermodynamic properties, the real gas formulations defined in IAPWS-IF97 are used. These algorithms are applied to the compressible Navier-Stokes solver of Fluidyn MP and tests are carried on Laval nozzle and compared with the experimental measurements.


Sign in / Sign up

Export Citation Format

Share Document