scholarly journals Computer Simulation of Partial Discharges in Voids inside Epoxy Resins Using Three-Capacitance and Analytical Models

Polymers ◽  
2020 ◽  
Vol 12 (1) ◽  
pp. 77 ◽  
Author(s):  
Johnatan M. Rodríguez-Serna ◽  
Ricardo Albarracín-Sánchez ◽  
Ming Dong ◽  
Ming Ren

Epoxy resin is one of the most common polymers used as part of the insulation system in key electrical assets such as power transformers and hydrogenerators. Thus, it is necessary to know their main characteristics and to evaluate their condition when subjected to High Voltage (HV). A brief review of epoxy resins’ applications as insulating materials is made, their main characteristics as insulating media are given, the improvements with nano-fillers are summarized and the main electric properties required for Partial Discharges (PD) modelling are listed. In addition, the theoretical background and state-of-the-art of the three-capacitance and analytical models for simulating PD in solid dielectrics, such as epoxy resins, are reviewed in detail. Besides, their main advantages and disadvantages are presented, some critical arguments to the modelling procedure and assumptions are made and some improvements are proposed, taking into account conclusions made from other authors using models related to the PD development process. Finally, a case study was simulated using a modified three-capacitance model and the analytical model. The PD rate, q-φ-n diagrams and the minimum, mean and maximum PD electric charge are compared with measurements reported in the literature. Simulation results are in reasonable agreement with measured values. Capacitance models can be implemented in general purpose electric circuit simulation packages; however, its simulation is computationally expensive. Additional to this, although the modified three-capacitance model is not as accurate as finite elements or analytical models, results are also in agreement with real data.

1987 ◽  
Vol 14 (3) ◽  
pp. 134-140 ◽  
Author(s):  
K.A. Clarke

Practical classes in neurophysiology reinforce and complement the theoretical background in a number of ways, including demonstration of concepts, practice in planning and performance of experiments, and the production and maintenance of viable neural preparations. The balance of teaching objectives will depend upon the particular group of students involved. A technique is described which allows the embedding of real compound action potentials from one of the most basic introductory neurophysiology experiments—frog sciatic nerve, into interactive programs for student use. These retain all the elements of the “real experiment” in terms of appearance, presentation, experimental management and measurement by the student. Laboratory reports by the students show that the experiments are carefully and enthusiastically performed and the material is well absorbed. Three groups of student derive most benefit from their use. First, students whose future careers will not involve animal experiments do not spend time developing dissecting skills they will not use, but more time fulfilling the other teaching objectives. Second, relatively inexperienced students, struggling to produce viable neural material and master complicated laboratory equipment, who are often left with little time or motivation to take accurate readings or ponder upon neurophysiological concepts. Third, students in institutions where neurophysiology is taught with difficulty because of the high cost of equipment and lack of specific expertise, may well have access to a low cost general purpose microcomputer system.


1996 ◽  
Vol 2 (4) ◽  
pp. 295-302 ◽  
Author(s):  
BRUCE W. WATSON

Finite automata and various extensions of them, such as transducers, are used in areas as diverse as compilers, spelling checking, natural language grammar checking, communication protocol design, digital circuit simulation, digital flight control, speech recognition and synthesis, genetic sequencing, and Java program verification. Unfortunately, as the number of applications has grown, so has the variety of implementations and implementation techniques. Typically, programmers will be confused enough to resort to their text books for the most elementary algorithms. Recently, advances have been made in taxonomizing algorithms for constructing and minimizing automata and in evaluating various implementation strategies Watson 1995. Armed with this, a number of general-purpose toolkits have been developed at universities and companies. One of these, FIRE Lite, was developed at the Eindhoven University of Technology, while its commercial successor, FIRE Engine II, has been developed at Ribbit Software Systems Inc. Both of these toolkits provide implementations of all of the known algorithms for constructing automata from regular expressions, and all of the known algorithms for minimizing deterministic finite automata. While the two toolkits have a great deal in common, we will concentrate on the structure and use of the noncommercial FIRE Lite. The prototype version of FIRE Lite was designed with compilers in mind. More recently, computation linguists and communications protocol designers have become interested in using the toolkit. This has led to the development of a much more general interface to FIRE Lite, including the support of both Mealy and Moore regular transducers. While such a toolkit may appear extremely complex, there are only a few choices to be made. We also consider a ‘recipe’ for making good use of the toolkits. Lastly, we consider the future of FIRE Lite. While FIRE Engine II has obvious commercial value, we are committed to maintaining a version which is freely available for academic use.


2018 ◽  
Vol 16 ◽  
pp. 01002
Author(s):  
Jitka Poměnková ◽  
Eva Klejmová ◽  
Tobiáš Malach

The paper deals with significance testing of time series co-movement measured via wavelet analysis, namely via the wavelet cross-spectra. This technique is very popular for its better time resolution compare to other techniques. Such approach put in evidence the existence of both long-run and short-run co-movement. In order to have better predictive power it is suitable to support and validate obtained results via some testing approach. We investigate the test of wavelet power cross-spectrum with respect to the Gaussian white noise background with the use of the Bessel function. Our experiment is performed on real data, i.e. seasonally adjusted quarterly data of gross domestic product of the United Kingdom, Korea and G7 countries. To validate the test results we perform Monte Carlo simulation. We describe the advantages and disadvantages of both approaches and formulate recommendations for its using.


1982 ◽  
Vol 30 (5) ◽  
pp. 719-724 ◽  
Author(s):  
T. Takada ◽  
Yokoyama ◽  
Kiyoyuki ◽  
M. Ida ◽  
T. Sudo

2019 ◽  
Vol 26 (1) ◽  
pp. 39-62
Author(s):  
Stanislav O. Bezzubtsev ◽  
Vyacheslav V. Vasin ◽  
Dmitry Yu. Volkanov ◽  
Shynar R. Zhailauova ◽  
Vladislav A. Miroshnik ◽  
...  

The paper proposes the architecture and basic requirements for a network processor for OpenFlow switches of software-defined networks. An analysis of the architectures of well-known network processors is presented − NP-5 from EZchip (now Mellanox) and Tofino from Barefoot Networks. The advantages and disadvantages of two different versions of network processor architectures are considered: pipeline-based architecture, the stages of which are represented by a set of general-purpose processor cores, and pipeline-based architecture whose stages correspond to cores specialized for specific packet processing operations. Based on a dedicated set of the most common use case scenarios, a new architecture of the network processor unit (NPU) with functionally specialized pipeline stages was proposed. The article presents a description of the simulation model of the NPU of the proposed architecture. The simulation model of the network processor is implemented in C ++ languages using SystemC, the open-source C++ library. For the functional testing of the obtained NPU model, the described use case scenarios were implemented in C. In order to evaluate the performance of the proposed NPU architecture a set of software products developed by KM211 company and the KMX32 family of microcontrollers were used. Evaluation of NPU performance was made on the basis of a simulation model. Estimates of the processing time of one packet and the average throughput of the NPU model for each scenario are obtained.


2021 ◽  
Vol 16 (6) ◽  
pp. 3376-3386
Author(s):  
Maira Bedebayeva ◽  
Roza Kadirbayeva ◽  
Laura Suleimenova ◽  
 Gulzhan O Zhetpisbayeva ◽  
Gulira Nurmukhanbetova

Blended cooperative learning applications, which offer education with the opportunities offered by information technologies, have the potential to increase the interaction between learners and support learners to learn information more permanently and to develop positive attitudes towards the lesson. Along with the developing technology, technological tools have been included in education. The effect of the blended learning method, which is a technological innovation, is very important in language teaching. The aim of this study; To determine the opinions of English teachers about blended teaching. Within the scope of this general purpose, the positive aspects and negative aspects of the technological tools, the advantages and disadvantages of the blended learning methods, the effect of this method on the students are determined by the English teachers working in the secondary school. A qualitative research method was used to reach the results of this research. The opinions of 15 English teachers who use the blended learning method and technological tools in their classes were taken. In the selection of the sample, the teachers' use of technology was taken as a basis. The opinions of 15 English teachers who used technological equipment in their classes and participated in the research voluntarily were consulted. The findings were thematised and explained with the content analysis method. It has been concluded that the teachers participating in the research have positive contributions to the learning of technological tools. It is among the results of the study that blended learning has important advantages such as providing instant feedback and continuous feedback to students, taking into account individual differences, increasing the interaction and communication outside the classroom, and increasing the interest in the lesson. Keywords: English, language, blended learning, technology, educational environment, information technology


2020 ◽  
Vol 25 (3) ◽  
pp. 7-12
Author(s):  
Rud V.V. ◽  

This paper considers the problems of the integration of independent manipulator control systems. Areas of control of the manipulator are: recognition of objects and obstacles, identification of objects to be grasped, determination of reliable positions by the grasping device, planning of movement of the manipulator to certain positions with avoidance of obstacles, and recognition of slipping or determination of reliable grasping. This issue is a current problem primarily in industry, general-purpose robots, and experimental robots. This paper considers current publications that address these issues. Existing algorithms and approaches have been found in the management of both parts of the robot manipulator and solutions that combine several areas, or the integration of several existing approaches. There is a brief review of current literature and publications on the above algorithms and approaches. The advantages and disadvantages of the considered methods and approaches are determined. There are solutions that cover either some areas or only one of them, which does not meet the requirements of the problem. Using existing approaches, integration points of existing implementations are identified to get the best results. In the process, a system was developed that analyzes the environment, finds obstacles, objects for interaction, poses for grasping, plans the movement of the manipulator to a specific position, and ensures reliable grasping of the object. The next step was to test the system, test the performance, and adjust the parameters for the best results. The resulting system was developed by the research team of RT-Lions, Technik University, Reutlingen. The hardware research robot includes an Intel Realsense camera, a Sawyer Arm manipulator from Rethink Robotics, and an internally grabbing device.


2019 ◽  
Vol 15 (S1) ◽  
pp. 253-266 ◽  
Author(s):  
Kazi Badrul Ahsan ◽  
M. R. Alam ◽  
Doug Gordon Morel ◽  
M. A. Karim

AbstractEmergency departments (EDs) have been becoming increasingly congested due to the combined impacts of growing demand, access block and increased clinical capability of the EDs. This congestion has known to have adverse impacts on the performance of the healthcare services. Attempts to overcome with this challenge have focussed largely on the demand management and the application of system wide process targets such as the “four-hour rule” intended to deal with access blocks. In addition, EDs have introduced various strategies such as “fast tracking”, “enhanced triage” and new models of care such as introducing nurse practitioners aimed at improving throughput. However, most of these practices require additional resources. Some researchers attempted to optimise the resources using various optimisation models to ensure best utilisation of resources to improve patient flow. However, not all modelling approaches are suitable for all situations and there is no critical review of optimisation models used in hospital EDs. The aim of this article is to review various analytical models utilised to optimise ED resources for improved patient flow and highlight benefits and limitations of these models. A range of modelling techniques including agent-based modelling and simulation, discrete-event simulation, queuing models, simulation optimisation and mathematical modelling have been reviewed. The analysis revealed that every modelling approach and optimisation technique has some advantages and disadvantages and their application is also guided by the objectives. The complexity, interrelationships and variability of ED-related variables make the application of standard modelling techniques difficult. However, these models can be used to identify sources of flow obstruction and to identify areas where investments in additional resources are likely to have most benefit.


Sign in / Sign up

Export Citation Format

Share Document