Fuzzy Sets-Based Retranslation of Numerical Data in E-Learning

Author(s):  
Adam Niewiadomski ◽  
Bartosz Rybusiński
Keyword(s):  
Author(s):  
EDGE C. YEH ◽  
SHAO HOW LU

In this paper, the hysteresis characterization in fuzzy spaces is presented by utilizing a fuzzy learning algorithm to generate fuzzy rules automatically from numerical data. The hysteresis phenomenon is first described to analyze its underlying mechanism. Then a fuzzy learning algorithm is presented to learn the hysteresis phenomenon and is used for predicting a simple hysteresis phenomenon. The results of learning are illustrated by mesh plots and input-output relation plots. Furthermore, the dependency of prediction accuracy on the number of fuzzy sets is studied. The method provides a useful tool to model the hysteresis phenomenon in fuzzy spaces.


2014 ◽  
Vol 513-517 ◽  
pp. 2186-2189
Author(s):  
Lian Hong Ding

A challenging topic for e-learning system is to find appropriate learning assets for users when e-learning takes place in an open and dynamic environment. A good personalized e-learning environment should recommend right learning content for learners. In order to find appropriate learning materials for learners with different preferences, this paper presents a fuzzy set theoretic method for e-learning system. Firstly, e-learning resource and user interest are represented with fuzzy value. Secondly, an algorithm based on various fuzzy set theoretic similarity measures is introduced to find the e-learning contents matching learners request. Lastly, the approach to introduce learning materials for learners, based on the similarity computing, is given. Compared to the baseline crisp set based method presented, our method shows an improvement in precision without loss of recall.


Author(s):  
Dinesh C. S. Bisht ◽  
Shilpa Jain ◽  
Pankaj Kumar Srivastava

This study helps to select the length for fuzzy sets in fuzzy time series prediction. In order to examine the effect of intervals and evaluate the efficiency of the proposed algorithm, numerical data of water recharge and discharge are considered to predict water table elevation fluctuation (WTEF). Particle swarm optimization (PSO) is an influential tool to handle optimization of multi-model problems, whereas fuzzy logic can handle uncertainty. In this paper, adaptive inertia weights are adopted rather than static inertia weights for PSO, which further improves efficiency of PSO. This modified PSO is termed as adaptive particle swarm optimization (APSO). APSO optimizes the intervals and these intervals are further used to generate fuzzy sets for prediction. The results indicate that the APSO performs better than PSO and genetic algorithm approaches for the same problem.


Author(s):  
W.M. Stobbs

I do not have access to the abstracts of the first meeting of EMSA but at this, the 50th Anniversary meeting of the Electron Microscopy Society of America, I have an excuse to consider the historical origins of the approaches we take to the use of electron microscopy for the characterisation of materials. I have myself been actively involved in the use of TEM for the characterisation of heterogeneities for little more than half of that period. My own view is that it was between the 3rd International Meeting at London, and the 1956 Stockholm meeting, the first of the European series , that the foundations of the approaches we now take to the characterisation of a material using the TEM were laid down. (This was 10 years before I took dynamical theory to be etched in stone.) It was at the 1956 meeting that Menter showed lattice resolution images of sodium faujasite and Hirsch, Home and Whelan showed images of dislocations in the XlVth session on “metallography and other industrial applications”. I have always incidentally been delighted by the way the latter authors misinterpreted astonishingly clear thickness fringes in a beaten (”) foil of Al as being contrast due to “large strains”, an error which they corrected with admirable rapidity as the theory developed. At the London meeting the research described covered a broad range of approaches, including many that are only now being rediscovered as worth further effort: however such is the power of “the image” to persuade that the above two papers set trends which influence, perhaps too strongly, the approaches we take now. Menter was clear that the way the planes in his image tended to be curved was associated with the imaging conditions rather than with lattice strains, and yet it now seems to be common practice to assume that the dots in an “atomic resolution image” can faithfully represent the variations in atomic spacing at a localised defect. Even when the more reasonable approach is taken of matching the image details with a computed simulation for an assumed model, the non-uniqueness of the interpreted fit seems to be rather rarely appreciated. Hirsch et al., on the other hand, made a point of using their images to get numerical data on characteristics of the specimen they examined, such as its dislocation density, which would not be expected to be influenced by uncertainties in the contrast. Nonetheless the trends were set with microscope manufacturers producing higher and higher resolution microscopes, while the blind faith of the users in the image produced as being a near directly interpretable representation of reality seems to have increased rather than been generally questioned. But if we want to test structural models we need numbers and it is the analogue to digital conversion of the information in the image which is required.


Author(s):  
B. Lencova ◽  
G. Wisselink

Recent progress in computer technology enables the calculation of lens fields and focal properties on commonly available computers such as IBM ATs. If we add to this the use of graphics, we greatly increase the applicability of design programs for electron lenses. Most programs for field computation are based on the finite element method (FEM). They are written in Fortran 77, so that they are easily transferred from PCs to larger machines.The design process has recently been made significantly more user friendly by adding input programs written in Turbo Pascal, which allows a flexible implementation of computer graphics. The input programs have not only menu driven input and modification of numerical data, but also graphics editing of the data. The input programs create files which are subsequently read by the Fortran programs. From the main menu of our magnetic lens design program, further options are chosen by using function keys or numbers. Some options (lens initialization and setting, fine mesh, current densities, etc.) open other menus where computation parameters can be set or numerical data can be entered with the help of a simple line editor. The "draw lens" option enables graphical editing of the mesh - see fig. I. The geometry of the electron lens is specified in terms of coordinates and indices of a coarse quadrilateral mesh. In this mesh, the fine mesh with smoothly changing step size is calculated by an automeshing procedure. The options shown in fig. 1 allow modification of the number of coarse mesh lines, change of coordinates of mesh points or lines, and specification of lens parts. Interactive and graphical modification of the fine mesh can be called from the fine mesh menu. Finally, the lens computation can be called. Our FEM program allows up to 8000 mesh points on an AT computer. Another menu allows the display of computed results stored in output files and graphical display of axial flux density, flux density in magnetic parts, and the flux lines in magnetic lenses - see fig. 2. A series of several lens excitations with user specified or default magnetization curves can be calculated and displayed in one session.


ASHA Leader ◽  
2007 ◽  
Vol 12 (14) ◽  
pp. 24-25 ◽  
Author(s):  
Gloria D. Kellum ◽  
Sue T. Hale

Sign in / Sign up

Export Citation Format

Share Document