Migration of a relational databases to NoSQL: The way forward

Author(s):  
Alae El Alami ◽  
Mohamed Bahaj
Keyword(s):  
2019 ◽  
pp. 94-106
Author(s):  
Mikhail D. Malykh ◽  
◽  
Anton L. Sevastianov ◽  
Leonid A. Sevastianov ◽  
◽  
...  

The work of transforming a database from one format periodically appears in different organizations for various reasons. Today, the mechanism for changing the format of relational databases is well developed. But with the advent of new types of database such as NoSQL, this problem was exacerbated due to the radical difference in the way data was organized. This article discusses a formalized method based on set theory, at the choice of the number and composition of collections for a key-value type database. The initial data are the properties of the objects, information about which is stored in the database, and the set of queries that are most frequently executed or the speed of which should be maximized. The considered method can be applied not only when creating a new key-value database, but also when transforming an existing one, when moving from relational databases to NoSQL, when consolidating databases.


2020 ◽  
Vol 164 ◽  
pp. 09014
Author(s):  
Mikhail Urubkin ◽  
Vasiliy Galushka ◽  
Vladimir Fathi ◽  
Denis Fathi ◽  
Alla Gerasimenko

The article is devoted to the problem of representing graphs in the form that is most suitable for their recording in relational databases and for subsequent efficient extracting and processing. The article analyzes various ways to describe graphs, such as adjacency and, incidence matrices, and adjacency lists. Each of them is reviewed from the point of view of their compliance with normal forms to assess the possibility of using a particular method when developing databases for storing graphs. It is shown that for such a task, each of these methods has a large number of disadvantages that lead to low efficiency of both data storing and processing. The article suggests the way to represent graphs in the form of a relational list of edges corresponding to the third normal form and allowing to eliminate the disadvantages of other methods.


Author(s):  
Liliana María Favre

Reverse Engineering is the process of analyzing available software artifacts such as requirements, design, architectures, code or byte code, with the objective of extracting information and providing high-level views on the underlying system. A common idea in reverse engineering is to exploit the source code as the most reliable description both of the behavior of a software system and of the organization and its business rules. However, reverse engineering is immersed in a variety of tasks related to comprehending and modifying software such as re-documentation of programs and relational databases, recovering of architectures, recovering of alternative design views, recovering of design patterns, building traceability between code and designs, modernization of interfaces or extracting the source code or high level abstractions from byte code when the source code is not available. Reverse engineering is hardly associated with modernization of legacy systems that were developed many years ago with technology that is now obsolete. These systems include software, hardware, business processes and organizational strategies and politics. Many of them remain in use after more than 20 years; they may be written for technology which is expensive to maintain and which may not be aligned with current organizational politics. Legacy systems resume key knowledge acquired over the life of an organization. Changes are motivated for multiple reasons, for instance the way in which we do business and create value. Important business rules are embedded in the software and may not be documented elsewhere. The way in which the legacy system operates is not explicit (Brodie and Stonebraker, 1995) (Sommerville, 2004).


2018 ◽  
Vol 41 ◽  
Author(s):  
Maria Babińska ◽  
Michal Bilewicz

AbstractThe problem of extended fusion and identification can be approached from a diachronic perspective. Based on our own research, as well as findings from the fields of social, political, and clinical psychology, we argue that the way contemporary emotional events shape local fusion is similar to the way in which historical experiences shape extended fusion. We propose a reciprocal process in which historical events shape contemporary identities, whereas contemporary identities shape interpretations of past traumas.


2020 ◽  
Vol 43 ◽  
Author(s):  
Aba Szollosi ◽  
Ben R. Newell

Abstract The purpose of human cognition depends on the problem people try to solve. Defining the purpose is difficult, because people seem capable of representing problems in an infinite number of ways. The way in which the function of cognition develops needs to be central to our theories.


1976 ◽  
Vol 32 ◽  
pp. 233-254
Author(s):  
H. M. Maitzen

Ap stars are peculiar in many aspects. During this century astronomers have been trying to collect data about these and have found a confusing variety of peculiar behaviour even from star to star that Struve stated in 1942 that at least we know that these phenomena are not supernatural. A real push to start deeper theoretical work on Ap stars was given by an additional observational evidence, namely the discovery of magnetic fields on these stars by Babcock (1947). This originated the concept that magnetic fields are the cause for spectroscopic and photometric peculiarities. Great leaps for the astronomical mankind were the Oblique Rotator model by Stibbs (1950) and Deutsch (1954), which by the way provided mathematical tools for the later handling pulsar geometries, anti the discovery of phase coincidence of the extrema of magnetic field, spectrum and photometric variations (e.g. Jarzebowski, 1960).


Author(s):  
W.M. Stobbs

I do not have access to the abstracts of the first meeting of EMSA but at this, the 50th Anniversary meeting of the Electron Microscopy Society of America, I have an excuse to consider the historical origins of the approaches we take to the use of electron microscopy for the characterisation of materials. I have myself been actively involved in the use of TEM for the characterisation of heterogeneities for little more than half of that period. My own view is that it was between the 3rd International Meeting at London, and the 1956 Stockholm meeting, the first of the European series , that the foundations of the approaches we now take to the characterisation of a material using the TEM were laid down. (This was 10 years before I took dynamical theory to be etched in stone.) It was at the 1956 meeting that Menter showed lattice resolution images of sodium faujasite and Hirsch, Home and Whelan showed images of dislocations in the XlVth session on “metallography and other industrial applications”. I have always incidentally been delighted by the way the latter authors misinterpreted astonishingly clear thickness fringes in a beaten (”) foil of Al as being contrast due to “large strains”, an error which they corrected with admirable rapidity as the theory developed. At the London meeting the research described covered a broad range of approaches, including many that are only now being rediscovered as worth further effort: however such is the power of “the image” to persuade that the above two papers set trends which influence, perhaps too strongly, the approaches we take now. Menter was clear that the way the planes in his image tended to be curved was associated with the imaging conditions rather than with lattice strains, and yet it now seems to be common practice to assume that the dots in an “atomic resolution image” can faithfully represent the variations in atomic spacing at a localised defect. Even when the more reasonable approach is taken of matching the image details with a computed simulation for an assumed model, the non-uniqueness of the interpreted fit seems to be rather rarely appreciated. Hirsch et al., on the other hand, made a point of using their images to get numerical data on characteristics of the specimen they examined, such as its dislocation density, which would not be expected to be influenced by uncertainties in the contrast. Nonetheless the trends were set with microscope manufacturers producing higher and higher resolution microscopes, while the blind faith of the users in the image produced as being a near directly interpretable representation of reality seems to have increased rather than been generally questioned. But if we want to test structural models we need numbers and it is the analogue to digital conversion of the information in the image which is required.


1979 ◽  
Vol 44 (1) ◽  
pp. 3-30 ◽  
Author(s):  
Carol A. Pruning

A rationale for the application of a stage process model for the language-disordered child is presented. The major behaviors of the communicative system (pragmatic-semantic-syntactic-phonological) are summarized and organized in stages from pre-linguistic to the adult level. The article provides clinicians with guidelines, based on complexity, for the content and sequencing of communicative behaviors to be used in planning remedial programs.


Sign in / Sign up

Export Citation Format

Share Document