scholarly journals Brain text processing model: A new approach based on conceptual dependency stories

2009 ◽  
Vol 3 ◽  
Author(s):  
Petr Kratochvil ◽  
Jana Kleckova
Author(s):  
Wolfgang Mueller ◽  
Bernd Kleinjohann

Abstract Most engineering tasks are highly data intensive coping with the increasing complexity of systems. Gigabytes of heterogeneous engineering data have to be managed consistently by a huge array of tools. This necessitates sophisticated integration techniques based on a common database management system in order to decrease the amount of data that have to be exchanged between these tools. In this paper we present a new approach to distributed design frameworks, integrating graphical as well as text processing tools. Tools may online share the same graphical and logical data synchronized by the event mechanism of an object management system. The synchronization concept is based on a tight integration into an object-oriented object management system and provides means to keep the graphical views of multiple agents consistent. We outline these concepts by the example of an integrated EXPRESS modeling workbench.


Author(s):  
Snezhana Sulova ◽  
Boris Bankov

The impact of social networks on our liveskeeps increasing because they provide content,generated and controlled by users, that is constantly evolving. They aid us in spreading news, statements, ideas and comments very quickly. Social platforms are currently one of the richest sources of customer feedback on a variety of topics. A topic that is frequently discussed is the resort and holiday villages and the tourist services offered there. Customer comments are valuable to both travel planners and tour operators. The accumulation of opinions in the web space is a prerequisite for using and applying appropriate tools for their computer processing and for extracting useful knowledge from them. While working with unstructured data, such as social media messages, there isn’t a universal text processing algorithm because each social network and its resources have their own characteristics. In this article, we propose a new approach for an automated analysis of a static set of historical data of user messages about holiday and vacation resorts, published on Twitter. The approach is based on natural language processing techniques and the application of machine learning methods. The experiments are conducted using softwareproduct RapidMiner. 


2020 ◽  
Vol 10 (1) ◽  
pp. 13-33
Author(s):  
Nitesh Kumar Jha ◽  
Arnab Mitra

Text-summarization plays a significant role towards quick knowledge acquisition from any text-based knowledge resource. To enhance the text-summarization process, a new approach towards automatic text-summarization is presented in this article that facilitates level (word importance factor)-based automated text-summarization. An equivalent tree is produced from the directed-graph during the input text processing with WordNet. Detailed investigations further ensure that the execution time for proposed automatic text-summarization, is strictly following a linear relationship with reference to the varying volume of inputs. Further investigation towards the performance of proposed automatic text-summarization approach ensures its superiority over several other existing text-summarization approaches.


Terminology ◽  
1998 ◽  
Vol 5 (2) ◽  
pp. 147-159 ◽  
Author(s):  
Johann Gamper ◽  
Oliviero Stock

The manual acquisition of terminological material from the domain-specific text material is a very time-consuming task. Recent advances in text-processing research provide a basis for automating this task. Computer-assisted term acquisition improves both the quantity and the quality of terminological work. This paper gives a brief overview of this new approach in terminology acquisition. Three subtasks are distinguished: compilation of an electronic text corpus, extraction of terminological data, and management of terminological data. Each of the subtasks will be discussed in some detail by identifying the core problems as well as proposed solutions. As a concrete initiative in this emerging field, we present an ongoing research project at the European Academy Bolzano, which illustrates the importance of computer-assisted terminology acquisition and of the resulting steps that have been taken in recent times. The paper concludes with a summary of five selected papers which have been presented at a workshop on corpus-based terminology in Bolzano. The full papers are published in this volume and in volume 4(2) of this journal.


1999 ◽  
Vol 173 ◽  
pp. 185-188
Author(s):  
Gy. Szabó ◽  
K. Sárneczky ◽  
L.L. Kiss

AbstractA widely used tool in studying quasi-monoperiodic processes is the O–C diagram. This paper deals with the application of this diagram in minor planet studies. The main difference between our approach and the classical O–C diagram is that we transform the epoch (=time) dependence into the geocentric longitude domain. We outline a rotation modelling using this modified O–C and illustrate the abilities with detailed error analysis. The primary assumption, that the monotonity and the shape of this diagram is (almost) independent of the geometry of the asteroids is discussed and tested. The monotonity enables an unambiguous distinction between the prograde and retrograde rotation, thus the four-fold (or in some cases the two-fold) ambiguities can be avoided. This turned out to be the main advantage of the O–C examination. As an extension to the theoretical work, we present some preliminary results on 1727 Mette based on new CCD observations.


Author(s):  
V. Mizuhira ◽  
Y. Futaesaku

Previously we reported that tannic acid is a very effective fixative for proteins including polypeptides. Especially, in the cross section of microtubules, thirteen submits in A-tubule and eleven in B-tubule could be observed very clearly. An elastic fiber could be demonstrated very clearly, as an electron opaque, homogeneous fiber. However, tannic acid did not penetrate into the deep portion of the tissue-block. So we tried Catechin. This shows almost the same chemical natures as that of proteins, as tannic acid. Moreover, we thought that catechin should have two active-reaction sites, one is phenol,and the other is catechole. Catechole site should react with osmium, to make Os- black. Phenol-site should react with peroxidase existing perhydroxide.


Author(s):  
K. Chien ◽  
R. Van de Velde ◽  
I.P. Shintaku ◽  
A.F. Sassoon

Immunoelectron microscopy of neoplastic lymphoma cells is valuable for precise localization of surface antigens and identification of cell types. We have developed a new approach in which the immunohistochemical staining can be evaluated prior to embedding for EM and desired area subsequently selected for ultrathin sectioning.A freshly prepared lymphoma cell suspension is spun onto polylysine hydrobromide- coated glass slides by cytocentrifugation and immediately fixed without air drying in polylysine paraformaldehyde (PLP) fixative. After rinsing in PBS, slides are stained by a 3-step immunoperoxidase method. Cell monolayer is then fixed in buffered 3% glutaraldehyde prior to DAB reaction. After the DAB reaction step, wet monolayers can be examined under LM for presence of brown reaction product and selected monolayers then processed by routine methods for EM and embedded with the Chien Re-embedding Mold. After the polymerization, the epoxy blocks are easily separated from the glass slides by heatingon a 100°C hot plate for 20 seconds.


Author(s):  
W. A. Chiou ◽  
N. Kohyama ◽  
B. Little ◽  
P. Wagner ◽  
M. Meshii

The corrosion of copper and copper alloys in a marine environment is of great concern because of their widespread use in heat exchangers and steam condensers in which natural seawater is the coolant. It has become increasingly evident that microorganisms play an important role in the corrosion of a number of metals and alloys under a variety of environments. For the past 15 years the use of SEM has proven to be useful in studying biofilms and spatial relationships between bacteria and localized corrosion of metals. Little information, however, has been obtained using TEM capitalizing on its higher spacial resolution and the transmission observation of interfaces. The research presented herein is the first step of this new approach in studying the corrosion with biological influence in pure copper.Commercially produced copper (Cu, 99%) foils of approximately 120 μm thick exposed to a copper-tolerant marine bacterium, Oceanospirillum, and an abiotic culture medium were subsampled (1 cm × 1 cm) for this study along with unexposed control samples.


Author(s):  
Arthur V. Jones

With the introduction of field-emission sources and “immersion-type” objective lenses, the resolution obtainable with modern scanning electron microscopes is approaching that obtainable in STEM and TEM-but only with specific types of specimens. Bulk specimens still suffer from the restrictions imposed by internal scattering and the need to be conducting. Advances in coating techniques have largely overcome these problems but for a sizeable body of specimens, the restrictions imposed by coating are unacceptable.For such specimens, low voltage operation, with its low beam penetration and freedom from charging artifacts, is the method of choice.Unfortunately the technical dificulties in producing an electron beam sufficiently small and of sufficient intensity are considerably greater at low beam energies — so much so that a radical reevaluation of convential design concepts is needed.The probe diameter is usually given by


1968 ◽  
Vol 32 (3) ◽  
pp. 279-282
Author(s):  
JI Mock ◽  
JW Grenfell ◽  
WA Richter
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document