scholarly journals Case Report: Preliminary Images From an Electromagnetic Portable Brain Scanner for Diagnosis and Monitoring of Acute Stroke

2021 ◽  
Vol 12 ◽  
Author(s):  
David Cook ◽  
Helen Brown ◽  
Isuravi Widanapathirana ◽  
Darshan Shah ◽  
James Walsham ◽  
...  

Introduction: Electromagnetic imaging is an emerging technology which promises to provide a mobile, and rapid neuroimaging modality for pre-hospital and bedside evaluation of stroke patients based on the dielectric properties of the tissue. It is now possible due to technological advancements in materials, antennae design and manufacture, rapid portable computing power and network analyses and development of processing algorithms for image reconstruction. The purpose of this report is to introduce images from a novel, portable electromagnetic scanner being trialed for bedside and mobile imaging of ischaemic and haemorrhagic stroke.Methods: A prospective convenience study enrolled patients (January 2020 to August 2020) with known stroke to have brain electromagnetic imaging, in addition to usual imaging and medical care. The images are obtained by processing signals from encircling transceiver antennae which emit and detect low energy signals in the microwave frequency spectrum between 0.5 and 2.0 GHz. The purpose of the study was to refine the imaging algorithms.Results: Examples are presented of haemorrhagic and ischaemic stroke and comparison is made with CT, perfusion and MRI T2 FAIR sequence images.Conclusion: Due to speed of imaging, size and mobility of the device and negligible environmental risks, development of electromagnetic scanning scanner provides a promising additional modality for mobile and bedside neuroimaging.

2018 ◽  
Vol 1 (4) ◽  
Author(s):  
Ali Hadizadeh ◽  
Ehsan Tanghatari

Processors are main part of the calculation and decision making of a system. Today, due to the increasing need of industry and technology to faster and more accurate computing power, design and manufacture of parallel processing units, has been very much considered. One of the most important processor families used in various devises is the MIPS processors. This processor family had been considered in the telecom and control industry as a reasonable choice. In this paper, new architecture based on this processor, with new parallel processing design, is provided to allow parallel execution of instructions dynamically. Ultimately, the processor efficiency to several fold will be increased. In this architecture, new ideas for the issuance of instructions in parallel, intelligent detection of conditional jumps and memory management are presented.


Author(s):  
S.J.B. Reed

Characteristic fluorescenceThe theory of characteristic fluorescence corrections was first developed by Castaing. The same approach, with an improved expression for the relative primary x-ray intensities of the exciting and excited elements, was used by Reed, who also introduced some simplifications, which may be summarized as follows (with reference to K-K fluorescence, i.e. K radiation of element ‘B’ exciting K radiation of ‘A’):1.The exciting radiation is assumed to be monochromatic, consisting of the Kα line only (neglecting the Kβ line).2.Various parameters are lumped together in a single tabulated function J(A), which is assumed to be independent of B.3.For calculating the absorption of the emerging fluorescent radiation, the depth distribution of the primary radiation B is represented by a simple exponential.These approximations may no longer be justifiable given the much greater computing power now available. For example, the contribution of the Kβ line can easily be calculated separately.


Author(s):  
Stuart McKernan

For many years the concept of quantitative diffraction contrast experiments might have consisted of the determination of dislocation Burgers vectors using a g.b = 0 criterion from several different 2-beam images. Since the advent of the personal computer revolution, the available computing power for performing image-processing and image-simulation calculations is enormous and ubiquitous. Several programs now exist to perform simulations of diffraction contrast images using various approximations. The most common approximations are the use of only 2-beams or a single systematic row to calculate the image contrast, or calculating the image using a column approximation. The increasing amount of literature showing comparisons of experimental and simulated images shows that it is possible to obtain very close agreement between the two images; although the choice of parameters used, and the assumptions made, in performing the calculation must be properly dealt with. The simulation of the images of defects in materials has, in many cases, therefore become a tractable problem.


Author(s):  
Jose-Maria Carazo ◽  
I. Benavides ◽  
S. Marco ◽  
J.L. Carrascosa ◽  
E.L. Zapata

Obtaining the three-dimensional (3D) structure of negatively stained biological specimens at a resolution of, typically, 2 - 4 nm is becoming a relatively common practice in an increasing number of laboratories. A combination of new conceptual approaches, new software tools, and faster computers have made this situation possible. However, all these 3D reconstruction processes are quite computer intensive, and the middle term future is full of suggestions entailing an even greater need of computing power. Up to now all published 3D reconstructions in this field have been performed on conventional (sequential) computers, but it is a fact that new parallel computer architectures represent the potential of order-of-magnitude increases in computing power and should, therefore, be considered for their possible application in the most computing intensive tasks.We have studied both shared-memory-based computer architectures, like the BBN Butterfly, and local-memory-based architectures, mainly hypercubes implemented on transputers, where we have used the algorithmic mapping method proposed by Zapata el at. In this work we have developed the basic software tools needed to obtain a 3D reconstruction from non-crystalline specimens (“single particles”) using the so-called Random Conical Tilt Series Method. We start from a pair of images presenting the same field, first tilted (by ≃55°) and then untilted. It is then assumed that we can supply the system with the image of the particle we are looking for (ideally, a 2D average from a previous study) and with a matrix describing the geometrical relationships between the tilted and untilted fields (this step is now accomplished by interactively marking a few pairs of corresponding features in the two fields). From here on the 3D reconstruction process may be run automatically.


2016 ◽  
Vol 22 ◽  
pp. 178
Author(s):  
Sachin Jain ◽  
Anshuman Srivastava ◽  
Ramesh Aggarwal ◽  
Mahendra Rajput ◽  
Nishchint Jain

Methodology ◽  
2006 ◽  
Vol 2 (1) ◽  
pp. 24-33 ◽  
Author(s):  
Susan Shortreed ◽  
Mark S. Handcock ◽  
Peter Hoff

Recent advances in latent space and related random effects models hold much promise for representing network data. The inherent dependency between ties in a network makes modeling data of this type difficult. In this article we consider a recently developed latent space model that is particularly appropriate for the visualization of networks. We suggest a new estimator of the latent positions and perform two network analyses, comparing four alternative estimators. We demonstrate a method of checking the validity of the positional estimates. These estimators are implemented via a package in the freeware statistical language R. The package allows researchers to efficiently fit the latent space model to data and to visualize the results.


Author(s):  
L Feyen ◽  
H Seifarth ◽  
T Niederstadt ◽  
V Hesselmann ◽  
W Heindel ◽  
...  
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document