A high speed dataflow processing element and its performance compared to a von Neumann mainframe

Author(s):  
J.N. Coleman
2015 ◽  
Vol 773 ◽  
pp. 366-394 ◽  
Author(s):  
Xisheng Luo ◽  
Minghu Wang ◽  
Ting Si ◽  
Zhigang Zhai

The interaction of a planar shock wave ($M\approx 1.2$) with an $\text{SF}_{6}$ polygonal inhomogeneity surrounded by air is experimentally investigated. Six polygons including a square, two types of rectangle, two types of triangle, and a diamond are generated by the soap film technique developed in our previous work, in which thin pins are used as angular vertexes to avoid the pressure singularities caused by the surface tension. The evolutions of the shock-accelerated $\text{SF}_{6}$ polygons are captured by a high-speed schlieren system from which wave systems and the interface characteristics can be clearly identified. Both regular and irregular refraction phenomena are observed outside the volume, and more complex wave patterns, including transmitted shock, refracted shock, Mach stem and the interactions between them, are found inside the volume. Two typical irregular refraction phenomena (free precursor refraction, FPR, and free precursor von Neumann refraction, FNR) are observed and analysed, and the transition from FPR to FNR is found, providing the experimental evidence for the transition between different wave patterns numerically found in the literature. Combined with our previous work (Zhai et al., J. Fluid Mech., vol. 757, 2014, pp. 800–816), the reciprocal transitions between FPR and FNR are experimentally confirmed. The velocities and trajectories of the triple points are further measured and it is found that the motions of the triple points are self-similar or pseudo-stationary. Besides the shock dynamics phenomena, the evolutions of these shocked heavy polygonal volumes, which are quite different from the light ones, are captured and found to be closely related to their initial shapes. Specifically, for square and rectangular geometries, the different width–height ratios result in different behaviours of shock–shock interaction inside the volume, and subsequently different features for the outward jet and the interface. Quantitatively, the time-variations of the interface scales, such as the width and the normalized displacements of the edges, are obtained and compared with those from previous work. The comparison illustrates the superiority of the interface formation method and the significant effect of the initial interface shape on the interface features. Furthermore, the characteristics of the vortex core, including the velocity and vortex spacing, are experimentally measured, and the vortex velocity is compared with those from some circulation models to check the validity of the models. The results in the present work enrich understanding of the shock refraction phenomenon and the database of research into Richtmyer–Meshkov instability (RMI).


2016 ◽  
Vol 55 (4S) ◽  
pp. 04EF08
Author(s):  
Zhe Chen ◽  
Jie Yang ◽  
Cong Shi ◽  
Qi Qin ◽  
Liyuan Liu ◽  
...  

Author(s):  
Subrata Dasgupta

In the ENIAC story so far, John von Neumann has had a fleeting presence. We saw that the BRL formed a high-powered scientific advisory committee at the start of World War II, well before the United States entered the war. von Neumann was a member of this committee and it is unlikely that anyone in the committee was as influential in the American scientific world or, for that matter, in the corridors of power in Washington, DC, than him. By the beginning of the 1940s, von Neumann had a massive reputation in the mathematical universe. His contributions spanned many regions of pure and applied mathematics, mathematical physics, even formal logic. He was one of the six mathematicians originally appointed as professors at the Institute of Advanced Study, Princeton, when it was founded in 1933—another was Einstein. In 1944, von Neumann and economist Oskar Morgenstern (1902–1977) published a book titled The Theory of Games and Economic Behavior, thus founding and establishing for posterity the scientific discipline known as game theory. Herman Goldstine, who came to know von Neumann very well—first through their involvement with the BRL and then, after the war, at the Institute of Advanced Study, where Goldstine went to work with von Neumann on what came to be called the IAS computer project —wrote vividly about von Neumann’s intellectual persona, of his ever-ready receptiveness to new ideas, his responsiveness to new intellectual challenges, his mental restlessness when between projects, and the single-mindedness with which he pursued an idea that captured his attention. Oddly enough, despite his involvement with the BRL, he was apparently unaware of the ENIAC project until a chance meeting with Goldstine in a railway station in Aberdeen, Maryland. Goldstine recalls how the entire tone and tenor of their first conversation, initially casual and relaxed, changed when von Neumann realized that Goldstine was involved with the development of a high-speed electronic computer. There after, Goldstine writes, he felt as he was being grilled in a doctoral oral examination. Thus began their association, a relationship that only ended with von Neumann’s death from cancer in 1957.


Colossus ◽  
2006 ◽  
Author(s):  
Jack Copeland

Secrecy about Colossus has bedevilled the history of computing. In the years following the Second World War, the Hungarian-born American logician and mathematician John von Neumann, through writings and charismatic public addresses, made the concept of the electronic digital computer widely known. Von Neumann knew nothing of Colossus, and he told the world that the American ENIAC—first operational at the end of 1945, two years after Colossus—was ‘the first electronic computing machine’. Others familiar with the ENIAC and unaware of Colossus peddled the same message. The myth soon became set in stone, and for the rest of the twentieth century book after book—not to mention magazines and newspaper articles—told readers that the ENIAC was the first electronic computer. In 1971, a leading computer science textbook gave this historical summary: ‘The early story has often been told, starting with Babbage and . . . up to the birth of electronic machines with ENIAC.’ The present chapter revisits the early story, setting Colossus in its proper place. In the original sense of the word, a computer was not a machine at all, but a human being—a mathematical assistant whose task was to calculate by rote, in accordance with a systematic method supplied by an overseer prior to the calculation. The computer, like a filing clerk, might have little detailed knowledge of the end to which his or her work was directed. Many thousands of human computers were employed in business, government, and research establishments, doing some of the sorts of calculating work that nowadays is performed by electronic computers (see photograph 42). The term ‘computing machine’ was used increasingly from the 1920s to refer to small calculating machines which mechanised elements of the human computer’s work. For a complex calculation, several dozen human computers might be required, each equipped with a desktop computing machine. By the 1940s, however, the scale of some calculations required by physicists and engineers had become so great that the work could not easily be done in a reasonable time by even a roomful of human computers with desktop computing machines. The need to develop high-speed large-scale computing machinery was pressing.


2020 ◽  
Vol 14 ◽  
Author(s):  
Carlo Michaelis ◽  
Andrew B. Lehr ◽  
Christian Tetzlaff

Neuromorphic hardware has several promising advantages compared to von Neumann architectures and is highly interesting for robot control. However, despite the high speed and energy efficiency of neuromorphic computing, algorithms utilizing this hardware in control scenarios are still rare. One problem is the transition from fast spiking activity on the hardware, which acts on a timescale of a few milliseconds, to a control-relevant timescale on the order of hundreds of milliseconds. Another problem is the execution of complex trajectories, which requires spiking activity to contain sufficient variability, while at the same time, for reliable performance, network dynamics must be adequately robust against noise. In this study we exploit a recently developed biologically-inspired spiking neural network model, the so-called anisotropic network. We identified and transferred the core principles of the anisotropic network to neuromorphic hardware using Intel's neuromorphic research chip Loihi and validated the system on trajectories from a motor-control task performed by a robot arm. We developed a network architecture including the anisotropic network and a pooling layer which allows fast spike read-out from the chip and performs an inherent regularization. With this, we show that the anisotropic network on Loihi reliably encodes sequential patterns of neural activity, each representing a robotic action, and that the patterns allow the generation of multidimensional trajectories on control-relevant timescales. Taken together, our study presents a new algorithm that allows the generation of complex robotic movements as a building block for robotic control using state of the art neuromorphic hardware.


2007 ◽  
Vol 18 (11) ◽  
pp. 1747-1764 ◽  
Author(s):  
X. F. PAN ◽  
AIGUO XU ◽  
GUANGCAI ZHANG ◽  
SONG JIANG

We present an improved lattice Boltzmann model for high-speed compressible flows. The model is composed of a discrete-velocity model by Kataoka and Tsutahara15 and an appropriate finite-difference scheme combined with an additional dissipation term. With the dissipation term parameters in the model can be flexibly chosen so that the von Neumann stability condition is satisfied. The influence of the various model parameters on the numerical stability is analyzed and some reference values of parameter are suggested. The new scheme works for both subsonic and supersonic flows with a Mach number up to 30 (or higher), which is validated by well-known benchmark tests. Simulations on Riemann problems with very high ratios (1000:1) of pressure and density also show good accuracy and stability. Successful recovering of regular and double Mach shock reflections shows the potential application of the lattice Boltzmann model to fluid systems where non-equilibrium processes are intrinsic. The new scheme for stability can be easily extended to other lattice Boltzmann models.


2011 ◽  
Vol 21 (3) ◽  
pp. 823-826 ◽  
Author(s):  
Fumishige Miyaoka ◽  
Toshiki Kainuma ◽  
Yasuhiro Shimamura ◽  
Yuki Yamanashi ◽  
Nobuyuki Yoshikawa

Author(s):  
E.D. Wolf

Most microelectronics devices and circuits operate faster, consume less power, execute more functions and cost less per circuit function when the feature-sizes internal to the devices and circuits are made smaller. This is part of the stimulus for the Very High-Speed Integrated Circuits (VHSIC) program. There is also a need for smaller, more sensitive sensors in a wide range of disciplines that includes electrochemistry, neurophysiology and ultra-high pressure solid state research. There is often fundamental new science (and sometimes new technology) to be revealed (and used) when a basic parameter such as size is extended to new dimensions, as is evident at the two extremes of smallness and largeness, high energy particle physics and cosmology, respectively. However, there is also a very important intermediate domain of size that spans from the diameter of a small cluster of atoms up to near one micrometer which may also have just as profound effects on society as “big” physics.


Sign in / Sign up

Export Citation Format

Share Document