Parallel functional programming in Eden

2005 ◽  
Vol 15 (3) ◽  
pp. 431-475 ◽  
Author(s):  
RITA LOOGEN ◽  
YOLANDA ORTEGA-MALLÉN ◽  
RICARDO PEÑA-MARÍ

Eden extends the non-strict functional language Haskell with constructs to control parallel evaluation of processes. Although processes are defined explicitly, communication and synchronisation issues are handled in a way transparent to the programmer. In order to offer effective support for parallel evaluation, Eden's coordination constructs override the inherently sequential demand-driven (lazy) evaluation strategy of its computation language Haskell. Eden is a general-purpose parallel functional language suitable for developing sophisticated skeletons – which simplify parallel programming immensely – as well as for exploiting more irregular parallelism that cannot easily be captured by a predefined skeleton. The paper gives a comprehensive description of Eden, its semantics, its skeleton-based programming methodology – which is applied in three case studies – its implementation and performance. Furthermore it points at many additional results that have been achieved in the context of the Eden project.

2001 ◽  
Vol 9 (2-3) ◽  
pp. 143-161 ◽  
Author(s):  
Insung Park ◽  
Michael J. Voss ◽  
Seon Wook Kim ◽  
Rudolf Eigenmann

We present our effort to provide a comprehensive parallel programming environment for the OpenMP parallel directive language. This environment includes a parallel programming methodology for the OpenMP programming model and a set of tools (Ursa Minor and InterPol) that support this methodology. Our toolset provides automated and interactive assistance to parallel programmers in time-consuming tasks of the proposed methodology. The features provided by our tools include performance and program structure visualization, interactive optimization, support for performance modeling, and performance advising for finding and correcting performance problems. The presented evaluation demonstrates that our environment offers significant support in general parallel tuning efforts and that the toolset facilitates many common tasks in OpenMP parallel programming in an efficient manner.


1987 ◽  
Vol 14 (3) ◽  
pp. 134-140 ◽  
Author(s):  
K.A. Clarke

Practical classes in neurophysiology reinforce and complement the theoretical background in a number of ways, including demonstration of concepts, practice in planning and performance of experiments, and the production and maintenance of viable neural preparations. The balance of teaching objectives will depend upon the particular group of students involved. A technique is described which allows the embedding of real compound action potentials from one of the most basic introductory neurophysiology experiments—frog sciatic nerve, into interactive programs for student use. These retain all the elements of the “real experiment” in terms of appearance, presentation, experimental management and measurement by the student. Laboratory reports by the students show that the experiments are carefully and enthusiastically performed and the material is well absorbed. Three groups of student derive most benefit from their use. First, students whose future careers will not involve animal experiments do not spend time developing dissecting skills they will not use, but more time fulfilling the other teaching objectives. Second, relatively inexperienced students, struggling to produce viable neural material and master complicated laboratory equipment, who are often left with little time or motivation to take accurate readings or ponder upon neurophysiological concepts. Third, students in institutions where neurophysiology is taught with difficulty because of the high cost of equipment and lack of specific expertise, may well have access to a low cost general purpose microcomputer system.


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Geraldine Cáceres Sepúlveda ◽  
Silvia Ochoa ◽  
Jules Thibault

AbstractDue to the highly competitive market and increasingly stringent environmental regulations, it is paramount to operate chemical processes at their optimal point. In a typical process, there are usually many process variables (decision variables) that need to be selected in order to achieve a set of optimal objectives for which the process will be considered to operate optimally. Because some of the objectives are often contradictory, Multi-objective optimization (MOO) can be used to find a suitable trade-off among all objectives that will satisfy the decision maker. The first step is to circumscribe a well-defined Pareto domain, corresponding to the portion of the solution domain comprised of a large number of non-dominated solutions. The second step is to rank all Pareto-optimal solutions based on some preferences of an expert of the process, this step being performed using visualization tools and/or a ranking algorithm. The last step is to implement the best solution to operate the process optimally. In this paper, after reviewing the main methods to solve MOO problems and to select the best Pareto-optimal solution, four simple MOO problems will be solved to clearly demonstrate the wealth of information on a given process that can be obtained from the MOO instead of a single aggregate objective. The four optimization case studies are the design of a PI controller, an SO2 to SO3 reactor, a distillation column and an acrolein reactor. Results of these optimization case studies show the benefit of generating and using the Pareto domain to gain a deeper understanding of the underlying relationships between the various process variables and performance objectives.


Data ◽  
2021 ◽  
Vol 6 (6) ◽  
pp. 60
Author(s):  
Miguel A. Becerra ◽  
Catalina Tobón ◽  
Andrés Eduardo Castro-Ospina ◽  
Diego H. Peluffo-Ordóñez

This paper provides a comprehensive description of the current literature on data fusion, with an emphasis on Information Quality (IQ) and performance evaluation. This literature review highlights recent studies that reveal existing gaps, the need to find a synergy between data fusion and IQ, several research issues, and the challenges and pitfalls in this field. First, the main models, frameworks, architectures, algorithms, solutions, problems, and requirements are analyzed. Second, a general data fusion engineering process is presented to show how complex it is to design a framework for a specific application. Third, an IQ approach, as well as the different methodologies and frameworks used to assess IQ in information systems are addressed; in addition, data fusion systems are presented along with their related criteria. Furthermore, information on the context in data fusion systems and its IQ assessment are discussed. Subsequently, the issue of data fusion systems’ performance is reviewed. Finally, some key aspects and concluding remarks are outlined, and some future lines of work are gathered.


Author(s):  
Hatem Abou-Senna ◽  
Mohamed El-Agroudy ◽  
Mustapha Mouloua ◽  
Essam Radwan

The use of express lanes (ELs) in freeway traffic management has seen increasing popularity throughout the United States, particularly in Florida. These lanes aim at making the most efficient transportation system management and operations tool to provide a more reliable trip. An important component of ELs is the channelizing devices used to delineate the separation between the ELs and the general-purpose lane. With the upcoming changes to the FHWA Manual on Uniform Traffic Control Devices, this study provided an opportunity to recommend changes affecting safety and efficiency on a nationwide level. It was important to understand the impacts on driver perception and performance in response to the color of the EL delineators. It was also valuable to understand the differences between demographics in responding to delineator colors under different driving conditions. The driving simulator was used to test the responses of several demographic groups to changes in marker color and driving conditions. Furthermore, participants were tested for several factors relevant to driving performance including visual and subjective responses to the changes in colors and driving conditions. Impacts on driver perception were observed via eye-tracking technology with changes to time of day, visibility, traffic density, roadway surface type, and, crucially, color of the delineating devices. The analyses concluded that white was the optimal and most significant color for notice of delineators across the majority of subjective and performance measures, followed by yellow, with black being the least desirable.


2017 ◽  
Vol 16 (4) ◽  
pp. 155-160
Author(s):  
Ian Johnston

Purpose This paper aims to show that everything a business does is fundamentally reliant on its culture. Culture determines how successful a strategy is and whether that strategy can be executed. If the culture in a business is out of alignment, it is imperative to change it. This paper examines how HR professionals can take ownership of this cultural space and help to create a growth mindset throughout the organisation. Design/methodology/approach The paper is based on experience gained through working with several large organisations to transform their people culture and performance by embracing a growth mindset and to help their HR leadership become the early champions of change, thus ensuring the process was successfully delivered. The paper includes case studies of two organisations where successful cultural shaping delivered improved results. Findings Companies with a growth mindset will outperform those with a fixed mindset. Changing mindsets is not overly complex, but it requires flawless implementation with the HR leaders at the forefront. Originality/value As Lou Gerstner, who turned around the computing giant IBM, said “I finally realised that culture is not part of the game, it is the game”. By understanding how individual mindsets impact culture, HR professionals can own and drive their organisation’s culture-shaping efforts.


2010 ◽  
Vol 20 (02) ◽  
pp. 103-121 ◽  
Author(s):  
MOSTAFA I. SOLIMAN ◽  
ABDULMAJID F. Al-JUNAID

Technological advances in IC manufacturing provide us with the capability to integrate more and more functionality into a single chip. Today's modern processors have nearly one billion transistors on a single chip. With the increasing complexity of today's system, the designs have to be modeled at a high-level of abstraction before partitioning into hardware and software components for final implementation. This paper explains in detail the implementation and performance evaluation of a matrix processor called Mat-Core with SystemC (system level modeling language). Mat-Core is a research processor aiming at exploiting the increasingly number of transistors per IC to improve the performance of a wide range of applications. It extends a general-purpose scalar processor with a matrix unit. To hide memory latency, the extended matrix unit is decoupled into two components: address generation and data computation, which communicate through data queues. Like vector architectures, the data computation unit is organized in parallel lanes. However, on parallel lanes, Mat-Core can execute matrix-scalar, matrix-vector, and matrix-matrix instructions in addition to vector-scalar and vector-vector instructions. For controlling the execution of vector/matrix instructions on the matrix core, this paper extends the well known scoreboard technique. Furthermore, the performance of Mat-Core is evaluated on vector and matrix kernels. Our results show that the performance of four lanes Mat-Core with matrix registers of size 4 × 4 or 16 elements each, queues size of 10, start up time of 6 clock cycles, and memory latency of 10 clock cycles is about 0.94, 1.3, 2.3, 1.6, 2.3, and 5.5 FLOPs per clock cycle; achieved on scalar-vector multiplication, SAXPY, Givens, rank-1 update, vector-matrix multiplication, and matrix-matrix multiplication, respectively.


2017 ◽  
Vol 45 (3) ◽  
pp. 23-29
Author(s):  
John Oliver

Purpose CEO turnover and chronic corporate underperformance are examined through the lens of Transgenerational Response. Design/methodology/approach The criteria for investigating Transgenerational Response in corporations consisted of identifying a Critical Corporate Incident, the number of corporate generations and the resultant corporate financial performance. Findings The evidence presented in the case studies illustrates how a Critical Corporate Incident has produced the consequential effect of chronic financial performance in the years following the incident. Research limitations/implications These case studies have not presented the “actual” adaptive responses, inherited attitudes and behaviours that have subsequently embedded themselves in a new corporate culture, post the Critical Corporate Incident, to the detriment of the long-term health and performance of each firm. Practical implications Examining CEO turnover and chronic corporate underperformance through the lens of Transgenerational Response means that business leaders can identify how a historic event has affected the performance of their firm in subsequent generations. With this knowledge in hand, they will be able to examine the inherited attitudes and behaviours, organizational policies, strategy and adaptive cultural routines that have combined to consolidate the firms chronic under performance. Originality/value This is a highly original, evidence based, idea that has the potential to reshape our current understanding of CEO turnover and underperforming firms. It will help business leaders identify how a historic event has affected the performance of a firm in subsequent generations.


2013 ◽  
Vol 26 (1-2) ◽  
pp. 101-116 ◽  
Author(s):  
Luis Maia Carneiro ◽  
António Lucas Soares ◽  
Rui Patrício ◽  
Américo Lopes Azevedo ◽  
Jorge Pinho de Sousa

Sign in / Sign up

Export Citation Format

Share Document