Enabling Multi-Robot Cooperative Additive Manufacturing: Centralized vs. Decentralized Approaches

2021 ◽  
Author(s):  
Saivipulteja Elagandula ◽  
Laxmi Poudel ◽  
Wenchao Zhou ◽  
Zhenghui Sha

Abstract This paper presents a decentralized approach based on a simple set of rules to carry out multi-robot cooperative 3D printing. Cooperative 3D printing is a novel approach to 3D printing that uses multiple mobile 3D printing robots to print a large part by dividing and assigning the part to multiple robots in parallel using the concept of chunk-based printing. The results obtained using the decentralized approach are then compared with those obtained from the centralized approach. Two case studies were performed to evaluate the performance of both approaches using makespan as the evaluation criterion. The first case is a small-scale problem with four printing robots and 20 chunks, whereas the second case study is a large-scale problem with ten printing robots and 200 chunks. The result shows that the centralized approach provides a better solution compared to the decentralized approach in both cases in terms of makespan. However, the gap between the solutions seems to shrink with the scale of the problem. While further study is required to verify this conclusion, the decrease in this gap indicates that the decentralized approach might compare favorably over the centralized approach for a large-scale problem in manufacturing using multiple mobile 3D printing robots. Additionally, the runtime for the large-scale problem (Case II) increases by 27-fold compared to the small-scale problem (Case I) for the centralized approach, whereas it only increased by less than 2-fold for the decentralized approach.

2021 ◽  
pp. 1-29
Author(s):  
Laxmi Poudel ◽  
Wenchao Zhou ◽  
Zhenghui Sha

Abstract Cooperative 3D printing (C3DP) – a representative realization of cooperative manufacturing – is a novel approach that utilizes multiple mobile 3D printing robots for additive manufacturing. It makes the make-span much shorter compared to the traditional 3D printing due to parallel printing. In C3DP, collision-free scheduling is critical to the realization of cooperation and parallel operation among mobile printers. In the extant literature, there is a lack of methods to schedule multi-robot C3DP with limited resources. This study addresses this gap with two methods. The first method, dynamic dependency list algorithm (DDLA), uses constraint satisfaction to eliminate solutions that could result in collisions between robots and collisions between robots with already-printed materials. The second method, modified genetic algorithm (GA), uses chromosomes to represent chunk assignments and utilizes GA operators, such as the crossover and mutation, to generate diverse print schedules while maintaining the dependencies between chunks. Three case studies, including two large rectangular bars in different scales and a foldable SUV, are used to demonstrate the effectiveness and performance of the two methods. The results show that both methods can effectively generate valid print schedules using a specified number of robots while attempting to minimize the make-span. The results also show that both methods generate a print schedule with equal print time for the first two case studies with homogeneous chunks. In contrast, the modified GA outperforms the DDLA in the third case study, where the chunks are heterogeneous in volume and require different time to print.


2015 ◽  
Vol 43 (3) ◽  
pp. 7-14 ◽  
Author(s):  
Jim Moffatt

Purpose – This case example looks at how Deloitte Consulting applies the Three Rules synthesized by Michael Raynor and Mumtaz Ahmed based on their large-scale research project that identified patterns in the way exceptional companies think. Design/methodology/approach – The Three Rules concept is a key piece of Deloitte Consulting’s thought leadership program. So how are the three rules helping the organization perform? Now that research has shown how exceptional companies think, CEO Jim Moffatt could address the question, “Does Deloitte think like an exceptional company?” Findings – Deloitte has had success with an approach that promotes a bias towards non-price value over price and revenue over costs. Practical implications – It’s critical that all decision makers in an organization understand how decisions that are consistent with the three rules have contributed to past success as well as how they can apply the rules to difficult challenges they face today. Originality/value – This is the first case study written from a CEO’s perspective that looks at how the Three Rules approach of Michael Raynor and Mumtaz Ahmed can foster a firm’s growth and exceptional performance.


2017 ◽  
Vol 22 (6) ◽  
pp. 486-505 ◽  
Author(s):  
Benjamin Tukamuhabwa ◽  
Mark Stevenson ◽  
Jerry Busby

Purpose In few prior empirical studies on supply chain resilience (SCRES), the focus has been on the developed world. Yet, organisations in developing countries constitute a significant part of global supply chains and have also experienced the disastrous effects of supply chain failures. The purpose of this paper is therefore to empirically investigate SCRES in a developing country context and to show that this also provides theoretical insights into the nature of what is meant by resilience. Design/methodology/approach Using a case study approach, a supply network of 20 manufacturing firms in Uganda is analysed based on a total of 45 interviews. Findings The perceived threats to SCRES in this context are mainly small-scale, chronic disruptive events rather than discrete, large-scale catastrophic events typically emphasised in the literature. The data reveal how threats of disruption, resilience strategies and outcomes are inter-related in complex, coupled and non-linear ways. These interrelationships are explained by the political, cultural and territorial embeddedness of the supply network in a developing country. Further, this embeddedness contributes to the phenomenon of supply chain risk migration, whereby an attempt to mitigate one threat produces another threat and/or shifts the threat to another point in the supply network. Practical implications Managers should be aware, for example, of potential risk migration from one threat to another when crafting strategies to build SCRES. Equally, the potential for risk migration across the supply network means managers should look at the supply chain holistically because actors along the chain are so interconnected. Originality/value The paper goes beyond the extant literature by highlighting how SCRES is not only about responding to specific, isolated threats but about the continuous management of risk migration. It demonstrates that resilience requires both an understanding of the interconnectedness of threats, strategies and outcomes and an understanding of the embeddedness of the supply network. Finally, this study’s focus on the context of a developing country reveals that resilience should be equally concerned both with smaller in scale, chronic disruptions and with occasional, large-scale catastrophic events.


Author(s):  
Ilda Vagge ◽  
◽  
Gioia Maddalena Gibelli ◽  
Alessio Gosetti Poli ◽  
◽  
...  

The authors, with the awareness that climate change affects and changes the landscape, wanted to investigate how these changes are occurring within the metropolitan area of Tehran. Trying to keep a holistic method that embraces different disciplines, reasoning from large scale to small scale, the authors tried to study the main problems related to water scarcity and loss of green spaces. Subsequently they dedicated themselves to the identification of the present and missing ecosystem services, so that they could be used in the best possible way as tools for subsequent design choices. From the analysis obtained, the authors have created a masterplan with the desire to ensure a specific natural capital, the welfare of ecosystem services, and at the same time suggest good water management practices. It becomes essential to add an ecological accounting to the economic accounting, giving dignity to the natural system and the ecosystem services that derive from it.


Author(s):  
Anjan Pakhira ◽  
Peter Andras

Testing is a critical phase in the software life-cycle. While small-scale component-wise testing is done routinely as part of development and maintenance of large-scale software, the system level testing of the whole software is much more problematic due to low level of coverage of potential usage scenarios by test cases and high costs associated with wide-scale testing of large software. Here, the authors investigate the use of cloud computing to facilitate the testing of large-scale software. They discuss the aspects of cloud-based testing and provide an example application of this. They describe the testing of the functional importance of methods of classes in the Google Chrome software. The methods that we test are predicted to be functionally important with respect to a functionality of the software. The authors use network analysis applied to dynamic analysis data generated by the software to make these predictions. They check the validity of these predictions by mutation testing of a large number of mutated variants of the Google Chrome. The chapter provides details of how to set up the testing process on the cloud and discusses relevant technical issues.


Author(s):  
Hans-Jörg Schmid

This chapter discusses how the Entrenchment-and-Conventionalization Model explains language change. First, it is emphasized that not only innovation and variation, but also the frequency of repetition can serve as important triggers of change. Conventionalization and entrenchment processes can interact and be influenced by numerous forces in many ways, resulting in various small-scale processes of language change, which can stop, change direction, or even become reversed. This insight serves as a basis for the systematic description of nine basic modules of change which differ in the ways in which they are triggered and controlled by processes and forces. Large-scale pathways of change such as grammaticalization, lexicalization, pragmaticalization, context-induced change, or colloquialization and standardization are all explained by reference to these modules. The system is applied in a case study on the history of do-periphrasis.


Geophysics ◽  
2008 ◽  
Vol 73 (4) ◽  
pp. A23-A26 ◽  
Author(s):  
Gilles Hennenfent ◽  
Ewout van den Berg ◽  
Michael P. Friedlander ◽  
Felix J. Herrmann

Geophysical inverse problems typically involve a trade-off between data misfit and some prior model. Pareto curves trace the optimal trade-off between these two competing aims. These curves are used commonly in problems with two-norm priors in which they are plotted on a log-log scale and are known as L-curves. For other priors, such as the sparsity-promoting one-norm prior, Pareto curves remain relatively unexplored. We show how these curves lead to new insights into one-norm regularization. First, we confirm theoretical properties of smoothness and convexity of these curves from a stylized and a geophysical example. Second, we exploit these crucial properties to approximate the Pareto curve for a large-scale problem. Third, we show how Pareto curves provide an objective criterion to gauge how different one-norm solvers advance toward the solution.


1943 ◽  
Vol 37 (1) ◽  
pp. 30-45 ◽  
Author(s):  
Robert R. Wilson

A large-scale problem for the principal belligerents in the present war is that of the treatment of civilians of enemy nationality in their respective jurisdictions. Measured in terms of the number of human beings involved, national safety considerations, and the possibly unfortunate effect at home of mishandling it, the problem assumes far-reaching importance. There is need for clear law as well as positive action. There is need for perspective. In relation to international law, the distinctiveness of the classification of “civilian alien enemy,” past effort looking to the construction of internationally binding rules prescribing treatment, and practice in the current war, merit attention.


2018 ◽  
Vol 84 (2) ◽  
Author(s):  
E. G. Highcock ◽  
N. R. Mandell ◽  
M. Barnes ◽  
W. Dorland

The confinement of heat in the core of a magnetic fusion reactor is optimised using a multidimensional optimisation algorithm. For the first time in such a study, the loss of heat due to turbulence is modelled at every stage using first-principles nonlinear simulations which accurately capture the turbulent cascade and large-scale zonal flows. The simulations utilise a novel approach, with gyrofluid treatment of the small-scale drift waves and gyrokinetic treatment of the large-scale zonal flows. A simple near-circular equilibrium with standard parameters is chosen as the initial condition. The figure of merit, fusion power per unit volume, is calculated, and then two control parameters, the elongation and triangularity of the outer flux surface, are varied, with the algorithm seeking to optimise the chosen figure of merit. A twofold increase in the plasma power per unit volume is achieved by moving to higher elongation and strongly negative triangularity.


Sign in / Sign up

Export Citation Format

Share Document