scholarly journals Heuristic Methods for Minimum-Cost Pipeline Network Design – a Node Valency Transfer Metaheuristic

Author(s):  
Christopher Yeates ◽  
Cornelia Schmidt-Hattenberger ◽  
Wolfgang Weinzierl ◽  
David Bruhn

AbstractDesigning low-cost network layouts is an essential step in planning linked infrastructure. For the case of capacitated trees, such as oil or gas pipeline networks, the cost is usually a function of both pipeline diameter (i.e. ability to carry flow or transferred capacity) and pipeline length. Even for the case of incompressible, steady flow, minimizing cost becomes particularly difficult as network topology itself dictates local flow material balances, rendering the optimization space non-linear. The combinatorial nature of potential trees requires the use of graph optimization heuristics to achieve good solutions in reasonable time. In this work we perform a comparison of known literature network optimization heuristics and metaheuristics for finding minimum-cost capacitated trees without Steiner nodes, and propose novel algorithms, including a metaheuristic based on transferring edges of high valency nodes. Our metaheuristic achieves performance above similar algorithms studied, especially for larger graphs, usually producing a significantly higher proportion of optimal solutions, while remaining in line with time-complexity of algorithms found in the literature. Data points for graph node positions and capacities are first randomly generated, and secondly obtained from the German emissions trading CO2 source registry. As political will for applications and storage for hard-to-abate industry CO2 emissions is growing, efficient network design methods become relevant for new large-scale CO2 pipeline networks.

2020 ◽  
Author(s):  
CHRISTOPHER YEATES ◽  
Cornelia Schmidt-Hattenberger ◽  
Wolfgang Weinzierl ◽  
David Bruhn

Designing low-cost networks is an essential step in planning linked infrastructure. For the case of capacitated trees, such as oil or gas pipeline networks, the cost is usually a function of both pipeline thickness (i.e. capacity) and pipeline length. Minimizing cost becomes particularly difficult as network topology itself dictates local flow material balances, rendering the optimization space non-linear. The combinatorial nature of potential trees requires the use of graph optimization heuristics to achieve good solutions in reasonable time. In this work we perform a comparison of known literature network optimization heuristics and metaheuristics, and propose novel algorithms, including a metaheuristic based on transferring edges of high valency nodes. Our metaheuristic achieves performance above similar algorithms studied, especially for larger graphs, usually producing a significantly higher proportion of optimal solutions, while remaining in line with time-complexity of algorithms found in the literature. Data points for graph node positions and capacities are first randomly generated, and secondly obtained from the German emissions trading CO2 source registry. Driven by the increasing necessity to find applications and storage for industry CO2 emissions, finding minimum-cost networks increases the business case for large-scale CO2 transportation pipeline infrastructure.


2018 ◽  
Vol 2018 ◽  
pp. 1-23 ◽  
Author(s):  
Hao Chen ◽  
Shu Yang ◽  
Jun Li ◽  
Ning Jing

With the development of aerospace science and technology, Earth Observation Satellite cluster which consists of heterogeneous satellites with many kinds of payloads appears gradually. Compared with the traditional satellite systems, satellite cluster has some particular characteristics, such as large-scale, heterogeneous satellite platforms, various payloads, and the capacity of performing all the observation tasks. How to select a subset from satellite cluster to perform all observation tasks effectively with low cost is a new challenge arousing in the field of aerospace resource scheduling. This is the agent team formation problem for observation task-oriented satellite cluster. A mathematical scheduling model is built. Three novel algorithms, i.e., complete search algorithm, heuristic search algorithm, and swarm intelligence optimization algorithm, are proposed to solve the problem in different scales. Finally, some experiments are conducted to validate the effectiveness and practicability of our algorithms.


2006 ◽  
Vol 2006 ◽  
pp. 1-12
Author(s):  
A. Korobeinikov ◽  
P. Read ◽  
A. Parshotam ◽  
J. Lermit

It has been suggested that the large scale use of biofuel, that is, fuel derived from biological materials, especially in combination with reforestation of large areas, can lead to a low-cost reduction of atmospheric carbon dioxide levels. In this paper, a model of three markets: fuel, wood products, and land are considered with the aim of evaluating the impact of large scale biofuel production and forestry on these markets, and to estimate the cost of a policy aimed at the reduction of carbon dioxide in the atmosphere. It is shown that the costs are lower than had been previously expected.


2020 ◽  
Vol 142 (4) ◽  
Author(s):  
Abdelhamid Mraoui ◽  
Abdallah Khellaf

Abstract In this work, the design of a hydrogen production system was optimized for Algiers in Algeria. The system produces hydrogen by electrolysis using a photovoltaic (PV) generator as a source of electricity. All the elements of the system have been modeled to take into account practical constraints. The cost of producing hydrogen has been minimized by varying the total power of the photovoltaic generator. An optimal ratio between the peak power of the PV array and the nominal power of the electrolyzer was determined. Photovoltaic module technology has been varied using a large database of electrical characteristics. It was noted that PV technology does not have a very significant impact on cost. The minimum cost is around 0.44$/N m3, and the power ratio in this case is 1.45. This results in a cost reduction of around 12% compared to a unit ratio. The power ratio and cost are linearly dependent. Only a small number of technologies give a relatively low cost of about 0.35$/N m3. These generators are interesting; however, we assumed an initial cost of $2.00/Wp for all technologies. In addition, it was noted that it is possible to increase hydrogen production by 10% by increasing the power of the photovoltaic generator, the extra cost in this case will only be 0.1%.


Author(s):  
Mohan Rao T. ◽  
K. Rajesh Kumar ◽  
G. Shyamala ◽  
R. Gobinath

With the growth of urbanization and industrialization, water bodies are getting polluted. Among various pollutants, phenol-based pollutants are common water pollutions which originate from wastewater discharged from processing manufacturing industries like petrochemical refineries, ceramic plants, textile processing, leather processing, synthetic rubbers, etc. These pollutants are toxic and have long-term ill effects on both humans and aquatic animals. Adsorption is well proven technique which is widely used for removal of pollutions from aqueous environments. But this process, is hindered due to the cost of adsorbents especially for large scale continuous processes. In this regard, adsorbents derived from waste biomass can be a great asset to reduce the cost of wastewater treatment. To meet this objective, coconut shells are chosen as biomass which is abundantly available from south east Asia. This biomass is converted into activated carbon and hence used to remove phenol from wastewater. Batch adsorption experiments were performed with different initial concentration, carbon dosage, pH and contact time. At a lower concentration of 50 mg/L of initial feed (phenol) concentration resulted in around 90% phenol removal and henceforth optimum results in phenol removal obtained in only 64%. Experimental results are in good agreement with Langmuir adsorption isotherm model and have shown a better fitting to the experimental data. These studies confirm that the coconut shell-based activated carbon could be used to effectively adsorb phenol from aqueous solutions.


2002 ◽  
Vol 17 (10) ◽  
pp. 2484-2488 ◽  
Author(s):  
Travis L. Brown ◽  
Srinivasan Swaminathan ◽  
Srinivasan Chandrasekar ◽  
W. Dale Compton ◽  
Alexander H. King ◽  
...  

In spite of their interesting properties, nanostructured materials have found limited uses because of the cost of preparation and the limited range of materials that can be synthesized. It has been shown that most of these limitations can be overcome by subjecting a material to large-scale deformation, as occurs during common machining operations. The chips produced during lathe machining of a variety of pure metals, steels, and other alloys are shown to be nanostructured with grain (crystal) sizes between 100 and 800 nm. The hardness of the chips is found to be significantly greater than that of the bulk material.


Author(s):  
P. K. KAPUR ◽  
ANU. G. AGGARWAL ◽  
KANICA KAPOOR ◽  
GURJEET KAUR

The demand for complex and large-scale software systems is increasing rapidly. Therefore, the development of high-quality, reliable and low cost computer software has become critical issue in the enormous worldwide computer technology market. For developing these large and complex software small and independent modules are integrated which are tested independently during module testing phase of software development. In the process, testing resources such as time, testing personnel etc. are used. These resources are not infinitely large. Consequently, it is an important matter for the project manager to allocate these limited resources among the modules optimally during the testing process. Another major concern in software development is the cost. It is in fact, profit to the management if the cost of the software is less while meeting the costumer requirements. In this paper, we investigate an optimal resource allocation problem of minimizing the cost of software testing under limited amount of available resources, given a reliability constraint. To solve the optimization problem we present genetic algorithm which stands up as a powerful tool for solving search and optimization problems. The key objective of using genetic algorithm in the field of software reliability is its capability to give optimal results through learning from historical data. One numerical example has been discussed to illustrate the applicability of the approach.


Author(s):  
G. G. Nalbandyan ◽  
S. S. Zholnerchik

The reduction in the cost of technologies for distributed generation involves an increasing decentralization of power generation and large-scale development of distributed sources around the world. This trend is a key change in both the characteristics of electricity consumption: it is becoming increasingly flexible and mobile, and the patterns of consumer behavior in the electricity market. Electricity consumers are becoming at the same time its suppliers and require revision of traditional regulation standards of the electricity market. The purpose of the article is to assess the influence of distributed generation on the economy of both enterprises and the country as a whole. To identify the effects of the introduction of distributed generation technologies, the method of case study analysis is used. The empirical analysis was carried out on the basis of twelve Russian companies that use their own energy sources. The selected companies belong to the following industries: industrial production, housing and communal services, retail trade, construction, agriculture. Technological and economic effects are revealed. Technological ones include: improving consumer reliability, energy security, involving local energy resources, optimizing load management and redundancy, providing the flexibility of smart grids (in terms of generation), reducing the load on the environment, including CO2 emissions. Economic effects: optimization of the load schedule, reduction of losses in the process of transmission/distribution of energy, expansion of cogeneration, etc., providing the consumer with the electricity of a given quality, saving losses in networks, reducing the cost of energy. The identified effects of the introduction of distributed generation technologies make it possible to highlight the advantages of regeneration facilities: high efficiency and the possibility of cogeneration and trigeneration, individual maneuvering capacity loading, high reliability of equipment, low cost of transportation of electricity, fuel usage of the by-products and the main production waste. In conclusion, recommendations are formulated on a set of measures for the development of industrial distributed generation in Russia at the Federal level.


2020 ◽  
Vol 2020 (1) ◽  
pp. 374-1-374-11
Author(s):  
Thanawut Ananpiriyakul ◽  
Joshua Anghel ◽  
Kristi Potter ◽  
Alark Joshi

Computational complexity is a limiting factor for visualizing large-scale scientific data. Most approaches to render large datasets are focused on novel algorithms that leverage cutting-edge graphics hardware to provide users with an interactive experience. In this paper, we alternatively demonstrate foveated imaging which allows interactive exploration using low-cost hardware by tracking the gaze of a participant to drive the rendering quality of an image. Foveated imaging exploits the fact that the spatial resolution of the human visual system decreases dramatically away from the central point of gaze, allowing computational resources to be reserved for areas of importance. We demonstrate this approach using face tracking to identify the gaze point of the participant for both vector and volumetric datasets and evaluate our results by comparing against traditional techniques. In our evaluation, we found a significant increase in computational performance using our foveated imaging approach while maintaining high image quality in regions of visual attention.


2016 ◽  
Vol 2016 (S2) ◽  
pp. S1-S52 ◽  
Author(s):  
Ennis Ogawa ◽  
Aimin Xing ◽  
David F.-S. Liao ◽  
Ten V. Y. Ten ◽  
Chong Wei Neo ◽  
...  

Fanout Wafer Level Packaging (FoWLP) is a very attractive solution for microelectronics applications requiring optimized performance, smaller form factor, and low cost. By utilizing such an approach where system integration is done to multiple chips on a single package frame, the need to ensure much higher levels of process integrity, quality, and reliability becomes absolutely critical, especially if the total product volume lies in the range of tens of millions of units. A single defect type may negate the benefits of such an approach because the cost of losing one FoWLP unit results in the loss of multiple devices. Thus, yield, quality, and reliability optimization using such a package solution is critical for successful large scale manufacturing. In this talk, the issue of defectivity and its impact on quality and reliability on Wafer-Level (WL) devices with regards to the issue of Die Edge Delamination (DED) and Chip Mechanical Integrity (CMI) is discussed. Through this discussion and the resulting solutions found to improve WL quality and reliability, better understanding on how to assess the quality and reliability of a given FoWLP solution for large scale production will be demonstrated.


Sign in / Sign up

Export Citation Format

Share Document