scholarly journals Cost-effective conservation: calculating biodiversity and logging trade-offs in Southeast Asia

2011 ◽  
Vol 4 (6) ◽  
pp. 443-450 ◽  
Author(s):  
Brendan Fisher ◽  
David P. Edwards ◽  
Trond H. Larsen ◽  
Felicity A. Ansell ◽  
Wayne W. Hsu ◽  
...  
2015 ◽  
Vol 2 (7) ◽  
pp. 140521 ◽  
Author(s):  
L. R. Little ◽  
R. Q. Grafton

Conservation management agencies are faced with acute trade-offs when dealing with disturbance from human activities. We show how agencies can respond to permanent ecosystem disruption by managing for Pimm resilience within a conservation budget using a model calibrated to a metapopulation of a coral reef fish species at Ningaloo Reef, Western Australia. The application is of general interest because it provides a method to manage species susceptible to negative environmental disturbances by optimizing between the number and quality of migration connections in a spatially distributed metapopulation. Given ecological equivalency between the number and quality of migration connections in terms of time to recover from disturbance, our approach allows conservation managers to promote ecological function, under budgetary constraints, by offsetting permanent damage to one ecological function with investment in another.


2009 ◽  
Vol 10 (04) ◽  
pp. 435-457
Author(s):  
ATHANASIOS KINALIS ◽  
SOTIRIS NIKOLETSEAS

Motivated by emerging applications, we consider sensor networks where the sensors themselves (not just the sinks) are mobile. Furthermore, we focus on mobility scenarios characterized by heterogeneous, highly changing mobility roles in the network. To capture these high dynamics of diverse sensory motion we propose a novel network parameter, the mobility level, which, although simple and local, quite accurately takes into account both the spatial and speed characteristics of motion. We then propose adaptive data dissemination protocols that use the mobility level estimation to optimize performance, by basically exploiting high mobility (redundant message ferrying) as a cost-effective replacement of flooding, e.g. the sensors tend to dynamically propagate less data in the presence of high mobility, while nodes of high mobility are favored for moving data around. These dissemination schemes are enhanced by a distance-sensitive probabilistic message flooding inhibition mechanism that further reduces communication cost, especially for fast nodes of high mobility level, and as distance to data destination decreases. Our simulation findings demonstrate significant performance gains of our protocols compared to non-adaptive protocols, i.e. adaptation increases the success rate and reduces latency (even by 15%) while at the same time significantly reducing energy dissipation (in most cases by even 40%). Also, our adaptive schemes achieve significantly higher message delivery ratio and satisfactory energy-latency trade-offs when compared to flooding when sensor nodes have limited message queues.


Author(s):  
Cesar A. Cortes-Quiroz ◽  
Alireza Azarbadegan ◽  
Emadaldin Moeendarbary ◽  
Mehrdad Zangeneh

Numerical simulations and an optimization method are used to study the design of a planar T-micromixer with curved-shaped baffles in the mixing channel. The mixing efficiency and the pressure loss in the mixing channel have been evaluated for Reynolds number (Re) in the mixing channel in the range 1 to 250. A Mixing index (Mi) has been defined to quantify the mixing efficiency. Three geometric dimensions: radius of baffle, baffles pitch and height of the channel, are taken as design parameters, whereas the mixing index at the outlet section and the pressure loss in the mixing channel are the performance parameters used to optimize the micromixer geometry. To investigate the effect of design and operation parameters on the device performance, a systematic design and optimization methodology is applied, which combines Computational Fluid Dynamics (CFD) with an optimization strategy that integrates Design of Experiments (DOE), Surrogate modeling (SM) and Multi-Objective Genetic Algorithm (MOGA) techniques. The Pareto front of designs with the optimum trade-offs of mixing index and pressure loss is obtained for different values of Re. The micromixer can enhance mixing using the mechanisms of diffusion (lower Re) and convection (higher Re) to achieve values over 90%, in particular for Re in the order of 100 that has been found the cost-effective level for volume flow. This study applies a systematic procedure for evaluation and optimization of a planar T-mixer with baffles in the channel that promote transversal 3-D flow as well as recirculation secondary flows that enhance mixing.


2018 ◽  
Vol 21 (8) ◽  
pp. 1503-1514 ◽  
Author(s):  
Anna K Farmery ◽  
Gabrielle O’Kane ◽  
Alexandra McManus ◽  
Bridget S Green

AbstractObjectiveEncouraging people to eat more seafood can offer a direct, cost-effective way of improving overall health outcomes. However, dietary recommendations to increase seafood consumption have been criticised following concern over the capacity of the seafood industry to meet increased demand, while maintaining sustainable fish stocks. The current research sought to investigate Australian accredited practising dietitians’ (APD) and public health nutritionists’ (PHN) views on seafood sustainability and their dietary recommendations, to identify ways to better align nutrition and sustainability goals.DesignA self-administered online questionnaire exploring seafood consumption advice, perceptions of seafood sustainability and information sources of APD and PHN. Qualitative and quantitative data were collected via open and closed questions. Quantitative data were analysed with χ2 tests and reported using descriptive statistics. Content analysis was used for qualitative data.SettingAustralia.SubjectsAPD and PHN were targeted to participate; the sample includes respondents from urban and regional areas throughout Australia.ResultsResults indicate confusion around the concept of seafood sustainability and where to obtain information, which may limit health professionals’ ability to recommend the best types of seafood to maximise health and sustainability outcomes. Respondents demonstrated limited understanding of seafood sustainability, with 7·5 % (n 6/80) satisfied with their level of understanding.ConclusionsNutrition and sustainability goals can be better aligned by increasing awareness on seafood that is healthy and sustainable. For health professionals to confidently make recommendations, or identify trade-offs, more evidence-based information needs to be made accessible through forums such as dietetic organisations, industry groups and nutrition programmes.


Author(s):  
Satyakiran Munaga ◽  
Francky Catthoor

Modern cost-conscious dynamic systems incorporate knobs that allow run-time trade-offs between system metrics of interest. In these systems regular knob tuning to minimize costs while satisfying hard system constraints is an important aspect. Knob tuning is a combinatorial constrained nonlinear dynamic optimization problem with uncertainties and time-linkage. Hiding uncertainties under worst-case bounds, reacting after the fact, optimizing only the present, and applying static greedy heuristics are widely used problem simplification strategies to keep the design complexity and decision overhead low. Applying any of these will result in highly sub-optimal system realizations in the presence of nonlinearities. The more recently introduced System Scenarios methodology can only handle limited form of dynamics and nonlinearities. Existing predictive optimization approaches are far from optimal as they do not fully exploit the predictability of the system at hand. To bridge this gap, the authors propose the combined strategy of dynamic bounding and proactive system conditioning for the predicted likely future. This paper describes systematic principles to design low-overhead controllers for cost-effective hard constraint management. When applied to fine-grain performance scaling mode assignment problem in a video decoder design, proposed concepts resulted in more than 2x energy gains compared to state-of-the-art techniques.


2005 ◽  
Vol 883 ◽  
Author(s):  
Edward.F. Stephens

AbstractLow duty cycle, high peak power, conductively cooled laser diode arrays have been manufactured for several years by a number of different vendors. Typically these packages have been limited to a few percent duty cycles due to thermal problems that develop in tight bar pitch arrays at higher duty cycles. Traditionally these packages are made from some combination of copper and BeO or Tungsten/copper and BeO. Trade-offs between thermal conductivity and CTE matching are always made when manufacturing these devices. In addition, the manufacturability of the heat sinks plays a critical role in creating a cost effective, high performance solution. In this discussion we examine several different exotic materials that have been manufactured and tested as heat sinks for laser diode arrays.


1984 ◽  
Vol 3 (1) ◽  
pp. 149-166 ◽  
Author(s):  
William Rudelius ◽  
Richard Weijo ◽  
Gary Dodge

Energy conservation appeals to homeowners stressing patriotism and social responsibility have not worked. The authors believe that more precise information for the homeowner showing the specific dollar costs and savings for various energy actions will stimulate meaningful, beneficial trade-offs for the individual. They further believe that broadly conceived, publicly sponsored marketing strategies can help individual consumers make more informed energy-conservation choices from among the continuous, seasonal, and one-time actions available to them. If public policymakers focus efforts on the most cost-effective, energy-saving actions for households, the community will receive the greatest energy savings for a fixed amount of public expenditures.


2019 ◽  
Vol 20 (1) ◽  
Author(s):  
Jacob R Heldenbrand ◽  
Saurabh Baheti ◽  
Matthew A Bockol ◽  
Travis M Drucker ◽  
Steven N Hart ◽  
...  

Abstract Background Use of the Genome Analysis Toolkit (GATK) continues to be the standard practice in genomic variant calling in both research and the clinic. Recently the toolkit has been rapidly evolving. Significant computational performance improvements have been introduced in GATK3.8 through collaboration with Intel in 2017. The first release of GATK4 in early 2018 revealed rewrites in the code base, as the stepping stone toward a Spark implementation. As the software continues to be a moving target for optimal deployment in highly productive environments, we present a detailed analysis of these improvements, to help the community stay abreast with changes in performance. Results We re-evaluated multiple options, such as threading, parallel garbage collection, I/O options and data-level parallelization. Additionally, we considered the trade-offs of using GATK3.8 and GATK4. We found optimized parameter values that reduce the time of executing the best practices variant calling procedure by 29.3% for GATK3.8 and 16.9% for GATK4. Further speedups can be accomplished by splitting data for parallel analysis, resulting in run time of only a few hours on whole human genome sequenced to the depth of 20X, for both versions of GATK. Nonetheless, GATK4 is already much more cost-effective than GATK3.8. Thanks to significant rewrites of the algorithms, the same analysis can be run largely in a single-threaded fashion, allowing users to process multiple samples on the same CPU. Conclusions In time-sensitive situations, when a patient has a critical or rapidly developing condition, it is useful to minimize the time to process a single sample. In such cases we recommend using GATK3.8 by splitting the sample into chunks and computing across multiple nodes. The resultant walltime will be nnn.4 hours at the cost of $41.60 on 4 c5.18xlarge instances of Amazon Cloud. For cost-effectiveness of routine analyses or for large population studies, it is useful to maximize the number of samples processed per unit time. Thus we recommend GATK4, running multiple samples on one node. The total walltime will be ∼34.1 hours on 40 samples, with 1.18 samples processed per hour at the cost of $2.60 per sample on c5.18xlarge instance of Amazon Cloud.


Sign in / Sign up

Export Citation Format

Share Document