scholarly journals A collaborative, integrated and electronic future for taxonomy

2011 ◽  
Vol 25 (5) ◽  
pp. 471 ◽  
Author(s):  
Norman F. Johnson

The Platygastroidea Planetary Biodiversity Inventory is a large-scale, multinational effort to significantly advance the taxonomy and systematics of one group of parasitoid wasps. Based on this effort, there are some clear steps that should be taken to increase the efficiency and throughput of the taxonomic process. Increased collaboration among taxonomic specialists can significantly shorten the timeline and add increased rigor to the development of hypotheses of characters and taxa. Species delimitations should make use of multiple data sources, thus providing more nearly independent tests of these hypotheses. Taxonomy should fully embrace electronic media and informatics tools. Particularly, this step requires the development and widespread implementation of community data standards. The barriers to progress in these areas are not technological, but are primarily social. The community needs to see clear evidence of the value added through these changes in procedures and insist upon their use as standard practice.

Water ◽  
2021 ◽  
Vol 13 (7) ◽  
pp. 899
Author(s):  
Djordje Mitrovic ◽  
Miguel Crespo Chacón ◽  
Aida Mérida García ◽  
Jorge García Morillo ◽  
Juan Antonio Rodríguez Diaz ◽  
...  

Studies have shown micro-hydropower (MHP) opportunities for energy recovery and CO2 reductions in the water sector. This paper conducts a large-scale assessment of this potential using a dataset amassed across six EU countries (Ireland, Northern Ireland, Scotland, Wales, Spain, and Portugal) for the drinking water, irrigation, and wastewater sectors. Extrapolating the collected data, the total annual MHP potential was estimated between 482.3 and 821.6 GWh, depending on the assumptions, divided among Ireland (15.5–32.2 GWh), Scotland (17.8–139.7 GWh), Northern Ireland (5.9–8.2 GWh), Wales (10.2–8.1 GWh), Spain (375.3–539.9 GWh), and Portugal (57.6–93.5 GWh) and distributed across the drinking water (43–67%), irrigation (51–30%), and wastewater (6–3%) sectors. The findings demonstrated reductions in energy consumption in water networks between 1.7 and 13.0%. Forty-five percent of the energy estimated from the analysed sites was associated with just 3% of their number, having a power output capacity >15 kW. This demonstrated that a significant proportion of energy could be exploited at a small number of sites, with a valuable contribution to net energy efficiency gains and CO2 emission reductions. This also demonstrates cost-effective, value-added, multi-country benefits to policy makers, establishing the case to incentivise MHP in water networks to help achieve the desired CO2 emissions reductions targets.


2021 ◽  
Vol 13 (7) ◽  
pp. 1367
Author(s):  
Yuanzhi Cai ◽  
Hong Huang ◽  
Kaiyang Wang ◽  
Cheng Zhang ◽  
Lei Fan ◽  
...  

Over the last decade, a 3D reconstruction technique has been developed to present the latest as-is information for various objects and build the city information models. Meanwhile, deep learning based approaches are employed to add semantic information to the models. Studies have proved that the accuracy of the model could be improved by combining multiple data channels (e.g., XYZ, Intensity, D, and RGB). Nevertheless, the redundant data channels in large-scale datasets may cause high computation cost and time during data processing. Few researchers have addressed the question of which combination of channels is optimal in terms of overall accuracy (OA) and mean intersection over union (mIoU). Therefore, a framework is proposed to explore an efficient data fusion approach for semantic segmentation by selecting an optimal combination of data channels. In the framework, a total of 13 channel combinations are investigated to pre-process data and the encoder-to-decoder structure is utilized for network permutations. A case study is carried out to investigate the efficiency of the proposed approach by adopting a city-level benchmark dataset and applying nine networks. It is found that the combination of IRGB channels provide the best OA performance, while IRGBD channels provide the best mIoU performance.


1997 ◽  
Vol 100 (1) ◽  
pp. 40 ◽  
Author(s):  
Bruce McCune ◽  
Jonathan P. Dey ◽  
JeriLynn E. Peck ◽  
David Cassell ◽  
Karin Heiman ◽  
...  

2017 ◽  
Vol 47 (1) ◽  
pp. 58-86 ◽  
Author(s):  
Stef Adriaenssens ◽  
Jef Hendrickx

Economic output implies that underground sectors such as prostitution are taken into account. This article presents an innovative methodology to measure turnover and added value in prostitution based on a combination of observational and Internet data. The method is applied to Belgium. Turnover is broken down in transactions and price per segment. The starting point is an observation-based measure of turnover in one locational and visible segment of the market: window prostitution. Fundamental differences between segments make linear generalizations from one segment invalid. Therefore, we estimate the relative size of transactions in other segments (such as brothels or escort) with Internet data. In combination with measures of average price per transaction, a consolidated estimate of turnover in prostitution in Belgium is measured. Estimates of nonresident production are based on data on sex workers’ country of origin. Several bootstrap replications allow for robustness checks of the delta-based standard errors.


Author(s):  
Kanix Wang ◽  
Walid Hussain ◽  
John R. Birge ◽  
Michael D. Schreiber ◽  
Daniel Adelman

Having an interpretable, dynamic length-of-stay model can help hospital administrators and clinicians make better decisions and improve the quality of care. The widespread implementation of electronic medical record (EMR) systems has enabled hospitals to collect massive amounts of health data. However, how to integrate this deluge of data into healthcare operations remains unclear. We propose a framework grounded in established clinical knowledge to model patients’ lengths of stay. In particular, we impose expert knowledge when grouping raw clinical data into medically meaningful variables that summarize patients’ health trajectories. We use dynamic, predictive models to output patients’ remaining lengths of stay, future discharges, and census probability distributions based on their health trajectories up to the current stay. Evaluated with large-scale EMR data, the dynamic model significantly improves predictive power over the performance of any model in previous literature and remains medically interpretable. Summary of Contribution: The widespread implementation of electronic health systems has created opportunities and challenges to best utilize mounting clinical data for healthcare operations. In this study, we propose a new approach that integrates clinical analysis in generating variables and implementations of computational methods. This approach allows our model to remain interpretable to the medical professionals while being accurate. We believe our study has broader relevance to researchers and practitioners of healthcare operations.


2021 ◽  
Author(s):  
Henry C. Edeh

Achieving the Sustainable Development Goals (SDGs) of poverty and inequality reduction through redistribution have indeed become critical concerns in many low- and middle-income countries, including Nigeria. Although redistribution results from the effect of tax revenue collections, micro household-level empirical analyses of the distributional effect of personal income tax (PIT) and value added tax (VAT) reforms in Nigeria have been scarcely carried out. This study for the first time quantitatively assessed both the equity and redistributive effects of PIT and VAT across different reform scenarios in Nigeria. Data used in this study was mainly drawn from the most recent large scale nationally representative Nigeria Living Standard Survey, conducted in 2018/2019. The Kakwani Index was used to calculate and compare the progressivity of PIT and VAT reforms. A simple static micro-simulation model was employed in assessing the redistributive effect of PIT and VAT reforms in the country. After informality has been accounted for, the PIT was found to be progressive in the pre- 2011 tax scheme, but turned regressive in the post-2011 tax scheme. It was also discovered that the newly introduced lump sum relief allowance in the post-2011 PIT scheme accrues more to the high-income than to the low-income taxpayers – confirming the regressivity of the current PIT scheme. However, the study further shows (through counterfactual simulations) that excluding the relatively high-income taxpayers from sharing in the variable part of the lump sum relief allowance makes PIT progressive in the post-2011 scheme. The VAT was uncovered to be regressive both in the pre-2020 scheme, and in the current VAT reform scheme. Further, after putting informality into consideration, the PIT was found to marginally reduce inequality but increase poverty in the pre-2011 scheme. The post-2011 PIT scheme reduced inequality and increased poverty, but by a smaller proportion – confirming a limited redistribution mainly resulting from the concentration of the lump sum relief allowance at the top of the distribution. However, if the variable part of the lump sum relief allowance is provided for ‘only’ the low-income taxpayers below a predefined income threshold, the post-2011 PIT scheme becomes largely redistributive. VAT was uncovered to marginally increase inequality and poverty in the pre-2020 scheme. Though the current VAT scheme slightly increased inequality, it considerably increased poverty in the country. It is therefore suggested that a better tax reform, with well-regulated relief allowance and differentiated VAT rates, will help to enhance the equity and redistribution capacity of the Nigeria tax system.


E-Management ◽  
2020 ◽  
Vol 3 (1) ◽  
pp. 68-74
Author(s):  
Ya. V. Miller

In the last decade unprecedented technological changes have taken place, resulting in the emergence of a fundamentally new economic model. Based on the widespread spread of smartphones, the world has become more “connected”. The digitalization of demand and supply contributed to the creation of entirely new digital markets managed by platform enterprises based on an open business model, that enabled external consumers and producers to connect and interact with each other. A more interconnected world generates vast amounts of data, allowing platform companies to invest in machine learning and artificial intelligence and ultimately improve their efficiency. Finally, a steady digitalization of business processes, markets and global value chains is observed. In these circumstances, approaches to value addition are fundamentally changing in the context of new dimensions of the digital economy, the analysis of which was the purpose of our study. It has been identified, that in the absence of a standardized international methodology for measuring the digital economy, the latter is so far possible on disparate development-left and national statistics. Initiatives taken at the international level to overcome national differential approaches are still insufficient, as there is a lack of statistics and variables related to digital data. It has been revealed, that the lack of quality statistics on key indicators of the digital economy makes it difficult to assess the value added in the world economy scale and international comparisons. Much of the challenges of measuring value added in the digital economy, as shown in the article, are related to the principle of “scale without mass,” the intangible nature of capital, the intense growth of large-scale cross-border data flows, and the emergence of new sources of value creation.


2011 ◽  
Vol 24 (13) ◽  
pp. 3272-3293 ◽  
Author(s):  
Tara J. Troy ◽  
Justin Sheffield ◽  
Eric F. Wood

Abstract Northern Eurasia has experienced significant change in its hydrology during the past century. Much of the literature has focused on documenting and understanding the trends rather than documenting the uncertainty that exists in current estimates of the mean hydroclimatology. This study quantifies the terrestrial water budget with reanalysis, hydrologic modeling, remote sensing, and in situ observations and shows there is significant uncertainty in the estimates of precipitation, evapotranspiration, runoff, and terrestrial water storage changes. The spread among the various datasets highlights the scientific community's inability to accurately characterize the hydroclimatology of this region, which is problematic because much attention has focused on hydrologic trends using these datasets. The largest relative differences among estimates exist in the terrestrial storage change, which also is the least studied variable. Seasonally, the spread in estimates relative to the mean is largest in winter, when uncertainty in cold-season processes and measurements causes large differences in the estimates. A methodology is developed that takes advantage of multiple sources of data and observed discharge to improve estimates of precipitation, evapotranspiration, and storage changes. The method also provides a framework to evaluate the errors in datasets for variables that have no large-scale in situ measurements, such as evapotranspiration.


2016 ◽  
Vol 31 (4) ◽  
pp. 1397-1405
Author(s):  
Weihong Qian ◽  
Ning Jiang ◽  
Jun Du

Abstract Mathematical derivation, meteorological justification, and comparison to model direct precipitation forecasts are the three main concerns recently raised by Schultz and Spengler about moist divergence (MD) and moist vorticity (MV), which were introduced in earlier work by Qian et al. That previous work demonstrated that MD (MV) can in principle be derived mathematically with a value-added empirical modification. MD (MV) has a solid meteorological basis. It combines ascent motion and high moisture: the two elements necessary for rainfall. However, precipitation efficiency is not considered in MD (MV). Given the omission of an advection term in the mathematical derivation and the lack of precipitation efficiency, MD (MV) might be suitable mainly for heavy rain events with large areal coverage and long duration caused by large-scale quasi-stationary weather systems, but not for local intense heavy rain events caused by small-scale convection. In addition, MD (MV) is not capable of describing precipitation intensity. MD (MV) worked reasonably well in predicting heavy rain locations from short to medium ranges as compared with the ECMWF model precipitation forecasts. MD (MV) was generally worse than (though sometimes similar to) the model heavy rain forecast at shorter ranges (about a week) but became comparable or even better at longer ranges (around 10 days). It should be reiterated that MD (MV) is not intended to be a primary tool for predicting heavy rain areas, especially in the short range, but is a useful parameter for calibrating model heavy precipitation forecasts, as stated in the original paper.


Sign in / Sign up

Export Citation Format

Share Document