scholarly journals Science of science

Bibliosphere ◽  
2021 ◽  
pp. 25-42
Author(s):  
S. Fortunato ◽  
C. T. Bergstrom ◽  
K. Börner ◽  
J. A. Evans ◽  
D. Helbing ◽  
...  

BACKGROUND. The increasing availability of digital data on scholarly inputs and outputs – from research funding, productivity, and collaboration to paper citations and scientist mobility – offers unprecedented opportunities to explore the structure and evolution of science. The science of science (SciSci) offers a quantitative understanding of the interactions among scientific agents across diverse geographic and temporal scales: It provides insights into the conditions underlying creativity and the genesis of scientific discovery, with the ultimate goal of developing tools and policies that have the potential to accelerate science. In the past decade, SciSci has benefited from an influx of natural, computational, and social scientists who together have developed big data–based capabilities for empirical analysis and generative modeling that capture the unfolding of science, its institutions, and its workforce. The value proposition of SciSci is that with a deeper understanding of the factors that drive successful science, we can more effectively address environmental, societal, and technological problems.ADVANCES. Science can be described as a complex, self-organizing, and evolving network of scholars, projects, papers, and ideas. This representation has unveiled patterns characterizing the emergence of new scientific fields through the study of collaboration networks and the path of impactful discoveries through the study of citation networks. Microscopic models have traced the dynamics of citation accumulation, allowing us to predict the future impact of individual papers. SciSci has revealed choices and trade-offs that scientists face as they advance both their own careers and the scientific horizon. For example, measurements indicate that scholars are risk-averse, preferring to study topics related to their current expertise, which constrains the potential of future discoveries. Those willing to break this pattern engage in riskier careers but become more likely to make major breakthroughs. Overall, the highest-impact science is grounded in conventional combinations of prior work but features unusual combinations. Last, as the locus of research is shifting into teams, SciSci is increasingly focused on the impact of team research, finding that small teams tend to disrupt science and technology with new ideas drawing on older and less prevalent ones. In contrast, large teams tend to develop recent, popular ideas, obtaining high, but often short-lived, impact.OUTLOOK. SciSci offers a deep quantitative understanding of the relational structure between scientists, institutions, and ideas because it facilitates the identification of fundamental mechanisms responsible for scientific discovery. These interdisciplinary data-driven efforts complement contributions from related fields such as scientometrics and the economics and sociology of science. Although SciSci seeks long-standing universal laws and mechanisms that apply across various fields of science, a fundamental challenge going forward is accounting for undeniable differences in culture, habits, and preferences between different fields and countries. This variation makes some cross-domain insights difficult to appreciate and associated science policies difficult to implement. The differences among the questions, data, and skills specific to each discipline suggest that further insights can be gained from domain-specific SciSci studies, which model and identify opportunities adapted to the needs of individual research fields.Abstract. Identifying fundamental drivers of science and developing predictive models to capture its evolution are instrumental for the design of policies that can improve the scientific enterprise – for example, through enhanced career paths for scientists, better performance evaluation for organizations hosting research, discovery of novel effective funding vehicles, and even identification of promising regions along the scientific frontier. The science of science uses large-scale data on the production of science to search for universal and domainspecific patterns. Here, we review recent developments in this transdisciplinary field.

Cloud computing technologies and service models are attractive to scientific computing users due to the ability to get on-demand access to resources as well as the ability to control the software environment. Scientific computing researchers and resource providers servicing these users are considering the impact of new models and technologies. SaaS solutions like Globus Online and IaaS solutions such as Nimbus Infrastructure and OpenNebula accelerate the discovery of science by helping scientists to conduct advanced and large-scale science. This chapter describes how cloud is helping researchers to accelerate scientific discovery by transforming manual and difficult tasks into the cloud.


Author(s):  
Robin J. Pakeman ◽  
Debbie A. Fielding

AbstractMany ecosystems are grazed by livestock or large, wild herbivores and exist as mosaics of different vegetation communities. Changing grazing could have an impact on heterogeneity as well as on composition. A long-term, large-scale grazing experiment that maintained existing low-intensity sheep grazing, tripled it, removed it and partially substituted sheep grazing by cattle grazing was set up on a mosaic of upland vegetation types. The impact of changing grazing regimes was assessed in terms of changes in temporal and spatial species and functional beta diversity. Removal of grazing had the highest impact on species replacement, whilst increased grazing was closest to maintaining the original species complement. Wet heath and Molina mire had the lowest turnover, but wet heath showed the highest changes in unidirectional abundance as it contained species capable of increasing in abundance in response to changing grazing intensity. Agrostis-Festuca and Nardus grasslands displayed the highest level of balanced species replacement reflecting their more dynamic vegetation. In functional terms, there was no clear separation of communities based on their grazing preference, all were relatively resistant to change but Nardus grassland was the most resistant to the removal of grazing. The increased offtake associated with increased grazing led to a degree of homogenisation as grazing tolerant species associated with preferred communities increased in the unpreferred ones. Decisions about grazing management of the uplands involve many trade-offs, and this study identified potential trade-offs between stability and homogenisation to add to existing ones on the biodiversity of different groups of species and on ecosystem services.


2021 ◽  
Author(s):  
Dimitris Vordonis ◽  
Vassilis Paliouras

Detection for high-dimensional multiple-input multiple-output (MIMO) and massive MIMO (MMIMO) systems is an active field of research in wireless communications. While most works consider spatially uncorrelated channels, practical MMIMO channels are correlated. This paper investigates the impact of correlation on Sphere Decoder (SD), for both single-user (SU) and multi-user (MU) scenarios. The complexity of SD is mainly determined by the initial radius (IR) method and the number of visited nodes during detection. This paper employs an efficient IR and proposes a new metric constraint in the tree searching algorithm, that significantly decrease the number of visited nodes and render SD feasible for large-scale systems. In addition, an introduced hardware implementation featured with a one-node-per-cycle architecture, minimizes the latency of the detection process. Trade-offs between bit error rate (BER) performance and computational complexity are presented. The trade-offs are achieved by either modifying the backtracking mechanism or limiting the number of radius updates. Simulation results prove that the proposed optimizations are effective for both correlated and uncorrelated channels, regardless of the level of noise. The decoding gain of SD compared to the low-complexity linear detectors (LD) is higher in the presence of correlation than in the uncorrelated case. However, as expected, spatial correlation adversely affects the performance and the complexity of SD. Simulation results reported here also confirm that correlation at the side equipped with more antennas is less detrimental. Hardware implementation aspects are examined for both a Virtex-7 FPGA device and a 28-nm ASIC technology.<br>


2021 ◽  
Author(s):  
Dimitris Vordonis ◽  
Vassilis Paliouras

<div>Detection for high-dimensional multiple-input multiple-output (MIMO) and Massive MIMO (MMIMO) systems is an active field of research in wireless communications. While most works consider spatially uncorrelated channels, practical MMIMO channels are correlated. This paper investigates the impact of correlation on Sphere Decoder (SD), not only for Single-User (SU) but also for Multi-User (MU) scenarios. The complexity of SD is mainly determined by the Initial Radius (IR) method and the number of visited nodes during detection. This paper proposes both an efficient IR and a new metric constraint in the tree searching algorithm, that significantly decrease the number of visited nodes and render SD feasible for large-scale systems. In addition, a hardware implementation featured with a one-node-per-cycle architecture, minimizes the latency of the detection process. Trade-offs between bit error rate (BER) performance and computational complexity are presented, either modifying the backtracking mechanism or limiting the number of radius updates. Simulation results prove that the proposed optimizations are effective for both correlated and uncorrelated channels, regardless the level of noise. The decoding gain of SD compared to the low-complexity Linear Detectors (LD) is higher in the presence of correlation than in the uncorrelated case. However, as expected, spatial correlation adversely affects the performance and the complexity of SD. Simulation results reported here also confirm that correlation at the side equipped with more antennas is less detrimental. Hardware aspects are examined for both a Virtex-7 FPGA device and a 28-nm ASIC technology.</div>


Koedoe ◽  
1999 ◽  
Vol 42 (1) ◽  
Author(s):  
H.C. Biggs ◽  
A.L.F. Potgieter

New developments in fire management policy in the Kruger National Park are sketched against the background of changing attitudes towards ecosystem management. The experimental burning plots established in the mid-1950s are discussed briefly, as is the almost forty-year era of rotational block- burning. The lightning-driven fire policy initiated in 1992 and currently aimed at by park management is discussed, with comments on its early performance. More recent revision of the management plan stressed maximisation of appropriate research benefits from the experimental burning plots, con- doned the lightning approach for the present, but stressed the absolute necessity of the park not finding itself in the 1992 position again, where a major change in policy has to be made with no comparative evidence from other systems. To this end, a major landscape-scale fire management trial has been planned for implementation starting in April 2000. It is sheduled to run over a twenty-year period, and will be placed at four localities representing different major landscapes in the park. It will compare the effects of three different fire systems (lightning, patch mosaic, and range condition burning systems) on biodiversity elements crucial to the park's mission. The rationale for, layout of, and criteria for deciding on the outcome of the trial are discussed, as well as the trade-offs that were made to enable the trial to be of such a large scale and still fit into overall park planning. The impact of the trial on the park's monitoring programme is discussed.


2018 ◽  
Vol 37 (13-14) ◽  
pp. 1595-1609 ◽  
Author(s):  
Vitor Guizilini ◽  
Fabio Ramos

Real-world scenarios contain many structural patterns that, if appropriately extracted and modeled, can be used to reduce problems associated with sensor failure and occlusions while improving planning methods in such tasks as navigation and grasping. This paper devises a novel unsupervised procedure that models 3D structures from unorganized pointclouds as occupancy maps. Our methodology enables the learning of unique and arbitrarily complex features using a variational Bayesian convolutional auto-encoder, which compresses local information into a latent low-dimensional representation and then decodes it back in order to reconstruct the original scene, including color information when available. This reconstructive model is trained on features obtained automatically from a wide variety of scenarios, in order to improve its generalization and interpolative powers. We show that the proposed framework is able to recover partially missing structures and reason over occlusions with high accuracy while maintaining a detailed reconstruction of observed areas. To combine localized feature estimates seamlessly into a single global structure, we employ the Hilbert maps framework, recently proposed as a robust and efficient occupancy mapping technique, and introduce a new kernel for reproducing kernel Hilbert space projection that uses estimates from the reconstructive model. Experimental tests are conducted with large-scale 2D and 3D datasets, using both laser and monocular data, and a study of the impact of various accuracy–speed trade-offs is provided to assess the limits of the proposed methodology.


Hydrology ◽  
2018 ◽  
Vol 5 (3) ◽  
pp. 49
Author(s):  
Salvatore Manfreda ◽  
Caterina Samela ◽  
Alberto Refice ◽  
Valerio Tramutoli ◽  
Fernando Nardi

The last decades have seen a massive advance in technologies for Earth observation (EO) and environmental monitoring, which provided scientists and engineers with valuable spatial information for studying hydrologic processes. At the same time, the power of computers and newly developed algorithms have grown sharply. Such advances have extended the range of possibilities for hydrologists, who are trying to exploit these potentials the most, updating and re-inventing the way hydrologic and hydraulic analyses are carried out. A variety of research fields have progressed significantly, ranging from the evaluation of water features, to the classification of land-cover, the identification of river morphology, and the monitoring of extreme flood events. The description of flood processes may particularly benefit from the integrated use of recent algorithms and monitoring techniques. In fact, flood exposure and risk over large areas and in scarce data environments have always been challenging topics due to the limited information available on river basin hydrology, basin morphology, land cover, and the resulting model uncertainty. The ability of new tools to carry out intensive analyses over huge datasets allows us to produce flood studies over large extents and with a growing level of detail. The present Special Issue aims to describe the state-of-the-art on flood assessment, monitoring, and management using new algorithms, new measurement systems and EO data. More specifically, we collected a number of contributions dealing with: (1) the impact of climate change on floods; (2) real time flood forecasting systems; (3) applications of EO data for hazard, vulnerability, risk mapping, and post-disaster recovery phase; and (4) development of tools and platforms for assessment and validation of hazard/risk models.


2021 ◽  
Author(s):  
Dimitris Vordonis ◽  
Vassilis Paliouras

<div>Detection for high-dimensional multiple-input multiple-output (MIMO) and Massive MIMO (MMIMO) systems is an active field of research in wireless communications. While most works consider spatially uncorrelated channels, practical MMIMO channels are correlated. This paper investigates the impact of correlation on Sphere Decoder (SD), not only for Single-User (SU) but also for Multi-User (MU) scenarios. The complexity of SD is mainly determined by the Initial Radius (IR) method and the number of visited nodes during detection. This paper proposes both an efficient IR and a new metric constraint in the tree searching algorithm, that significantly decrease the number of visited nodes and render SD feasible for large-scale systems. In addition, a hardware implementation featured with a one-node-per-cycle architecture, minimizes the latency of the detection process. Trade-offs between bit error rate (BER) performance and computational complexity are presented, either modifying the backtracking mechanism or limiting the number of radius updates. Simulation results prove that the proposed optimizations are effective for both correlated and uncorrelated channels, regardless the level of noise. The decoding gain of SD compared to the low-complexity Linear Detectors (LD) is higher in the presence of correlation than in the uncorrelated case. However, as expected, spatial correlation adversely affects the performance and the complexity of SD. Simulation results reported here also confirm that correlation at the side equipped with more antennas is less detrimental. Hardware aspects are examined for both a Virtex-7 FPGA device and a 28-nm ASIC technology.</div>


2018 ◽  
Vol 2 ◽  
pp. e28470
Author(s):  
Gil Nelson ◽  
Shari Ellis

The first two decades of the 21st Century have seen a rapid rise in the creation, mobilization, research, and educational use of digital museum data, especially in the natural and biodiversity sciences. This has thrust natural history museums and especially the biodiversity specimen collections they hold into the forefront of biodiversity research in systematics, ecology, and conservation, underscoring their central role in the modern scientific enterprise. The advent of such digitization and data mobilization initiatives as the United States National Science Foundation’s Advancing the Digitization of Biodiversity Collections program (ADBC), Australia’s Atlas of Living Australia (ALA), Mexico’s National Commission for the Knowledge and Use of Biodiversity (CONABIO), Brazil’s Centro de Referência em Informação (CRIA), Europe’s SYNTHESYS, and China’s National Specimen Information Infrastructure (NSII) has led to a rapid rise in regional, national, and international digital data aggregators and has precipitated an exponential increase in the availability of digital data for scientific research. The international Global Biodiversity Information Facility (GBIF) now serves about 130 million museum specimen records, and Integrated Digitized Biocollections (iDigBio), the U.S. national biodiversity portal, has amassed over 109 million records representing over 300 million specimens that are international in scope. These digital resources raise the profiles of museums, expose collections to a wider audience of systematic and conservation researchers, provide the best biodiversity data in the modern era outside of nature itself, and ensure that specimen-based research remains at the forefront of the biodiversity sciences. Here we provide a brief overview of worldwide digital data generation and mobilization, the impact of these data on biodiversity research, new data underscoring the impact of worldwide digitization initiatives on citation in scientific publications, and evidence of the roles these activities play in raising the public and scientific profiles of natural history collections.


2020 ◽  
Vol 59 (04) ◽  
pp. 294-299 ◽  
Author(s):  
Lutz S. Freudenberg ◽  
Ulf Dittmer ◽  
Ken Herrmann

Abstract Introduction Preparations of health systems to accommodate large number of severely ill COVID-19 patients in March/April 2020 has a significant impact on nuclear medicine departments. Materials and Methods A web-based questionnaire was designed to differentiate the impact of the pandemic on inpatient and outpatient nuclear medicine operations and on public versus private health systems, respectively. Questions were addressing the following issues: impact on nuclear medicine diagnostics and therapy, use of recommendations, personal protective equipment, and organizational adaptations. The survey was available for 6 days and closed on April 20, 2020. Results 113 complete responses were recorded. Nearly all participants (97 %) report a decline of nuclear medicine diagnostic procedures. The mean reduction in the last three weeks for PET/CT, scintigraphies of bone, myocardium, lung thyroid, sentinel lymph-node are –14.4 %, –47.2 %, –47.5 %, –40.7 %, –58.4 %, and –25.2 % respectively. Furthermore, 76 % of the participants report a reduction in therapies especially for benign thyroid disease (-41.8 %) and radiosynoviorthesis (–53.8 %) while tumor therapies remained mainly stable. 48 % of the participants report a shortage of personal protective equipment. Conclusions Nuclear medicine services are notably reduced 3 weeks after the SARS-CoV-2 pandemic reached Germany, Austria and Switzerland on a large scale. We must be aware that the current crisis will also have a significant economic impact on the healthcare system. As the survey cannot adapt to daily dynamic changes in priorities, it serves as a first snapshot requiring follow-up studies and comparisons with other countries and regions.


Sign in / Sign up

Export Citation Format

Share Document