scholarly journals The Deep-time Digital Earth program: data-driven discovery in geosciences

2021 ◽  
Author(s):  
Chengshan Wang ◽  
Robert M Hazen ◽  
Qiuming Cheng ◽  
Michael H Stephenson ◽  
Chenghu Zhou ◽  
...  

Abstract Current barriers hindering data-driven discoveries in deep-time Earth (DE) include: substantial volumes of DE data are not digitized; many DE databases do not adhere to FAIR principles (findable, accessible, interoperable, and reusable); we lack a systematic knowledge graph for DE; existing DE databases are geographically heterogeneous; a significant fraction of DE data is not in open-access formats; tailored tools are needed. These challenges motivate the Deep-time Digital Earth (DDE) program initiated by the International Union of Geological Sciences (IUGS) and developed in cooperation with national geological surveys, professional associations, academic institutions, and scientists around the world. DDE’s mission is to build on previous research to develop a systematic DE knowledge graph, a FAIR data infrastructure that links existing databases and makes dark data visible, and tailored tools for DE data, which are universally accessible. DDE aims to harmonize DE data, share global geoscience knowledge, and facilitate data-driven discovery in the understanding of Earth's evolution.

2019 ◽  
Vol 46 (8) ◽  
pp. 622-638
Author(s):  
Joachim Schöpfel ◽  
Dominic Farace ◽  
Hélène Prost ◽  
Antonella Zane

Data papers have been defined as scholarly journal publications whose primary purpose is to describe research data. Our survey provides more insights about the environment of data papers, i.e., disciplines, publishers and business models, and about their structure, length, formats, metadata, and licensing. Data papers are a product of the emerging ecosystem of data-driven open science. They contribute to the FAIR principles for research data management. However, the boundaries with other categories of academic publishing are partly blurred. Data papers are (can be) generated automatically and are potentially machine-readable. Data papers are essentially information, i.e., description of data, but also partly contribute to the generation of knowledge and data on its own. Part of the new ecosystem of open and data-driven science, data papers and data journals are an interesting and relevant object for the assessment and understanding of the transition of the former system of academic publishing.


2021 ◽  
Author(s):  
Tetsuya Yamada ◽  
Shoi Shi

Comprehensive and evidence-based countermeasures against emerging infectious diseases have become increasingly important in recent years. COVID-19 and many other infectious diseases are spread by human movement and contact, but complex transportation networks in 21 century make it difficult to predict disease spread in rapidly changing situations. It is especially challenging to estimate the network of infection transmission in the countries that the traffic and human movement data infrastructure is not yet developed. In this study, we devised a method to estimate the network of transmission of COVID-19 from the time series data of its infection and applied it to determine its spread across areas in Japan. We incorporated the effects of soft lockdowns, such as the declaration of a state of emergency, and changes in the infection network due to government-sponsored travel promotion, and predicted the spread of infection using the Tokyo Olympics as a model. The models used in this study are available online, and our data-driven infection network models are scalable, whether it be at the level of a city, town, country, or continent, and applicable anywhere in the world, as long as the time-series data of infections per region is available. These estimations of effective distance and the depiction of infectious disease networks based on actual infection data are expected to be useful in devising data-driven countermeasures against emerging infectious diseases worldwide.


2019 ◽  
Vol 37 (4) ◽  
pp. 244-249
Author(s):  
Akshay Rajaram ◽  
Trevor Morey ◽  
Sonam Shah ◽  
Naheed Dosani ◽  
Muhammad Mamdani

Background: Considerable gains are being made in data-driven efforts to advance quality improvement in health care. However, organizations providing hospice-oriented palliative care for structurally vulnerable persons with terminal illnesses may not have the enabling data infrastructure or framework to derive such benefits. Methods: We conducted a pilot cross-sectional qualitative study involving a convenience sample of hospice organizations across North America providing palliative care services for structurally vulnerable patients. Through semistructured interviews, we surveyed organizations on the types of data collected, the information systems used, and the challenges they faced. Results: We contacted 13 organizations across North America and interviewed 9. All organizations served structurally vulnerable populations, including the homeless and vulnerably housed, socially isolated, and HIV-positive patients. Common examples of collected data included the number of referrals, the number of admissions, length of stay, and diagnosis. More than half of the organizations (n = 5) used an electronic medical record, although none of the record systems were specifically designed for palliative care. All (n = 9) the organizations used the built-in reporting capacity of their information management systems and more than half (n = 6) augmented this capacity with chart reviews. Discussion: A number of themes emerged from our discussions. Present data collection is heterogeneous, and storage of these data is highly fragmented within and across organizations. Funding appeared to be a key enabler of more robust data collection and use. Future work should address these gaps and examine opportunities for innovative ways of analysis and reporting to improve care for structurally vulnerable populations.


2001 ◽  
Vol 34 (3) ◽  
pp. 1101
Author(s):  
Α. ΔΗΜΗΤΡΙΑΔΗΣ

Environmental Geology is considered to have been coined for the environmental sensitive market. It originated in the United States in the late 1960's to attract students to save the closure of University Geology Departments. After almost thirty years there are still questions about its viability as a stand alone branch of geological sciences, since by definition it encompasses all the specialised branches of engineering geology, economic geology, structural geology, hydrogeology, geochemistry, geophysics, etc. The environmental geologist must, therefore, be a "super geologist", which is an impossibility by present day standards. University curricula in Environmental Geology still teach the basic geological subjects of geology degrees, since these serve as a strong foundation for courses in the environmental field. In the United States, students are required to take at least four elective courses in environmentally orientated earth science subjects during their first degree. Whereas in the United Kingdom a Master of Science course in environmental subjects is recommended as a follow-up to the first degree in Environmental Geology, again a misnomer for the degree in pure Geology. It is quite apparent that Universities jumped on the bandwagon of the environmental market, without serious thought into what they were embarking. They created a non-existent market orientated branch of geological sciences, Environmental Geology, and they subsequently realised that it is impossible to produce the "super student" and the "super geologist", for this is what is in fact demanded. It is strongly believed, that specialists in the different branches of geological sciences, because of their in depth study of the natural geological environment and its processes, have considerable knowledge and expertise to be applied in the solution of environmental problems. This must, therefore, be advertised by both Universities and State Geological Surveys, for advertising is a more powerful tool of getting the message across to the public and to policy-makers, rather than by making up new branches of science with no content.


2020 ◽  
Author(s):  
Hendro Wicaksono

The presentation introduces the technologies associated with the fourth industrial revolution which rely on the concept of artificial intelligence. Data is the basis of functioning artificial intelligence technologies. The presentation also explains how data can revolutionize the business by providing global access to physical products through an industry 4.0 ecosystem. The ecosystem contains four pillars: smart product, smart process, smart resources (smart PPR), and data-driven services. Through these four pillars, the industry 4.0 can be implemented in different sectors. The presentation also provides some insights on the roles of linked data (knowledge graph) for data integration, data analytics, and machine learning in industry 4.0 ecosystem. Project examples in smart city, healthcare, and agriculture sectors are also described. Finally, the presentation discusses the implications of the introduced concepts on the Indonesian context.


2020 ◽  
Vol 102 ◽  
pp. 534-548
Author(s):  
Zhenfeng Lei ◽  
Yuan Sun ◽  
Y.A. Nanehkaran ◽  
Shuangyuan Yang ◽  
Md. Saiful Islam ◽  
...  

2019 ◽  
Vol 93 (S3) ◽  
pp. 73-75
Author(s):  
Sabin ZAHIROVIC ◽  
Tristan SALLES ◽  
Dietmar MÜLLER ◽  
Michael GURNIS ◽  
Wenchao CAO ◽  
...  
Keyword(s):  

2020 ◽  
pp. 251484862090972
Author(s):  
Eric Nost

Conservationists around the world advocate for “data-driven” environmental governance, expecting data infrastructures to make all relevant and actionable information readily available. But how exactly is data to be infrastructured and to what political effect? I show how putting together and maintaining environmental data for decision-making is not a straightforward technical task, but a practice shaped by and shaping politico-economic context. Drawing from the US state of Louisiana’s coastal restoration planning process, I detail two ways ecosystem modelers manage fiscal and institutional “frictions” to “infrastructuring” data as a resource for decision-making. First, these experts work with the data they have. They leverage, tweak, and maintain existing datasets and tools, spending time and money to gather additional data only to the extent it fits existing goals. The assumption is that these goals will continue to be important, but building coastal data infrastructure around current research needs, plans, and austerity arguably limits what can be said in and done with the future. Second, modelers acquire the data they made to need. Coastal communities have protested the state’s primary restoration tool: diversions of sediment from the Mississippi River. Planners reacted by relaxing institutional constraints and modelers brought together new data to highlight possible winners and losers from ecological restoration. Fishers and other coastal residents leveraged greater dissent in the planning process. Political ecologists show that technocentric environmental governance tends to foreclose dissent from hegemonic socioecological futures. I argue we can clarify the conditions in which this tends to happen by following how experts manage data frictions. As some conservationists and planners double down on driving with data in a “post-truth” world, I find that data’s politicizing effects stem from what is asked of it, not whether it is “big” or “drives.”


Sign in / Sign up

Export Citation Format

Share Document