Molecular Epidemiology and Biomarkers in Etiologic Cancer Research: The New in Light of the Old

2020 ◽  
Author(s):  
Lungwani Muungo

The purpose of this review is to evaluate progress inmolecular epidemiology over the past 24 years in canceretiology and prevention to draw lessons for futureresearch incorporating the new generation of biomarkers.Molecular epidemiology was introduced inthe study of cancer in the early 1980s, with theexpectation that it would help overcome some majorlimitations of epidemiology and facilitate cancerprevention. The expectation was that biomarkerswould improve exposure assessment, document earlychanges preceding disease, and identify subgroupsin the population with greater susceptibility to cancer,thereby increasing the ability of epidemiologic studiesto identify causes and elucidate mechanisms incarcinogenesis. The first generation of biomarkers hasindeed contributed to our understanding of riskandsusceptibility related largely to genotoxic carcinogens.Consequently, interventions and policy changes havebeen mounted to reduce riskfrom several importantenvironmental carcinogens. Several new and promisingbiomarkers are now becoming available for epidemiologicstudies, thanks to the development of highthroughputtechnologies and theoretical advances inbiology. These include toxicogenomics, alterations ingene methylation and gene expression, proteomics, andmetabonomics, which allow large-scale studies, includingdiscovery-oriented as well as hypothesis-testinginvestigations. However, most of these newer biomarkershave not been adequately validated, and theirrole in the causal paradigm is not clear. There is a needfor their systematic validation using principles andcriteria established over the past several decades inmolecular cancer epidemiology.

2011 ◽  
Vol 152 (16) ◽  
pp. 633-641 ◽  
Author(s):  
Katalin Gőcze ◽  
Katalin Gombos ◽  
Gábor Pajkos ◽  
Ingrid Magda ◽  
Ágoston Ember ◽  
...  

Cancer research concerning short non-coding RNA sequences and functionally linked to RNA interference (RNAi) have reached explosive breakthrough in the past decade. Molecular technology applies microRNA in extremely wide spectrum from molecular tumor prediction, diagnostics, progression monitoring and prevention. Functional analysis of tissue miRNA and cell-free serum miRNA in posttranscription and translation regulation innovated and restructured the knowledge on the field. This review focuses on molecular epidemiology and primary prevention aspects of the small non-coding RNA sequences. Orv. Hetil., 2011, 152, 633–641.


GigaScience ◽  
2020 ◽  
Vol 9 (11) ◽  
Author(s):  
Alexandra J Lee ◽  
YoSon Park ◽  
Georgia Doing ◽  
Deborah A Hogan ◽  
Casey S Greene

Abstract Motivation In the past two decades, scientists in different laboratories have assayed gene expression from millions of samples. These experiments can be combined into compendia and analyzed collectively to extract novel biological patterns. Technical variability, or "batch effects," may result from combining samples collected and processed at different times and in different settings. Such variability may distort our ability to extract true underlying biological patterns. As more integrative analysis methods arise and data collections get bigger, we must determine how technical variability affects our ability to detect desired patterns when many experiments are combined. Objective We sought to determine the extent to which an underlying signal was masked by technical variability by simulating compendia comprising data aggregated across multiple experiments. Method We developed a generative multi-layer neural network to simulate compendia of gene expression experiments from large-scale microbial and human datasets. We compared simulated compendia before and after introducing varying numbers of sources of undesired variability. Results The signal from a baseline compendium was obscured when the number of added sources of variability was small. Applying statistical correction methods rescued the underlying signal in these cases. However, as the number of sources of variability increased, it became easier to detect the original signal even without correction. In fact, statistical correction reduced our power to detect the underlying signal. Conclusion When combining a modest number of experiments, it is best to correct for experiment-specific noise. However, when many experiments are combined, statistical correction reduces our ability to extract underlying patterns.


1987 ◽  
Vol 20 (1) ◽  
pp. 7-17 ◽  
Author(s):  
R A Furness

Pipelines are an integral part of the world's economy and literally billions of pounds worth of fluids are moved each year in pipelines of varying lengths and diameters. As the cost of some of these fluids and the price of moving them has increased, so the need to measure the flows more accurately and control and operate the line more effectively has arisen. Instrumentation and control equipment has developed steadily in the past decade but not as fast as the computers and microprocessors that are now a part of most large scale pipeline systems. It is the interfacing of the new generation of digital and sometimes ‘intelligent’ instrumentation with smaller and more powerful computers that has led to a quiet but rapid revolution in pipeline monitoring and control. This paper looks at the more significant developments from the many that have appeared in the past few years and attempts to project future trends in the industry for the next decade.


2018 ◽  
Author(s):  
Koen Van Den Berge ◽  
Katharina Hembach ◽  
Charlotte Soneson ◽  
Simone Tiberi ◽  
Lieven Clement ◽  
...  

Gene expression is the fundamental level at which the result of various genetic and regulatory programs are observable. The measurement of transcriptome-wide gene expression has convincingly switched from microarrays to sequencing in a matter of years. RNA sequencing (RNA-seq) provides a quantitative and open system for profiling transcriptional outcomes on a large scale and therefore facilitates a large diversity of applications, including basic science studies, but also agricultural or clinical situations. In the past 10 years or so, much has been learned about the characteristics of the RNA-seq datasets as well as the performance of the myriad of methods developed. In this review, we give an overall view of the developments in RNA-seq data analysis, including experimental design, with an explicit focus on quantification of gene expression and statistical approaches for differential expression. We also highlight emerging data types, such as single-cell RNA-seq and gene expression profiling using long-read technologies.


Author(s):  
Harry Cook ◽  
Michael Newson

In 2013, the Saudi government embarked on a nationwide strategy to restructure its labor market and its policies towards the recruitment of foreign workers. These changes are in line with the implementation of Saudi Arabia’s Nitaqat system which aims to better regulate foreign labor in the country and to reduce the number of irregular workers in the Kingdom. As a result of these changes in policy and implementation, there have been large-scale deportations of irregular workers—along with their family members, in some cases—from KSA beginning in mid-2013 and continuing up to the time of writing. Yemeni workers in KSA have been particularly hard hit by these policy changes due to the largely informal nature of labor migration flows that have existed between KSA and Yemen for the past few decades. This chapter explores the possible implications of the recent labor policy changes in KSA for Yemeni and host communities in KSA, as well as for returning workers, their families, and communities of origin in Yemen. The chapter concludes with several recommendations on how to effectively address the challenges these disruptions will cause and how to build new avenues to support the transnational linkages between Yemeni migrant workers in KSA and their communities in Yemen.


Author(s):  
Koen Van Den Berge ◽  
Katharina Hembach ◽  
Charlotte Soneson ◽  
Simone Tiberi ◽  
Lieven Clement ◽  
...  

Gene expression is the fundamental level at which the result of various genetic and regulatory programs are observable. The measurement of transcriptome-wide gene expression has convincingly switched from microarrays to sequencing in a matter of years. RNA sequencing (RNA-seq) provides a quantitative and open system for profiling transcriptional outcomes on a large scale and therefore facilitates a large diversity of applications, including basic science studies, but also agricultural or clinical situations. In the past 10 years or so, much has been learned about the characteristics of the RNA-seq datasets as well as the performance of the myriad of methods developed. In this review, we give an overall view of the developments in RNA-seq data analysis, including experimental design, with an explicit focus on quantification of gene expression and statistical approaches for differential expression. We also highlight emerging data types, such as single-cell RNA-seq and gene expression profiling using long-read technologies.


2021 ◽  
Vol 9 (1) ◽  
pp. 187-199
Author(s):  
Basil Bornemann ◽  
Marius Christen

Governments and administrations at all levels play a central role in shaping sustainable development. Over the past 30 years, many have developed differentiated sustainability governance arrangements (SGAs) to incorporate sustainability into their governing practice. The 2030 Agenda for Sustainable Development, which the UN adopted in 2015, brings with it some significant conceptual shifts in sustainability thinking that, in turn, entail new governance requirements. Starting from practical calls for improved understanding of the requirements and conditions of 2030 Agenda implementation ‘on the ground,’ this article examines existing SGAs’ potential to deal with the generational shift that the 2030 Agenda implies. To this end, four ideal-typical SGAs representing an early generation of sustainability governance at the subnational level in Switzerland are related to five specific governance requirements emerging from the 2030 Agenda. The analysis highlights different possibilities and limitations of the four SGAs to meet 2030 Agenda requirements and points to the need for context-specific reforms of first-generation sustainability governance in the wake of the new Agenda.


Author(s):  
Thenille Braun Janzen ◽  
Michael H. Thaut

This chapter presents a broad panorama of the current knowledge concerning the anatomical and functional basis of music processing in the healthy brain. Neuroimaging studies developed over the past 20 years provide evidence that music processing takes place in widely distributed neural networks. Here, attention is focused on core brain networks implicated in music processing, emphasizing the anatomical and functional interactions between cortical and subcortical areas within auditory-frontal networks, auditory-motor networks, and auditory-limbic networks. Finally, the authors review recent studies investigating how brain networks organize themselves in a naturalistic music listening context. Collectively, this robust body of literature demonstrates that music processing requires timely coordination of large-scale cognitive, motor, and limbic brain networks, setting the stage for a new generation of music neuroscience research on the dynamic organization of brain networks underlying music processing.


2019 ◽  
Vol 2 (1) ◽  
pp. 139-173 ◽  
Author(s):  
Koen Van den Berge ◽  
Katharina M. Hembach ◽  
Charlotte Soneson ◽  
Simone Tiberi ◽  
Lieven Clement ◽  
...  

Gene expression is the fundamental level at which the results of various genetic and regulatory programs are observable. The measurement of transcriptome-wide gene expression has convincingly switched from microarrays to sequencing in a matter of years. RNA sequencing (RNA-seq) provides a quantitative and open system for profiling transcriptional outcomes on a large scale and therefore facilitates a large diversity of applications, including basic science studies, but also agricultural or clinical situations. In the past 10 years or so, much has been learned about the characteristics of the RNA-seq data sets, as well as the performance of the myriad of methods developed. In this review, we give an overview of the developments in RNA-seq data analysis, including experimental design, with an explicit focus on the quantification of gene expression and statistical approachesfor differential expression. We also highlight emerging data types, such as single-cell RNA-seq and gene expression profiling using long-read technologies.


Industry ◽  
2021 ◽  
pp. 221-230
Author(s):  
William Robin

By the early twenty-first century, new music’s marketplace turn was complete, though Bang on a Can’s journey had only begun: in the past two decades, they have received Pulitzer Prizes and grown into a multi-faced, multi-million dollar organization. The three founders began writing more large-scale works, and Bang on a Can’s marathons at the World Financial Center expanded their audience and diversified their programming. With their summer institute in the Berkshires, Bang on a Can has cultivated their ethos among a new generation of entrepreneurial composers, including the prominent indie classical scene, while American new music has grown from a fringe phenomenon to a cottage industry. But in the wake of the Great Recession, younger musicians are emerging amidst a crowded and precarious market, in which opportunities proliferate but stability remains elusive.


Sign in / Sign up

Export Citation Format

Share Document