scholarly journals MultiCellDS: a standard and a community for sharing multicellular data

2016 ◽  
Author(s):  
Samuel H. Friedman ◽  
Alexander R. A. Anderson ◽  
David M. Bortz ◽  
Alexander G. Fletcher ◽  
Hermann B. Frieboes ◽  
...  

AbstractCell biology is increasingly focused on cellular heterogeneity and multicellular systems. To make the fullest use of experimental, clinical, and computational efforts, we need standardized data formats, community-curated “public data libraries”, and tools to combine and analyze shared data. To address these needs, our multidisciplinary community created MultiCellDS (MultiCellular Data Standard): an extensible standard, a library of digital cell lines and tissue snapshots, and support software. With the help of experimentalists, clinicians, modelers, and data and library scientists, we can grow this seed into a community-owned ecosystem of shared data and tools, to the benefit of basic science, engineering, and human health.

2021 ◽  
Vol 10 (3) ◽  
pp. 506
Author(s):  
Hans Binder ◽  
Maria Schmidt ◽  
Henry Loeffler-Wirth ◽  
Lena Suenke Mortensen ◽  
Manfred Kunz

Cellular heterogeneity is regarded as a major factor for treatment response and resistance in a variety of malignant tumors, including malignant melanoma. More recent developments of single-cell sequencing technology provided deeper insights into this phenomenon. Single-cell data were used to identify prognostic subtypes of melanoma tumors, with a special emphasis on immune cells and fibroblasts in the tumor microenvironment. Moreover, treatment resistance to checkpoint inhibitor therapy has been shown to be associated with a set of differentially expressed immune cell signatures unraveling new targetable intracellular signaling pathways. Characterization of T cell states under checkpoint inhibitor treatment showed that exhausted CD8+ T cell types in melanoma lesions still have a high proliferative index. Other studies identified treatment resistance mechanisms to targeted treatment against the mutated BRAF serine/threonine protein kinase including repression of the melanoma differentiation gene microphthalmia-associated transcription factor (MITF) and induction of AXL receptor tyrosine kinase. Interestingly, treatment resistance mechanisms not only included selection processes of pre-existing subclones but also transition between different states of gene expression. Taken together, single-cell technology has provided deeper insights into melanoma biology and has put forward our understanding of the role of tumor heterogeneity and transcriptional plasticity, which may impact on innovative clinical trial designs and experimental approaches.


1994 ◽  
Vol 5 (3) ◽  
pp. 267-272 ◽  
Author(s):  
Harold Varmus

The following is an edited version of the Keynote Speech delivered at the Annual Meeting of the American Society for Cell Biology by Harold Varmus, Director of the National Institutes of Health. The address, entitled Basic Science and the NIH, was given at the opening of the meeting in New Orleans on December 11, 1993. It was Varmus' first public policy talk as NIH Director.


2011 ◽  
pp. 96-154 ◽  
Author(s):  
A.R. Hurson ◽  
Y. Jiao

The advances in mobile devices and wireless communication techniques have enabled anywhere, anytime data access. Data being accessed can be categorized into three classes: private data, shared data, and public data. Private and shared data are usually accessed through on-demand-based approaches, while public data can be most effectively disseminated using broadcasting. In the mobile computing environment, the characteristics of mobile devices and limitations of wireless communication technology pose challenges on broadcasting strategy as well as data-retrieval method designs. Major research issues include indexing scheme, broadcasting over single and parallel channels, data distribution and replication strategy, conflict resolution, and data retrieval method. In this chapter, we investigate solutions proposed for these issues. High performance and low power consumption are the two main objectives of the proposed schemes. Comprehensive simulation results are used to demonstrate the effectiveness of each solution and compare different approaches.


2020 ◽  
Author(s):  
Meredith Richardson ◽  
Ed Kearns ◽  
Jonathan O'Neil

<p>Through satellites, ships, radars, and weather models, the National Oceanic and Atmospheric Administration (NOAA) generates and handles tens of terabytes of data per day. Many of NOAA’s key datasets have been made available to the public through partnerships with Google, Microsoft, Amazon Web Services, and more as part of the Big Data Project (BDP). This movement of data to the Cloud has enabled access for researchers from all over the world to vast amounts of NOAA data, initiating a new form of federal data management as well as exposing key challenges for the future of open-access data. NOAA researchers have run into challenges of providing “analysis-ready” datasets to which researchers from varying fields can easily access, manipulate, and use for different purposes. This issue arises as there is no agreed-upon format or method of transforming traditional datasets for the cloud across research communities, with each scientific field or start up expressing differing data formatting needs (cloud-optimized, cloud-native, etc.). Some possible solutions involve changing data formats into those widely-used throughout the visualization community, such as Cloud-Optimized GeoTIFF. Initial findings have led NOAA to facilitate roundtable discussions with researchers, public and private stakeholders, and other key members of the data community, to encourage the development of best practices for the use of public data on commercial cloud platforms. Overall, by uploading NOAA data to the Cloud, the BDP has led to the recognition and ongoing development of new best practices for data authentication and dissemination and the identification of key areas for targeting collaboration and data use across scientific communities.</p>


2020 ◽  
Author(s):  
Mathieu Turlure ◽  
Marc Schaming ◽  
Alice Fremand ◽  
Marc Grunberg ◽  
Jean Schmittbuhl

<p><strong>The CDGP Repository for Geothermal Data</strong></p><p>The Data Center for Deep Geothermal Energy (CDGP – Centre de Données de Géothermie Profonde, https://cdgp.u-strasbg.fr) was launched in 2016 by the LabEx G-EAU-THERMIE PROFONDE (http://labex-geothermie.unistra.fr) to preserve, archive and distribute data acquired on geothermal sites in Alsace. Since the beginning of the project, specific procedures are followed to respect international requirements for data management. In particular, FAIR recommendations are used to distribute Findable, Accessible, Interoperable and Reusable data.</p><p>Data currently available on the CDGP mainly consist of seismological and hydraulic data acquired at the Soultz-sous-Forêts geothermal plant pilot project. Data on the website are gathered in episodes. Episodes 1994, 1995, 1996, and 2010 from Soultz-sous-Forêts have been recently added to the episodes already available on the CDGP (1988, 1991, 1993, 2000, 2003, 2004 and 2005). All data are described with metadata and interoperability is promoted with use of open or community-shared data formats: SEED, csv, pdf, etc. Episodes have DOIs.</p><p>To secure Intellectual Property Rights (IPR) set by data providers that partly come from Industry, an Authentication, Authorization and Accounting Infrastructure (AAAI) grants data access depending to distribution rules and user’s affiliation (i.e. academic, industrial, …).</p><p>The CDGP is also a local node for the European Plate Observing System (EPOS) Anthropogenic Hazards platform (https://tcs.ah-epos.eu). The platform provides an environment and facilities (data, services, software) for research onto anthropogenic hazards, especially related to the exploration and exploitation of geo-resources. Some episodes from Soultz-sous-Forêts are already available and the missing-ones will be soon on the platform.</p><p>The next step for the CDGP is first to complete data from Soultz-sous-Forêts. Some data are still missing and must be recovered from the industrial partners. Then, data from the other geothermal sites in Alsace (Rittershoffen, Illkirch, Vendenheim) need to be collected in order to be distributed. Finally, with other French data centers, we are on track to apply the CoreTrustSeal certification (ANR Cedre).</p><p>The preservation of data can be very challenging and time-consuming. We had to deal with obsolete tapes and formats, even incomplete data. Old data are frequently not well documented and the identification of owner is sometimes difficult. However, the hard work to retrieve, collect old geothermal data and make them FAIR is necessary for new analysis and the valorization of these patrimonial data. The re-use of data (e.g. Cauchie et al, 2020) demonstrates the importance of the CDGP.</p>


2019 ◽  
Vol 5 ◽  
pp. e231
Author(s):  
Sebastian Ohse ◽  
Melanie Boerries ◽  
Hauke Busch

The rise of high-throughput technologies in the domain of molecular and cell biology, as well as medicine, has generated an unprecedented amount of quantitative high-dimensional data. Public databases at present make a wealth of this data available, but appropriate normalization is critical for meaningful analyses integrating different experiments and technologies. Without such normalization, meta-analyses can be difficult to perform and the potential to address shortcomings in experimental designs, such as inadequate replicates or controls with public data, is limited. Because of a lack of quantitative standards and insufficient annotation, large scale normalization across entire databases is currently limited to approaches that demand ad hoc assumptions about noise sources and the biological signal. By leveraging detectable redundancies in public databases, such as related samples and features, we show that blind normalization without constraints on noise sources and the biological signal is possible. The inherent recovery of confounding factors is formulated in the theoretical framework of compressed sensing and employs efficient optimization on manifolds. As public databases increase in size and offer more detectable redundancies, the proposed approach is able to scale to more complex confounding factors. In addition, the approach accounts for missing values and can incorporate spike-in controls. Our work presents a systematic approach to the blind normalization of public high-throughput databases.


2020 ◽  
Vol 5 (1) ◽  
Author(s):  
Anthony Futerman

The critical point in the life cycle of a virus is gaining entry into a host cell so that the virus can replicate. Anthony Futerman describes the distinct biological features of SARS-CoV-2, including its method of entering host cells. He finally urges more support for basic science research so that future biologists will be better prepared to stem diseases before they reach pandemic proportions.


2018 ◽  
Vol 35 (3) ◽  
Author(s):  
George S. Stoyanov ◽  
Deyan Dzhenkov ◽  
Peter Ghenev ◽  
Bogomil Iliev ◽  
Yavor Enchev ◽  
...  

2009 ◽  
Vol 37 (1) ◽  
pp. 299-302
Author(s):  
Adrienne M. Gorman ◽  
Karen M. Doyle

Neuroscience is a rapidly developing area of science which has benefitted from the blurring of interdisciplinary boundaries. This was apparent in the range of papers presented at this year's Neuroscience Ireland Conference, held in Galway during August 2008. The event was attended by academics, postdoctoral and postgraduate researchers, scientists from industry and clinicians. The themes of this year's conference, neurodegeneration, neuroregeneration, pain, glial cell biology and psychopharmacology, were chosen for their reflection of areas of strength in neuroscience within Ireland. In addition to basic science, translational research also featured strongly.


Sign in / Sign up

Export Citation Format

Share Document