scholarly journals A framework for evaluating edited cell libraries created by massively parallel genome engineering

2021 ◽  
Author(s):  
Simon Cawley ◽  
Eric Abbate ◽  
Christopher G. Abraham ◽  
Steven Alvarez ◽  
Mathew Barber ◽  
...  

AbstractGenome engineering methodologies are transforming biological research and discovery. Approaches based on CRISPR technology have been broadly adopted and there is growing interest in the generation of massively parallel edited cell libraries. Comparing the libraries generated by these varying approaches is challenging and researchers lack a common framework for defining and assessing the characteristics of these libraries. Here we describe a framework for evaluating massively parallel libraries of edited genomes based on established methods for sampling complex populations. We define specific attributes and metrics that are informative for describing a complex cell library and provide examples for estimating these values. We also connect this analysis to generic phenotyping approaches, using either pooled (typically via a selection assay) or isolate (often referred to as screening) phenotyping approaches. We approach this from the context of creating massively parallel, precisely edited libraries with one edit per cell, though the approach holds for other types of modifications, including libraries containing multiple edits per cell (combinatorial editing). This framework is a critical component for evaluating and comparing new technologies as well as understanding how a massively parallel edited cell library will perform in a given phenotyping approach.

2020 ◽  
Author(s):  
James Joseph Lalonde ◽  
Richard Fox ◽  
Michael Clay ◽  
Nandini Krishnamurthy ◽  
Eric Abbate

Author(s):  
Tetiana Sovhyra

The article is a comprehensive analysis of projects aimed at studying AI technologies and culture interaction. The author examines the specifics and uniqueness of art works created through AI-technologies using examples of projects from “ThoughtWorks Arts Global Research”, “Innovation Laboratory of New Technologies”, “Isolation Foundation” and “IZONE Creative Association”. The article analyzes the principle of selection of materials, algorithmic analysis of data, the interdependence of digital data received from the user's brain impulses with audiovisual content, the possibility of instant data processing in the process of creating an artistic product. The author explores the principles of tracking brain function and decoding human genetic data, which are used to create art projects. The article assesses the potential that AI possesses and explains the conditions necessary for the implementation of AI-technology in culture. As a result of the study, the author revealed that through algorithmic analysis it is possible to transform digital data into a system of expressive signs of visual and sound arts, to broadcast the received audiovisual content. The author finds out that through these technologies it is possible to create interactive art forms (interactive film, installations, immersive presentations, etc.).


2018 ◽  
Vol 112 (1) ◽  
pp. 19
Author(s):  
Anja DOMADENIK

<p>Autism spectrum disorders (ASD) are a group of highly heterogenous neurological disorders that are believed to have strong genetic component. Due to the limited use of approaches of functional genomics in human medicine, creating adequate animal models for the study of complex human diseases shows great potential. There are several already established mouse models of autism that offer insight into single phenotypic traits, although causes for its complex phenotype have not yet been fully understood. Development of new technologies, such as CRISPR/Cas9, represent great capability for targeted genome engineering and establishment of new animal models. This article provides an up to date overview of current knowledge in the area of autism genomics and describes the potential of CRISPR/Cas9 technology for the establishment of new mouse models, representing sgRNA design as one of the initial steps in planning a CRISPR/Cas9 single knock-out experiment. In addition, it offers an overview of current approaches to behavioural studies, explaining how relevant animal models could be developed.</p>


1969 ◽  
Vol 15 (2) ◽  
Author(s):  
Rupa S Iyer ◽  
William E Fitzgibbon

The field of biotechnology has become more quantitative and interdisciplinary as research in biotechnology continues to grow at a tremendous rate with broader and complex applications in medicine, agriculture, the environment and nanobiotechnology. The tremendous research in recombinant DNA technology has profoundly transformed the way biologists design, perform and analyze experiments. As biological concepts and models become more quantitative, biological research will be increasingly dependent on concepts and methods drawn from other scientific disciplines. Therefore, in order to prepare our undergraduate life science students to be future research scientists, we need to transform undergraduate education. This will require life science majors to develop and reinforce connections between biology and other scientific disciplines so that interdisciplinary thinking and work becomes second nature. With the integration of new technologies in biological research, biology will continue to become more interdisciplinary and will present a challenge for higher institutions that are training future biologists. This paper describes the development of a new undergraduate interdisciplinary research-based biotechnology degree programme offered by University of Houston College of Technology that addresses issues and challenges in biotechnology education.


2021 ◽  
Vol 14 (8) ◽  
pp. 765
Author(s):  
Marcin Janowski ◽  
Małgorzata Milewska ◽  
Peyman Zare ◽  
Aleksandra Pękowska

Neurological disorders (NDs) comprise a heterogeneous group of conditions that affect the function of the nervous system. Often incurable, NDs have profound and detrimental consequences on the affected individuals’ lives. NDs have complex etiologies but commonly feature altered gene expression and dysfunctions of the essential chromatin-modifying factors. Hence, compounds that target DNA and histone modification pathways, the so-called epidrugs, constitute promising tools to treat NDs. Yet, targeting the entire epigenome might reveal insufficient to modify a chosen gene expression or even unnecessary and detrimental to the patients’ health. New technologies hold a promise to expand the clinical toolkit in the fight against NDs. (Epi)genome engineering using designer nucleases, including CRISPR-Cas9 and TALENs, can potentially help restore the correct gene expression patterns by targeting a defined gene or pathway, both genetically and epigenetically, with minimal off-target activity. Here, we review the implication of epigenetic machinery in NDs. We outline syndromes caused by mutations in chromatin-modifying enzymes and discuss the functional consequences of mutations in regulatory DNA in NDs. We review the approaches that allow modifying the (epi)genome, including tools based on TALENs and CRISPR-Cas9 technologies, and we highlight how these new strategies could potentially change clinical practices in the treatment of NDs.


Author(s):  
Adam Treister

Flow cytometry is a result of the computer revolution. Biologists used fluorescent dyes in microscopy and medicine almost a hundred years before the first flow cytometer. Only after electronics became sophisticated enough to control individual cells and computers became fast enough to analyze the data coming out of the instrument, and to make a decision in time to deflect the stream, did cell sorting become viable. Since the 1970s, the capabilities of computers have grown exponentially. According to the famed Moore’s Law, the size of the computer, as tracked by the number of transistors on a chip, doubles every 18 months. This rule has held for three decades so far, and new technologies continue to appear to keep that growth on track. The clock speed of chips is now measured in gigahertz—billions of instructions per second—and hard drives are now available with capacities measured in terabytes. Having computers so powerful, cheap, and ubiquitous changes the nature of scientific exploration. We are in the early steps of a long march of biotechnology breakthroughs spawned from this excess of compute power. From genomics to proteomics to high-throughput flow cytometry, the trend in biological research is toward massproduced, high-volume experiments. Automation is the key to scaling their size and scope and to lowering their cost per test. Each step that was previously done by human hands is being delegated to a computer or a robot for the implementation to be more precise and to scale efficiently. From making sort decisions in milliseconds to creating data archives that may last for centuries, computers control the information involved with cytometry, and software controls the computers. As the technology matures and the size and number of exper iments increase, the emphasis of software development switches from instrument control to analysis and management. The challenge for computers is not in running the cytometer any more. The more modern challenge for informatics is to analyze, aggregate, maintain, access, and exchange the huge volume of flow cytometry data. Clinical and other regulated use of cytometry necessitates more rigorous data administration techniques. These techniques introduce issues of security, integrity, and privacy into the processing of data.


2018 ◽  
Vol 23 (2) ◽  
pp. 116-122 ◽  
Author(s):  
Jonathan Karnon ◽  
Laura Edney ◽  
Hossein Afzali

Health technology assessment provides a common framework for evaluating the costs and benefits of new health technologies to inform decisions on the public funding of new pharmaceuticals and other health technologies. In Australia and England, empirical analyses of the opportunity costs of government spending on new health technologies suggest more quality adjusted life years are being forgone than are being gained by a non-trivial proportion of funded health technologies. This essay considers the relevance of available empirical estimates of opportunity costs and explores the relationship between the public funding of health technologies and broader political and economic factors. We conclude that the benefits of a general reduction in the prices paid by governments for new technologies outweigh the costs, but evidence of informed public acceptance of reduced access to new health technologies may be required to shift the current approach to assessing the value of new health technologies.


2009 ◽  
Vol 12 (04) ◽  
pp. 610-629 ◽  
Author(s):  
P. Craig Smalley ◽  
A. William Ross ◽  
Chris Brown ◽  
Timothy P. Moulds ◽  
Michael J. Smith

Summary The Reservoir Technical Limits (RTL™) approach described herein has proved highly effective at identifying those activities and technologies required to push oilfield recovery factors toward their maximum potential. It combines classical reservoir engineering approaches, together with knowledge of existing and novel recovery-enhancing technologies, to create a common framework for identifying specific actions to increase recovery factor. RTL is implemented in a structured workshop supported by a software toolkit. The RTL workshop involves the cross-disciplinary field team (in-depth field knowledge), external technical experts (challenge, cross-fertilization), and trained facilitation. The software toolkit encourages innovation in a structured and reproducible manner and documents the outcomes in a consistent format. The RTL conceptual framework represents a recovery factor as the product of four efficiency factors:pore-scale displacement (microscopic efficiency of the recovery process);drainage (connectedness to a producer);sweep (movement of oil to producers within the drained volume); andcut-offs (losses related to end of field life/access). RTL encourages identification of new "opportunities," specific activities or projects that, if implemented, increase one or more efficiency factor, and thus increase recovery relative to the current field Depletion Plan. New ideas are stimulated by comparing current efficiency values with the effects of successful prescreened activities from analogue fields. The identified opportunities are validated by benchmarking:internally, comparing recovery factors derived from summing the opportunity volumes with recovery factors derived from the expected efficiency factor increments; andexternally, comparing with analogue fields. The result is a prioritized list of validated opportunities and an understanding of how each activity affects the reservoir to increase recovery. The opportunities (and any required new technologies) are valued in terms of the resultant incremental barrels. The RTL approach is a significant innovation, because it provides a systematic framework to:identify new recovery-increasing activities across a portfolio of fields;engender ownership of these activities by the individual field teams; andidentify the technology requirements to progress the opportunities. Now, having been implemented in more than 200 fields, this systematic approach has enabled opportunity descriptions/values and technology requirements to be compared consistently across all fields, thereby improving project prioritization and focusing corporate technology development and deployment onto the highest impact areas.


2020 ◽  
Author(s):  
James Joseph Lalonde ◽  
Richard Fox ◽  
Michael Clay ◽  
Nandini Krishnamurthy ◽  
Eric Abbate

Sign in / Sign up

Export Citation Format

Share Document