scholarly journals Toxicity testing: Non-animal approaches and safety science

2014 ◽  
Vol 36 (3) ◽  
pp. 19-25 ◽  
Author(s):  
Fiona Reynolds ◽  
Carl Westmoreland ◽  
Julia Fentem

New informatics capabilities and computational and mathematical modelling techniques, used in combination with highly sensitive molecular biology and mechanistic chemistry approaches, are transforming the way in which we assess the safety of chemicals and products. In recent years, good progress has been made in replacing some of the animal tests required for regulatory purposes with methods using cells and tissues in vitro. Nevertheless, big scientific challenges remain in developing relevant non-animal models able to predict the effects of chemicals which are absorbed systemically. The greatest breakthroughs in non-animal approaches for chemical safety assessment will most likely result from continued multi-disciplinary research investment in predictive (integrative and systems) biology. Some of our current research in this area is described in the present article.

2010 ◽  
Vol 29 (1) ◽  
pp. 11-14 ◽  
Author(s):  
Robert F Phalen

Toxicity Testing in the 21st Century: A Vision and a Strategy (NRC, 2007) presents a bold plan for chemical toxicity testing that replaces whole-animal tests with cell-culture, genetic, other in-vitro techniques, computational methods, and human monitoring. Although the proposed vision is eloquently described, and recent advances in in-vitro and in-silico methods are impressive, it is difficult believe that replacing in-vitro testing is either practical or wise. It is not clear that the toxicity-related events that occur in whole animals can be adequately replicated using the proposed methods. Protecting public health is a serious endeavor that should not be limited by denying animal testing. Toxicologists and regulators are encouraged to read the report, carefully consider its implications, and share their thoughts. The vision is for too important to ignore.


Metabolomics ◽  
2022 ◽  
Vol 18 (1) ◽  
Author(s):  
Julia M. Malinowska ◽  
Taina Palosaari ◽  
Jukka Sund ◽  
Donatella Carpi ◽  
Mounir Bouhifd ◽  
...  

Abstract Introduction High-throughput screening (HTS) is emerging as an approach to support decision-making in chemical safety assessments. In parallel, in vitro metabolomics is a promising approach that can help accelerate the transition from animal models to high-throughput cell-based models in toxicity testing. Objective In this study we establish and evaluate a high-throughput metabolomics workflow that is compatible with a 96-well HTS platform employing 50,000 hepatocytes of HepaRG per well. Methods Low biomass cell samples were extracted for metabolomics analyses using a newly established semi-automated protocol, and the intracellular metabolites were analysed using a high-resolution spectral-stitching nanoelectrospray direct infusion mass spectrometry (nESI-DIMS) method that was modified for low sample biomass. Results The method was assessed with respect to sensitivity and repeatability of the entire workflow from cell culturing and sampling to measurement of the metabolic phenotype, demonstrating sufficient sensitivity (> 3000 features in hepatocyte extracts) and intra- and inter-plate repeatability for polar nESI-DIMS assays (median relative standard deviation < 30%). The assays were employed for a proof-of-principle toxicological study with a model toxicant, cadmium chloride, revealing changes in the metabolome across five sampling times in the 48-h exposure period. To allow the option for lipidomics analyses, the solvent system was extended by establishing separate extraction methods for polar metabolites and lipids. Conclusions Experimental, analytical and informatics workflows reported here met pre-defined criteria in terms of sensitivity, repeatability and ability to detect metabolome changes induced by a toxicant and are ready for application in metabolomics-driven toxicity testing to complement HTS assays.


2019 ◽  
Vol 20 (7) ◽  
pp. 1712 ◽  
Author(s):  
Arianna Giusti ◽  
Xuan-Bac Nguyen ◽  
Stanislav Kislyuk ◽  
Mélanie Mignot ◽  
Cecilia Ranieri ◽  
...  

Zebrafish-based platforms have recently emerged as a useful tool for toxicity testing as they combine the advantages of in vitro and in vivo methodologies. Nevertheless, the capacity to metabolically convert xenobiotics by zebrafish eleuthero embryos is supposedly low. To circumvent this concern, a comprehensive methodology was developed wherein test compounds (i.e., parathion, malathion and chloramphenicol) were first exposed in vitro to rat liver microsomes (RLM) for 1 h at 37 °C. After adding methanol, the mixture was ultrasonicated, placed for 2 h at −20 °C, centrifuged and the supernatant evaporated. The pellet was resuspended in water for the quantification of the metabolic conversion and the detection of the presence of metabolites using ultra high performance liquid chromatography-Ultraviolet-Mass (UHPLC-UV-MS). Next, three days post fertilization (dpf) zebrafish eleuthero embryos were exposed to the metabolic mix diluted in Danieau’s medium for 48 h at 28 °C, followed by a stereomicroscopic examination of the adverse effects induced, if any. The novelty of our method relies in the possibility to quantify the rate of the in vitro metabolism of the parent compound and to co-incubate three dpf larvae and the diluted metabolic mix for 48 h without inducing major toxic effects. The results for parathion show an improved predictivity of the toxic potential of the compound.


2006 ◽  
Vol 36 (1) ◽  
pp. 37-68 ◽  
Author(s):  
John E. Doe ◽  
Alan R. Boobis ◽  
Ann Blacker ◽  
Vicki Dellarco ◽  
Nancy G. Doerrer ◽  
...  

1991 ◽  
Vol 19 (2) ◽  
pp. 222-225
Author(s):  
Per Kjellstrand ◽  
Ulf Boberg

Tests, performed over a ten-year period, on 653 polymers intended for use in extracorporeal renal replacement therapy, were evaluated. The test battery used included animal tests, in vitro tests and chemical tests. Some tests were found to have a very low sensitivity. Thus, acute systemic toxicity testing in mice with sodium chloride, ethanol or paraffin oil as extractants was performed on a total of 806 occasions. Only two of these resulted in a “fail” decision. The final outcome of the tests for the majority of materials could be predicted by the results of the UV absorption, chloride, inhibition of cell growth, and tin tests. Of the materials that passed these four tests, less than 2% were not approved on the basis of the whole test battery. The experiments show that only a limited number of tests have to be performed when testing polymers intended for use in extracorporeal renal replacement therapy.


Nanomaterials ◽  
2020 ◽  
Vol 10 (4) ◽  
pp. 708 ◽  
Author(s):  
Angela Serra ◽  
Michele Fratello ◽  
Luca Cattelani ◽  
Irene Liampa ◽  
Georgia Melagraki ◽  
...  

Transcriptomics data are relevant to address a number of challenges in Toxicogenomics (TGx). After careful planning of exposure conditions and data preprocessing, the TGx data can be used in predictive toxicology, where more advanced modelling techniques are applied. The large volume of molecular profiles produced by omics-based technologies allows the development and application of artificial intelligence (AI) methods in TGx. Indeed, the publicly available omics datasets are constantly increasing together with a plethora of different methods that are made available to facilitate their analysis, interpretation and the generation of accurate and stable predictive models. In this review, we present the state-of-the-art of data modelling applied to transcriptomics data in TGx. We show how the benchmark dose (BMD) analysis can be applied to TGx data. We review read across and adverse outcome pathways (AOP) modelling methodologies. We discuss how network-based approaches can be successfully employed to clarify the mechanism of action (MOA) or specific biomarkers of exposure. We also describe the main AI methodologies applied to TGx data to create predictive classification and regression models and we address current challenges. Finally, we present a short description of deep learning (DL) and data integration methodologies applied in these contexts. Modelling of TGx data represents a valuable tool for more accurate chemical safety assessment. This review is the third part of a three-article series on Transcriptomics in Toxicogenomics.


2002 ◽  
Vol 30 (6) ◽  
pp. 571-578 ◽  
Author(s):  
Ian Kimber

Many chemicals are known to be, or have been implicated as, contact allergens, and allergic contact dermatitis is an important occupational and environmental health issue. It is the responsibility of toxicologists to identify those chemicals that have the potential to induce skin sensitisation, and to assess the conditions under which there will exist a risk to human health. This article describes progress that has been made in the development of new approaches to the toxicological evaluation of skin sensitisation, and the benefits to animal welfare that such developments have already produced, and are likely to produce in the future. In this context, the local lymph node assay is described with regard to hazard identification and risk assessment, and possible strategies for the development of in vitro approaches to safety assessment are discussed.


1995 ◽  
Vol 23 (4) ◽  
pp. 474-479
Author(s):  
Michael Balls

— The use of in vitro techniques in fundamental pharmacotoxicological research is widespread, but relatively little progress has been made in applying the knowledge and experience gained to regulatory toxicity testing. This is largely because specific tests of various kinds are required by national and international laws and regulations, before chemicals and products of many types can be manufactured, transported or marketed. These requirements have led to the publication of standardised test guidelines by various regulatory authorities. Although there is an increasing willingness to accept, in principle, the incorporation of non-animal tests and testing strategies into regulatory practice, their relevance and reliability must first be established in recognised validation studies, the outcomes of which must be assessed by independent groups of experts. The formal validation of alternative methods is not progressing satisfactorily. The reasons for this are discussed and some suggestions for improvements are made.


Sign in / Sign up

Export Citation Format

Share Document