scholarly journals The Development of New Toxicity Testing and Approval Processes for Oil Spill Treatment Products in the UK

Author(s):  
Helen E Walton ◽  
Joshua J Davison ◽  
Joanna Uzyczak ◽  
Christopher Martin ◽  
Paula Milliken ◽  
...  

ABSTRACT Current experimental protocols for the toxicity assessment of oil spill treatment products in the UK have been established since the 1970s. To address health and safety, cost and scientific robustness issues the UK approach for dispersant testing and approval has been reviewed and updated for implementation during 2020. To provide more robust scientific advice for the risk assessments that enable effective decision making on the use of oil remediation products in the event of a spill there has been a focus on methods that already have internationally accepted protocols. Standardisation of dispersant testing will promote more effective cross-institute comparisons of toxicity data and will enable further harmonisation of approaches in the future. It is preferable that environmentally relevant test species are used but, as the scientific literature provides little conclusive evidence of a taxa-specific trend in sensitivity, species selection based on sensitivity alone was not justified. Eight dispersants, commonly stockpiled in the UK, were tested independently and in combination with a representative crude oil (Kuwait). Testing of dispersants in combination with oil has historically provided more variable results so this study has considered the benefits of this versus product only testing. Core test species included the harpacticoid copepod, Tisbe battagliai, and the algae, Skeletonema sp., as both have cost-effective internationally standardised methods, whilst also being environmentally representative and using test species easily cultured under laboratory conditions with no seasonality. Other candidate test species, such as oyster embryos, had limitations in applicability due to seasonal issues. Fish testing was not considered as there was no ethical reasoning for vertebrate testing due to the absence of taxa-specific toxicity. Results showed that, if oil is excluded from the assessment, Skeletonema sp. and Tisbe battagliai, can produce reliable, reproduceable and interpretable results. When running the T. battagliai test, independently on multiple occasions, without oil, dispersant 1, 2 and 3 had EC50 results that were not statistically different. This suggests that product only testing is suitable for ranking products based on toxicological hazard. The redevelopment of the UK guideline to use standardised testing and the selection of appropriate, environmentally relevant test organisms will increase the quality and reliability of data used to underpin the UK oil spill treatment testing and approval scheme. The adoption of this approach will enable an approved list of products for use in UK waters to be maintained. However, the decision for dispersant use in any given scenario will need to be underpinned through expert advice applying a risk assessment approach taking account a range of incident-specific physical and environmental sensitivity information.

2008 ◽  
Vol 2008 (1) ◽  
pp. 829-833 ◽  
Author(s):  
Mark F. Kirby ◽  
Bryony Devoy ◽  
Robin J. Law

ABSTRACT Oil spill treatment products in the UK are a key option in response scenarios. It is recognised that their appropriate usage can significantly reduce net environmental impact. However, the approval process for products in the UK is strictly regulated and all must undergo an approval process before they can be used in UK marine waters. This requires the product to pass both efficacy and toxicity assessments. The toxicity assessment comprises a ‘Sea and ‘Rocky Shore’ test and is primarily based on a toxicity comparison between mechanically dispersed oil (untreated) and oil under the same conditions but treated with the product. The premise being that the addition of product should not significantly increase the toxicity of the oil alone. The current toxicity and efficiency testing protocols have been in place for 30 years and the last review of the scheme took place in 1993. The UK Department of Environment, Food and Rural Affairs (Defra) are the regulatory body and have launched another review of the scheme during 2007 to establish whether the process continues to provide the most appropriate means of ensuring that safe and effective products are available to UK responders. The review will take the form of a wide ranging public consultation. Particular issues that are being considered include; the continued requirement for products to pass both toxicity assessments; the need to test dispersants as a type 2 (water-diluted) or type 3 (neat) separately; the need to approve products for specific use against different oil types, especially heavier fuel and weathered oils; the need to take into account the fact that many modern dispersants are effective at lower product:oil ratios than used in the current test process; the performance of products under different conditions of salinity and/or temperature; the need for specific test development for other product types (e.g. surface cleaners) or to make the process more efficient (e.g. combined efficacy and toxicity test). The UK Government wishes to ensure that responders have the option of selecting the best product for tackling each oil spill scenario providing that environmental protection is not compromised. It is intended that the outcome of the review will be to facilitate this even more so than at present. This paper will describe the background to the issues being considered in the review, why they may be significant and will give a preliminary overview of initial conclusions.


2013 ◽  
Author(s):  
David Hollis ◽  
Stavroula Leka ◽  
Aditya Jain ◽  
Nicholas J. A. Andreou ◽  
Gerard Zwetsloot

TAPPI Journal ◽  
2018 ◽  
Vol 17 (09) ◽  
pp. 519-532 ◽  
Author(s):  
Mark Crisp ◽  
Richard Riehle

Polyaminopolyamide-epichlorohydrin (PAE) resins are the predominant commercial products used to manufacture wet-strengthened paper products for grades requiring wet-strength permanence. Since their development in the late 1950s, the first generation (G1) resins have proven to be one of the most cost-effective technologies available to provide wet strength to paper. Throughout the past three decades, regulatory directives and sustainability initiatives from various organizations have driven the development of cleaner and safer PAE resins and paper products. Early efforts in this area focused on improving worker safety and reducing the impact of PAE resins on the environment. These efforts led to the development of resins containing significantly reduced levels of 1,3-dichloro-2-propanol (1,3-DCP) and 3-monochloropropane-1,2-diol (3-MCPD), potentially carcinogenic byproducts formed during the manufacturing process of PAE resins. As the levels of these byproducts decreased, the environmental, health, and safety (EH&S) profile of PAE resins and paper products improved. Recent initiatives from major retailers are focusing on product ingredient transparency and quality, thus encouraging the development of safer product formulations while maintaining performance. PAE resin research over the past 20 years has been directed toward regulatory requirements to improve consumer safety and minimize exposure to potentially carcinogenic materials found in various paper products. One of the best known regulatory requirements is the recommendations of the German Federal Institute for Risk Assessment (BfR), which defines the levels of 1,3-DCP and 3-MCPD that can be extracted by water from various food contact grades of paper. These criteria led to the development of third generation (G3) products that contain very low levels of 1,3-DCP (typically <10 parts per million in the as-received/delivered resin). This paper outlines the PAE resin chemical contributors to adsorbable organic halogens and 3-MCPD in paper and provides recommendations for the use of each PAE resin product generation (G1, G1.5, G2, G2.5, and G3).


Author(s):  
Tochukwu Moses ◽  
David Heesom ◽  
David Oloke ◽  
Martin Crouch

The UK Construction Industry through its Government Construction Strategy has recently been mandated to implement Level 2 Building Information Modelling (BIM) on public sector projects. This move, along with other initiatives is key to driving a requirement for 25% cost reduction (establishing the most cost-effective means) on. Other key deliverables within the strategy include reduction in overall project time, early contractor involvement, improved sustainability and enhanced product quality. Collaboration and integrated project delivery is central to the level 2 implementation strategy yet the key protocols or standards relative to cost within BIM processes is not well defined. As offsite construction becomes more prolific within the UK construction sector, this construction approach coupled with BIM, particularly 5D automated quantification process, and early contractor involvement provides significant opportunities for the sector to meet government targets. Early contractor involvement is supported by both the industry and the successive Governments as a credible means to avoid and manage project risks, encourage innovation and value add, making cost and project time predictable, and improving outcomes. The contractor is seen as an expert in construction and could be counter intuitive to exclude such valuable expertise from the pre-construction phase especially with the BIM intent of äóÖbuild it twiceäó», once virtually and once physically. In particular when offsite construction is used, the contractoräó»s construction expertise should be leveraged for the virtual build in BIM-designed projects to ensure a fully streamlined process. Building in a layer of automated costing through 5D BIM will bring about a more robust method of quantification and can help to deliver the 25% reduction in overall cost of a project. Using a literature review and a case study, this paper will look into the benefits of Early Contractor Involvement (ECI) and the impact of 5D BIM on the offsite construction process.


The Lancet ◽  
2021 ◽  
Vol 397 (10271) ◽  
pp. 274
Author(s):  
Raymond M Agius ◽  
Denise Kendrick ◽  
Herb F Sewell ◽  
Marcia Stewart ◽  
John FR Robertson
Keyword(s):  

2021 ◽  
Vol 14 (1) ◽  
Author(s):  
Emily Joanne Nixon ◽  
Ellen Brooks-Pollock ◽  
Richard Wall

Abstract Background Ovine psoroptic mange (sheep scab) is a highly pathogenic contagious infection caused by the mite Psoroptes ovis. Following 21 years in which scab was eradicated in the UK, it was inadvertently reintroduced in 1972 and, despite the implementation of a range of control methods, its prevalence increased steadily thereafter. Recent reports of resistance to macrocyclic lactone treatments may further exacerbate control problems. A better understanding of the factors that facilitate its transmission are required to allow improved management of this disease. Transmission of infection occurs within and between contiguous sheep farms via infected sheep-to-sheep or sheep–environment contact and through long-distance movements of infected sheep, such as through markets. Methods A stochastic metapopulation model was used to investigate the impact of different transmission routes on the spatial pattern of outbreaks. A range of model scenarios were considered following the initial infection of a cluster of highly connected contiguous farms. Results Scab spreads between clusters of neighbouring contiguous farms after introduction but when long-distance movements are excluded, infection then self-limits spatially at boundaries where farm connectivity is low. Inclusion of long-distance movements is required to generate the national patterns of disease spread observed. Conclusions Preventing the movement of scab infested sheep through sales and markets is essential for any national management programme. If effective movement control can be implemented, regional control in geographic areas where farm densities are high would allow more focussed cost-effective scab management. Graphical Abstract


Author(s):  
Anmol Arora ◽  
Andrew Wright ◽  
Mark Cheng ◽  
Zahra Khwaja ◽  
Matthew Seah

AbstractHealthcare as an industry is recognised as one of the most innovative. Despite heavy regulation, there is substantial scope for new technologies and care models to not only boost patient outcomes but to do so at reduced cost to healthcare systems and consumers. Promoting innovation within national health systems such as the National Health Service (NHS) in the United Kingdom (UK) has been set as a key target for health care professionals and policy makers. However, while the UK has a world-class biomedical research industry, several reports in the last twenty years have highlighted the difficulties faced by the NHS in encouraging and adopting innovations, with the journey from idea to implementation of health technology often taking years and being very expensive, with a high failure rate. This has led to the establishment of several innovation pathways within and around the NHS, to encourage the invention, development and implementation of cost-effective technologies that improve health care delivery. These pathways span local, regional and national health infrastructure. They operate at different stages of the innovation pipeline, with their scope and work defined by location, technology area or industry sector, based on the specific problem identified when they were set up. In this introductory review, we outline each of the major innovation pathways operating at local, regional and national levels across the NHS, including their history, governance, operating procedures and areas of expertise. The extent to which innovation pathways address current challenges faced by innovators is discussed, as well as areas for improvement and future study.


Sign in / Sign up

Export Citation Format

Share Document