Environmental Data in Decision Making in EPA Regional Offices

Author(s):  
Stanley L. Laskowski ◽  
Frederick W. Kutz
2012 ◽  
Author(s):  
Goffredo La Loggia ◽  
Elisa Arnone ◽  
Giuseppe Ciraolo ◽  
Antonino Maltese ◽  
Leonardo Noto ◽  
...  

2014 ◽  
Vol 668-669 ◽  
pp. 1374-1377 ◽  
Author(s):  
Wei Jun Wen

ETL refers to the process of data extracting, transformation and loading and is deemed as a critical step in ensuring the quality, data specification and standardization of marine environmental data. Marine data, due to their complication, field diversity and huge volume, still remain decentralized, polyphyletic and isomerous with different semantics and hence far from being able to provide effective data sources for decision making. ETL enables the construction of marine environmental data warehouse in the form of cleaning, transformation, integration, loading and periodic updating of basic marine data warehouse. The paper presents a research on rules for cleaning, transformation and integration of marine data, based on which original ETL system of marine environmental data warehouse is so designed and developed. The system further guarantees data quality and correctness in analysis and decision-making based on marine environmental data in the future.


2018 ◽  
Vol 64 (4) ◽  
pp. 614-626
Author(s):  
Tishyarakshit Chatterjee

With India’s well-intentioned environmental laws and legal interpretations in place, there is still a perceptible weakness in the enforcement of her environmental regulations. This is ascribed to the centralised departmental structure and process of implementation, which prioritise clearances of developmental projects over monitoring and cleaning up of already polluted environments. Although in a democratic set-up, a lack of transparency and participation of knowledgeable stakeholders in decision-making are other process weaknesses noticed. Establishing an Independent Environmental Regulatory Authority has been tried repeatedly but given up mainly because its effectiveness depends on the same resources support as at present, on reliable primary field-level environmental data, not gathered regularly now and on sustained political support. Technically analysing the issues involved, this article suggests a process shift towards a locally relevant, transparent, decentralised, participative and area-science–value-based approach that can strengthen environmental regulation from below.


2019 ◽  
Vol 101 (2) ◽  
pp. 615-623 ◽  
Author(s):  
France Guertin ◽  
Thomas Polzin ◽  
Martha Rogers ◽  
Betsy Witt

2014 ◽  
Vol 685 ◽  
pp. 524-527
Author(s):  
Yan Ju Zhu

The article mainly researches on the application of big data in the environment decision-making of the government. Through the integration of the technology of Internet, video compression, computer processing, we pose the model of the government environmental data platform. The platform includes the environmental data acquisition platform, the environmental decision-making platform and the environmental management platform.


2020 ◽  
pp. 251484862090972
Author(s):  
Eric Nost

Conservationists around the world advocate for “data-driven” environmental governance, expecting data infrastructures to make all relevant and actionable information readily available. But how exactly is data to be infrastructured and to what political effect? I show how putting together and maintaining environmental data for decision-making is not a straightforward technical task, but a practice shaped by and shaping politico-economic context. Drawing from the US state of Louisiana’s coastal restoration planning process, I detail two ways ecosystem modelers manage fiscal and institutional “frictions” to “infrastructuring” data as a resource for decision-making. First, these experts work with the data they have. They leverage, tweak, and maintain existing datasets and tools, spending time and money to gather additional data only to the extent it fits existing goals. The assumption is that these goals will continue to be important, but building coastal data infrastructure around current research needs, plans, and austerity arguably limits what can be said in and done with the future. Second, modelers acquire the data they made to need. Coastal communities have protested the state’s primary restoration tool: diversions of sediment from the Mississippi River. Planners reacted by relaxing institutional constraints and modelers brought together new data to highlight possible winners and losers from ecological restoration. Fishers and other coastal residents leveraged greater dissent in the planning process. Political ecologists show that technocentric environmental governance tends to foreclose dissent from hegemonic socioecological futures. I argue we can clarify the conditions in which this tends to happen by following how experts manage data frictions. As some conservationists and planners double down on driving with data in a “post-truth” world, I find that data’s politicizing effects stem from what is asked of it, not whether it is “big” or “drives.”


2018 ◽  
Vol 9 (1) ◽  
pp. 84 ◽  
Author(s):  
Muhammad Syafrudin ◽  
Norma Fitriyani ◽  
Ganjar Alfian ◽  
Jongtae Rhee

Maintaining product quality is essential for smart factories, hence detecting abnormal events in assembly line is important for timely decision-making. This study proposes an affordable fast early warning system based on edge computing to detect abnormal events during assembly line. The proposed model obtains environmental data from various sensors including gyroscopes, accelerometers, temperature, humidity, ambient light, and air quality. The fault model is installed close to the facilities, so abnormal events can be timely detected. Several performance evaluations are conducted to obtain the optimal scenario for utilizing edge devices to improve data processing and analysis speed, and the final proposed model provides the highest accuracy in terms of detecting abnormal events compared to other classification models. The proposed model was tested over four months of operation in a Korean automobile parts factory, and provided significant benefits from monitoring assembly line, as well as classifying abnormal events. The model helped improve decision-making by reducing or preventing unexpected losses due to abnormal events.


Author(s):  
Matty Janssen ◽  
Paul Stuart

In recent years real-time data management systems have become commonplace at pulp and paper mills, and mills seek to use this important resource for improved operation of production facilities as well as for business decision-making. This paper presents a comprehensive and holistic approach to business modeling in which real-time process data, cost data, and environmental data are used in a “bottom-up” manner to exploit their potential for process decision-making. The paper describes a hypothetical case study in which the business model concept is illustrated by application to a process design problem at an integrated newsprint mill.


Sign in / Sign up

Export Citation Format

Share Document