scholarly journals An Era of Appropriate Technology: Evolutions, Oversights and Opportunities

2015 ◽  
Vol 3 (1) ◽  
Author(s):  
Jessie Lissenden ◽  
Siri Maley ◽  
Khanjan Mehta

As we develop practical, innovative and sustainable technology solutions for resource-constrained settings, what can we learn from the Appropriate Technology (AT) movement? Based on a review of academic literature over the past 35 years, this article identifies, and chronologically maps, the defining tenets and metrics of success advocated by scholars. The literature has gradually evolved from general musings into concrete lessons learned, while the definitions of “success” have transitioned from laboratory success into practical application and long-term usefulness. Nonetheless, juxtaposing this scholastic history with actual projects reveals three major gaps in AT philosophy related to a lack of 1) bilateral knowledge exchange, 2) emphasis on venture scalability, and 3) integration of implementation strategy through the project lifecycle. This article argues that rethinking and repositioning AT with a human-centric narrative emphasizing sustainability and scalability is imperative in order to revitalize and accelerate the AT movement and to achieve the large-scale impact it was expected to deliver.

2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Mateusz Taszarek ◽  
John T. Allen ◽  
Mattia Marchio ◽  
Harold E. Brooks

AbstractGlobally, thunderstorms are responsible for a significant fraction of rainfall, and in the mid-latitudes often produce extreme weather, including large hail, tornadoes and damaging winds. Despite this importance, how the global frequency of thunderstorms and their accompanying hazards has changed over the past 4 decades remains unclear. Large-scale diagnostics applied to global climate models have suggested that the frequency of thunderstorms and their intensity is likely to increase in the future. Here, we show that according to ERA5 convective available potential energy (CAPE) and convective precipitation (CP) have decreased over the tropics and subtropics with simultaneous increases in 0–6 km wind shear (BS06). Conversely, rawinsonde observations paint a different picture across the mid-latitudes with increasing CAPE and significant decreases to BS06. Differing trends and disagreement between ERA5 and rawinsondes observed over some regions suggest that results should be interpreted with caution, especially for CAPE and CP across tropics where uncertainty is the highest and reliable long-term rawinsonde observations are missing.


Collections ◽  
2021 ◽  
pp. 155019062110527
Author(s):  
J.A. Pryse

The spread of COVID-19 has created numerous challenges in the field of archive management. Limited in-house office space, furloughs of personnel, and inconsistency, has highlighted the potential for the Carl Albert Congressional Research and Studies Center Archives (Center) to develop and implement improved accessibility measures to thousands of linear feet of material. Addition ally, the Center has found unique opportunities to collaborate with multiple academic institutions to propose large-scale digitization program exhibitions using the Center’s remote workflow model. One of the largest, most complex collections the Center has worked with during this time is the Political Commercial Collection (the Collection), which holds 119,000 film, audio, and videotape recordings of commercials aired between 1936 and present. It is the largest collection of political commercials in the world. The Center has developed a working pilot digitization project that has currently resulted in access to 16,000 digital videos for public researchers and over 10,000 available for on-line streaming during the pilot phase between April 16, 2020, and December 1, 2020. This paper presents the practical application of the Center’s simplified “Linear Reciprocity Workflow Model” to provide a systematic solution for digital and long-term preservation of complex collections. The Center has proven that limited personnel and reduced resources need not interrupt continued access to archival repositories.


Author(s):  
Jun Liu ◽  
Mohamed Alhashme ◽  
Ronggui Yang

Carbon nanotubes (CNTs) have been reported to have excellent thermal and mechanical properties over the past two decades. However, the practical application of CNT-based technologies has been limited, due to the inability to transform the excellent properties of single CNTs into macroscopic applications. CNT network structure connects CNTs and can be possibly scaled up to macro-scale CNT-based application. In this paper, nonequilibrium molecular dynamics is applied to investigate thermal transport across two CNTs connected longitudinally by molecular linkers. We show the effect of different types and lengths of molecular linkers on interfacial thermal conductance. We also analyze the density of vibrational normal modes to further understand the interfacial thermal conductance between different molecular linkers and CNTs. These results provide guidance for choosing molecular linkers to build up large-scale CNT-based network structures.


2006 ◽  
Vol 1 (1) ◽  
pp. 46-71 ◽  
Author(s):  
Itsuki Nakabayashi ◽  

This treatise outlines developments in disaster management focusing on earthquake disaster measures taken by the Japanese and Tokyo Metropolitan Governments since the 1980s. The 1978 Large-Scale Earthquake Measures Special Act on conditions for predicting the Tokai Earthquake significantly changed the direction of earthquake disaster measures in Japan. The Tokyo Metropolitan Government undertook its own earthquake disaster measures based on lessons learned from the 1964 Niigata Earthquake. In the 1980s, it began planning urban development disaster management programs for upgrading areas with high wooden houses concentration - still a big problem in many urban areas of Japan - which are most vulnerable to earthquake disasters. The 1995 Great Hanshin-Awaji Earthquake in Kobe brought meaningful insight into both to earthquake disaster measures by the Japanese Government and by the Tokyo Metropolitan Government and other local governments nationwide. Long-term predictions concerning possible earthquake occurrence have been conducted throughout Japan and new earthquake disaster measures have been adopted based on this long-term prediction. The Tokyo Government has further completely revised its own earthquake disaster measures. As a review of measures against foreseeable earthquake disasters based on developments in disaster management measures, this treatise provides invaluable insights emphasizing urban earthquake disaster prevention developed in Japan over the last 30 years that readers are sure to find both interesting and informative in their own work.


2012 ◽  
Vol 7 (4) ◽  
pp. 343-343
Author(s):  
Kenji Watanabe

Among the lessons learned from the Great East Japan Earthquake, there were a large number of new findings, including which preparations functioned as planned and which did not. Now that a year has elapsed since the earthquake disaster, the parties concerned need to reexamine those measures which are yet to be implemented since we should not see the same results after a large scale disaster in the future as those we saw in the past. In this JDR Special Issue on Business Continuity Plan (BCP), I tried to ask for papers not only from academia but also from business fields to make this issue practical and useful to be leveraged for our next steps in preparing for incoming disasters. As a result, this issue obtains papers from various fields from academia to financial businesses and also with several different approaches which includes actual real case studies. Many of papers in this issue focus on intangible part of business continuity activities that is different from the traditional disaster management approaches which have mainly focused on tangibles or hardware reinforcement against natural disasters. Recent wide-area disasters taught us the importance of intangibles and we should start discussions more in details with aspects such as corporate value, emergency transportation & logistics, training & exercises, funding arrangement, and management systems. I hope that discussions and insights in this issue will help our discussions and actions to move forward. Finally, I really thank the authors’ insightful contributions and the referees’ intensive professional advices to make this JDR Special Issue valuable to our society in preparing for incoming disasters.


1983 ◽  
Vol 13 (4) ◽  
pp. 539-547 ◽  
Author(s):  
J. R. Blais

The history of spruce budworm (Choristoneurafumiferana (Clem.)) outbreaks for the past 200 to 300 years, for nine regions in eastern Canada, indicates that outbreaks have occurred more frequently in the 20th century than previously. Regionally, 21 outbreaks took place in the past 80 years compared with 9 in the preceding 100 years. Earlier infestations were restricted to specific regions, but in the 20th century they have coalesced and increased in size, the outbreaks of 1910, 1940, and 1970 having covered 10, 25, and 55 million ha respectively. Reasons for the increase in frequency, extent, and severity of outbreaks appear mostly attributable to changes caused by man, in the forest ecosystem. Clear-cutting of pulpwood stands, fire protection, and use of pesticides against budworm favor fir–spruce stands, rendering the forest more prone to budworm attack. The manner and degree to which each of these practices has altered forest composition is discussed. In the future, most of these practices are expected to continue and their effects could intensify, especially in regions of recent application. Other practices, including large-scale planting of white spruce, could further increase the susceptibility of forest stands. Forest management, aimed at reducing the occurrence of extensive fir–spruce stands, has been advocated as a long-term solution to the budworm problem. The implementation of this measure at a time when man's actions result in the proliferation of fir presents a most serious challenge to forest managers.


Author(s):  
Daniel Kinderman

This chapter focuses on how business interests and neoliberal ideas have come together in Germany during the past two decades. It is based on a detailed analysis of the INSM, a large-scale campaign founded and funded by the metal industry employers’ association Gesamtmetall in 2000 to shape public opinion. Since its origination, the INSM has launched a systematic attack on the German welfare state. As part of a business-led public relations campaign, the purpose of the INSM is to propagate market-oriented reforms and influence public opinion and policymaking rather than to develop new economic ideas. Nevertheless, a group of economists associated with the Mont Pèlerin Society have actively supported and campaigned for the INSM. The INSM exposes a serious problem with the academic literature that characterizes Germany as an exemplar of “nonliberal” capitalism: the positions of leading German business officials and economists are fundamentally and unmistakably liberal.


1991 ◽  
Vol 7 (03) ◽  
pp. 176-182
Author(s):  
John Walker Hartigan

The naval shipyards are in the process of installing a system for identifying and recording specific job-related skills in their industrial workforce. The system, called the Shipyard Skills Tracking System (SSTS), is intended initially to support middle-level management in allocating their workforce properly for critical tasks and in accurately factoring personnel availability and training requirements into the planning for upcoming work. SSTS is supported by sophisticated computer programs which are integrated into other shipyard administrative programs. Data entry, ever the bugaboo of large-scale tracking programs, is minimized by using data links to other job-related programs for most of the information. The programs have been successfully field-tested at one naval shipyard and, starting in November 1989, began undergoing phased installation at all eight government yards. Discussion John D. Prebula, Pearl Harbor Naval Shipyard This paper is an excellent overview of how shipyards will track qualifications, skills, skill level, and other data needed to assign work. The SSTS is a good example of what can happen when appropriate technology is used to satisfy similar needs at a number of naval shipyards. The naval shipyards had a problem where they knew a great deal about the training and qualifications of individuals but had poor means of retrieving the information on their skill level. Attempts in the past to document and retain the information on skill levels and experience were generally unsuccessful because of the large amount of information and the continuing changes in the information. The SSTS successfully linked new microcomputer technology and training information in the shipyard main-frame computer. This allows information to be maintained currently and easily without the large duplication of effort that had been necessary in the past. Pearl Harbor Naval Shipyard's supervisors are looking forward to full implementation of the SSTS and believe that if properly implemented it will be of benefit to the shipyard. Mr. Hartigan does an excellent job of listing and explaining the important features of the SSTS and uses the example of a new supervisor trying to provide someone for a "tiger team" effort. While such a system is definitely a benefit to new supervisors it is also a great benefit to supervisors who have been on the job and know the people rather well. An experienced supervisor who is familiar with his people is still not likely to know such things as:who has passports, the currency of medical exams, the currency of inoculations, and, the other things necessary to be checked out before someone can be sent overseas or to a specific shipyard job. When the workforce is composed of a large number of temporary or more transient workers (as shipyards are being asked to become), the importance of a system to track skills becomes more important than ever. The SSTS, as the author explains so well, is not just another system of tracking qualifications. Rather, it marries together qualifications, skills, skill levels, some select training, medical qualifications and selected attributes such as the possession of a passport. This database is updated for training and qualification whenever the shipyard's mainframe is updated for these trainings and qualifications. The SSTS is manually updated for the specific attributes and skills. It was the marrying of the microcomputer technology to the shipyard's main-frame computer that allowed such a system to come into existence without the need for the purchase of additional computer equipment. As planned, the shipyard's SSTS system will be applicable to the production department workers for all ships in the shipyard and will be applied to selected engineering and inspection codes. One significant item in the paper is the mention that not only are the skills and experience reported and tracked, but the degree of expertise in each of these skills is also tracked. Mr. Hartigan uses the words "accomplished a battery replacement successfully." This allows the shipyard not only to track who has performed specific skills or tasks, but to know at what level they are capable of performing. This is done by a tie between the timekeeping system and the SSTS, allowing the supervisors to annotate the skill level when an individual has done a specific skill by entering the appropriate coding on the timekeeping sheet. This timekeeping entry also provides direct input into the SSTS.


2021 ◽  
Author(s):  
Christopher ODell ◽  
Annmarie Eldering ◽  
Michael Gunson ◽  
David Crisp ◽  
Brendan Fisher ◽  
...  

<p>While initial plans for measuring carbon dioxide from space hoped for 1-2 ppm levels of accuracy (bias) and precision in the CO<sub>2</sub> column mean dry air mole fraction (XCO<sub>2</sub>), in the past few years it has become clear that accuracies better than 0.5 ppm are required for most current science applications.  These include measuring continental (1000+ km) and regional scale (100s of km) surface fluxes of CO<sub>2</sub> at monthly-average timescales.  Considering the 400+ ppm background, this translates to an accuracy of roughly 0.1%, an incredibly challenging target to hit. </p><p>Improvements in both instrument calibration and retrieval algorithms have led to significant improvements in satellite XCO<sub>2</sub> accuracies over the past decade.  The Atmospheric Carbon Observations from Space (ACOS) retrieval algorithm, including post-retrieval filtering and bias correction, has demonstrated unprecedented accuracy with our latest algorithm version as applied to the Orbiting Carbon Observatory-2 (OCO-2) satellite sensor.   This presentation will discuss the performance of the v10 XCO<sub>2</sub> product by comparisons to TCCON and models, and showcase its performance with some recent examples, from the potential to infer large-scale fluxes to its performance on individual power plants.  The v10 product yields better agreement with TCCON over land and ocean, plus reduced biases over tropical oceans and desert areas as compared to a median of multiple global carbon inversion models, allowing better accuracy and faith in inferred regional-scale fluxes.  More specifically, OCO-2 has single sounding precision of ~0.8 ppm over land and ~0.5 ppm over water, and RMS biases of 0.5-0.7 ppm over both land and water.  Given the six-year and growing length of the OCO-2 data record, this also enables new studies on carbon interannual variability, while at the same time allowing identification of more subtle and temporally-dependent errors.  Finally, we will discuss the prospects of future improvements in the next planned version (v11), and the long-term prospects of greenhouse gas retrievals in the coming years. </p><p> </p>


2010 ◽  
Vol 26 (1) ◽  
pp. 1-3 ◽  
Author(s):  
William F. Schillinger

AbstractMany lessons in long-term cropping systems experiments are learned from practical experience. I have conducted large-scale, long-term, multidisciplinary dryland and irrigated cropping systems experiments with numerous colleagues at university and government research stations and in farmers' fields in the USA and in developing countries for 25 years. Several practical lessons learned through the years are outlined in this short commentary. While some of these lessons learned may be intrinsically obvious, results of many cropping systems experiments have not been published in scientific journals due to fatal flaws in experimental design, improper transitioning between phases of the experiment and many other reasons. Ongoing active support by stakeholders is critical to maintain funding for long-term cropping systems studies. Problems and unexpected challenges will occur, but scientists can often parlay these into opportunities for discovery and testing of new hypotheses. Better understanding and advancement of stable, profitable and sustainable cropping systems will be critical for feeding the world's projected 10 billion people by the mid-21st century.


Sign in / Sign up

Export Citation Format

Share Document