scholarly journals Technology Translation and Synthesis: A Conceptual Framework for Critiquing Determinism in the Development Project

2021 ◽  
Author(s):  
◽  
Rebecca Paxton

<p>This thesis constructs a theoretical framework which critiques the legitimacy of technology transfer for the purposes of development. Under the auspices of the development project, technology transfer has involved the introduction of technology into so-called developing societies in the hope of leapfrogging them toward modernity. This process embodies a deterministic definition of technology that sees it as an inherently objective and rational process, mapping the ideas of Western science. Hence, all technological and social change is expected to follow a linear progression from pre-modern to modern, and developing to developed, respectively. In contrast, philosophers of technology have argued that technology has a cultural dimension which permits multiple avenues of change. This definition incorporates a dialogue between technology and society, whereby technologies are reinterpreted and imbued with culturally specific meanings by the adopting societies. The culturally contingent nature of these meanings entails that they are not necessarily transferable between cultures. Rather, technology must be translated. Conceptually, technology translation requires that aspects of the donor and recipient cultures are intertwined, producing a novel set of hybridised meanings. I argue that this process occurs primarily through the mode of synthesis - an emergent process whose outcomes are not predictable based solely on a priori knowledge of the interacting cultures. These ideas are tested in case studies arising from Indian agriculture. Indian agriculture has a long history of external agricultural influence in the shape of European colonialism, the Green Revolution and the more recent Gene Revolution. The results support the idea that both technology transfer and synthesis have occurred in Indian agriculture following the adoption of new technologies. Development agencies must revise their simplistic notion of technology by acknowledging the centrality of culture as part of technology, therefore, if they wish to ensure greater success in the future.</p>

2021 ◽  
Author(s):  
◽  
Rebecca Paxton

<p>This thesis constructs a theoretical framework which critiques the legitimacy of technology transfer for the purposes of development. Under the auspices of the development project, technology transfer has involved the introduction of technology into so-called developing societies in the hope of leapfrogging them toward modernity. This process embodies a deterministic definition of technology that sees it as an inherently objective and rational process, mapping the ideas of Western science. Hence, all technological and social change is expected to follow a linear progression from pre-modern to modern, and developing to developed, respectively. In contrast, philosophers of technology have argued that technology has a cultural dimension which permits multiple avenues of change. This definition incorporates a dialogue between technology and society, whereby technologies are reinterpreted and imbued with culturally specific meanings by the adopting societies. The culturally contingent nature of these meanings entails that they are not necessarily transferable between cultures. Rather, technology must be translated. Conceptually, technology translation requires that aspects of the donor and recipient cultures are intertwined, producing a novel set of hybridised meanings. I argue that this process occurs primarily through the mode of synthesis - an emergent process whose outcomes are not predictable based solely on a priori knowledge of the interacting cultures. These ideas are tested in case studies arising from Indian agriculture. Indian agriculture has a long history of external agricultural influence in the shape of European colonialism, the Green Revolution and the more recent Gene Revolution. The results support the idea that both technology transfer and synthesis have occurred in Indian agriculture following the adoption of new technologies. Development agencies must revise their simplistic notion of technology by acknowledging the centrality of culture as part of technology, therefore, if they wish to ensure greater success in the future.</p>


2020 ◽  
Vol 31 (2) ◽  
pp. 90-92
Author(s):  
Rob Edwards

Herbicide resistance in problem weeds is now a major threat to global food production, being particularly widespread in wild grasses affecting cereal crops. In the UK, black-grass (Alopecurus myosuroides) holds the title of number one agronomic problem in winter wheat, with the loss of production associated with herbicide resistance now estimated to cost the farming sector at least £0.5 billion p.a. Black-grass presents us with many of the characteristic traits of a problem weed; being highly competitive, genetically diverse and obligately out-crossing, with a growth habit that matches winter wheat. With the UK’s limited arable crop rotations and the reliance on the repeated use of a very limited range of selective herbicides we have been continuously performing a classic Darwinian selection for resistance traits in weeds that possess great genetic diversity and plasticity in their growth habits. The result has been inevitable; the steady rise of herbicide resistance across the UK, which now affects over 2.1 million hectares of some of our best arable land. Once the resistance genie is out of the bottle, it has proven difficult to prevent its establishment and spread. With the selective herbicide option being no longer effective, the options are to revert to cultural control; changing rotations and cover crops, manual rogueing of weeds, deep ploughing and chemical mulching with total herbicides such as glyphosate. While new precision weeding technologies are being developed, their cost and scalability in arable farming remains unproven. As an agricultural scientist who has spent a working lifetime researching selective weed control, we seem to be giving up on a technology that has been a foundation stone of the green revolution. For me it begs the question, are we really unable to use modern chemical and biological technology to counter resistance? I would argue the answer to that question is most patently no; solutions are around the corner if we choose to develop them.


Author(s):  
Chiara Treghini ◽  
Alfonso Dell’Accio ◽  
Franco Fusi ◽  
Giovanni Romano

AbstractChronic lung infections are among the most diffused human infections, being often associated with multidrug-resistant bacteria. In this framework, the European project “Light4Lungs” aims at synthesizing and testing an inhalable light source to control lung infections by antimicrobial photoinactivation (aPDI), addressing endogenous photosensitizers only (porphyrins) in the representative case of S. aureus and P. aeruginosa. In the search for the best emission characteristics for the aerosolized light source, this work defines and calculates the photo-killing action spectrum for lung aPDI in the exemplary case of cystic fibrosis. This was obtained by applying a semi-theoretical modelling with Monte Carlo simulations, according to previously published methodology related to stomach infections and applied to the infected trachea, bronchi, bronchioles and alveoli. In each of these regions, the two low and high oxygen concentration cases were considered to account for the variability of in vivo conditions, together with the presence of endogenous porphyrins and other relevant absorbers/diffusers inside the illuminated biofilm/mucous layer. Furthermore, an a priori method to obtain the “best illumination wavelengths” was defined, starting from maximizing porphyrin and light absorption at any depth. The obtained action spectrum is peaked at 394 nm and mostly follows porphyrin extinction coefficient behavior. This is confirmed by the results from the best illumination wavelengths, which reinforces the robustness of our approach. These results can offer important indications for the synthesis of the aerosolized light source and definition of its most effective emission spectrum, suggesting a flexible platform to be considered in further applications.


Author(s):  
Brian A. Weiss ◽  
Linda C. Schmidt ◽  
Harry A. Scott ◽  
Craig I. Schlenoff

As new technologies develop and mature, it becomes critical to provide both formative and summative assessments on their performance. Performance assessment events range in form from a few simple tests of key elements of the technology to highly complex and extensive evaluation exercises targeting specific levels and capabilities of the system under scrutiny. Typically the more advanced the system, the more often performance evaluations are warranted, and the more complex the evaluation planning becomes. Numerous evaluation frameworks have been developed to generate evaluation designs intent on characterizing the performance of intelligent systems. Many of these frameworks enable the design of extensive evaluations, but each has its own focused objectives within an inherent set of known boundaries. This paper introduces the Multi-Relationship Evaluation Design (MRED) framework whose ultimate goal is to automatically generate an evaluation design based upon multiple inputs. The MRED framework takes input goal data and outputs an evaluation blueprint complete with specific evaluation elements including level of technology to be tested, metric type, user type, and, evaluation environment. Some of MRED’s unique features are that it characterizes these relationships and manages their uncertainties along with those associated with evaluation input. The authors will introduce MRED by first presenting relationships between four main evaluation design elements. These evaluation elements are defined and the relationships between them are established including the connections between evaluation personnel (not just the users), their level of knowledge, and decision-making authority. This will be further supported through the definition of key terms. An example will be presented in which these terms and relationships are applied to the evaluation design of an automobile technology. An initial validation step follows where MRED is applied to the speech translation technology whose evaluation design was inspired by the successful use of a pre-existing evaluation framework. It is important to note that MRED is still in its early stages of development where this paper presents numerous MRED outputs. Future publications will present the remaining outputs, the uncertain inputs, and MRED’s implementation steps that produce the detailed evaluation blueprints.


1944 ◽  
Vol 41 (6) ◽  
pp. 155
Author(s):  
Arthur Child
Keyword(s):  
A Priori ◽  

2017 ◽  
pp. 1-3
Author(s):  
J.-P. Michel

The overlap between one innovative paradigm (P4 medicine: predictive, personalized, participatory and preventive) and another (a new definition of “Healthy ageing”) is fertile ground for new technologies; a new mobile application (app) that could broaden our scientific knowledge of the ageing process and help us to better analyse the impact of possible interventions in slowing the ageing decline. A novel mobile application is here presented as a game including questions and tests will allow in 10 minutes the assessment of the following domains: robustness, flexibility (lower muscle strength), balance, mental and memory complaints, semantic memory and visual retention. This game is completed by specific measurements, which could allow establishing precise information on functional and cognitive abilities. A global evaluation precedes advice and different types of exercises. The repetition of the tests and measures will allow a long follow up of the individual performances which could be shared (on specific request) with family members and general practitioners.


Author(s):  
D. Egorov

Adam Smith defined economics as “the science of the nature and causes of the wealth of nations” (implicitly appealing – in reference to the “wealth” – to the “value”). Neo-classical theory views it as a science “which studies human behavior in terms of the relationship between the objectives and the limited funds that may have a different use of”. The main reason that turns the neo-classical theory (that serves as the now prevailing economic mainstream) into a tool for manipulation of the public consciousness is the lack of measure (elimination of the “value”). Even though the neo-classical definition of the subject of economics does not contain an explicit rejection of objective measures the reference to “human behavior” inevitably implies methodological subjectivism. This makes it necessary to adopt a principle of equilibrium: if you can not objectively (using a solid measurement) compare different states of the system, we can only postulate the existence of an equilibrium point to which the system tends. Neo-classical postulate of equilibrium can not explain the situation non-equilibrium. As a result, the neo-classical theory fails in matching microeconomics to macroeconomics. Moreover, a denial of the category “value” serves as a theoretical basis and an ideological prerequisite of now flourishing manipulative financial technologies. The author believes in the following two principal definitions: (1) economics is a science that studies the economic system, i.e. a system that creates and recombines value; (2) value is a measure of cost of the object. In our opinion, the value is the information cost measure. It should be added that a disclosure of the nature of this category is not an obligatory prerequisite of its introduction: methodologically, it is quite correct to postulate it a priori. The author concludes that the proposed definitions open the way not only to solve the problem of the measurement in economics, but also to address the issue of harmonizing macro- and microeconomics.


Author(s):  
Herb A Phelan ◽  
James H Holmes IV ◽  
William L Hickerson ◽  
Clay J Cockerell ◽  
Jeffrey W Shupp ◽  
...  

Abstract Introduction Burn experts are only 77% accurate when subjectively assessing burn depth, leaving almost a quarter of patients to undergo unnecessary surgery or conversely suffer a delay in treatment. To aid clinicians in burn depth assessment (BDA), new technologies are being studied with machine learning algorithms calibrated to histologic standards. Our group has iteratively created a theoretical burn biopsy algorithm (BBA) based on histologic analysis, and subsequently informed it with the largest burn wound biopsy repository in the literature. Here, we sought to report that process. Methods The was an IRB-approved, prospective, multicenter study. A BBA was created a priori and refined in an iterative manner. Patients with burn wounds assessed by burn experts as requiring excision and autograft underwent 4mm biopsies procured every 25cm 2. Serial still photos were obtained at enrollment and at excision intraoperatively. Burn biopsies were histologically assessed for presence/absence of epidermis, papillary dermis, reticular dermis, and proportion of necrotic adnexal structures by a dermatopathologist using H&E with whole slide scanning. First degree and superficial 2 nd degree were considered to be burn wounds likely to have healed without surgery, while deep 2 nd and 3 rd degree burns were considered unlikely to heal by 21 days. Biopsy pathology results were correlated with still photos by five burn experts for consensus of final burn depth diagnosis. Results Sixty-six subjects were enrolled with 117 wounds and 816 biopsies. The BBA was used to categorize subjects’ wounds into 4 categories: 7% of burns were categorized as 1 st degree, 13% as superficial 2 nd degree, 43% as deep 2 nd degree, and 37% as 3 rd degree. Therefore 20% of burn wounds were incorrectly judged as needing excision and grafting by the clinical team as per the BBA. As H&E is unable to assess the viability of papillary and reticular dermis, with time our team came to appreciate the greater importance of adnexal structure necrosis over dermal appearance in assessing healing potential. Conclusions Our study demonstrates that a BBA with objective histologic criteria can be used to categorize BDA with clinical misclassification rates consistent with past literature. This study serves as the largest analysis of burn biopsies by modern day burn experts and the first to define histologic parameters for BDA.


Author(s):  
Igor I. Kartashov ◽  
Ivan I. Kartashov

For millennia, mankind has dreamed of creating an artificial creature capable of thinking and acting “like human beings”. These dreams are gradually starting to come true. The trends in the development of modern so-ciety, taking into account the increasing level of its informatization, require the use of new technologies for information processing and assistance in de-cision-making. Expanding the boundaries of the use of artificial intelligence requires not only the establishment of ethical restrictions, but also gives rise to the need to promptly resolve legal problems, including criminal and proce-dural ones. This is primarily due to the emergence and spread of legal expert systems that predict the decision on a particular case, based on a variety of parameters. Based on a comprehensive study, we formulate a definition of artificial intelligence suitable for use in law. It is proposed to understand artificial intelligence as systems capable of interpreting the received data, making optimal decisions on their basis using self-learning (adaptation). The main directions of using artificial intelligence in criminal proceedings are: search and generalization of judicial practice; legal advice; preparation of formalized documents or statistical reports; forecasting court decisions; predictive jurisprudence. Despite the promise of using artificial intelligence, there are a number of problems associated with a low level of reliability in predicting rare events, self-excitation of the system, opacity of the algorithms and architecture used, etc.


Author(s):  
Ewa Suknarowska-Drzewiecka

The digital revolution, also called the fourth industrial revolution, constitutes another era of change, caused by the development of computerisation and modern technologies. It is characterised by rapid technological progress, widespread digitisation and an impact on all areas of life, including the provision of work. The changes affecting this area are so significant that there are proposals to remodel the definition of the employment relationship in the Labour Code. New forms of employment, which do not fit the conventional definition of an employment relationship, are emerging and gaining importance. An example could be employment via digital platforms. At the same time, there are also employment forms that do fit that definition, but deviate from the conventional understanding of the terms and conditions for performing work, which have undergone modification due to the use of new technologies. Teleworking, or working outside the employer’s premises, are examples of that. Employers get further opportunities to organise and control work, which often raises concerns due to the employee’s right to privacy, the protection of personal rights and personal data.


Sign in / Sign up

Export Citation Format

Share Document