Can an Algorithm Be Disturbed?

Author(s):  
James E. Dobson

This chapter positions the use of machine learning within the digital humanities as part of a wider movement that nostalgically seeks to return literary criticism to the structuralist era, to a moment characterized by belief in systems, structure, and the transparency of language. While digital methods enable one to examine radically larger archives than those assembled in the past, a transformation that Matthew Jockers characterizes as a shift from micro to macroanalysis, the fundamental assumptions about texts and meaning implicit in these tools and in the criticism resulting from the use of these tools belong to a much earlier period of literary analysis. The author argues that the use of imported tools and procedures within literary and cultural criticism on the part of some digital humanists in the present is an attempt to separate methodology from interpretation. In the process, these critics have deemphasized the degree to which methodology participates in interpretation. The chapter closes by way of a return to the deconstructive critique of structuralism in order to highlight the ways in which numerous interpretive decisions are suppressed in the selection, encoding, and preprocessing of digitized textual sources for text mining and machine learning analysis.

Author(s):  
Bethany Percha

Electronic health records (EHRs) are becoming a vital source of data for healthcare quality improvement, research, and operations. However, much of the most valuable information contained in EHRs remains buried in unstructured text. The field of clinical text mining has advanced rapidly in recent years, transitioning from rule-based approaches to machine learning and, more recently, deep learning. With new methods come new challenges, however, especially for those new to the field. This review provides an overview of clinical text mining for those who are encountering it for the first time (e.g. physician researchers, operational analytics teams, machine learning scientists from other domains). While not a comprehensive survey, it describes the state of the art, with a particular focus on new tasks and methods developed over the past few years. It also identifies key barriers between these remarkable technical advances and the practical realities of implementation at health systems and in industry.


Author(s):  
Bethany Percha

Electronic health records (EHRs) are becoming a vital source of data for healthcare quality improvement, research, and operations. However, much of the most valuable information contained in EHRs remains buried in unstructured text. The field of clinical text mining has advanced rapidly in recent years, transitioning from rule-based approaches to machine learning and, more recently, deep learning. With new methods come new challenges, however, especially for those new to the field. This review provides an overview of clinical text mining for those who are encountering it for the first time (e.g. physician researchers, operational analytics teams, machine learning scientists from other domains). While not a comprehensive survey, it describes the state of the art, with a particular focus on new tasks and methods developed over the past few years. It also identifies key barriers between these remarkable technical advances and the practical realities of implementation at health systems and in industry.


Author(s):  
Sho Araiba

Although behavioristic works have had substantial effects on many fields, the art and entertainment world had responded largely negatively or insensitive to behaviorism in the past. More recently, however, as the field of Applied Behavior Analysis grew, a new wave of behavioristic art has emerged. With this new trend, the argument can be made that it is now time to develop a fully-fledged art of behaviorism, both in terms of art making and art theory. In this paper, I present a literary analysis of the film, Ghost in the Shell (Oshii, 1995), as a form of behavioristic literary criticism based on the works of three contemporary behaviorists, B. F. Skinner, Gilbert Ryle, and Ludwig Wittgenstein. From a behaviorist point of view, the film presents a juxtaposition between Motoko’s belief and the environmental influences on her personal identity.


Author(s):  
James E. Dobson

This chapter serves as an introduction to the problems raised by the use of computational methods in cultural and literary criticism. It does so by placing the desire for a science of reading expressed by many digital humanists within a larger genealogy of interpretive hermeneutics by turning to a series of crucial historical inflection points in which scholars and other intellectuals have raised the question of whether literary and cultural criticism could be a science or should depend upon the procedures of the sciences. While some critics (Matthew Jockers, Ted Underwood, and Andrew Goldstone, among others) have proposed that research in the digital humanities should look more like the quantitative social sciences, this chapter’s reconstruction of pivotal debates in the literary studies demonstrates the existence of surplus questions related to the ongoing meaning of cultural objects, textual sources, and archives that remain unaddressable and unanswerable by empirical methods. The chapter argues that what sustains the possibility of this notion of the “unaddressable” is a regular disciplinary injunction to apply a critical gaze backward through scholarly methods, to the ways in which evidence is found, collected, or produced, to the ways in which scholars frame this evidence, the protocols by which they interpret it, and the arguments they present to their readers.


2019 ◽  
pp. 446-461
Author(s):  
Ekaterina Baydalova

The postcolonial studies have been under discussion in the Ukrainian historiography, social science, culture studies and literary criticism since 1990 years. They have originated from American, European, and Australian academic studies and became more and more popular in modern Ukrainian culture recently. The nation and the nationalism, Orientalism, multicultural and ambivalent individuality self-presentation, the search of cultural identity, the problem of ambivalent attitude to the past are in the paradigm of postcolonial studies. The problems of national identity, the totalitarian past, the interactions with neighboring countries especially Russia and Poland, the instable Ukrainian society’s condition are analyzed under the postcolonial ideas in the Ukrainian intellectual discourse. The postcolonial theory has become the main interpretative strategy of the Ukrainian researchers lately. Nevertheless, there is no unconditional modus vivendi in the Ukrainian academia about postcolonial conceptions, strategies and principles. One of the most important unsolved issues is the question of correlation of postcolonial and postmodern components of the Ukrainian national literature. The inclusion of the studies of trauma and anticolonial and posttotalitarian discourses into the framework of the postcolonial studies is the most distinguishing feature of postcolonial studies in the Ukraine.


2019 ◽  
Vol 19 (1) ◽  
pp. 4-16 ◽  
Author(s):  
Qihui Wu ◽  
Hanzhong Ke ◽  
Dongli Li ◽  
Qi Wang ◽  
Jiansong Fang ◽  
...  

Over the past decades, peptide as a therapeutic candidate has received increasing attention in drug discovery, especially for antimicrobial peptides (AMPs), anticancer peptides (ACPs) and antiinflammatory peptides (AIPs). It is considered that the peptides can regulate various complex diseases which are previously untouchable. In recent years, the critical problem of antimicrobial resistance drives the pharmaceutical industry to look for new therapeutic agents. Compared to organic small drugs, peptide- based therapy exhibits high specificity and minimal toxicity. Thus, peptides are widely recruited in the design and discovery of new potent drugs. Currently, large-scale screening of peptide activity with traditional approaches is costly, time-consuming and labor-intensive. Hence, in silico methods, mainly machine learning approaches, for their accuracy and effectiveness, have been introduced to predict the peptide activity. In this review, we document the recent progress in machine learning-based prediction of peptides which will be of great benefit to the discovery of potential active AMPs, ACPs and AIPs.


Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 300
Author(s):  
Mark Lokanan ◽  
Susan Liu

Protecting financial consumers from investment fraud has been a recurring problem in Canada. The purpose of this paper is to predict the demographic characteristics of investors who are likely to be victims of investment fraud. Data for this paper came from the Investment Industry Regulatory Organization of Canada’s (IIROC) database between January of 2009 and December of 2019. In total, 4575 investors were coded as victims of investment fraud. The study employed a machine-learning algorithm to predict the probability of fraud victimization. The machine learning model deployed in this paper predicted the typical demographic profile of fraud victims as investors who classify as female, have poor financial knowledge, know the advisor from the past, and are retired. Investors who are characterized as having limited financial literacy but a long-time relationship with their advisor have reduced probabilities of being victimized. However, male investors with low or moderate-level investment knowledge were more likely to be preyed upon by their investment advisors. While not statistically significant, older adults, in general, are at greater risk of being victimized. The findings from this paper can be used by Canadian self-regulatory organizations and securities commissions to inform their investors’ protection mandates.


2021 ◽  
Vol 14 (3) ◽  
pp. 101016 ◽  
Author(s):  
Jim Abraham ◽  
Amy B. Heimberger ◽  
John Marshall ◽  
Elisabeth Heath ◽  
Joseph Drabick ◽  
...  

2020 ◽  
Vol 114 ◽  
pp. 242-245
Author(s):  
Jootaek Lee

The term, Artificial Intelligence (AI), has changed since it was first coined by John MacCarthy in 1956. AI, believed to have been created with Kurt Gödel's unprovable computational statements in 1931, is now called deep learning or machine learning. AI is defined as a computer machine with the ability to make predictions about the future and solve complex tasks, using algorithms. The AI algorithms are enhanced and become effective with big data capturing the present and the past while still necessarily reflecting human biases into models and equations. AI is also capable of making choices like humans, mirroring human reasoning. AI can help robots to efficiently repeat the same labor intensive procedures in factories and can analyze historic and present data efficiently through deep learning, natural language processing, and anomaly detection. Thus, AI covers a spectrum of augmented intelligence relating to prediction, autonomous intelligence relating to decision making, automated intelligence for labor robots, and assisted intelligence for data analysis.


Sign in / Sign up

Export Citation Format

Share Document