Traditional Approaches to Aesthetics

2019 ◽  
pp. 410-421
Author(s):  
Patrik N. Juslin

This chapter focuses on empirical aesthetics, which can be regarded as one of the oldest subfields in psychology. The most important contribution to the domain was made by the scholar Daniel Berlyne, who launched the ‘New Empirical Aesthetics’. In accordance with the prevailing ‘Zeitgeist’ of the 1960s, Berlyne focuses mainly on the notion of autonomic arousal as opposed to discrete emotions; he notes that art influences its perceivers mainly by manipulating their arousal. Berlyne further suggests that listeners' preferences are related to arousal in the form of an inverted U-shaped curve, sometimes referred to as the Wundt curve. The chapter then discusses what empirical aesthetics has contributed to the understanding of aesthetic responses to music.

Author(s):  
Ronald J. Gilson

In the 1960s and 1970s, corporate law and finance scholars gave up on their traditional approaches. Corporate law had become “towering skyscrapers of rusted girders, internally welded together and containing nothing but wind.” In finance, the theory of the firm was recognized as an “empty box.” This essay tracks how corporate law was reborn as corporate governance through three examples of how we have usefully complicated the inquiry into corporate behavior. Part I frames the first complication, defining governance broadly as the company’s operating system, a braided framework of legal and non-legal elements. Part II adds a second complication by making the inquiry dynamic: corporate governance as a path dependent process that co-evolves with the elements of the broader capitalist regime. Part III considers unsuccessful efforts to simplify rather than complicate corporate governance analysis through static single factor models: stakeholder, team production, director primacy, and shareholder primacy. Part IV concludes by highlighting the tradeoff between a governance system’s capacity to adapt to change and its ability to support long-term investment.


Author(s):  
Kenneth De Jong

I continue to be surprised and pleased by the dramatic growth of interest in and applications of genetic algorithms (GAs) in recent years. This growth, in turn, has placed a certain amount of healthy "stress" on the field as current understanding and traditional approaches are stretched to the limit by challenging new problems and new areas of application. At the same time, other forms of evolutionary computation such as evolution strategies [50] and evolutionary programming [22], continue to mature and provide alternative views on how the process of evolution might be captured in an efficient and useful computational framework. I don't think there can be much disagreement about the fact that Holland's initial ideas for adaptive system design have played a fundamental role in the progress we have made in the past thirty years [23, 46]. So, an occasion like this is an opportunity to reflect on where the field is now, how it got there, and where it is headed. In the following sections, I will attempt to summarize the progress that has been made, and to identify critical issues that need to be addressed for continued progress in the field. The widespread availability of inexpensive digital computers in the 1960s gave rise to their increased use as a modeling and simulation tool by the scientific community. Several groups around the world including Rechenberg and Schwefel at the Technical University of Berlin [49], Fogel et al. at the University of California at Los Angeles [22], and Holland at the University of Michigan in Ann Arbor [35] were captivated by the potential of taking early simulation models of evolution a step further and harnessing these evolutionary processes in computational forms that could be used for complex computer-based problem solving. In Holland's case, the motivation was the design and implementation of robust adaptive systems, capable of dealing with an uncertain and changing environment. His view emphasized the need for systems which self-adapt over time as a function of feedback obtained from interacting with the environment in which they operate. This led to an initial family of "reproductive plans" which formed the basis for what we call "simple genetic algorithms" today, as outlined in figure 1.


Author(s):  
Neelam Sidhar Wright

This chapter discusses Indian film criticism, with a particular focus on traditional modes of studying Indian cinema. It first traces the history of the development of the Bombay film industry from the 1910s to the 2000s, arguing that the 1960s and 1980s are decades from which we can best study Indian cinema's most popular form of filmmaking: the masala genre. It then considers traditional approaches to Indian film and some popular themes in Indian film studies, including nationalism, diaspora, postcolonialism and cultural identity. It also examines introductory guidebooks and other literary sources that it accuses of having misled readers towards restrictive (if not outmoded and derogatory) definitions of the cinema they seek to understand. The chapter concludes with an overview of categories used to explore Bollywood's current manifestation, namely, third cinema, world cinema, Asian cinema, global contemporary Indian cinema and transnational cinema.


2015 ◽  
Vol 2 (3) ◽  
Author(s):  
Kanchi.Madhavi

Boredom is frequently considered inconsequential and has received relatively little research attention. We argue that boredom has important implications for human functioning, based on emotion theory and empirical evidence. Specifically, we argue that boredom motivates pursuit of new goals when the previous goal is no longer beneficial. Exploring alternate goals and experiences allows the attainment of goals that might be missed if people fail to reengage. Similar to other discrete emotions, we propose that boredom has specific and unique impacts on behavior, cognition, experience and physiology. Consistent with a broader argument that boredom encourages the behavioral pursuit of alternative goals, we argue that, while bored, attention to the current task is reduced, the experience of boredom is negative and aversive, and that boredom increases autonomic arousal to ready the pursuit of alternatives. By motivating desire for change from the current state, boredom increases opportunities to attain social, cognitive, emotional and experiential stimulation that could have been missed. We review the limited extant literature to support these claims, and call for more experimental boredom research.


2006 ◽  
Vol 23 (7-8) ◽  
pp. 75-91
Author(s):  
Geoffrey Winthrop-Young

Focusing on Kittler’s reading of Goethe’s ‘Wanderer’s Nightsong’ and Pink Floyd’s ‘Brain Damage’, the article traces Kittler’s development from discourse analysis to media theory. Where more traditional approaches would stress notions of self-reflexivity (both the poem and the song elaborate on their effects and foreground their own construction), Kittler performs, in his own words, a kind of ‘implosion’: The words of Goethe’s poem collapse back into the discursive order they evoke, and Pink Floyd’s song performs its own technology. But it is precisely this implosion that has an intoxicating effect, which paves the way for a more political, or at least politicized, reading of Kittler’s work that highlights his indebtedness to the cultural transgressions of the 1960s.


2014 ◽  
Vol 38 (01) ◽  
pp. 102-129
Author(s):  
ALBERTO MARTÍN ÁLVAREZ ◽  
EUDALD CORTINA ORERO

AbstractUsing interviews with former militants and previously unpublished documents, this article traces the genesis and internal dynamics of the Ejército Revolucionario del Pueblo (People's Revolutionary Army, ERP) in El Salvador during the early years of its existence (1970–6). This period was marked by the inability of the ERP to maintain internal coherence or any consensus on revolutionary strategy, which led to a series of splits and internal fights over control of the organisation. The evidence marshalled in this case study sheds new light on the origins of the armed Salvadorean Left and thus contributes to a wider understanding of the processes of formation and internal dynamics of armed left-wing groups that emerged from the 1960s onwards in Latin America.


Author(s):  
Richard B. Mott ◽  
John J. Friel ◽  
Charles G. Waldman

X-rays are emitted from a relatively large volume in bulk samples, limiting the smallest features which are visible in X-ray maps. Beam spreading also hampers attempts to make geometric measurements of features based on their boundaries in X-ray maps. This has prompted recent interest in using low voltages, and consequently mapping L or M lines, in order to minimize the blurring of the maps.An alternative strategy draws on the extensive work in image restoration (deblurring) developed in space science and astronomy since the 1960s. A recent example is the restoration of images from the Hubble Space Telescope prior to its new optics. Extensive literature exists on the theory of image restoration. The simplest case and its correspondence with X-ray mapping parameters is shown in Figures 1 and 2.Using pixels much smaller than the X-ray volume, a small object of differing composition from the matrix generates a broad, low response. This shape corresponds to the point spread function (PSF). The observed X-ray map can be modeled as an “ideal” map, with an X-ray volume of zero, convolved with the PSF. Figure 2a shows the 1-dimensional case of a line profile across a thin layer. Figure 2b shows an idealized noise-free profile which is then convolved with the PSF to give the blurred profile of Figure 2c.


1980 ◽  
Vol 11 (2) ◽  
pp. 85-94 ◽  
Author(s):  
Jack Damico ◽  
John W. Oller

Two methods of identifying language disordered children are examined. Traditional approaches require attention to relatively superficial morphological and surface syntactic criteria, such as, noun-verb agreement, tense marking, pluralization. More recently, however, language testers and others have turned to pragmatic criteria focussing on deeper aspects of meaning and communicative effectiveness, such as, general fluency, topic maintenance, specificity of referring terms. In this study, 54 regular K-5 teachers in two Albuquerque schools serving 1212 children were assigned on a roughly matched basis to one of two groups. Group S received in-service training using traditional surface criteria for referrals, while Group P received similar in-service training with pragmatic criteria. All referrals from both groups were reevaluated by a panel of judges following the state determined procedures for assignment to remedial programs. Teachers who were taught to use pragmatic criteria in identifying language disordered children identified significantly more children and were more often correct in their identification than teachers taught to use syntactic criteria. Both groups identified significantly fewer children as the grade level increased.


Sign in / Sign up

Export Citation Format

Share Document