scholarly journals Det tonale rum

Author(s):  
Karl Aage Rasmussen

The article adresses the question of ‘what is music?’ It is argued that a conceptualization of tonal space must take its starting point in the intersection of space and time: music is suspended in time, but time and therefore music can not be thought without space. In 20th century music a linear time is mainly found, in which the compositions are developed continuously from start to finish. But with composers such as Stravinsky, Varese, and Satie, time is dissociated and a non-linear time is crystallized in the compositions. In this way, a continuous development is no longer central to their compositions, but discontinuities, planes, fragments, and chord breakings, where time is slowed down or suddenly released. However, it is argued that this does not mean that the music becomes space or may be understood as space, but rather that the connection between the different fragments of a composition is not determined until it interacts with the listener’s concept of time in a space of mental experience. The space of music thus becomes a mental-tonal space.

Author(s):  
Ray Huffaker ◽  
Marco Bittelli ◽  
Rodolfo Rosa

In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjects of science, such as mathematical topology, relativity or particle physics. For this reason, the tools of NLTS have been confined and utilized mostly in the fields of mathematics and physics. However, many natural phenomena investigated I many fields have been revealing deterministic non linear structures. In this book we aim at presenting the theory and the empirical of NLTS to a broader audience, to make this very powerful area of science available to many scientific areas. This book targets students and professionals in physics, engineering, biology, agriculture, economy and social sciences as a textbook in Nonlinear Time Series Analysis (NLTS) using the R computer language.


Urban Science ◽  
2021 ◽  
Vol 5 (2) ◽  
pp. 42
Author(s):  
Dolores Brandis García

Since the late 20th century major, European cities have exhibited large projects driven by neoliberal urban planning policies whose aim is to enhance their position on the global market. By locating these projects in central city areas, they also heighten and reinforce their privileged situation within the city as a whole, thus contributing to deepening the centre–periphery rift. The starting point for this study is the significance and scope of large projects in metropolitan cities’ urban planning agendas since the final decade of the 20th century. The aim of this article is to demonstrate the correlation between the various opposing conservative and progressive urban policies, and the projects put forward, for the city of Madrid. A study of documentary sources and the strategies deployed by public and private agents are interpreted in the light of a process during which the city has had a succession of alternating governments defending opposing urban development models. This analysis allows us to conclude that the predominant large-scale projects proposed under conservative policies have contributed to deepening the centre–periphery rift appreciated in the city.


2020 ◽  
Author(s):  
E. Priyadarshini ◽  
G. Raj Gayathri ◽  
M. Vidhya ◽  
A. Govindarajan ◽  
Samuel Chakkravarthi

Proceedings ◽  
2020 ◽  
Vol 47 (1) ◽  
pp. 55
Author(s):  
Shan Zhang

By applying the concept of natural science to the study of music, on the one hand, we can understand the structure of music macroscopically, on the other, we can reflect on the history of music to a certain extent. Throughout the history of western music, from the classical period to the 20th century, music seems to have gone from order to disorder, but it is still orderly if analyzed carefully. Using the concept of complex information systems can give a good answer in the essence.


Author(s):  
Fatemeh Jalayer ◽  
Hossein Ebrahimian ◽  
Andrea Miano

AbstractThe Italian code requires spectrum compatibility with mean spectrum for a suite of accelerograms selected for time-history analysis. Although these requirements define minimum acceptability criteria, it is likely that code-based non-linear dynamic analysis is going to be done based on limited number of records. Performance-based safety-checking provides formal basis for addressing the record-to-record variability and the epistemic uncertainties due to limited number of records and in the estimation of the seismic hazard curve. “Cloud Analysis” is a non-linear time-history analysis procedure that employs the structural response to un-scaled ground motion records and can be directly implemented in performance-based safety-checking. This paper interprets the code-based provisions in a performance-based key and applies further restrictions to spectrum-compatible record selection aiming to implement Cloud Analysis. It is shown that, by multiplying a closed-form coefficient, code-based safety ratio could be transformed into simplified performance-based safety ratio. It is shown that, as a proof of concept, if the partial safety factors in the code are set to unity, this coefficient is going to be on average slightly larger than unity. The paper provides the basis for propagating the epistemic uncertainties due to limited sample size and in the seismic hazard curve to the performance-based safety ratio both in a rigorous and simplified manner. If epistemic uncertainties are considered, the average code-based safety checking could end up being unconservative with respect to performance-based procedures when the number of records is small. However, it is shown that performance-based safety checking is possible with no extra structural analyses.


Sign in / Sign up

Export Citation Format

Share Document