scholarly journals Time oddity: paradoxes and the gothic

Literartes ◽  
2021 ◽  
Vol 1 (15) ◽  
pp. 243-260
Author(s):  
Vinicius Bril Rocatelli ◽  
Cido Rossi

From the Post-structuralist perspective of the hybridity of genre-modes, this article’s goal is to explore the connection between Science Fiction and the Gothic; more specifically, from Brian Aldiss and David Wingrove’s idea, in Trillion Year Spree (1986), that Science Fiction is created by the Gothic, we tried to argue that Science Fiction can also create the Gothic. Explicitly, we argued that, through the paradoxes of narratives that work with the concept of non-linear Time Travel, Science Fiction creates Gothic effects and leads to Gothic results, once the very idea of the paradox already is inherently reactionary to the Enlightenment’s scientism and rationalism and, therefore, Gothic in its very uncanny conception. 

12 Monkeys ◽  
2019 ◽  
pp. 17-28
Author(s):  
Susanne Kord

This chapter analyses Terry Gilliam's drawing for the opening credits of 12 Monkeys, which shows monkeys following each other in a never-ending spiral. It discusses 12 Monkeys's heftiest hint at the one point without which it cannot be properly understood as the film does not present time as linear. It also talks about how non-linear time is associated with time travel and futuristic scenarios, and how the idea has been around for centuries. The chapter mentions the Aztec calendar to which Gilliam's drawing bears some resemblance and relied on an interconnected tripartite system that assigned each day a unique combination of a name, number, symbol and patron deity. It talks about the hot debate in the film about linear time that becomes the acid test for the entire film.


Author(s):  
Ray Huffaker ◽  
Marco Bittelli ◽  
Rodolfo Rosa

In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjects of science, such as mathematical topology, relativity or particle physics. For this reason, the tools of NLTS have been confined and utilized mostly in the fields of mathematics and physics. However, many natural phenomena investigated I many fields have been revealing deterministic non linear structures. In this book we aim at presenting the theory and the empirical of NLTS to a broader audience, to make this very powerful area of science available to many scientific areas. This book targets students and professionals in physics, engineering, biology, agriculture, economy and social sciences as a textbook in Nonlinear Time Series Analysis (NLTS) using the R computer language.


2020 ◽  
Author(s):  
E. Priyadarshini ◽  
G. Raj Gayathri ◽  
M. Vidhya ◽  
A. Govindarajan ◽  
Samuel Chakkravarthi

Author(s):  
Fatemeh Jalayer ◽  
Hossein Ebrahimian ◽  
Andrea Miano

AbstractThe Italian code requires spectrum compatibility with mean spectrum for a suite of accelerograms selected for time-history analysis. Although these requirements define minimum acceptability criteria, it is likely that code-based non-linear dynamic analysis is going to be done based on limited number of records. Performance-based safety-checking provides formal basis for addressing the record-to-record variability and the epistemic uncertainties due to limited number of records and in the estimation of the seismic hazard curve. “Cloud Analysis” is a non-linear time-history analysis procedure that employs the structural response to un-scaled ground motion records and can be directly implemented in performance-based safety-checking. This paper interprets the code-based provisions in a performance-based key and applies further restrictions to spectrum-compatible record selection aiming to implement Cloud Analysis. It is shown that, by multiplying a closed-form coefficient, code-based safety ratio could be transformed into simplified performance-based safety ratio. It is shown that, as a proof of concept, if the partial safety factors in the code are set to unity, this coefficient is going to be on average slightly larger than unity. The paper provides the basis for propagating the epistemic uncertainties due to limited sample size and in the seismic hazard curve to the performance-based safety ratio both in a rigorous and simplified manner. If epistemic uncertainties are considered, the average code-based safety checking could end up being unconservative with respect to performance-based procedures when the number of records is small. However, it is shown that performance-based safety checking is possible with no extra structural analyses.


Author(s):  
Kasper Wåsjø ◽  
Terje P. Stavang ◽  
Tore H. Søreide

Experience from model tests has initiated a growing attention towards extreme wave slam as a critical load situation for offshore large volume structures. Most of the problem is related to the local slam pressure, which may go up to several MPa’s for 100-year and 10 000-year waves. The paper deals with modeling techniques for marine concrete structures under extreme slam loading from waves where dynamic effects together with material softening play a major role for the response. Different analysis approaches for ultimate limit state (ULS) and accidental limit state (ALS) controls are discussed in view of reliability philosophy as basis for conventional design approach. The present paper is devoted to the local impact scenario and the alternative approaches for response and capacity control involving non-linear time domain analyses. Conventional design schemes as based on linear elastic models for response calculation together with code specified capacity control often come out more conservative than non-linear approach. The paper demonstrates by case studies how softening of the structure in general reduces the response in terms of section forces. A key issue when going from conventional linear approaches into non-linear techniques is to still keep an acceptable reliability level on the capacity control. Load and material factors are normally based on structures with limited non-linearity where linear response modeling is representative. Implementing non-linear material model in time domain analysis has a major challenge in limiting the sensitivity in response and capacity calculation. The paper demonstrates the way material model of concrete affects the section forces to go into local capacity control, and concludes on needed sensitivity analyses. Practical approaches on the concrete slam problem together with resulting utilizations from the control are demonstrated. The full non-linear technique by response and capacity control in one analysis is also handled, using average material parameters and justifying safety factors for the effect of implementing characteristic lower strength of concrete in the capacity. The paper ends up in a recommendation on non-linear time domain analysis procedure for typically slam problems. A discussion is also given on applicable design codes with attention to non-linear analysis.


Sign in / Sign up

Export Citation Format

Share Document