Quantitative Approaches

Author(s):  
Andrew R. Hom

Chapter six confronts the hard case of quantitative research, which seems firmly based on “timeless” mathematical formulas and Western standard time. It thus appears most resistant to interpretation as a narrative timing project. This chapter excavates quantitative IR’s temporal assumptions and dependencies, with illustrations drawn from international conflict research. It argues that dominant statistical models and the statistical approach in general work to tame overly temporal phenomena by constructing narrative and poetic links to eternal logic. After tracing the narrative timing techniques embedded in IR’s quantitative workhorse, the general linear model, it shows how putatively “time-sensitive” techniques like time series analysis and event hazard models still treat time as a problem for sound knowledge development. The chapter closes by highlighting the distinctly temporal moves made in the recent Bayesian turn, which suggest that instead of relying on passive timing meters, quantitative research must remain in an active timing mode much closer to lived time than the scientific laboratory.

2021 ◽  
Author(s):  
Hugo Abreu Mendes ◽  
João Fausto Lorenzato Oliveira ◽  
Paulo Salgado Gomes Mattos Neto ◽  
Alex Coutinho Pereira ◽  
Eduardo Boudoux Jatoba ◽  
...  

Within the context of clean energy generation, solar radiation forecast is applied for photovoltaic plants to increase maintainability and reliability. Statistical models of time series like ARIMA and machine learning techniques help to improve the results. Hybrid Statistical + ML are found in all sorts of time series forecasting applications. This work presents a new way to automate the SARIMAX modeling, nesting PSO and ACO optimization algorithms, differently from R's AutoARIMA, its searches optimal seasonality parameter and combination of the exogenous variables available. This work presents 2 distinct hybrid models that have MLPs as their main elements, optimizing the architecture with Genetic Algorithm. A methodology was used to obtain the results, which were compared to LSTM, CLSTM, MMFF and NARNN-ARMAX topologies found in recent works. The obtained results for the presented models is promising for use in automatic radiation forecasting systems since it outperformed the compared models on at least two metrics.


2019 ◽  
Vol 37 (1) ◽  
pp. 3-15
Author(s):  
Benjamin O Fordham

This essay examines the relationship between history and the quantitative study of international conflict. The usual distinction between these two pursuits does not hold up to close scrutiny. In fact, both research communities are in the business of using theory to explain social processes that occur within historical bounds. Making these historical bounds explicit is an appropriate response to the nature of our subject matter. Doing so also has some important advantages, including more precise theory, higher quality data, better model specification, and the potential to help explain important historical events.


2005 ◽  
Vol 50 (01) ◽  
pp. 1-8 ◽  
Author(s):  
PETER M. ROBINSON

Much time series data are recorded on economic and financial variables. Statistical modeling of such data is now very well developed, and has applications in forecasting. We review a variety of statistical models from the viewpoint of "memory", or strength of dependence across time, which is a helpful discriminator between different phenomena of interest. Both linear and nonlinear models are discussed.


Author(s):  
Peter Hedström

This article emphasizes various ways by which the study of mechanisms can make quantitative research more useful for causal inference. It concentrates on three aspects of the role of mechanisms in causal and statistical inference: how an understanding of the mechanisms at work can improve statistical inference by guiding the specification of the statistical models to be estimated; how mechanisms can strengthen causal inferences by improving our understanding of why individuals do what they do; and how mechanism-based models can strengthen causal inferences by showing why, acting as they do, individuals bring about the social outcomes they do. There has been a surge of interest in mechanism-based explanations, in political science as well as in sociology. Most of this work has been vital and valuable in that it has sought to clarify the distinctiveness of the approach and to apply it empirically.


2010 ◽  
Vol 22 (7) ◽  
pp. 1927-1959 ◽  
Author(s):  
Ming-Jie Zhao ◽  
Herbert Jaeger

Hidden Markov models (HMMs) are one of the most popular and successful statistical models for time series. Observable operator models (OOMs) are generalizations of HMMs that exhibit several attractive advantages. In particular, a variety of highly efficient, constructive, and asymptotically correct learning algorithms are available for OOMs. However, the OOM theory suffers from the negative probability problem (NPP): a given, learned OOM may sometimes predict negative probabilities for certain events. It was recently shown that it is undecidable whether a given OOM will eventually produce such negative values. We propose a novel variant of OOMs, called norm-observable operator models (NOOMs), which avoid the NPP by design. Like OOMs, NOOMs use a set of linear operators to update system states. But differing from OOMs, they represent probabilities by the square of the norm of system states, thus precluding negative probability values. While being free of the NPP, NOOMs retain most advantages of OOMs. For example, NOOMs also capture (some) processes that cannot be modeled by HMMs. More importantly, in principle, NOOMs can be learned from data in a constructive way, and the learned models are asymptotically correct. We also prove that NOOMs capture all Markov chain (MC) describable processes. This letter presents the mathematical foundations of NOOMs, discusses the expressiveness of the model class, and explains how a NOOM can be estimated from data constructively.


Sign in / Sign up

Export Citation Format

Share Document