INFORMS Journal on Data Science
Latest Publications


TOTAL DOCUMENTS

4
(FIVE YEARS 4)

H-INDEX

0
(FIVE YEARS 0)

Published By Institute For Operations Research And The Management Sciences (INFORMS)

2694-4022, 2694-4030

Author(s):  
Ben Shneiderman

This thoughtful review productively covers use of visualization in three operations management research issues: theory/model development, theory/model testing, and translation/conveyance. The authors’ admirable motivation is to promote use of visualization, but I believe (1) that a wider scope would provide a stronger foundation for their encouragement and (2) that a more positive attitude would increase the chances of their succeeding.


Author(s):  
Alexander Dokumentov ◽  
Rob J. Hyndman

We propose a new method for decomposing seasonal data: a seasonal-trend decomposition using regression (STR). Unlike other decomposition methods, STR allows for multiple seasonal and cyclic components, covariates, seasonal patterns that may have noninteger periods, and seasonality with complex topology. It can be used for time series with any regular time index, including hourly, daily, weekly, monthly, or quarterly data. It is competitive with existing methods when they exist and tackles many more decomposition problems than other methods allow. STR is based on a regularized optimization and so is somewhat related to ridge regression. Because it is based on a statistical model, we can easily compute confidence intervals for components, something that is not possible with most existing decomposition methods (such as seasonal-trend decomposition using Loess, X-12-ARIMA, SEATS-TRAMO, etc.). Our model is implemented in the R package stR, so it can be applied by anyone to their own data.


Author(s):  
Rahul Basole ◽  
Elliot Bendoly ◽  
Aravind Chandrasekaran ◽  
Kevin Linderman

The unprecedented availability of data, along with the growing variety of software packages to visualize it, presents both opportunities and challenges for operations management (OM) research. OM researchers typically use data to describe conditions, predict phenomena, or make prescriptions depending on whether they are building, testing, or translating theories to practice. Visualization, when used appropriately, can complement, aid, and augment the researcher’s understanding in the different stages of research (theory building, testing, or translating and conveying results). On the other hand, if used incorrectly or without sufficient consideration, visualization can yield misleading and erroneous claims. This article formally examines the benefits of visualization as a complementary method enhancing each stage of a broader OM research strategy by examining frameworks and cases from extant research in different OM contexts. Our discussion offers guidance with regard to researchers’ use of visual data renderings, particularly toward avoiding misrepresentation, which can arise with the incorrect use of visualization. We close with a consideration of emerging trends and their implications for researchers and practitioners as well as recommendations for both authors and reviewers, regardless of domain, in evaluating the effectiveness of visuals at each stage of research.


Author(s):  
Spyros Makridakis ◽  
Chris Fry ◽  
Fotios Petropoulos ◽  
Evangelos Spiliotis

Forecasting competitions are the equivalent of laboratory experimentation widely used in physical and life sciences. They provide useful, objective information to improve the theory and practice of forecasting, advancing the field, expanding its usage, and enhancing its value to decision and policymakers. We describe 10 design attributes to be considered when organizing forecasting competitions, taking into account trade-offs between optimal choices and practical concerns, such as costs, as well as the time and effort required to participate in them. Consequently, we map all major past competitions in respect to their design attributes, identifying similarities and differences between them, as well as design gaps, and making suggestions about the principles to be included in future competitions, putting a particular emphasis on learning as much as possible from their implementation in order to help improve forecasting accuracy and uncertainty. We discuss that the task of forecasting often presents a multitude of challenges that can be difficult to capture in a single forecasting contest. To assess the caliber of a forecaster, we, therefore, propose that organizers of future competitions consider a multicontest approach. We suggest the idea of a forecasting-“athlon” in which different challenges of varying characteristics take place.


Sign in / Sign up

Export Citation Format

Share Document