Trending Mining for Predictive Product Design

Author(s):  
Conrad S. Tucker ◽  
Harrison M. Kim

The Preference Trend Mining (PTM) algorithm that we propose in this work aims to address some fundamental challenges of current demand modeling techniques being employed in the product design community. The first contribution is a multistage predictive modeling approach that captures changes in consumer preferences (as they relate to product design) over time, hereby enabling design engineers to anticipate next generation product features before they become mainstream/unimportant. Because consumer preferences may exhibit monotonically increasing or decreasing, seasonal or unobservable trends, we proposed employing a statistical trend detection technique to help detect time series attribute patterns. A time series exponential smoothing technique is then used to forecast future attribute trend patterns and generate a demand model that reflects emerging product preferences over time. The second contribution of this work is a novel classification scheme for attributes that have low predictive power and hence may be omitted from a predictive model. We propose classifying such attributes as either obsolete, nonstandard or standard, with the appropriate classification given based on the time series entropy values that an attribute exhibits. By modeling attribute irrelevance, design engineers can determine when to retire certain product features (deemed obsolete) or incorporate others into the actual product architecture (standard) while developing modules for those attributes exhibiting inconsistent patterns throughout time (nonstandard). A cell phone example containing 12 time stamped data sets (January 2009-December 2009) is used to validate the proposed Preference Trend Mining model and compare it to traditional demand modeling techniques for predictive accuracy and ease of model generation.

2011 ◽  
Vol 133 (11) ◽  
Author(s):  
Conrad S. Tucker ◽  
Harrison M. Kim

The Preference Trend Mining (PTM) algorithm that is proposed in this work aims to address some fundamental challenges of current demand modeling techniques being employed in the product design community. The first contribution is a multistage predictive modeling approach that captures changes in consumer preferences (as they relate to product design) over time, hereby enabling design engineers to anticipate next generation product features before they become mainstream/unimportant. Because consumer preferences may exhibit monotonically increasing or decreasing, seasonal, or unobservable trends, we proposed employing a statistical trend detection technique to help detect time series attribute patterns. A time series exponential smoothing technique is then used to forecast future attribute trend patterns and generates a demand model that reflects emerging product preferences over time. The second contribution of this work is a novel classification scheme for attributes that have low predictive power and hence may be omitted from a predictive model. We propose classifying such attributes as either standard, nonstandard, or obsolete by assigning the appropriate classification based on the time series entropy values that an attribute exhibits. By modeling attribute irrelevance, design engineers can determine when to retire certain product features (deemed obsolete) or incorporate others into the actual product architecture (standard) while developing modules for those attributes exhibiting inconsistent patterns throughout time (nonstandard). Several time series data sets using publicly available data are used to validate the proposed preference trend mining model and compared it to traditional demand modeling techniques for predictive accuracy and ease of model generation.


2000 ◽  
Vol 32 (1) ◽  
pp. 1-9 ◽  
Author(s):  
Mark G. Brown ◽  
Jong-Yinq Lee

AbstractThis study extends Batten's synthetic demand modeling approach to increase the flexibility of the uniform substitute specification of the Rotterdam demand system. Marginal propensities to consume (MPC) vary with budget shares and Slutsky coefficients are defined in terms of varying MPCs. An application of the model to orange-juice products shows that the pattern of income and price elasticities over time is much different than when MPCs are restricted to be constant.


2014 ◽  
Vol 136 (6) ◽  
Author(s):  
Jungmok Ma ◽  
Harrison M. Kim

Product and design analytics is emerging as a promising area for the analysis of large-scale data and usage of the extracted knowledge for the design of optimal system. The continuous preference trend mining (CPTM) algorithm and application proposed in this study address some fundamental challenges in the context of product and design analytics. The first contribution is the development of a new predictive trend mining technique that captures a hidden trend of customer purchase patterns from accumulated transactional data. Unlike traditional, static data mining algorithms, the CPTM does not assume stationarity but dynamically extracts valuable knowledge from customers over time. By generating trend embedded future data, the CPTM algorithm not only shows higher prediction accuracy in comparison with well-known static models but also provides essential properties that could not be achieved with previously proposed models: utilizing historical data selectively, avoiding an over-fitting problem, identifying performance information of a constructed model, and allowing a numeric prediction. The second contribution is the formulation of the initial design problem which can reveal an opportunity for multiple profit cycles. This mathematical formulation enables design engineers to optimize product design over multiple life cycles while reflecting customer preferences and technological obsolescence using the CPTM algorithm. For illustration, the developed framework is applied to an example of tablet PC design in leasing market and the result shows that the determination of optimal design is achieved over multiple life cycles.


SAGE Open ◽  
2021 ◽  
Vol 11 (1) ◽  
pp. 215824402199836
Author(s):  
Tarek Ismail Mohamed

This article focuses on applying the ethics of the product features during the students’ design education. Good/Bad design term is a conventional approach to discuss the ethical/unethical design values of the products. It is noted that different aspects of the product design such as visual information design, interface design, and appearance design have a vital role in judging the levels of ethics in the product. So the students of product design everywhere need to practice the term ethical/unethical design during their study because designers influence society more than they could imagine. This influence can be done by creating an attractive organized appearance and perfect functions that support the ethical brand’s image to the customers. The interviews and discussions were held as a research method with the students of product design in some institutions in addition to some design experts and customers to find out their opinions about the design values that achieve the ethical dimensions in the product design. They can end up with products that carry ethical values in their design. The final article’s results are in the descending order of the different design values according to their importance in emphasizing the ethical aspects of the products, in addition to a checklist including some important questions that can help the designers to be more aware of ethics’ considerations in the product design because ethics is a process of learning, not a process of obedience, and to highlighting the term of ethical designer which in turn reflects on the ethics of customers and societies.


2010 ◽  
Vol 67 (6) ◽  
pp. 1185-1197 ◽  
Author(s):  
C. Fernández ◽  
S. Cerviño ◽  
N. Pérez ◽  
E. Jardim

Abstract Fernández, C., Cerviño, S., Pérez, N., and Jardim, E. 2010. Stock assessment and projections incorporating discard estimates in some years: an application to the hake stock in ICES Divisions VIIIc and IXa. – ICES Journal of Marine Science, 67: 1185–1197. A Bayesian age-structured stock assessment model is developed to take into account available information on discards and to handle gaps in the time-series of discard estimates. The model incorporates mortality attributable to discarding, and appropriate assumptions about how this mortality may change over time are made. The result is a stock assessment that accounts for information on discards while, at the same time, producing a complete time-series of discard estimates. The method is applied to the hake stock in ICES Divisions VIIIc and IXa, for which the available data indicate that some 60% of the individuals caught are discarded. The stock is fished by Spain and Portugal, and for each country, there are discard estimates for recent years only. Moreover, the years for which Portuguese estimates are available are only a subset of those with Spanish estimates. Two runs of the model are performed; one assuming zero discards and another incorporating discards. When discards are incorporated, estimated recruitment and fishing mortality for young (discarded) ages increase, resulting in lower values of the biological reference points Fmax and F0.1 and, generally, more optimistic future stock trajectories under F-reduction scenarios.


2020 ◽  
Vol 94 ◽  
Author(s):  
A.L. May-Tec ◽  
N.A. Herrera-Castillo ◽  
V.M. Vidal-Martínez ◽  
M.L. Aguirre-Macedo

Abstract We present a time series of 13 years (2003–2016) of continuous monthly data on the prevalence and mean abundance of the trematode Oligogonotylus mayae for all the hosts involved in its life cycle. We aimed to determine whether annual (or longer than annual) environmental fluctuations affect these infection parameters of O. mayae in its intermediate snail host Pyrgophorus coronatus, and its second and definitive fish host Mayaheros urophthalmus from the Celestun tropical coastal lagoon, Yucatan, Mexico. Fourier time series analysis was used to identify infection peaks over time, and cross-correlation among environmental forcings and infection parameters. Our results suggest that the transmission of O. mayae in all its hosts was influenced by the annual patterns of temperature, salinity and rainfall. However, there was a biannual accumulation of metacercarial stages of O. mayae in M. urophthalmus, apparently associated with the temporal range of the El Niño-Southern Oscillation (five years) and the recovery of the trematode population after a devasting hurricane. Taking O. mayae as an example of what could be happening to other trematodes, it is becoming clear that environmental forcings acting at long-term temporal scales affect the population dynamics of these parasites.


2021 ◽  
Vol 3 (1) ◽  
Author(s):  
Hitoshi Iuchi ◽  
Michiaki Hamada

Abstract Time-course experiments using parallel sequencers have the potential to uncover gradual changes in cells over time that cannot be observed in a two-point comparison. An essential step in time-series data analysis is the identification of temporal differentially expressed genes (TEGs) under two conditions (e.g. control versus case). Model-based approaches, which are typical TEG detection methods, often set one parameter (e.g. degree or degree of freedom) for one dataset. This approach risks modeling of linearly increasing genes with higher-order functions, or fitting of cyclic gene expression with linear functions, thereby leading to false positives/negatives. Here, we present a Jonckheere–Terpstra–Kendall (JTK)-based non-parametric algorithm for TEG detection. Benchmarks, using simulation data, show that the JTK-based approach outperforms existing methods, especially in long time-series experiments. Additionally, application of JTK in the analysis of time-series RNA-seq data from seven tissue types, across developmental stages in mouse and rat, suggested that the wave pattern contributes to the TEG identification of JTK, not the difference in expression levels. This result suggests that JTK is a suitable algorithm when focusing on expression patterns over time rather than expression levels, such as comparisons between different species. These results show that JTK is an excellent candidate for TEG detection.


1973 ◽  
Vol 1 (4) ◽  
pp. 409-425 ◽  
Author(s):  
Robert E. Berney ◽  
Bernard H. Frerichs

The concept of income elasticity of tax revenues has been used in numerous studies with little concern about its theoretical foundations. Income elasticities have also been used for revenue estimation with limited concern about stability over time or about the accuracy of the forecasts. This paper explores the development of the tax elasticity measure and, using revenue data from Washington, compares year-to-year elasticity measures with those established by regression analysis. The length of the time series is varied to check on the stability of the coefficients. Finally, the elasticities are used to predict revenues for three years to check on their accuracy for revenue estimation.


Sign in / Sign up

Export Citation Format

Share Document