Continuous Preference Trend Mining for Optimal Product Design With Multiple Profit Cycles

2014 ◽  
Vol 136 (6) ◽  
Author(s):  
Jungmok Ma ◽  
Harrison M. Kim

Product and design analytics is emerging as a promising area for the analysis of large-scale data and usage of the extracted knowledge for the design of optimal system. The continuous preference trend mining (CPTM) algorithm and application proposed in this study address some fundamental challenges in the context of product and design analytics. The first contribution is the development of a new predictive trend mining technique that captures a hidden trend of customer purchase patterns from accumulated transactional data. Unlike traditional, static data mining algorithms, the CPTM does not assume stationarity but dynamically extracts valuable knowledge from customers over time. By generating trend embedded future data, the CPTM algorithm not only shows higher prediction accuracy in comparison with well-known static models but also provides essential properties that could not be achieved with previously proposed models: utilizing historical data selectively, avoiding an over-fitting problem, identifying performance information of a constructed model, and allowing a numeric prediction. The second contribution is the formulation of the initial design problem which can reveal an opportunity for multiple profit cycles. This mathematical formulation enables design engineers to optimize product design over multiple life cycles while reflecting customer preferences and technological obsolescence using the CPTM algorithm. For illustration, the developed framework is applied to an example of tablet PC design in leasing market and the result shows that the determination of optimal design is achieved over multiple life cycles.

Author(s):  
Jungmok Ma ◽  
Harrison M. Kim

Product and design analytics is emerging as a promising area for the analysis of large-scale data and reflection of the extracted knowledge for the design of optimal system. The Continuous Preference Trend Mining (CPTM) algorithm and a framework that are proposed in this study address some fundamental challenges in the context of product and design analytics. The first contribution is the development of a new predictive trend mining technique that captures a hidden trend of customer purchase patterns from large accumulated transactional data. Different from traditional, static data mining algorithms, the CPTM does not assume the stationarity, and dynamically extract valuable knowledge of customers over time. By generating trend embedded future data, the CPTM algorithm not only shows higher prediction accuracy in comparison with static models, but also provide essential properties that could not be achieved with a previous proposed model: avoiding an over-fitting problem, identifying performance information of constructed model, and allowing a numeric prediction. The second contribution is a predictive design methodology in the early design stage. The framework enables engineering designers to optimize product design over multiple life cycles while reflecting customer preferences and technological obsolescence using the CPTM algorithm. For illustration, the developed framework is applied to an example of tablet PC design in leasing market and the result shows that the selection of optimal design is achieved over multiple life cycles.


Author(s):  
Conrad S. Tucker ◽  
Harrison M. Kim

The Preference Trend Mining (PTM) algorithm that we propose in this work aims to address some fundamental challenges of current demand modeling techniques being employed in the product design community. The first contribution is a multistage predictive modeling approach that captures changes in consumer preferences (as they relate to product design) over time, hereby enabling design engineers to anticipate next generation product features before they become mainstream/unimportant. Because consumer preferences may exhibit monotonically increasing or decreasing, seasonal or unobservable trends, we proposed employing a statistical trend detection technique to help detect time series attribute patterns. A time series exponential smoothing technique is then used to forecast future attribute trend patterns and generate a demand model that reflects emerging product preferences over time. The second contribution of this work is a novel classification scheme for attributes that have low predictive power and hence may be omitted from a predictive model. We propose classifying such attributes as either obsolete, nonstandard or standard, with the appropriate classification given based on the time series entropy values that an attribute exhibits. By modeling attribute irrelevance, design engineers can determine when to retire certain product features (deemed obsolete) or incorporate others into the actual product architecture (standard) while developing modules for those attributes exhibiting inconsistent patterns throughout time (nonstandard). A cell phone example containing 12 time stamped data sets (January 2009-December 2009) is used to validate the proposed Preference Trend Mining model and compare it to traditional demand modeling techniques for predictive accuracy and ease of model generation.


2011 ◽  
Vol 133 (11) ◽  
Author(s):  
Conrad S. Tucker ◽  
Harrison M. Kim

The Preference Trend Mining (PTM) algorithm that is proposed in this work aims to address some fundamental challenges of current demand modeling techniques being employed in the product design community. The first contribution is a multistage predictive modeling approach that captures changes in consumer preferences (as they relate to product design) over time, hereby enabling design engineers to anticipate next generation product features before they become mainstream/unimportant. Because consumer preferences may exhibit monotonically increasing or decreasing, seasonal, or unobservable trends, we proposed employing a statistical trend detection technique to help detect time series attribute patterns. A time series exponential smoothing technique is then used to forecast future attribute trend patterns and generates a demand model that reflects emerging product preferences over time. The second contribution of this work is a novel classification scheme for attributes that have low predictive power and hence may be omitted from a predictive model. We propose classifying such attributes as either standard, nonstandard, or obsolete by assigning the appropriate classification based on the time series entropy values that an attribute exhibits. By modeling attribute irrelevance, design engineers can determine when to retire certain product features (deemed obsolete) or incorporate others into the actual product architecture (standard) while developing modules for those attributes exhibiting inconsistent patterns throughout time (nonstandard). Several time series data sets using publicly available data are used to validate the proposed preference trend mining model and compared it to traditional demand modeling techniques for predictive accuracy and ease of model generation.


Author(s):  
Jaekwan Shin ◽  
Scott Ferguson

Point-estimates of part-worth values in customer preference models have been used in market-based product design under the simplying assumption that customer preferences can be treated as deterministic. However, customer preferences are not only inherently stochastic, but are also statistical estimates that exhibit random errors in model formulation and estimation. Ignoring uncertainty in customer preferences and variability in estimates has caused concern about the reliability and robustness of an optimal product design solution. This study quantitatively defines reliability and robustness of a product design under uncertainty when using discrete choice methods. These metrics are then integrated into a multi-objective optimization problem to search for product line solutions considering reliability and robustness under uncertainty when using discrete choice methods.


1994 ◽  
Vol 6 (3) ◽  
pp. 133-142 ◽  
Author(s):  
Steve King

Re-creating the social, economic and demographic life-cycles of ordinary people is one way in which historians might engage with the complex continuities and changes which underlay the development of early modern communities. Little, however, has been written on the ways in which historians might deploy computers, rather than card indexes, to the task of identifying such life cycles from the jumble of the sources generated by local and national administration. This article suggests that multiple-source linkage is central to historical and demographic analysis, and reviews, in broad outline, some of the procedures adopted in a study which aims at large scale life cycle reconstruction.


Author(s):  
Bethany Juhnke ◽  
Colleen Pokorny ◽  
Linsey Griffin ◽  
Susan Sokolowski

Despite the complexity of the human hand, most large-scale anthropometric data for the human hand includes minimal measurements. Anthropometric studies are expensive and time-consuming to conduct, and more efficient methods are needed to capture hand data and build large-scale civilian databases to impact product design and human factors analyses. A first of its kind large-scale 3D hand anthropometric database was the result of this study with 398 unique datasets. This database was created at minimal cost and time to researchers to improve accessibility to data and impact the design of products for hands.


2021 ◽  
pp. 0958305X2110148
Author(s):  
Mojtaba Shivaie ◽  
Mohammad Kiani-Moghaddam ◽  
Philip D Weinsier

In this study, a new bilateral equilibrium model was developed for the optimal bidding strategy of both price-taker generation companies (GenCos) and distribution companies (DisCos) that participate in a joint day-ahead energy and reserve electricity market. This model, from a new perspective, simultaneously takes into account such techno-economic-environmental measures as market power, security constraints, and environmental and loss considerations. The mathematical formulation of this new model, therefore, falls into a nonlinear, two-level optimization problem. The upper-level problem maximizes the quadratic profit functions of the GenCos and DisCos under incomplete information and passes the obtained optimal bidding strategies to the lower-level problem that clears a joint day-ahead energy and reserve electricity market. A locational marginal pricing mechanism was also considered for settling the electricity market. To solve this newly developed model, a competent multi-computational-stage, multi-dimensional, multiple-homogeneous enhanced melody search algorithm (MMM-EMSA), referred to as a symphony orchestra search algorithm (SOSA), was employed. Case studies using the IEEE 118-bus test system—a part of the American electrical power grid in the Midwestern U.S.—are provided in this paper in order to illustrate the effectiveness and capability of the model on a large-scale power grid. According to the simulation results, several conclusions can be drawn when comparing the unilateral bidding strategy: the competition among GenCos and DisCos facilitates; the improved performance of the electricity market; mitigation of the polluting atmospheric emission levels; and, the increase in total profits of the GenCos and DisCos.


2021 ◽  
Vol 12 (3) ◽  
pp. 212-231
Author(s):  
Issam El Hammouti ◽  
Azza Lajjam ◽  
Mohamed El Merouani

The berth allocation problem is one of the main concerns of port operators at a container terminal. In this paper, the authors study the berth allocation problem at the strategic level commonly known as the strategic berth template problem (SBTP). This problem aims to find the best berth template for a set of calling ships accepted to be served at the port. At strategic level, port operator can reject some ships to be served for avoid congestion. Since the computational complexity of the mathematical formulation proposed for SBTP, solution approaches presented so far for the problem are limited especially at level of large-scale instances. In order to find high quality solutions with a short computational time, this work proposes a population based memetic algorithm which combine a first-come-first-served (FCFS) technique, two genetics operators, and a simulating annealing algorithm. Different computational experiences and comparisons against the best known solutions so far have been presented to show the performance and effectiveness of the proposed method.


Author(s):  
Katharine McCoy

This presentation, reflecting a politics undergraduate thesis, will explore the design process behind the ballots that voters use in democratic elections around the world. Ballots are an inherently political objects, and in many cases, the most direct line of communication a citizen has to the government of their country. As such, the design of the ballot affects the legitimacy of higher level electoral and democratic institutions. This project argues that by co-opting the language of product design, a universal ballot design process would make more efficient ballots across the globe.   Product design starts with a brainstorming stage that explores at the user, the goal of the object, and the context of its use to create an effective design. By applying these observations to the process of designing a ballot, each electoral commission can produce a more effective ballot. Currently there is no standardization for ballot design other than ensuring that electoral commissions tried to make it “friendly.” By examining cases of bad ballot design, it is possible to see what element of the design process was missed or misused to create a process that corrects for these mistakes. This project examines poorly designed ballots in Florida, Scotland, and Colombia to explore the large-scale effects these small design choices make, and how to fix them. 


Author(s):  
Jairo da Costa Junior ◽  
Ana Laura Rodrigues dos Santos ◽  
Jan Carel Diehl

As our society faces large-scale wicked problems like global warming, resource depletion, poverty and humanitarian emergencies, problem solvers are required to apply new reasoning models more appropriate to deal with these complex societal problems. Dealing with these problems poses unfamiliar challenges in contexts with poor financial and infrastructural resources. Systems Oriented Design (SOD) has been recognized in the literature as a promising approach, capable to support design engineers to deal with these complex societal problems. This paper explores the application of SOD in the development of Product-Service System (PSS) concepts by student teams in a multidisciplinary master course. The course resulted in twelve concepts that were analysed using a case study approach with the support of protocol analysis. The analysis results in a description of advantages, context- and process-related challenges of using SOD. From an education point-of-view, the results demonstrate that even though SOD provides students with a broad knowledge base and skills to deal with problems in complex societal contexts, there is still the need to introduce the appropriate scope and depth in the design engineering curricula, making the transition from traditional product design, a challenging one.


Sign in / Sign up

Export Citation Format

Share Document