A method of constructing maximin distance designs

Biometrika ◽  
2020 ◽  
Author(s):  
Wenlong Li ◽  
Min-Qian Liu ◽  
Boxin Tang

Abstract One attractive class of space-filling designs for computer experiments is that of maximin distance designs. Algorithmic search for such designs is commonly used but this method becomes ineffective for large problems. Theoretical construction of maximin distance designs is challenging; some results have been obtained recently, often by employing highly specialized techniques. This paper presents an easy-to-use method for constructing maximin distance designs. The method is versatile as it is applicable for any distance measure. Our basic idea is to construct large designs from small designs and the method is effective because the quality of large designs is guaranteed by that of small designs, as evaluated by the maximin distance criterion.

2015 ◽  
Vol 14 (9) ◽  
pp. 6118-6128 ◽  
Author(s):  
T. Srikanth ◽  
M. Shashi

Collaborative filtering is a popular approach in recommender Systems that helps users in identifying the items they may like in a wagon of items. Finding similarity among users with the available item ratings so as to predict rating(s) for unseen item(s) based on the preferences of likeminded users for the current user is a challenging problem. Traditional measures like Cosine similarity and Pearson correlation’s correlation exhibit some drawbacks in similarity calculation. This paper presents a new similarity measure which improves the performance of Recommender System. Experimental results on MovieLens dataset show that our proposed distance measure improves the quality of prediction. We present clustering results as an extension to validate the effectiveness of our proposed method.


Author(s):  
Xinwei Deng ◽  
Ying Hung ◽  
C. Devon Lin

Computer experiments refer to the study of complex systems using mathematical models and computer simulations. The use of computer experiments becomes popular for studying complex systems in science and engineering. The design and analysis of computer experiments have received broad attention in the past decades. In this chapter, we present several widely used statistical approaches for design and analysis of computer experiments, including space-filling designs and Gaussian process modeling. A special emphasis is given to recently developed design and modeling techniques for computer experiments with quantitative and qualitative factors.


Author(s):  
Manuel de Maya Matallana ◽  
María López-Martínez ◽  
Prudencio José Riquelme-Perea

Abstract The present paper measures quality of life through a set of dimensions included in the following partial indicators of objective well-being: demography, economic endowment, academic training, employment, health, cultural goods, environment, housing habitability, security and family. Additionally, and independently, subjective well-being is studied to measure the degree of happiness of the population. As a result, a quality of life indicator is obtained that combines both objective and subjective indicators. The methodology used corresponds to that provided by Pena Trapero through the distance measure DP2, which has been widely used in many empirical studies on well-being and quality of life. Among the results obtained, it is worth noting that happiness diminishes as per capita income grows, and that prosperity, understood as social welfare, can be achieved without relying exclusively on material growth. Thus, the Spanish development model must be revised since the material objectives and economic growth do not guarantee the happiness of the population.


2012 ◽  
Vol 178-181 ◽  
pp. 2610-2614
Author(s):  
Li Hui Liu ◽  
Ying Mei Pei ◽  
Jing Sun

In many-one distribution system, the Greedy Randomized Adaptive Search Procedure (GRASP) was applied to solve the Inventory-transportation Integrated Optimization problem (ITIO problem). The ITIO problem in many-one distribution system is difficult. When the product variety, the supplier quantity or the vehicle capacity increases, the calculated quantity will increase exponentially, and it is very difficult to get an exact solution. However, the GRASP can answer this problem. Further, by analyzing the computer experiments, it is proved that the GRASP can find the better solution to the ITIO problem in less time, and the quality of the solution will be improved with the size of the problem expanding.


2021 ◽  
Vol 29 (3) ◽  
pp. 91-104
Author(s):  
Sanjeev Dhawan ◽  
Kulvinder Singh ◽  
Adrian Rabaea ◽  
Amit Batra

Abstract Session centered recommender systems has emerged as an interesting and challenging topic amid researchers during the past few years. In order to make a prediction in the sequential data, prevailing approaches utilize either left to right design autoregressive or data augmentation methods. As these approaches are used to utilize the sequential information pertaining to user conduct, the information about the future context of an objective interaction is totally ignored while making prediction. As a matter of fact, we claim that during the course of training, the future data after the objective interaction are present and this supplies indispensable signal on preferences of users and if utilized can increase the quality of recommendation. It is a subtle task to incorporate future contexts into the process of training, as the rules of machine learning are not followed and can result in loss of data. Therefore, in order to solve this problem, we suggest a novel encoder decoder prototype termed as space filling centered Recommender (SRec), which is used to train the encoder and decoder utilizing space filling approach. Particularly, an incomplete sequence is taken into consideration by the encoder as input (few items are absent) and then decoder is used to predict these items which are absent initially based on the encoded interpretation. The general SRec prototype is instantiated by us employing convolutional neural network (CNN) by giving emphasis on both e ciency and accuracy. The empirical studies and investigation on two real world datasets are conducted by us including short, medium and long sequences, which exhibits that SRec performs better than traditional sequential recommendation approaches.


Author(s):  
Stefan Dahlstro¨m ◽  
S. Jack Hu ◽  
Rikard So¨derberg

Compliant sheet metal assemblies are often used as support structures in automobiles, airplanes and appliances. These structures not only provide a metrology frame for other modules to be assembled, but also give the product its aesthetic form. For this reason, the dimension quality of the assemblies is a very important factor to control, in order to make sure that the product will function as planned and continuously keep the product cost low. The assembly is influenced by variations in the component parts and the assembly processes. Tolerance analysis, as conducted in most industries today, is normally based on the assumption of rigid parts and is thus not always valid for sheet metal assemblies, due to their compliance. This paper will present a method, based on finite element analysis (FEA) and design of computer experiments, of identifying the influence of input variables on the final geometry variation of the assembly. The influence and the interactions among the input variables are analyzed with a response model that has been constructed, using the simulation results. This response model could be used to identify the important variables that need to be controlled in assembly. An example application is included, in order to demonstrate the simulation model and response model construction. Analysis of the results from the simulations can facilitate the design of the assembly process, in order to control the dimensional quality of the product.


Sign in / Sign up

Export Citation Format

Share Document