A Comparison of Vector and Network-Based Measures for Assessing Design Similarity

Author(s):  
Ananya Nandy ◽  
Andy Dong ◽  
Kosa Goucher-Lambert

Abstract In order to retrieve analogous designs for design-by-analogy, computational systems require the calculation of similarity between the target design and a repository of source designs. Representing designs as functional abstractions can support designers in practicing design-by-analogy by minimizing fixation on surface-level similarities. In addition, when a design is represented by a functional model using a function-flow format, many measures are available to determine functional similarity. In most current function-based design-by-analogy systems, the functions are represented as vectors and measures like cosine similarity are used to retrieve analogous designs. However, it is hypothesized that changing the similarity measure can significantly change the examples that are retrieved. In this paper, several similarity measures are empirically tested across a set of functional models of energy harvesting products. In addition, the paper explores representing the functional models as networks to find functionally similar designs using graph similarity measures. Surprisingly, the types of designs that are considered similar by vector-based and one of the graph similarity measures are found to vary significantly. Even among a set of functional models that share known similar technology, the different measures find inconsistent degrees of similarity — some measures find the set of models to be very similar and some find them to be very dissimilar. The findings have implications on the choice of similarity metric and its effect on finding analogous designs that, in this case, have similar pairs of functions and flows in their functional models. Since literature has shown that the types of designs presented can impact their effectiveness in aiding the design process, this work intends to spur further consideration of the impact of using different similarity measures when assessing design similarity computationally.

Author(s):  
J.S. Linsey ◽  
K.L. Wood ◽  
A.B. Markman

AbstractDesign by analogy is a powerful part of the design process across the wide variety of modalities used by designers such as linguistic descriptions, sketches, and diagrams. We need tools to support people's ability to find and use analogies. A deeper understanding of the cognitive mechanisms underlying design and analogy is a crucial step in developing these tools. This paper presents an experiment that explores the effects of representation within the modality of sketching, the effects of functional models, and the retrieval and use of analogies. We find that the level of abstraction for the representation of prior knowledge and the representation of a current design problem both affect people's ability to retrieve and use analogous solutions. A general semantic description in memory facilitates retrieval of that prior knowledge. The ability to find and use an analogy is also facilitated by having an appropriate functional model of the problem. These studies result in a number of important implications for the development of tools to support design by analogy. Foremost among these implications is the ability to provide multiple representations of design problems by which designers may reason across, where the verb construct in the English language is a preferred mode for these representations.


2021 ◽  
Author(s):  
Ananya Nandy ◽  
Kosa Goucher-Lambert

Abstract Function drives many early design considerations in product development. Therefore, finding functionally similar examples is important when searching for sources of inspiration or evaluating designs against existing technology. However, it is difficult to capture what people consider to be functionally similar and therefore, if measures that compare function directly from the products themselves are meaningful. In this work, we compare human evaluations of similarity to computationally determined values, shedding light on how quantitative measures align with human perceptions of functional similarity. Human perception of functional similarity is considered at two levels of abstraction: (1) the high-level purpose of a product, and (2) a detailed view of how the product works. Human evaluations of similarity are quantified by crowdsourcing 1360 triplet ratings at each functional abstraction, and then compared to similarity that is computed between functional models. We demonstrate how different levels of abstraction and the fuzzy line between what is considered “similar” and “similar enough” may impact how these similarity measures are utilized, finding that different measures better align with human evaluations along each dimension. The results inform how product similarity can be leveraged by designers. Therefore, applications lie in creativity support tools, such as those used for design-by-analogy, or future computational methods in design that incorporate product function in addition to form.


Author(s):  
Matt Bohm ◽  
Hannah Ingram ◽  
Dalton Reith ◽  
Robert Nagel ◽  
Julie Linsey

Abstract Understanding the differences in functional models between traditional full-time graduate students and graduate students working in industry may allow for a deeper understanding of the impact of an engineer’s work on their ability to model a system in terms of its functions. To explore these differences, the researchers assigned two groups of students the task of creating a functional model of a can opener. One group of graduate students was traditional full-time graduate students while the other group was comprised of graduate students who are actively working in industry as engineers in the consumer appliance sector. This paper explores both the mechanics and plausibility behind the functional models created by the two groups of students and the impact of the industry standard parameter diagrams on the functional models of the graduate students’ working in industry. After an initial analysis of the data, the researchers noticed an abnormal trend of the industry students to include more information beyond the common functional model elements which affected their models’ logical plausibility. Because this trend seemed to occur in higher quantities in the industry students’ functional models than the traditional graduate students’ models, the researchers decided to evaluate both groups’ functional models with a rubric developed for parameter diagrams — a model format common to the industry in which the industry students were employed. After re-analyzing the functional models of both groups using the parameter diagram rubric, it was observed that the industry students’ functional models did indeed include higher traces of parameter diagrams than the average graduate student. The researchers believe this may have been due to design fixation and incomplete conceptual change in practicing engineers. Implications of this finding are discussed herein.


2021 ◽  
Vol 144 (3) ◽  
Author(s):  
Ananya Nandy ◽  
Andy Dong ◽  
Kosa Goucher-Lambert

Abstract The development of example-based design support tools, such as those used for design-by-analogy, relies heavily on the computation of similarity between designs. Various vector- and graph-based similarity measures operationalize different principles to assess the similarity of designs. Despite the availability of various types of similarity measures and the widespread adoption of some, these measures have not been tested for cross-measure agreement, especially in a design context. In this paper, several vector- and graph-based similarity measures are tested across two datasets of functional models of products to explore the ways in which they find functionally similar designs. The results show that the network-based measures fundamentally operationalize functional similarity in a different way than vector-based measures. Based upon the findings, we recommend a graph-based similarity measure such as NetSimile in the early stages of design when divergence is desirable and a vector-based measure such as cosine similarity in a period of convergence, when the scope of the desired function implementation is clearer.


Author(s):  
Alexey Faizliev ◽  
Vladimir Balash ◽  
Vladimir Petrov ◽  
Alexey Grigoriev ◽  
Dmitriy Melnichuk ◽  
...  

The aim of the paper is to provide an analysis of news and financial data using their network representation. The formation of network structures from data sources is carried out using two different approaches: by building the so-called market graph in which nodes represent financial assets (e.g., stocks) and the edges between nodes stand for the correlation between the corresponding assets, by constructing a company co-mention network in which any two companies are connected by an edge if a news item mentioning both companies has been published in a certain period of time. Topological changes of the networks over the period 2005–2010 are investigated using the sliding window of six-month duration. We study the stability of the market graph and the company co-mention network over time and establish which of the two networks was more stable during the period. In addition, we examine the impact of the crisis of 2008 on the stability of the market graph as well as the company co-mention network. The networks that are considered in this paper and that are the objects of our study (the market graph and the company co-mention network) have a non-changing set of nodes (companies), and can change over time by adding/removing links between these nodes. Different graph similarity measures are used to evaluate these changes. If a network is stable over time, a measure of similarity between two graphs constructed for two different time windows should be close to zero. If there was a sharp change between the graphs constructed for two adjacent periods, then this should lead to a sharp increase in the value of the similarity measure between these two graphs. This paper uses the graph similarity measures which were proposed relatively recently. In addition, to estimate how the networks evolve over time we exploit QAP (Quadratic Assignment Procedure). While there is a sufficient amount of works studying the dynamics of graphs (including the use of graph similarity metrics), in this paper the company co-mention network dynamics is examined both individually and in comparison with the dynamics of market graphs for the first time.


Sensors ◽  
2021 ◽  
Vol 21 (13) ◽  
pp. 4566
Author(s):  
Dominik Prochniewicz ◽  
Kinga Wezka ◽  
Joanna Kozuchowska

The stochastic model, together with the functional model, form the mathematical model of observation that enables the estimation of the unknown parameters. In Global Navigation Satellite Systems (GNSS), the stochastic model is an especially important element as it affects not only the accuracy of the positioning model solution, but also the reliability of the carrier-phase ambiguity resolution (AR). In this paper, we study in detail the stochastic modeling problem for Multi-GNSS positioning models, for which the standard approach used so far was to adopt stochastic parameters from the Global Positioning System (GPS). The aim of this work is to develop an individual, empirical stochastic model for each signal and each satellite block for GPS, GLONASS, Galileo and BeiDou systems. The realistic stochastic model is created in the form of a fully populated variance-covariance (VC) matrix that takes into account, in addition to the Carrier-to-Noise density Ratio (C/N0)-dependent variance function, also the cross- and time-correlations between the observations. The weekly measurements from a zero-length and very short baseline are utilized to derive stochastic parameters. The impact on the AR and solution accuracy is analyzed for different positioning scenarios using the modified Kalman Filter. Comparing the positioning results obtained for the created model with respect to the results for the standard elevation-dependent model allows to conclude that the individual empirical stochastic model increases the accuracy of positioning solution and the efficiency of AR. The optimal solution is achieved for four-system Multi-GNSS solution using fully populated empirical model individual for satellite blocks, which provides a 2% increase in the effectiveness of the AR (up to 100%), an increase in the number of solutions with errors below 5 mm by 37% and a reduction in the maximum error by 6 mm compared to the Multi-GNSS solution using the elevation-dependent model with neglected measurements correlations.


2021 ◽  
Author(s):  
Antonios Makris ◽  
Camila Leite da Silva ◽  
Vania Bogorny ◽  
Luis Otavio Alvares ◽  
Jose Antonio Macedo ◽  
...  

AbstractDuring the last few years the volumes of the data that synthesize trajectories have expanded to unparalleled quantities. This growth is challenging traditional trajectory analysis approaches and solutions are sought in other domains. In this work, we focus on data compression techniques with the intention to minimize the size of trajectory data, while, at the same time, minimizing the impact on the trajectory analysis methods. To this extent, we evaluate five lossy compression algorithms: Douglas-Peucker (DP), Time Ratio (TR), Speed Based (SP), Time Ratio Speed Based (TR_SP) and Speed Based Time Ratio (SP_TR). The comparison is performed using four distinct real world datasets against six different dynamically assigned thresholds. The effectiveness of the compression is evaluated using classification techniques and similarity measures. The results showed that there is a trade-off between the compression rate and the achieved quality. The is no “best algorithm” for every case and the choice of the proper compression algorithm is an application-dependent process.


2020 ◽  
Vol 10 (1) ◽  
pp. 57-82
Author(s):  
Katarzyna Guczalska

Wolfgang Merkel’s concept of rooted democracy in the context of contemporary populism and the crisis of democracy: The article presents the concept of “rooted democracy” by Wolf‐ gang Merkel, which was presented in the context of the democratic crisis. The German poli‐ tical scientist indicates what democracy is — specifying the proper functioning of the regula‐ tions of the democratic system (regimes). Speaking of the weakness or strength of democracy, we must have a well‐described set of system principles that determine the degree of strength of democracy or its erosion. The above set of principles of the democratic system is thoroughly discussed in the article. In particular, the functional model of civil society is analysed. The text also explores how the crisis of democracy is understood and Merkel’s view of the impact of global capitalism on democratic institutions, which contributes to the transformation of democracy into an oligarchy. The topics discussed in the article also concern alternative, non‐ ‐liberal forms of democracy and populism. The question is whether Merkel’s concept is useful in explaining populism and its political consequences.


2020 ◽  
Vol 11 (2) ◽  
pp. 375-386
Author(s):  
Hamed Ahmad Almahadin ◽  
Yazan Salameh Oroud

This study aims to investigate the moderating role of profitability in the relationship between capital structure and firm value in Jordan, as an example of an emerging economy. For this purpose, two functional models were formulated to capture the direct relationship as well as the interaction impact of capital structure on firm value. The robust empirical findings of panel data analysis provide strong evidence of an adverse relationship between capital structure and firm value. The findings confirm that the impact of capital structure appears to be complicated in nature and difficult to examine without controlling for the interaction of profitability as one of the major determinants. Therefore, studying the interaction effect provides ample evidence and enhances the understanding of the link between firm value and capital structure. The empirical results of the study may provide important insights and policy implications to decision-makers.


2014 ◽  
Vol 14 (16) ◽  
pp. 22985-23025
Author(s):  
M. Righi ◽  
J. Hendricks ◽  
R. Sausen

Abstract. Using the EMAC global climate-chemistry model coupled to the aerosol module MADE, we simulate the impact of land transport and shipping emissions on global atmospheric aerosol and climate in 2030. Future emissions of short-lived gas and aerosol species follow the four Representative Concentration Pathways (RCPs) designed in support of the Fifth Assessment Report of the Intergovernmental Panel on Climate Change. We compare the resulting 2030 land-transport- and shipping-induced aerosol concentrations to the ones obtained for the year 2000 in a previous study with the same model configuration. The simulations suggest that black carbon and aerosol nitrate are the most relevant pollutants from land transport in 2000 and 2030, but their impacts are characterized by very strong regional variations during this time period. Europe and North America experience a decrease in the land-transport-induced particle pollution, although in these regions this sector remains the dominant source of surface-level pollution in 2030 under all RCPs. In Southeast Asia, on the other hand, a significant increase is simulated, but in this region the surface-level pollution is still controlled by other sources than land transport. Shipping-induced air pollution is mostly due to aerosol sulfate and nitrate, which show opposite trends towards 2030. Sulfate is strongly reduced as a consequence of sulfur reduction policies in ship-fuels in force since 2010, while nitrate tends to increase due to the excess of ammonia following the reduction in ammonium-sulfate. The aerosol-induced climate impact of both sectors is dominated by aerosol-cloud effects and is projected to decrease between 2000 and 2030, nevertheless still contributing a significant radiative forcing to the Earth's radiation budget.


Sign in / Sign up

Export Citation Format

Share Document