vector space representation
Recently Published Documents


TOTAL DOCUMENTS

24
(FIVE YEARS 6)

H-INDEX

3
(FIVE YEARS 1)

2020 ◽  
Author(s):  
Bryan Loh ◽  
Tom White

Generative models capture properties and relationships of images in a generic vector space representation called a latent space. Latent spaces can be sampled to create novel images and perform semantic operations consistent with the principles inferred from the training set. Designers can use representations learned by generative models to express design intent enabling more effective design experimentation. We present the SpaceSheet, a general-purpose spreadsheet interface designed to support the experimentation and exploration of latent spaces.


2020 ◽  
Author(s):  
Bryan Loh ◽  
Tom White

Generative models capture properties and relationships of images in a generic vector space representation called a latent space. Latent spaces can be sampled to create novel images and perform semantic operations consistent with the principles inferred from the training set. Designers can use representations learned by generative models to express design intent enabling more effective design experimentation. We present the SpaceSheet, a general-purpose spreadsheet interface designed to support the experimentation and exploration of latent spaces.


2020 ◽  
Vol 10 (12) ◽  
pp. 4176 ◽  
Author(s):  
Loris Nanni ◽  
Andrea Rigo ◽  
Alessandra Lumini ◽  
Sheryl Brahnam

In this work, we combine a Siamese neural network and different clustering techniques to generate a dissimilarity space that is then used to train an SVM for automated animal audio classification. The animal audio datasets used are (i) birds and (ii) cat sounds, which are freely available. We exploit different clustering methods to reduce the spectrograms in the dataset to a number of centroids that are used to generate the dissimilarity space through the Siamese network. Once computed, we use the dissimilarity space to generate a vector space representation of each pattern, which is then fed into an support vector machine (SVM) to classify a spectrogram by its dissimilarity vector. Our study shows that the proposed approach based on dissimilarity space performs well on both classification problems without ad-hoc optimization of the clustering methods. Moreover, results show that the fusion of CNN-based approaches applied to the animal audio classification problem works better than the stand-alone CNNs.


2019 ◽  
Author(s):  
Masashi Sugiyama

Deep learning has recently shown much progress for natural language processing problems and applications because of the vector space representation \cite{pennington2014glove} of each word or document instead of discrete representation by a sparse bag-of-words and other deep learning structures, such as recurrent neural network and recursive neural network, etc. Therefore, I really want to explore deep learning modules on natural language generation this summer.


Author(s):  
Zied Bouraoui ◽  
Steven Schockaert

Considerable attention has recently been devoted to the problem of automatically extending knowledge bases by applying some form of inductive reasoning. While the vast majority of existing work is centred around so-called knowledge graphs, in this paper we consider a setting where the input consists of a set of (existential) rules. To this end, we exploit a vector space representation of the considered concepts, which is partly induced from the rule base itself and partly from a pre-trained word embedding. Inspired by recent approaches to concept induction, we then model rule templates in this vector space embedding using Gaussian distributions. Unlike many existing approaches, we learn rules by directly exploiting regularities in the given rule base, and do not require that a database with concept and relation instances is given. As a result, our method can be applied to a wide variety of ontologies. We present experimental results that demonstrate the effectiveness of our method.


Sign in / Sign up

Export Citation Format

Share Document