Person ReID: Optimization of Domain Adaption Though Clothing Style Transfer Between Datasets

Author(s):  
Haijian Wang ◽  
Meng Yang ◽  
Hui Li ◽  
Linbin Ye
2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Hanying Wang ◽  
Haitao Xiong ◽  
Yuanyuan Cai

In recent years, image style transfer has been greatly improved by using deep learning technology. However, when directly applied to clothing style transfer, the current methods cannot allow the users to self-control the local transfer position of an image, such as separating specific T-shirt or trousers from a figure, and cannot achieve the perfect preservation of clothing shape. Therefore, this paper proposes an interactive image localized style transfer method especially for clothes. We introduce additional image called outline image, which is extracted from content image by interactive algorithm. The interaction consists simply of dragging a rectangle around the desired clothing. Then, we introduce an outline loss function based on distance transform of the outline image, which can achieve the perfect preservation of clothing shape. In order to smooth and denoise the boundary region, total variation regularization is employed. The proposed method constrains that the new style is generated only in the desired clothing part rather than the whole image including background. Therefore, in our new generated images, the original clothing shape can be reserved perfectly. Experiment results show impressive generated clothing images and demonstrate that this is a good approach to design clothes.


2019 ◽  
Author(s):  
Utsav Krishnan ◽  
Akshal Sharma ◽  
Pratik Chattopadhyay

Author(s):  
Xide Xia ◽  
Tianfan Xue ◽  
Wei-sheng Lai ◽  
Zheng Sun ◽  
Abby Chang ◽  
...  
Keyword(s):  

Author(s):  
Yingying Deng ◽  
Fan Tang ◽  
Weiming Dong ◽  
Wen Sun ◽  
Feiyue Huang ◽  
...  
Keyword(s):  

2021 ◽  
pp. 1-12
Author(s):  
Mukul Kumar ◽  
Nipun Katyal ◽  
Nersisson Ruban ◽  
Elena Lyakso ◽  
A. Mary Mekala ◽  
...  

Over the years the need for differentiating various emotions from oral communication plays an important role in emotion based studies. There have been different algorithms to classify the kinds of emotion. Although there is no measure of fidelity of the emotion under consideration, which is primarily due to the reason that most of the readily available datasets that are annotated are produced by actors and not generated in real-world scenarios. Therefore, the predicted emotion lacks an important aspect called authenticity, which is whether an emotion is actual or stimulated. In this research work, we have developed a transfer learning and style transfer based hybrid convolutional neural network algorithm to classify the emotion as well as the fidelity of the emotion. The model is trained on features extracted from a dataset that contains stimulated as well as actual utterances. We have compared the developed algorithm with conventional machine learning and deep learning techniques by few metrics like accuracy, Precision, Recall and F1 score. The developed model performs much better than the conventional machine learning and deep learning models. The research aims to dive deeper into human emotion and make a model that understands it like humans do with precision, recall, F1 score values of 0.994, 0.996, 0.995 for speech authenticity and 0.992, 0.989, 0.99 for speech emotion classification respectively.


Sign in / Sign up

Export Citation Format

Share Document