scholarly journals Non-Intrusive Load Monitoring via Deep Learning Based User Model and Appliance Group Model

Energies ◽  
2020 ◽  
Vol 13 (21) ◽  
pp. 5629
Author(s):  
Ce Peng ◽  
Guoying Lin ◽  
Shaopeng Zhai ◽  
Yi Ding ◽  
Guangyu He

Non-Intrusive Load Monitoring (NILM) increases awareness on user energy usage patterns. In this paper, an efficient and highly accurate NILM method is proposed featuring condensed representation, super-state and fusion of two deep learning based models. Condensed representation helps the two models perform more efficiently and preserve longer-term information, while super-state helps the model to learn correlations between appliances. The first model is a deep user model that learns user appliances usage patterns to predict the next appliance usage behavior based on past behaviors by capturing the dynamics of user behaviors history and appliances usage habits. The second model is a deep appliance group model that learns the characteristics of appliances with temporal and electrical information. These two models are then fused to perform NILM. The case study based on REFIT datasets demonstrates that the proposed NILM method outperforms two state-of-the-art benchmark methods.

2020 ◽  
Vol 34 (04) ◽  
pp. 6470-6477
Author(s):  
Canran Xu ◽  
Ming Wu

Learning representations for feature interactions to model user behaviors is critical for recommendation system and click-trough rate (CTR) predictions. Recent advances in this area are empowered by deep learning methods which could learn sophisticated feature interactions and achieve the state-of-the-art result in an end-to-end manner. These approaches require large number of training parameters integrated with the low-level representations, and thus are memory and computational inefficient. In this paper, we propose a new model named “LorentzFM” that can learn feature interactions embedded in a hyperbolic space in which the violation of triangle inequality for Lorentz distances is available. To this end, the learned representation is benefited by the peculiar geometric properties of hyperbolic triangles, and result in a significant reduction in the number of parameters (20% to 80%) because all the top deep learning layers are not required. With such a lightweight architecture, LorentzFM achieves comparable and even materially better results than the deep learning methods such as DeepFM, xDeepFM and Deep & Cross in both recommendation and CTR prediction tasks.


2008 ◽  
Vol 3 (2) ◽  
Author(s):  
R. Birks ◽  
S. Hills ◽  
E. Grant ◽  
B. Verrecht

Due to increasing pressure on water resources in southeast England, Thames Water are currently installing the first membrane bioreactor (MBR) plant for reuse (toilet flushing and irrigation) in the UK, at Beddington Zero Energy Development (BedZED), a prestigious sustainable development in south London. Thames Water will operate and evaluate the system via an in depth research programme for a 3 year period. A case study, the Solaire in New York (US), informed the BedZED Wastewater Reclamation Plant (BWRP) design and is presented. The BWRP process stream comprises 3mm screens, MBR, granular activated carbon and chlorination. Research will include process optimisation, water quality and water saving studies, post treatment efficiency and effectiveness, energy usage, studies of biofilm regrowth potential and householder perception studies. A comprehensive metering system consisting of hardwired pulse, electromagnetic and radio meters will monitor reclaimed and potable water throughout the site. The metering data will be used to calculate water balances and water savings at various scales. Research using the radio meters (AMR) will cover areas such as customer side leakage and usage patterns. This research will allow a holistic and complete understanding of water use and recycling in a sustainable community.


Author(s):  
Max Losch ◽  
Mario Fritz ◽  
Bernt Schiele

AbstractToday’s deep learning systems deliver high performance based on end-to-end training but are notoriously hard to inspect. We argue that there are at least two reasons making inspectability challenging: (i) representations are distributed across hundreds of channels and (ii) a unifying metric quantifying inspectability is lacking. In this paper, we address both issues by proposing Semantic Bottlenecks (SB), which can be integrated into pretrained networks, to align channel outputs with individual visual concepts and introduce the model agnostic Area Under inspectability Curve (AUiC) metric to measure the alignment. We present a case study on semantic segmentation to demonstrate that SBs improve the AUiC up to six-fold over regular network outputs. We explore two types of SB-layers in this work. First, concept-supervised SB-layers (SSB), which offer inspectability w.r.t. predefined concepts that the model is demanded to rely on. And second, unsupervised SBs (USB), which offer equally strong AUiC improvements by restricting distributedness of representations across channels. Importantly, for both SB types, we can recover state of the art segmentation performance across two different models despite a drastic dimensionality reduction from 1000s of non aligned channels to 10s of semantics-aligned channels that all downstream results are based on.


Energies ◽  
2021 ◽  
Vol 14 (10) ◽  
pp. 2931
Author(s):  
Hwan Kim ◽  
Sungsu Lim

Non-Intrusive Load Monitoring (NILM) techniques are effective for managing energy and for addressing imbalances between the energy demand and supply. Various studies based on deep learning have reported the classification of appliances from aggregated power signals. In this paper, we propose a novel approach called a temporal bar graph, which patternizes the operational status of the appliances and time in order to extract the inherent features from the aggregated power signals for efficient load identification. To verify the effectiveness of the proposed method, a temporal bar graph was applied to the total power and tested on three state-of-the-art deep learning techniques that previously exhibited superior performance in image classification tasks—namely, Extreme Inception (Xception), Very Deep One Dimensional CNN (VDOCNN), and Concatenate-DenseNet121. The UK Domestic Appliance-Level Electricity (UK-DALE) and Tracebase datasets were used for our experiments. The results of the five-appliance case demonstrated that the accuracy and F1-score increased by 19.55% and 21.43%, respectively, on VDOCNN, and by 33.22% and 35.71%, respectively, on Xception. A performance comparison with the state-of-the-art deep learning methods and image-based spectrogram approach was conducted.


Electronics ◽  
2021 ◽  
Vol 10 (14) ◽  
pp. 1657
Author(s):  
Mingzhi Yang ◽  
Xinchun Li ◽  
Yue Liu

Nonintrusive load monitoring (NILM) analyzes only the main circuit load information with an algorithm to decompose the load, which is an important way to help reduce energy usage. Recent research shows that deep learning has become popular for this problem. However, the ability of a neural network to extract load features depends on its structure. Therefore, more research is required to determine the best network architecture. This study proposed two deep neural networks based on the attention mechanism to improve the current sequence to point (s2p) learning model. The first model employs Bahdanau style attention and RNN layers, and the second model replaces the RNN layer with a self-attention layer. The two models are both based on a time embedding layer. Therefore, they can be better applied in NILM. To verify the effectiveness of the algorithms, we selected two open datasets and compared them with the original s2p model. The results show that attention mechanisms can effectively improve the model’s performance.


2020 ◽  
Author(s):  
Dean Sumner ◽  
Jiazhen He ◽  
Amol Thakkar ◽  
Ola Engkvist ◽  
Esben Jannik Bjerrum

<p>SMILES randomization, a form of data augmentation, has previously been shown to increase the performance of deep learning models compared to non-augmented baselines. Here, we propose a novel data augmentation method we call “Levenshtein augmentation” which considers local SMILES sub-sequence similarity between reactants and their respective products when creating training pairs. The performance of Levenshtein augmentation was tested using two state of the art models - transformer and sequence-to-sequence based recurrent neural networks with attention. Levenshtein augmentation demonstrated an increase performance over non-augmented, and conventionally SMILES randomization augmented data when used for training of baseline models. Furthermore, Levenshtein augmentation seemingly results in what we define as <i>attentional gain </i>– an enhancement in the pattern recognition capabilities of the underlying network to molecular motifs.</p>


2020 ◽  
Author(s):  
Saeed Nosratabadi ◽  
Amir Mosavi ◽  
Puhong Duan ◽  
Pedram Ghamisi ◽  
Ferdinand Filip ◽  
...  

This paper provides a state-of-the-art investigation of advances in data science in emerging economic applications. The analysis was performed on novel data science methods in four individual classes of deep learning models, hybrid deep learning models, hybrid machine learning, and ensemble models. Application domains include a wide and diverse range of economics research from the stock market, marketing, and e-commerce to corporate banking and cryptocurrency. Prisma method, a systematic literature review methodology, was used to ensure the quality of the survey. The findings reveal that the trends follow the advancement of hybrid models, which, based on the accuracy metric, outperform other learning algorithms. It is further expected that the trends will converge toward the advancements of sophisticated hybrid deep learning models.


Sign in / Sign up

Export Citation Format

Share Document