scholarly journals Transformer-Based Patent Novelty Search by Training Claims to Their Own Description

2021 ◽  
Vol 8 (5) ◽  
pp. 37
Author(s):  
Michael Freunek ◽  
André Bodmer

In this paper we present a method to concatenate patent claims to their own description. By applying this method, bidirectional encoder representations from transformers (BERT) train suitable descriptions for claims. Such a trained BERT could be able to identify novelty relevant descriptions for patents. In addition, we introduce a new scoring scheme: relevance score or novelty score to interprete the output of BERT. We test the method on patent applications by training BERT on the first claims of patents and corresponding descriptions. The output is processed according to the relevance score and the results compared with the cited X documents in the search reports. The test shows that BERT score some of the cited X documents as highly relevant.

Sign in / Sign up

Export Citation Format

Share Document