The Use of Transformer Model in Opinion Summarisation
We live in an age of information, therefore collected data and documentation are practically treasure resources. All about a business and its development can be estimated with clarity via statistics. Any machine that could really analyse information to predict a projected outcome is known for being extremely vital for the business. It is critical for the system to provide accurate and useful knowledge of the products in order to conduct accurate assessment. Summarisation is a technique for obtaining a rundown from series of sentences in a study or observation that facilitates us with understanding the basic content of the knowledge expressed within. Simple and brief summaries of just a product will assist the system in performing prospective product research and development. In our paper, we use a deep learning framework that provides to extract clean, relevant, brief summaries from comprehensive customer feedback. Strategies of abstractive text summarisation is used. The method of extracting the primary keyword from a statement and using them in the summary is defined as extractive text summarisation. We utilise abstractive summarisation in this case, which evolves from sample information and provides the best feasible description. Utilising Transformer with Depth Scaling MultiHeaded Attention as well as GloVe word embedding with positional encoding, we illustrate an abstractive approach to extract summaries from the Amazon fine food reviews dataset. Transformer aids in the parallelisation of workloads in order to process data more quickly. We have used an Attention layer which boost the model's quality and enables it to become more effective. The BLUE rating is used to quantify the model's potency.