Skip to main content
WorldCist'23 - 11st World Conference on Information Systems and Technologies

Full Program »

Identifying Valid User Stories Using Bert Pre-Trained Natural Language Models

Currently, in the software industry, agile methodologies are vastly adopted. These methods use natural language for requirement engineering and are mostly done through the usage of User Stories. This translates into a high cost to elicit and map the software requirements. Pre-trained language models have been shown to be able to verify the validity of requirements in User Story format. Aiming to further understand their effectiveness, this paper presents a comparison between different BERT models to validate User Stories and discusses which model is the most efficient for requirement engineering. The main contributions contemplate an enhancement of the data and validation models with up to 98\% accuracy, further analyses should consider which aspect is more important in real-world applications, a high recall or a high precision.

Sandor Scoggin
Pontifical Catholic University of Minas Gerais - PUC Minas
Brazil

Humberto Marques-Neto
Pontifical Catholic University of Minas Gerais - PUC Minas
Brazil

 


Powered by OpenConf®
Copyright ©2002-2022 Zakon Group LLC