Utilizing Transformers for Interactive Chatbot Development

Authors

  • Yichen Zhang Independent researcher, China

Keywords:

natural language processing, NLP, Chatbot, transformer model, tokenization

Abstract

Transformers, a novel architecture introduced by Vaswani et al. in "Attention is All You Need" (2017), have revolutionized natural language processing (NLP) by effectively using attention mechanisms to process and generate human language. This paper explores the implementation of a chatbot using the transformer model, specifically focusing on the practical aspects of tokenization, model selection, and generation of responses. The paper outlines the methods used, presents results from various model configurations, and provides an analysis of the chatbot's performance. Improvements and future directions are also discussed.

Downloads

Download data is not yet available.

Downloads

Published

05-08-2024

How to Cite

[1]
“Utilizing Transformers for Interactive Chatbot Development”, J. Computational Intel. & Robotics, vol. 4, no. 1, pp. 124–129, Aug. 2024, Accessed: Mar. 07, 2026. [Online]. Available: https://www.thesciencebrigade.org/jcir/article/view/288