JCSE, vol. 18, no. 2, pp.125-133, June, 2024
DOI: http://dx.doi.org/10.5626/JCSE.2024.18.2.125
Academic Query Assistant: Integrating LLM API into an Academic Assistant Using a Microservices Architecture
Pedro Fernando Alvarez and Sebastian Quevedo
Unidad Academica de Informatica, Ciencias de la Computacion, e Innovacion Tecnologica, Grupo de Investigacion
Simulacion, Modelado, An찼lisis y Accesibilidad (SMA^2), Universidad Catolica de Cuenca, Cuenca, Ecuador
Abstract: Artificial intelligence (AI) has made impressive progress in recent years. One notable development in this technology has been the emergence of large language models (LLMs) that are capable of generating and interpreting natural language data. These models have gained widespread attention for their remarkable text generation capabilities and improved user interface. At present, academic institutions face challenges associated with how to access vast amounts of information in an efficient manner. This problem is compounded by the increasing number of academic documents available, the dispersion of information in different repositories, and the time and resources required to search and filter this information, which represents a significant workload for professors and students. To address the issue, the current paper proposes an AI-powered assistant integrated with LLMs and a software system based on a microservices architecture. This assistant offers clear and contextually relevant answers to help make academic information retrieval processes more efficient. Altogether, this article proposes an AI-powered assistant that covers the integration aspects of both AI and software models. It also uses intelligent assistants to manage academic information, and is intended to serve as a model for future implementations.
Keyword:
Assistant; API; LLM; Knowledge retrieval; NLP
Full Paper: 80 Downloads, 353 View
|