|
|
2023 » Papers » Volume 2 » Towards e-learning use cases based on generative pre-trained transformers applied on Data Structure disciplines 1. TOWARDS E-LEARNING USE CASES BASED ON GENERATIVE PRE-TRAINED TRANSFORMERS APPLIED ON DATA STRUCTURE DISCIPLINES Authors: Costea Felicia-Mirabela, Chirila Ciprian-Bogdan Volume 2 | DOI: 10.12753/2066-026X-23-072 | Pages: 331-341 | Download PDF | Abstract
The San Francisco-based startup OpenAI recently released ChatGPT (generative pre-trained transformer), the latest huge (175 billion parameter) language model, and it got many thinking about the fascinating (and problematic) ways artificial intelligence (AI) can transform our lives in the very near future. According to reports, the OpenAI chatbot attracted 100 million users in the first two months after it launched, making it the fastest-growing consumer application in history. On the other hand, the pandemic enforced the use of multichoice questions in evaluating students. After the online enforced education, the question banks are still used nowadays. The problem with these questions banks are that they are collected by students and learned by hearth. One solution in this sense for the tutors is to create new questions and answers which could be a tedious work especially for large cohorts of students. The advent of ChatGPT has a huge impact on all aspects of the society. Even if the model is at its early beginnings, it has a promising perspective. It was tested in several domains with debatable results. In this sense we intend to test the generative capabilities of ChatGPT in creating questions to be integrated in LMSs. We experiment in the context of Data Structures discipline generating questions to be integrated in Moodle. We tested the generation on several sorting algorithms. We analysed the generated content and we concluded that the generation was accurate and usable in practice. There was no misleading information in the content generation. Repeated generations produced different quiz content having insignificant similarities. | Keywords
artificial intelligence, Generative Pre-Trained Transformer, deep learning models, e-learning, data structures and algorithms |
|
|
|