Computationally Scalable Lecture Assistance through Large Language Models and Retrieval Augmented Generation in Consumer-Grade Hardware

Show simple item record

dc.contributor.author Jiyadh, I.
dc.contributor.author Thirukumaran, S.
dc.date.accessioned 2026-03-07T07:47:56Z
dc.date.available 2026-03-07T07:47:56Z
dc.date.issued 2025
dc.identifier.uri http://drr.vau.ac.lk/handle/123456789/1949
dc.description.abstract Taking notes during lectures is cognitively demanding and often hindered by environmental fac tors, with students with learning disabilities (LD) facing even greater challenges. This research addresses a critical gap by developing an automated lecture assistance tool that generates detailed notes, answers retro spective questions of students related to lectures, and produces practice questions to enhance comprehension. Existing tools either rely on proprietary, resource-intensive Large Language Models (LLMs), which require certain subscriptions, or keyword extraction mechanisms such as TF-IDF, and none of them could answer students’ questions based on past lectures or generate theoretical questions based on past lectures to improve student comprehension. The primary objectives are to identify resource-efficient LLMs, fine-tune them for generating structured lecture notes, integrate an effective RAG system, and evaluate the end-to-end pipeline for accuracy and relevance. The methodology involved selecting a suitable Automated Speech Recognition (ASR) system for lecture audio transcription, creating datasets of lecture transcripts and notes, fine-tuning compact LLMs using parameter-efficient techniques, and integrating a RAG pipeline with it. The fine-tuned model’s performance was evaluated with metrics including ROUGE, BERT Score and Perplexity to select the LLM that would serve as the base LLM for the tool development. RAG Assessment (RAGAS) was used to evaluate the built RAG system. Comparative analysis was conducted to evaluate the behaviour of LLM with and without RAG integration. This showed that the RAG integration significantly reduced LLM hallucinations by 37%. The proposed solution demonstrates that high-quality, accessible lecture assistance tools can be built even on modest hardware, supporting students- including those with LD- by automating notetaking, highlighting key announcements, enabling retrospective query answering, and generating prac tice questions. To our knowledge, this is the first study to fine-tune LLMs, integrate with RAG and deploy them on consumer-grade hardware for lecture assistance. en_US
dc.language.iso en en_US
dc.publisher Faculty of Applied Science University of Vavuniya Sri Lanka en_US
dc.subject Fine-tuning LLMs en_US
dc.subject Learning disabilities en_US
dc.subject Lecture note-taking en_US
dc.subject RAG en_US
dc.subject Scalable tool en_US
dc.title Computationally Scalable Lecture Assistance through Large Language Models and Retrieval Augmented Generation in Consumer-Grade Hardware en_US
dc.type Conference abstract en_US
dc.identifier.proceedings 1st International Conference on Applied Sciences- 2025 en_US


Files in this item

This item appears in the following Collection(s)

  • ICAS - 2025 [59]
    International Conference on Applied Sciences - 2025

Show simple item record

Search


Browse

My Account