| dc.description.abstract |
Taking notes during lectures is cognitively demanding and often hindered by environmental fac
tors, with students with learning disabilities (LD) facing even greater challenges. This research addresses a
critical gap by developing an automated lecture assistance tool that generates detailed notes, answers retro
spective questions of students related to lectures, and produces practice questions to enhance comprehension.
Existing tools either rely on proprietary, resource-intensive Large Language Models (LLMs), which require
certain subscriptions, or keyword extraction mechanisms such as TF-IDF, and none of them could answer
students’ questions based on past lectures or generate theoretical questions based on past lectures to improve
student comprehension. The primary objectives are to identify resource-efficient LLMs, fine-tune them for
generating structured lecture notes, integrate an effective RAG system, and evaluate the end-to-end pipeline
for accuracy and relevance. The methodology involved selecting a suitable Automated Speech Recognition
(ASR) system for lecture audio transcription, creating datasets of lecture transcripts and notes, fine-tuning
compact LLMs using parameter-efficient techniques, and integrating a RAG pipeline with it. The fine-tuned
model’s performance was evaluated with metrics including ROUGE, BERT Score and Perplexity to select
the LLM that would serve as the base LLM for the tool development. RAG Assessment (RAGAS) was
used to evaluate the built RAG system. Comparative analysis was conducted to evaluate the behaviour of
LLM with and without RAG integration. This showed that the RAG integration significantly reduced LLM
hallucinations by 37%. The proposed solution demonstrates that high-quality, accessible lecture assistance
tools can be built even on modest hardware, supporting students- including those with LD- by automating
notetaking, highlighting key announcements, enabling retrospective query answering, and generating prac
tice questions. To our knowledge, this is the first study to fine-tune LLMs, integrate with RAG and deploy
them on consumer-grade hardware for lecture assistance. |
en_US |