Automatic generation of introductory programming exercises with large language models

Main Article Content

Nguyen Binh Duong Ta
Hua Gia Phuc Nguyen
Swapna Gottipati

Abstract

Despite recent advances in code generation made possible by large language models (LLMs), programming is still an essential skill that computing students need to master now and in the foreseeable future. In learning programming, frequent practices with exercises set at an appropriate difficulty and knowledge level is of crucial importance for students. However, it’s not a trivial task for instructors to create many good quality exercises customized for each student. Programming problems found on Internet sources such as LeetCode are mostly too challenging for novice programmers with no prior coding knowledge. Recent work in AI-enabled education has been leveraging LLMs for adaptive feedback generation on code submitted by students. Not much work has been done in generating customized exercises for students to have more practice. In this work, we propose ExGen, an automatic exercise generation system which uses LLMs such as OpenAI’s GPT models to generate on-demand, customized, and ready-to-use programming exercises for individual students. ExGen is designed as a plugin to Visual Studio Code. It incorporates a set of prompting strategies for candidate exercise generation, and a novel chain of automatic filtering mechanisms to select ready-to-use exercises. ExGen is convenient to use as compared to chatbots such as ChatGPT. We have conducted an extensive performance evaluation using more than 1400 generated Python exercises. We considered several prompting strategies with various keyword and seed exercise types, filtering techniques, difficulty levels, and LLMs with different generative performance and cost. The results demonstrated the effectiveness of ExGen’s design and implementation.

Metrics

Metrics Loading ...

Article Details

How to Cite
Ta, N. B. D., Nguyen, H. G. P., & Gottipati, S. (2026). Automatic generation of introductory programming exercises with large language models. Research and Practice in Technology Enhanced Learning, 21, 025. https://doi.org/10.58459/rptel.2026.21025
Section
Articles