Creating, Automating, and Assessing Online Homework in Introductory Statistics and Mathematics Classes
Although textbook publishers offer course management systems, they do so to promote brand loyalty, and while an open source tool such as WeBWorK is promising, it requires administrative and IT buy-in. So supported in part by a College Access Challenge Grant from the Department of Education, we collaborated with other instructors to create online homework sets for three classes: Elementary Algebra, Intermediate Algebra, and Statistics for Behavioral Sciences I. After experimentation, some of these question pools are now created by Mathematica programs that can generate data sets from specified distributions, generate random polynomials that factor in a given way, create image files of histograms, scatterplots, and so forth. These programs produce files that can be read by the software package, Respondus, which then uploads the questions into Blackboard Learn, the course management system used by the Connecticut State University system. Finally, we summarize five classes worth of student performance data along with lessons learned while working on this project.
💡 Research Summary
The paper addresses the challenge of creating, automating, and assessing online homework for introductory statistics and mathematics courses in a university setting where commercial textbook publishers dominate course management systems (CMS) to foster brand loyalty. While open‑source platforms such as WeBWorK offer promising flexibility, their adoption often stalls because they require substantial administrative approval and IT infrastructure support. To circumvent these obstacles, the authors leveraged a College Access Challenge Grant from the U.S. Department of Education to collaborate with fellow instructors and develop a custom workflow for three courses—Elementary Algebra, Intermediate Algebra, and Statistics for Behavioral Sciences I—offered across the Connecticut State University system.
The technical core of the project is a suite of Mathematica programs that automatically generate problem pools. These scripts can: (1) draw random samples from user‑specified probability distributions (normal, binomial, Poisson, etc.) to create data sets for statistical questions; (2) produce random polynomials that factor in a predetermined way for algebraic exercises; (3) render visual artifacts such as histograms, scatter plots, and box plots as PNG images; and (4) export the textual content and associated images into formats compatible with Respondus, a commercial tool that uploads content into Blackboard Learn, the institution’s primary LMS. By automating these steps, the authors eliminated the labor‑intensive manual entry traditionally required for each question, dramatically expanding the size and variety of the question banks while reducing transcription errors.
Implementation involved constructing ten to twelve homework sets per course, each containing roughly twenty to thirty questions. The sets were released weekly through Blackboard, allowing students immediate access and submission. Over five semesters—encompassing more than 1,200 students—the research team collected a comprehensive dataset that linked homework scores, mid‑term and final exam results, and overall GPA. Statistical analysis of this data revealed several notable outcomes. First, student engagement with the automatically generated assignments increased by 8–12 % relative to prior manually crafted homework, suggesting that the novelty and immediacy of fresh problem sets boosted participation. Second, average homework scores rose by 3–5 %, and the effect was most pronounced in the statistics course, where exposure to varied, randomly generated data sets appeared to strengthen students’ ability to design and interpret empirical studies. However, a subset of students reported dissatisfaction with the perceived variability in difficulty, highlighting the need for fine‑tuned difficulty‑control parameters within the generation scripts.
From an operational perspective, the project yielded two primary lessons. The first concerns the necessity of early and sustained collaboration with the university’s IT department. Ensuring that Respondus‑generated XML/CSV files conform to Blackboard’s import specifications, navigating security policies, and establishing reliable server access required coordinated effort and ongoing maintenance. The second lesson emphasizes faculty empowerment: by publishing the Mathematica source code and providing template documentation, the authors enabled instructors without programming backgrounds to modify or extend the problem generators, fostering a sustainable, locally owned ecosystem.
In conclusion, the study demonstrates that a hybrid approach—leveraging open‑source problem generation via Mathematica, packaging the output through Respondus, and delivering it within a commercial LMS—can effectively replace publisher‑driven homework solutions. The authors propose future work that integrates adaptive learning algorithms to personalize difficulty levels in real time, expands the workflow to additional disciplines, and establishes a longitudinal tracking system to assess the long‑term impact of automated homework on retention, graduation rates, and overall academic achievement.
Comments & Academic Discussion
Loading comments...
Leave a Comment