From Curriculum Guidelines to Learning Objectives: A Survey of Five Statistics Programs
The 2000 ASA Guidelines for Undergraduate Statistics majors aimed to provide guidance to programs with undergraduate degrees in statistics as to the content and skills that statistics majors should be learning. With new guidelines forthcoming, it is important to help programs develop an assessment cycle of evaluation. How do we know the students are learning what we want them to learn? How do we improve the program over time? The first step in this process is to translate the broader Guidelines into institution-specific measurable learning outcomes. This paper provides examples of how five programs did so for the 2000 Guidelines. We hope they serve as illustrative examples for programs moving forward with the new guidelines.
💡 Research Summary
The paper examines how five undergraduate statistics programs translated the 2000 American Statistical Association (ASA) Guidelines into concrete, measurable learning outcomes, providing a practical model for institutions preparing to adopt forthcoming revisions of those guidelines. The authors first decompose the ASA Guidelines into twelve thematic domains—data acquisition and cleaning, exploratory visualization, probability theory, statistical inference, regression and ANOVA, experimental design, computational tools, communication, ethical and legal considerations, among others. Each of the five case-study programs then maps these domains onto institution‑specific learning outcomes that are explicitly observable and assessable.
In doing so, the programs employ educational frameworks such as Bloom’s Taxonomy to differentiate cognitive levels, ensuring that objectives progress from basic recall and comprehension to application, analysis, synthesis, and evaluation. For example, one university’s outcome states that students will “clean real‑world data sets, handle missing values, and produce appropriate visualizations,” and pairs this with a capstone data‑cleaning project graded by rubric criteria (e.g., 90 % accuracy in imputation, correct graph selection). Another program specifies that students must “formulate and test statistical hypotheses using appropriate models,” which is evaluated through a combination of written exams, simulation exercises, and a final report.
A central contribution of the article is its description of an assessment cycle that links learning outcomes to a diversified set of evaluation tools. Traditional exams are supplemented with authentic tasks such as coding assignments in R or Python, portfolio submissions, peer‑reviewed reports, and industry‑partnered capstone projects. These instruments generate both quantitative scores and qualitative feedback, enabling faculty to monitor achievement of each outcome over time. The feedback loop informs iterative curriculum revisions, aligning instruction with the measured performance of students on each objective.
The study also highlights variations among the five programs. Some institutions place heightened emphasis on computational fluency, integrating modern data‑science environments and cloud‑based platforms into their outcomes. Others foreground communication skills, requiring students to present statistical findings to non‑technical audiences. One program uniquely incorporates ethical and legal considerations as a standalone outcome, using case‑based discussions and reflective essays to assess students’ awareness of data privacy and research integrity.
By documenting these concrete translation processes, the authors provide a template that other statistics departments can adapt when aligning their curricula with new ASA standards. The paper demonstrates that systematic conversion of broad guidelines into specific, measurable learning outcomes—paired with a robust, multi‑modal assessment infrastructure—creates a sustainable mechanism for continuous program improvement. This model not only clarifies what students should know and be able to do but also offers a clear pathway for institutions to evaluate, refine, and demonstrate the effectiveness of their undergraduate statistics education.
Comments & Academic Discussion
Loading comments...
Leave a Comment