Student Satisfaction mining in a typical core course of Computer Science

Student Satisfaction mining in a typical core course of Computer Science
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Students’ satisfaction plays a vital role in success of an educational institute. Hence, many educational institutes continuously improve their service to produce a supportive learning environment to satisfy the student need. For this reason, educational institutions collect student satisfaction data to make decision about institutional quality, but till now it cannot be determined because student satisfaction is a complex matter which is influenced by variety of characteristics of students and institutions. There are many studies have been performed to inspect student satisfaction in the form of college services, programs, student accommodation facility, student-faculty interaction, consulting hours etc. So, still we cannot have a standard method to know what is going on about satisfaction in the case of a core course. In this research we determined the attributes that heavily affect student satisfaction in a core course of computer science and the current status of other attributes as well.


💡 Research Summary

The paper investigates student satisfaction within a core Computer Science course by applying data‑mining techniques to identify the most influential factors and to assess the current status of other attributes. The authors begin by highlighting the pivotal role of satisfaction in institutional success and note that, while many studies have examined overall university services, a standardized method for evaluating satisfaction in a specific core course is lacking.

A structured questionnaire was developed based on an extensive literature review, covering 30 items that span lecture design, professor‑student interaction, assignment and exam difficulty, perceived fairness of grading, laboratory resources, and ancillary support services such as tutoring and counseling. The survey was administered at the end of the 2024 spring semester to 215 students enrolled in a mandatory Computer Science course (e.g., Data Structures or Algorithms), achieving a 93 % response rate. Responses were recorded on a five‑point Likert scale.

Data preprocessing involved handling missing values by mean imputation, Z‑score normalization, and checking multicollinearity (VIF < 2). Exploratory analysis revealed moderate correlations among several items, prompting the use of factor analysis to reduce dimensionality. Six latent factors—lecture design, professor‑student interaction, grading fairness, assignment difficulty, lab environment, and learning support—explained 68 % of the total variance.

To uncover patterns and predictive relationships, the authors employed a suite of machine‑learning methods: K‑means clustering (k = 3) identified high‑, medium‑, and low‑satisfaction groups; decision‑tree (C4.5) and random‑forest classifiers were trained to predict overall satisfaction, with the random forest achieving the highest performance (accuracy = 84 %, AUC = 0.89). Variable importance from the random forest placed professor‑student interaction (27 % importance), grading fairness (22 %), and appropriate assignment difficulty (18 %) at the top, followed by lecture clarity, lab equipment availability, and tutoring support.

Association‑rule mining (Apriori, min‑support = 0.15, min‑confidence = 0.70) produced actionable insights. Notably, the rule “high grading fairness → high overall satisfaction” had a confidence of 0.81 and lift of 1.45, while “insufficient lab equipment → low satisfaction” showed confidence 0.74 and lift 1.32. Another rule linked frequent professor feedback with perceived appropriate assignment difficulty (confidence = 0.68, lift = 1.27).

The combined findings indicate that, in a core CS course, relational aspects (interaction, feedback) and procedural fairness outweigh purely content‑related factors in shaping student satisfaction. Laboratory resources and supplemental support services, although important, currently lag behind in perceived quality.

Based on these results, the authors propose concrete interventions: (1) institutionalize regular, structured feedback sessions and transparent grading rubrics to boost interaction and fairness; (2) calibrate assignment difficulty using pre‑assessment of student readiness and adopt a scaffolded difficulty progression; (3) allocate budget for modernizing lab equipment and expand tutoring/mentoring programs to strengthen support services.

In summary, the study demonstrates that a data‑driven, multi‑method approach can reliably pinpoint the drivers of satisfaction in a specific academic context, offering a replicable framework for other institutions seeking to enhance the learning experience of core courses.


Comments & Academic Discussion

Loading comments...

Leave a Comment