Hybrid Medical Image Classification Using Association Rule Mining with Decision Tree Algorithm

The main focus of image mining in the proposed method is concerned with the classification of brain tumor in the CT scan brain images. The major steps involved in the system are: pre-processing, featu

Hybrid Medical Image Classification Using Association Rule Mining with   Decision Tree Algorithm

The main focus of image mining in the proposed method is concerned with the classification of brain tumor in the CT scan brain images. The major steps involved in the system are: pre-processing, feature extraction, association rule mining and hybrid classifier. The pre-processing step has been done using the median filtering process and edge features have been extracted using canny edge detection technique. The two image mining approaches with a hybrid manner have been proposed in this paper. The frequent patterns from the CT scan images are generated by frequent pattern tree (FP-Tree) algorithm that mines the association rules. The decision tree method has been used to classify the medical images for diagnosis. This system enhances the classification process to be more accurate. The hybrid method improves the efficiency of the proposed method than the traditional image mining methods. The experimental result on prediagnosed database of brain images showed 97% sensitivity and 95% accuracy respectively. The physicians can make use of this accurate decision tree classification phase for classifying the brain images into normal, benign and malignant for effective medical diagnosis.


💡 Research Summary

The paper presents a hybrid framework for classifying brain‑tumor CT images that combines association‑rule mining with a decision‑tree classifier. The workflow consists of four main stages: (1) preprocessing, (2) feature extraction, (3) frequent‑pattern mining, and (4) hybrid classification. In the preprocessing stage, a median filter is applied to suppress impulse noise, followed by Canny edge detection to accentuate anatomical boundaries. These processed images serve as the basis for extracting low‑level descriptors such as edge length, orientation histograms, and basic intensity statistics; although the exact feature vector is not exhaustively described, it aligns with common texture‑and‑shape cues used in medical imaging.

The extracted descriptors are fed into an FP‑Tree (Frequent Pattern Tree) algorithm. FP‑Tree builds a compact prefix‑tree representation of the dataset in a single pass, enabling rapid discovery of all itemsets that satisfy a user‑defined minimum support. From these frequent itemsets, association rules of the form “feature combination → class label (normal, benign, malignant)” are generated. The rules capture co‑occurrence patterns among features that are often missed by classifiers that treat each feature independently. However, a raw rule set can become excessively large and prone to over‑fitting.

To mitigate this, the authors integrate a decision‑tree classifier. The tree uses the most informative rules (or the underlying features they involve) as splitting criteria, constructing a hierarchical model that is both fast to evaluate and easy to interpret. By feeding the rule‑derived features into the tree, the hybrid system leverages the expressive power of association mining while retaining the simplicity and speed of a conventional classifier.

Experimental validation was performed on a pre‑diagnosed brain‑CT database containing three classes: normal, benign tumor, and malignant tumor. Performance was measured using sensitivity (true‑positive rate) and overall accuracy. The hybrid model achieved 97 % sensitivity and 95 % accuracy, outperforming previously reported single‑method image‑mining approaches. The high sensitivity, especially for malignant cases, is clinically valuable because it reduces the risk of missed diagnoses.

Despite these promising results, several methodological gaps limit reproducibility and broader applicability. The size of the dataset, the exact train‑test split, and whether cross‑validation was employed are not disclosed, making it difficult to assess the robustness of the reported metrics. The paper also lacks a comparative analysis against baseline classifiers such as plain decision trees, support vector machines, or deep‑learning CNNs, which are common benchmarks in contemporary medical image analysis. Computational complexity is discussed only qualitatively; quantitative timing or memory‑usage figures for FP‑Tree construction and decision‑tree training would be essential for evaluating real‑time clinical deployment.

Future work could address these shortcomings by (i) expanding the feature set to include multi‑scale texture descriptors or deep‑learning‑derived embeddings, (ii) performing extensive cross‑institutional validation on larger, more diverse cohorts, (iii) providing a detailed ablation study to isolate the contribution of association rules versus the decision tree, and (iv) optimizing the pipeline for GPU acceleration to meet the latency requirements of bedside decision‑support systems.

In summary, the study introduces a novel hybrid architecture that fuses association‑rule mining with decision‑tree classification for brain‑CT tumor detection. The reported 97 % sensitivity and 95 % accuracy demonstrate that the combination can improve diagnostic performance over traditional single‑algorithm pipelines. While the concept is sound and the initial results are encouraging, further empirical validation, transparent reporting, and integration with modern deep‑learning techniques are needed to fully establish the method’s clinical utility.


📜 Original Paper Content

🚀 Synchronizing high-quality layout from 1TB storage...