Towards Scalable Meta-Learning of near-optimal Interpretable Models via Synthetic Model Generations

Reading time: 1 minute
...

📝 Original Info

  • Title: Towards Scalable Meta-Learning of near-optimal Interpretable Models via Synthetic Model Generations
  • ArXiv ID: 2511.04000
  • Date: 2025-11-06
  • Authors: ** 정보가 제공되지 않았습니다. (논문에 명시된 저자 정보를 확인해 주세요.) **

📝 Abstract

Decision trees are widely used in high-stakes fields like finance and healthcare due to their interpretability. This work introduces an efficient, scalable method for generating synthetic pre-training data to enable meta-learning of decision trees. Our approach samples near-optimal decision trees synthetically, creating large-scale, realistic datasets. Using the MetaTree transformer architecture, we demonstrate that this method achieves performance comparable to pre-training on real-world data or with computationally expensive optimal decision trees. This strategy significantly reduces computational costs, enhances data generation flexibility, and paves the way for scalable and efficient meta-learning of interpretable decision tree models.

💡 Deep Analysis

📄 Full Content

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut