Puppet-CNN: Continuous Parameter Dynamics for Input-Adaptive Convolutional Networks
Modern convolutional neural networks (CNNs) organize computation as a discrete stack of layers whose parameters are independently stored and learned, with the number of layers fixed as an architectural hyperparameter. In this work, we explore an alternative perspective: can network parameterization itself be modeled as a continuous dynamical system? We introduce Puppet-CNN, a framework that represents convolutional layer parameters as states evolving along a learned parameter flow governed by a neural ordinary differential equation (ODE). Under this formulation, layer parameters are generated through continuous evolution in parameter space, and the effective number of generated layers is determined by the integration horizon of the learned dynamics, which can be modulated by input complexity to enable input-adaptive computation. We validate this formulation on standard image classification benchmarks and demonstrate that continuous parameter dynamics can achieve competitive predictive performance while substantially reducing stored trainable parameters. These results suggest that viewing neural network parameterization through the lens of dynamical systems provides a structured and flexible design space for adaptive convolutional models.
💡 Research Summary
Puppet‑CNN introduces a novel way of viewing convolutional neural networks: instead of storing an independent weight tensor for each layer, the entire set of convolutional kernels is treated as a continuous dynamical system that evolves over a normalized depth coordinate s∈
Comments & Academic Discussion
Loading comments...
Leave a Comment