PALMS: Pavlovian Associative Learning Models Simulator
Simulations are an indispensable step in the cycle of theory development and refinement, helping researchers formulate precise definitions, generate models, and make accurate predictions. This paper introduces the Pavlovian Associative Learning Models Simulator (PALMS), a Python environment to simulate Pavlovian conditioning experiments. In addition to the canonical Rescorla-Wagner model, PALMS incorporates several attentional learning approaches, including Pearce-Kaye-Hall, Mackintosh Extended, Le Pelley’s Hybrid, and a novel extension of the Rescorla-Wagner model with a unified variable learning rate that integrates Mackintosh’s and Pearce and Hall’s opposing conceptualisations. The simulator’s graphical interface allows for the input of entire experimental designs in an alphanumeric format, akin to that used by experimental neuroscientists. Moreover, it uniquely enables the simulation of experiments involving hundreds of stimuli, as well as the computation of configural cues and configural-cue compounds across all models, thereby considerably expanding their predictive capabilities. PALMS operates efficiently, providing instant visualisation of results, supporting rapid, precise comparisons of various models’ predictions within a single architecture and environment. Furthermore, graphic displays can be easily saved, and simulated data can be exported to spreadsheets. To illustrate the simulator’s capabilities and functionalities, we provide a detailed description of the software and examples of use, reproducing published experiments in the associative learning literature. PALMS is licensed under the open-source GNU Lesser General Public License 3.0. The simulator source code and the latest multiplatform release build are accessible as a GitHub repository at https://github.com/cal-r/PALMS-Simulator
💡 Research Summary
The paper presents PALMS (Pavlovian Associative Learning Models Simulator), an open‑source Python‑based platform designed to simulate a broad range of Pavlovian conditioning experiments. While the Rescorla‑Wagner (RW) model remains the canonical baseline for associative learning, decades of research have produced attentional extensions—most notably the Pearce‑Hall (PH) and Mackintosh families—that diverge on whether attention is driven by prediction error (PH) or by the predictive success of a cue (Mackintosh). Existing simulators either implement a single model or lack the ability to handle large stimulus sets and configural cues, limiting systematic model comparison.
PALMS integrates five models within a single graphical interface: (1) the classic RW model with configurable stimulus salience (α) and US salience (β), including separate β for present and absent US to capture relative validity effects; (2) the Pearce‑Kaye‑Hall (PKH) model, which updates attention α each trial as a weighted blend of the absolute prediction error and the previous α (parameter γ); (3) the Mackintosh Extended (ME) model, which adjusts α via separate learning‑rate parameters (θ_E, θ_I) for excitatory and inhibitory updates, again driven by the prediction error term ρ; (4) Le Pelley’s Hybrid model, which multiplies the Mackintosh‑style α_M and Pearce‑Hall‑style α_H to obtain a compound learning rate, thereby reconciling the two attentional accounts; and (5) the novel MLAB model, a unified attentional learning‑rate extension of RW that blends the PH and Mackintosh mechanisms into a single variable α_U, allowing the model to reproduce phenomena that each original attentional model explains only partially.
The GUI accepts experimental designs expressed in an alphanumeric string format (e.g., “A+B‑C”), automatically parses individual cues, compounds, and configural representations, and permits per‑stimulus parameter specification. Users can simulate experiments with hundreds of stimuli, randomize trial sequences, and generate configural cues for all models, a capability rarely found in prior tools. Results are rendered instantly as plots and tables, which can be exported as image files or as CSV/Excel data for downstream statistical analysis. The software runs on Windows, macOS, and Linux, leveraging pure Python 3 and PyQt5 for cross‑platform compatibility. The source code is clean, well‑documented, and released under the GNU LGPL 3.0, encouraging community contributions and extensions.
To validate PALMS, the authors reproduced several classic associative‑learning experiments from the literature, demonstrating that each model yields the expected pattern of acquisition, extinction, and blocking. The MLAB model, in particular, successfully captures both the rapid learning of uncertain cues (PH prediction) and the accelerated learning of reliable predictors (Mackintosh prediction) within a single simulation, illustrating its theoretical unification power. Performance benchmarks show that even with 200–300 stimuli the simulator completes a full experimental run in seconds, confirming its suitability for large‑scale designs.
Limitations include reliance on CPU computation, which may become a bottleneck for extremely large networks (thousands of cues), and the current need for manual code modification to add new learning rules beyond the five provided. Future work is outlined as adding a plugin architecture, GPU acceleration, and automated parameter‑search utilities (e.g., Bayesian optimization) to streamline model fitting to empirical data.
In sum, PALMS offers a comprehensive, user‑friendly, and extensible environment for researchers, educators, and modelers to test, compare, and develop Pavlovian associative‑learning theories. By making sophisticated attentional models accessible and enabling rapid, reproducible simulations of complex experimental designs, PALMS stands to accelerate theoretical progress and promote open‑science practices in the fields of psychology, neuroscience, and computational cognition.
Comments & Academic Discussion
Loading comments...
Leave a Comment