Title: Near–Real-Time Conflict-Related Fire Detection Using Unsupervised Deep Learning and Satellite Imagery
ArXiv ID: 2512.07925
Date: 2025-12-08
Authors: ** - Kuldip Singh Atwal (George Mason University, Geography and Geoinformation Science) - Dieter Pfoser (George Mason University, Geography and Geoinformation Science) - Daniel Rothbart (George Mason University, The Jimmy and Rosalynn Carter School for Peace and Conflict Resolution) **
📝 Abstract
Ongoing armed conflict in Sudan highlights the need for rapid monitoring of conflict-related fire damage. Recent advances in deep learning and high-frequency satellite imagery enable near-real-time assessment of active fires and burn scars in war zones. This study presents a near-real-time monitoring approach using a lightweight Variational Auto-Encoder (VAE)-based model integrated with 4-band Planet Labs imagery at 3 m spatial resolution. We demonstrate that conflict-related fire damage can be detected with minimal delay using accessible, commercially available satellite data. To achieve this, we adapt a VAE-based model, originally designed for 10band imagery, to operate effectively on high-resolution 4-band inputs. The model is trained in a unsupervised manner to learn compact latent representations of nominal land-surface conditions and identify fire-affected areas by quantifying changes between temporally paired latent embeddings. Performance is evaluated across five case studies in Sudan and compared against a cosine-distance baseline computed between temporally paired image tiles using precision, recall, F1-score, and the area under the precision-recall curve (AUPRC). Results show that the proposed approach consistently outperforms the baseline, achieving higher recall and F1-scores while maintaining strong precision in highly imbalanced fire-detection scenarios. Experiments with 8-band imagery and temporal image sequences yield only marginal per-
💡 Deep Analysis
📄 Full Content
Near–Real-Time Conflict-Related Fire Detection Using
Unsupervised Deep Learning and Satellite Imagery
Kuldip Singh Atwal1,∗, Dieter Pfoser1, Daniel Rothbart2
1Geography and Geoinformation Science, George Mason University, 4400 University
Drive, Fairfax, 22030, VA, United States
2The Jimmy and Rosalynn Carter School for Peace and Conflict Resolution, George
Mason University, 4400 University Drive, Fairfax, 22030, VA, United States
Abstract
Ongoing armed conflict in Sudan highlights the need for rapid monitor-
ing of conflict-related fire damage. Recent advances in deep learning and
high-frequency satellite imagery enable near–real-time assessment of active
fires and burn scars in war zones. This study presents a near–real-time mon-
itoring approach using a lightweight Variational Auto-Encoder (VAE)–based
model integrated with 4-band Planet Labs imagery at 3 m spatial resolu-
tion.
We demonstrate that conflict-related fire damage can be detected
with minimal delay using accessible, commercially available satellite data.
To achieve this, we adapt a VAE-based model, originally designed for 10-
band imagery, to operate effectively on high-resolution 4-band inputs. The
model is trained in a unsupervised manner to learn compact latent represen-
tations of nominal land-surface conditions and identify fire-affected areas by
quantifying changes between temporally paired latent embeddings. Perfor-
mance is evaluated across five case studies in Sudan and compared against a
cosine-distance baseline computed between temporally paired image tiles us-
ing precision, recall, F1-score, and the area under the precision–recall curve
(AUPRC). Results show that the proposed approach consistently outper-
forms the baseline, achieving higher recall and F1-scores while maintaining
strong precision in highly imbalanced fire-detection scenarios. Experiments
with 8-band imagery and temporal image sequences yield only marginal per-
∗Corresponding author
Email addresses: katwal@gmu.edu (Kuldip Singh Atwal), dpfoser@gmu.edu (Dieter
Pfoser), drothbar@gmu.edu (Daniel Rothbart)
arXiv:2512.07925v2 [cs.CV] 3 Feb 2026
formance gains over single 4-band inputs, underscoring the effectiveness of
the proposed lightweight approach for scalable, near–real-time conflict mon-
itoring.
Keywords:
Conflict-related fire monitoring, Unsupervised deep learning, Variational
autoencoder (VAE), Latent-space change detection, High-resolution
satellite imagery, Near–Real-Time monitoring, Fire damage detection,
Conflict Monitoring
1. Introduction
An armed conflict in Sudan, which began in April 2023, has resulted in
widespread civilian harm, large-scale displacement, and severe destruction
of infrastructure. Fighting between the Sudanese Armed Forces (SAF) and
the Rapid Support Forces (RSF) has resulted in the deaths of thousands of
people and the displacement of over 12 million individuals (Birch et al., 2024).
The conflict initially concentrated in Khartoum state before expanding to
western and southern regions, particularly Darfur, where attacks on civilians,
healthcare facilities, and essential infrastructure are extensively documented
(Milton et al., 2025; Dahab et al., 2025). (Eljack et al., 2023) and (Alrawa
et al., 2023) highlight the impact of conflict on health.
The Sudan Conflict Observatory (SCO) has reported a massacre of more
than a thousand civilians by June 2023 in El-Geneina (Rothbart et al., 2025),
and reported destruction of healthcare facilities (Abubakr et al., 2024). Fig-
ure 1 shows the map of attacks in Sudan. The timeline indicates a rising
number of attacks since April 2023, with a significant increase in the regions
of Khartoum, El Fasher, and El Geneina.
1.1. Problem Statement and Objectives
Monitoring conflict-related fires in active war zones presents several chal-
lenges. Ground-based reporting is often delayed, incomplete, or impossible
due to insecurity and access constraints (Sticher et al., 2023). As a result,
satellite imagery has become a critical source of independent evidence for
tracking attacks, infrastructure destruction, and potential violations of inter-
national humanitarian law (Hassan and Ahmed, 2025). Among observable
conflict evidence, active fires and burn scars are particularly informative, as
they frequently accompany airstrikes, shelling, looting, and the destruction of
2
Figure 1: Map of attacks in Sudan, April–June 2023, highlighting the key conflict hotspots.
civilian structures. Recent advances in high-resolution, high-frequency satel-
lite imagery, combined with deep learning techniques, enable the possibility
of near–real-time conflict monitoring. Commercial satellite constellations,
such as Planet Labs, provide near-daily revisit capabilities at meter-scale
resolution, which makes them particularly suitable for detecting small-scale
fires in dense environments. Similarly, deep learning methods provide a way
to detect anomalous events without relying on ground-truth labels, which
are limited in active c