Benchmarking Tesla's Traffic Light and Stop Sign Control: Field Dataset and Behavior Insights
Understanding how Advanced Driver-Assistance Systems (ADAS) interact with Traffic Control Devices (TCDs) is critical for assessing their influence on traffic operations, yet this interaction has received little focused empirical study. This paper presents a field dataset and behavioral analysis of Tesla’s Traffic Light and Stop Sign Control (TLSSC), a mature ADAS that perceives traffic lights and stop signs. We design and execute experiments across varied speed limits and TCD types, collecting synchronized high-resolution vehicle trajectory data and driver-perspective video. From these data, we develop a taxonomy of TLSSC-TCD interaction behaviors (i.e., stopping, accelerating, and car following) and calibrate the Full Velocity Difference Model (FVDM) to quantitatively characterize each behavior mode. A novel empirical insight is the identification of a car-following threshold (~90 m). Calibration results reveal that stopping behavior is driven by strong responsiveness to both desired speed deviation and relative speed, whereas accelerating behavior is more conservative. Intersection car-following behavior exhibits smoother dynamics and tighter headways compared to standard car-following behaviors. The established dataset, behavior definitions, and model characterizations together provide a foundation for future simulation, safety evaluation, and design of ADAS-TCD interaction logic. Our dataset is available at GitHub.
💡 Research Summary
This paper addresses a notable gap in the literature: the empirical study of how advanced driver‑assistance systems (ADAS) interact with traffic control devices (TCDs) such as traffic lights and stop signs. While many public datasets capture autonomous‑driving (ADS) behavior at intersections, none provide detailed trajectories of ADAS‑equipped vehicles that actually perceive and react to TCDs. To fill this void, the authors develop a dedicated field dataset focused on Tesla’s Traffic Light and Stop Sign Control (TLSSC) feature, which combines forward‑facing cameras and onboard sensors to detect traffic lights and stop signs and to execute longitudinal control (deceleration, stopping, and acceleration) under driver supervision.
Data collection and processing
Experiments were conducted at twelve intersections with varying speed limits (30–50 km/h) in Wisconsin and Georgia. A Tesla Model 3/Y equipped with TLSSC recorded high‑frequency GPS (10 Hz) and forward‑facing video (30 fps). The authors synchronized the two streams, applied Kalman‑filter smoothing, performed quality checks, and manually annotated key events: traffic‑signal changes, stop‑line displays on the touchscreen, and instances of driver “permission” (the driver’s manual override to allow or deny movement). The resulting dataset, publicly released on GitHub, contains over 200 km of vehicle trajectories, each labeled with behavior segments and video clips.
Behavior taxonomy
Three primary TLSSC behaviors are defined:
- Stopping – includes stopping before a red/yellow light, stopping before a green light when no lead vehicle is present (waiting for driver permission), and stopping before a stop sign.
- Accelerating – includes accelerating after driver permission at a green light (both before and after a brief stop) and accelerating after permission at a stop sign.
- Car‑following – split into standard car‑following (away from intersections) and car‑following while proceeding straight through an intersection under a green light.
Human driver intervention is explicitly modeled because TLSSC is an ADAS feature that requires driver consent before proceeding through a green light without a lead vehicle.
Modeling with the Full Velocity Difference Model (FVDM)
Each behavior mode is calibrated using the Full Velocity Difference Model:
\
Comments & Academic Discussion
Loading comments...
Leave a Comment