Privacy of Groups in Dense Street Imagery

Reading time: 2 minute
...

📝 Original Info

  • Title: Privacy of Groups in Dense Street Imagery
  • ArXiv ID: 2505.07085
  • Date: 2025-05-11
  • Authors: ** 논문에 명시된 저자 정보가 제공되지 않아 현재는 “미상(Anonymous)”으로 표기합니다. 실제 논문에서는 저자명, 소속, 연락처가 포함될 것입니다. **

📝 Abstract

Spatially and temporally dense street imagery (DSI) datasets have grown unbounded. In 2024, individual companies possessed around 3 trillion unique images of public streets. DSI data streams are only set to grow as companies like Lyft and Waymo use DSI to train autonomous vehicle algorithms and analyze collisions. Academic researchers leverage DSI to explore novel approaches to urban analysis. Despite good-faith efforts by DSI providers to protect individual privacy through blurring faces and license plates, these measures fail to address broader privacy concerns. In this work, we find that increased data density and advancements in artificial intelligence enable harmful group membership inferences from supposedly anonymized data. We perform a penetration test to demonstrate how easily sensitive group affiliations can be inferred from obfuscated pedestrians in 25,232,608 dashcam images taken in New York City. We develop a typology of identifiable groups within DSI and analyze privacy implications through the lens of contextual integrity. Finally, we discuss actionable recommendations for researchers working with data from DSI providers.

💡 Deep Analysis

📄 Full Content

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut