Measuring Human Movement Patterns and Behaviors in Public Spaces

Publikation: KonferencebidragPosterForskningfagfællebedømt


In order to assess human movement patterns and behaviors in public spaces we present a method using thermal cameras and Computer Vision (CV) technology, combined with the analytical virtues of Geographical Information Systems (GIS), to track people in urban streets and plazas. The method enables recording of georeferenced positions of individuals in a scene 30 times per second with a spatial accuracy about 25-50 cm. This allows for the analysis of behavior and attendance at a fine scale compared to other established methods for pedestrian behavior monitoring.

The use of thermal cameras has the advantage over normal cameras that they can operate independent of light, and in many situations they perform better with Computer Vision software as segmentation of moving objects is easier in thermal video. At the same time concerns for privacy issues when tracking people can be neglected since the identity of individuals cannot be revealed in thermal images. Thus the technique ensures privacy by design. Furthermore the prices on thermal cameras continue to be lowered at the same time as the resolution keeps improving. This add to the practical applicability of such sensors for pedestrian behavioral studies.

Our method builds on previous work and extends the analysis to the GIS domain by capturing georeferenced tracks. This allows for analysis of the tracks in relation to other spatio-temporally referenced data. Environmental variables that might influence movement patterns in urban landscapes such as sunny or shaded areas, wind speed, humidity, rain, can be brought in, as well as a 3D model of the scene, or socio-economic and statistical data for the neighborhood in which the tracking is taking place.

In 2013 we conducted a pilot study in Copenhagen in a pedestrian zone with a continuous flow of pedestrians from several directions that needed to negotiate and avoid each other. A single state-of-the-art uncooled thermal camera with a resolution of 640x480 pixels (Axis Q1922), a lens with a focal length of 10 mm, a viewing angle of 57o, and 30 fps camera frame rate was used. Background subtraction was applied to detect people.

To assess the quality of the trajectories generated by the CV software, a sample of Ground Truth (GT) trajectories were digitized manually for all individuals simultaneously present in the scene in parts of the video recorded. The manual digitization was done in the T-Analyst software developed at Lund University. Tracks of people walking alone or in social groups of different sizes were recorded, as well as people waiting, people having a conversation, and people dragging their bikes or pushing prams or wheelchairs. The tracks of ‘facers’ working for a charity organization trying to stop people in the street to make them donate to the cause were also recorded in the scene.

Our method enables the tracks of individuals in the different situations to be extracted in GIS for further analysis of the detailed movement behaviors in the specific contexts. Further research will be to develop advanced methods in GIS to enable extraction of behavioral parameters for different classes of tracks that can be used to calibrate models of pedestrian movement.

Our approach to tracking urban public life should be seen as a supplement to the traditional qualitative and intuitive manual approaches to collection of data used in studies of urban public spaces and qualities. It is the aim that our approach can contribute to the development of new digital methods in this field.
Bidragets oversatte titelMåling af menneskers bevægelsesmønstre og adfærd i byrum
Publikationsdato27 aug. 2014
Antal sider1
StatusUdgivet - 27 aug. 2014
BegivenhedMeasuring Behavior 2014 - Hof van Wageningen, Wageningen, Holland
Varighed: 27 aug. 201429 aug. 2014


KonferenceMeasuring Behavior 2014
LokationHof van Wageningen

Bibliografisk note

Posteren blev brugt igen ved IGNs interne PhD konference i November 2014

Antal downloads er baseret på statistik fra Google Scholar og

Ingen data tilgængelig

ID: 128787504