At last check, the population of the world was around 7.1 billion and counting. As we all know, the sheer number of people on the planet presents a host of new challenges and exacerbates existing ones. The overarching population problem may seem daunting, but there’s still plenty we can do to make a crowded, urbanized world livable. A new study in PLOS ONE focuses on the specific issue of pedestrian traffic and how to accurately model the flow of people through their environment.
Researchers with Siemens and the Munich University of Applied Sciences examined video recordings of commuters walking through a major German train station on a weekday, during both the morning and evening peak commute times. Scientists analyzed the videos to determine individual pedestrians’ paths and walking speeds, and used the resulting data to set the parameters for a simulation of pedestrian traffic flow. According to the authors, this kind of calibration of theoretical models using real-world data is largely missing from the most pedestrian flow models, which are under-validated and imprecise.
The authors utilized a cellular automaton model to form the basis of this simulation. Cellular automatons are models in which cells in a grid evolve and change values through steps based on specific rules. In this instance, the authors used a hexagonal grid and a few simple rules about pedestrian movement:
- Pedestrians know and will follow the shortest path to their destination unless pedestrians or other obstacles are in the way.
- Pedestrians will walk at their own individual preferred speeds, so long as the path is unobstructed.
- Individuals need personal space, which acts like a repelling force to other pedestrians and objects.
- Walking speeds decrease as crowds get denser.
- Factors like age and fitness are all captured by setting a range of individual walking speeds.
This model also borrowed from electrostatics by treating people like electrons. As the authors write:
“Pedestrians are attracted by positive charges, such as exits, and repelled by negative charges, such as other pedestrians or obstacles.”
Add to this model rules about when and where pedestrians appear, the starting points and destinations, and the relative volume of traffic from each starting point to different destinations, and you’ve got a basic model of pedestrian traffic.
Next, the authors calibrated this model by setting parameters using real-world, observational data from the train station videos: where people at each starting point were going, distance kept from walls, the distribution of walking speeds, and so on. To test their model and parameters, the authors validated it by running predictive simulations and comparing it to real-world scenarios. Based on the results, the authors suggest that this kind of model, which includes parameters based on real-world observation, more accurately represents pedestrian flow than other models of walkers that do not incorporate observational data.
The authors also changed multiple parameters to determine which ones had the largest impact on the simulation. The parameter that had the largest effect when altered was the source-target distribution (the destinations of people coming from specific starting points), so the authors note that this is critical to measure accurately and precisely.
The ability to precisely predict the flow of traffic has many clear applications, from the design of buildings and public spaces to the prediction and prevention of unsafe crowd densities during large events or emergencies.
Next research question: when it’s crowded, does pushing really not make it go faster?
- Modelling Pedestrian Travel Time and the Design of Facilities: A Queuing Approach
- The Walking Behaviour of Pedestrian Social Groups and Its Impact on Crowd Dynamics
- A Microscopic “Social Norm” Model to Obtain Realistic Macroscopic Velocity and Density Pedestrian Distributions
Citation: Davidich M, Köster G (2013) Predicting Pedestrian Flow: A Methodology and a Proof of Concept Based on Real-Life Data. PLoS ONE 8(12): e83355. doi:10.1371/journal.pone.0083355
Images: All images come from the manuscript