Research & Innovation

Grappling with Gridlock: A New System

Stevens teams with the University of Florida to detect traffic bottlenecks and choke points more quickly during emergencies

As Hurricane Rita approached the Texas-Louisiana coast during a heat wave in September 2005, 2 to 3 million residents evacuated — creating a 100-mile-long highway gridlock that lasted two days. More than 100 deaths were later attributed to the bottlenecks.

During the hours before 2012’s Hurricane Sandy, traffic delays leaving Manhattan stretched to four hours. Key bridges and tunnels in and out of the city closed during the storm, and some flooded or remained closed for more than one week afterward, further disrupting travel.

Our systems for predicting and smoothing traffic chaos haven’t improved much since.

Now Stevens professor Mohammad Ilbeigi, working with recent doctoral conferee Mina Nouri Ph.D. ’24 and two University of Florida researchers, thinks he has found a better way.

“Unexpected gridlocks adversely affect evacuations and rescue and recovery operations, and can cost lives,” says Ilbeigi, whose work on the challenge is sponsored by a National Science Foundation grant.

“Rapid detection and warning of these sorts of disruptions is vital for successful emergency management.”

Sensing peak traffic, using Sandy as data

Existing traffic-detection technology can spot trouble in near real time, but isn’t so good at spotting and even predicting catastrophic bottlenecks, such as those that build up or even occur suddenly during emergencies such as a major storm.

“Current methods are unable to simultaneously detect extreme traffic events and pinpoint specific roads or zones experiencing anomalous traffic patterns,” explains Nouri. “They do not fully leverage the interdependencies of traffic networks.”

Professor Mohammad IlbeigiProfessor Mohammad IlbeigiThe new system in development will use a technique known as a self-expressive network monitoring method to enhance its detection power. That technique bundles segments of roadway — and there are nearly 9,000 segments in Manhattan alone — into chunks or “zones,” analyzing both real-time traffic and changing conditions in adjacent zones.

This approach, the team hopes, will give the system greater accuracy.

“You can’t analyze every road segment individually, because that will cause false alarms,” explains Ilbeigi. “Any statistical method used to detect abnormal shifts in traffic patterns is subject to some degree of error."

"This is not a concern when monitoring a few roads, however, when dealing with a road network consisting of thousands of road segments this approach can result in an excessive number of false alarms.”

In addition, analyzing road segments individually prevents the detection of critical bottlenecks that may disrupt significant portions of the network.

“One disabled car temporarily blocking a lane in a short segment of road might give you a signal that there’s a traffic emergency,” Ilbeigi continues, “but that isn’t really representative of the immediate area or certainly the region as a whole.”

A network-based approach, which models the interdependencies among the road segments by taking into consideration whether several adjacent zones are flowing smoothly or degrading, gives a better grasp of the problem.

The team developed computational models using a reliably established algorithm known as ADMM that handles complex problems efficiently by slicing them into smaller-sized problems, then solves them in parallel before assembling results and predictions.

“These calculations express the traffic of each zone using changing traffic information from other zones,” notes Ilbeigi.

Ph.D. graduate Mina NouriMina Nouri Ph.D. '24To quickly detect abnormal traffic patterns, the team employed a novel approach based on monitoring the accuracy of a predictive model. Under normal traffic conditions, the model produces accurate predictions. However, a sudden drop in prediction accuracy serves as a signal for unusual traffic patterns that detects gridlock hotspots in real time.

To test the new model, the team turned to Stevens’ own backyard: traffic data from 700 million New York City taxi rides taken between 2010 and 2013, which gave the model a good idea of typical travel times.

They drew a particular statistical focus on rides completing during the three weeks before, during and after the Hurricane Sandy disaster, dividing Manhattan into 46 equally-sized zones for the purpose of organizing and analyzing that data.

After removing the effect of seasonal variations from the system, their analysis uncovered critical bottlenecks during Sandy and its aftermath, including in Hell’s Kitchen — near the Lincoln Tunnel’s eastern entrance — and Midtown East, adjacent to the Queens-Midtown Tunnel’s western opening.

The system revealed the slowdowns occurred in two spikes, one during the day before the main push of the storm struck the metro area, then another during the second and third days afterward.

Another interesting finding: streets on the Upper West Side of the city experienced little traffic disruption — possibly indicating residents did not evacuate in large numbers, or that street networks are already sufficiently robust to carry high volumes of traffic out of the city efficiently during true emergencies.

“Having a system like this, providing disruption information in real time, could have helped manage traffic more effectively during Sandy,” concludes Nouri, “and understanding those patterns after the event can also assist city officials in planning for future disruptive events.”

The research was reported in the IEEE journal Transactions on Intelligent Transportation Systems [Vol. 25, No. 10] in October.