1 of 21

M4 - Video Analysis - Team 2

Video Surveillance for Road Traffic Monitoring

Laura Pérez Mayos

María Cristina Bustos

Xián López Álvarez

Gonzalo Benito

2 of 21

Project Goal

The main objective is to develop a Video Surveillance System for Traffic Monitoring with the capability of:

  • Keeping track of passing vehicles.
  • Accounting for traffic volume on the road.
  • Estimating the speed of each vehicle.

3 of 21

Pipeline of the process

Video input

Video to sequence

Video Stabilization

Foreground Segmentation

Object Tracking

Speed estimation

Vehicle counting

Video Surveillance System for Traffic Monitoring Algorithm

4 of 21

Sequence analysis

Once the input video is recorded, the sequence is split into frames and stored apart.

The system was developed and tested using 3 different video sequences: two of them as toy sequences for testing and tune-up, and one sequence of our own for final testing.

highway

traffic

Parc Nova Icaria

5 of 21

Video stabilization

To be able to work with the videos extracted from traffic cameras we first have to remove the effect of camera motion from a video stream.

→ improve background estimation and foreground segmentation

→ improve object tracking

Two Techniques explored:

  • Optical Flow Block Matching
  • Target Tracking

Displacement between two consecutive frames in the traffic sequence used for testing.��Source: MCV 2016 team 1

6 of 21

Video stabilization: Optical Flow Approach

In order to do video stabilization, optical flow is calculated using Block Matching approach. Block Matching consists in taking a block in one image and looking for the most similar block in other imagen in a certain search area

For Traffic sequence, Block Matching is only calculated in one part of the image, where there is an immovable object. The mean of the optical flow is taken for stabilize the video

Image 1

Mean of optical flow in image 2

7 of 21

Target Tracking Video stabilization

  1. define the target to track (square area in the lower left corner) and establish a dynamic search region, whose position is determined by the last known target location
  2. search for the target only within this search region
  3. in each subsequent video frame, determine how much the target has moved relative to the previous frame
  4. use this information to remove unwanted translational camera motions and generate a stabilized video (frame alignment)

Note: we added a black padding to all the frames to deal with the lose of information in the borders of the frames after stabilization, and we applied the same stabilization and padding to the ground truth.

Main idea: use a block-based parametric motion model to correct translational and rotational camera motions.

source: https://es.mathworks.com/help/vision/examples/video-stabilization.html

8 of 21

Background Estimation

Gaussian Model: A pixel is defined as background or foreground depending on a gaussian distribution

if

then background

else

foreground

Adaptive Gaussian: Train the background with a initial part of the video, and then update the gaussian recursively

if background

μ and σ matrices for highway sequence

Source: M4 Video Analysis Lecture 2

Source: MCV 2016 team 6

9 of 21

Background Estimation & Foreground Segmentation

Stauffer & Grimson: Based on Gaussian Mixture Model (GMM).

  • Both models (background and foreground) are mixed into a common model → statistical model of the pixel based on multiple Gaussians: Gaussian Mixture Model (GMM).
  • Incoming pixels are either classified into one of the existing Gaussians or a new Gaussian is created

Source: M4 Video Analysis Lecture 2

10 of 21

Background Estimation Improvements

In order to improve the masks generated by Stauffer & Grimson and make the tracking easier we apply a series of morphological filters:

Bg mask

Closing

(dilation + erosion)

Opening

(erosion + dilation)

Fill holes

bwareaopen

(remove small objects)

Morphological Operator Pipeline

11 of 21

Tracking

In the tracking framework, we have two kind of variables:

  • : The real state of the object at a certain time step.
  • : A measurements of this state.

The idea is, first, to predict the current state of the object based only on the previous measurements:

Then, we will make an update once we have the new measurement:

Image from 4th lesson of M4, MCV. Professor Ramon Morros.

12 of 21

Tracking: Kalman filter

Image from 4th lesson of M4, MCV. Professor Ramon Morros.

Optimal method in the case of a Linear Dynamics Model with Gaussian noise.

LDM equations:

13 of 21

Tracking: Other Techniques

Mean Shift:

Particle Filter:

Further tracking techniques were explored.

Particle filter approach works quite well.

Mean shift approach is in need of further tune up.

14 of 21

Vehicle Counting

When counting vehicles, we have to discard some tracks, since they do not correspond to real vehicles.

Putting a threshold on the lifespan of tracks solved this problem in our case.

  • Results in our video:
    • 22 vehicles detected in 27 seconds.
    • Rate of 48.89 vehicles / minute.
    • 1 false positive (real rate is slightly lower).

Source: difoosion.com

15 of 21

Speed control

We measured the distance between two consecutive street lamps (24 m).

Since we have a frontal view of the road, it was enough with drawing two horizontal lines crossing the lamps.

We record the moment on which the bottom left corner of a track passes each mark.

Knowing that our camera has a rate of 30 frames per second, the speed follows immediately.

16 of 21

Speed control: results on toy sequences

For the toy sequences we need to make some strong assumptions in order to proceed with speed estimation:

  • We assume there is no other distortion than the one due to perspective.
  • The border of the bounding box of a track reaching a marker line is considered as the vehicle itself reaching this marker.
  • Split lines on the road fulfill distance separation specifications (4.5 m lines separated by 7.5m).

17 of 21

Speed control: results on toy sequences

Vehicle

Speed (km/h)

1

50

2

84

3

76

Vehicle

Speed (km/h)

1

-

2

89

3

89

4

86

5

81

6

81

Highway

Traffic

18 of 21

Speed control: results on our sequence

All the cars in our sequence were under the speed limit.

  • Maximum speed: 76 km/h.
  • Mean speed: 64 km/h.

19 of 21

Video Surveillance System for Road Traffic Monitoring

  • Video Stabilization
  • Foreground Segmentation
  • Vehicle Tracking
  • Vehicle Counter
  • Speed control
    • Green box: under speed limit
    • Red box: above speed limit

20 of 21

Conclusions

  • We developed a traffic monitoring system with multiple tracking and speed estimation capabilities from scratch.

  • The implementation offers an acceptable alternative for speed estimation using only one camera and a set of reference points in the image.

  • The system has been proven to be robust in three different sequences.

  • Parameters in background estimation play an important role in robustness against illumination changes.

  • Shadow removal needs further tune up to provide a performance increase that justifies its application.

21 of 21

Thank you! Questions?

More information available at our web: goo.gl/D4ZJnw

Gonzalo Benito

gonzabenito@gmail.com

María Cristina Bustos

mcb9216@gmail.com

Xián López

lopezalvarez.xian@gmail.com

Laura Pérez

lpmayos@gmail.com