Published using Google Docs
DVSFLOW16: DVS Optical Flow Dataset
Updated automatically every 5 minutes

DVSFLOW16: DVS Optical Flow Dataset

Tobi Delbruck and Bodo Rückauer, Jan 2016

This getting-started guide accompanies the DVSFLOW16 dataset hosted by BitTorrent Sync that accompanies the Frontiers in Neuromorphic Engineering paper below. Publications using this data should cite the following

B. Rückauer and T. Delbruck, “Evaluation of event-based algorithms for optical flow with ground-truth from inertial measurement sensor,” Front. Neurosci., vol. 10, p. 176, 2016. http://journal.frontiersin.org/article/10.3389/fnins.2016.00176/abstract 

See more Inst. of Neuroinformatics Sensors Group datasets here. (link to source document)

Change history

(youtube video DVS optical flow methods in jAER)

For questions related to the use of this dataset or the associated jAER filters, please write both rbodo@ini.uzh.ch, tobi@ini.uzh.ch 

For an overview showing the developed algorithms acting on two of the dataset recordings, see this YouTube video DVS optical flow methods in jAER  

Table of contents

Change history

Getting the dataset

Organization of the dataset

calibration

contrast input

filter settings

middlebury dataset

real samples

Speed of translBoxes scene as function of time, measured using IMUFlow with focal length 4.5mm and IMU full scale set to 1000dps

synthesized samples

videos

Installing JDK+netbeans and opening the jAER project

Running jAER

Selecting the correct camera class

Loading the optical flow event filters

Opening a recorded (“logged”) data file

Hiding the image frames

Enabling the IMUFlow filter to see ground truth flow vectors

Showing the IMU output

Activating LocalPlanesFlow to compare with IMUFlow

Showing the time and space-averaged motion field

Visualizing the underlying data

LocalPlanesFlow

LucasKanadeFlow

Getting tooltips

Prepending a RefractoryFilter

Measuring accuracy online using jAER

Storing the IMU calibration

Measuring accuracy online

Using matlab scripts

Inspecting the Java flow algorithms code

More about camera calibration

Using SingleCameraCalibration in optical flow estimation

FAQ

Further questions

Getting the dataset

Use Resilio Sync to get the optical flow dataset, from

DAVIS dataset associated with the Frontiers paper "Evaluation of Algorithms for Normal Optical Flow from Dynamic Vision Sensors" by Bodo Rueckauer and Tobi Delbruck, 2015.

Note: for linux users see ResilioSync linux help.

Site hosting this and other data: http://sensors.ini.uzh.ch/databases.html 

Organization of the dataset

The data is organized into folders which are described below.

calibration

Contains camera calibration images and calibration data. Not used in the paper but can be used to estimate camera distortion for more precise motion vectors at periphery of image.

Note about IMU calibration: All data was recorded using an IMU gyro gain setting of 1000 dps full scale. See Showing IMU output  and Storing the IMU calibration for more information. Note that the Davis Hardware Configuration panel must be correctly set for 1000 dps full scale for the data to be properly scaled, as shown below.

For all the recordings, the lens focal length was approximately 4.5mm. This number must be entered into the flow methods GUI in the category IMU:

Further investigation has shown that the actual lens focal length should be set to 5.4mm to achieve a stabilized image using the Steadicam filter. The reason is unknown.

See below for more information about camera calibration.

contrast input

These are single frame images (png-files) used to acquire the video sequences by either moving the image in front of the camera or by moving the camera while the image is at rest (to get additional IMU data).

filter settings

This folder contains typical parameter settings that can be loaded into the jAER viewer. This only serves as starting point; some parameters have to be fine-tuned according to the input used. In general, spatial search distances of 3 (corresponding to a neighborhood of 49 pixels) perform best; smaller neighborhood sizes will of course speed up the process. The temporal search distance "maxDtThreshold" should be somewhere between 100 and 500 ms. The refractory period can usually be turned off to produce denser flow field. Values between 30 and 300 ms will speed up filtering and can sometimes even increase accuracy (esp. for LucasKanadeFlow). Another important parameter is "thr" in LucasKanadeFlow and "thr3" in LocalPlanesFlow. It is a confidence measure that filters out unrealistically high velocities, mostly by looking at the eigenvalues of the pseudo-inverse in the least-squares estimation. Setting the threshold higher improves accuracy and processing speed but reduces the number of events passing through the filter.

middlebury dataset

This is a download of the twelve gray-scale training sequences together with their ground truth from the Middlebury benchmark http://vision.middlebury.edu/flow/ . These frames can be used to extract events (e.g. with MATLAB) and apply event-based motion flow algorithms. The drawback is that the ground truth provided there is full 2D, while the event-based methods considered here estimate only normal flow. Thus we can evaluate their performance only by visual judgment.

real samples

IMU_calibration.aedat: A DVS-sequence of several seconds with stationary camera and no moving objects that can be used to calibrate the IMU.

IMU_rotDisk.aedat: The rotating disk data used above as example.

IMU_translBars: Two black bars of moderate width on white background (sharp contrast edges) oscillating horizontally. The motion is solely due to camera rotation, i.e. IMU ground truth can be computed.

ball: A short DVS-sequence featuring a jumping table-tennis ball (no IMU).

IMU_APS_translSin, IMU_APS_rotDisk, IMU_APS_translBoxes: Three sequences corresponding to the real samples used for benchmarking, but here with frames from the DAVIS. This allows to use the database with conventional frame-based methods. The transBoxes sample moves to the right at about 25[1] 32 pixels/second; see measured data below.

Speed of translBoxes scene as function of time, measured using IMUFlow with focal length 4.5mm and IMU full scale set to 1000dps

synthesized samples

These are simple/idealized AER samples generated with MATLAB.

- "TranslatingBarX", "TranslatingBarY": Two bars with width one pixel shifting from left to right, respectively from up to down. The line of ON events moves at 10 pixels/sec, i.e. every 100ms there is an ON event one more pixel to the right or up.

- "OscillatingDoubleBar": A bar with width two pixels oscillating from left to right, also at 1 pixel/100ms, or 10 pixels/sec.

- "gtRotatingBar": A mat-file containing the ground truth of the artificial rotBar-sample for the duration of one period.

- "gtTranslatingSquare": A mat-file containing the ground truth of the artificial translSquare-sample for the equivalent duration of two frames.  The square moves up and to right with X and Y velocity components of 20 pixels/second, i.e. speed of 28.3 pixels/second.

videos

mp4 videos of various methods in operation on data from the dataset and other data.

Installing JDK+netbeans and opening the jAER project

First, you need jAER and probably an IDE like netbeans installed to run the algorithms. Follow the steps to install netbeans+JDK (or use eclipse if you prefer) and jAER from https://sourceforge.net/p/jaer/wiki/jAER%20Installation/ 

Open the jAER project in netbeans

Running jAER

Run the project (^F5 to run under debugger) which launches an AEViewer window (using debugger by default lets you put in breakpoints and examine code),

Selecting the correct camera class

Select the AEChip Davis240C  - the dataset is recorded using this camera

Click the Filters button at lower left of display or select menu item View/Filters

Loading the optical flow event filters

Click the Select Filters… button in the filters frame

Type “flow” in the Filter box and after class scanning completes, select the optical flow classes from the paper shown highlighted below from the left column and move them to the Selected classes field on the right. These jAER event filters will then be loaded by default for the AEChip Davis240C.

Now you can see these filters are loaded.

Opening a recorded (“logged”) data file

Now open a dataset file, e.g. the rotating fan data file, using “o” or File/Open logged data file

Navigate to the dataset location and open the rotating fan data file. This file has only DVS data, no frames. It should start playing.

Hiding the image frames

If you open a file that has been recorded with Davis frames as well as DVS events, you can hide the image frames to better expose the DVS events and the resulting optical flow vectors. Show the hardware configuration panel by clicking the HW Configuration button at lower left of AEViewer. The configuration panel for Davis240C will open and now you can deselect the DisplayFrames checkbox to hide the captured APS images

Enabling the IMUFlow filter to see ground truth flow vectors

Now you can close the hardware configuration panel and open the control panel for one of the optical flow methods. Click IMUFlow’s Control button. The control panel for IMUFlow should appear. This button actually just toggles between showing the controls for this one filter and showing all the possible filters, which run in a pipeline for the the ones that are enabled. Select the enable checkbox, circled in red below. This selection will enable the filter and now you should see the motion vectors attached to each DVS event drawn on the AEViewer display. Since the flow arises from camera rotation, it is accurately estimated from the IMU rate gyro samples. Use the ppsScale property to set the scaling of the length of the flow vectors in pixels/second per screen pixel. Note that 1) the IMU gain must be set properly for correct computation of the flow; see Showing the IMU output below, and 2) the IMU must be calibrated to remove the offsets present even when there is no camera rotation; see Storing the IMU calibration.

Showing the IMU output

You can use the IMU control in the hardware configuration panel to show the IMU output:

Then for the rotating fan data, you should see the accelerometer output (which statically encodes gravity) rotate around as the camera rotates. It is the green vector shown here:

All datasets were recorded using an IMU gain setting of 1000 dps (degrees per second full scale).

The IMU gain must be set properly for veridical flow vector magnitudes. To set the IMU gain, use the HW Configuration button to open the IMU Config tab of the control panel. Then ensure that the Gyro full scale range (deg/s) is set to 1000 deg/s:

Activating LocalPlanesFlow to compare with IMUFlow

Pop back to the overview of all filters by hitting the Back to filters list button on IMUFlow.

Now expose the controls of LocalPlanesFlow by hitting its Controls button. Enable the filter using the checkbox at top left corner. The flow events from LocalPlanesFlow should now appear using the default (hard-coded) values of parameters such as th1, th2, th3.

Showing the time and space-averaged motion field

A new option added for all flow algorithms is to display a time and space-averaged motion field. The motion field is enabled by selecting the showMotionField option

The motionFieldSubsamplingShift parameter is the bit shift for subsampling the optical flow events. For example, setting it to 4 averages flow events over 16x16 pixel blocks. For other options, see the tooltips.

Enabling motion field display shows a color coded representation of the average flow, as shown below for the data file IMU_rotDisk.aedat using the DirectionSelectiveFlow filter

Visualizing the underlying data

LocalPlanesFlow

The option Display/showTimestampMap draws the surface of most-recent timestamps on top of the output. An example is shown below, where it is clear that a nice plane can be inferred from many of the events on the leading edge of a fan segment.  The option showTimestampMapMask selects the ON or OFF or both input event types. The option showTimestampMapAlpha sets the transparency of the overlay.

LucasKanadeFlow

The option drawCollectedEventsHistogramEnabled draws the histogram on top of the output for the LK filter

Getting tooltips

You can hover over each of the properties to see a tooltip for that property. For example, hovering over lensFocalLengthMm will show you this

All menu items, buttons, etc should have tooltips and most-used menu items have 1-key accelerators.

Prepending a RefractoryFilter

The LocalPlanesFlow filter in particular can benefit from preprocessing the DVS events with a RefractoryFilter that limits the event rate from each pixel. RefractoryFilter should be loaded by default. If not, add it before the flow filters just as you added the flow filters earlier. Make sure the RefractoryFilter is higher on the list than the flow filters, so it is processed first. Then you can limit the event rate by just enabling this RefractoryFilter. You can set the refractory period using the controls for RefractoryFilter

For the example above, the RefratoryFilter is currently disabled, but when enabled it limits the ISIs to at least 194 ms (194k microseconds).

Measuring accuracy online using jAER

If the logged data has IMU samples, then the IMUFlow can be used to estimate error online in any of the other flow methods. First the IMU offsets must be measured. This step is followed by using the built in MotionFlowStatistics object to measure the accuracy.

Storing the IMU calibration

The recording IMU_calibration.aedat is used to measure and store the zero-movement output of the IMU, so that these readings can be subtracted from later samples.

To use these offsets, play back this recording, and in any of the optical flow jAER filters, hit the StartIMUCalibration button

To verify that this calibration is working properly, try it with the IMUFlow filter. Erase the IMU calibration using the ResetIMUCalibration button. Note how each noise DVS event is labeled with a significant flow pointing down and to the left:

After hitting the StartIMUCalibration button, this offset should be removed:

Measuring accuracy online

Enabling the measureAccuracy and measureProcessingTime options in LocalPlanesFlow, for example, and hitting the PrintStatistics button results in a console output like this:

Jan 31, 2016 10:58:47 PM

 ch.unizh.ini.jaer.projects.rbodo.opticalflow.AbstractMotionFlowIMU doPrintStatistics

INFO: LocalPlanesFlow

Motion Flow Statistics Summary:

EventDensity: 92.61% - the ratio of output flow events to input DVS events

Global velocity: [-0.08, 42.66] +/- [3.25, 3.94] pixel/s, global rotation: -85.27 +/- 3.04 °/s

ProcessingTime: 2.23 +/-  2.92 us/event

AngularError: 74.36 +/- 63.82 °, Histogram of 583984 samples starting at 3.00 with step sizes of  10.00: [37101, 105565, 66477, 38961, 25098, 18650, 16444, 13583, 13052, 14084, 15810, 17027, 17783, 19990, 24626, 28138, 33666, 43113, 34816, 0], percentage above 3.000000 °: 93.65%, above 10.000000 °: 75.57%, above 30.000000 °: 57.52%

EndpointErrorAbs: 149.76 +/- 152.16 pixels/s, Histogram of 583984 samples starting at 1.00 with step sizes of  10.00: [65, 7371, 18948, 27250, 33078, 38233, 41556, 43888, 43004, 38822, 31375, 24865, 18712, 14335, 12190, 10784, 10053, 9694, 9224, 8517], percentage above 1.00 pixels/s: 75.67%, above 10.00 pixels/s: 74.41%, above 20.00 pixels/s: 71.16%

EndpointErrorRel: 218.01 +/- 248.48%, Histogram of 583984 samples starting at 2.00 with step sizes of 100.00: [89, 297024, 97980, 66885, 40591, 25847, 17518, 11867, 8152, 5432, 3807, 2546, 1722, 1233, 851, 559, 420, 324, 246, 167]

See the paper for detailed explanations of these statistics.

Using matlab scripts

To create or view AEDAT files, see the jAER wiki.

The jAER project has the matlab scripts associated with the paper. The scripts are located at https://github.com/SensorsINI/processAEDAT/tree/master/misc/optical%20flow. The folders have individual readmes to help guide you.

Inspecting the Java flow algorithms code

All the filters are located in the jAER package ch.unizh.ini.jaer.projects.rbodo.opticalflow. To open one of the classes, in netbeans, use ^o to open the Open Type browser: Use camel case and type “LPF” (short for LocalPlanesFlow), then select the LocalPlanesFlow class

Go to the method filterPacket, either by search or by using the Navigator on the left of netbeans:

The event packet EventPacket supplied on entry to filterPacket contains the list of events from the camera. It is processed in this filter. You can now put a breakpoint and examine the algorithm execution

For other optical flow methods, give them a try.

You will observe that all methods are inherited from a base class AbstractMotionFlowIMU that provides common functionality.

For help on how jAER event filters work, see https://sourceforge.net/p/jaer/wiki/Adding%20a%20new%20event%20filter/ 

To see revision history of these algorithms, in netbeans, select the package,right click, and select Subversion/Search History…

Click the Search button to pop up a log of changes to source files in the opticalflow package.

More about camera calibration

The data in the calibration dataset folder can be used for camera calibration using a variety of methods. We originally used the Matlab-based Camera Calibration toolbox.  Later, we added OpenCV camera calibration support within jAER.

To use the jAER camera calibration support, add the event filter SingleCameraCalibration.

Then, from the calibration folder play the calibration file Calibration_moving_checkerboard_DAVIS240C.aedat, enabling SingleCameraCalibration with the settings shown below. Make sure frames are displayed. Events do not need to be displayed.

The (physical) rectangle size should be set to 74x74 mm. Enable realtimePatternDetectionEnabled.  You should see the yellow crosses appear on at least some of the frames, when OpenCV’s findCorners method succeeds on the Davis image.

Hit TakeImage button a few times to capture some images with different views of the calibration target. If the capture succeeds, you will see a confirmation on the AEViewer status line.

After one or more sets of corner points have been collected, hit the Calibrate button. If the calibration succeeds, then a summary of the calibration is shown on the display. Now you should check if the focal length makes sense.  The principal point is shown as a green cross on the display.

Hit the SaveCalibration button the save the calibration files cameraCalibration.xml and distortionCoefs.xml in a selected folder. This calibration will be loaded at startup by default.

You can check if the calibration succeeds in undistorting the images by opening the ApsFrameExtractor enclosed filter, and selecting the showAPSFrameDisplay option there and the showUndistortedFrames option in SingleCameraCalibration.

An example of un-distorted output is shown below. The curved edges of the calibration pattern have been straightened out. At the same time, the whole image has been magnified and some source pixels have been lost.

Using SingleCameraCalibration in optical flow estimation

The calibration method shown above is integrated in all optical flow methods and can be accessed from the filter settings GUI of the optical flow method you are using, e.g. LocalPlanesFlow:

Follow the steps described above to calibrate the camera, or simply load the saved calibration parameters, which are stored in the calibration folder in BitTorrent Sync. After performing or importing a calibration, the DVS events can be undistorted by checking the field undistortDVSevents in the enclosed SingleCameraCalibration. These new addresses remove the effects of lens distortion, i.e. bent edges at the periphery are straightened out as though the optics were an ideal pinhole camera. The optical flow method uses these undistorted events to estimate motion vectors. Also the IMUFlow algorithm operates on the corrected event addresses to provide ground truth.

FAQ

Q1. Where is the origin point of captured events position? Seems for APS events, it's left down point of screen while for DVS events, it's right down point!

A1. The coordinates of both frames and events is lower left as 0,0 for both frames and event streams. In cAER we now use upper left as origin as is more conventional in machine vision, but jAER started out with cartesian coordinates and is still using that. Sorry for the inconvenience.   Here's screen shot after right clicking on LL corner in jAER, while playing back the rotating fan example from I:\BitTorrent Sync\BodoRuckhauerDVSOpticalFlowData\real samples\IMU_APS

Q2. There are some errors (or at least unusual) rows in the captured data, e.g. sometimes the extracted x or y coordinates of events are more than 240 or 180. The time of events are not necessarily increasing monotonously. There are some APS events which have '10' values for APS read bits. There are many many duplicate rows in the captured data as APS, IMU or DVS events. The gray scale frames are not complete or regular! I'm wondering how to deal with these issues in the captured data.

A2. When I look at the 3 videos in jAER, I don't see any bad events or bad frames except for the translating sinusoids example. Try playing them back slowed down (using S to slow down, F to speed up) and turning off events (shift E) and IMU samples (shift I). (See the DAVIS menu for Davis shortcuts). Maybe the matlab or python script you are using is not parsing all the event types correctly.

Q3. What is the unit of temperature, acceleration or gyro data? I have extracted 7 16-bit numbers and I don't know how to interpret these numbers.

A3. the unit of IMU samples is buried in the code.  See the class eu.seebetter.ini.chips.davis.imu.ImuSample in jAER for IMU stuff in jAER. To see how to use IMU samples, just look for usage of this class, e.g. in Steadicam. Note that the IMU scale must be set in jAER the same as the recording for proper interpretation. It is recorded in the preferences written in the header of the data file. search for IMU in these preferences. For all the dataset recordings, I believe the IMU gyro was set to the following (this is from the translatingBoxes file)

                   <entry key="CPLDByte.imu0_PWR_MGMT_1" value="2"/>

#                    <entry key="CPLDByte.imu1_CONFIG" value="0"/>

#                    <entry key="CPLDByte.imu2_SMPLRT_DIV" value="0"/>

#                    <entry key="CPLDByte.imu3_GYRO_CONFIG" value="16"/>

#                    <entry key="CPLDByte.imu4_ACCEL_CONFIG" value="16"/>

#                    <entry key="ImuGyroScale" value="GyroFullScaleDegPerSec1000"/>

#                    <entry key="ImuAccelScale" value="ImuAccelScaleG8"/>

Q4. The ground truth you have provided in *.mat files. What is the time or instance of these ground truth? Where you have calculated the errors for comparison, is it just for a special instance of the video or for whole video from beginning to end?

A4. The edge in RotatingBar turns with a period of 30 seconds. We measured the error for one complete 360 degree turn. In the TranslatingSquare sample, the object moves with sqrt(2)*20 ~ 28.28 pixels/second. We used the complete recording, except a few time steps at the beginning and end. This allows the object to traverse 3-5 pixels before measuring the error, so that we have a complete event neighbourhood at our disposal, removing possible border effects. With time steps of 50 ms, this amounts to 150-250 ms, and is negligible compared to the total of 5 s of uniform and noise-free motion.

The matlab scripts creating these samples (and specifying all parameters) are contained in the jAER repository: /jAER/scripts/matlab/optical flow/ artificial sample creation/

Further questions

Good luck and happy insights.

Tobi Delbruck and Bodo Rückauer, March 2016


[1] Speed was erroneously listed as 25 pps here earlier. New measurements show that the speed is about 32pps during the stable part of the motion.