Date: 09 August 2012, Thursday
SLAM is the problem of estimating the robot pose along with the landmark locations at the same time. First problem is called the localization problem and the second problem is called the mapping problem. The problem arises because of the noises present in odometry and sensors. One approach is to use Extended Kalman Filter to estimate the robot pose and landmark positions simultaneously using an extended state vector.
Download the scilab code for EKF SLAM from this link. The folder contains all files necessary for re-generating the results shown in the figures. The codes will work only on Linux. You need to have Player/Stage installed on your system.
Basically, we generate data using ‘gendata.cpp’ function that uses Player/Stage to simulate robot motion in an environment where the fiducial sensors of the robot is used for detecting beacons placed along the boundary of the workspace. There are 32 landmark beacons generated using the program ‘beacon_locn.cpp’ and they are put into the simple.world file. To know more about fiducial sensors, visit this link.
The data generated is stored in file ‘robotdata32.txt’ file. This file is used by the scilab program ‘ekfslam_32lm.sce’ file to estimate robot pose and landmark locations using EKF algorithm.
Just for understanding, you may look into this code that demonstrates how one can build SLAM using Kalman Filter assuming a linear motion and measurement model.
Simulation results for 64 landmarks for known correspondence is shown below.
Landmark Estimate | Robot Pose Estimate | Average Position Error |
Source Codes:
Resource:
[TOC]
Date: 23 October 2012, Tuesday
Scan-matching algorithm is used for estimating the robot pose from a set of measurement data obtained from sensors like a laser or a sonar. Since laser scan is more accurate compared to sonar, it is preferred a input for most scan-matching algorithms. For point to point features, some of the popular scan matching algorithms are Iterative Closest Point (ICP), iterative matching range point (IMRP), the iterative dual correspondence (IDC) and the polar scan matching (PSM) [2] etc.
Source Code: I tried using the PSM code obtained from Albert Diosi’s website. Some test examples of this code are available here.
The result obtained from CARMEN [2] (shown below) is not as good as that of PSM. I am not sure if I have used the function correctly.
Reference:
[TOC]
Date: 26 October 2012, Friday
CARMEN is robot navigation toolbox from Carnegie Mellon University. My interest lies in using the scan matching algorithm within my C/C++ code. This is how I go about it.
First of all you need to obtain the CARMEN source code and build the binaries. This page will get you through most of the problems that you might encounter while building the binaries.
Before you start writing your own program, I would advise you to go through the limited documentation available on its site. Particularly, go through the style guide. It will help you in understanding various pre-defined variables and their measurement units.
The website also tells you how to make use of the GUI based tools available with CARMEN. However, it does not tell you how to make use of the libraries with your own C/C++ code. This is what I am going to explain below:
I am interested in using the scan matching algorithm that comes with the vasco module of the package. The following code uses the scan matching algorithm to estimate the pose of the robot from its laser scans which I read from a previously generated data file.
// vasco-test.cpp
#include <iostream>
#include <fstream>
#include <cmath>
#include <carmen/carmen.h>
#include <carmen/vascocore.h>
const int nData = 200; // total number of records in the input file
const int LPOINTS = 181; // No. of Laser Scan Points
using namespace std;
int main(int argc, char *argv[])
{
carmen_ipc_initialize(argc, argv);
carmen_param_check_version(argv[0]);
vascocore_init(argc, argv);
carmen_point_t est_pose; // Estimated Robot Pose using Vasco toolkit
carmen_point_t act_pose; // Actual Robot Pose from data file
float range[LPOINTS], angle[LPOINTS];
double fov = M_PI; // 180 degrees
ofstream f1("pose.txt");
ofstream f2("landmark.txt");
ifstream fin("./data/robdata.txt");
int first = 1;
for(int k = 0; k < nData; ++k)
{
fin >> act_pose.x >> act_pose.y >> act_pose.theta;
for(int i = 0; i < LPOINTS; ++i)
{
fin >> range[i] >> angle[i];
}
est_pose = vascocore_scan_match_general(LPOINTS, range, angle, fov, act_pose, first);
f1 << est_pose.x << "\t" << est_pose.y << endl;
double xx, yy;
for(int i = 0; i < LPOINTS; ++i)
{
if(range[i] < 8.0)
{
xx = est_pose.x + range[i] * cos(est_pose.theta+angle[i]);
yy = est_pose.y + range[i] * sin(est_pose.theta+angle[i]);
f2 << xx << "\t" << yy << endl;
}
}
f2 << endl << endl;
if(first)
first = 0;
} // data-file-loop
f1.close();
f2.close();
fin.close();
return 0;
}
The Makefile for compiling this code looks like this:
TARGET = vasco-test
OBJECTS = vasco-test.o
CFLAGS = -g -Wall -Wextra
LFLAGS = -lz -Wl,--start-group -lvascocore -lglobal -lipc -lparam_interface -Wl,--end-group
CC = g++
INC_PATH = -I ~/TCS/Programs/Particle_Filter/carmen-0.7.4-beta/include
LIB_PATH = -L ~/TCS/Programs/Particle_Filter/carmen-0.7.4-beta/lib
$(TARGET): $(OBJECTS)
$(CC) $(LIB_PATH) $(OBJECTS) $(LFLAGS) -o $(TARGET)
vasco-test.o: vasco-test.cpp
$(CC) $(CFLAGS) $(INC_PATH) -c vasco-test.cpp
clean:
rm -f $(TARGET) $(OBJECTS)
The high-lighted line avoids the circular dependency within the carmen static libraries.
Now run the following commands on different terminals
Terminal 1:
$ <carmen-src-directory>/bin/central
central running ...
Terminal 2:
$ <carmen-src-directory>/bin/param_daemon --robot pioneer-I ../data/thickwean.map
Loading parameters for robot pioneer-I using param file ../src/carmen.ini
CARMEN - Carnegie Mellon Robot Navigation Toolkit - Version 0.7.0
Terminal 3:
$ make
g++ -g -Wall -Wextra -I ~/TCS/Programs/Particle_Filter/carmen-0.7.4-beta/include -c vasco-test.cpp
g++ -L ~/TCS/Programs/Particle_Filter/carmen-0.7.4-beta/lib vasco-test.o -Wl,--start-group -lvascocore -lglobal -lipc -lparam_interface -Wl,--end-group -lz -o vasco-test
$ ./vasco-test
The parameter server uses ‘carmen.ini’ to load the robot parameters. You may change these parameters by editing this file. Several robot parameters are already defined within this file. I have selected the pioneer robot as an example.
[TOC]
Date: 14 December 2012, Friday
Since I was not able to control the behaviour of the routines that I got from external sources, I decided to write my own. In this regard, I implemented the basic version of the Iterative Closest Point (ICP) Algorithm. The idea is to estimate the robot pose given two laser scans - the reference scan and the current scan. The initial robot pose of the current scan is taken from the odometry readings. Since we assume the odometry readings to be noisy and laser readings to be accurate, we intend to find a better estimate of robot pose using the difference between two laser scans.
The method involves four steps [1]:
Please refer to [1] for the details of the algorithm. I found it extremely useful for understanding the basic ICP algorithm. Since the laser scan points have fixed bearing angles, the correspondences between two scans may not be that difficult to find. I also tried using direct correspondences between the two scans for compute the error cost function. In other words, I assumed that first point of the first scan (bearing angle ) corresponds to the first point of the second scan which has the same bearing angle. That makes the whole algorithm of order
, where
is the number points in the scan.
Limitations:
Source code: Click here to download the source code of the program. The code is made available under GNU Public License.
Scan Matching Results for Only Translation Error | Scan Matching Results for both Translation as well as Rotational Errors |
Reference:
1. “Genetic and ICP Laser Point Matching for 2D Mobile Robot Motion Estimation” By J. L. Martinez, J. Gonzalvez, J. Morales, A. Mandow and A. J. Garcia-Cerezo. Journal of Field Robotics, Vol. 23, No. 1, pp. 21-34, January 2006
[TOC]
Date: 11 December 2012, Tuesday
The idea is to use Mean-shift algorithm to SURF features to carry out object tracking a video recorded from a non-static camera. Mean-shift algorithm necessitates having the object histogram which is searched locally in subsequent frames. Not only it is difficult to obtain a suitable histogram for the source object, there are several other limitations of this approach. We have done some work to overcome limitations of this approach and make it workable. An initial result of our work is available at this youtube link.
Publications:
Date: 09 January 2012, Wednesday
The code for generating a multi-variate Gaussian Random Vector is available here. This was originally written by Emmanuel Benazera. I have included a simple function interface for cpp users. This will be useful for generating samples for the particle filter. |
[TOC]
Date: May 8, 2013, Wednesday
Please refer to Student Dave’s tutorial in order to understand how particle filter works. The corresponding cpp and GNU/Octave files can be downloaded from here.
In a Particle Filter:
Particles Generated with Gaussian Distribution | Particles with higher weights are sampled more frequently compared to others during resampling | Actual values versus estimated values for the state. |
Source Code: CPP Particle Filter Class implementation can be downloaded from here. The working of the code is shown in the plots given below:
Tracking of a human in a image sequence | weights and particles at different instant of time |
Reference:
[TOC]
Source Code: Scilab Program
Underlying idea is to express a given density as a mixture of multiple gaussian functions. Using expectation maximization algorithm, we then try to estimate the parameters of these individual gaussian functions along with their contribution to the final distribution (
.
with
and
is the pdf of each model
: Mixing Probabilities of each model
: Parameter of each model
Simulated Data | Parameters Update | Output |