Spatial ecology and the Graduate School for Production Ecology & Resource Conservation is organizing:

International Summer School:

Geocomputation using free and open source software

Jump start with Bash,Grass, Python, Gdal/Ogr libraries and the linux operating system

Wageningen, The Netherlands,  8th-12th July 2019

The open source spatio-temporal data analyses and processing summer school is an immersion 5 day experience opening new horizons on the use of the vast potentials of the Linux environment and the command line approach to process data. We will guide newbies who have never used a command line terminal to a stage which will allow them to understand and use very advanced open source data processing routines. Our focus is to enhance the approach of self-learning allowing participants to keep on progressing and updating their skills in a continuously evolving technological environment.



Course requirements:

The summer school is aimed at students who are currently at the master or doctoral level, as well as researchers and professionals with a common interest in spatio-temporal data analysis and modelling. Nonetheless, we accept undergraduate students as well. Participants should have basic computer skills and a strong desire to learn command line tools to process data. We expect students to have a special interest on geographical data analyses, previous experience in Geographic Information Systems would be helpful. Students need to bring their own laptops with a minimum of 4GB RAM and 30GB free disk space.


Registration is on a first come, first served basis and will be closed when 25 participants have signed up. Therefore, we encourage participants to register ASAP. A waiting list will be established in case of exceeding the limit.

Academic programme:

The summer school provides students with the opportunity to develop key skills required for advanced spatial data processing. Throughout the training students will focus on developing independent learning skills which will be fundamental for a continuous learning process of advanced data processing. This is a progressing journey of development with the availability of more complex data and the ongoing technological revolution. Within the course many different, complementary and sometimes overlapping tools will be presented to provide an overview of the existing open source software available for spatial data processing. We will discuss their strengths, weaknesses and specificity for different data processing objectives (eg.: modelling, data filtering, query, GIS analyses, graphics or reporting) and data types. In particular, we will guide students to practice the use of different types of software and tools with the objective to assist in gaining a steep learning curve, which is generally experienced while using the new approach of analysing data within a programming command line environment. Broadly, we focus our training on helping students to develop independent learning skills to find online help, solutions and strategies to fix bugs and independently progress with complex data processing problems.

The Academic Programme is divided into the following areas of study and interactions:

Lectures: (15min to 1h each) Students will take part in a series of lectures introducing basics functions of tools, theoretical aspects and background information, which is needed for a better understanding of the deeper concepts to be successively applied in data processing.

Hands on Tutorials: Students will be guided during hands on sessions where trainers will perform data analyses on real case study datasets, while the students fill follow those example procedure using their laptops. During tutorials sessions students are supported by two trainers, one for the demonstrations and one to supervise the students' work as well as helping with individual guidance on coding.

Hands on Exercises: In addition to tutorials and lectures, students are encouraged to take up their own independent study during the exercise sessions. Specific tasks will be set allowing to reinforce the newly learned data processing capacity presented in lectures and practically learned during the tutorial sessions. Such exercise sessions equip students with the confidence and skills to become independent learners and effectively engaged with the demands of advanced spatial-data processing.

Depending on the number of participants and their previous knowledge in programming, the more or the less topics can be addressed in accordance to the students' needs. The exercises and examples will be cross-disciplinary: forestry, landscape planning, predictive modelling and species distribution, mapping, nature conservation, computational social science and other spatially related fields of studies. Nevertheless these case studies are template procedures and could be applied to any thematic applications and disciplines.

Round table discussions: these sessions are mainly focused on exchanging experiences, needs and point of views. We aim at clarify specific student’s needs and challenges, focus on how to help and how to find solutions while problem solving.

Learning objectives:

Our summer school will enable students to further develop and enhance their spatio-temporal data processing skills. Most importantly, it will allow them to start using professionally a fully functional open source operating system with software. With continuous practise during the week students will get more and more familiar with the command line and will focus on developing specific areas, including:

  • Developing a broad knowledge of existing tools and be able to judge the most appropriate for their needs.
  • Building confidence with the use of several command line utilities for spatial data processing and Linux operating system.
  • Developing data processing skills and increasing knowledge on data types, data modelling and data processing techniques.
  • Encouraging independent learning, critical thinking and effective data processing.

Summer school certification: At the end of the summer school the attendees will receive a course certificate upon successful completion of the course, although it is up to the participant’s university to recognize this as official course credit.

Time table: (7h teaching/day)

8:45 - 09:00

Morning Coffee

8:45 - 10:30

Morning session 1 (1h30min)

10:30 - 10:45

Coffee break

10:45 - 12:00

Morning session 2  (1h15min)

12:00 - 13:00


13:00 - 14:45

Afternoon session 1  (1h45min)

14:45 - 15:00


15:00 - 17:00

Afternoon session 2  (2h)

Preliminary Program

Day 1:

Knowing each other / OSGeo-live operating system (GA-LS)

Linux bash programming (GA)

AWK basic; Gnuplot plotting (GA)

Day 2:


Day 3:

Python basic (GA) and Python advance (LS) (Parallel sessions)

Working with your data

Day 4:

Grass in Bash (GA)

Grass in Python (LS)

Michelanis (Orfeo Toolbox, … )

Working with your data

Presentation group 1

Day 5:

Multicore processing in Bash - advance (GA)

Multicore processing in Python - advance (LS)

Geocomputation in HPC - very advance (GA)

Working with your data (basic) ;

Presentation group 2



Session 1. Getting started: Knowing each other and LINUX OS (Amatulli, shen)

This session introduce the overall course program and Linux operating system. We also learn how to install and use a virtual environment operating system.

  1. Get to know each other: trainers and participants - Identifying participant expectations and needs (Round-table).
  2. Course objectives and schedule.
  3. Linux environment, why and what to use to handling BigData (Lecture) (A.)
  4. Installation and introduction to the Linux Virtual Machine (Hands-on tutorial) (A.)
  5. The platform (Lecture) (A.)
  6. Lubuntu GUI and Unix/Linux command line (Hands-on tutorial) (C.)
  7. The use of kate as an editor (Hands-on tutorial) (C.)


Session 2. Jump start LINUX Bash programming (Amatulli)

During this session we explore and practice the basics of BASH terminal command line. The acquired skills will be used in all following sections.

  1. Unix/Linux command line - Lecture.
  2. Starting with Bash
  3. Special characters and Quoting
  4. The most important commands - Unix/Linux Command Reference
  5. Meta-characters and regular expression, their use - (Hands on Tutorial).        
  6. Concatenate process (pipe) - (Hands on Tutorial).
  7. The use of variables - (Hands on Tutorial).
  8. String manipulation - (Hands on Tutorial).
  9. Iteration (for loop, while) - (Hands on Tutorial).        
  10.  Command line File management - Exercise.
  11. Download unzipping and manipulate multiple files using the command line - (Exercise).

Session 3. Discovering the power of AWK programming language (Amatulli).

This session is fundamental for data filtering and preparation, bulk data download, text files manipulation, descriptive statistics and basic mathematical operation on large files. Students will access, query, understanding and cleaning up data, perform data filtering using bash command line. We use AWK which is an extremely versatile and powerful programming language for working on text files, performing data extraction and reporting or to squeeze data before importing them into R or other software types.

  1. Welcome to AWK world . Why to use AWK command line - (Lecture).
  2. The basic commands, command syntax - (Lecture).
  3. Built in variables - (Hands on tutorial).
  4. Import variables - (Hands on tutorial).
  5. String functions - (Hands on tutorial).
  6. Numerical functions - (Hands on tutorial).
  7. Query functions - (Hands on tutorial).
  8. Manipulate large files using AWK - (Hands on tutorial).

Session 4. Getting started with GNUPLOT (Amatulli).

This session introduces the command-line driven graphic utility GNUPLOT. Even though it has very sophisticated graphical options for a final layout definition, Gunuplot is a very powerful tool for rapid and effective preliminary data visualization. It is embedded in the bash-awk terminal and we can perform data filtering, random data extraction and many other operations quick visualisation of data sets for preliminary analyses.

  1. Accessing gnuplot, the Gnuplot syntax ( Lecture)
  2. 2D & 3D Data plots - (Hands on tutorial).
  3. Combine awk and gnuplot - (Hands on tutorial).
  4. Plot 3D Lidar data with Gnuplot (exercise).



Session 5. Exploring and understanding geographical data: introduction to GDAL/OGR, Pktools (Amatulli).

This section introduces data manipulation for geospatial data processing on the command line using GDAL and OGR libraries.

  1. GDAL/OGR  & PKTOOLS for raster and vector analysis  - (Lecture).
  2. Geographic Projections database 
  3. Raster and vector data formats and data type. 
  4. GeoTIFF format
  5. Openev & QGIS for raster and vector visualization
  6. Command syntax
  7. Raster/vector data manipulation for multiple image processing using GDAL & PKTOOLS
  8. The use of .VRT for splitting and merging images


Starting from day 3, we are presenting 2 parallel sections which require different programming level. You can select one based on your programming experience or thematic focus.


Parallel Session 6a. Jump start with Python language (Amatulli) 

  1. Getting started with Python, install and run on various platforms, managing versions, programming environments and how to think the Python way (lecture and hands on tutorial)
  2. Basic Data Types : integers, strings, list, dictionaries (lecture).
  3. Flow Control Statements : if, while, for (lecture).
  4. Functions (lecture).
  5. Modules and packages (lecture).
  6. Basic Numpy (numerical python) (lecture).
  7. Basic Math operations (lecture).
  8. Basic tools : compression, web interfacing (lecture)
  9. Basic geoapplication : raster, shapefile (lecture)
  10. Hands on tutorial and exercises on real code.

Parallel Session 6b. Advanced Python (Shen)

  1. Overview of python packages/modules oriented for geoapplications (lecture)
  2. Scientific computation using python (lecture).
  3. Vector operations : geometry manipulation and data filtering (code analysis).        
  4. Raster operations : projections, histograms (code analysis)
  5. Geostatics and advanced examples (code analysis).



Session 7. Command line GIS - Getting started with GRASS and Qgis (Amatulli).

This session will introduce the use of GRASS geographic information system in its command line interface for spatial-data processing, and the use of QGIS for map visualization and overlay tool. We do not expect any previous knowledge of GRASS or Qgis, but will use basic BASH and GDAL command line skills acquired in the previous days. This session is interesting for people that deal with general GIS analysis, hydrological modelling, DEM, vector/ raster data integration etc. This section is fundamental for the topic of tomorrow.

  1. Introduction to grass data structure and environment - (Lecture).
  2. GRASS and Qgis as learning tools - (Lecture).
  3. Accessing GRASS and links to Qgis - (Hands on tutorial).
  4. Command syntax and general commands of data handling - (Hands on tutorial).
  5. Grass working environment and bash working directory - (Hands on tutorial).
  6. Location and mapset - (Hands on tutorial).
  7. Region settings (Hands on tutorial).
  8. Raster and vector data import, export, display and conversion - (Hands on tutorial).
  9. Raster map calculator - (Hands on tutorial).
  10. Vector manipulation and processing - (Hands on tutorial).
  11. Production of maps and tables layout for reporting - (Hands on tutorial).

Session 8.  Advanced data processing using GRASS (Amatulli).

This parallel section assumes good understanding and discrete control of Bash command line and full knowledge of GRASS location and mapset. It is mainly indicate for people that already move in the direction of large data processing and are willing to explore more on batch job processing and multicore data manipulation for Geo-Computation. (This section can be dropped if nobody have used GRASS)

  1. GRASS70 Create Location using ancillary layer - (Hands on tutorial).
  2. Create a Location, enter in GRASS and import data - (Exercise).

Session 9.  Using GRASS in python (Shen).

This session shows how to use GRASS from python. It is mainly indicated for people that already work in python and they want to use the grass functionality within python.


  1. Grass in python
  2. Import grass modules inside python  
  3. Create grass location and mapset using grass-session
  4. Run a mapcal for a raster operation
  5. Reference ( grass-session1 , grass-session2 )

Session 10. Miscellaneous (Amatulli)

  1. Unsupervised classification (segmentation with Orfeo Toolbox)
  2. Calculation of contiguous stream-specific variables (hydrology with GRASS)
  3. Machine Learning for fresh water quality (hydrology in Python)


Session 11.  Understanding computer performance (Amatulli).

This session highlights how the RAM, hard disk and CPUs interacts in different environments such as Bash, R, Python.

  1. Hard disk performance
  2. RAM performance (fill up the RAM)
  3. Mount a folder in the RAM

Session 12.  Multicore processing in Bash (Shen).

This session shows how to use xargs from bash to run multicore processing.

  1. Transform a simple "for loop" in multicore "for loop"
  2. Use of xargs in for running R sections simultaneously

Session 13.  Multicore processing in Python (Shen).

This session illustrates the use of Pool object in the multiprocessing package in python to perform zonal statical analysis.  

  1. Multicore processing in python - Zonal Statistic

Session 14.  Geocomputation in HPC (Amatulli).

This session shows how to use the HPC to run Geocomputation analysis

  1. Nodes and CPUs
  2. The queue system
  3. The partitions
  4. Geocomputation in HPC using Slurm

Session 15. Geospatial data processing: find the right tool for the job (all students, Amatulli, Shen)

This session summarize and review  the tools presented till now, explore more open source libraries available and aims to clarify any issues or questions arising till now.

  1. Exploring more spatial data libraries: pktools, openforis, morfeo toolbox - differences and complementarities with gdal/ogr, grass. (Lecture 10 min)
  2. What’s best for what in choosing and using the right tool.  (Round table discussion 30 min).

Session 16. Working on students personal data.

CONCLUSION – Focus on the students' projects needs and how to get going with the use of free and open source tools for advanced spatial data processing.

Round table, question and answer session on specific data processing needs and how to follow up the summer school using open source software as daily working toolbox.