1 of 71

Unit 4: �Navigation and Guidance Components: �Classification of payload based on applications; �Hyper-spectral sensors; �Laser Detection and Range (LADAR); �Synthetic Aperture Radar (SAR); �Thermal cameras; �ultra-sonic detectors;� Case study on payloads.� Introduction to navigation systems and types of guidance; �Mission Planning and Control.

Dr M Vamshi Krishna

ECE HoD

2 of 71

3 of 71

  • Payloads in UAVs – Classification Based on Applications
  • A payload is any equipment carried by a UAV to perform a specific mission other than basic flight.

4 of 71

Classification Of Payloads

Application Area

Payload Type

Examples

Surveillance & Security

Imaging Sensors

EO cameras, IR cameras, SAR

Agriculture

Multispectral / Hyperspectral Sensors

Crop health monitoring

Mapping & Surveying

LiDAR, SAR

Terrain mapping

Disaster Management

Thermal Cameras

Search & rescue

Delivery & Logistics

Grippers / Boxes

Medical & parcel delivery

Scientific Research

Atmospheric Sensors

Gas, temperature, pressure

5 of 71

Hyper-Spectral Sensors

A hyperspectral sensor captures image data in hundreds of narrow, contiguous spectral bands across the electromagnetic spectrum.

▶ Working Principle

  • Each pixel contains a full spectrum.
  • Enables material identification.

▶ Applications

✔ Crop disease detection

✔ Mineral mapping

✔ Environmental monitoring

▶ Advantages

• High spectral resolution • Accurate material classification

▶ Limitations

• High data volume• Expensive & computationally heavy

6 of 71

7 of 71

Definition:

�A Hyperspectral Sensor (HSS) is an imaging system that acquires data in hundreds of narrow, contiguous spectral bands (typically 5–10 nm bandwidth) across the electromagnetic spectrum (VIS–NIR–SWIR).

  • Key Feature:

Unlike multispectral sensors, hyperspectral sensors provide a full spectral signature per pixel.

8 of 71

Spectral Concept & Data Structure

  • • Spectrum Range:�✔ Visible (0.4–0.7 µm)�✔ Near Infrared (0.7–1.0 µm)�✔ Short-Wave IR (1.0–2.5 µm)
  • • Data Format:�➡ Output is a 3D data cube (x, y, λ)�• x, y → spatial dimensions�• λ → spectral dimension

9 of 71

Working Principle

  • Incoming radiation is dispersed using:�• Prism / Diffraction Grating
  • Energy is split into narrow wavelength bands
  • Sensor array captures intensity per band
  • Each pixel stores a spectral vector
  • ✔ Pixel = [R₁, R₂, R₃, … Rₙ]�(where Rₙ = reflectance at wavelength λₙ)

10 of 71

Spectral Signature & Material Identification�

  • • Every material reflects energy differently → unique spectral signature�• Hyperspectral imaging enables:�✔ Endmember detection�✔ Spectral unmixing�✔ Target classification
  • ✔ Used with algorithms like:�• SAM (Spectral Angle Mapper)�• PCA (Principal Component Analysis)�• SVM / CNN for classification

11 of 71

UAV-Mounted Hyperspectral Payload

  • System Components:�• Imaging spectrometer�• CMOS/CCD sensor�• GNSS + IMU for geo-referencing�• Onboard storage + telemetry
  • Typical UAV Integration Issues:�• Weight & power constraints�• Data rate > 100 MB/s�• Real-time processing difficulty

12 of 71

Applications

  • ✔ Precision Agriculture�→ Crop stress, nutrient deficiency, disease detection
  • ✔ Geology & Mining�→ Mineral & ore identification
  • ✔ Environmental Monitoring�→ Pollution, deforestation, water quality
  • ✔ Defense & Surveillance�→ Camouflage detection, target discrimination

13 of 71

Advantages

  • • Very high spectral resolution�• Enables sub-pixel classification�• Accurate material & chemical identification�• Works with AI/ML-based analytics

14 of 71

Limitations

  • • Extremely large data volume�• Requires high processing power (GPU/FPGA)�• Sensitive to noise & atmospheric effects�• High cost of sensor + calibration complexity

15 of 71

�Laser Detection and Ranging (LADAR / LiDAR)

LiDAR uses laser pulses to measure distance by calculating time-of-flight.

▶ Working

  • Laser pulse emitted
  • Hits target
  • Reflected signal received
  • Distance = (c × time)/2

▶ Applications

✔ Terrain mapping

✔ Obstacle detectio

✔ Autonomous navigation

▶ Advantages

• High accuracy

• Works in low light

▶ Limitations

• Affected by fog, rain

• High cost

16 of 71

Definition:

LiDAR (Light Detection and Ranging), also called

LADAR, is an active remote sensing system that

measures distance by illuminating a target with

laser pulses and analyzing the reflected signals.

  • Key Principle:�✔ Distance measurement using Time-of-Flight (ToF)�✔ Uses near-infrared laser (typically 905 nm or 1550 nm)

17 of 71

Basic LiDAR Equation

  • Distance Calculation:
  • d=(c⋅t)/2
  • Where:�• d= Distance to target�• c = Speed of light (3 × 10⁸ m/s)�• t = Round-trip time of laser pulse

18 of 71

Types of LiDAR�

  • Pulsed LiDAR – Time-of-flight based�• Continuous Wave (CW) LiDAR – Phase shift method�• FMCW LiDAR – Frequency-modulated continuous wave�• Scanning LiDAR – Mechanical / solid-state�• Flash LiDAR – Full scene captured at once

19 of 71

Working Principle�

  • Laser emits short pulse
  • Pulse travels to object
  • Reflects back to receiver
  • Detector senses return pulse
  • TDC measures time delay
  • Distance is computed
  • Multiple pulses → 3D point cloud
  • ✔ Output = Point Cloud (x, y, z)

20 of 71

UAV-Based LiDAR Payload�

  • Components:�• Laser transmitter�• Receiver optics�• IMU + GNSS�• Data logger�• Stabilized gimbal
  • Functions:�✔ Terrain mapping�✔ Obstacle detection�✔ Navigation aid

21 of 71

Figure . Structure and principle of a typical LiDAR sensor.

The LD sends out the laser, which is focused by a light-transmitting lens; then, the emitted laser is reflected back from the target object and received by the APD via the light-receiving lens. Further, the TDC measures the subtraction between the time the LD sends the laser and the time the APD receives it and converts the subtraction to the ToF. Finally, the signal-processing unit, also known as a microprocessor (MP), receives the ToF from the TDC and computes the distance between the LiDAR sensor and the target objecT.

22 of 71

https://www.sphengineering.com/news/the-ultimate-guide-to-lidar-drone-mapping-for-professional-pilots

23 of 71

Synthetic Aperture Radar (SAR)�

SAR uses radar signals to form high-resolution images, independent of weather or lighting.

▶ Key Features

• Active sensor• Microwave frequencies• All-weather imaging

▶ Applications

✔ Military surveillance

✔ Disaster monitoring

✔ Land & sea mapping

▶ Advantages

• Works day/night

• Penetrates clouds, smoke

▶ Limitations

• Complex signal processing

• High power consumption

24 of 71

Introduction to SAR in Drones�

  • Definition:�Synthetic Aperture Radar (SAR) is an active microwave remote sensing system that generates high-resolution images by synthesizing a large antenna aperture using the motion of the UAV.
  • Key Feature:�✔ Works in all weather, day & night�✔ Uses microwave frequencies (X, C, L bands)
  • Why SAR on Drones?
  • ✔ UAV mobility + SAR = flexible, high-resolution imaging�✔ Overcomes limitations of optical sensors�✔ Suitable for:�• Cloudy / smoky / foggy environments�• Night operations�• Long-range reconnaissance

25 of 71

SAR Working Principle

  • UAV transmits microwave pulses
  • Pulses hit ground/objects
  • Reflected echoes received by antenna
  • Doppler shift + phase history recorded
  • Signal processing synthesizes a large virtual aperture
  • High-resolution image is formed
  • ✔ Resolution achieved by motion of UAV, not antenna size

26 of 71

27 of 71

SAR Image Formation

• SAR uses coherent processing

• Phase history is stored

• Uses FFT + matched filtering

• Produces high-resolution 2D image

✔ Output = Intensity + Phase image

Band

Frequency

Application

X-Band

8–12 GHz

High-resolution imaging

C-Band

4–8 GHz

Vegetation, terrain

L-Band

1–2 GHz

Penetrates foliage & soil

SAR Frequency Bands Used in UAVs

28 of 71

SAR vs Optical Sensors

Feature

SAR

Optical Camera

Weather

All-weather

Affected by clouds

Night

Works

Needs light

Resolution

High

High (only clear day)

Penetration

Yes

No

29 of 71

5. Thermal Cameras�

Thermal cameras detect infrared radiation to form images based on temperature differences.

▶ Applications

✔ Search and rescue

✔ Fire detection

✔ Wildlife monitoring

▶ Advantages

• Detects heat signatures

• Works in darkness

▶ Limitations

• Lower resolution than EO cameras

• Costly

30 of 71

Definition:

  • A thermal camera is a passive infrared (IR) sensor that detects radiation in the Long-Wave Infrared (LWIR) band (8–14 µm) emitted by objects based on their temperature.

✔ No external illumination required

✔ Forms images using heat signatures

Thermal Radiation Basics

• All objects above 0 K emit IR radiation

• Radiation intensity ∝ object temperature

• Governed by Planck’s Law and Stefan–Boltzmann Law

✔ Thermal camera measures emitted, not reflected energy

31 of 71

32 of 71

  • Types of Thermal Detectors

Microbolometer (Uncooled) – Most UAV cameras

Photon Detectors (Cooled) – High sensitivity, expensive

  • ✔ UAVs mostly use uncooled microbolometer arrays

  • Working Principle
  • Target emits IR radiation
  • IR lens focuses energy on detector array
  • Detector changes resistance with temperature
  • ROIC reads pixel values
  • Signal is digitized
  • Image processor maps temperature to colors
  • ✔ Output = Thermal image (Thermogram)

33 of 71

UAV Integration of Thermal Cameras

  • Components:�• IR camera module�• Gimbal (stabilized)�• GNSS + IMU�• Data logger�• Telemetry link
  • Functions:�✔ Target detection�✔ Night navigation�✔ Search & rescue

34 of 71

6. Ultrasonic Detectors�

  • ▶ Definition
  • Ultrasonic sensors measure distance using high-frequency sound waves (>20 kHz).
  • ▶ Working
  • • Emit ultrasonic pulse�• Receive echo�• Calculate distance
  • ▶ Applications
  • ✔ Obstacle avoidance�✔ Altitude hold (low height)
  • ▶ Advantages
  • • Simple & low-cost�• Good for short range
  • ▶ Limitations
  • • Affected by wind�• Limited range

35 of 71

Definition:

  • �An ultrasonic detector is an active ranging sensor that measures distance using high-frequency acoustic waves (>20 kHz) by analyzing the time delay of reflected echoes.
  • ✔ Short-range, low-cost sensing�✔ Commonly used for altitude hold & obstacle avoidance

36 of 71

Working Principle�

  • MCU triggers ultrasonic burst
  • Transducer emits sound pulse
  • Pulse travels to obstacle/ground
  • Echo reflects back
  • Receiver detects echo
  • Timer measures time delay
  • Distance is computed
  • ✔ Output = Range in meters / centimeters

37 of 71

  • 🔹 Slide 5: Types of Ultrasonic Sensors
  • Single-transducer type – TX/RX combined�• Dual-transducer type – Separate TX & RX�• Array type – Multiple sensors for wider FOV
  • ✔ UAVs commonly use dual-transducer modules

38 of 71

UAV Integration�

  • Components:�• Ultrasonic module (e.g., HC-SR04)�• MCU / Flight Controller (PX4 / Ardupilot)�• Power conditioning�• Mounting bracket (vibration isolation)
  • Functions:�✔ Low-altitude measurement�✔ Obstacle detection�✔ Landing assistance

39 of 71

Applications in Drones�

  • ✔ Obstacle avoidance�✔ Terrain following�✔ Precision landing�✔ Indoor navigation

40 of 71

Advantages�

  • • Simple hardware�• Low power & cost�• Fast response�• Good for short-range sensing (0.02–4 m)

41 of 71

  • Limitations
  • • Affected by wind & air turbulence�• Limited range & angular resolution�• Soft surfaces absorb sound�• Not suitable for high-speed flight

42 of 71

Introduction to Navigation Systems

  • A navigation system determines and controls the position, velocity, and orientation of a vehicle or object as it moves from one point to another.
  • 🎯 Objectives of Navigation
  • Determine where you are (position)
  • Determine how fast and in which direction you’re moving (velocity)
  • Determine how you’re oriented (attitude/heading)
  • Help you reach a target or destination accurately
  • 🚗✈️🚀 Where Navigation Systems Are Used
  • Aircraft & Drones (UAVs)
  • Missiles & Rockets
  • Ships & Submarines
  • Autonomous Cars & Robots
  • Spacecraft & Satellites

43 of 71

Types of Navigation Systems

  • Inertial Navigation System (INS)
    • Uses accelerometers + gyroscopes
    • Works without external signals
    • ✔ Self-contained
    • ❌ Errors accumulate over time (drift)
  • Global Navigation Satellite System (GNSS / GPS)
    • Uses satellite signals
    • Provides accurate position & time
    • ✔ High accuracy
    • ❌ Needs clear sky view
  • Radio Navigation
    • Uses ground-based radio beacons
    • Example: VOR, DME, LORAN
  • Vision-Based Navigation
    • Uses cameras and image processing
    • Used in drones & robots
  • Hybrid Navigation
    • Combines INS + GPS + Vision
    • ✔ High reliability and accuracy

44 of 71

  • What is Guidance?
  • Guidance is the process of determining the desired path or trajectory to reach a target and generating commands to follow that path.
  • 👉 Navigation tells where you are,�👉 Guidance tells where to go,�👉 Control tells how to move.

45 of 71

  • Types of Guidance
  • 1:Command Guidance
  • External source sends commands
  • Example: Ground radar guiding a missile
  • ✔ Simple onboard system
  • ❌ Vulnerable to jamming
  • 2:Homing Guidance
  • The vehicle homes in on a target.
  • 📌 Types:
  • Active Homing – Sends signal and receives reflection (radar)
  • Passive Homing – Tracks emissions from target (IR, RF)
  • Semi-Active Homing – External source illuminates target

46 of 71

  • 3:Inertial Guidance
  • Uses onboard INS only
  • No external input
  • ✔ Jam-proof
  • ❌ Drift over long distances
  • 4:GPS / Satellite Guidance
  • Uses satellite data
  • High precision navigation
  • Used in UAVs, smart bombs, vehicles
  • 5:Proportional Navigation (PN)
  • Common in missile guidance
  • Vehicle steers to reduce line-of-sight rate
  • ✔ Very efficient & accurate

47 of 71

Summary Table

Aspect

Navigation

Guidance

Function

Finds current position

Determines desired path

Input

Sensors / GPS / INS

Target & mission data

Output

Position, velocity, heading

Steering / trajectory commands

Example

GPS in a drone

Missile tracking a target

48 of 71

What is Mission Planning?

  • Mission Planner is a free, open-source, community-supported application developed by Michael Oborne for the open-source APM autopilot project. 
  • Mission Planning is the process of defining:
  • 🎯 Mission objectives
  • 📍 Waypoints and routes
  • ⏱ Flight time and schedule
  • 🔋 Battery and payload requirements
  • 🌦 Environmental considerations

49 of 71

Steps in UAV Mission Planning

  • Step 1: Define Mission Objective
  • Surveillance
  • Mapping
  • Delivery
  • Agriculture spraying
  • Search & Rescue
  • Step 2: Area Analysis
  • Study terrain
  • Check obstacles
  • Airspace restrictions (DGCA guidelines)
  • Weather conditions

50 of 71

  • Step 3: Path Planning
  • Define waypoints (Latitude, Longitude, Altitude)
  • Set flight speed
  • Set mission type:
    • Waypoint mission
    • Grid mission (mapping)
    • Orbit mission
    • Follow-me mission
  • Step 4: Resource Planning
  • Battery capacity
  • Payload weight
  • Communication range
  • Flight time estimation
  • Step 5: Risk Assessment
  • Emergency landing zones
  • Return-to-Home (RTH) configuration
  • Failsafe settings

51 of 71

What is Mission Control?

  • Mission Control refers to:
  • Real-time monitoring
  • Command execution
  • Telemetry tracking
  • Emergency handling
  • It is usually performed through a Ground Control Station (GCS).

52 of 71

Components of Mission Control System

1:Ground Control Station (GCS)

  • Laptop/Tablet software
  • Displays:
    • Map
    • UAV position
    • Battery status
    • Telemetry data

2:Communication Link

  • RF communication
  • 4G/5G
  • Satellite link

3:Flight Controller

  • Executes mission commands
  • Controls motors and sensors
  • Maintains stability

53 of 71

Types of Control Modes�

Manual Mode

  • Pilot controls via remote

Stabilized Mode

  • Auto stabilization, manual navigation

Autonomous Mode

  • Pre-programmed mission
  • Executes waypoint navigation automatically

54 of 71

Key Control Functions

Function

Description

Take-off Control

Automatic or manual launch

Waypoint Navigation

GPS-based route following

Payload Control

Camera trigger / Spray ON/OFF

Return to Home

Automatic return during low battery

Emergency Landing

Fail-safe landing

55 of 71

Mission control continuously monitors:

  • GPS coordinates
  • Altitude
  • Speed
  • Battery voltage
  • Signal strength
  • IMU data
  • Wind conditions

56 of 71

Path Planning Techniques

  • 1. Classical Methods
  • Straight-line waypoint navigation
  • Grid scanning for mapping
  • 2. Advanced Algorithms
  • A* Algorithm
  • Dijkstra’s Algorithm
  • RRT (Rapidly Exploring Random Tree)
  • Potential Field Method
  • Used in:
  • Obstacle avoidance
  • Dynamic mission updates

57 of 71

Mission Safety Features

  • Geo-fencing
  • Low battery warning
  • Signal loss protection
  • Collision avoidance sensors
  • Redundant communication links

58 of 71

Block Diagram

Mission Planner (GCS)

Communication Link

Flight Controller

Motors & Sensors

Telemetry Feedback

59 of 71

Applications of Mission Planning & Control

  • Agricultural field mapping
  • Disaster management
  • Border surveillance
  • Military operations
  • Smart city monitoring
  • Infrastructure inspection

60 of 71

Advantages�

✔ Efficient mission execution

✔ Reduced human error

✔ Optimized battery usage

✔ Enhanced safety

✔ Autonomous operations

61 of 71

https://ardupilot.org/planner/docs/mission-planner-overview.html

62 of 71

63 of 71

64 of 71

65 of 71

66 of 71

67 of 71

68 of 71

69 of 71

Short Answer Questions

  • Define payload in UAV systems.
  • Classify payloads based on applications.
  • What is a hyper-spectral sensor?
  • State any two applications of hyper-spectral imaging.
  • What is LADAR?
  • How does LADAR differ from conventional radar?
  • Define Synthetic Aperture Radar (SAR).
  • Mention two advantages of SAR.
  • What is a thermal camera?
  • Write two applications of thermal cameras in drones.
  • Define ultrasonic detector.
  • What frequency range is used in ultrasonic sensors?
  • Mention two limitations of ultrasonic detectors.
  • What is navigation in UAVs?
  • Define guidance system.

70 of 71

Short Answer Questions

  • Differentiate between navigation and guidance.
  • What is inertial navigation system (INS)?
  • What is GPS-based navigation?
  • Define waypoint navigation.
  • What is mission planning?
  • List any two steps in mission planning.
  • What is mission control?
  • Define Ground Control Station (GCS).
  • What is telemetry?
  • What is Return-to-Home (RTH) feature?
  • Define geo-fencing.
  • What is path planning?
  • State any two control modes of UAV.
  • What is autonomous navigation?
  • What is obstacle avoidance?
  • What is payload integration?
  • Define communication link in mission control.
  • What is a fail-safe mechanism?
  • What is a case study in UAV payloads?
  • Write two examples of UAV payloads used in agriculture.

71 of 71

Long Answer Questions

  • Explain classification of UAV payloads based on applications with examples.
  • Describe hyper-spectral sensors: working principle, advantages, and applications in UAVs.
  • Explain LADAR system with block diagram and its role in obstacle avoidance.
  • Discuss Synthetic Aperture Radar (SAR): principle, features, and UAV applications.
  • Explain thermal imaging cameras and their use in surveillance and rescue missions.
  • Describe ultrasonic detectors: working, advantages, limitations, and UAV uses.
  • Write a detailed note on navigation systems and types of guidance used in UAVs.
  • Explain mission planning process for UAV operations with neat diagram.
  • Describe mission control system: components, functions, and safety features.
  • Write a case study on UAV payloads used in agriculture/surveillance/disaster management.