1 of 23

Application Development with Android Sensor Frameworks

DSC 291

2 of 23

Activity

  • User interaction with a mobile-app is often non-deterministic. The app does not always begin in the same place.
    • For example, you can open an email app from your home screen and it will show a list of emails.
    • A social media app can also launch your email app which might directly bring you to the email app screen that let you compose an email.
  • The Activity serves as the entry point for an app’s interaction with the user.
  • An activity provides the window in which the app draws its User Interface (UI). The window typically fills the entire screen of the mobile device. Generally one activity implements one screen in an app.
    • For example, one of the app’s activities may implement a Preference screen, while another activity implements a Select Photo Screen.

3 of 23

Activity

.

  • Most apps contain multiple screens, which means they contains multiple activities. Typically, one activity in an app is specified as the main activity, which is the first screen to appear when the user launches the app.
  • One activity can start another activity in order to perform different actions.

4 of 23

Activity

.

  • For your app to be able to use activities, you must declare the activities, and certain of their attributes, in the manifest.
  • You must declare all the activities within an app in the Manifest file.
    • Declare it as a child of <application> element.
    • You can also setup intent filters if you want this activity to be launched by other activities and receive any data.

5 of 23

Activity

As a user navigates through, out of, and back to your app, each activity can transition through different states in their lifecycle. The activity class provides a number of callback functions so that 1) the activity may know a state change, and 2) the activity may know how to behave as the user leaves and re-enter the activity.

For example, if you're building a streaming video player, you might pause the video and terminate the network connection when the user switches to another app. When the user returns, you can reconnect to the network and allow the user to resume the video from the same spot.

Good implementation of the lifecycle callbacks can help ensure that your app avoids:

  • Crashing if the user receives a phone call or switches to another app while using your app.
  • Consuming valuable system resources when the user is not actively using it.
  • Losing the user's progress if they leave your app and return to it at a later time.

6 of 23

Source: Android Developer Documentation

7 of 23

Intent

  • Intents are used to start activity, pass information between activities ( services, broadcast receiver, etc,.)

8 of 23

Intent

  • The Intent constructor takes two parameters, a Context and a Class.
  • Intents are of two types- explicit and implicit
  • Explicit intents explicitly define the component which should be called by the Android system.
  • Implicit intents specify the action which should be performed and wait for the Android system to find all components registered for this action. Eg:- Share option

9 of 23

Sensor Overview

  • Most Android-powered devices have built-in sensors that measure motion, orientation, and various environmental conditions.
  • The Android platform supports three broad categories of sensors:
    • Motion sensors: These sensors measure acceleration forces and rotational forces along three axes. This category includes accelerometers, gravity sensors, gyroscopes, and rotational vector sensors.
    • Environmental sensors: These sensors measure various environmental parameters, such as ambient air temperature and pressure, illumination, and humidity. This category includes barometers, photometers, and thermometers.
    • Position sensors: These sensors measure the physical position of a device. This category includes orientation sensors and magnetometers.

10 of 23

Android Sensor Framework Capabilities

  • Determine which sensors are available on a device.
  • Determine an individual sensor's capabilities, such as its maximum range, manufacturer, power requirements, and resolution.
  • Acquire raw sensor data and define the minimum rate at which you acquire sensor data.
  • Register and unregister sensor event listeners that monitor sensor changes.

11 of 23

Types of Sensor

The Android sensor framework lets you access two types of sensors.

  • Hardware-based sensors: Hardware-based sensors are physical components built into a handset or tablet device. They derive their data by directly measuring specific environmental properties, such as acceleration, geomagnetic field strength, or angular change.
  • Software-based or virtual sensors: Software-based sensors are not physical devices, although they mimic hardware-based sensors. The linear acceleration sensor and the gravity sensor are examples of software-based sensors.

12 of 23

Types of Sensor

Source: Android Developer Documentation

13 of 23

Android Sensor Framework

In a typical application you use these sensor-related APIs to perform two basic tasks:

  • Identifying sensors and sensor capabilities: Identifying sensors and sensor capabilities at runtime is useful if your application has features that rely on specific sensor types or capabilities.
  • Monitor sensor events: Monitoring sensor events is how you acquire raw sensor data. A sensor event occurs every time a sensor detects a change in the parameters it is measuring. A sensor event provides you with four pieces of information: {the name of the sensor that triggered the event, the timestamp for the event, the accuracy of the event, the raw sensor data that triggered the event}

14 of 23

Android Sensor Framework

Android Sensor Framework allows you to determine which sensors are available on the device at run time.

If you want to list all of the sensors of a given type, you could use another constant instead of TYPE_ALL such as TYPE_GYROSCOPE, TYPE_LINEAR_ACCELERATION, or TYPE_GRAVITY.

private SensorManager sensorManager;

sensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);

List<Sensor> deviceSensors = sensorManager.getSensorList(Sensor.TYPE_ALL);

15 of 23

Determining if a sensor exists

private SensorManager sensorManager;

...

sensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);

if (sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD) != null){

// Success! There's a magnetometer.

} else {

// Failure! No magnetometer.

}

16 of 23

Determining sensor attributes

In addition to listing the sensors that are on a device, you can use the public methods of the Sensor class to determine the capabilities and attributes of individual sensors. This is useful if you want your application to behave differently based on which sensors or sensor capabilities are available on a device.

  • getResolution() methods can obtain information about sensor’s resolution.
  • getMaximumRange() methods can obtain a sensor's maximum range of measurement.
  • getPower() method can obtain a sensor's power requirements.
  • getMinDelay() method returns the minimum time interval (in microseconds) a sensor can use to capture data. If a sensor returns zero, it means the sensor is not a streaming sensor and reports data only when there is a change in the parameters it is sensing.

17 of 23

Listening to sensor data

public class SensorActivity extends Activity implements SensorEventListener {

private SensorManager sensorManager;

private Sensor mLight;

@Override

public final void onCreate(Bundle savedInstanceState) {

super.onCreate(savedInstanceState);

setContentView(R.layout.main);

sensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);

mLight = sensorManager.getDefaultSensor(Sensor.TYPE_LIGHT);

}

@Override

public final void onAccuracyChanged(Sensor sensor, int accuracy) {

// Do something here if sensor accuracy changes.

}

@Override

public final void onSensorChanged(SensorEvent event) {

// The light sensor returns a single value.

// Many sensors return 3 values, one for each axis.

float lux = event.values[0];

// Do something with this sensor value.

}

@Override

protected void onResume() {

super.onResume();

sensorManager.registerListener(this, mLight, SensorManager.SENSOR_DELAY_NORMAL);

}

@Override

protected void onPause() {

super.onPause();

sensorManager.unregisterListener(this);

}

}

To monitor raw sensor data you need to implement two callback methods that are exposed through the SensorEventListener interface: onAccuracyChanged() and onSensorChanged().

18 of 23

Listening to sensor data

@Override

protected void onResume() {

super.onResume();

sensorManager.registerListener(this, mLight, SensorManager.SENSOR_DELAY_NORMAL);

}

@Override

protected void onPause() {

super.onPause();

sensorManager.unregisterListener(this);

}

}

The data delay (or sampling rate) controls the interval at which sensor events are sent to your application via the onSensorChanged() callback method. The default data delay is delay of 200,000 microseconds. You can specify other data delays, such as SENSOR_DELAY_GAME(20,000 microsecond delay), SENSOR_DELAY_UI (60,000 microsecond delay), or SENSOR_DELAY_FASTEST (0 microsecond delay).

Remember, This is just a suggested or requested value to the Android Sensor Framework. There is no guarantee that you will get your sensor data sampled at your requested rate!

The best practice is that you should request the largest delay your application can tolerate.

19 of 23

Dealing with Sensors: Best Practices

  • Determine Sampling Rate with time stamps.
  • Select Sensor Delays carefully.
  • Don’t Block onSensorChanged() method.
  • Always remember to unregister listener when the sensor is not needed to save battery!

private SensorManager sensorManager;

...

@Override

protected void onPause() {

super.onPause();

sensorManager.unregisterListener(this);

}

20 of 23

Android Studio Virtual Devices for Emulating Sensor Data

  • Android studio facilitates developing and testing applications without physical access to phones
  • The Emulator provides virtual devices that include capability to simulate sensor data
  • An android virtual device (AVD) is a combination of hardware profile, system image, storage area, skin, and other properties, this helps developers test their apps on diverse device types and hardware
  • The options include include predefined profiles based on devices or may be customized as required

21 of 23

Android Studio Virtual Devices for Emulating Sensor Data

  • If system requirements are met, AVDs can be setup and used to test the apps
  • To simulate sensor data, there is option to simulate different types of movement or rotation, or change the simulated values with a slider to test the functionality of the apps

22 of 23

Plotting Sensor Data on Your Android App

  • Plotting and visualizing sensor data is important when working with sensor data
  • There are many ways to create plots and visualization on Android, one such option is to use the popular library called MPAndroidChart
  • This includes support for many different plot types, please refer to the documentation for more details - link to documentation
  • To set up this dependency please add the following in gradle files-

repositories {

maven(url = "https://jitpack.io")

}

dependencies {

implementation("com.github.PhilJay:MPAndroidChart:v<>")

}

23 of 23

Plotting Sensor Data on Your Android App with MPAndroidChart

  • To create a plot, define the UI component in your xml file and reference it in your activity code
  • You may select different styling options

chartData = new LineData(chartDataSet);//define the chartDataset

chart.setData(chartData); //set the data;

chartDataSet.addEntry(new Entry(event.timestamp,mag));//add incoming data

chartData.notifyDataChanged(); // let the data know a dataSet changed

chart.notifyDataSetChanged(); // let the chart know it's data changed

chart.invalidate();