3 Simple Steps to Integrate Sensors in Your Android Application

Learn to use Android sensors to detect device motion, device proximity, and location — using Kotlin and the Android SDK

In most Android devices, sensors are a vital component—we use them in many applications to perform different tasks and to improve user experiences.

They are the devices way of interacting with the outsider world, and using them we can measure motion, orientation, device position, and various environmental conditions.

In this short article, we’ll explore the different sensors we have at our disposal and how they work. Then, we’ll move to sensor fusion and see how the Android SDK classes simplify interacting with the array of device sensors.

Project Setup 📐

To start, open up Android Studio and create a new project or open an existing one.

No special dependencies are needed to work with Sensors, just the core SDK.

Sensors Types 🧭

In Android, we have three broad categories of sensors. Some of them are hardware-based, while others are software-based. We can use both inside our apps.

Let’s take a look at the types of sensors.

Motion Sensors

As the name indicates, motion sensors track a device’s movement; they include accelerometers, gravity sensors, and gyroscopes, and they give data on forces like acceleration and rotation. Here are some specific sensors:

  • TYPE_ACCELEROMETER: Computes acceleration (including the force of gravity) in m/s applied on all three axes.
  • TYPE_GRAVITY: Calculates the gravitational force in m/s applied on all three axes.
  • TYPE_GYROSCOPE: Measures the rate of rotation in radian/s around each of the three axes.
  • TYPE_LINEAR_ACCELERATION: Computes the acceleration force in m/s applied on all three axes, without the force of gravity.
  • TYPE_ROTATION_VECTOR: Measures the orientation of a device by the device’s rotation vector.

Environmental Sensors

When it comes to environmental variables, barometers and thermometers are types of sensors that measure environmental metrics—but here’s a more complete list of sensors under this category:

  • TYPE_AMBIENT_TEMPERATURE: Observes the temperature of the surroundings in degrees Celsius.
  • TYPE_PRESSURE: Computes the air pressure in hPa or mbar.
  • TYPE_LIGHT: Estimates the light around the device’s surroundings in lx units.
  • TYPE_RELATIVE_HUMIDITY: Computes the humidity of the environment air as a percentage.

Position Sensors

These sensors help the device to determine its physical position:

  • TYPE_PROXIMITY: Computes the proximity of the device’s screen to an object in centimeters.
  • TYPE_MAGNETIC_FIELD: Measures the geomagnetic field for all three axes in tesla (μT).

Generally speaking, we can’t use one sensor to create something like a compass or to calculate the device orientation, because the sensors themselves output noisy data. So to cultivate more accurate data, we can combine multiple sensors—this is called sensor fusion.

SensorManager Setup

The first step to start actually using sensors is to create something called SensorManager, this component is responsible for registering the sensors and getting updates from them.

We create this service using the getSystemService() method, alongside the service we want (SENSOR_SERVICE constant).

As mentioned earlier, the availability of sensors may vary from one device to another. Given this, it’s a good practice to verify the available sensors in a given device at any time. We can do this by using the getSensorList() method with the parameter TYPE_ALL .

Here’s a code snippet to illustrate:

private lateinit var sensorManager: SensorManager

override fun onCreate(savedInstanceState: Bundle?) {

    // initializing the sensor manager
    sensorManager = getSystemService(Context.SENSOR_SERVICE) as SensorManager

    // getting the list of all the availabale sensors in the current device
    val deviceSensors: List<Sensor> = sensorManager.getSensorList(Sensor.TYPE_ALL)

    deviceSensors.forEach { sensor ->
        Log.d("SENSORS", "Sensor Name: ${sensor.name}")

Registering to Sensors

Now that we have a SensorManager, we can get a reference to the sensors that we want to track data from.

Call the getDefaultSensor() method of the SensorManager with the desired sensor constant from the Sensor class, and right after using the also{} callback, call the registerListener() method on the SensorManager

Here’s a code snippet to register the gravity sensor:

sensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY).also { gravitySensor ->
            sensorManager.registerListener(this, gravitySensor, 

The this parameter refers to the SensorEventListener that our activity will implement; the gravitySensor refers to the sensor we received from the also callback; and the final two parameters are the sampling frequency and the maximum reporting latency.

The core Android SDK provides four constants that inform the system how often to report the computed events:

  • SENSOR_DELAY_NORMAL: Gets sensor data at a rate suitable for screen orientation changes.
  • SENSOR_DELAY_GAME: Gets sensor data at a rate suitable for games.
  • SENSOR_DELAY_UI: Gets sensor data at a rate suitable for working with user interfaces.
  • SENSOR_DELAY_FASTEST: Gets sensor data as soon as possible.

Handling the Sensor Data

To listen to the event changes of the sensor, we need to implement the interface SensorEventListener and override the onAccuracyChanged() and onSensorChanged() methods.

class MainActivity : AppCompatActivity(), SensorEventListener {

    override fun onAccuracyChanged(sensor: Sensor?, accuracy: Int) {
        Log.d("SENSORS", "onAccuracyChanged: $accuracy")

    override fun onSensorChanged(event: SensorEvent) {
        Log.d("SENSORS", "onSensorChanged: The values are ${Arrays.toString(event.values)}")

What happens behind the scenes is that the Android system will call onSensorChanged() (every time there’s a new sensor event) with the SensorEvent parameter, which contains an array of size three.

We have three values because the sensors compute the value of each axis separately. As such, event.values[0] will represent the x-axis value, event.values[1] represents y-axis, and event.values[2] for the z-axis.

onAccuracyChanged is called when there’s a change in the accuracy.

From here, you can store these values in a local variable so you can perform the process you need while building your use case and application.


If you want to learn more about sensors, here is the official documentation:


I hope that you’ve learned enough about sensors to start building apps with them in mind. Remember that you can always combine sensors data to produce more accurate results and richer user experiences.

Don’t forget to clap 👏 and follow 📌if you enjoy what you read. Here are a few of my other articles you can read, as well!

Editor’s Note: Heartbeat is a contributor-driven online publication and community dedicated to providing premier educational resources for data science, machine learning, and deep learning practitioners. We’re committed to supporting and inspiring developers and engineers from all walks of life.

Editorially independent, Heartbeat is sponsored and published by Comet, an MLOps platform that enables data scientists & ML teams to track, compare, explain, & optimize their experiments. We pay our contributors, and we don’t sell ads.

If you’d like to contribute, head on over to our call for contributors. You can also sign up to receive our weekly newsletters (Deep Learning Weekly and the Comet Newsletter), join us on Slack, and follow Comet on Twitter and LinkedIn for resources, events, and much more that will help you build better ML models, faster.

Avatar photo


Our team has been at the forefront of Artificial Intelligence and Machine Learning research for more than 15 years and we're using our collective intelligence to help others learn, understand and grow using these new technologies in ethical and sustainable ways.

Comments 0 Responses

Leave a Reply

Your email address will not be published. Required fields are marked *