Table of Contents
Overview
Android devices contain several built-in sensors that can detect motion, orientation, and environmental conditions. In this chapter you will focus on how to work with these physical sensors from your Android app. You will see how to access them through the Android SensorManager, how to register and unregister listeners, and how to interpret the raw data that sensors provide.
You will not learn about location here, because that is covered separately, but you will see how some sensors can improve location quality or enable features such as shake detection or step counting.
Types of Device Sensors
Android groups sensors into three main categories. Understanding this classification helps you choose the right sensor for a specific task and to know what kind of values to expect from it.
The first category is motion sensors. These report movement or acceleration of the device. Common examples are the accelerometer, which measures acceleration applied to the device including gravity, the gyroscope, which measures the rate of rotation around the device axes, and the gravity or linear acceleration sensors, which are derived or fused from other sensors.
The second category is environmental sensors. These measure conditions around the device. Examples include the ambient light sensor, which measures light level in lux, the proximity sensor, which measures the distance from the device to a nearby object, usually in centimeters, and sometimes temperature, pressure, or humidity sensors. Not all devices have all these sensors, but light and proximity are very common.
The third category is position or orientation sensors. These report the device’s position relative to the Earth’s magnetic field and gravity. The magnetometer measures the ambient magnetic field, which is useful for compass-like features. Orientation and rotation vector sensors combine accelerometer and magnetometer data to provide an estimated device orientation as a rotation in three-dimensional space.
Not every Android device includes every sensor. Always be prepared for some sensors to be missing and handle this gracefully in your app.
Accessing Sensors with SensorManager
To use physical sensors you work with the Android SensorManager system service. This object gives you access to the list of available sensors and lets you register callbacks for sensor events.
In an Activity or Fragment, you typically obtain the SensorManager using getSystemService. In a modern Kotlin activity that extends ComponentActivity or AppCompatActivity, the code usually looks like this.
class SensorActivity : AppCompatActivity() {
private lateinit var sensorManager: SensorManager
private var accelerometer: Sensor? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
sensorManager = getSystemService(Context.SENSOR_SERVICE) as SensorManager
accelerometer = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER)
}
}
The call to getDefaultSensor returns the default sensor of the specified type, or null if the device does not provide that type. There can be multiple sensors of the same type, for example different accuracy levels, but the default is usually the one you want for regular app features.
If you need a list of all sensors of a type, or all sensors on the device, you can use other methods on SensorManager, such as getSensorList. Most beginner use cases only require the default sensor for a particular type.
Always treat sensors as optional. Perform a null check before using the sensor. If it is not available, you might hide the corresponding UI feature or show a simple message to the user.
Registering and Unregistering SensorEventListeners
You do not poll sensors manually. Instead, you implement a SensorEventListener that receives callbacks whenever new sensor data is available. This is similar to how you receive location updates, but the interface and patterns are specific to sensors.
A basic implementation of a sensor listener for the accelerometer can look like this.
class SensorActivity : AppCompatActivity(), SensorEventListener {
private lateinit var sensorManager: SensorManager
private var accelerometer: Sensor? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
sensorManager = getSystemService(Context.SENSOR_SERVICE) as SensorManager
accelerometer = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER)
}
override fun onResume() {
super.onResume()
accelerometer?.also { sensor ->
sensorManager.registerListener(
this,
sensor,
SensorManager.SENSOR_DELAY_NORMAL
)
}
}
override fun onPause() {
super.onPause()
sensorManager.unregisterListener(this)
}
override fun onSensorChanged(event: SensorEvent?) {
// Handle sensor values here
}
override fun onAccuracyChanged(sensor: Sensor?, accuracy: Int) {
// React to accuracy changes if needed
}
}
The important parts are the calls to registerListener and unregisterListener. You pass your SensorEventListener implementation, the Sensor instance, and a delay constant that controls how often you receive updates. Placing registration in onResume and unregistration in onPause is a common pattern. It ensures that you only receive sensor data while your activity is visible and in the foreground.
Always unregister sensor listeners when you no longer need them. Failing to do this can waste battery, keep the CPU running, and keep your app process alive longer than necessary.
The SensorManager provides several delay options, such as SENSOR_DELAY_NORMAL, SENSOR_DELAY_UI, SENSOR_DELAY_GAME, and SENSOR_DELAY_FASTEST. A faster rate means more data but higher power usage. For UI effects such as tilt based animations, SENSOR_DELAY_UI or SENSOR_DELAY_GAME is often sufficient. For simple tasks such as detecting if the device is flat on a table, SENSOR_DELAY_NORMAL is usually enough.
You can also create a separate class that implements SensorEventListener instead of making your activity implement it. The principle is the same. You pass that instance to registerListener and later to unregisterListener.
Understanding SensorEvent Data
When new data arrives, the onSensorChanged method is called with a SensorEvent. This object contains several important fields. The most important one is values, an array of Float that contains the sensor readings. The meaning of each index depends on the type of sensor.
For the accelerometer, values[0], values[1], and values[2] represent acceleration along the X, Y, and Z axes respectively, usually in meters per second squared. The X axis is roughly horizontal, positive to the right. The Y axis is vertical, positive up. The Z axis is perpendicular to the screen, positive toward the user when the device is held in its natural orientation.
A typical handler might look like this.
override fun onSensorChanged(event: SensorEvent?) {
if (event?.sensor?.type == Sensor.TYPE_ACCELEROMETER) {
val ax = event.values[0]
val ay = event.values[1]
val az = event.values[2]
// Use ax, ay, az for your logic
}
}
The timestamp field represents the time of the event in nanoseconds since device boot, not wall clock time. This is useful if you need to compute time differences between events, but for basic user facing features you often do not need it.
Sensor values are often noisy, especially motion sensors. If you react to every small change, your UI might flicker. Simple smoothing techniques like averaging recent values, or ignoring changes below a certain threshold, can help.
If you need a complete description of the coordinate system or conversions between coordinate frames, you can use static utility methods in SensorManager, such as getRotationMatrix and getOrientation. These are more advanced topics that are useful when you implement precise orientation or augmented reality like features.
Common Motion Sensors and Use Cases
Among the motion sensors, three are very commonly used in beginner and intermediate apps. The accelerometer is the most widespread. You can use it to detect general device movement, approximate tilt, or simple gestures. A classic example is detecting a shake to trigger an action.
A simple shake detection can look at the magnitude of the acceleration vector. The magnitude is calculated as:
$$
a = \sqrt{a_x^2 + a_y^2 + a_z^2}
$$
You compare this against a threshold. If the value exceeds the threshold for a short time, you treat it as a shake.
For a vector with components $a_x$, $a_y$, and $a_z$ the magnitude is given by
$$a = \sqrt{a_x^2 + a_y^2 + a_z^2}$$
This formula is frequently used when working with accelerometer data.
The gyroscope measures how fast the device is rotating around the three axes, in radians per second. It is useful when you want to track rotation more precisely than what the accelerometer alone can provide, for example for games or camera stabilization.
The rotation vector sensor combines data from the accelerometer and the magnetometer and sometimes the gyroscope. It gives you the device orientation as a unit quaternion or as an axis and angle. You do not usually work directly with quaternions as a beginner. Instead you pass the raw values to SensorManager.getRotationMatrixFromVector and then to getOrientation to obtain human friendly angles such as azimuth, pitch, and roll.
Many devices also provide the linear acceleration and gravity sensors. These are not physical sensors but are derived in software from other sensors. The gravity sensor isolates the effect of gravity, and the linear acceleration sensor removes gravity from the accelerometer reading. This can be helpful when you want to distinguish actual movement from the constant downward pull.
Common Environmental and Position Sensors
Environmental sensors provide data about the device’s surroundings. The light sensor delivers ambient light level in lux. A typical use is to adjust UI brightness or to adapt colors based on whether the environment is bright or dim.
A basic light sensor handler might look like this.
override fun onSensorChanged(event: SensorEvent?) {
if (event?.sensor?.type == Sensor.TYPE_LIGHT) {
val lux = event.values[0]
// Adjust your UI based on lux
}
}The proximity sensor is often placed near the device’s earpiece. It reports a small value, such as 0, when something is very close, like your ear, and a larger maximum value when the path is clear. It is typically used to turn off the screen when the user is in a call and the phone is next to their head, to avoid touch events. You can build similar behavior in your own app, for example dimming the UI if an object is close.
Position sensors include the magnetometer, which measures the ambient magnetic field in microteslas. By itself, the magnetometer is influenced by nearby metal and electrical devices. When combined with the accelerometer using SensorManager utilities, it can provide the device orientation relative to the Earth’s magnetic field, which is the basis for a compass feature.
Rotation vector and game rotation vector sensors provide orientation intended for user interaction. The game rotation vector ignores the geomagnetic field which reduces distortion from hard and soft iron interference, and is useful for games where you care about stable, responsive orientation but not true north. The regular rotation vector tries to respect the global frame and works better for navigation use cases.
Sensor Availability and Feature Checks
Before you rely on a sensor for an important part of your app, you need to confirm that the device actually supports that sensor. Using getDefaultSensor and checking for null is the most direct method. For example, to check if the device supports a step counter sensor, you can write:
val stepCounter: Sensor? =
sensorManager.getDefaultSensor(Sensor.TYPE_STEP_COUNTER)
if (stepCounter == null) {
// Hide UI related to step counting
} else {
// Enable feature
}
For some sensors, especially those that represent major hardware capabilities, there are also manifest level feature declarations. These are added in AndroidManifest.xml using <uses-feature>. While this is part of manifest configuration, it is closely related to sensor availability.
If you declare a feature as required, devices that do not have it cannot install your app from Google Play. If it is optional, you must still check at runtime if the sensor is present before you use it.
Many sensors do not require runtime permissions. They are considered safe from a privacy perspective because they do not directly identify the user or provide direct location. However, sensor data can sometimes be combined to infer sensitive information. You should still use sensors responsibly and only when needed.
Power Consumption and Best Practices
Sensors consume power, both in the sensor chip and the CPU that processes events. Some sensors are relatively cheap, such as the light sensor, while others, such as the GPS receiver, are much more expensive. Although GPS is discussed separately under location, the same idea applies to all sensors. Use them at the lowest update rate that still gives a good user experience and turn them off when you do not need them.
Placing registration in onResume and unregistration in onPause is a simple but powerful pattern. For even more fine grained control, you can register and unregister around specific UI elements or states where you show sensor based interaction. For example, you might only listen for accelerometer updates while a particular animation is visible.
You can also use one time or short lived readings. For example, to calibrate something using the light sensor you might register a listener, capture the first value or a short average, then immediately unregister.
You should be careful with SENSOR_DELAY_FASTEST. This setting delivers data as quickly as possible, which may exceed what your app actually needs. Unless you have a specific requirement such as detailed motion tracking in a game, use a slower rate to reduce battery impact.
If you perform heavy calculations inside onSensorChanged, such as complex physics simulation or filtering, you can move those calculations to a background thread. Be careful, however, with synchronization and UI updates. Only update UI elements from the main thread.
Simple Example: Tilt Controlled UI Element
As a concrete example, imagine an app where moving the phone left and right moves an object on the screen. You can implement this with the accelerometer.
First, you register a listener as shown earlier. In onSensorChanged, you use the X axis acceleration to infer horizontal tilt. A basic approach is to treat positive values as tilt to one side and negative as tilt to the other.
override fun onSensorChanged(event: SensorEvent?) {
if (event?.sensor?.type == Sensor.TYPE_ACCELEROMETER) {
val ax = event.values[0]
val tilt = -ax / SensorManager.GRAVITY_EARTH
// Clamp tilt to [-1, 1]
val clampedTilt = tilt.coerceIn(-1f, 1f)
// Map tilt to a horizontal position in your view
// For example, update a custom view or call a method
}
}
In this snippet, dividing by SensorManager.GRAVITY_EARTH normalizes the acceleration to a fraction of gravity. A value around 1 or -1 corresponds to strong tilt to one side. This is not a full physics simulation but is enough to create a responsive, intuitive control for many casual interactions.
With more advanced techniques, such as combining sensors and using filters, you can create smoother, more accurate controls. For a beginner friendly project, simple normalization and clamping like this often provides a satisfying result with minimal complexity.
Testing and Debugging Sensor Features
Testing sensor features can be trickier than testing regular UI code because they depend on physical movements or environmental conditions. Most Android emulators provide some basic virtual sensor controls. For example, you can simulate accelerometer and orientation changes from the emulator extended controls window.
However, for realistic behavior and to test all available sensors on a device, you should run your app on real hardware. Different devices may have different sensor sensitivity, ranges, and noise characteristics. Test on at least one physical device for anything that relies heavily on sensors.
Adding temporary log statements inside onSensorChanged can help you understand what value ranges to expect. For example, you can log accelerometer readings while you gently tilt or shake the device to decide on appropriate thresholds for your app logic.
Finally, when testing, verify that your app behaves reasonably when sensors are missing or disabled, for example on a tablet without a gyroscope or a device without a step counter. Provide fallbacks or hide features gracefully, and avoid crashes caused by unguarded assumptions about sensor availability.