Accurate, reliable sensing of position, orientation, and movement has emerged as a vital requirement across diverse applications in segments including consumer, industrial, and mil/aero. To deliver this complex data, suitable sensor systems need to integrate results from more than a basic gyroscopic sensor or accelerometer.
It’s already inherently difficult to get accurate readings from these individual sensors, yet system designers need to go even further and combine the output of multiple sensing elements and integrate their output through sensor fusion. However, the complexity of sensor fusion implementation can easily stall system development.
To help with this, designers can turn to two orientation sensor devices from Bosch Sensortec. These devices offer a simpler alternative to custom-built sensor fusion designs, speeding development with off-the-shelf solutions to sensor-fusion functionality.
The need for sensor fusion
The ability to sense orientation and movement plays a fundamental role in applications that rely on virtual or physical movement. Smartphones rely on it for something as simple as changing from portrait to landscape displays, all the way up to complex inertial navigation applications that are able to operate without GPS assist. Beyond smartphones and many other consumer and industrial Internet of Things (IoT) products, medical and automotive applications are increasingly requiring some degree of orientation awareness functionality. This presents designers with the opportunity to differentiate their design, though there is a learning curve with regard to implementing an orientation aware, sensor-fusion design.
At the hardware level, the underlying sensing systems use accelerometers to detect motion, gyroscopes to sense rotation, and magnetometers to identify heading. Ideally, these sensors can provide all the required information needed to determine orientation, location and heading.
In practice, however, each type of sensor exhibits serious limitations in its ability to deliver the necessary data. Accelerometers are noisy due to their high sensitivity. Gyroscopes drift over time and therefore cannot provide absolute rotational data. Magnetometers respond to any magnetic field and can generate anomalous results from magnetic materials found in a given setting.
Further, none of these sensor types are able to unambiguously measure more complex movements such as yaw, much less identify the sensor's absolute orientation with respect to the earth. These more complex results require the combination of sensor data in a process called sensor fusion.
Sensor fusion methods
Sensor fusion merges data from multiple sensors to produce results that cannot be derived if at all from a single sensor. Specialists in orientation and inertial navigation employ a broad range of sensor fusion algorithms designed for specific classes of applications. The specifics of these algorithms are beyond the scope of this article, but each seeks to optimize the merging of raw sensor data that are statically or dynamically weighted by various characteristics such as sensor noise and accuracy. The result is a mathematical projection of orientation and movement using abstractions such as Euler angles or quaternions.
Fortunately, developers need not become experts in sensor fusion algorithms to exploit their functionality. In building sensor fusion applications, developers can use available software solutions such as the NXP Semiconductors sensor fusion library. Designed to run on MCUs such as NXP's Kinetis K20, the NXP software can combine sensor data acquired using the MCU's integrated analog signal chain which is comprised of programmable gain amplifiers (PGAs), comparators, and analog-to-digital converters (ADCs).
This MCU-based approach offers great flexibility in meeting specific application requirements. Developers without experience in sensor fusion theory can develop optimized systems using the available library. Experts looking to implement more specialized algorithms can substitute their code in the target application. Even so, all developers face significant challenges on the design of the front-end sensor system itself.
Regardless of the algorithm, the accuracy of sensor fusion results depends largely on the underlying sensor design. As a basic requirement of sensor fusion, sensor measurements need to be synchronized closely – or closely enough – to meet application requirements for temporal resolution. Issues such as physical layout of the target sensors can impact synchronization, particularly for applications where the sensor is far removed from the sensor-processing device. In these cases, different timing paths between sensors and their respective signal-processing chains can introduce a systematic timing error in synchronization. Although developers can account for these differences, an approach based on integrated sensors eliminates this problem.
Integrated sensor modules simplify fusion
Integrated sensor devices place each target sensor on the same module, removing any practical concerns about different timing paths. Further, by using such a device, developers can rely on the designer of the sensor module to have already minimized noise sources or other design factors that can erode sensor accuracy. In fact, Bosch Sensortec takes this approach a step further with its BMF055 9-axis orientation sensor. This system-in-package (SiP) device integrates an Atmel ATSAMD20J18A 32-bit MCU with sensors that are largely equivalent to its BMA280 accelerometer, BMG160 gyroscope, and BMM150 geomagnetic sensor (Figure 1). (Note that the BMF055's sensors differ in some performance values with their standalone equivalents.)
Figure 1: The Bosch Sensortec BMF055 combines sensors with an Atmel Cortex-M0+-based MCU to collect raw sensor data and perform sensor fusion, simplifying design of sensor systems for orientation and inertial measurement applications. (Image: Bosch Sensortec)
Based on the ARM® Cortex®-M0+ core, the built-in Atmel ATSAMD20J18A MCU integrates 32 Kbytes of SRAM and 256 Kbytes of flash memory. It serves as the local host, acquiring raw sensor data through an SPI bus and executing sensor fusion software algorithms within the module. In turn, the Atmel MCU communicates with an external host through a USART interface to transfer final sensor fusion results.
Hardware design is straightforward. The BMF055 requires only an external 32 KHz crystal and capacitors to complete a sensor fusion design (Figure 2). In fact, the Bosch Sensortec BMF055 evaluation kit provides a simple, ready-to-use development platform, combining a breakout board with a small board that includes the BMF055 and all necessary components.
Figure 2: The Bosch Sensortec BMF055 requires few additional components to implement an orientation sensor system and provides serial interfaces for development and communication of sensor fusion results to a host system. (Image: Bosch Sensortec)
The BMF055 SiP removes significant barriers to hardware implementation of sensor fusion designs. Developers who need to write their own sensor fusion algorithms can use the BMF055 as an integrated replacement, rather than create their own MCU-based sensor fusion design. In fact, Bosch Sensortec provides its BSX-Lite sensor fusion library within a layered architecture based on the Atmel software framework (ASF).
The software package exposes a series of APIs at each layer for access to the BSX-Lite library, sensor drivers, and underlying ASF drivers (Figure 3). The actual runtime code resides in packaged libraries delivered with the distribution. Developers can quickly build their applications on the provided stack, substituting their own proprietary sensor fusion library as needed for specific application requirements.
Figure 3: Bosch Sensortec provides a sensor fusion software package that provides API access to the BSX-Lite sensor fusion library, sensors, and the Atmel software framework (ASF). (Image: Bosch Sensortec)
The Bosch Sensortec software package even includes sample code that demonstrates high-level calls for executing various device operations (Code Listing). The software provides classes for each sensor, so reading data from a particular sensor is as simple as calling the appropriate method for an instance of the respective class. Low-level routines in sensor support libraries perform the necessary bus reads, executed by the embedded MCU to access sensor device registers through the SiP module's internal SPI bus.
void bmf055_sensors_initialize (void)
/* Initialize BMA280 */
/*BMA settings for running BSXLite: Range = 2G, BW = 62.5Hz*/
/* Initialize BMG160 */
/* BMG settings for running BSXLite: Range = 500dps, BW = 64Hz*/
/* Initialize BMM150 */
/*BMM settings for running BSXLite: Preset mode = Regular, Functional state = Forced mode*/
Code Listing. The BMF055 software package includes sample routines illustrating the use of high-level routines for operations such as initialization of the device's three integrated sensors. (Code source: Atmel/Bosch Sensortec)
With its full programmability, the BMF055 offers an effective solution for applications that require custom functionality and even specialized sensor fusion calculations. For developers looking for a quick drop-in solution, the Bosch Sensortec BNO055 integrates sensor-fusion firmware with sensors and the MCU to directly output higher-level information generated by the firmware. The BNO055 uses a register-based approach that allows a host to acquire final results including acceleration, linear acceleration, gravity vector, magnetic field strength, angular rate, temperature, and orientation in Euler angles or quaternions.
For hardware integration, the module provides I2C and UART ports for connection to a host and requires the same basic crystal and capacitor complement as the BMF055. As with the BMF055, Bosch Sensortec also provides a BNO055 development board that includes the device and all necessary components.
Because the BNO055 performs sensor fusion calculations and delivers the final results, the software interface is relatively simple. A basic device driver handles the bus reads and writes required for the external host to access the device through the I2C or UART hardware interface.
Low-level software routines access the BNO055's dedicated registers to retrieve specific sensor fusion results. For example, the driver routine
bno055_read_accel_xyz() reads raw linear acceleration data and the
bno055_convert_float_accel_xyz_msq() function converts the data to floating point values in units of m/s2.
Applications such as augmented reality, drones, and smartphones depend on the ability to determine orientation and movement. Sensor fusion provides this information based on sensors that cannot individually generate that information, or cannot do so unambiguously and rapidly. For designers, creating suitable sensor solutions presents significant challenges in both hardware design and software development.
The BMF055 and BNO055 devices meet the need for rapid development of custom and drop-in solutions for sensor-fusion designs.