Benjamin Cabé

Zephyr Weekly Update – I’m sensing some improvements!

Zephyr Weekly Update - June 23, 2023

Zephyr Weekly Update - June 23, 2023

Zephyr 3.4 was released last week, so new features started to flow again into the main repository for the upcoming 3.5 version of Zephyr, which is due in just about 4 months. The most notable change from this week is the addition of a new sensing subsystem, which provides a higher level of abstraction to interact with sensors and orchestrate/consolidate the data they expose (sensor fusion).

Next week, the Zephyr community will gather in Prague for the Zephyr Developer Summit (the agenda is packed!), and I hope to see many of you there!

New sensing subsystem

The new Sensing Subsystem is a high-level sensor framework which you can think of as a conductor for an “orchestra” of sensors.

Often times, your high-level application will not (want to) be responsible for directly interacting with a sensor hardware, and will just want to know about the sensor data itself (plus metadata, too, such as timestamps, units, …), which it can now get from the sensing subsystem rather than by requesting it using the underlying, lower level, Sensor API. The sensing subsystem is responsible for orchestrating efficient polling of sensor data (if two “consumers” need temperature data at a time interval of 1sec, surely there’s no need to have them both read virtually the same data from the sensor), and can also centralize/rationalize things such as power management.

What’s more, it makes it super easy to come up with virtual sensors, that are not directly bound to physical hardware / drivers, but rather consume and process data from other (physical) sensors to expose “virtual” sensor streams to applications. For example, a virtual podometer sensor could take data from a gyroscope and an accelerometer sensor, and turn it into a sensor exposing step count data that any app could use just as if it were any other type of physical sensor. Of course, this also enables sensor fusion scenarios, where the same gyroscope and accelerometer could be consolidated into a virtual 6-DOF IMU.

Sensor fusion is the process of combining sensor data or data derived from disparate sources such that the resulting information has less uncertainty than would be possible when these sources were used individually.

Wikipedia

This new subsystem is 100% optional. In particular, if your app only needs to access “simple” sensors, you should probably keep directly using the Zephyr Sensors API.

Architecture overview of the new Zephyr sensing subsystem.

See the main documentation page already linked above, and PR #55389 for more details. There’s a couple of additional pull requests already lined up that will enrich the subsystem in the coming weeks.

SoCs

Boards & shields

Nuvoton NuMaker PFM M467

Drivers

Bosch BMM150 Magnetometer

Miscellaneous


A big thank you to the 15 individuals who had their first pull request accepted this week, 💙 🙌: @cyliangtw, @donatieng, @MSEEHenrik, @supperthomas, @reportingsjr, @Nicolas62x, @jmontgomery-cruise, @ZYNQHRONIZE, @MarkoSagadin, @p-woj, @tenllado, @msmttchr, @ghu0510, @zapol, and @trustngotech.

As always please feel free to jump in with your thoughts or questions in the comments below. See you next week!

If you enjoyed this article, don’t forget to subscribe to this blog to be notified of upcoming publications! And of course, you can also always find me on Twitter and Mastodon.

Catch up on all the editions of the Zephyr Weekly Update:

Exit mobile version