Information

Table Of Contents
Bosch Sensortec | BME AI-Studio Documentation 43 | 49
Modifications reserved | Data subject to change
without notice Document number: BST-BME688-AN001-00
Please note
Be aware that including the temperature data channel and humidity data channel needs careful attention, since the
temperature and relative humidity may not directly reflect the environmental conditions. This is because both values
are measured inside the metal packaging, which can be affected by self-heating effects of the BME688. For example
running the sensor in a continuous cycling leads to a significant temperature increase inside the metal packaging and
thus to a decrease of the relative humidity. This introduces transient effects on the recorded data that do not originate
from the actual sample you are measuring and therefore may mislead the training of the algorithm.
5.9 Neural Net
Here you can see the neural net architecture used for the training of the algorithm.
Choose other neural nets
In the current version of BME AI-Studio, you can choose one pre-defined neural net architecture. In a future release, we
plan to have the possibility of chose other architectures.
5.10 Training Method
Choose the training method for the training. In this version of BME AI-Studio, only one optimizer function is available
(ADAM optimizer). You can choose between different batch sizes:
ADAM optimizer, batch size 4
ADAM optimizer, batch size 16
ADAM optimizer, batch size 32
ADAM optimizer, batch size 64
The batch size tells you how many data-points are taken into account for the optimization at once. This has influence on
the training speed and stability.
As a rule of thumb, one can say the larger the batch size is the faster and more stable the training will be. The
disadvantage of having larger batch sizes and the reason to go to smaller batch sizes is that it is more likely for the
algorithm to be stuck in a local minima.
Please note
This is an expert setting. If you want to know more about training methods, please get in touch with us.