Is there some form of temperature/RH compensation regarding the CO2 measurement?
I just deployed an indoor and an outdoor unit today, so I am just getting familiar with the possibilities and trying to come up with meaningful workflows.
This is what I experienced:
0. I have calibrated the indoor units CO2 measurement (left it outside for 20 minutes, clicked on calibrate button, CO2 value went to 400). I brought the device back into the house. Outside: 7 Celsius
- Inside, the device immediately started to show 2000ppm CO2 level, and still a low temperature. In 5-10 minutes the shown temperature converged to the actual temperature (21 C), and the shown CO2 went down to around 1150 ppm
- I have opened all windows. CO2 very quickly went down to 450ppm. Shown temperature was decreasing much slower. As temperature decreased, CO2 started to increase(!) after some time to approx 500ppm, from there it started to decrease slowly again.
- I have closed the windows. Both temperature and CO2 started to increase gradually. It took about 30 minutes for the displayed temperature to the reach the normal internal 21C(I guess as a result of the inertia of both the heating and the sensor), and the CO2 increased in a similar gradient, meaning that CO2 is back to the original 1150 ppm just 30 minutes after thorough ventilation.
So basically it seems like that the CO2 level is the same 30 minutes after ventilation as was before the ventilation.
Based on this experience I have a feeling that CO2 is somehow compensated by the temperature/RH measurement, and the different inertia of the different values/sensors might caused the experienced outcome.
I would like to understand this mechanism to a level which would make me capable to properly schedule the start and end time of the ventilation based on the values shown by the sensor, if that possible, to keep the inhouse CO2 level relatively low with the minimum amount of ventilation.