In a measurement process sometimes contant voltage output of a for example force transducer is logged as offset as a bias to be subtracted later from the dynamic measurement. When the sensor is at rest, this offset is sampled at a specific sampling rate and specific duration.
If we have the sampled data in time series of this offset how can we calculate the uncertainty of the mean value of the offset?
I mean what is the uncertainty of the mean value of the offset due to random noise? Is it the standard deviation or standard error? I want to be able to say “This is the mean value of the offset with this standard uncertainty”