Is there a safe and simple way to estimate a standard deviation for a next subset?
In case I receive only standard deviation from a sensor of a value $v$ (that is btw normally distributed) each 4th minute but need to provide a standard deviation $\sigma$ for each 15 minutes is there a safe way to do it.
There are two things that came into my mind:
1) One and safe way is to get the mean, generate possible values using standard deviation of the 4 minute interval for the 15 minutes period (15*60 values). Calculate the $\sigma$ for this period
2) Alternatively one can naively estimate the value of $\sigma$ of the next time interval based on two previous values. For example, use and
standard deviations to estimate
In case the standard deviation is increasing/descreasing in previous cases and
it will increase/descreasing in the next time interval on absolute value
-
The first method can be time-consuming/computationally-consuming comparing to the second method. Though the second method may suffer on precision.
Edit 16.04: Since I'm limited in the amount of data i preferably would use only the last standard deviation and no mean data
Edit 23.04: There is one more way that bring me to the result very close to 1st way of problem solving.
Let say $\sigma_i$ is based on $n$ observations while $\sigma_{i+1}$ is based on $k$ observations and $k n$. Then $\sigma^2_{i+1} = \frac{(n-1) * \sigma^2_i * \frac{k}{n}}{k-1} $
The benefit in this case that you are not dealing with a mean value. I suppose that this solution works well only with normally distributed values.
Topic statistics
Category Data Science