How to find average lag time with variance & confidence of two time series
I have two variables as time series, one a consequent of the other, I would like to find the average time delay it takes the dependent variable to act on the independent variable. Additionally, I would like to find the range of variance that is associated with the lag time and its respective confidence level. I am unsure how to go about this in a statistically valid way, but I am using Python.
Currently I have used np.diff(np.sign(np.diff(df)))
to isolate the relative max mins of the time series to then try to find the time gap between the subsequent pairs of mins max but that doesn't seem too valid to me -- thoughts? The out put of the mins maxes return an array like [0, -2, 0, 2, 0, 0, -2]
where -2 is the relative min 2 is relative max.
Methodological pointers would be greatly appreciated.
Thank you for your time stay safe!
All the best, RS
Topic numpy variance python statistics
Category Data Science