1st order Taylor Series derivative calculation for autoregressive model
I wrote a blog post where I calculated the Taylor Series of an autoregressive function. It is not strictly the Taylor Series, but some variant (I guess). I'm mostly concerned about whether the derivatives look okay. I noticed I made a mistake and fixed the issue. It seemed simple enough,but after finding an error, I started to doubt myself.
$$f(t+1) = w_{t+1} \cdot f(t) $$
$$y^{*}_{t+1} = f(t+1)-{\frac {f'(t+1)}{1!}}(-t-1+t)$$
$$y^{*}_{t+1} = w_{t+1} f(t) + \dfrac{d}{df(t)}w_{t+1}f(t) + \dfrac{d}{dw_{t+1}}w_{t+1}f(t)$$
$$y'_{t+1} = w_{t+1} f(t) + w_{t+1} + f(t)$$
The details can be found in the blog post:
EDIT 7/6/20:
The AR form:
$$y^{*}_{t+1}=c+\sum _{{I=0}}^{L}w _{t+1-i}y_{{t-i}}+\varepsilon _{t}$$
f(t) is a recursive dense layer, y is the predicted output, and w are the weights, and L are the number of lag components. For the simple case where the next value only depends on the previous value, I got the following result.
$$f(t+1) = w_{t+1} \cdot f(t) $$
$$y^{*}_{t+1} = f(t+1)-{\frac {f'(t+1)}{1!}}(-t-1+t)$$
$$y^{*}_{t+1} = w_{t+1} f(t) + \dfrac{d}{df(t)}w_{t+1}f(t)$$
$$y'_{t+1} = w_{t+1} f(t) + w_{t+1}$$
EDIT 7/7/20:
The function f(t) represents y(t) with an error term. The error term might have some random process, but I'm going to assume that the errors are independent.
$$f(t+1) = w_{t+1} \cdot y(t) + \epsilon_t$$
EDIT 7/9/20:
Changed the dimensionality of w_t+1 to w_t.
$$f(t+1) = w_{t} \cdot f(t) $$
$$y^{*}_{t+1} = f(t+1)-{\frac {f'(t+1)}{1!}}(-t-1+t)$$
$$y^{*}_{t+1} = w_{t} f(t) + \dfrac{d}{df(t)}w_{t}f(t)$$
$$y'_{t+1} = w_{t} f(t) + w_{t}$$
Topic derivation forecasting regression predictive-modeling
Category Data Science