predict parameters of linear function
My questions seems very trivial, but I can't quite grasp it. I am also aware this post asks for opinions and knowhow, but do not know were else to ask. I do have quite a lot of experience solving even somewhat difficult machine learning problems, but never faced a situation were the target is still a parameter in an other function.
I have a function like:
t = ax + by + cz + bias.
And a database with t, x, y, z, m, n, o, p, etc. I expect the optimal a, b, c etc. to be functions of x, y, z, m, n, o, p. But since I do not know a,b,c, etc and only label/target t instead most machine learning methods from SKlearn do not work out of the box.
I though about first running a linear regression to find a, b, c, etc, that that loses all the information differentiating different datapoints. I guess a self coded SGD would be able to solve my issue, but do not know were to start. I also thought about just dropping my final function completly, but I want to conserve some of the information stored within the function.
For example I do know, that a is dependant on different parameters than b and can give a list to each. My thought is that I reduce possible overfitting by giving as many constraints as possible, espacially since my dataset is not that big.
Topic parameter-estimation
Category Data Science