How often may algorithms improve on the quality of input data? Or i.e. extrapolate in information?
How often may algorithms improve on the quality of input data? Or i.e. extrapolate in information?
I've sometimes thought that it might seem like it's possible to add data points by some well-informed interpolation or random draw procedures on top of real-world data, but I wonder how common or reasonable is this in general? Since someone else could view that input data is somehow bandwidth-limited and all processes that follow must necessarily be limited to this resolution. And that there's no way to gain more samples etc.
Topic interpolation
Category Data Science