How can I go about building a model for large number of outputs?
I have previously worked on small-scale feedforward neural network problems.
But I have started working on a new project where the goal is to predict air quality in 25 locations throughout the country a day ahead. Now, I am quite well-versed with the air quality side of things.
My question:
In a problem like this, would I develop 25 independent models (which share the same input structure) or one model with 25 outputs.
I guess what I want to do - is there something like parallel neural networks? Or is this 25 different problems? I have mostly worked on physical models where the physics would be shared by all 25 locations. And the inputs would be different.
Would this be a data parallelism
or a model parallelism
problem?
Category Data Science