Efficient Searching for a basis of information as a hyperparameter in a large possible hyperparameter space
I have a set of inputs, let's call them 'I', that can be fed through a complicated group of functions to produce/calculate a wide variety of outputs (let's call them 'O'). I want to find a subset of outputs (let's call them 'O-prime') within 'O' that contain sufficient information to form a basis in order to find/reconstruct a point in the 'I'-space accurately. In other words I want to pick 'O-prime' such that I am able to uniquely identify any point from 'I' given a corresponding point from 'O-prime'. I am not sure if such a basis exists, but I think here the choice of 'O-prime' here would be considered a hyper-parameter.
Naively to search for 'O-prime' I would try a number of options and see which gives the best results/lowest L2 loss when reproducing 'I' in training (presuming candidates for 'O-prime' that do not have sufficient information/do not form a basis would perform poorly here compared to those that do). The problem is that the space of outputs 'O' is rather large and so finding/ruling out the existence of an 'O-prime' by brute force seems like it would be intractable (Assuming 'I' and 'O-prime' have the same dimensionality here, we could be talking about something on the order of ${95 \choose 19}$ possible candidates in the applications I am interested in).
Is there a way to use machine learning to efficiently search through these options? For instance I know with trees/classification tasks there are ways to rank how much each variable contributes to determining the correct outcome. Is there a similar methodology that can be used when trying to reconstruct numerical values with a NN? If not what are some more sophisticated search methods I could use as opposed to brute force searching?