Any books or resources about how to approach "purely synthethic expressions" of physical phenomena?

Over and over again I come to think that it's cumbersome to collect empirical data. Yet it's often viewed as a necessity for explaining empirical phenomena.

But then I idealize that:

It would be so nice if I could describe a phenomenon simply by describing a simple model with variable parameters and then generate instances from it to describe the empirical phenomenon.

But I've been puzzled by particularly the validition phase of this, since often validation means to compare to empirical observation.

As an example, consider if you could describe the model that a well-trained CNN would be, without having to train it with a large data set. This can be achieved in some cases by having very good features, and then one needs a small training set.

So is there something that explains how to formulate very good features?

As an example, consider if you wanted to develop a model that can simulate 90% of the instances (ways) in which a tree may grow (i.e. the shapes it may form). Instead of collecting, say, 1000 tree samples in order to explain the shape variation.

Topic validation simulation

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.