Which are the strategies to counter the 80/20 dilema in Data Science projects?
Most of the time in Data Science projects is not spent in (performing) actual analytics but rather in other tasks, such as organizing data sources, collecting samples and preparing datasets, compiling and validating business rules in data, etc.This fact has been studied as the 80/20 dilemma in Data Science projects
In order to tackle this dilema, I would like to ask which are the strategies used to decrease the 80% of time spent in the other stages (organizing data sources, collecting samples and preparing datasets, compiling and validating business rules in data)
Topic project-planning management
Category Data Science