Handling encoding of a dataset which has more than total 2000 columns
Whenever we have a dataset to be pre processed, before feeding it to the model we convert the categorical values to numerical values for which we generally use LabelEncoding, One Hot encoding etc techniques but all these are done manually going through each column.
But what if are dataset is huge in terms of columns(eg : 2000 columns), here it wont be possible to go through each column manually, in such cases how do we handle encoding?
Are there any specific libraries available which deal with automatic encoding of variable? I know of category_encoders which provides with different encoding techniques but how do we do it at in the above mentioned condition.
Topic categorical-encoding encoding
Category Data Science