How to optimize hyperparameters in Bert?
I am using the BERT model in order to classify stereotypes in sentences. I wanted to know if there is a way to automate the optimization of hyperparameters such as 'epochs', 'batchs' or 'learning rate' with some function that is similar to 'GridSearchCV' (I don't know if this function can be used in the BERT model, if it can be used let me know) so I don't have to test combinations of values 'by hand'.
I attach part of my code where I declare the hyperparameters in case there is a function that automates the search for better hyperparameters, someone could tell me where to include it.
np.random.seed(112)
from sklearn.model_selection import train_test_split
df_train, df_val = train_test_split(df_detests, test_size=0.2)
#HIPERPARAMETROS
EPOCHS = 1
BATCH = 16
model = BETOClassifier()
LR = 1e-6
train(model, df_train, df_val, LR, EPOCHS, BATCH)
df_test=test
evaluate(model, df_test, BATCH, False)
I would also like to know if it is possible to extract metrics from the BERT model such as f1 score
Topic bert transformer deep-learning nlp python
Category Data Science