What is the difference between batch_encode_plus() and encode_plus()
I am doing a project using T5 Transformer. I have read documentations related to T5 Transformer model. While using T5Tokenizer I am kind of confused with tokenizing my sentences.
Can someone please help me understand the difference between batch_encode_plus() and encode_plus() and when should I use either of the tokenizers.
Topic transformer nlg transfer-learning nlp
Category Data Science