Fine Tuning BERT for text summarization

I was trying to follow this notebook to fine-tune BERT for the text summarization task. Everything was good till I come to this instruction in section Evaluation to evaluate my model:

model = EncoderDecoderModel.from_pretrained(checkpoint-500)

An error appears: OSError: checkpoint-500 is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login and pass use_auth_token=True.

I really can't understand what is this error and how to solve it. I'm a beginner that much.

Also, I tried this instruction:

trainer.predict(test_data)

output:

The following columns in the test set  don't have a corresponding argument in `EncoderDecoderModel.forward` and have been ignored: article, highlights, id. If article, highlights, id are not expected by `EncoderDecoderModel.forward`,  you can safely ignore this message.

**** Running Prediction *****
  Num examples = 0
  Batch size = 4

PredictionOutput(predictions=None, label_ids=None, metrics={'test_runtime': 0.0283, 'test_samples_per_second': 0.0, 'test_steps_per_second': 0.0})

Topic huggingface bert finetuning

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.