Detecting grammatical errors with BERT

We fine-tuned BERT (bert-base-uncased) model with CoLA dataset for sentence classification task. The dataset is a mix of sentences with and without grammatical errors. The retrained model is then used to identify sentences with or without errors. Are there any other approaches we could make use of using BERT, other than building a classifier?

Topic grammar-inference bert nlp

Category Data Science


Maybe you could interpret your use case as a named entity recognition problem. But for that, you would need to tag the wrongly spelled words first.

Then you could train a NER model using a transformer.


Absolutely, you could use bert pre trained model to suggest corrected sentences without annotated data as long as you are using a pre trained model. Here is a good repo that shows it https://github.com/sunilchomal/GECwBERT Hope it helps

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.