Guide to Natural language Prompt programming for few-shot learning of Pretrained Language Models

I'm currently working on a project with the goal of producing AI content in the space of a content generation like blog writing, Instagram caption generation etc. Found the in-context few-shot learning capabilities of the GPT-3 quite useful but I'm unable to generate creative content consistently. It becomes boring and repetitive in nature after a few iterations. I came across the concept of knowledge probing of language models and have come to this understanding that writing better prompts can actually solve my problem.

Can the community guide me to the right set of papers or other articles/blogs that expand on this idea further? so that I can make some progress on this interesting use case. Thanks, regards!.

Topic openai-gpt transformer text-generation deep-learning nlp

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.