I've started learning about NLP and NLG and I'm fascinated! I've been blown away by the things I've seen from NLP; but I have a few questions about NLG. All my questions boil down to this: Given a network or Markov chain how does one specify what you want the system to talk about? To explain this a little; if I ask my 5 year old nephew to tell me something he'll talk about his toys, or what's on TV …
For example, I have a list of keywords like I, hungry => output: I am hungry or I, author, poem => output: I am the author of this poem. Can someone please suggest the simplest way to achieve this? I am a newbie, please tell me which knowledge I must have to solve this problem.
I am working on a project where I want the model to generate job description based on Role, Industry, Skills. I have trained my data and got the resultant output. I am aware that the T5 model was trained on C4 data in an unsupervised manner. The various different techniques applied including denoising, corrupted span, etc. But I am not able to understand how it works for classification problem. My concern is if I pass my input and target variables …
I am doing a project using T5 Transformer. I have read documentations related to T5 Transformer model. The aim of the project is to generate Job Description based on Job_Role and Skills. My concern here is which metric can I use for evaluating predictions. While training data I am using loss metrics but for predictions I am not sure which metric can I use. Any resources or read material is appreciated. Thankyou.
I am doing a project using T5 Transformer. I have read documentations related to T5 Transformer model. While using T5Tokenizer I am kind of confused with tokenizing my sentences. Can someone please help me understand the difference between batch_encode_plus() and encode_plus() and when should I use either of the tokenizers.
I have a dataframe which has columns Role Name, Technical Skills, Soft Skills and average experience. I have to use these words and generate a job description. I have read few stuffs on transformers, BERT, LSTM but either they have considered whole paragraphs or sentences to predict. I would like to know 1- how can I use pre-trained models to train my data. 2- How can I approach this problem. Any references and links to resources are appreciated. Thankyou.
I have implemented some basic models like composing a poem using the dataset of poems. But the results were not that good in general. I want to make a model that could write an essay for me. One strong motivation for making this essay writing model is that it would help me to escape my tedious and useless college assignments. But before I proceed and hope for making such a human-like model that could write an essay for me, I …
I have to generate simulated data for healthcare where people talk about their health issues. I am trying to Manually bring in keywords for different health issues and generate closely related keywords to generate human conversations regarding their health. Is there any library readily available to ease the work?
I am working on a project where I want to replace a template-based approach for financial reporting with an end-to-end approach with NLG. The template-based approach takes as input some financial data (features about ESG: environment, governance, social of the company with some scores features on these three pillars); and then, it is passed as input to a function (i.e, the "template" function) which return a text which is present on the report for the company. For example: score_governance = …
Why is the F-measure usually used for (supervised) classification tasks, whereas the G-measure (or Fowlkes–Mallows index) is generally used for (unsupervised) clustering tasks? The F-measure is the harmonic mean of the precision and recall. The G-measure (or Fowlkes–Mallows index) is the geometric mean of the precision and recall. Below is a plot of the different means. F1 (harmonic) $= 2\cdot\frac{precision\cdot recall}{precision + recall}$ Geometric $= \sqrt{precision\cdot recall}$ Arithmetic $= \frac{precision + recall}{2}$ The reason I ask is that I need …
For example, for a domain specific neural network in Fashion, with the Keywords light, dress, orange, cotton. It could output: This gorgeous orange summer dress is great for wearing on sunny camping days. It's cotton fabric makes it very comfortable to wear. Can someone please suggest the simplest way to achieve this?
Given a few specific words, which techniques of Natural Language Processing can I use to achieve creating a meaningful sentence from those words? eg. Words: jackets, highest sale, sweaters, lowest sale Sentence: Jackets exhibited the highest sales while sweaters sold the least. If my question is too broad, let me know so I can ask more specific questions.
I am trying to implement the following idea. For a daily newsletter I would like to generate an engaging and funny intro text, such as: Good morning. Sorry if there are beer stains and buffalo sauce smeared on the Brew this morning. Yesterday was the sports equinox, a very special day in which all four major U.S. sports leagues (NBA, NHL, MLB, and NFL) were in action, plus Tiger and the Premier League. In related news...our Nats in 7 prediction? …
I am working on an automated insights generation use case where I want to generate meaningful sentences from given aggregated data. For example, Data: Student = John Total_Marks = 96 Class_Average = 85 NLG model-generated insights: 1. You did an excellent job, John! Your score is 96! 2. You have scored 11 marks above the class average. When I look at classic NLG, they are sentence generation approaches given a starting letter or word. This might be more of a …
I want to create a translator which can translate English, Korean and Tamil sentences into English sentence, I tried googletrans but is there any way to create something better than that using DL and NLP techniques?
Please let me ask if you know any ressources on imperative sentence patterns for natural language generation. There’s probably some ressources online, but I did not really find any. Imperative sentences, or command sentences, could be structured very simple, e.g.: 1) “Eat!” (intransitive action-verb + exclamation mark) 2) “Eat rice!” (transitive action-verb + noun (singular or plural in some cases) + exclamation mark) 3) “Eat healthy!” (intransitive action-verb + adverb) … but the more words per sentence, the more complex …