Pretrained models for Propositional logic

Are there any pretrained models which understand propositional logic?

For example, the t5 model can do question-answering. Given a context such as Alice is Bob's mother. Bob is Charlie's father, t5 can answer the question Who is Charlie's father correctly, but it cannot say Who is Charlie's grandmother.

Is there any model that has been/can be trained to do this kind of deduction and answer the question?

Topic question-answering deep-learning

Category Data Science


As far as I know, automated reasoning with propositional logic can be done with a solver like Prolog (and it's not new). I don't know if it has been done but I don't think it would make sense to train a ML model for propositional logic, since it's entirely symbolic (as opposed to statistical): the right answer can be found deterministically.

The question of logic reasoning from text like in the proposed example is a bit different, because it involves a step of representing the text. I think it could make sense to train a model which converts a text into a formal logic proposition (and back). The logic reasoning should still be done with a tool meant specifically for that imho. Note that question answering doesn't involve any logical reasoning, even if it might look this way to a user. As far as I know a QA system learns patterns about matching a type of question to its corresponding answer, the system is completely oblivious to the meaning.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.