DeepStochLog: Neural Stochastic Logic Programming
![AAAI22 Logo](/static/5d30e689e2214b6bd2c09786071e241c/6e60c/aaai22.png)
- Speaker: Thomas Winters
- Type: Conference talk
- Date: 2022-02-26
- Location: AAAI22: 36th AAAI Conference on Artificial Intelligence
Recent advances in neural-symbolic learning, such as DeepProbLog, extend probabilistic logic programs with neural predicates. Like graphical models, these probabilistic logic programs define a probability distribution over possible worlds, for which inference is computationally hard. We propose DeepStochLog, an alternative neural-symbolic framework based on stochastic definite clause grammars, a kind of stochastic logic program. More specifically, we introduce neural grammar rules into stochastic definite clause grammars to create a framework that can be trained end-to-end. We show that inference and learning in neural stochastic logic programming scale much better than for neural probabilistic logic programs. Furthermore, the experimental evaluation shows that DeepStochLog achieves state-of-the-art results on challenging neural-symbolic learning tasks.
Video
Slides
Related paper
Related projects
![](/static/07badb80c363e4e48670f6fc05e16293/aab36/cover.png)
DeepStochLog
Neural Stochastic Logic Programming