DeepStochLog: Neural Stochastic Logic Programming

  • Authors: Thomas Winters, Giuseppe Marra, Robin Manhaeve, Luc De Raedt
  • Publication Date: 2021-06
  • Publication Venue: ArXiv pre-print
  • Abstract: Recent advances in neural symbolic learning, such as DeepProbLog, extend probabilistic logic programs with neural predicates. Like graphical models, these probabilistic logic programs define a probability distribution over possible worlds, for which inference is computationally hard. We propose DeepStochLog, an alternative neural symbolic framework based on stochastic definite clause grammars, a type of stochastic logic program, which defines a probability distribution over possible derivations. More specifically, we introduce neural grammar rules into stochastic definite clause grammars to create a framework that can be trained end-to-end. We show that inference and learning in neural stochastic logic programming scale much better than for neural probabilistic logic programs. Furthermore, the experimental evaluation shows that DeepStochLog achieves state-of-the-art results on challenging neural symbolic learning tasks.
Read paper

Citation

APA

Winters, T., Marra, G., Manhaeve, R., & De Raedt, L. (2021). Deepstochlog: Neural stochastic logic programming. arXiv Preprint arXiv:2106.12574.

Harvard

Winters, T. et al., 2021. Deepstochlog: Neural stochastic logic programming. arXiv preprint arXiv:2106.12574.

Vancouver

1.
Winters T, Marra G, Manhaeve R, De Raedt L. Deepstochlog: Neural stochastic logic programming. arXiv preprint arXiv:210612574. 2021;

BibTeX

Related talks

Back to all publications