What are the tradeoffs between Bayesian Deep Learning and Deep Gaussain Processes?

I understand the differences between Deep Gaussian Processes(DGPs) and Bayesian Deep Learning(BDL): DGPs are essentially feed-forward neural networks where each node is a Gaussian Processes, which BDL places a prior belief on parameters of a normal(potentially convolutional) neural network.

But what are the trade-offs and relationships between these two models?

Topic bayesian deep-learning machine-learning

Category Data Science


Deep Gaussian Processes (DGPs) are more complex than Bayesian Deep Learning (BDL). DGPs have more parameters to estimate. As a result of more parameters, DGPs have increased learning capacity but will take longer to train.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.