Which is better KL- Divergence or Bhattacharya(Hellinger) Distance

I'm beginner in probability and statistics. I came across the concept of comparing two probability distributions. KL-Divergence and Bhattacharya(Hellinger) Distance are used to compare two probability distributions. But which one is better among these two?

Topic vae probability statistics

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.