Estimating time to travel between two lat/longs
I'm trying to create an offline estimator for how long it would take to get from one lat/long to another. Two approaches I have come across are the Haversine distance and the Manhattan distance. What I'm thinking of doing is calculating both of them and then using the average between the two as the distance and then use some average speed to calculate time.
Since this value will be used as an estimator for drivers in a city a straight line distance will grossly underestimate the actual distance drivers will have to travel.
Any thoughts would be greatly appreciated.
Category Data Science