How to specify version for dependencies so that each one is compatible and stays within a size limit?

I am trying to deploy a web app to Heroku. The free tier is limited to 500 MB.

I am using my resnet34 model as a .pkl file.

I create model with it using the fastai library.

This project requires torch and torchvision as dependencies.

But not specifying the dependency will download the latest version of torch which alone is 750 MB and exceeds the memory limit.

So, I specify torchvision version as 0.2.2 and specify the wheel for torch for v1.1.0 in the requirements.txt file. Now, this gives rise to other problems.

For example, I got this error message from Heroku build log-

ERROR: torchvision 0.2.2 has requirement tqdm==4.19.9, but you'll have tqdm 4.48.0 which is incompatible.

But if I specify tqdm's version to 4.19.9, some library called spacey becomes incompatible and the app fails. If I install more recent version of torchvision, the app size will exceed permitted limits.

How do I get around this problem? How do I find out which dependency versions are compatible with each other, and the total size doesn't exceed 500 MB? Is there an easy way to do that?

Topic fastai pytorch torch library deep-learning

Category Data Science


Till now, what I have learned is that it is a trial and error process.

And the best solution is to look for projects that have already done this using a working combination of libraries.

I used this requirements.txt file, which works fine:

gunicorn
flask
numpy
https://download.pytorch.org/whl/cpu/torch-1.1.0-cp36-cp36m-linux_x86_64.whl
torchvision==0.2.2
fastai==1.0.52
jupyter

The web app works perfectly and the permitted size is not exceeded.

Although Heroku's size limit is 500 MB, they have a soft limit of 300 MB. These libraries, their dependencies, and the model file together exceed that resulting in a slightly longer response time.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.