Ideas on handling large quantities of "waste" data

Hello and thank you for taking time out of your day to help me. I am currently working on developing a machine vision application for production monitoring. The application handles images, about 20,000 images everyday are being processed and I want to come up with an interesting way to handle all this information.

Let me first explain what the processing of every images does. The application is being implemented on a production line, when an item passes infront of my camera, a photograph is taken and then through some image pocessing the application decides if the item is properly packaged and perfoms some actions accordingly. After that, the image is localy stored and in the end of the day every image is deleted to clear space for the next day.

Now, this system works good enough but I was thinking that maybe there is more creative way to handle the waste images. Do you think that creating a cloud based database with a statisticaly significant number of images of everyday could be usefull, and if yes do you have any statistical models to suggest?

Topic cloud statsmodels data statistics bigdata

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.