Detecting punch type using CoreML Activity classifier
I’m trying to train an activity classifier (made by Apple) to detect with kind of punch is thrown during boxing training.
Accelerations are taken directly from an Arduino Nano 33 using Bluetooth low energy. I have acceleration in 3 axes and gyroscope acceleration in 3 axes at a 100Hz sample rate.
A punch depending on the experience of the boxer lasts from 0.3 to 0.7 seconds
To get accelerations I have a special configuration on my application that produces CSV data files based on the specific program that must be followed. No activity, jab, left hook, left uppercut. The application tells the user which punch to throw and creates the relative file.
So far I’ve managed to get some data and test some results, results are not bad, but I have some concerns about how to improve the training.
- At the moment I’m using raw acceleration values, will it be better to filter them?
- This is something that really makes me crazy. In a WWDC video an engineer shows that she trained a machine to understand which kind of a frisbee launch had been thrown. In the brief explanation she tells that in each file, used for training, she has at least 3 launches for a specific type of lunch and of course more files for each. Wouldn't be better to register just one launch and repeat that procedure for more files. Since core ml training is based on the directory structure is like she is saying that 3 launches of that kind are associated with a specific activity when it must be detected it’s just one throw. Which approach is better? More files with just one activity registered repeated overtime or more files with the same activity repeated more time in one file and then the same over time?
Topic activity-recognition training classifier machine-learning
Category Data Science