Activity recognition with binary sensors
I have a bunch of streams coming from a set of 28 binary sensors around a SmartHome like this:
Where:
- OBJECT: indicates the name of a binary sensor
- STATE: is obviously the state of the sensor at that precise instant (you can think of it like Movement/Pressure/Open=1 and NoMovement/NoPressure/Close=0)
- ACTIVITY: is the Label I would like to predict and indicates the activity the Human was doing
- TIMESTAMP/HOUR: you know what they mean...
The activities to be recognised are 24 and are a bit complex. For example: making breakfast, making lunch, making dinner, having breakfast, having lunch, having dinner, going to the toilet, teeth brushing, dressing, watch TV, and so on...
Can you suggest a technique to use this dataset for AR?
Topic activity-recognition deep-learning classification
Category Data Science