Here is the OpenInterface “code” used for the demonstration (click to zoom) :
After using OpenVibe to get the signals, we spent time on the Emotiv API.
It proposes directly different functions for three types of behavior:
- The emotions;
- The cognition;
- The expressions.
This last one, is the one wich allows to get the most results in a few time. So, we decided to use it (we don’t have time to explore all the possibility the Epoc offers).
It is also easy to use and, in a short time, we managed to implement it in OpenInterface.
Here is a list of the different expressions that we can recognize (this works for most tests) :
- The blink of eyes
- The eyebrow movement
- The left wink
- The right Wink
- The look on the right
- The look on the left
- Clench teeth
- The smile