Made some adjustments in the socketio server and eye tracking react component to get rid of some connection bugs and to increase efficiency during sending/receiving data.
Added some averaging to the gaze position and made the scroll speed continuous. Feels a lot more natural now.
Using the y coordinate of the fixation position to automatically scroll up and down the page. It's pretty jumpy right now, but demonstrates the way that you should be able to control the page's position just by reading and scanning.
Finally got it working. Now I have eye tracker data streaming to the browser, so I can build an interface using libraries like React for the web.
In the end the method that worked best and without any noticeable lag is:
1. Accessing the raw data stream using Tobii's C++ SDK (this happens through an openFrameworks program)
2. Creating a local network OSC stream and sending gaze events (same openFrameworks progam)
3. Listening for the OSC stream on a nodejs server
4. Packaging up the OSC stream data to json and sending it to listening clients via a nodejs socketio server
5. Listening for the most recent state updates from the server and rendering on the client
Lots of hoops to jump though to hack this system together but now I can do the fun interfacey stuff.
My eyes and fixation point.
I had to change the plan a bit as the C++ socket.io implementation I was using was taking ages to get working as it's very depreciated. I dusted off a project of mine from March that sends gaze information via OSC, an interesting networking protocol originally used for audio/visual devices. I think I'll create a nodejs server that listens for OSC events and then dispatches them to the browser via socket.io.
The backup plan is to use C# .NET, which I want to avoid at all costs. I just wanna make an interface, all this server stuff is so annoying.
After three hours of wrestling with poorly documented and out of date libraries I finally can access the eye tracker's raw data stream using Tobii's C++ SDK. Next step is to hook this steam up to a C++ socket.io client, connect that to a Nodejs socket.io server, then connect a browser to the same server. Boom, eye tracking hardware streaming to the browser. Hopefully.
Eyes: tracked ✓
Always the craziest sensation to be aware of where you're fixating.
Went back to take a look at one of my projects that I'll be using as the basis for the gaze interface. It allows you to traverse between Wikipedia pages down branching trail-like structures. I want to do something similar using the eye tracker, where the system recognizes which content you're spending more time reading and uses it to pull up related pages, creating contextually linked trails of information.
Tracking device: acquired ✓
He said yes.