I've made a couple of interesting updates since the last post. The animation system now accounts for characters talking (and they know when the dialogue being displayed is theirs), I've learned a lot more about Articy and figured out how to get it to work the way I want it to, and I've been playing with some fancy new hardware.
I've implemented some interesting re-useable nodes in Articy, largely to make up for their lack of switch statements (their only conditions are boolean) and when my systems have a lot of enumerations dictating the flow of conversation that's really necessary.
The switch blocks for motivations and emotions.
Inside the switch block for character motivations.
And I've been playing with the Tobii Eye eye tracker. The Unity SDK for it comes with a 'Gaze Aware' component which, quite literally, just knows whether or not you're looking at that collider. I've been able to add it to sections of the UI roughly covering the face, body, and feet (the Tobii seems to only be accurate to about the size of a 10c-20c piece on the screen, and loses accuracy towards the edges, so I can't be too specific with it) and implementing a system which times eye placement to figuring out where you're tending to look on the character, essentially, the characters can tell if you're staring and react accordingly. At the moment this is just a change in emotion, but it could potentially trigger different conversation options as well.
Example of the dialogue triggering the talking mouth animation, emotional responses will still affect the eyes if they're talking, and the eyes and mouth if they're not. Also shows the tobii indicator tracking my eyes but the emotional triggers to staring either weren't triggering for some reason or weren't noticeable.
This project started as a smaller part of a bigger game I was planning, but at this stage it appears it's going to be turning into more of a visual novel type game in its own right. I think I'm gonna need to work on my writing skills.