Well, hello its me again...
I have thought about the way interactions from characters are implemented and i ended up with a conclussion.
For the END USER its not flexible at all... The characters do not learn, they only do what was scripted before.
I think the brain should become a brain... Why dont make use of neural networks and chatbots? Instead of scripting the whole behavior of the chars, just give them basic needs and instict. As someone spends some time with a char, the char will develop.
Also make use of text to speech libs to break the silence!
Make posing a part of learning, enable the chars to be controlled by the character. Small example:
Player: Push your hands up in the air.
Monica(does something wierd with her hands): This way?
Player: No
Monica: Could you show me what to do?
Player:Yes
Player (Poses the char)
Player:Remember this Pose as "Hands up in the Air"
Monica: Ok, now i know what "Hands up in the Air" looks like.
Also make behavior learnable. I am to lazy to explaon this with an example, but i am willing to discuss this proposal.
