Add AGENT_SPEAKING to llGetAgentInfo, to detect whether avatar is/is not speaking on voice
tracked
Honey Puddles
While certain gesture triggers exist to activate if a given user is speaking on voice (/voicelevel1. /voicelevel2, /voicelevel3) it appears that there is no LSL analog to detect these states (or any voice use state).
Adding AGENT_SPEAKING to llGetAgentInfo would be analogous with voice, to the long-existing AGENT_TYPING detection.
The creative uses for such a trigger are nearly endless, from scripted a multi-camera setup for an interview program, with a camera that could automatically change position to the person currently speaking (useful for vlogs like Lab Gab!), to debate and open mic performance timers, recording cues for editing machinima, scripting a mesh head with custom bones or other nontraditional 'mouth' designs to animate the mouth while the wearer is speaking, or even silly things like a microphone that appears in your hand when speaking (similar to the classic magic keyboards).
This seems like a simple to implement feature, that could add a wide variety of creative options.
Log In
Kinezis Resident
I'd also suggest adding LSL functions for specific vysemes.
To my understanding, the system avatar has specific blendshapes that play when you pronounce specific sounds to make lipsync, so there is already a system in place for that. We can't use blendshapes on custom meshes currently but playing animations for specific sounds would be very much welcome addition.
Spidey Linden
tracked
Issue tracked. We have no estimate when it may be implemented. Please see future updates here.
Blau Rascon
This'd be very nice, since then when making, e.g. a talkjaw or other comms-activated thing, we could just add in the AGENT_SPEAKING flag instead of relying on a huge chunk of voice gestures to pass info to scripts like we do now. Would be nice to not have to include those anymore, and would make the scripts tidier
Izzatooona Enthusiast
I love this idea. It probably can be done, since the viewer already has a "voice" indicator that turns green when you talk. It even shows them over your head, if you want that on. I vote YES to this!
Coffee Pancake
Being able to know someone is using an optional communication medium has accessibility use cases.
Honey Puddles
Coffee Pancake That's a really good point. Many is the time that I had voice disabled, and didn't realize someone was speaking on voice. Not just a 'I didn't hear them".. I didn't even KNOW they were trying to say something.
There could be some interesting use cases for this idea.