Add AGENT_SPEAKING to llGetAgentInfo, to detect whether avatar is/is not speaking on voice
tracked
Cutie Crush
While certain gesture triggers exist to activate if a given user is speaking on voice (/voicelevel1. /voicelevel2, /voicelevel3) it appears that there is no LSL analog to detect these states (or any voice use state).
Adding AGENT_SPEAKING to llGetAgentInfo would be analogous with voice, to the long-existing AGENT_TYPING detection.
The creative uses for such a trigger are nearly endless, from scripted a multi-camera setup for an interview program, with a camera that could automatically change position to the person currently speaking (useful for vlogs like Lab Gab!), to debate and open mic performance timers, recording cues for editing machinima, scripting a mesh head with custom bones or other nontraditional 'mouth' designs to animate the mouth while the wearer is speaking, or even silly things like a microphone that appears in your hand when speaking (similar to the classic magic keyboards).
This seems like a simple to implement feature, that could add a wide variety of creative options.
Log In
Bloodsong Termagant
this would be important, especially for the deaf community, and any other type of non-voice-user, as an accessibility aid. someone asked me if her in-world hearing dog could indicate if others were speaking nearby, but the answer is currently no. there's no way to scriptorially detect that.
Kinezis Resident
I'd also suggest adding LSL functions for specific vysemes.
To my understanding, the system avatar has specific blendshapes that play when you pronounce specific sounds to make lipsync, so there is already a system in place for that. We can't use blendshapes on custom meshes currently but playing animations for specific sounds would be very much welcome addition.
Spidey Linden
tracked
Issue tracked. We have no estimate when it may be implemented. Please see future updates here.
Blau Rascon
This'd be very nice, since then when making, e.g. a talkjaw or other comms-activated thing, we could just add in the AGENT_SPEAKING flag instead of relying on a huge chunk of voice gestures to pass info to scripts like we do now. Would be nice to not have to include those anymore, and would make the scripts tidier
Izzatooona Enthusiast
I love this idea. It probably can be done, since the viewer already has a "voice" indicator that turns green when you talk. It even shows them over your head, if you want that on. I vote YES to this!
Crush Cutie
Being able to know someone is using an optional communication medium has accessibility use cases.
Cutie Crush
Crush Cutie That's a really good point. Many is the time that I had voice disabled, and didn't realize someone was speaking on voice. Not just a 'I didn't hear them".. I didn't even KNOW they were trying to say something.
There could be some interesting use cases for this idea.