Using face motion to drive 3D facial animation
Hello,
I was wondering if there's a way to use the facial tracking data, that Snapchat already uses, to drive a 3D face. I have a 3D head with bones for the eyebrows, eyes, and mouth. Would there be a way to connect these bones to a tracking point, in Lens Studio, so that our face could drive it? It would be similar to the way Animojis work, but for those of us who don't have an iPhoneX.
Thank you,
Mark
Hey Mark!
Unfortunately, Lens Studio does not provide any facial tracking data. You can, however, access whether the mouth is open using the events MouthOpenedEvent and MouthClosedEvent. You can also track changes in brow state using BrowsLoweredEvent, BrowsRaisedEvent, BrowsReturnedToNormalEvent.
Let me know if you have any specific questions about how to use these events.
Best,
Peter