Part 2 using Oculus Lip-sync setting up the graph
In this week's post I added the graph. This is the most important part as it will make the mouth move based on the audio using OVRLipsync plug in.
The OVR lip sync will be based on this.
First I Added the viseme which will put in a value for each the name and the value based on the audio.
I also added variables by clicking on the plus sign on the left side of the screen.
This part will get the highest value and set it to the character.
That should be all for the graph. It didn’t end up working for me. I have to look back on the sprites and see if it was because of that or maybe something is wrong with the graph. I need to do more research on the graph and understand it a bit more.
Comments
Post a Comment