Part 2 using Oculus Lip-sync setting up the graph


 In this week's post I added the graph. This is the most important part as it will make the mouth move based on the audio using OVRLipsync plug in.

The OVR lip sync will be based on this.

OVR lipSync Reference images


First I Added the viseme which will put in a value for each the name and the value based on the audio.



I added sets by clicking and dragging the return value and promoted to variable and renamed them.

I also added variables by clicking on the plus sign on the left side of the screen. 


This part from my understanding is that the value will start at zero then it will make it to the greatest value. It will then store it in the set Viseme Value and it will determine based on the 15 of the mouth sequence if its a higher number or not. It will then replace the sequences if another value is higher.



I then added another Variable for my sprites. I renamed it VisemeSprite. On the variable type I clicked on paper sprite and on the right side I clicked on Array.


On the VisemeSprite, I added the sprite in the order based on the oculus reference. 


This part will get the highest value and set it to the character. 


Adding this part will make the value 0 again and reset everything and add different mouth sprites.


That should be all for the graph. It didn’t end up working for me. I have to look back on the sprites and see if it was because of that or maybe something is wrong with the graph. I need to do more research on the graph and understand it a bit more.


Comments

Popular posts from this blog

Part 1 Using Oculus lip-sync in Unreal Engine 5.3

Updating the mouth

Cleaning up the mouth animation in Adobe Animate