Posts

Cleaning up the mouth animation in Adobe Animate

Image
       This week I was working on the final lip sync. Instead of doing clips of the audio for each phrase the main character says. We thought it would be easier if we put together all the audio we are using in one audio clip. For the process of the lip sync, I used adobe animate because it has lip sync. Where it automatically lip syncs the audio for me. As easy as it sounds, the auto lip sync isn’t 100 percent accurate. I would have to go over it a couple times making sure that the mouth doesn’t look too chatty, using unnecessary mouth movements, and making sure it aligns with the audio.      These are some things that I did for the lip sync in adobe animate when cleaning it up.       To change the mouth expression. I clicked on the keyframe that I want the mouth expression to change , clicked on the mouth that's on the scene, and on the right I clicked on the frame picker. A panel pops up with all the mouth animation that I added i...

Updating the mouth

Image
     Last week I was working with the lip sync for the character . I added a new mouth for the character that looked a little bit cleaner and more simplistic. I also added the new audio for the character that we recorded.        First, I used photoshop and drew out the different mouth expressions. I used photoshop because I am a bit more comfortable and experienced using that software instead of adobe animate. I transferred the mouth expressions to adobe animate where I used the lip sync. From there I exported it out as a Png image sequence and imported it into unreal. I d id the same process where I used the media track in the sequencer to play out the lip sync.          I did run into a problem where the plane had a black background instead of being transparent. Usually when exporting a png it should be transparent. What I think went wrong in photoshop I painted the background black.      This was luckil...

Using Image Sequence for Video in Unreal

Image
       After the Oculus lip sync plug in did not work. Our other option was using After Effects but applying the 2D animation on top of the finale edit will just take too long. We ended up trying to animate materials in the sequencer in unreal. I was able to apply the mouth expression onto a plane but was left to figure out how to add the other mouth expression. I ended up figuring out how to apply png image sequence into unreal so it can play smoothly.       The first couple weeks of school I used Adobe animate lip sync for the mouth animation. I exported the animation as a png image sequence and placed them in a folder.       In unreal I created a folder in the content folder and named it movies. With the folder of the PNG image sequence I dragged and dropped the folder into the unreal project folder under the movie folder that I created.       In the Movie folder, I added an Img Media Source by right mous...

Using After Effect for character mouth animation

Image
  The Oculus lip sync plug in did not work for me. The graph from the blueprint did not sync with the audio. I decided to take on a different approach . Using Adobe After Effects for the 2D animation mouth expressions. For this I will have to take the final take and add in the mouth animation.   The first thing I did in after effects was import my video and audio by dragging and dropping on the left side of the screen. Then I clicked on new composition and pressed ok.     Then I just dragged my video that I imported and clicked and dragged and dropped and placed it on top of the composition setting to adjust the screen size.   To add the audio, I clicked and drag into the timeline.     I added a new composition for the mouth animation by left mouth clicking on the project panel.     I added the mouth animations in adobe illustrate and saved it as .ai. In a fter effect I dou ble clicked on the project panel and imported the  mou...

Part 2 using Oculus Lip-sync setting up the graph

Image
  In this week's post I added the graph. This is the most important part as it will make the mouth move based on the audio using OVRLipsync plug in. The OVR lip sync will be based on this. OVR lipSync Reference images First I Added the viseme which will put in a value for each the name and the value based on the audio. I added sets by clicking and dragging the return value and promoted to variable and renamed them. I also added variables by clicking on the plus sign on the left side of the screen.  This part from my understanding is that the value will start at zero then it will make it to the greatest value. It will then store it in the set Viseme Value and it will determine based on the 15 of the mouth sequence if its a higher number or not. It will then replace the sequences if another value is higher. I then added another Variable for my sprites. I renamed it VisemeSprite. On the variable type I clicked on paper sprite and on the right side I clicked on Array. On the Visem...

Part 1 Using Oculus lip-sync in Unreal Engine 5.3

Image
     In this week’s blog, with the help of one of my group mates Joaquin figure d out how to install the plug-in into unreal . We couldn’t figure out exactly why the plug-in would not install but Joaquin saw another OVRlipsync folder. He dropped the folder into Unreal’s plug-in folder. In U nreal   the OVRlipsync plug-in should show up in the plug- i ns folder called oculus lipsync . Once I found the plug in, I checked on the box and a window popped up to ask to restart the software for the pl ug-in to show up into unreal to be able to use it . Once that's all done I was able to use the oculus lip sync.        Starting off in the model BP in the component I added a OVRLipsync Playback Actor and audio .        Back to the content drawer in unreal I added a new folder and renamed it audio . In the folder I dragged and dropped a wav audio file. Hovering over the audio and right mouse click , I cl...