Simple Sync Lite™ – Lip Sync for Unity

This is a demo of Simple Sync Lite™, a simple lip sync system for Unity. Just drop it on a ‘mouth’ object, and assign an audio clip and it will move, scale or rotate the mouth parts. A more comprehensive solution including phoneme morphing is in the works, but hey, this gets you pretty far.

Check it out in the Asset Store
NOTE: If you purchased this package and it is missing any files, contact me! This seems to be a bug in Unity’s Asset Store tools and will be rectified in version 1.04 which should be up by 2/26/14. Sorry for any inconvenience!

NOTE: If your jaw is already animated, you’ll need to disable that so that this script can take over. To do that:

    For Mechanim:

Inspector, the Import settings for the model, Animations tab, Transform Mask (at the bottom), open that to find the joints (transforms) that will be under control of Simple Sync™, and uncheck them, so they are not under control of the Animation.

    For Legacy animation:

If your eyes/lids are under control of the animation, drop this script on each transform that needs to be non-animated (that you will override with Simple Sync™).

Many thanks to Kevin Gallant for a tutorial vid

These components are designed to be modular and can work in combinations if needed.
They will work on 2D or 3D models.

Work Flow:
————–
The basic work-flow is:
Find your mouth’s ‘jaw’
Drop onto it the appropriate SimpleSync component(s)
Assign the audio clip
Determine minimum and maximum ranges (from ‘mouth closed’ to ‘mouth open’)
Hit Play!

Types of Mouth Movement
———————————
First, determine what kind of movement the mouth you will be animating will be doing.

Does it just move up and down (no rotation)? If so you want to use a SimpleSyncMovement component.

If it is the type of jaw which rotates (as most jaws do), use a SimpleSyncRotation component.

If you want the mouth to resize, use a SimpleSyncScale component. This might be most appropriate for 2D cartoonish figures, but use as you see fit.

Other types of movment are certainly possible, but you may have to script that movement yourself. If you are making your own script to drive your jaw, just use the SimpleSyncVolume component. (All other components automatically use this most basic component).

Note that if the jaw you are animating is the bone of a rigged character, you may be able to use rotation, scale, or movement to move the bones and/or lips.

Note that multiple parts (bones or whatever) can be driven by the same SimpleSyncVolume component.

Setting Up The Scene:
—————————

Please refer to the ‘Simple Sync Lite Demo’ project.

If you have a rig (armature), find the jaw. You may need to follow the hierarchy of the character rig, testing the rotation of bones as you go, until you find the jaw.

If the jaw is a simple hinge (as in most animals), add the Rotation component. It will automatically add the audio source and Volume component. Using the Editor’s Move and/or Rotate tools, move the jaw, taking note of the values at ‘closed’ and ‘open’ (that being, as far open as you want it to ever be). Set the minimum and maximum values accordingly. Generally, the minimum will be 0 and the maximum rotation will be about 10 degrees. Also, you will be determining which axis of rotation the jaw uses. Play with the rotation tool to see which axis value is changing. When you find the correct axis, set it on the SimpleSyncRotation component.

Attach your audio source (this can be set dynamically with scripts, of course).

Hit Play to see the results. If the jaw moved in the correct way, you’re done. If it moved in the opposite direction of what you thought it should, reverse the values in minimum and maximum (sometimes the maximum value will be negative, indicating a negative offset from the start). That’s ok. If the jaw moves sideways instead of up and down, you need to use another axis.

You can also set the ‘intensity’ on the Volume component. This is just a scale factor of convenience so you don’t have to play with ranges too much. You can think of it as being 0 is silent, low numbers are ‘mumbling’ and larger values are yelling.

Your jaw maybe a floating bone or whatever type of mouth part which simply moves in a linear fashion, use a SimpleSyncMovement component.

The SimpleSyncVolume component includes a ‘Use Mic’ checkbox. When checked, it will try to use the default microphone as input, so you can see your character’s jaw move to the sound of your own voice. You can set the ‘delay’ value, in seconds. Smaller numbers has less delay, but zero is not a good number for this. Please see the ‘SimpleSync Lite Skull Parrot’ project for an example.

Credits
———
Skinned Mesh Head courtesy of Carlo Cassani

TheMonkey model Copyright 2009 Kursad Karatas
Licensed under Attribution-Share Alike 3.0 Unported. | http://creativecommons.org/licenses/by-sa/3.0/

Reference:
————-

SimpleSyncVolume

Response: Very Fast, Fast, Normal, Slow, Very Slow – this is an amount of ‘smoothing’ of the sound that is performed when providing motion data. You may need to experiment with this to suit your needs.
Use Mic : Check if you would rather use the Default Microphone rather than a pre-recorded audio clip.
Mic Sample Rate : If using the microphone, the sampling rate.
Delay : If using the microphone, the delay. This is actually the seconds of samples collected before lip-sync begins. Minimum is 1.

SimpleSyncMove

Mouth : Reference to the Transform which will move. Note if this is null, it will use the object that this script is attached to. The transform can be of a bone on a rig.
Motion Scale : A multiplication factor used on the incoming sound data. Smaller numbers make smaller movements, larger numbers make larger movements. Normally this woule be 1 if the sound is covers the full dynamic range, but since that’s not common, you can use this to ‘turn up the volume’ so to speak.
Range Minimum : The lower-limit of the translational displacement for the Mouth
Range Maximum : The upper-limit of the translational displacement for the Mouth

SimpleSyncScale

Mouth : Reference to the Transform which will resize. Note if this is null, it will use the object that this script is attached to. The transform can be of a bone on a rig.
Motion Scale : A multiplication factor used on the incoming sound data. Smaller numbers make smaller movements, larger numbers make larger movements. Normally this woule be 1 if the sound is covers the full dynamic range, but since that’s not common, you can use this to ‘turn up the volume’ so to speak.
Range Minimum : The lower-limit of the scale for the Mouth
Range Maximum : The upper-limit of the scale for the Mouth

SimpleSyncRotation

Mouth : Reference to the Transform which will rotate. Like a jaw. The pivot point should be set up such that normal rotation about one axis will make a nice talking motion. Note if this is null, it will use the object that this script is attached to. The transform can be of a bone on a rig.
Motion Scale : A multiplication factor used on the incoming sound data. Smaller numbers make smaller movements, larger numbers make larger movements. Normally this woule be 1 if the sound is covers the full dynamic range, but since that’s not common, you can use this to ‘turn up the volume’ so to speak.
Range Minimum : The lower-limit of angular displacement for the Mouth
Range Maximum : The upper-limit of the angular displacement for the Mouth
Jaw Axis: X, Y, or Z – the axis about which this jaw rotates.
Orig Rot : Original rotation values for the Mouth. These are retained at Start to have a reference from which to move.

37 thoughts on “Simple Sync Lite™ – Lip Sync for Unity

  1. Hi All,
    I’m Patrick Vier from Post logic, a french compagny.
    I need to show something like that in Unity. Is there a way to have a demo version (time limit) to use your plugin in Unity and show a concept with lipsync (lite). Thank for a reply from you.
    Best Regards
    Patrick

  2. Hi,
    Very interesting lip synch, this is a big issue for me for I am a big advocate of serious games and training simulations, but they suffer from lack of appropriate audio feedback with Avatar characters and this is becoming a bigger and bigger issue in our industry as a whole every day. If you have a legitimate plugin/extension solution for Unity to do this there are quite a few people who would be standing in line with money in their hands, myself included. please respond to my e-mail and let me know the current status of this project and if there is anything an Instructional Technology person can do to help or push through this technology as oppoesed to being a programmer, which I am not (Scripter maybe, but not a programmer).Thanks for any additional info.

  3. Hey, is it possible to licence Simple Sync Lite for our commercial product in development? Looking for very simplistic solution, but cannot find if this one is available. Would appreciate a quick reply.

  4. Thanks all for your interest in SimpleSync. The project is currently in beta testing. Pending some feedback and fixes it should be ready in about a week.

  5. Hi,
    We are working on a product in Unity for severely disabled people and we are about to start developing a text to speech to face animation system very much like what you are describing here. I would be very interested to hear from you about the level of development you are currently in and at which level you are aiming to bring this too. We are ready to discuss all types of possible collaborations from some type of licensing to maybe some more direct collaboration.

    I am looking forward to hearing from you.

    • It’s just about ready to ship. Getting art and video together. It really is simple – just the current volume level adjusting the size, rotation, scale, or whatever on a mouth. So for text-to-speech it would work by attaching the audio source which plays the converted text to this component. I do plan to make it understand phonemes in the future for better control. I don’t currently have plans to integrate text with it, but that’s not out of the question. Reminds me of the old Amiga days. I could probably bolt something like that in.

      • Hi, Very interested in finding out the current status of your SimpleSynch lipsynching product for Unity. I am in the process of defining my Dissertation project, which is going to be using Intelligent Animated Pedagogical Agents within Unity for a scenario-based training simulation, and I am looking for the easiest to implement solution but one that has enough realism to be useable for training simulations, and being more close up during these scenarios, the lip synching has to be pretty realistic. Please reply to my e-mail when you have time to let me know where things are currently at, how I might be able to acquire a copy and if you have determined any pricing for educational use.

  6. Do you have any update on the progress of this package? Or are you still looking for beta testers? I’m looking for a potential solution for doing some quick and dirty lip syncing, and this might do the job.

    Thanks

  7. You say that it gets the current “volume level”. I am assuming that it uses the AudioSource.GetOutputData but I can not figure out how to get the appropriate timing in the array.

    Could you give us any suggestions?

    • I’m sorry I don’t understand your question. Are you asking how I do it? My suggestion would be to buy the package, of course 😉 If you have bought it and are having problems, I’ll be glad to help.

      • I was more than happy to pay for code. I have checked the asset store and I can not find the package. Either way, I have figured it out. It was important that I create one that ran with Playmaker. Thanks, either way, for the quick reply.

  8. Looking to control a character live with a microphone. is there any way to have the character talk via a live feed? Would you be willing to take on a project like that?

  9. I got the plugin for unity and it seems to work fine, but won’t blend over my other animations. My character has and idle animation on him, but when I apply the simple sync script on nothing happens as the jaw rotation is probably being controlled by the idle animation. Is there a way to layer these things?

    • Yes I need to add this to the documentation. Go to your avatar controller setup and disable the jaw on the avatar map, so that animations will not affect it. I will post more details when I can (I am away at the moment).

  10. I have a face model that I export from Maya as FBX and import it into Unity. It has no specific ‘jaw’, it is just a model of a face based on vertices. The mouth is no separate object, the entire face is one object. In Maya I would use blend shapes to do lip animation, but since Unity doesn’t support that, I though your solution might be handy. My question is, will it work for my model?

    Thanks!

    • Generally, there must be something to move or rotate, and generally that’s a ‘bone’ So if you could weight-paint that jaw/cheek area to at least one bone, it would work. If you have just those shapes, would you mind sending me some files to work on? I’m thinking a ‘mouth closed/no talking’ and one ‘mouth open/saying something’ shape, and I’ll see what I can do. Would be fine as DAE or FBX.

  11. Hi
    I’m working with 2d sprites on Unity. I’ve different mouths for each different phoneme. Would it work for me? Can Simple Sync Lite asign each 2d mouth to each different phoneme automatically?
    Thx.

    • It’s not set up to do phonemes, that’s the ‘pro’ version I’m still working on. The ‘Lite’ version just uses current loudness to set size, position, or rotation. BUT: I could extend this to have it choose from a list of pre-set mouths, sorted from ‘closed’ to ‘open’ and it would pick or interpolate between them. I could roll this into the next ‘Lite’ version. The Pro version is a long way off unfortunately.

  12. Hello,
    How can I check if your asset works with some 3D models I own before buying it? Having a jaw bone that makes the mouth open and close is enough?

    • Hi Yiorgos,

      Yes, pretty much. You can check by finding the ‘jaw’ bone and rotating it a little in the Editor. If it can rotate a bit and look right (like the jaw is moving to open/close the mouth), that’s it, and use the Rotate option. If the bone needs to move (translate, not rotate), that’s ok too, just use the Translate (Move) option.

    • Although I have not tested it myself on iOS, it should work. That is, it should work as well as Unity can support audio properly on iOS!

  13. Hi, I just purchased the asset from the Unity store and I like it!

    I had to make some changes since I didn’t want a delay, so I pretty much combined your scripts with this script: http://u3d.as/content/alzheimer-studio/mic-control/5Rw
    so that I don’t have to deal with that 1 second delay since I am using mic input.

    I wanted to know if there is a way to slow down either the movement or rotation of a joint. I tried by multiplying the value variable with Time.deltaTime but it just makes a smaller movement like how changing motion scale would. Since there is audio constantly coming in, the mouth keeps quivering unless I have an if statement saying to move the mouth when the volume is above 1.5.

    • Hi Edgar, Thanks for the purchase! And thanks for the link, I’m always looking to improve my wares, I hope to have an update in a few weeks. Did you try the ‘Response’ setting on the ‘Simple Sync Volume’ script? That will set the rate at which the jaw responds, setting to slow will ‘even out’ the motion a bit. Let me know if that doesn’t do it for you, I’ll work on it for the next release.

      • I tried using it but realized that the response parameter wouldn’t work unless I had the “Use microphone” option checked, which I can’t because I am using the code for sound from the link that I sent you.

        I did try taking the code that deals with “response” and putting it in my own script, which worked with some modifications, but after some testing (I made sure to change the response in the correct component in the inspector), I can’t see much of a difference. I tried multiplying the “winWidth” with a higher number like 64, and it seems like it’s responding slightly slower but the mouth snaps back so quickly. Part of it is because of what I stated in my previous post, “Since there is audio constantly coming in, the mouth keeps quivering unless I have an if statement saying to move the mouth when the volume is above 1.5.”

        I think that is what is causing part or most of the snappiness when the mouth closes again but I don’t know any other way to keep the model’s mouth closed when no one is talking into the mic. Maybe some kind of filter to get rid of background noise?

        Thanks for responding back so quickly and I’m glad I helped out by providing the link!

  14. Pingback: Unity Assets Are Here! | Quantum Leap Computing

Leave a Reply

Your email address will not be published. Required fields are marked *

Human? *