With single-click calibration, Studio tracks and animates any facial performance. Our neural network technology easily recognizes your face whether from a live camera or from pre-recorded media. Interactive events, virtual production, and large scale content creation are all made possible with Studio.
Motion Effects is a powerful system for building additional logic into your realtime data stream. Effects are used to manipulate your data to perform exactly the way you want it to, giving you unparalleled and direct control over your final animation.
Use the Realtime Setup panel to select the camera, image sequence, or pre-recorded video that you want to track. Then, use the controls in the panel to adjust various options like frame rate, resolution, and rotation. Depending on your input, select the Stationary or Professional Headcam model. The Pathfinder feature will alert you whether your realtime setup is optimal.
With your realtime setup complete, the Tracking Viewport will begin displaying your media. In order to calibrate your tracking, your actor should be properly framed, focused, and in a neutral pose. If using pre-recorded media, use the media playback controls to find an ideal neutral pose. Then, use the Calibrate Neutral Pose button to begin tracking.
The Animation Tuning panel contains a list of each shape and its corresponding value as your face is being tracked. The moving, colored bars give you instant feedback about each shape. Click and drag the slider to increase or decrease the influence of each shape. In the Motion Effects panel, use the included effects to enhance your data or use Python to create your own custom effects on a per control basis.
Faceware Studio connects to Unreal Engine through our free Live Link plugin. Unreal is an ideal choice for things like virtual production, pre-visualization, and projects that require high-quality character rendering and provides an intuitive toolset for connecting the animation data from Studio to your character.
Capture, edit, and play back complex character animation with MotionBuilder 3D character animation software. Work in an interactive environment that's optimized for both animators and directors. Create realistic movement for your characters with one of the industry's fastest animation tools.
Faceware Studio connects to MotionBuilder through a free plugin called Live Client for MotionBuilder, available for free through your Faceware User Portal. MotionBuilder is a common and ideal choice for traditional motion capture pipelines looking to record facial animation data being streamed from Studio.
While this is an experimental and unsupported feature, it is possible to use your own character by replacing the current preview character with an FBX of your choosing that mimics the hierarchy and naming conventions of the current character. Write in to sup...@facewaretech.com for more information!
Absolutely! The data streaming from Faceware Studio is in JSON format, streaming over TCP/IP. Connecting to the socket and parsing the data is simple and easy. Contact sup...@facewaretech.com for any questions.
Faceware Realtime for iClone is the face tracking software provided by Faceware Technologies Inc.
Contact Faceware Technologies Inc. Support by emailing sup...@facewaretech.com ONLY if you are a purchased user, and have the following support requests:
*To get support from Faceware, please make sure that you provide the "Ticket" for your Faceware Realtime for iClone, so that they can identify your product version. The "Ticket" is the same as the Serial Number, which is issued to you in the order email from Reallusion.
You can also create and share Pose Libraries in Retargeter Studio Plus between your animation team, effectively allowing your animation team to set up one library for an entire production, rather than recreating it each shot.
In addition, Studio Plus versions allow for batch/API processing capabilities that provide an infrastructure to quickly process hundreds (even thousands) of lines of dialogue for larger projects. With this, you can set up a semi-automated pipeline to quickly track large collections of data, and get each shot to the 'polish' phase in 70% of the time.
For our software licenses, it depends on the type of license you own. If you purchase Perpetual Licenses of our software, you may upgrade from Studio to Studio Plus at a later date as long as your support is active. Should you upgrade to a higher-priced version we will credit revenues you have paid towards the cost of upgrade.
We do not allow software upgrades of our new Annual or Academic Licenses. For example, you would need to purchase a new Studio Plus Annual License if you wished to upgrade from a Studio Annual License.
Yes, you can purchase multiple years of Annual Licenses at one time. Since our Annual License rates potentially change from year-to-year, purchasing multi-year licenses now will ensure you have a license for multiple years at a reduced rate.
If you are in need of hardware or a larger bulk purchase that would require a quote, you can make your payment after the quotation has been confirmed in our system and an invoice has been issued to your email address. From there, you can pay via any of the options listed above.
Please note, however, that credit card payments will receive a 4.5% processing fee. Once receipt of your full invoice payment has been confirmed, we will begin the process of issuing your software licenses and/or preparing to ship your hardware.
Our systems are largely customized and can vary depending on vendor and availability. We work to make sure we have all components available at the time an order has been placed. Once payment or terms have been received, we will work to ship out your system within 3-10 business days, based on our current order queue.
It is important to note, however, that the person or company receiving the shipment is legally obliged to pay customs duties. Customs duties and taxes are only set when your shipment is imported into the country. Delivery fees from your sales invoice are ONLY for the cost of shipping the freight.
Please contact sup...@facewaretech.com to coordinate the return of your hardware products in need of repair or replacement. Once proper maintenance on your hardware has been conducted, we will send your system back to you within two to ten business days depending on if components needed are Faceware manufactured or third-party.
For the Studio Plus versions of our software, no internet connection is required. For Studio, and Trial versions, an active internet connection is only required during the initial credential check whenever the software is launched. During the actual workflow process, no internet connection is required.
For calibration/neutral pose, try this. Before you make a neutral pose, try and scrunch your face first, like you just tasted a lemon, then let your face muscles completely relax. That should give you a good neutral pose.
Depending on what part of the face you really want the solver to hone in on, like the mouth for instance, move your camera slightly lower so that its looking not straight ahead but just a tad bit lower, looking up to where your nose is.
Another tip, the Pro Cam settings do not always give the best results. I find that whether I am using DSLR prerecorded footage or footage from the Mark IV HMC, sometimes I get better results from the stationary cam.
Hello Gabriella, nice idea to have a thread about faceware studio. It would be nice to have an example of how you manage to achieve such a nice facial mocap in your last videos. I m using a good stationary Logitech 60fps web cam, but I have not managed to get the results i would like. Besides of Animation Tuning tab, did you also use the Motion Effects tab to adjust the curves ?
So the thing is though, regardless of obs, if you are using an iphone that is capturing at 30fps i would leave it at 30 and not mess with it. The highest you can stream into Faceware is 60. Faceware will automatically pick up the fps you are streaming in, so I always leave that alone. I have tried increasing the fps in FW but encountered other issues such as audio sync later on down the road.
Using a webcam to achieve good facial motion can be tricky. What you want to do is get even lighting on your face first, then you do want your face to be close enough to the camera so that your face should be taking up the majority of the screen. The closer your face is, the better the solver will be able to track your expressions.
If you are not using the head/neck rotation, try to not move your head a lot. If you notice you are doing something like moving your head alot and it is throwing the solver off (those lines around your eyes, nose, brows and mouth after you calibrate) the cleaner and smoother data will be the end result.
I personally do not use the male head in Faceware as my reference. The best way to see how the data is translating onto your Metahuman is to stream directly into Unreal and make your adjustments with the multiplier in faceware that way. You can also try switching from Stationary Cam to Pro cam.
My issue is a straight forward one really.
My trial of FaceWare Studio runs out tomorrow and I spent all my cash on a mic to make tutorials
and the massive vet bill for my dog.
Other than me taking on extra hours and extending my nightshift at work, do you have any idea idea how I can fix the problem and get another 6 months free ?
So there are two motion logic blueprints. Faceware Live Link plugin has its own motion logic blueprint as well as Glassbox. Looks like you are trying to use the Glassbox motion logic blueprint with the faceware one. If you go to the Faceware Unreal course on here, download the zip folder, it should be in there. The motion logic bp labeled FWLL is the faceware one, fyi.
Hi,
I checked manual and I see you can use plenty of assets for lipsync and they are played somehow automatically.
Is there a way to play animation files generated by maya? I am using faceware/maya to analyze facial capture video, I can cut it same as sound, with same names, create animator for it and call each file with triger or something like that, just curious if there is better way? ( got several hundred lines )
Thanks in advance for any info