Does anyone know have any suggestions on how to batch characterize motion capture files in motion builder? I have all the naming conventions set and exported in FBX format from blade but looking to characterize the skeleton that blade creates. Thanks for any advice!
To characterize, you need to assign a bunch of models to different HumanIK nodes in the character definition, then enable characterization in order for MotionBuilder to work its magic. In the API, all of these mapping fields are implemented as properties of the character (object list properties, to be specific).
Once all of these mappings are set up, you can call SetCharacterizeOn to initialize characterization. It will return a boolean representing whether it succeeded, and if it failed, a user-friendly message will be accessible via GetCharacterizeError:
The library consists of two separate data sets, the first consisting of 175 individual moves spanning 17 motion types, including fighting and dancing, and 15 styles, mimicking the motion of humans, giants and apes.
The movements were captured from three professional actors, with Bandai Namco post-processing the raw data to remove noise and align proportions to a standard character skeleton, and are supplied at 30fps.
Compatible with most DCC applications and game engines
The files are provided in BVH format, supported natively by many DCC applications, including 3ds Max, Blender, Cinema 4D, Houdini and MotionBuilder, and in Maya via free import scripts.
So, our mocap result generated a bunch of actions, one action per bone. And actions takes a lot of space apparently. It increases our project file size from 2MB to 500MB. (78MB after compression). Is this normal for 2 minutes of mocap data with fingers? We used Motive for this.
I found the solution: after a successful retarget animation, there should be only 1 new combined action. And that is the only thing you need. Delete all other actions. Access actions list from Outliner -> Display Mode: Blender File -> Actions. Also delete the unused armature leftover.
I reworked a little bit mocap tools file to work with namespace on control. In the beginning, you have to have any control from your referenced rig selected.
Hope someone can check it and let me know if it works!
This might sound stupid for you but I really don't know how it works in the domain of 3d animation and motion capture, but I need to work with ASF/AMC files at the moment and I find it a little bit complex to understand (being a non-native english speaker does not help).
Edit: I am particularly thinking about Acclaim ASF/AMC format mainly because I need to use it with an imlementation of an algorithm that use joint vertex positions and deltas from initial position for the motion part. Having looked at how the data are represented it makes more sense to me to represent skeletons as joint vertex and motion as deltas between the intial skeleton and the new moved one.
For example, hold out your arm to your side. Notice that you still have one degree of freedom to twist the arm without affecting the positions of your elbow or wrist. Thus, if you had a motion capture format that stored only those coordinates, you would not be able to distinguish these poses.
To position any given bone, you need to go up the bone hierarchy to the root bone and combine all their transformation matrices together. For example, to position bone chest, the following sequence is used: root -> hips -> hips1 -> chest. Each of these bones has its own transformation (rotation, stretch) based on the motion capture data which affects the position of its children. The root bone is a special case that has no parents and represents the position and orientation of the whole model.
Motion capture data cannot contain only bone positions because rotation in the axis of the bone cannot be expressed otherwise. Finally, it makes more sense to have motion capture data in terms of rotation and stretch because these values can be interpolated more easily than position and it's what is used anyway to render the skeleton.
I sugest you to start learning animation. You will get more out of it than using mocap or downloading animation files. Learn Blender character animation, then look at videos of your favourite fighters and try to animate them. After that, compare. More you practice, more accurate results you will get.
We've added over 500 awesome motion files dedicated to capturing the beauty and expressiveness of American Sign Language (ASL) and over 200 motion files for Mexican Sign Language (LSM).
We've always been on a mission to make motion capture accessible to creators everywhere. And with this addition to our library, we're taking a big step towards diversity and inclusivity and empowering creators with accurate representation in digital spaces.
Ready to dive in? Head over to mocap.market today and explore our American and Mexican Sign Language motion collection.
We have some built-in entities and templates around this sort of work, but I wanted to get some feedback from the community about how different studios have approached this challenge. Here are some specific questions to start the discussion, feel free to answer any or all of them based on your experience:
Thanks in advance for sharing any thoughts or feedback here. This will not only help new teams adoption Shotgun in the mocap space, but will give our Product and Design teams some great insight into how we can support this critical industry segment even better in the future.
Both the Slate and Take entity is pretty empty so you would have to create the required fields for your workflows.
I would advise to standardize these for the studio and make them Project agnostic so you can define one workflow and use the same workflow on every project going forward.
I assume you used the correct bones and joints for the SL avatar when you created your BVH file. If you didn't you may want to spend time reading the SL Wiki to learn creating an animation for SL and about all the things you need to do before you import your file into SL.
You import it into SL using Upload>animation. However SL will only upload animations that are a maximum of 30 seconds of unique frames. If your animation is longer than this you will need to split it up then upload each segment then use a script to play them in order. Even if it 30 seconds, you will need a script to use it inworld in a pose ball.
Of course if you are going to sell them to builders, you can just sell the animation and the builder will have the scripts needed to use it. But if you are going to sell it on the retail market you will need to script it and put it in a pose ball.
Well, if Moven allows you to import a rig, then you'd just apply the mocap to that rig. If it does not, than you'd need to do this in another program made to do this. Linden Lab supplies us with a cr2 file that can be loaded in Poser or DazStudio. In Poser, you can use the SL cr2, and then import any human bvh file. Poser will allow you to apply the joints onto all the proper SL joints, then you can export that bvh. You will probably want to add a Tframe to the animation, as this is the first frame, and it's positioning frame. It should import into SL properly. DazStudio is a free animation program and basically can do the same as Poser.
Amethyst Jetaime - Thank you for the SL links and good information. I'm interested in creating and selling the motion on the retail market. So, that scripting link will come in handy! And, thanks for the heads-up regarding the 30 second animaton file limit.
To All Animators - My preferred pipeline is to take the mocap BVH motion and import it onto a simple 3D Studio Max Biped. Is there a 3DSMax Biped file with the proper SL bone count and naming convention available already? And, how do convert the motion from a 3DSMax Biped to SL?
I have 3ds Max, and other than Abu's script, I haven't found anything that allows you to export a bvh for SL out of 3ds Max properly. Abu's script will work, but I've never got 1 to export all the joints properly. My Max is old tho and It's been awhile since I tried. Ultimately, I'd not use 3ds Max, cause it just ends up being more work than it is worth.
FBX: The main export option for bringing the mocap data into MotionBuilder, Maya, Unity 3D, etc. This format allows saving mocap data or markers and/or sleleton (translation and rotation). FBX can also carry a mesh with image based textures or color per vertex, as well as virtual cameras.
A great reference for testing with your pipeline or simply checking out the quality of Mocap Online offerings, this Free Demo Pack download has you covered.
Regardless of what engine or application you are working with currently, all the included files allow you to experiment with other formats and pipelines.
Full UE4 Blueprint Project contains 100% of what is in the playable demo.
NOTE: Version 2.1 doesn't change any gameplay or action seen in the Demo.
You and up to eight players can explore the huge environment for hidden power ups, some platforming and surprises. Take a stroll down W Keystone Rd and visit Porcupine Ct. All while fragging each other with five different Main Weapons and a Pistol. It is fun to do some some exploration in more remote areas with a few surprises. Check out all the new features, weapons and animations. Have fun! Explore! Frag your friends!
A comprehensive set of Blueprints for a third person shooter with network replication for multiplayer support. A modular system and great starting point to create your own game, with many custom animations and audio. Our MotusMan skeletal mesh is used for NPCs and include several demo animations from our other Mocap Online Animation Packs.
erm not too sure about the converter, but there may be a workaround (all be it a long winded one) and forgive me if i'm wrong i did a mocap project at uni and it was well over a year ago so my memory is a bit groggy, but . . .
93ddb68554