[Ipisoft Mocap 89

0 views
Skip to first unread message

Kody Coste

unread,
Jun 12, 2024, 10:35:39 PM6/12/24
to diskmatilab

If some of the components is already installed, it has no checkbox and is marked with ALREADY INSTALLED label.You should not install all optional components in advance, without necessity. All of them can be installed separately at later time. Components descriptions below contain corresponding download links.

Ipisoft mocap 89


Download 🆗 https://t.co/2o5yY4ljvI



Light-color background (light walls and light floor) is recommended for markerless motion capture. iPi Desktop Motion Capture is designed to work with real-life backgrounds. A multi-camera configuration (3 cameras and up) can handle certain amount of background clutter. Please keep in mind that the system can be confused if your background has large objects of the same color as actor clothes.

For best results, your environment should have multiple light sources for uniform, ambient lighting. Typical office lighting with multiple light sources located on ceiling should be quite suitable for markeless motion capture. In a home environment, you may need to use additional light sources to achieve more uniform lighting.

Actor should be dressed in solid-color long-sleeve shirt, solid-color trousers (or jeans) and solid-color shoes. Deep, saturated colors are preferable. Casual clothes like jeans should be OK for use with markerless mocap system. iPi Desktop Motion Capture uses clothing color for separating actor from background and therefore cannot work with totally arbitrary clothing.

Recommended shirt (torso) colors are black, blue or green. Red is not recommended because red can blend with human skin color making it difficult for the system to see hands placed over torso. Black color is useful for reducing self-shadows on torso. If you have bright uniform lighting you can get better results with a primary-color (blue or green) shirt.

iPi Desktop Motion Capture has an option of using T-shirt over long-sleeve shirt for actor clothing. The tracking quality should benefit from such clothing because the arms are distinguished better from the torso.

iPi Recorder is a stand-alone application and does not require a powerful video card. You may choose to install it on a notebook PC for portability. Since it is free, you can install it on as many computers as you need.

Maximum possible framerate for Sony PlayStation Eye camera is 60 frames per second. Sony advertises PlayStation Eye camera as capable of capturing at 120 frames per second but framerates over 60 FPS result in too much noise in PlayStation Eye camera sensor and are not usable for motion capture.

A quad-core CPU is recommended for recording at 640 by 480 resolution at 60 frames per second. If you have a dual-core CPU you may need to configure a lower framerate and/or lower compression quality to be able to record video at 640 by 480.

All modern computers (e.g. dual-core and better) based on Intel, AMD and Nvidia chipsets have two high-speed USB (USB 2.0) controllers on board. That should give you enough bandwidth to be able to record with 4 cameras at 640x480 (raw Bayer format) at 60 FPS, or 6 cameras at 640x480 (raw Bayer format) at 40 FPS.

Recording with multiple cameras is a very resource consuming operation. If the system does not provide sufficient resources, you may experience different problems like frame drops, unstable frame rate, premature ending of recording. All these end up in poor quality records which cannot be successfully processed in iPi Mocap Studio. Below are some recommendations which help you to avoid problems with hardware performance during recording.

The general rule of thumb is to place most of the cameras at 1 - 1.5m height and one per each 4 at a greater height of 2 - 2.5m.However, for specific motions other setups may be more beneficial. For instance, when an actor is lying on the floor or crawling, you can improve tracking quality by placing more (like half) cameras higher, up to the ceiling level.

You can set up 5 or more cameras in a full-circle or a half-circle configuration, depending on available space. You can improve accuracy by placing one or two cameras high over the ground (like 3 meters high).

You can set up 4 cameras in a half-circle or a full-circle configuration, depending on available space. You can improve accuracy by placing one of the cameras high over the ground (like 3 meters high).

Sony PlayStation Eye cameras do not have standard tripod mounting screw, so you will have to use some kind of ad hoc solution. The simplest approach is to fix the cameras to tripods using sticky tape.

Calibration is a process of computing accurate camera positions and orientations from a video of user waving a small glowing object called marker (for color/color+depth cameras). This step is essential and required for multi-camera system setup.

Mini Maglite flashlight is recommended for calibration. This is a very common flashlight in US and many other countries. Removing flashlight reflector converts it into an ideal glowing marker easily detectable by motion capture software.

Ian Chisholm is a machinima director and actor and the creator of critically acclaimed Clear Skies machinima series. Below are some hints from his motion capture guide based on his experience with motion capture for Clear Skies III.

If you have a lot of capture to do, you need to strike a balance between short and long recordings. Aim for 30 seconds to 2 minutes. Too long is a pain to work on later due to the fiddlyness of setting up takes, and too short means you are forever setting up T-poses.

But in real-life you can have non-uniform lighting, when most light comes from window, or bright light source. In this case colors will be darker in cameras directed towards this light source, and lighter in the rest of cameras. In such situation adjusting light settings may substantially improve tracking.

Sometimes ground height detected during calibration may differ from the actual. This depends on particular flashlight you use for calibration, lighting conditions and other circumstances. Incorrect ground height may cause problems in feet tracking. Ground Height Fine-Tuning setting allows you to manually correct the ground height.

After the primary tracking and cleanup are complete, you can optionally run the Refine pass (see Refine Forward and Refine Backward buttons). It slightly improves accuracy of pose matching, and can automatically correct minor tracking errors. However, it takes a bit more time than the primary tracking, so it is not recommended for quick-and-dirty tests.

In contrast to the primary tracking, Refine does no pose prediction. It is based on the current pose in a frame only. Essentially, running Refine is equal to automatically applying Refit Pose to a range of frames which were previously tracked.

The default skeleton in iPi Mocap Studio is optimized for markerless motion capture. It may or may not be suitable as a skeleton for your character. Default iPi skeleton in T-pose has non-zero rotations for all joints. Please note that default iPi skeleton with zero rotations does not represent a meaningful pose and looks like a random pile of bones.

By default iPi Mocap Studio exports a T-pose (or a reasonable default pose for custom rig after motion transfer) in the first frame of animation. In case when it is not desired switch off Export T-pose in first frame checkbox.

Starting with version 3.5, iPi Mocap Studio can map hips motion either to Root/Ground or to Hips/Pelvis. This is useful for game engine characters, including standard Unity 3D Engine and Unreal Engine characters.

To map source character bone to multiple target bones you need to use Add a target bone item in a Manage target bones context menu. You then set weights for splitting the source rotation.

Latest versions of Maya (starting with Maya 2011) have a powerful biped animation subsystem called "HumanIK". Animations exported from iPi Mocap Studio in MotionBuilder-friendly format should work fine with Maya 2011 and HumanIK. The following video tutorials can be helpful:

iPi Mocap Studio has built-in motion transfer profile for UE4 Unreal Mannequin (default Unreal character) and MetaHuman character. So select the corrensponding target character, export animation to FBX and then import it into Unreal.

Some applications do not use the latest FBX SDK and may have problems importing FBX files of newer versions. In case of such problems, your can use Autodesk's free FBX Converter to convert your animation file to an appropriate FBX version.

iPi Mocap Studio supports COLLADA format for import/export of animations and characters. Current version of iPi Mocap Studio exports COLLADA animations as matrices. In case if you encounter incompatibilities with other applications' implementation of COLLADA format, we recommend using Autodesk's free FBX Converter to convert your data between FBX and COLLADA formats. FBX is known to be more universally supported in many 3D graphics packages.

Then you can import your model DMX into iPi Mocap Studio. Current version of iPi Mocap Studio cannot display character skin, but it should display the skeleton. Skeleton should be enough for motion transfer.

Anyways, if anyone uses Mocap Software in their animation creations, please help me out here. I'm wanting to make some weapon animations for myself. I don't want to buy prefabed animations, they cost way too much for ONE animation and in addition most weapon creators use them so we have a flood of animation spam on the marketplace.

So I'm ready to be hit with the reality mallet.. so don't hold back I need to see what I'm getting involved with here.. so if you need to crush my dreams, crush them.. but it's fine.. I recover fast.

It also did experiments with low-budget mocap for a while. The results were never "perfect on pressing a button", the output was still very usable - mostly to have the basic amination to continue working with.

My first try was (since I'm using 3dsMax that included the MatchMover software) traditional. Putting on black clothes and attached white ping-pong balls everywhere on me. I got 2 cheap (100$) cameras and recorded my moves with these 2 cams. After the tracking in Matchmover I finally got the anims into 3dsMax for further usage.

795a8134c1
Reply all
Reply to author
Forward
0 new messages