Difference between revisions of "IC Python API:Hand Gestures Puppeteering"

From Reallusion Wiki!
Jump to: navigation, search
m (Descripton)
m (Verson 2.0)
Line 13: Line 13:
 
The ability to customize and create new tools to facilitate character animation is one of the main draws of the iClone Python API. This example demonstrates the use of an intuitive graphical user interface to drive the hand gestures of a character. Once the script is applied, a Hand Gestures Puppeteering dialog window will appear with 6 quadrant gesture controls. The user can drag the dot within the interface to blend the hand gestures in midst of the 6 quadrants making for a smooth transition every time.
 
The ability to customize and create new tools to facilitate character animation is one of the main draws of the iClone Python API. This example demonstrates the use of an intuitive graphical user interface to drive the hand gestures of a character. Once the script is applied, a Hand Gestures Puppeteering dialog window will appear with 6 quadrant gesture controls. The user can drag the dot within the interface to blend the hand gestures in midst of the 6 quadrants making for a smooth transition every time.
  
=== Verson 2.0 ===
+
=== Verson 1.0 to 2.0 Change Log ===
  
 
Improvements have been made to the initial release version:
 
Improvements have been made to the initial release version:

Revision as of 23:43, 29 December 2020

Main article: Various Lessons.

Demo Video

Descripton

The ability to customize and create new tools to facilitate character animation is one of the main draws of the iClone Python API. This example demonstrates the use of an intuitive graphical user interface to drive the hand gestures of a character. Once the script is applied, a Hand Gestures Puppeteering dialog window will appear with 6 quadrant gesture controls. The user can drag the dot within the interface to blend the hand gestures in midst of the 6 quadrants making for a smooth transition every time.

Verson 1.0 to 2.0 Change Log

Improvements have been made to the initial release version:

  • Silent initialization: removed Console Log print when loaded.
  • Add separate preview and recording for each hand; Now supports both hands, left, or right hand.
  • Changed recording hotkey to "R" instead of the space bar.
  • Rearranged the user interface by grouping the relevant controls.
  • Removed individual Preview and Record hotkeys.
  • Added Preview and Record button and activation via the space bar, similar to the Motion Puppet and Direct Puppet tool.
  • Fixed 2-Point blend bug.
  • Fixed left hand capture bug.
  • Blend points are now replaceable with Right Hand, Left Hand, and revertable to Default Gestures.
  • Blend point icons can now be replaced with a screenshot or an existing image file.
  • Blend point data can now save and load via preset files.
  • Blend point icons now scale as the blend weighting increases instead of highlighting in green.
  • UI disabled state is now more obvious and consistent.
  • Fixed blending errors between the gesture points.
  • Removed influence on the movement of the wrists.
  • Three completely new presets: "Standard", "Male", and "Female".
  • The UI will now store the state of the prior session.
  • The UI can now return the default factory state, with a press of a button.
  • Now supports transitioning between the end of the recorded clip and any existing clip in the timeline.
  • End transition period can be adjusted (number of transition frames).

Course Prerequisites

You should familiarize yourself with the following fundamental articles before you proceed:

Link Purpose
Using Temp Data Learn to read/write temporary data.
Embedding QML A soft introduction to creating QML ui and embedding it into iClone.
Merge All Clips Take advantage of new and powerful merge clip capability in v7.83.

Takeaway Lessons

  • Simulate Mocap device to control Character motion.
  • Create UI using QML.
  • Read and write temporary data without dealing with file access permissions.

Required Files

  • Hand Gestures Python script -included in Python samples (main.py)
  • Hand Rig Python script (handrigger.py)
  • User interface Python script (KeyData.py)
  • Bone data required by the hand rigger script (BoneData.py)
  • User interface markup language file for QT (resource/qml folder)

You can download this plugin from the Reallusion Marketplace. To acquire and view the source code, please visit Reallusion GitHub.

Usage Instructions

  1. Inside iClone, load a standard or non-standard character.
  2. Initiate the Hand Gestures Puppeteering script (Plugins > Python Samples > Hand Gestures Puppeteering).
  3. Press Preview button and space-bar to start the preview process without recording to the timeline.
  4. Move the mouse around within the control wheel of the Hand Gestures Puppeteering window and watch as the hands begin to move and blend between the 6 different poses.
  5. Switch between 2-Point and Multi-Point Blend to try out the different blending options via the Blend Mode: menu.
  6. When you are ready to start recording an animation, press the Record button and space-bar to begin.
  7. Finish the recording by pressing the space-bar again and play the timeline to review the recorded animation.

APIs Used

You can research the following references for the APIs deployed in this code.

main.py

handrigger.py