Difference between revisions of "IC Python API:Hand Gestures Puppeteering"
Chuck (RL) (Talk | contribs) m (→Descripton) |
Chuck (RL) (Talk | contribs) m (→Usage Instructions) |
||
Line 55: | Line 55: | ||
#Inside iClone, load a standard or non-standard character. | #Inside iClone, load a standard or non-standard character. | ||
#Initiate the Hand Gestures Puppeteering script ('''Plugins > Python Samples > Hand Gestures Puppeteering'''). | #Initiate the Hand Gestures Puppeteering script ('''Plugins > Python Samples > Hand Gestures Puppeteering'''). | ||
− | #Press | + | #Press '''Preview''' button and space-bar to start the preview process without recording to the timeline. |
− | # | + | #Move the mouse around within the control wheel of the '''Hand Gestures Puppeteering''' window and watch as the hands begin to move and blend between the 6 different poses. |
− | #Switch between 2-Point and Multi-Point Blend to try out the different blending options | + | #Switch between 2-Point and Multi-Point Blend to try out the different blending options via the '''Blend Mode:''' menu. |
− | # | + | #When you are ready to start recording an animation, press the '''Record''' button and space-bar to begin. |
#Finish the recording by pressing the space-bar again and play the timeline to review the recorded animation. | #Finish the recording by pressing the space-bar again and play the timeline to review the recorded animation. | ||
Revision as of 20:27, 19 October 2020
- Main article: Various Lessons.
Demo Video
Descripton
The ability to customize and create new tools to facilitate character animation is one of the main draws of the iClone Python API. This example demonstrates the use of an intuitive graphical user interface to drive the hand gestures of a character. Once the script is applied, a Hand Gestures Puppeteering dialog window will appear with 6 quadrant gesture controls. The user can drag the dot within the interface to blend the hand gestures in midst of the 6 quadrants making for a smooth transition every time.
- Completely customizable control points with support for screen-captured icons.
- Save, load, or reset the custom control points with unlimited custom presets.
- Blend mode supports left and right separate recording or simultaneous recording for both hands.
- 2-Point Blend allows for blending of the nearest two points to the control point.
- Multi-Point Blend allows for blending of all 6 points from any position within the control wheel.
- Mimic's iClone's Puppet preview/record system: button press to initialize playback mode and space-bar to begin.
Course Prerequisites
You should familiarize yourself with the following fundamental articles before you proceed:
Link | Purpose |
---|---|
Using Temp Data | Learn to read/write temporary data. |
Embedding QML | A soft introduction to creating QML ui and embedding it into iClone. |
Takeaway Lessons
- Simulate Mocap device to control Character motion.
- Create UI using QML.
Required Files
- Hand Gestures Python script -included in Python samples (main.py)
- Hand Rig Python script (handrigger.py)
- User interface Python script (KeyData.py)
- Bone data required by the hand rigger script (BoneData.py)
- User interface markup language file for QT (resource/qml folder)
You can download this plugin from the Reallusion Marketplace. To acquire and view the source code, please visit Reallusion GitHub.
Usage Instructions
- Inside iClone, load a standard or non-standard character.
- Initiate the Hand Gestures Puppeteering script (Plugins > Python Samples > Hand Gestures Puppeteering).
- Press Preview button and space-bar to start the preview process without recording to the timeline.
- Move the mouse around within the control wheel of the Hand Gestures Puppeteering window and watch as the hands begin to move and blend between the 6 different poses.
- Switch between 2-Point and Multi-Point Blend to try out the different blending options via the Blend Mode: menu.
- When you are ready to start recording an animation, press the Record button and space-bar to begin.
- Finish the recording by pressing the space-bar again and play the timeline to review the recorded animation.
APIs Used
You can research the following references for the APIs deployed in this code.
main.py
- RLPy.RPyTimerCallback.__init__()
- RLPy.RGlobal.GetTime()
- RLPy.RGlobal.GetEndTime()
- RLPy.RGlobal.SetTime()
- RLPy.RUi.CreateRDialog()
- RLPy.RUi.GetMainWindow()
- RLPy.RUi.AddMenu()
- RLPy.RUi.AddHotKey()
- RLPy.RPyTimer()
- RLPy.REventHandler.UnregisterCallbacks()
- RLPy.RUi.RemoveHotKey()
- RLPy.RDialogCallback.__init__()
- RLPy.REventCallback.__init__()
- RLPy.REventHandler.RegisterCallback()
- RLPy.RUi.OpenFileDialog()
- RLPy.RScene.GetSelectedObjects()
- RLPy.RGlobal.IsPlaying()
- RLPy.RGlobal.Pause()
- RLPy.RGlobal.Play()
- RLPy.RUi.SaveFileDialog()
- RLPy.RUi.ShowMessageBox()