Difference between revisions of "IC Python API:Hand Gestures Puppeteering"

From Reallusion Wiki!
Jump to: navigation, search
m
m
Line 2: Line 2:
 
{{Parent|IC_Python_API#Various_Lessons|Various Lessons}}
 
{{Parent|IC_Python_API#Various_Lessons|Various Lessons}}
  
==Hand Gestures Puppeteering==
+
== Demo Video ==
  
 
{{#evt:
 
{{#evt:
 
service=youtube
 
service=youtube
 
|id=https://www.youtube.com/watch?v=qoc00QhhL4E&list=PLNV5zSFadPdnhNttpXa1YtstjTnFjA6qF&t=0s&index=4
 
|id=https://www.youtube.com/watch?v=qoc00QhhL4E&list=PLNV5zSFadPdnhNttpXa1YtstjTnFjA6qF&t=0s&index=4
|alignment=right
 
 
}}
 
}}
 +
 +
== Descripton ==
  
 
The ability to customize and create new tools to facilitate character animation is one of the main draws of the iClone Python API. This example demonstrates the use of an intuitive graphical user interface to drive the hand gestures of a character. Once the script is applied, a Hand Gestures Puppeteering dialog window will appear with 6 quadrant gesture controls. The user can drag the dot within the interface to blend the hand gestures in midst of the 6 quadrants making for a smooth transition every time.
 
The ability to customize and create new tools to facilitate character animation is one of the main draws of the iClone Python API. This example demonstrates the use of an intuitive graphical user interface to drive the hand gestures of a character. Once the script is applied, a Hand Gestures Puppeteering dialog window will appear with 6 quadrant gesture controls. The user can drag the dot within the interface to blend the hand gestures in midst of the 6 quadrants making for a smooth transition every time.
Line 23: Line 24:
 
*Multi-Point Blend allows for blending of all 6 quadrants from any position within the control wheel.
 
*Multi-Point Blend allows for blending of all 6 quadrants from any position within the control wheel.
  
===You Will Learn How to===
+
== Course Prerequisites ==
 +
 
 +
You should familiarize yourself with the following fundamental articles before you proceed:
 +
 
 +
{|class="wikitable
 +
!Link
 +
!Purpose
 +
|-
 +
|[[ IC_Python_API:Using_Temp_Data | Using Temp Data ]]
 +
|Learn to read/write temporary data.
 +
|-
 +
|[[ iC_Python_API:Embedding_QML | Embedding QML ]]
 +
|A soft introduction to creating QML ui and embedding it into iClone.
 +
|}
 +
 
 +
== Takeaway Lessons ==
  
 
*Simulate Mocap device to control Character motion.
 
*Simulate Mocap device to control Character motion.
 
*Create UI using QML.
 
*Create UI using QML.
  
===Required Files===
+
== Required Files ==
 
+
[https://github.com/reallusion/iClone/tree/master/HandGesturesPuppeteering GitHUB Page]
+
  
 
*Hand Gestures Python script -included in Python samples (main.py)
 
*Hand Gestures Python script -included in Python samples (main.py)
Line 38: Line 52:
 
*User interface markup language file for QT (resource/qml folder)
 
*User interface markup language file for QT (resource/qml folder)
  
Steps to take:
+
You can download this plugin from the [https://marketplace.reallusion.com/hand-gestures-puppeteering Reallusion Marketplace]. 
 +
To acquire and view the source code, please visit [https://github.com/reallusion/iClone/tree/master/SmoothCameraFollow Reallusion GitHub].
 +
 
 +
== Usage Instructions ==
  
 
#Inside iClone, load a standard or non-standard character.
 
#Inside iClone, load a standard or non-standard character.
Line 47: Line 64:
 
#Hit the space-bar at any point to start recording the animation in the timeline.
 
#Hit the space-bar at any point to start recording the animation in the timeline.
 
#Finish the recording by pressing the space-bar again and play the timeline to review the recorded animation.
 
#Finish the recording by pressing the space-bar again and play the timeline to review the recorded animation.
 +
 +
== APIs Used ==
 +
 +
You can research the following references for the APIs deployed in this code.
 +
 +
=== main.py ===
 +
 +
<div style="column-count:4; -moz-column-count:4; -webkit-column-count:4">
 +
* [[ IC_Python_API:RLPy_RPyTimerCallback#__init__ | RLPy.RPyTimerCallback.__init__() ]]
 +
* [[ IC_Python_API:RLPy_RGlobal#GetTime | RLPy.RGlobal.GetTime() ]]
 +
* [[ IC_Python_API:RLPy_RGlobal#GetEndTime | RLPy.RGlobal.GetEndTime() ]]
 +
* [[ IC_Python_API:RLPy_RGlobal#SetTime | RLPy.RGlobal.SetTime() ]]
 +
* [[ IC_Python_API:RLPy_RUi#CreateRDialog | RLPy.RUi.CreateRDialog() ]]
 +
* [[ IC_Python_API:RLPy_RUi#GetMainWindow | RLPy.RUi.GetMainWindow() ]]
 +
* [[ IC_Python_API:RLPy_RUi#AddMenu | RLPy.RUi.AddMenu() ]]
 +
* [[ IC_Python_API:RLPy_RUi#AddHotKey | RLPy.RUi.AddHotKey() ]]
 +
* [[ IC_Python_API:RLPy_RPyTimer | RLPy.RPyTimer() ]]
 +
* [[ IC_Python_API:RLPy_REventHandler#UnregisterCallbacks | RLPy.REventHandler.UnregisterCallbacks() ]]
 +
* [[ IC_Python_API:RLPy_RUi#RemoveHotKey | RLPy.RUi.RemoveHotKey() ]]
 +
* [[ IC_Python_API:RLPy_RDialogCallback#__init__ | RLPy.RDialogCallback.__init__() ]]
 +
* [[ IC_Python_API:RLPy_REventCallback#__init__ | RLPy.REventCallback.__init__() ]]
 +
* [[ IC_Python_API:RLPy_REventHandler#RegisterCallback | RLPy.REventHandler.RegisterCallback() ]]
 +
* [[ IC_Python_API:RLPy_RUi#OpenFileDialog | RLPy.RUi.OpenFileDialog() ]]
 +
* [[ IC_Python_API:RLPy_RScene#GetSelectedObjects | RLPy.RScene.GetSelectedObjects() ]]
 +
* [[ IC_Python_API:RLPy_RGlobal#IsPlaying | RLPy.RGlobal.IsPlaying() ]]
 +
* [[ IC_Python_API:RLPy_RGlobal#Pause | RLPy.RGlobal.Pause() ]]
 +
* [[ IC_Python_API:RLPy_RGlobal#Play | RLPy.RGlobal.Play() ]]
 +
* [[ IC_Python_API:RLPy_RUi#SaveFileDialog | RLPy.RUi.SaveFileDialog() ]]
 +
* [[ IC_Python_API:RLPy_RUi#ShowMessageBox | RLPy.RUi.ShowMessageBox() ]]
 +
</div>
 +
 +
=== handrigger.py ===
 +
<div style="column-count:4; -moz-column-count:4; -webkit-column-count:4">
 +
* [[ IC_Python_API:RLPy_RGlobal#GetMocapManager | RLPy.RGlobal.GetMocapManager() ]]
 +
* [[ IC_Python_API:RLPy_RScene#GetSelectedObjects | RLPy.RScene.GetSelectedObjects() ]]
 +
</div>

Revision as of 21:07, 19 October 2020

Main article: Various Lessons.

Demo Video

Descripton

The ability to customize and create new tools to facilitate character animation is one of the main draws of the iClone Python API. This example demonstrates the use of an intuitive graphical user interface to drive the hand gestures of a character. Once the script is applied, a Hand Gestures Puppeteering dialog window will appear with 6 quadrant gesture controls. The user can drag the dot within the interface to blend the hand gestures in midst of the 6 quadrants making for a smooth transition every time.

Convenient hotkeys are provided so the mouse can remain pre-occupied with controlling the blend position:

  • [P] starts and stops the preview mode. Under this mode, the user can test out various blend styles without having to actually record the motion into the timeline.
  • [Space] starts and stops the recording. Use this to record the motions to the timeline to preserve it for playback.
  • [B] for switching between the two blend modes, see below for more information.

Two blending options are available:

  • 2-Point Blend allows for blending of the nearest two points to the control point.
  • Multi-Point Blend allows for blending of all 6 quadrants from any position within the control wheel.

Course Prerequisites

You should familiarize yourself with the following fundamental articles before you proceed:

Link Purpose
Using Temp Data Learn to read/write temporary data.
Embedding QML A soft introduction to creating QML ui and embedding it into iClone.

Takeaway Lessons

  • Simulate Mocap device to control Character motion.
  • Create UI using QML.

Required Files

  • Hand Gestures Python script -included in Python samples (main.py)
  • Hand Rig Python script (handrigger.py)
  • User interface Python script (KeyData.py)
  • Bone data required by the hand rigger script (BoneData.py)
  • User interface markup language file for QT (resource/qml folder)

You can download this plugin from the Reallusion Marketplace. To acquire and view the source code, please visit Reallusion GitHub.

Usage Instructions

  1. Inside iClone, load a standard or non-standard character.
  2. Initiate the Hand Gestures Puppeteering script (Plugins > Python Samples > Hand Gestures Puppeteering).
  3. Press [P] to start the preview process; Which does not record actual animation.
  4. Drag the dot within the Hand Gestures Puppeteering window and watch as the hands begin to move and blend between the 6 different poses.
  5. Switch between 2-Point and Multi-Point Blend to try out the different blending options by pressing [B] so you won't need to let go of the mouse button.
  6. Hit the space-bar at any point to start recording the animation in the timeline.
  7. Finish the recording by pressing the space-bar again and play the timeline to review the recorded animation.

APIs Used

You can research the following references for the APIs deployed in this code.

main.py

handrigger.py