Difference between revisions of "IC Python API:Advanced Level Examples"

From Reallusion Wiki!
Jump to: navigation, search
m (Hand Gestures Puppeteering)
m (Required Files)
Line 78: Line 78:
  
 
#Inside iClone, load a standard or non-standard character.
 
#Inside iClone, load a standard or non-standard character.
#Initiate the Hand Gestures Puppeteering script (Plugins > Python Samples > Hand Gestures Puppeteering).
+
#Initiate the Hand Gestures Puppeteering script ('''Plugins > Python Samples > Hand Gestures Puppeteering''').
 
#Press [P] to start the preview process; Which does not record actual animation.
 
#Press [P] to start the preview process; Which does not record actual animation.
 
#Drag the dot within the Hand Gestures Puppeteering window and watch as the hands begin to move and blend between the 6 different poses.
 
#Drag the dot within the Hand Gestures Puppeteering window and watch as the hands begin to move and blend between the 6 different poses.

Revision as of 19:06, 7 January 2019

Main article: iClone Python API.

Light Remote Control

Light Remote Control

This lesson demonstrates the implementation of IOT doctrine (Internet of Things) to drive functions within the iClone application itself. Not only does this extend upon the functionalities available inside iClone, but it also where the medium of control is performed. Once the script is initiated in iClone and the unity scene is playing, the two applications will sync over the internet via the TCP/IP protocol. In which case, the iClone scene will send relevant light data to the Unity application. Once received, the Unity application will list out all of the light prop data received and create on/off toggle buttons and strength sliders for each light. When the controls are manipulated in the Unity application, data will be sent back into iClone to drive the same light parameters. For all this to work, the two applications must use the same network and the IP address displayed must match. Before you begin this lesson, make sure to download and install a copy of Unity 3D: https://unity3d.com/ as one of the lesson files requires it.

You Will Learn how to

  • Use TCP/IP to connect iClone and other web apps.
  • Control light props.

Required Files

GitHUB Page

  • Unity project for the remote control mobile application (Unity Project/iCloneRemoteControlLight folder)
  • Light gizmo icons and user interface markup language file for QT (resource folder)
  • Light Remote Control Python script -included in python sample (main.py)

Steps to Take

  1. Open or create a scene in iClone and pepper it with light props.
  2. Initiate the Light Remote Control script (Plugins > Python Samples > Light Remote Control).
  3. Take note of the IP address shown in the subsequent dialog window and open the SampleScene in Unity (Unity Project / iCloneRemoteControlLight / Assets / Scenes / SampleScene.unity).
  4. Once the project is loaded in Unity, press the play button above the "Game" panel to start the application.
  5. Make sure the IP address corresponds correctly with the IP address shown on the Light Remote Control APP.
  6. If the IP address do not match because you are using separate devices, make sure both are connected to the same internet network.
  7. Experiment around with the light sliders and watch as the lights in the scene begin to change as you pull on the slider bars and toggle them on and off.

Hand Gestures Puppeteering

Hand Gestures Puppeteering

The ability to customize and create new tools to facilitate character animation is one of the main draws of the iClone Python API. This example demonstrates the use of an intuitive graphical user interface to drive the hand gestures of a character. Once the script is applied, a Hand Gestures Puppeteering dialog window will appear with 6 quadrant gesture controls. The user can drag the dot within the interface to blend the hand gestures in midst of the 6 quadrants making for a smooth transition every time.

Convenient hotkeys are provided so the mouse can remain pre-occupied with controlling the blend position:

  • [P] starts and stops the preview mode. Under this mode, the user can test out various blend styles without having to actually record the motion into the timeline.
  • [Space] starts and stops the recording. Use this to record the motions to the timeline to preserve it for playback.
  • [B] for switching between the two blend modes, see below for more information.

Two blending options are available:

  • 2-Point Blend allows for blending of the nearest two points to the control point.
  • Multi-Point Blend allows for blending of all 6 quadrants from any position within the control wheel.

You Will Learn How to

  • Simulate Mocap device to control Character motion.
  • Create UI using QML.

Required Files

GitHUB Page

  • Hand Gestures Python script -included in Python samples (main.py)
  • Hand Rig Python script (handrigger.py)
  • User interface Python script (KeyData.py)
  • Bone data required by the hand rigger script (BoneData.py)
  • User interface markup language file for QT (resource/qml folder)

Steps to take:

  1. Inside iClone, load a standard or non-standard character.
  2. Initiate the Hand Gestures Puppeteering script (Plugins > Python Samples > Hand Gestures Puppeteering).
  3. Press [P] to start the preview process; Which does not record actual animation.
  4. Drag the dot within the Hand Gestures Puppeteering window and watch as the hands begin to move and blend between the 6 different poses.
  5. Switch between 2-Point and Multi-Point Blend to try out the different blending options by pressing [B] so you won't need to let go of the mouse button.
  6. Hit the space-bar at any point to start recording the animation in the timeline.
  7. Finish the recording by pressing the space-bar again and play the timeline to review the recorded animation.