Introduction

The ICT Virtual Human Toolkit is a collection of modules, tools, and libraries designed to aid and support researchers and developers with the creation of virtual human conversational characters. The Toolkit is an on-going, ever-changing, innovative system fueled by basic research performed at the University of Southern California (USC) Institute for Creative Technologies (ICT) and its partners.

Designed for easy mixing and matching with a research project’s proprietary or 3rd-party software, the Toolkit provides a widely accepted platform on which new technologies can be built. It is our hope that, together as a research community, we can further develop and explore virtual human research and technologies.

Request the ICT Virtual Human Toolkit.

News

Feb 5 2015 - Welcome to the new year, and welcome to a new Toolkit version. We updated Unity, made SmartBody animations a native Unity object and added previews for a new menu and character selection in the LineUp scene. See the Release Notes for more details.

Sep 17 2014 - We have released a minor update to the Toolkit which fixes Bonebus. This allows you to run SmartBody as a stand-alone process, which is required for the free version of Unity. If you have downloaded the Toolkit before, you can use the same information.

Aug 28 2014 - We have released a new version of the Toolkit, primarily focusing on updating some of the components, in particular SmartBody and Unity.  This release also contains numerous bug fixes for VHBuilder, Character Customizer and Unity. See Release Notes for details.

See Release Notes and News Archive for details.

Modules

The ICT Virtual Human Toolkit is built upon a common modular architecture which enables users to utilize all modules as is, one or more modules coupled with proprietary components, or one or more modules in other existing systems. Our technology emphasizes natural language interaction, nonverbal behavior and perception. Its main modules are listed below. See Documentation for an overview of the architecture, the messaging API, and other components.

MultiSense

MultiSense is a multimodal sensing framework which is created as a platform to integrate and fuse sensor technologies and develop probabilistic models for human behavior recognition. MultiSense tracks and analyzes users’ facial expressions, body posture, acoustic features, linguistic patterns and higher-level behavior descriptors (e.g. attention, fidgeting). It uses the Perception Markup Language (PML).

NPCEditor

At the core of the NPCEditor is a statistical text classification algorithm that selects the character’s responses based on the user’s utterances. A character designer specifies a set of responses and a set of sample utterances that should produce each response through a provided authoring tool. The NPCEditor also contains a dialogue manager that specifies how to use the classifier results.

Nonverbal Behavior Generator (NVBG)

The NVBG is a rule-based system that analyzes character text and functional markup to propose nonverbal behaviors. The resulting schedule is Behavior Markup Language (BML).

SmartBody

SmartBody is a character animation library that provides synchronized locomotion, steering, object manipulation, lip syncing, gazing and nonverbal behavior in real-time. It uses Behavior Markup Language (BML) to transforms behavior descriptions into real-time animations.

vhtoolkitUnity

The Toolkit uses Unity as its main game engine, which has been extended to include a tight integration with SmartBody, a messaging protocol, debug and authoring tools, and a graphical timeline editor for creating cut-scenes.

Research Systems

In addition to the overall set of modules and tools, the Toolkit aims to provide sample systems that consist of a predefined configuration of modules with the goal of supporting specific research areas.

Rapport 1.0

The Rapport system is a virtual listener, included in the Virtual Human Toolkit using a predecessor of GAVAM (now part of MultiSense) and a custom rule selector. It is based on psycho-linguistic theory and was designed to create a sense of rapport between a human speaker and virtual human listener. It has been used in many studies to gather evidence that it increases speaker fluency and engagement. See Publications for Rapport related papers.

Cognitive Science

Virtual humans are gaining interest as a methodological tool for studying human cognition, including the use of virtual confederates. Virtual humans not only simulate the cognitive abilities of people, but also many of the embodied and social aspects of human behavior more traditionally studied in fields outside of cognitive science. By integrating multiple cognitive capabilities to support real-time interactions with people, virtual humans create a unique and challenging environment within which to develop and validate cognitive theories. The Toolkit allows users to create character-based movies for use in (online) studies, with the ability to create variations by changing one or more attributes.