Anthony Evans / Digital Puppets UK
Hi, my name is Antony Evans, and I am the co-founder of Digital Puppets, an animation studio based in the United Kingdom. I have worked as a character designer for many years, specializing in real-time animation, and collaborating with studios such as Disney, Warner Bros, and the BBC. In addition to creating real-time performing avatar puppetry, I focus on developing and designing custom characters for clients in a wide range of styles.
My team and I can replicate any existing licensed character or company/product mascot. Our studio not only creates motion capture characters for online entertainers, educators, and marketing/advertising agencies, but we also produce fully animated shows for our clients using the characters we design.
Recently, we completed a 50-episode series for the UK nationwide radio station ‘RADIO X’, where we animated the hosts and transformed their weekly podcast into engaging content for their online audience. We also work with globally recognized broadcasters such as the BBC and Cartoon Network, among others, though many projects remain undisclosed due to NDAs.
Concept & Character Creation
Last year, we submitted a clip of “Monkey Band” to the Reallusion 2024 Character Contest and won “Best Mocap Use” in the “Special Awards” category. This contest allowed us to explore new workflows and integrate our existing practices with innovative technologies and techniques. It’s also the inspiration behind this article, where I’ll be sharing the full creation process.
The decision to create the Monkey Band stemmed from our desire to explore fur creation within Character Creator. Additionally, monkey characters provided a unique opportunity to utilize the humanoid base mesh, granting access to the clothing and hair systems. This allowed us to design distinctive outfits and looks for each band member, thereby, enhancing their individuality.
The concept was inspired by our long-standing interest in virtual bands that can perform in real time. To achieve this, we ensured the characters were equipped with detailed facial morphs to maximize performance during live facial capture. The band was specifically designed to enable real-time virtual avatar control for the lead character, allowing us to stream pre-recorded performances seamlessly. You can see an example of this in action here
The characters were developed using a combination of Character Creator 4 (CC4) tools and ZBrush via the GoZ workflow. ZBrush was instrumental in adding intricate details to the base model and in sculpting facial morphs. Its capabilities in creating organic shapes allowed for rapid exploration of different looks, which greatly contributed to giving each band member a unique personality.
While designing the monkey characters, we utilized preset hair cards to create their fur. Specifically, the “Fur Collar” and “Fur Cuffs” presets from the “Stylings” pack in CC4 proved invaluable. By leveraging the Edit Mesh tool, we customized and duplicated these hair cards to achieve a full-body fur effect. For a more in-depth look at the process, you can refer to our video on customizing hair.
iClone Motion LIVE for Facial Animation
When utilizing Live Face for facial capture, it’s important to exaggerate expressions. Overacting ensures that subtle details are captured, as it’s easier to tone down exaggerated expressions during post-processing than to add missing nuances. Additionally, the quality of blend shapes is critical. Take time to review them in the Facial Profile Editor, ensuring they work well, particularly for stylized characters like the monkeys. Modifications to the base mesh for a stylized appearance can sometimes distort blend shapes, so proper setup is essential to achieve high-quality facial capture.
Facial mocap is highly efficient and allows for a quick turnaround. For this animation, I mimed the song while recording the performance using an iPhone and LIVE FACE. Afterward, I reviewed the recording and made adjustments to refine expressions and mouth shapes. Simultaneously, I recorded the audio track and used AccuLips to generate a viseme track.
By blending the viseme track with the expression track, I achieved more natural results. The viseme track ensures accurate mouth shapes for speech, while the expression track adds emotional nuance. This workflow significantly reduced the time required compared to manual facial animation.
iClone for Body Animation
For this project, we utilized the Xsens Link motion capture suit, which delivered highly accurate results with minimal cleanup. Each character’s movements were recorded in single takes, one after the other. The motion data was processed in Xsens MVN software and exported as FBX files. Importing the mocap data into iClone was seamless—simply drag and drop the FBX file onto the character and select the Xsens profile.
The primary cleanup involved aligning the characters’ hands with their musical instruments. For example, refining the hand placement for the keyboard took the most time. Using an actual keyboard during the mocap session would have been beneficial, as physical props provide clear points of contact and improve accuracy in hand positioning.
Here are some Dos and Don’ts when animating body motion in iClone:
DO:
-Exaggerate your movements to ensure the capture conveys dynamic motion.
-Use real-world props to add weight and realism, especially for interactions like playing instruments.
-Allow sufficient space for natural movement during capture.
DON’T:
-Avoid miming contact with imaginary objects; it’s challenging to replicate consistent positioning without real props.
-Don’t neglect refining hand and finger alignment during cleanup, especially for detailed tasks like playing instruments.
Final Render in Unreal
The workflow between iClone and Unreal is highly efficient, especially with the Auto Setup plugin. This tool makes it simple to transfer animations to Unreal while retaining the original look of the characters without the need to recreate materials. For this project, all animations were created in iClone and transferred to a pre-built stage scene in Unreal using Live Link. Once transferred, the remaining steps involved setting up cameras and adjusting the lighting directly in Unreal.
The Unreal Sequencer is intuitive and functions similarly to other timeline tools. One standout feature of iClone’s Live Link is its ability to automatically configure the Sequencer when transferring animations. In this case, transferring animations for four characters was straightforward, as Live Link set up the Sequencer automatically, saving a significant amount of time and effort.
Unreal was selected for its ability to handle large environments seamlessly. While iClone’s renders were great, Unreal’s render offered access to an extensive stage asset complete with pre-configured particle effects and dynamic lighting. This allowed us to effortlessly integrate our animations into the scene, achieving professional-quality results with minimal setup.
Additionally, Unreal’s real-time capabilities made it ideal for a live show setup. By connecting the Xsens suit, we could perform live interactions, such as the lead singer engaging with the audience in real time. This setup also allowed us to trigger pre-animated sequences, like songs, with a single button press, before seamlessly transitioning back to the live performance. Unreal’s real-time triggers enabled one person to orchestrate an entire virtual band performance.
Closing Thoughts
Working on the “Monkey Band” project alongside my brother Scott has been an incredibly rewarding experience. At Digital Puppets UK, we’ve always strived to push the limits of real-time animation and virtual avatars. Using tools like iClone, Character Creator, and Unreal Engine, we’ve been able to create detailed characters and fluid animations with a remarkable level of efficiency. By blending motion capture with real-time performances, we can deliver highly engaging content, transforming simple ideas into polished, professional-quality animations that truly resonate with audiences.
Looking back at projects like animating RADIO X’s podcast and working with major broadcasters like the BBC and Cartoon Network, it’s clear how far we’ve come. Each step of our workflow—from character design to live performance capture—has been meticulously refined to ensure we’re always delivering the best. The “Monkey Band” project, in particular, showcases how we can seamlessly integrate these powerful tools to bring a virtual band to life. It’s exciting to see how these innovations are shaping the future of virtual production, and we’re proud to be part of that journey.
As we reach the conclusion of this article, I can’t think of a better way to wrap things up than by offering you some more Monkey Band for your entertainment:
Related Posts