Since September 2022, I've been lucky to be part of X&Immersion as an apprentice in Technical Game Design. I was recruited following a 2-month internship between June and August of the same year.
As a technical game designer, my tasks during my apprenticeship are:
Support the team during the various tool productions from a design and documentation point of view.
Implement the tools directly into the game engine (Unity and Unreal Engine).
Exchange directly with the customer to establish a link between their design needs and the technical possibilities of the tools.
Manage small teams during the production of technical demos for different game events (GDC, Gamescom, PGC).
The Company - X&Immersion
X&Immersion is a company that builds tools that use Generative AI to help game studios with the development of their games.
The goal of the company is to offer tools that can be directly integrated in-engine and to the studio's production pipeline on order to help developers with their creative endeavors. The tools offered by X&Immersion allow writers, narrative designers and animators to automatize part of their workflows with AI so they have more time to focus on polishing the already existing content and create more lively in-game characters.
The tools support the following :
Voice Generation
Text and dialogue generation
Facial animation and lip-sync automation
Mission : Totem's Path - Survival RPG (cancelled)
The first project I worked on as a member of X&Immersion was a survival RPG whose goal was to showcase the in-house technologies of the company : voice generation, lip-sync animation and text generation.
The aspect of this project that compelled me the most was the use of procedurale narrative and dialogue.
My tasks during this project were :
Technical and Design Documentation : Along with the team, we established a GDD that I kept up to date during the pre-production and production phase with the different changes to the project.
Design of game systems : A survival RPG is a very system-heavy game, I was tasked with the design of the main rules and interactions of core game systems such as :
Quest and Dialogue System : The immersive NPCs were at the core of the game experience as they are the force that pushes the player to explore and do quests. Using AI to create dynamic and unique NPCs was quite challenging as the technology was still in its early stages therefore a lot of R&D had to take place, especially in the game design department. Creating a system of NPCs that delivered quests but also had AI generated dialogue that evolved with the world was our main narrative and interactive intention with this project.
Combat System : The combat system was a reflection of the game's intentions : a ruthless yet simple hack-and-slash-like combat system. The first person camera made designing a satisfying melee combat quite challenging.
Crafting System : The crafting system allowed the player to merge resources in order to create new ones. This came with a whole database of items and recipes that helped balance the quantity of resources available in the game world.
Implementation of the systems in Unity : I worked directly in-engine to implement multiple systems via C# and third party packages.
Tools used for this project :
Mission : Tool Development for Unreal Engine and Unity
My tasks during the development of the various tools were :
Work on the general in-engine User Experience for the tool
Design features for the tool based on game development use-cases
Integrate the tool front-end directly in-engine (Unity / Unreal Engine)
Write technical documentation for the pre-production of the tool and for the clients
Work with the team on establishing sprints and milestones for the different features
Q&A the different tools in-engine and report bugs via tickets (click up)
Ariel - Voice Generation Tool
One of the first tools I worked on was Ariel, a text-to-speech tool designed specifically for Unity and then ported to Unreal Engine. This tool allows users to generate audio files based on synthesized voices produced in-house.
My role in the production of this tool revolved around adapting it first to the Unity interface. This involved creating a readable user interface, designed to be easy to grasp, while allowing the user to discover all the features the tool has to offer. I also created the documentation for this tool, writing explanatory notes for users.
This tool is available both in the Unity Asset Store and the Unreal Marketplace.
Geppetto - Automated Lip-Sync
Geppetto's fundamental concept is based on the clever integration of blendshapes, specific deformations of the 3D characters' faces. This approach enabled Geppetto to generate emotions procedurally, while animating the characters' lips to synchronize them with audio clips.
I worked with the team to test Geppetto's effectiveness through a series of rigorous tests with Unity Asset Store, Ready Player Me and Epic Games' Meta Human characters. Based on this extensive Q&A phase, we managed to identify the strengths and weaknesses of the tool and improve on them.
I also worked with the team on more specific features such as the emotion tags that allow the animator to use customizable emotions via uAssets.
This tool is available both in the Unity Asset Store and the Unreal Marketplace.
Tools used for this projects :