All video games start with one idea and require a lot of thought and planning before drawing. It takes a team of designers, developers and artists to bring even the simplest idea to life.

 The teams responsible for designing and producing all game ideas have different skill sets and responsibilities. Game design requires a wide variety of talents that must be combined precisely to create a fun and exciting experience.

 Designers infuse their artistic vision and creativity into every aspect of game creation. But artistry alone is not enough. Teams should have clear goals, allocate all resources appropriately, and be efficient and collaborative. Project managers need to give everyone the best tools for their jobs.

The biggest challenge facing studios is finding the perfect balance between creativity and efficiency

 In other words, the biggest challenge facing studios is finding the perfect balance between creativity and efficiency. Having a grand vision means nothing if you don’t have a production pipeline to bring it to life.

 This article provides an overview of a typical game development pipeline and how development tools are changing the design landscape. If you’re a new studio head or producer, read on.

 When you talk to a game designer or studio head about the process of making a game, they’ll usually start by explaining the composition of the team and what each of them do.

 Most studios are divided into departments with specific responsibilities and objectives that allow them to craft the final product without stomping on each other.

 This is what we call the production pipeline.

 A pipeline can be visualized as a production chain divided into different steps, focused on some part of the final product .

 This can be broken down into 3 phases:

pre-production

 In pre-production, concept art, character design, storyboards, etc. are produced by a small group of artists and art directors. Ideally, each time a new project starts, the roles of the team should be defined.

 The first thing to consider is the operations director. Below that position are usually design, animation, programming and art departments. Depending on the scope and scale of the project, audio, lighting, QA, etc. may expand.

 During pre-production, details such as the game’s story, gameplay style, setting, environment, lore, intended audience, and platform are clarified.

 These elements will be invaluable when creating the initial sketches of characters, props, locations, weapons, etc.

 The main reason pre-production exists is to remove as much guesswork as possible during the later stages of development. This is where you can experiment with different approaches, adding or removing details, and color palette variations. Changing them later can be time-consuming and expensive, so it’s preferable to get everything together in pre-production.

The raison d’etre of pre-production is to eliminate as much guesswork as possible

 Next is the storyboarding stage, which expresses interactions between characters and the environment. Here you can imagine effects such as camera angles, transitions and lighting. The next stage, animatics, applies some camera effects as well as reference sounds, narration and background music. Through this process, games and stories are shaped.

 Once you’ve tried all these elements, you’ll eventually come to a consensus on the scenario style of the story and decide on the art style. It’s at this point that the game’s personality begins to take shape, from photorealism for added immersion, to pixel art, to cel-shaded animation for a more cartoonish feel.

 Once the initial models, sketches, storyboards and game style are all approved, we move on to the next stage of the production pipeline.

production

 Production is the stage where most of the game’s assets are created. This will generally be where you need the most resources.

● Modeling
 At this stage, the artist begins by transforming the vision into an asset that can be manipulated later. Modelers use specialized software to convert sketches and ideas into 3D models. Current 3D software is mainly Maya, 3DS Max, and Blender .

 When designing assets, the most obvious is to start at the top at the beginning of the modeling process. Then, the vertices are connected with lines to create edges, and the flow becomes faces, polygons, and surfaces. This process takes time and its complexity depends on how detailed you want your model to look.

Creating ultra-realistic models with photogrammetry is one big breakthrough

 We usually start with a simple 3D model and add details and wrinkles through sculpting. The level of detail required for a model differs depending on the target platform. Models with a higher level of detail have more triangles and require more processing power to generate.

 A big breakthrough is the use of photogrammetry to generate ultra-realistic models. From real objects to terrain surveys of entire regions such as cities and racetracks, various models are generated in a wide range to enhance the sense of immersion. Games such as Forza Motorsports and Call of Duty: Modern Warfare make extensive use of photogrammetry to bring real-world objects and scenarios to life, providing players with realistic environments and props. Photogrammetry allows studios to generate more realistic visuals at a fraction of the cost of manually sculpting assets. Many studios are moving to this technology as it saves them a lot of modeling and sculpting time, making it well worth the investment required.

 As for the software, ZBrush allows you to sculpt or manipulate ultra-high-resolution models of over 30 million polygons of characters and props. Additionally, models can be retopoed to a specific polygon count, depending on technical requirements and target platform. For this reason, ZBrush has become an industry standard.

 Many studios have been leveraging the power of Autodesk 3Ds Max and Maya for over a decade due to its ease of use and pipeline integration capabilities. But Blender has more flexibility for solo artists to have no problem deploying custom solutions that fit their workflow.

Room 8 Studio has chosen  to build several pipelines to address specific projects and requirements .

●Rigging
 From here on, it gets a little technical. Riggers are responsible for giving the 3D model its skeleton and articulating all the parts in a way that makes sense. These bones are then tied to the surrounding geometry, allowing animators to easily move parts as needed.

 In addition, scripts are sometimes automatically created to control the character’s movements. The same controls can be used for other characters and projects, making the process much easier and more efficient.

●Animation
 Animators use properly rigged 3D models to create fluid movement and bring characters to life. This stage requires attention to detail, as every limb and muscle must move organically and believably.

 The animation process has changed a lot, especially since the adoption of non-linear pipelines.

 Once the animation is approved, it needs to be baked into a geometry format and every frame separated into individual poses used for simulation and lighting.

Appearance
 This is where we apply textures and shading to all assets. All objects and surfaces must adhere to the color palette. Skin color, clothes, items, etc. are all painted at this stage.

 It also applies textures to objects according to the style agreed upon in pre-production.

 There are various tools for physically based rendering (PBR). Adobe Substance is such a versatile tool that it’s used by big names in the anime industry such as Pixar.

 For example, Substance Designer makes it easy to create textures and materials, which can be generated procedurally according to the composition of the target mesh.

 Substance Painter makes it very easy to generate and apply textures to 3D objects, greatly streamlining the pipeline and allowing artists to import their work directly into their game engine. Substance Painter also has a nice baking tool, with the power to bake to a low-poly mesh without losing the properties of the high-poly mesh. This is a boon if you have a limited polygon budget.

 After applying a texture to an object, it’s time to decide how it will interact with the light source and combine the surface properties with the texture of your choice to complete the final look. Unreal Engine 5 ,

 one of the most powerful engines in the pipeline , features a render engine called Lumen . This allows for global illumination without baking lightmaps to provide accurate lighting . This was huge for artists, as they no longer needed cumbersome things like lightmap UVs that slowed down the lighting process.

 For example, when creating a location, it used to be necessary to set up lightmaps and bake them into the scene to achieve global illumination, realistic shadows, and all other aspects of achieving a realistic setting. This process could take an entire day just to set the UV channels correctly.

 Lumen also provides a highly optimized way to efficiently switch between three tracing methods and create dynamic lighting that covers both outdoor and indoor areas without consuming resources.

●Simulation
 Some things are too complicated to create animation by hand. That’s where the simulation department comes in. The randomness of water waves and ripples, the effect of wind and movement on textures and hair, etc., are all programmed in the simulator.

 With current technology and simulation algorithms, it is possible to achieve extremely realistic motion of liquids, gases, fire, clothing, and even muscle mass of characters in action.

●Assembly
 At this stage, all the assets are assembled to create the final product. All assets become level building blocks, like Lego pieces. Depending on the game engine you use in your pipeline (Unreal, Unity, or other custom engine), you’ll need to ensure smooth asset integration. Therefore, the assembly process requires a robust pipeline that keeps everything at hand and avoids bottlenecks that affect efficiency.

 A game changer for the new UE5 engine is the Nanite feature. This unique solution dramatically changed the pipeline by allowing high-poly count 3D models to be imported and rendered in real-time without compromising performance.

 The engine does this by transforming assets into more efficient meshes that dynamically change according to their distance from the camera. For example, when the camera gets closer, the triangle gets smaller, and vice versa. In other words, the conventional LOD fine-tuning is unnecessary.

post production

 There are many aspects to post-production, but the most important are color correction and lighting, which set the final tone for your game or trailer. Here, you can give your game its own visual style by adding specific shades and filters to your images. The greenish tones used in the movie Matrix were applied during post-production.

 But there are other reasons why post-production is an important stage. For example, profiling, measuring framerate numbers and frame times, checking if the polygon budget fits into the memory allocated on the target platform, deciding whether lights and shadows should be baked in or treated as dynamics.

 Animation studios have traditionally spent most of their resources on the production stage. This is in contrast to the real-life studio where the cost of completing a product carries more weight in the post-production stage.

 For example, film directors enjoy greater freedom and flexibility during the production phase than traditional animation studios allow. A real-time feedback loop allows directors, actors and crew to bring the studio’s vision to life in a matter of shots.

 Today, tools like Unreal Engine 5 give animators the same freedom. Developers will have the ability to choose assets, characters, locations, move cameras and angles, and visualize their deliverables. If you want to get an idea of ​​how assets can be manipulated, you can manipulate them in real time directly from the engine. This eliminates the gap between the previs and the director, as the director can instantly manipulate assets and provide instant feedback.

 Furthermore, since all assets can now be aggregated into the engine, it is now possible to move freely between production stages. With a non-linear pipeline, studios can even out their spending between stages of production and spread it out more.

 No matter how advanced the technology is, it can’t be done without a talented and experienced team who understand how to use these tools.

 One of the main benefits of this approach is that we’re now able to deliver production-grade assets from the pre-production stage and propagate them throughout the pipeline without issue. The final step in this process is functional and compliance testing performed by the Quality Assurance department to ensure that the product works as advertised and complies with the platform-imposed requirements.

 Today’s studios need to adopt agile pipeline systems that harness the power of the latest tools and technology trends to stay competitive and bring their vision to life. The move to non-linear pipelines and real-time engines is likely to be a game changer, allowing smaller teams to compete closely with the industry giants.

 But in this industry, having access to more powerful design, animation and development tools is key. No matter how advanced the technology is, the tools won’t get you where you are without a talented and experienced team that understands how to leverage these tools and build an efficient production chain. Noda.

Maksim Makovsk is the art director for Room 8 Studio’s 3D division. He has 10 years of experience in the video game industry and has worked on some of the biggest titles in the industry including Call of Duty, Control, Overkill’s The Walking Dead, Warthunder and World of Tanks.