The anatomy of a Game Engine

There are various techniques individuals or organizations can go about making games. These include:

  1. Doing it from scratch: This is the most complicated technique. It involves manually assembling all components from the ground up to make a game. In the traditional world this was the approach, and since games need to be highly optimized, folks used to develop a lot on top of C & C++. Including integrations to libraries talking to the GPU, Sound, Handling Input, etc etc..
  2. Employing Libraries and Abstraction Layers: This is the second step that is less complex compared to the 1st approach. In this technique folks make use of already existing abstractions, including on languages like Lua, to deal with the very technical low level interfaces to GPUs, Input, Sound, etc. A good candidate for this is LibSDL, which if you do not fancy C / C++, bindings do exist on top of languages like Lua in the form of LuaSDL2. Lua has traditionally been a common language of choice for Game Developers working to abstract direct access to C / C++.
  3. Working with a Game Engine: Which is quickly becoming a popular technique these past few years, Game Engines abstract not only libraries for low level access, but also provide tools that deal with routine operations in game development. These sophisticated tools have allowed Game Development to become a lot more easier (but that does not mean it can take you a month to create a hit – under most circumstances).

So what is the anatomy of a modern – capable game engine?

I spent the last few months trying to answer this question, and this also saw me playing around with C / C++ code and Lua bindings in some scenarios. For instance, I actively tried to tweek the Love2D Engine and put in my own custom module to see how it works: as I did on this code repository. There are some very similar patterns in all these game engines I dived into. Here goes:

  1. At the core, most game engines modularize or compartmentalize aspects such as Audio, Input, Physics, and Graphics Management, Game Object Game Object Lifetime Management, special Utility Functions, and Deployment Management Tools.
  2. Audio deals with how Music and Sound Effects are managed within the scope of the game’s lifetime.
  3. Input modules abstract various input methods: including Keyboard, Mouse, Touch, Joy Pad / Game Pads, etc etc.
  4. The Physics subsystem implements how objects interact to emulate real world interactions. A good deal of these engines mostly borrow Box2D and Bullet Physics, and additionally go ahead to create abstraction layers for their Game Development environments. The physics subsystem is also responsible for checking for collisions between objects and handling those collisions in various ways.
  5. Graphics Management is providing tools and SDKs that allow the manipulation of graphics elements, including abstracting Materials, Shaders and Particles. This layer is what interacts with SDKs like OpenGL and Vulkan and provides GPU level access for your Game Objects.
  6. Game Object and Game Object Lifetime Management allows Game Developers to create objects with life and control the interaction of these objects with Graphics, Audio, Physics, Input, etc..
  7. Special Utility Functions abstract the use of Vectors (screen coordinates), Cameras (viewport of a game), Special Mathematical Functions, Data Management, and even at times, the capability to work with functionality available on different platforms – like Mobile, Web, Desktop, Console, etc.
  8. Deployment Management: Tools to make it easier to quickly deploy your games to leading Mobile, Web, Desktop, and even sometimes Console platforms.

Its easy nowadays not to appreciate the level of efforts needed to create games from scratch. But this also means that the Game Engine industry has reached a specific maturity level as has a lot of other Engineering Disciplines. However, making games is still a complex science, because even if it has become easier, the state of technology nowadays demands more interactive and sophisticated games to be developed. So the complexity is not gone, it has only shifted to User Experience management. We are in that regard, a step ahead.

Published by Ahmed Maawy

Ahmed Mohamed Maawy is a seasoned technologist with over a decade of experience growing and leading technology products across the African continent - Currently the VP for Engineering at Streamlytics. In his role as VP of Engineering at Streamlytics he leads engineering product development for both B2C and B2B products. Prior to Streamlytics his work spanned leading engineering at Griffin Kenya (an InsureTech Company) as the Chief Technology Officer to working for innovative and disruptive startups like EveryLayer Broadband, Ushahidi, and one of Time's Magazine 50 most genius companies BRCK; the only company in East Africa that designs its own complete hardware and software stack. Most recently he was at the Al Jazeera Media Network, working on both Digital & Broadcast technical integrations. He was part of the team that launched Al Jazeera’s streaming service AJ+ and was heavily involved in the organization's Media Archive Artificial Intelligence projects. Ahmed is a respected leader and pioneer in the Kenyan technology community, his work having been featured in Quartz Magazine, Fast Company & Huffington Post. He sits on the advisory board for CIO’s East African leading Internet of Things and Artificial Intelligence conference, the East Africa IoT and AI Summit.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: