Integrating an AR Tracking Framework into Unreal Engine

Currently, there are many AR developers out there who offer AR frameworks and more often than not, these frameworks integrate with different graphics and game engines like Unity. However, the Unreal Engine, the perhaps most notable game engine, was still lacking such an integration – until now!


When the Unreal Engine’s licensing changed to a more liberal model, we decided to integrate an AR framework as a little side project. We chose the Metaio SDK, which – at least then – seemed to fit the bill best, since it contained examples of how to integrate it into a graphics engine. There were examples for Windows (via the library’s native C++ interface) and Java code for Android (which was our main platform initially).

augmented reality

we did it: augmented reality made with unreal engine

Metaio & Unreal Engine Module

Since the Unreal Engine, of course, uses the more performant C++ code on Android as well, we first had to turn the Metaio SDK into a functioning C++ module. Metaio was also provided as a native C++ library for Android and could be called up from Java via a JNI bridge. There are no native headers for Android. They are, however, needed to call up the library’s functions from the C++ code. Since the Windows and Android libraries don’t significantly differ (at least not superficially), you can use the headers provided in the Windows version to call up the Android library in the native code. Thus, after initializing the library from the Java code, the Metaio SDK could be used in the Unreal Engine module. This made it possible to receive a picture of the camera stream as well as the corresponding pose with each “tick” of the engine. So we had managed to get the data into the engine – but how could we use them?

Real-Time Camera with Mediatexture?

First, we tackled the problem of rendering the pictures of the camera stream in real time. Our first thought was to use the media texture which is used to play videos on objects. Therefore, the video could be rendered on an object in the scene’s background. Unfortunately, however, we found out that, on mobile devices, the media texture didn’t perform as well as we would have liked. So we ruled it out for further development.

The “Two-Camera-Concept”

We got our next idea when we looked more closely at the Unity integration. Here, two ‘cameras’ were used to draw the video background and subsequently render the scene on top of it. Since Unreal also has the ‘camera’ concept in their editor, we planned to do the same there. That didn’t yield any success either, but the principle of rendering something on top of something else was something we could go on. We therefore tried to solve the problem with the editor’s post-processing shaders. With the help of these little programs, it is possible to influence the rendering of the scene directly on the graphics card. We thus managed to combine the video with the scene and to display them simultaneously, only to be set back again when we tried to test what we had accomplished with the Android OS. The video could not be displayed, since some post-processing shaders are not yet available on mobile platforms.

The Additional Shader

Since we couldn’t make it work with the editor effects, we changed the code of the render engine to perform an additional post-processing step. In this step, any areas (pixels) not covered by the scene are used to render the video stream. To accomplish this, we ran a custom shader over the project between rendering solid and transparent objects. Thus, it finally became possible to display the camera stream in real time in the background of the unreal engine.

shader

our custom shader

The Home Stretch: Changing Perspectives

Now all that what was missing is the correct perspective from which to render the virtual objects. We thought this would be a simple task, since we could take over the transformation matrix directly from Metaio and simply put it into the Unreal Engine. The problem that soon transpired, however, was that the matrix provided by Metaio displayed the transformation in a right-handed coordinate system, whereas the Unreal engine used a left-handed one. So our final challenge was to transform the transformation matrix, so to speak.

transformation matrix

transforming the transformation matrix: from metaio to unreal

Thus, eventually and after many a struggle, we managed to create a functional AR application using the Unreal Engine for the graphical display.