CICD in Unreal with Jenkins

Tall Buildings

Continuous Integration/Continuous Deployment is a cornerstone of modern software engineering methodology. In short, the goal is to always have recent builds of the software available to test against.

These builds are automated and the build system runs a series of automated tests run against them after a successful build. This rapid feedback enables developers to identify and remedy issues in a timely fashion and allows project management to track a more realistic reflection of the project.

For me, I want it to automate the packaging process. While packaging is fast on my development machine, it's not a clean build: the development machine has uncommitted changes most of the time, especially to submodules that haven't yet been updated.

Building Unreal

While I would have liked to build the Unreal Engine itself and save the artefacts, but the engine build filled the virtual machine's disk. Ultimately, I installed the early access version of the engine directly on the VM and used that to provide UnrealAutomationTool.

On that point, UnrealAutomationTool – or UAT – is the tool I'll be using to cook and package the project. It's part of the default Unreal Engine install and will do all the heavy lifting for me. The only caveat is that I do need to ensure that the build requirements are satisfied.

To build an Unreal project, or the engine itself, you need a few things. For a Windows build, we need:

  1. MSBuild and MSVC v142+
  2. Windows 10/11 SDK
  3. .NET Framework 4.6+ SDK
  4. .NET 5.0 Runtime
  5. .NET Core 3.1 Runtime
  6. .NET SDK
  7. .NET Framework 4.6+ targeting pack
Quick note, I'm using the MSVC v142 tools until the engine is updated for v143. V143 does work, but the engine build will fail when automated as it looks for v142.

For most user deployments, these requirements are satisfied by the Visual Studio installer. There's no reason to install a full IDE for a build server, so I went out and grabbed the VS2019 build tools from Microsoft's site. This installer looks a lot like the Visual Studio installer, but it doesn't pull in the heavier parts of the IDE, so the install is only around 6-7GiB. Go ahead and select the desktop build tools for C++ as they include MSBuild, MSVC, and their dependencies.

Installing all the different parts of the .NET ecosystem is a bit more annoying. I remember .NET 3.5 being needed for a bit of Unreal, so I may have grabbed that and forgotten. It'll flag during the build if it goes wrong.

Jenkins

This section covers the pipeline script I'm using. I'm assuming that the Windows agent is configured in Jenkins and is in a state where it can execute jobs.

For my own convenience, my build nodes are tagged with the Unreal name for the platform they build for. For example, Linux is "Linux" and Windows is "Win64". I don't both with a 32-bit build.

Quick aside, I ended up moving all of my repositories to the same host and then creating a second pipeline to mirror them back to GitHub. The pipeline checkout didn't handle some of my submodules needing different credentials properly, and my git instance isn't configured for SSH yet.

The following code uses git to check out the project to a folder called "Project", pull  the LFS objects, and initialise the submodules. Since I've moved everything to the same set of credentials, I only need to specify the credentials once for the main repository and the parentCredentials setting will do the rest. Ideally, I'd just use the same SSH key for GitHub and my Gitea instance, but I don't have SSH keys generated for Gitea yet.

checkout([
    $class: 'GitSCM',
    branches: [[name: 'origin/dev']],
    doGenerateSubmoduleConfigurations: false,
    extensions: [
        [$class: 'GitLFSPull'],
        [$class: 'CloneOption', depth: 1, honorRefspec: true, noTags: true, reference: '', shallow: true],
        [$class: 'RelativeTargetDirectory', relativeTargetDir: '<Project>'],
        [$class: 'SubmoduleOption', depth: 1, disableSubmodules: false, parentCredentials: true, recursiveSubmodules: true, reference: '', shallow: true, trackingSubmodules: true]
    ],
    userRemoteConfigs: [[credentialsId: 'JenkinsKey', url: https://[REDACTED]/git.git']]
])

Once the code is cloned, the next block executes the build proper. Do note that a lot of the paths begin with the <Project> directory we cloned into, and that's because Unreal expects that the uproject file is contained in a directory of the same name. This may have changed, but previous versions of UAT seemed to throw a fit over that, so I ran with it.

Also, note that the python calls are something that I do to version the executable. It's pretty simple code that edits ProjectVersion in Config/DefaultGame.ini to increment the build number. As this is just something I'm doing to keep track of the project builds, I've not included the code nor the python dependency in the requirements above.

Yet another note: versioning the executable doesn't seem to work with UE5. Windows will report is as 1.0.0.0 (or 5.0.0.0, I forget)

That said, I'd recommend something like this. While Jenkins' archiveArtifacts can fingerprint the files in the build, it's useful for a plethora of reasons to be able to check the executable itself for this information, including creating patches.

stage("Version, Build, and Cook")
{
    if (isUnix())
    {
        // Version
        sh "python3 Project/Config/UpdateVersionInfo.py ${BUILD_NUMBER}"
        
        // Create temporary and artefact folder
        sh "mkdir \"${WORKSPACE}/Archive\""
        sh "mkdir \"${WORKSPACE_TMP}/Stage\""
        
        // Build and Cook the game
        sh "'UnrealEngine/Engine/Build/BatchFiles/RunUAT.sh' BuildCookRun -project='../Project/Project.uproject' -platform='$platformConfig' -SkipCookingEditorContent -pak -clientconfig='$buildType' -build -cook -stage -stagingdirectory='$WORKSPACE_TMP/Stage' -archive -archivedirectory='$WORKSPACE/Archive'" 
    }
    else
    {
        // Windows has Python3 as python
        bat "python Project/Config/UpdateVersionInfo.py ${BUILD_NUMBER}"
        bat "if not exist \"${WORKSPACE}/Archive\" mkdir \"${WORKSPACE}/Archive\""
        bat "if not exist \"${WORKSPACE_TMP}/Stage\" mkdir \"${WORKSPACE_TMP}/Stage\""
        bat "\"C:/Program Files/Epic Games/UE_5.0EA/Engine/Build/BatchFiles/RunUAT.bat\" BuildCookRun -project=\"${WORKSPACE}/Project/Project.uproject\" -platform=\"${platformConfig}\" -Distribution -SkipCookingEditorContent -pak -CreateReleaseVersion=\"0.0.0.${BUILD_NUMBER}\" -clientconfig=\"${buildType}\" -build -cook -stage -stagingdirectory=\"${WORKSPACE_TMP}/Stage\" -archive -archivedirectory=\"${WORKSPACE}/Archive\" -prereqs -package 
    }
}

Closing Notes

Unreal Engine, when building releases locally, creates a folder <Project>/Releases. This folder contains past released versions that can be used to build patches against. Currently, these objects don't get backed up nor do archives get created from them. For releases that are actually going out to consumers, you want to keep these somewhere safe.

You could probably also tag the source version and rebuild it before a patch, but engine changes might break these unless you pair the tag to an engine commit.

Closing Thoughts on Virtualisation

I use a virtualised Windows environment for building Unreal Engine and the project. There are a few things to be aware of in this configuration:

vCPUs and Memory

Unreal launches the build task on as many cores as it can, provided they each have 1.5GiB of memory. So, if you, like me, have 4G free when you start, you will get two tasks being processed concurrently. This check seems to occur after the OS memory use is taken into account, so be aware of that. This meant my VM that only has 8G allocated gets to run on two cores, which is less than ideal as I have more compute than memory at Unreal's target ratio.

To be fair, this is an enormous improvement over the old system that used to blindly spawn a task for every core. At the time, my desktop had a gigabyte per core, so it immediately filled all available memory and wound up increasing shader build times.

Networked Filesystems

I use NFS as a backing volume for the bulk of my VM storage. It's effective in Linux VMs, but it was a nightmare on Windows. Despite configuring everything as best I could (and working around a git fsync issue), I eventually gave up. The state it got to was pretty stable, and plenty for most things, but not for the Unreal build task.

Windows Enterprise, at the time of setting this up in 2022, doesn't support the newer NFSv4. Maybe that would fix the straw that eventually broke the camel's back. My builds were taking upward of 45 minutes which, given the speed my desktop could build and package locally, seemed unreasonably high. A quick check of the Jenkins log revealed that each file was failing to have permissions set (or something to that effect, it's been a while) and the build was waiting for a timeout on each file.

Switching to a raw disk image on a flash disk netted a reduction from 45 minutes to 3 or 4 minutes of build time. This is a tiny project, so that's far more in line with expectations.

The Linux build system didn't have this issue when testing and would happily compile on an NFS volume, but I disabled it because building the Unreal Engine was a pain, which leads us nicely to...

Building Unreal

If you’re customising the engine, your choice is a bit more complicated, but if you are using the downloadable version, save yourself the bandwidth and install the Engine as if it were just another dependency like MSVC or clang++.

I originally configured Jenkins to have an Unreal Engine pipeline that would pull ue5-main and build it. The artefacts from this build would then be used to build the game project. There's nothing wrong with this approach, but you end up storing the entire engine in Jenkins and copying it to the build node when the game project gets built. Both the build and the copying take time, bandwidth, and storage. For smaller projects, it's just not worth the extra hassle. For me, I axed this functionality when it started eating my precious disk space on the NVMe drive.

Where it becomes useful is when you have multiple people on the project and some are working on the engine while others work on the game, but at that point, the server doing the build should probably be well enough equipped that 160G of space isn't a concern. My issue is that my hard drive array is set up for capacity and resilience, not performance, and the SSD was meant to only hold the VM images, not their data.