top of page
  • Holly

Placement Year - Development Blog

Tanglewood Games Placement




At the start of my placement year, I was offered a position at Tanglewood Games as a Student Technical Designer. The studio was founded by former programmers at Epic Games, Chris Wood and Terence Burns, with a focus on providing support and expertise for projects using Unreal Engine. They are based in northeast England, although I decided to work remotely for the duration of my placement there.


I started in June 2022, just as the team was getting ready to start contract work on Hogwarts Legacy. Since the project was relatively late in development, our focus was mainly on optimization, particularly for load times and general performance on consoles.


Unfortunately, there are little to no direct images or videos I can show of my work, as most of it would be considered source code and under NDA. I will still do my best to describe as much of the software used and the process of working on the project.




The first task I was assigned to on the project was to work alongside Terry to collect and analyse streaming data from the game on various platforms. This started with me developing a tool in C++ that would output the time taken for each level load to a log file.


While I had used C++ with Unreal before, there was still a lot of learning involved in this first task. I had to learn how about subsystems, and how to set up console commands to be used with them.

The second part of this task was to write a Python script to parse the outputted log file, and convert the data into a more useable CSV file. Luckily I had past experience with using Python, so it wasn’t too difficult to brush up on and put the script together.


I spent a few days on working on this tool, and eventually was able to show the work to Terry for a review. I got some feedback on improving my C++ in a few areas, but it was overall positive. We were then able to set up the tool to be run during automated tests, using other console commands to have a player character run through set paths in the game where streaming was used.





Alongside all of this, I had been learning to keep track of my work with management software like Jira. It took some getting used to, but I would try to keep regularly updating my tickets with comments on the progress of each task. We would usually report on progress during daily standups as well.

For other tasks, I continued working with C++ and Python. In many cases, I was working with and learning about existing tools and APIs that had been built for the project; Figuring out how to extend them and integrate my own scripts with them and the editor.





A piece of software called Confluence was of great help here, which functioned as a sort of accumulated wiki about the various tools and systems made for the project over the years. I used it to look for existing documentation for the API I was writing Python scripts for. I was also able to get help from one of my team members who had worked on the tool himself.


Some of the work I was able to do included writing scripts to automate tasks like flagging blueprints that used tick nodes etc. These were then used by myself and other team members to identify and investigate blueprints that needed optimization, which was another set of tasks I then worked on.


Most of these optimization tasks involved nativizing Blueprint code into C++. One example was for two blueprints that each featured an eye that would follow the player’s movements. The existing blueprint code was doing a lot of vector math and function calls on the tick event, was repeated across the blueprints and had a few bugs. In this case, I spent a day investigating why the math was incorrect and how to fix it while still working with Blueprint code. Once I had found the issue, I then started work on nativizing most of the code to a function library in C++ that could be accessed by both blueprints.


Initially, I ported the code almost exactly into C++. However, when having my work reviewed by another team member, I was given some feedback on ways I could further optimize the code. These were things like making better use of pointers, reducing function calls and variable assignments etc. It was helpful to get me to start looking into the underlying math for a lot of the functions I was calling and using simpler inline versions instead. I received positive feedback on the extra optimizations I ended up making after this.





The submission process for work in general was something that took some getting used to. My experience with source control up until now had been with using Github for small university projects. Here we were using Perforce’s P4V client, and there were significantly more steps to go through to have changes approved before committing to source. As I’ve mentioned before, any code had to first be reviewed by someone within the team at Tanglewood, sometimes once or twice if changes were necessary. This usually involved moving changes to a separate changelist, so whoever was reviewing it could look over it themselves.


After this, it was also necessary to have the changes reviewed by one of the blueprint’s original owners, often from another team on the project. Finally, the changes were compiled and run through a series of automated tests to ensure the build would be stable, before being uploaded.


I often found managing these things to be more work than the tasks themselves, especially with the amount of regular communication required. It was definitely a challenge for me at times, but it was an interesting experience to see how projects of this scale are managed. I still feel like managing my work is an area I need to improve on in the future, so it will be useful to reflect on.




For some tasks, I started having to measure performance more directly. There were a few different ways this could be done, including using tools like RAD Telemetry to look at the time spent on individual functions. A lot of the learning for this was mostly done by again looking through the project’s documentation on Confluence.


One last example of the work I did was the development of a separate Python application to display large amounts of recorded streaming data in more readable graph formats. This was done with the Plotly Dash library, which I had to spend a fair amount of time researching and looking through the API for. I spent around two weeks on this task, and the results seemed to be helpful in identifying problem areas in the game’s stream times.





Freelance: Project Re-Mix



The first freelance project I worked on was for a client called PrimeCut Productions, a Belfast-based theatre company. The brief for the project was to create a game for iPad devices that would be used in a series of workshops in schools, with the aim being to promote “self-expression, mutual understanding and celebration of diversity”.


Kids would work in groups and create their own unique characters, that would then become involved in a branching narrative through a series of event prompts. The client themselves had mentioned that they had little experience with games or game development, so most of the specifics of how the game would work were left up to us to design.


On this project I was working alongside two other students from Ulster University: another Games Design student and an Animation student. After our initial brief from PrimeCut, we had a few meetings through a Discord server where we started to plan out our ideas for the game. We mainly used Miro for brainstorming and organizing the ideas and notes from these meetings.

For the first meeting at the PrimeCut office, I helped set up and organize a presentation going over our narrative ideas and plans for the structure of the game. This included making some rough mockup images to convey some of the ideas.




Character Creator


As we started production of the game, I decided to take on the task of designing and scripting the game’s character creation system. This was a fairly long process and I spent approximately two months developing it.


I started work on this by putting together a very basic prototype with blueprints, to see how feasible it would be to get this kind of system to work in Unreal. I found Epic’s documentation Working with Modular Characters a very useful resource. For the prototype, I went with the approach of having each piece of the character be its own skeletal mesh component (head, arms, legs, torso etc.), and using the “Set Master Pose Component” node to apply the same pose to all of them.


Something else I experimented with was generating all of the body part icons dynamically, capturing them with Render Texture targets. The idea was to help save time making the icons manually, as we would be aiming to have a decently large number of options available to players and would potentially be making regular changes to items throughout development. It ended up being relatively easy to implement and definitely made it easier for us to add new items later on. Overall, this prototype took me around 2 weeks to put together.




With the initial prototype working, the next task was to design a more robust and flexible version of the system, with one of the main goals being to support extra features or “accessories” on each body part. I also wanted as much of the system to be data-driven as possible, to make things easy for anyone to add more items or quickly tweak existing ones.


At first I had attempted to use Data Tables for this system, but things quickly became a lot more complex than I had anticipated. I had to go back and properly plan out how the system would work and how characters would be structured in a more object-oriented way.



I eventually settled on using Primary Data Assets to store item data, as they could make use of inheritance and contain their own functions (useful for things like generating their own icons). Meanwhile, the characters themselves would keep track of their appearance with a sort of “tree” of UObject classes that would store the data of currently instanced body parts.


Each Body Part item could now specify a list of “slots” that certain types of “Subparts” could be attached to. Body Parts with matching slots would retain previously selected Subparts when switching between them. Implementing this system took about 3 weeks and was fairly complicated, but I was happy with the results in the end.



Our team was able to show this progress to PrimeCut in another meeting at their office, where we also got the chance to meet the facilitators who would be running the workshops. The character creation system received positive feedback, and we were able to get a better idea of what exactly was needed of the game for the workshops. For example, that the game should allow for each person in a group using 1 iPad to create their own character. Based on this, I added a “waiting room” screen to the character creation process, which would display a lobby where created characters can be viewed or edited before starting the game.


It was around this time I also started designing the proper UI for the character creator. I used Adobe XD to put together some low-fidelity wireframes that I could show to the rest of the team and PrimeCut themselves. My main goals were to make things as intuitive to use as possible, while making sure there weren’t too many options on display at once. Both the team and our client responded positively to these designs. Later on I also handled the visual design of the UI.



UI Development Process



As we continued work on the project, I did end up making a few changes and additions to the character creation system. To start with, since the pawn blueprint I had used for the character creator was separate from the pawn that would be used in-game (with AI logic etc.), I decided to move the logic for handling and tracking the character’s appearance data into an ActorComponent that could be used by both pawns.



Another change was to start merging all of the skeletal mesh components into a singular mesh once character creation was finished, as an optimization for when large numbers of characters would be active at once. For this I actually got to make use of some of the C++ skills I had developed during my placement, as it required exposing an existing “SkeletalMeshMerge” function library to Blueprints. Due to the way the character’s materials were being handled though, I ended up having to create a modified version of the function library myself in C++.


Some extra features required a lot of reworking of how character appearance data was structured. This included things like letting players pick colour parameters for certain items, or having Subparts adjust their position depending on which Body Part they are attached to.


After a meeting where the workshop facilitators had a chance to playtest the character creator, it was also pointed out that the workshops would likely be broken up into multiple sessions across different days. Therefore, it would be necessary to have a method of saving the characters and progress for a given group, so they could return where they left off. The game would also need to support multiple saves, as multiple workshops would often be held in different schools in a single day.

Making sure all the data associated with each character was properly stored and able to be loaded took a lot of work and debugging. By the end of it I feel like I’d learned a lot about custom data structures.





Narrative Event System


Towards the end of the project, the team was having some difficulties getting a system working for the game’s random narrative events. Mainly, the issue was with character AI, and since I felt confident working in this area in the past, I decided to take over.



With the time remaining, I managed to put together a simple, most text-based system. Random events are picked from a weighted list, causing set numbers of characters to make their way to a specified location. On arrival, a textbox usually appears describing the event. From there, facilitators would encourage kids to discuss the event in groups and come up with their own ideas about what happened.

There is some additional complexity behind the scenes, with events having cooldowns before they can occur again etc. However I definitely would’ve liked to have had more time to develop this system further.



IOS Deployment


At the start of production, none of us in our team had any experience with building a UE4 project for IOS, so I started researching about it early on. In the end I’m glad I did, since there were a lot of steps involved and problems I had to solve in the process. Between different build errors, crashes with the Mac I was using, and complications with Apple’s provisioning licenses, it took a few days before I managed to get a functioning build launched on an iPad.



For a while I tested the game by launching it directly onto the iPad, but later realized we would need a method to deploy the build to multiple iPads at once for the workshops. Luckily, I was able to start using Apple’s Testflight service, so new builds could be uploaded and then downloaded by a specific list of testers. This way, any updates could be made without needing access to the iPads themselves.



Final Version







Freelance: Digitising Maritime Heritage


As this project is still ongoing, I have less work to talk about overall. Like the ReMix project, this one is also targeting mobile devices including IOS, so I’ve been able to make use of that previous experience. The brief was to develop an application for the Ulster Museum that would provide information about its maritime heritage exhibits, making use of Augmented Reality to allow the viewing of scanned models etc.


Most of my work has been focused on researching the AR functionality available to Unreal, such as the ARKit and ARCore plugins. So far, I’ve been able to develop a prototype that allows the selection of different exhibit locations, and for the associated models to be loaded in and viewed after scanning a reference image.




I also had the chance to see the boats myself and take part in the scanning process at the Ulster Transport Museum.

5 views0 comments
bottom of page