The MXTreality team spent time preparing for the imminent installation of the tracking system to bring the Virtual Reality Innovation Centre one step closer to being the showcase for what’s possible in immersive environments. To begin with, it will largely be training programmes for Highways England Traffic Officers, but the future is happening right now and in Derby of all places.

This was our week, commencing 29th June

Toby – Delivery Driver, Mask Provider, Hand Sanitiseror

For me, this week has been about preparation. Preparation for the Innovation Centre tracking system install next week, preparation for the live trials of our Varjo XR-1 Driving Simulator, preparation for our office move, and for the future in general looking again at developing our Value Proposition and determining where best to invest our Marketing and Communications time (and money).

Alongside that we tried our first podcast (waiting to see the results of that), we continue work on the website ready to relaunch the new design, and we’ve invested in a lovely new cine lens to have a crack at some higher quality filming of our work.

Ending the week with a call from the good people of Varjo was a high note. In many ways this is still a young market and one of the many positives is that we work hand-in-hand with hugely impressive organisations such as Varjo. They’re a terrific bunch and super-enthusiastic.

We’ve tried our best to invest in the right areas, both in time and money, during this peculiar period and we can’t wait to get going now that many of our clients are returning to work. I guess we’re absolutely not out of the woods yet, regarding COVID-19 and as a business we will do all that we possibly can to ensure the wellbeing of our beloved team.

Josh – Programme Manager

Busy. Leading across all the projects. On holiday now so no diary.

Cat Flynn – Lead Programmer

This week I have mostly been focussing on developing our ‘Animals on the Network’ (ANVR) solution. For the most part, this has involved setting up scene transitions with our new holodeck system and starting to bring some life to the animals involved with some beautiful animations from Stefano. Now I have a pet dog in VR! And a cow, and a deer, and a horse…

As we progress the project, I’ve been taking time to re-assess planned work to make sure nothing sneaks up on us. In the past I’ve found a Kanban workflow can easily slip into something less productive and start to eat up time and effort while progress slows.

This gets worse as the real state of the project diverges from its representation in the tooling used to manage it. I’ve been chipping away at it this week, and now I’m more comfortable with the tasks we have planned.

Sometimes it’s hard to avoid writing code in favour of more planning, but the end result almost always benefits from it.

Sergio – Programmer

This past week, I had a chance to jump into our internal web AR project to undertake more research on the technology and in particular, complex scene setups and events.

Towards the end of the week, I joined Cat on our ANVR project, which required a similar User Interface (UI) to the one I created for our ongoing Traffic Officer VR project.

While the visual design of the UI was focused mostly on the dialogue system component created for that project, we could still import most of its critical features to ANVR.

One of the big differences between the projects is that this time we opted to use Unity’s Scriptable Rendering Pipeline (SRP), which focuses on utilising node-based shaders. In the process of importing the UI, I started to note down the parts that could be decoupled more and made modular to help futureproof our UI systems, which I will continue next week.

Stefano – 3D Artist

This has been a week of planning and motion capture recording.

For this new scenario I have to animate two characters that interact with each other and with the user, so the workflow has been revised to meet the requirements and to ease everyone’s work.

I’m now starting with the post-production and cleaning of the raw mocap files and everything seems to have worked as predicted.

I worked a lot on the sounds and voiceover audio clips, merging and cutting pieces to meet timings for the back and forth between the characters. I also included environment sounds and creating the custom audio files to help the recordings come alive.

As usual, I’m creating detailed automations that will help deliver repetitive operations on each of these scenes.

Slava – Lead 3D Artist

This week I was working on the talking animation of two characters for our new scenario, which included preparing the characters for lip synchronising in Unity and the application of Physically Based Rendering (PBR) materials – it’s a way of shading and rendering to deliver a more accurate representation of how light interacts with surfaces. All about the detail at MXTreality.

This current scenario is a bit different from previous ones, as we have two interacting characters, which means more animations and more attention to the structure of files.

It is taking a lot of time for the detail as one of the characters is quite an emotional lady, so animation of additional facial movements based on blendshapes is challenging.

Another task was experimental and involved exporting animations of vehicle doors for an AR project. Unfortunately, there is still no direct way of exporting 3D models from 3ds Max in an AR suitable format, so this process includes several intermediate actions.

This makes the process quite tricky, because of baking animations into external files and attempting not to lose this animation completely or some of its properties during the conversion.

Kyung-Min – 3D Generalist

Thinking back to the start of the pandemic, Toby took a lead and sent us all home long before it became mandatory, which proved to be the right thing to do and one I wish others had taken.

With the government announcing an easing of the COVID-19 restrictions, talk amongst our team has turned to the transition back to our studio space. Many of us have already made the decision to return, but we have the choice to remain remote, thanks to the company’s foresight.

COVID-19 has changed the world, but I like to think its negative impact has been countered somewhat by the work we do at MXTreality that brings positivity and even saves lives.

As the mini-Holodeck project progressed I continued to learn more about the Unity Universal Rendering Pipeline (URP) and Shader Graph.

Fortunately much of my knowledge from the Unreal engine node-based system transferred to the Visual Effect Graph, a node-based VFX editor inspired by the leading film tools, which made the learning process a lot smoother than originally anticipated.

With my work in Shader Graph advancing at a steady pace, I found myself looking into the new Visual Effect Graph designed for URP and HDRP particles systems.

With the primary foundational work on the Hallelujah mountains finished, I began spending much more time in VR testing the experience. The initial paper build was constructed in 2D but once testing began in VR I found it had to be scaled up vertically almost 3 times its original height to deliver the originally intended impact.

With the Key assets and animations to be implemented next, the barebones experience will be complete and ready for intensive testing.

Accumulating data from the feedback will allow data-driven design choices and adjustments to be made, which helps the final outcome be truly outstanding. During this accumulation phase, I will also enrich the scene, adding foliage and enhancing the backdrop to truly immerse players.

As our first Holodeck experience comes to the end of Phase One, concepts to advance it have already been outlined for Phase Two, but next week I look forward to sharing the outcome and feedback from our team.

Jordi Caballol – Programmer

This week I have been almost full-time on the driving simulator.

The most difficult part of which, but also the most fun to implement, is implementing the systems of the car so they feel believable. This includes simulating the way the engine produces the force, the way it revs, how the force is transmitted through the transmission, etc.

All this detail takes a lot of time due to its complexity and the amount of fine tuning (excuse the pun) it requires. However, during the week we achieved a perfectly driveable car and now it’s about adding layers of detail to improve the feel and make it the driving a truly immersive experience.

Once this is done, the plan is to build a motorway section where we can drive around to showcase the simulator. But this will be in another diary, so don’t forget to check back.