Despite the easing of the lockdown, social distancing in our office remains tricky and with productivity at an all-time high, we’re reluctant to break the spell yet. Our Innovation Centre is getting back on track which is great for the future story of Immersive Environments. This was our week.

Toby – Leading From Home-ish

Albeit I’m gradually returning to the office, I’m also taking the opportunity, when possible, to top up my Vitamin D reserves. My partner lives in Bristol so we have ready access to some beautiful countryside and quiet roads. Such a treat in the sun.

Another interesting week work-wise. The hardware for the Innovation Centre tracking system has been delivered to site and now we’re working out how best to install it. The sound system will follow over the next couple of weeks.

As anticipated, as we start to emerge from the ‘Covid-19 ‘lockdown’, we are being increasingly asked about the use of headsets and how we address concerns about the sharing of them during training courses and at events and shows.

We have drawn up clean-down and quarantine protocols for the Innovation Centre, but also have the capability to live-stream our solutions to people working from remote locations; they don’t even need to use a headset. And, of course, there’s always the use of Augmented Reality……

Josh – Programme Manager

Popped into the office this week, that courgette photo is actually real. I assumed it was from Google. Motion capture is underway for the disabled customer scenario in TOVR, the rest of the development is pretty much complete, so that’s good.

I’ve spent the last couple of days looking at the feasibility of delivering some of our upcoming solutions remotely. Given our solutions normally need quite a lot of graphical processing and given the average mobile workstation doesn’t have a graphics card, it sounded like a bit of a stretch.

But with a little bit of work on our side and some pretty cool remoting software and cloud provision, it looks like it might just work. It won’t attempt to be as immersive as an on-premise VR solution but it could, in theory, offer a lot of long term flexibility in how training is delivered or administered.

I’m about to complete storyboards for the Incident Management planning I mentioned last week. We’ll take those designs and construct 3D versions, to comb through with the Highways England project team.

Apparently, due to the danger of doing this kind of thing on a live road, this training hasn’t ever been attempted before, in any form – so It’s comforting to know that however we do, it’ll be better than nothing.

Cat – Lead Programmer

Traditionally, as a team we’ve worked almost exclusively in Unity. However, it’s important for us to remain flexible in what we can do and not become dependent on any particular technology or toolkit.

With this in mind, I’ve been researching web-based Augmented Reality with Sergio. At present, Unity is unable to offer Augmented Reality as part of its WebGL platform so we searched for alternatives.

One we found is AR.js, an Augmented Reality specific library built on top of A-frame, a more general XR web framework. Using this toolkit, I was able to use my phone to plant a virtual object on a real-world image, all accessible from a website running on my computer.

From my experience as a web developer I was thoroughly impressed at the implementation of the library from the perspective of writing an HTML web page. Looking forward to experimenting more with this technology!

Working closely with Varjo, I’ve been trying to get real-time reflections working to a level that we can use in our driving simulator. Reflections are a surprisingly difficult rendering problem, particularly in virtual reality as the stereo projection must be accounted for.

This week I was able to produce an effect that looks great in video recordings, but seems to produce a strange offset effect when viewed from inside the headset.

This will probably be sufficient for our needs, since the user won’t be moving much relative to the mirrors, which is the next stage for me. Looking further forward, I’d like to see if ray-tracing can be employed as a possibly more performant implementation of mirrors.

While I don’t have much experience with the technique, reflections are one of its strengths and so I think this may be a perfect application.

In what was a busy week, I’ve also been doing final bugfixes, changes and starting post-production on our driving game. For us, post-production means extracting the parts of the project that will be useful in future endeavours.

In the case of the driving game that means packaging the now almost omni-present road system we initially developed for our Eyes & Ears experience, such that it can be easily included in other projects.

I’m also updating the traffic package with the updates from this project, and shunting some useful data structures into our general utility scripts. Having a post-production process and sticking to it means that over time we develop features that can be used again.

Sergio – Programmer

A website is the window to a business and sets the tone for the first impression. From design to user experience or performance to SEO, a good website creates trust with users, whilst demonstrating consistency of message, to help people decide if they want to do business with them or not.

We want the MXTreality website to be easy to use, fast, secure and SEO optimised. So, after weeks of research, finding and learning the right stack to use for the new website redesign, I concluded that a static page generator Gatsby with a headless CMS, offers us many advantages.

The power of Gatsby is the way it delivers pages to the browser. A website built with the typical server-side approach would dynamically pull information from a database every time a visitor hits a page, which on a content-heavy site would result in browser refresh, delays and consequently bounces – someone who visits your website, views one page only and leaves.

With Gatsby however, the pages are compiled prior in build time, which cuts load times significantly, allowing developers to build fast websites. Also, static pages are more secure and reliable as there is mostly nothing to hack into and the pages could be regenerated without losing any content.

Gatsby and its components, ensures a developer can future proof a website with modern technology such as React (An open-source JavaScript library for building user interfaces) and GraphQL (an open-source data query and manipulation language for APIs) which has big community support and is one of the standards in a web development world.

Finally, what makes Gatsby powerful is the way it pulls content, like blogs and articles about our experiences and technology that helps engage an audience with insight into our knowledge of the important industry we work in.

This is helped if we improve the way we fetch data, robustly and securely, without creating problems for our content managers, which is what the headless CMS is capable of – more on that next week.

Slava – Lead 3D Artist

This week I was working on creation of vehicle, adapted for use by a disabled person. Basically, I used normal Chrysler car 3D model, which I converted with a motorised wheelchair in place of the driver’s seat. Both the car and wheelchair models needed to be optimised for use in Unity.

I also added further modifications, including a blue badge inside the windscreen, two sticks connected to pedals and small stick on the steering wheel to allow rotation with one hand only – all of these details were designed to help the viewer recognise the driver as a disabled person.

In addition to this task I created small assets, such as traffic cone and temporary sign. I also made some modification to the environment of the new scenario, which included new skybox, lighting, and new road markings.

Stefano – 3D Artist

This week, after two attempts didn’t works as I wanted, I finished the 2D version of our logo animation. I loved coming back to the render pipeline and letting the creativity flow.

It was great fun working with particles and colours in Maya, rendering them with Arnold and then compositing in After Effects. I had an initial idea, but basically, I experimented with force fields, lights and colours to create the final effort. (A couple of screenshots of the result)

This week I also worked on a variety of sounds for our projects, along with a lot of motion capture takes, which have become a habit now and something I feel perfectly comfortable with.

Kyung-Min – 3D Generalist

With the last of my work in events on standby, as we await the final motion capture, my efforts turned to refining existing elements of our scenes.

With the removal of the baking process from an Open GL project, it has made it hard for me as an artist to generate works, when textures are not visible or displaying values. Some final tweaks await, but it’s been a great learning experience to understand how we could work with such specifications.

With a New Jira workflow being implemented, time was spent adapting our old workflow and amongst these changes, the new art pipeline we built tested and refined during our events project is also now being processed.

The end of the week saw me taking all the data and notes over these trials as well as what we learned through errors, and compiling them into a format that can be presented to the art division for feedback.

I’m excited for these changes as I know it will mean a lot for the team’s workflow and simplify a lot of our work, saving time and reducing the risk of an error creeping in.

I have also been given the authorisation to construct video tutorials and workflow videos that will be helpful for an artist learning the new processes – always an eye on growth and more team members here at MXTreality.

With Wiki pages as simple as they are, I’ve found artists especially, respond better to a visual representation of examples in complex workflows, than just text.

A lot of time and consideration was required to create these new working methods and practices, designed not just for our work now, but for those MXTreality recruits yet to join.

Jordi Caballol – Programmer

This week has been all about experimenting and playing around with the new Intro screen and, especially, with how it transitions to the scenarios.

The reference we had in mind was the holodeck from Star Trek, that looks like this:

So first of all we replicated this style for our room, achieving this kind of effect:

Once we have the room built, it was time for the transition effect for the scenarios, again using the holodeck as a reference:

As you can see in this image, the walls disappear progressively. We wanted to replicate this effect but we had one issue: the idea of fading the walls is cool for objects that are behind them, but we had objects that needed to appear inside of the holodeck.

The solution is what I’ve been working on almost the whole week. First of all the shape of the objects will appear using the same pattern as the holodeck, and then this will fade to the actual colour of the object, to give the sensation of the world materializing in front of our eyes.

It looks something like this: