The return to our studio has begun, like many businesses and it will be nice to see the team in person, socially distant of course. The new projects continue to hit our inbox and if they are an indication of how clients view the near future, it looks like Immersive Technology features more prominently now than it did even before the lockdown. This was our week.
Toby – Back in the office and still in charge
This week was my first full week back in The Shed (my office space). It really is a shed y’know, albeit one packed with technology.
Things have overgrown a bit during the lockdown, but I think I might leave it for now. I’m fortunate to live pretty close to the office so I can walk in each day and it’s a real bonus to be able to have that separation between home and work each day.
Busy times now made busier, planning for opening the Innovation Centre but we’re also looking at a variety of 2D-type experiences for clients to use in training people that are working remotely.
Immersive Technologies to one side we’re a team of Artists and Programmers, we have state-of-the-art Motion Capture capabilities, and animation to go with it so we’re happy to put all that to good use wherever it’s needed. Forthcoming experiences, as you’ll read from the team, include a Holodeck, 2D gamification, and Augmented Reality fun…
Josh – Programme Manager
It’s a been a good week and as I write this we’re Quality Assurance checking the final touches to Scenario 4 of the Traffic Officer VR training solution.
I’ve continued planning our Incident Management solution and Toby and I have both shared productive calls with the trainer-led project management team at Highways England.
In addition, we have a lot of ongoing projects. We’re doing some internal work on both app and app-less AR, Sergio’s website project is continuing, we’ve almost completed our Virtual Events project and our driving game progresses well.
We are at the final stage of development for our Animals on the Network solution, which helps Traffic Officers understand the best way to deal with wild animals roaming the highways.
We’re continuing work on the Driving Simulator and we’ve started concept work for the next stage of the Holodeck project, which I’m sure others will talk about in more detail.
Cat – Lead Programmer
This week I have started work on the ‘Paper Build’ of our solution to help Traffic Officers deal with animals being in places they shouldn’t be – on roads.
A paper build is a relatively quick once-over of a solution, a framework built quickly to get an overview of all the moving parts of a project. By spending some time up-front to give structure to the whole project, we can avoid tricky integration problems later in development.
By composing a paper build of parts developed in earlier solutions, we can hit the ground running with new projects instead of starting from scratch. In this instance, we are re-using the road and traffic systems we’ve been developing this year (with minor tweaks, as there always are) so we can produce a lot, very quickly.
One unfortunate effect of the pandemic is that we can’t deliver our solutions like normal, at conventions or using shared equipment. So, we’ve been considering ways for getting our solutions onto people’s PCs remotely.
Game streaming is a challenge, yet fundamental to enjoy widespread mainstream success even from the largest tech companies as they attempt to make compelling services.
Our requirements for streaming an application are a bit more lenient in terms of latency and quality than running modern e-sports titles. I’ve assessed Parsec and Amazon AppStream 2.0 as potential application streaming platforms and will be looking at more in the near future.
Lastly, we continue to look into Augmented Reality and spent time among the programmers, figuring out the features our research project needed to have, to address our questions about working with the technology.
We decided to develop two identical projects, one using Unity and another ‘App-less’ project using web technologies. Both projects will need us to substantially change how we work on and test projects, so my primary focus will be figuring out our internal tooling and processes for projects targeting web and mobile platforms, instead of the desktop platform we’re most familiar with.
Sergio – Programmer
Last week, I transitioned from our Events experience project to a new internal AR (Augmented Reality) research project, to understand the capabilities and benefits from both web and standalone AR applications.
Cat and I investigated AR.js and found it worked surprisingly well on mobile device browsers, given there are not many stable web-based libraries.
I started undertaking local tests on Natural Feature Tracking (NFT), complex image-based tracking instead of a marker-based approach. The tracking quality depends directly on the resolution of the image – the more complex it is, the better tracking we have.
The light plays an important role as well, with a database file created containing all descriptors and their position on the original lit image. Then, using the camera, a real-time recognition process takes place that matches the descriptors from the live video against those in the database.
If the camera is processing a much darker set of pixels or the image target is blending very well with strong colours, it may have poorer results, but every day is a school day here at MXTreality.
Overall, I am very intrigued by learning more about AR, but next week I’ll be spending more time on researching our website redesign and hopefully write more about its features and processes in our next blog.
Stefano – 3D Artist
This week I tackled the character animations for our latest immersive experience project. Working with the better takes from the mocap recordings, I began by figuring out all the fixes and adjustments to make the character fit comfortably inside the car.
Recording the mocap in my own and in a small space has proved perfectly feasible, but it does have its limitations.
After exporting the animation files from the mocap suit software I had to ensure they were all centred in the same position, which they obviously weren’t. I applied a transformation on the main bone of the skeleton for each scene, to shift them all to the same spot.
Since there were 30 scenes, I made good use of Maya scripting to automate the process and speed things up – essential given how many projects we have in the business.
After that I had to fix the bad movements that makes limb models cross with other objects, by ‘deviating’ manually my own virtual acting.
At this point I wanted to export and test one of the animations and I started adjusting positions, importing referenced files, renaming objects, baking keyframes, cleaning animation curves, deleting bones and nodes that are not used in Unity, all of which took around 2 hours.
There would have been a lot of downsides to reproducing this process manually for all 30 scenes, like taking more than a week just to export all the files and undoubtedly making a few errors when repeating long sequences of commands.
So, I decided to invest almost an entire day on scripting all those operations to be performed with just a few button pushes, which reduced the 2 hours cleaning and exporting process to 30 seconds.
This ensures we can undertake the deep cleaning needed to make the animations very light and super-optimised, whilst also being able to make modifications to animations whenever needed, without spending 2 additional hours every time – efficiency is a key driver.
Slava – Lead 3D Artist
This week I worked on several different tasks, with the facial animation of a new disabled lady character, central to my efforts. As always, I created new material and specific setups of blend shapes for lip synchronisation and emotions, before animating for each line of dialogue.
Another interesting task was building the environment for the intro scene of project Holodeck, the famous fictional room from ‘Star Trek’, used to introduce virtual reality environments.
We will be using a similar approach to prepare customers for their VR experience and to transition between scenarios. Jordi made an amazing shader, which makes any surface gradually disappear at the beginning of each scenario and reappear at the end, with a nice transition effect.
I made a room with fictional sci-fi style doors and applied this shader to the walls. I modified all materials to create a more interesting experience inside this room and created a model car and character to help users navigate inside VR.
A big TV screen was added, showing video from multiple scenarios as if it were CCTV recording. I also worked on multiple small tasks, like modifying the character mesh to improve the shape of the cloth when sitting and creation of a reflection probe to improve reflections on the car’s surface.
Kyung-Min – 3D Generalist
This week’s diary will boldly go where no man has gone before! Since the creation of the cult classic Star Trek many of their depictions of futuristic technologies have become surprisingly accurate predictions of our future, turning science fiction into reality.
From the flip phones, tablets, Bluetooth earpieces and voice command computers, the list of eerily accurate predictions continues. But apart from the spaceships themselves, top of many fans’ wish lists for a technological transition to reality were the holodecks!
A room that allowed for immersive virtual holographic experiences, used for virtual training or leisure experiences. While the show’s interpretation may have seemed like complete fantasy, in reality our project gets us ever closer to making it possible.
This week I worked up some potential concepts for our very own holodeck, with a virtual reality game along the lines of an escape room experience, first on my list.
I began by researching the fundamentals of escape rooms and discovered existing VR escape rooms mostly follow traditional designs and try to recreate existing rooms.
Our concept is progressing well, with input from both the creative and programmer teams regarding the complex aspects of the project and how we might manage them.
As I have often mentioned previously, while hardware for virtual reality has seen a lot of development, software is still lagging behind. Only now are we seeing triple-A titles such as Half-life Alyx and Boneworks show the true potential of what can be achieved.
While the full specs and details about our holodeck are strictly classified, the potential and cutting edge technologies we are implementing, I believe, truly are the first steps into turning the dream of a holodeck into an MXTreality.
As part of my research on designing VR for new users, I ran some trials with someone who was non-tech proficient and trying VR for the first time. As someone who I knew preferred books to mobile phones and had never played any video game, it was amazing to see how VR surprised them.
At the end of the experience, she was left nearly in tears after being overwhelmed at what she had just experienced whilst trying this immersive technology. I rediscovered my passion for VR, reminding myself of the wonder of the technology and what our work means for the future.
Jordi Caballol – Programmer
This week, a lot of my work has been in finishing Scenario 4 for the Traffic Officer experience, setting up the last few items and fixing the last few bugs. Aside from that I spent some time in creating a storyboard for a future project about incident management.
The storyboards, so far, look like this:
And finally I started researching about creating AR apps with Unity to get experience for a future project, but more about this in next week’s diary, when I (hopefully) will have results to share.