Highways England estimate that up to 74,000 traffic incidents, each year, involve deer and there are plenty of other animals that can find themselves loose on the roads network. The question is how to train a Traffic Officer in dealing with them.
These animals are often in a distressed state, which is impossible to recreate in training scenarios with real animals, not least because of the very obvious welfare concerns that would result.
Immersive Technology, in particular Virtual Reality, allows us to create scenes and scenarios like these that can’t otherwise be staged; incorporating high quality animation, detailed modelling, and accurate animal sounds has all played a part in our ‘Animals on The Network’ training solution.
All in a week’s work here and this was our week.
Week commencing 3rd August
Toby – Managing Director
We’re now into the holiday period and so I’m trying to juggle the workload to maintain the flow of projects while our various team members jet off to their respective home countries for a well-earned break.
The focus for me is the launch of some key projects over the next few weeks, of which I’ll talk a bit more about in coming diaries and blogs.
Very soon we’re going to start showing off our driving simulator capabilities and, yes, finally our Test & Innovation Centre which feels like it’s gone through the longest of incubation periods but we are actually nearly there now.
The remaining tracking headsets and computer backpacks will be delivered to the office next week and the team will start experimenting with them and getting to know their capabilities, before they then go back to the centre itself. FUN!
Josh – Programme Manager
Josh spent the week trying to get the sound installation organised in the Test & Innovation Centre but came up against power supply issues.
He’s also trying to co-ordinate a number of new projects and get them integrated into our pipeline; Augmented Reality Vehicle Checks and Augmented Reality experiences to complement real-world conferences. And he built some Ikea cabinets, with no bits left over! (Actually quite a few, according to Toby)
Cat – Lead Programmer
This week I’ve started work in earnest on the new software for the Test & Innovation Centre. I thought I’d take the opportunity to make some comparisons between the original implementation and the new one we’re developing now!
As a networked solution, getting reliable, stable and convincing networked clients is very important. To accommodate this, we communicate just over the local area network (LAN) when the application is deployed.
This reduces latency, as shortening the physical distance travelled by data packets decreases the time it takes to get responses from the server. It also reduces – or entirely eliminates – packet loss since information isn’t having to jostle for priority on busy public internet lines.
To develop it originally, we had a server running in the office so that we could use the local network already in place. However, now that a substantial fraction of the team is working from home we need to be a bit more flexible.
As such, while re-implementing the network layer I’m taking care to include support for running it in Photon’s (the network package we’re using) cloud. Also, for added flexibility, I’ll make sure that the whole thing can capably run without needing a server online at all. Much easier for development!
From the start, this project had complex requirements in terms of how end-users start up and run it. In deployment, there will be at least two different types of client.
The first type is the operator, in charge of running the scenario, communicating information to the trainees, while seated at a desktop computer. They will have a birds-eye or CCTV view of the scenario and the trainees within it, as well as a number of controls to affect the simulation.
The second type is the Traffic Officer trainees themselves – equipped with a headset and VR backpack, they will be on the simulated road in the middle of it all, practicing managing incidents on the road network.
For them, it’s important the training scenario is as immersive as possible – no user interface elements, very ‘boots on the ground’.
Additionally, for development purposes we need at least two more versions – one to use VR without the advanced StarTracker positioning system and another to run without VR at all to help us develop the networking aspects.
In the original project, we handled this by compiling four separate versions of the project. While it worked, it was hugely inefficient because the end product was mostly the same with just a couple of configuration differences.
Now that we have other projects in progress, as well as a significantly longer compile time thanks to Unity HDRP (the new project has vastly improved graphics!) we can’t afford to tie up our build server for a couple of hours every time we want to make a build.
To resolve this, I dug into PowerShell a bit and figured out a way to launch the same executable in a number of different modes, corresponding to the different configurations we need. As an added bonus, this also lessens the load on our cloud storage as we’re only storing one build, not four.
Sergio – Programmer
Website redesign: Mobile First?
During Mobile World Congress in 2010, it was put forward by Google that designers should focus on mobile-first for their product design, simply because the medium is shifting further from desktop users towards the smartphone.
According to Statista, the number of smartphone users worldwide today surpasses 3 billion and is forecast to further grow by several hundred million in the next few years.
Many developers opted to focus their website design mobile-first and port it all together to desktop screens (Facebook 2020, Twitter, Evernote, etc). With this decision, sacrifices had to be made and some of them were compatibility issues with some older browsers such as Internet Explorer 11 which by any means is still being supported by Microsoft’s Windows 10.
Although, targeting only mobile is tempting, at MXTreality we focus on a wide range of users and we need to display information that is engaging and accessible.
Thanks to the flexibility of Carbon Design Language and Gatsby/React, I can create versions of our website that will work well both on mobile and desktop browsers without compromising compatibility or performance, while keeping a distinctive structure.
For the past week I have been tackling exactly that, mobile/desktop responsiveness that can directly affect SEO – Search Engine Optimisation (Something that I will discuss in more detail in the future posts).
Stefano – 3D Artist
This week I spent time creating some detailed sounds for our virtual animals to help improve the sense of presence in the immersive environments they inhabit. The first pass was to add the most obvious animals sounds like barking, tailored to each of the existing animations.
A second step involved the sound of hooves or paws when they hit the ground or slide on it, and we are now testing the results of automating these sounds effects where and when needed.
We want the source of the sound to be automatically positioned on the moving hoof, so a realistic spatial coherence will be respected.
It’s not finished: I created a breathing sound for each animal, extracting the sound from real footage, to play when the user comes really near to it, plus many other generic body movements noises, like the tail.
When training Traffic Officers how to deal with wild animals loose on the road network, detail is everything if we are to make the scenarios realistic. And a heavily breathing horse up close and personal can be quiet scary if you’re not used to it!
For the rest of the week I started creating the concept drawings for our different virtual or mixed reality driving simulator solutions, based on the use of the Varjo XR-1. One of the options is based on a real vehicle.
At the same time I’ve been starting to prepare the footage for a video that will feature on our new website to highlight the capabilities of our work here at MXTreality.
Slava – Lead 3D Artist
To improve the visual aspects of our upcoming projects, we switched to the newly introduced high definition render pipeline (HDRP) in Unity engine. HDRP provides more accurate and thus more realistic lighting and reflection. It also provides improved effects, such as fog, bloom, exposure compensation, physical camera, and many others.
My task was to tweak environmental features for our ‘Animals on the Network’ project. I have found that to implement HDRP into Unity, the developers dramatically changed the user interface.
As a result, the workflow has changed as well, so I spent several days learning new features and testing their effects in the final picture.
Eventually I became more confident and was able to improve the original scene environment. I imported new skybox, included it in ambient lighting, baked shadows and changed fog. While I am pleased with the results, I know that there are still a lot to study to make HDRP second nature.
Kyung-Min – 3D Generalist
Sorry, but I’m on holiday now and so won’t be submitting diaries for a couple of weeks, but I’ll have good stuff to say when I get back!
Jordi Caballol – Programmer
Jordi has been really busy, but forgot to send in his diary before going on holiday.