Our technology is applicable for any autonomous system which needs to interact naturally with humans or replicate a human activity, from self-driving cars to collaborative robots.
Autonomous Vehicle Development
The biggest challenge for autonomous vehicles (AVs) is interacting safely with other road users. Consumers, regulators and car makers need to know a car is safer than a human, but doing billions of miles of on-road testing to validate this isn’t feasible or safe. Instead, as in the aviation industry, simulated virtual reality environments can provide a scalable, safe solution.
We provide the realistic human behaviour models for these virtual reality environments, enabling accurate simulation of AV-human interactions. This provides a safe and scalable way for our customers to test their AV’s decision making around human road users.
Today, a car’s safety is tested through physical crash tests and the driver is tested separately through a set of on-road challenges. This won’t work when the car is doing the driving: a new regulatory approach is needed.
We are working with leading stakeholders to establish a ground-breaking certification framework for autonomous cars, through two Innovate UK grant-funded projects. These projects will deliver a comprehensive simulation platform and certification framework to test an autonomous car’s behaviour and build public trust in the safety of these vehicles on public roads. Working with stakeholders from across the automotive industry, insurance, and government, will ensure that this framework meets the complex needs of all end users.
Our cutting-edge Computer Vision algorithms automatically detect and profile road users, providing advanced analytics on traffic flows and traffic behaviours with no manual labelling. Other applications for automated video analytics include evaluating sports player performance, monitoring industrial processes, profiling wildlife behaviour, or understanding crowd dynamics.
On the Imitation Learning side, our methods can be applied to learning behaviours in any environment with readily available demonstration data, video or otherwise.
We started with building software for telepresence robots in care homes to help them learn socially responsive behaviour, so they could more naturally fit into social environments, bringing patients closer to their loved ones or care-givers, even if they were physically separated.
Telepresence robots are just one kind of “collaborative” robot. They all share a need to learn how humans behave, either to replicate that behaviour or to interact with humans effectively and safely. Our technology enables this learning.