Bristol-based embedded technology developer Kudan is expanding the use of its simultaneous localisation and mapping (SLAM) software from mobile phones to drones and cars – helping them to navigate more accurately.

“We can help hardware companies develop their own AR kit… and adjust our technology to their processors and camera”

 

SLAM came from the robotics world and has been implemented in high-end virtual reality (VR) devices such as HoloLens, but it has been more primitive in mobile phones as the hardware has a lot of constraints.

Kudan’s technology simplifies things by utilising the central processor (CPU), which means its technology can be used in many new applications.

As Daiu Ko (pictured right), COO of Kudan tells us: “We have updated our technology on mobile, implementing our SLAM technology on mobile platforms such as iOS that can enhance the augmented reality (AR) core engines which are provided by Apple and Google.”

Overcoming hurdles

Commenting on current technologies, Ko says: “To implement [SLAM] into mobile phones, Google and Apple use a lot of data from the IMU sensor [which detects motion, orientation and positioning]. While this is good for robust tracking it’s not as effective for other applications.”

“We are 20x faster than the most popular open source SLAM”

 

He adds: “As we are a third party we can help hardware companies like Huawei, Lenovo or Samsung to develop their own AR kit with their own software and hardware and adjust our technology to their processors and camera which Google cannot do. The project time is typically two years, so our technology is targetting the next generation in the next 12 to 18 months.”

Even with just an ordinary processor the new SLAM technology is significantly faster than other versions. Ko says: “We are 20x faster than the most popular open source SLAM. By running the SLAM on a benchmark dataset on a CPU alone means we can process 20x more frames per second, or we can save the power consumption by a factor of 20 over other algorithms with the same frame rate.”

Expanding use cases

Kudan is developing versions that make use of the digital signal processors (DSP) and graphic processor units (GPUs) to get performance two to four times higher.

There is also a design that uses the mobile phone chips in drones to provide navigation. “The requirement for drones is quite close to mobile as it is power sensitive and low specification,” he adds. “We already have a customer in the market and it will be 12 to 18 months before the product is released in the consumer market.”

“A major technology supplier in the Toyota group use our technology for parking assist and to recognise the environment”

 

“We are also doing lots of work in robotics, in autonomous control for cars and drones. We have a partnership with a major technology supplier in the Toyota group and they use our technology for parking assist to recognise the environment.”

Kudan is also using SLAM to recognise the position of the car. Ko explains: “The existing technology used sensors to monitor the angle of the wheels but this can have errors build up. With SLAM the car can use the external environment to calibrate its position.”

The technology can also be used in consumer equipment. Ko says: “10cm is the minimum requirement for parking but with a good camera specification 1cm is quite feasible.

“Our accuracy is less than 1mm with a good camera, it just depends on what hardware you choose. If you use an intermediate camera it can do AR or other robotics applications with several mm to several cm, depending on the environment. For a room scale with a vacuum robot then several mm is adequate.”

You can contact Kudan, which has its engineering in Bristol and marketing in Tokyo,  at www.kudan.eu