Kudan in Bristol has developed a way of providing higher position accuracy for iPhones while using less power for the latest augmented reality (AR) applications. The technology can make use of a single camera in a phone, which is a key step forward.
“Our tech is able to enhance existing Apple and Google AR solutions and move the industry forward”
The Simultaneous Location and Mapping (SLAM) technology uses the inertial measurement unit in an iPhone to provide more accurate positioning than GPS satellite navigation with lower power.
- You may like: From Pokemon Go to autonomous drones
“The progress we made is to implement our tech on current generation of mobile device such as the limited field of view, rolling shutters and single camera, which is able to enhance existing Apple and Google AR solutions and move the industry forward,” says Daiu Ko, chief operating officer of Kudan, which has offices in Bristol and Tokyo.
The KudanSLAM taps into the IMU via Apple’s AR Kit to combine the IMU data with image data from the phone’s camera. This is used to build local maps that can be saved and loaded, allowing relocalisation of pre-mapped areas and map sharing. You can see the tech in action in the video below:
Visualisation on OSX: This is only visual tracking, with no IMU support
These are used to correct the drift that is present in ARKit and provides higher accuracy tracking and mapping, all with lower power consumption than GPS.
There is more detail on KudanSLAM at www.kudan.eu
- You may like: Mekamon fighting robots now on sale in Apple stores

Shona Wright
Shona covers all things editorial at TechSPARK. She publishes news articles, interviews and features about our fantastic tech and digital ecosystem, working with startups and scaleups to spread the word about the cool things they're up to.
She also oversees TechSPARK's social media, sharing the latest updates on everything from investment news to green tech meetups and inspirational stories.