Facebooks big augmented reality announcements at its F8 developer’s conference in California are the clearest sign the next iPhone will have built-in AR hardware.

It’s true, Apple’s interest in and affinity for augmented reality is not news. Ask Apple CEO Tim Cook about virtual reality and he will immediately steer the conversation to the promise of ARThat’s

ts led many people to assume that the next iPhone, the 10th Anniversary iPhone, will feature a camera with built-in augmented reality capabilities. This seemed like a reasonable assumption, but, to be honest, I was 50-50 on the possibility. Augmented Reality is only marginally less of a curiosity to consumers than virtual reality.

Facebook CEO Mark Zuckerbergs keynote presentation on the major updates coming to Facebooks Camera app changed my mind.

Were making the camera the first mainstream augmented reality platform, said Zuckerberg, who then proceeded to show off some eye-popping AR integrations.

When blended with AI, Facebooks real-time visual understanding could identify objects, people, places, and their positional relationship to each other.

First Zuckerberg demonstrated how the camera could seamlessly integrate 3D text with a coffee table scene. The 3D text wasnt floating in space. It was perfectly positioned on the table and maintained that correct perspective no matter where you moved the camera. He also created fake steam in the real coffee cup and added fake flowers to a real plant that looked like they grew in place.

Zuckerberg also showed just how grand augmented reality vision could be, transforming a suburban home into Harry Potters Hogwarts.

Ive seen all kinds of mobile AR, usually activated by special hidden codes on cards on signage or special action figures. It doesnt need special mobile hardware and works well enough. Facebooks brand of mobile AR seemed somehow better, more powerful.

Part of this is surely due to the growing power of Facebook’s vision system, which Facbeook CTO Mike Schroepfer illustrated below.

However, Zuckerberg chose to highlight another key technology underpinning Facebooks camera augmented reality: SLAM or Simultaneous Localization and Mapping.

SLAM is a range and mapping technology typically associated with robotics and self-driving cars. It often employs a variety of specialized hardware to gather the information it needs to fully read everything in an environment, including streets, other cars, rooms, people, and objects to build a map of its environment.

It was surprising and a bit odd that Zuckerberg would credit SLAM with the seamless integration of real and virtual worlds. I kept thinking about how there seemed to be a missing piece here.

Is Facebook integrating all SLAM technology as software inside Facebooks camera or are they relying on an as-of-yet-unnamed and unreleased piece of mobile technology?

A key part of SLAM is depth perception. The computer needs to know the distance from itself to other objects and the distances between objects to build an accurate 3D picture of the room. There are some very good and well-known technologies, like Microsofts Kinect and Googles Project Tango, that bathe the environment in infrared (IR) to build a 3D mesh of the environment. With that, the AR engine can ensure that virtual objects properly interact with the real world.

As of now, there arent a lot of mainstream mobile phones that include range-finding technology. The two leading mobile devices: Apples iPhone and the Samsung Galaxy S8 do not include it.

Yet.

As I see it, Zuckerberg wouldnt use SLAM to power his brand of AR unless he knew that the masses would have access to the technology. Where better to find it than on the next iPhone?

Apple has several options here. It could integrate range-sensing technology like Intel RealSense (Ive seen it primarily on Laptops for Windows Hello face recognition and on a handful of smartphones and tablets). That seems an unlikely choice since Apple uses its own custom silicon for its A10 Fusion mobile CPU and Intel is unlikely to license one technology without the other.

However, Apple also owns Primesense, the company behind the original Kinect sensor. Apple bought the company and its 3D imaging technology in 2013 for a reported $345 million. A Primesense reader could easily fit inside an iPhone X Plus.

An iPhone with real range-finding hardware would not only satisfy Tim Cooks AR appetite, it could be just the platform Zuckerberg is thinking about when he promises SLAM-powered AR.

Its not unreasonable to assume Zuckerberg and Facebook know a little bit more about whats coming on the next iPhone than the rest of us.

On the other hand, I may be extrapolating a bit too much.

I checked in with my friend and iRobot CEO Colin Angle to see if he had any insight on SLAM and the need for a hardware-based solution. Angles own robot, the popular Roomba robot vacuum uses SLAM for positioning.

SLAM is a computational technique which can integrate multiple sensors of different types into a map. Typically, there are primary sensors like a laser or camera (which Roomba uses) and secondary sensors like ultrasonic, IR, or a downward pointing optical flow measuring device (which Roomba uses). So, it may be that they are using the camera on the phone or it may be that they are getting something new in the phone, said Angle via email.

However, he noted that theres not enough information here to really know how Facebook is using SLAM.

And, I guess, its not even clear how the social media giant is defining its brand of SLAM.

Still, I choose to take this as a sign: A fully-integrated AR camera and sensor system is coming on the next iPhone. Its as solid an iPhone rumor as any other youve heard in the last six months.