Augmented reality (AR) has been steadily growing in popularity since the release of Pokemon GO launched this technology into mainstream consciousness. However, AR has more practical uses as well, from allowing consumers to see what a piece of furniture would look like in their home to streamlining manufacturing processes to establishing virtual dressing rooms. The current most popular platforms for developing AR apps are Apple ARKit and Google ARCoreBut which AR platform is better? When it comes to comparing AppleARKit vs. Google ARCore, the two have similar functionalities, but they’re ultimately designed for different purposes.

Apple AR Kit VS Google AR Core

How are ARKit and ARCore Similar?

Both AR software development kits (SDKs) rely on three main tenets to ensure their success:

  • Environmental Understanding: This means recognizing the different dimensions of the real world. ARKit and ARCore can both distinguish between horizontal and vertical planes depending on the camera’s perspective when adding digital content to real images.
  • Motion Tracking: With motion tracking, the SDKs are banking on AR’s ability to remain consistent as a user moves their camera. Both platforms use visual inertial odometry (VIO), which leverages the motion sensors and camera of a device to accurately measure movement through six axes.
  • Light Estimation: Both kits consider the lighting and shading of a room in order to obtain an accurate idea of what an object would really look like in a particular environment. The SDKs examine the light sensed by a smart device’s camera and use a realistic photographic rendering of virtual objects in the environment.

Because of the complexity associated with both SDKs, it may be necessary to develop a native app for one operating system at a time. This is due to the fact that there are different dimensions and limits associated with each operating system and device, so your developer will have to build successfully for one before you can move on to another.

How are ARKit and ARCore Different?

One of the main differences between ARCore and ARKit is their purpose. ARKit seeks to create an AR ecosystem that is prevalent across industries in order to provide companies with technological tools to bolster their bottom lines. Essentially, Apple wants to create a series of interconnected devices that operate on AR for consumer and professional purposes. In comparison, ARCore seeks to generalize AR across all platforms, making the technology available to as many people as possible. With wider access availability, Google is seeking to add AR solutions to more businesses with the hopes of expanding their brands.

There are a few other key differences to consider as well. For example, ARCore is stronger when it comes to its mapping capabilities, as it uses larger maps. On the other hand, ARKit only stores the most recent location data and deletes old data. Due to this property, the ARKit can’t map as much of the world as the ARCore can, limiting the stability of the image after the user has moved the camera away from the scene.

Another difference is that Google’s product uses its tracking capabilities to examine more tracking points than Apple’s. Essentially, this means the mapped area expands faster in ARCore apps. However, ARKit is more accurate when creating a distinction between horizontal and vertical surfaces.

Finally, the Google product’s API supporting documentation is relatively limited. You get a fairly generic guide that breaks down every class and method, as well as a “quickstart” guide for developing with ARCore. Apple has the edge here; the ARKit’s support files break down every segment of developing an AR application more comprehensively.

Nevertheless, there are only small issues for the two SDKs, as they are both successful at overlaying virtual objects onto the real world. ARCore has more reach, but ARKit is a bit more precise with its dimensions and lighting, so choosing one ultimately depends on what you’re trying to achieve with your app.

Apple ARKit vs. Google ARCore: What’s Next?

There’s a lot of buzz surrounding the two SDKs and where they may go next. Google recently announced the addition of Lens search for all Pixel and Pixel 2 phones. Lens search lets users point to an object and garner information about it in real-time. This capability is particularly useful for learning about cultural landmarks or obtaining more information about an address in a mapping app.

Google is also investing in Measure, an app that allows you to measure the length, width and areas of objects in the real world with special dragging tools. Users can save these images for future reference, which is useful for construction or manufacturing apps looking for a way to outline the user’s surroundings and its dimensions.

Apple has an even stronger product in the pipeline with ARKit 2.0, which allows users to get accurate measurements of their surroundings with improved face tracking features. The update will also have better 3D object detection, more realistic rendering and sharing features via messaging apps and social media. In addition to its utility in the professional world, ARKit 2.0 will work well for playing multiplayer games, as you’ll be able to play a digital brick-building game with friends or have a Pokemon battle from a distance.

Apple also announced the addition of eye-tracking technology in AR headsets. They will mount an infrared emitter and camera on the headsets. The technology uses a dielectric mirror that reflects infrared light while also allowing light to pass through. The “hot mirror” bounces light into the user’s eye, which is then returned to the camera. This will increase the accuracy of the viewer’s vision of the real world by preventing the display panel from being obscured by components.

While Google is focusing on expanding the reach of ARCore, the company is not quite keeping up with the technological advancements of Apple’s ARKit. After all, Apple has advanced mapping and infrared technology, which results in higher accuracy when measuring your environment – and stronger motion tracking and light estimation capabilities – creating more detailed digital content. As such, ARKit seems to be leading the industry right now. Still, both AR platforms are strong contenders, and it will be exciting to see how these SDKs continue to evolve.

Regardless of whether you favor ARKit or ARCore, you will need a skilled developer if you’re hoping to create an augmented reality app. The team at SevenTablets specializes in mobile AR app development, with a reputation for helping businesses increase their position in their industry. We’re also well-versed in other emerging technologies, including virtual reality, artificial intelligence, blockchain and natural language processing.

Although SevenTablets is headquartered in Dallas, we also work with clients in Austin, Houston, and beyond. If you’re ready to discuss your project, we invite you to contact us today.


Reach out to our team today!

Shane Long

Shane Long

President at SevenTablets
As President of SevenTablets, Shane Long brings experience in mobility that pre-dates the term “smartphone” and the release of the first iPhone. His work has helped revolutionize the growth of mobility by bringing to market one of the first graphics processors used in mobile phones, technology that after being acquired by Qualcomm lived well into the 4th generation of smartphones, as well as helped pioneer the first GPS implementations in the segment. With a strong engineering and business background, Shane understands how the rise of mobility and Predictive Analytics is crucial to greater business strategies geared toward attaining competitive advantage, accelerating revenue, and realizing new efficiencies. As the leader of a B2B mobility solutions provider, he partners with business leaders including marketers and product developers to leverage enterprise mobile applications, big data and analytics, and mobile strategy.

Shane earned a B.S. at Texas A&M (whoop!) and studied mathematics as a graduate student at Southern Methodist University.
Shane Long