Arcore android studio tutorial

Develop your HelloAR app in Android studio using ARCore and Sceneform

What is Augmented Reality

Augmented Reality is “a technology that superimposes a computer-generated image on a user’s view of the real world, thus providing a composite view”. Essentially, AR is a technology which enables us to render computer generated 3D object models into the real world and have it interact with its surrounding as if it were physically present at the same location.

What is ARCore

ARCore is a platform for building Android AR experiences. It enables your phone to sense its environment, understand the world and interact with the information.

ARCore works on 3 principles:

  • Motion Tracking: It allows the phone to understand its current position relative to the real world.
  • Understanding the Environment: It allows the phone to detect the size and location of all type of surfaces: vertical, horizontal and angled.
  • Light Estimation: It allows the phone to sense the environment’s lighting condition.

Sceneform

ARCore in itself isn’t an SDK, rather it is an engine that helps SDKs to render the objects. Hence, in order to make use of this functionality, Google released Sceneform SDK to enable developers to build Android AR apps without having to learn OpenGL.

Sceneform comes with many nifty features such as:

  • An automatic compatibility check for ARCore enabled phones.
  • Checking for camera permissions.
  • A scene graph API to abstract all the complexities.
  • A plugin for manipulating 3D assets.

Getting Started

you first need to enable ARCore in your project. This is simple as we will be using Android Studio and Sceneform SDK. There are two major operations Sceneform performs automatically:

  • Checking for availability of ARCore
  • Asking for camera permission

Create a new Android Studio project and select an empty activity.

Adding Dependencies

Add the following dependency to your project level build.gradle file:

Add the latest ARCore library as a dependency in your app’s build.gradle file:

Sceneform SDK requires minSdkVersion greater than or equal to 24. So, make sure that you set the minSdkVersion>= 24. Also, make sure that you have included the Maven repository in your project level build.gradle.

Updating Manifest

Add the following lines in your AndroidManifest.xml file:

Using ARCore requires camera permission and a camera enabled phone. Also, add a meta data to your application tag:

If your app strictly requires the device to be ARCore enabled, then set required = true or if AR is not a primary feature or you have handled compatibility for non-compatible devices you can set required = false.

Then sync Your project with Gradle files and wait for the build to finish. This will install the Sceneform SDK to the project and Sceneform plugin to AndroidStudio. It will help you to view the .sfb files. These files are the 3D models which are rendered in your camera. It also helps you in importing, viewing, and building 3D assets.

Presently with our Android Studio sync is complete and Sceneform SDK Installed, we can begin with creating our absolute first ARCore application.

In the first place, we have to add the Sceneform piece to our design record. This will be where we place all our 3D models. It deals with the camera instatement and authorization taking care of.

Head over to your fundamental design document. For my situation it is activity_main.xml and include the Sceneform part:

This is all that you need to do in the layout file.

Checking compatibility at runtime

We will check if the device:

  1. Is running Android API version >= 24.
  2. Can support OpenGL version 3.0.

The above conditions are mandatory for a device to support AR applications using ARCore and Sceneform SDK.

We intend to finish the activity if these conditions aren’t satisfied. However, you can still continue to support other features. Add the method below in your class That will help with the Compactibility Check:

Читайте также:  Ошибка главного экрана андроид

Adding 3D models to our application

It is now time to download and import the 3D models to be rendered into our application. In our case, we will be rendering a 3D cube in a corner of our room and moving it around.

You can download 3D models from anywhere, but Google has provided an excellent repository POLY to download 3D models for your application. You can download the models in .fbx, .obj or .gltf format. We will be downloading the .fbx file.

Open the project view in your android studio project and expand the app folder. You will notice a folder named “sampledata”. If not, go ahead to create one.

After your model finishes downloading, you will need to extract the downloaded zip file into this sample data folder.

You will find a .fbx file and a png image of the model. We’ll import the .fbx file in our application using the sceneform plugin.

Importing the model using Sceneform plugin

You need to set the ar plugin in your app’s Gradle file as well.
Add the following below the dependencies:

Add the following lines at the end of the app’s build.gradle :

The build.gradle of the app looks like this finally:

Android ARCore Sceneform requires Java 8 or higher version.

The sja and sjb are the Sceneform Asset Description and Sceneform Binary files. The sjb file is visible in the 3D viewer. It is shipped with the APK. and sja file is used to set properties for the sjb file.

Integrating the Model

Add the following code in your java file:

Adding the Model to the AR Scene

Our AR fragment is the container of the scene and hence we need to add a model to the fragment whenever it is clicked. Hence, we’ll add an onTapListener to our fragment.

Using the hitResult, we can get the location tapped and create an anchor node which is the root node of our scene (image an augmented reality scene as an inverted tree).

Next, we create a TransformableNode which will be our chair and set it to the anchornode. A transformable node can react to location changes and size changes when the user drags the object or uses pinch to zoom.

Let’s have a look at some terminologies here:

  • Scene: It’s the place where our 3D world will be rendered.
  • HitResult: It is an imaginary ray of light coming from infinity and it’s first point of intersection with the real world is the point of tap.
  • Anchor: A fixed location in the real world. Used to transform local coordinates (according to user’s display) to the real-world coordinates.
  • TransformableNode: A node that can react to user’s interactions such as rotation, zoom and drag.

Here’s how your final java file would look like:

Let’s see what’s happening here.

  1. First, we need to get the fragment from our layout file with the help of supportFragmentManager and the fragment id.
  2. Then we need to load the model into the scene. For this, we use the ModelRenderable class provided by the Sceneform SDK. With the help of ModelRenderable’s setSource() method, we can load our model by passing the name of the generated .sfb file.
  3. Model is being built on a background thread, so after the model is loaded, it’s presented to the main thread which then renders it to the scene.
  4. We receive the model inside the thenAccept method. If there’s any error in building the model, an exception is thrown.

There you have it! Your own AR App in Android studio 🙂

I hope you had fun reading and/or following along. In the next story, we will look into how to build more features and interactions into this App. Stay tuned!

If you are interested in further exploring, here are some resources I found helpful along the way:

Источник

How to build an Augmented Reality Android App with ARCore and Android Studio

This article was originally posted here

In the previous post, I explained what ARCore is and how it helps developers build awesome augmented reality apps without the need to understand OpenGL or Matrix maths.

If you haven’t checked it out yet, I highly recommend doing so before moving ahead with this article and diving into ARCore app development.

Overview

According to Wikipedia, ARCore is a software development kit developed by Google that allows for augmented reality applications to be built.

Читайте также:  Зарядка для андроид нового поколения

ARCore uses three key technologies to integrate virtual content with the real environment:

  1. Motion Tracking: it allows the phone to understand its position relative to the world.
  2. Environmental understanding: This allows the phone to detect the size and location of all type of surfaces, vertical, horizontal and angled.
  3. Light Estimation: it allows the phone to estimate the environment’s current lighting conditions.

Getting Started

To get started with ARCore app development, you first need to enable ARCore in your project. This is simple as we will be using Android Studio and Sceneform SDK. There are two major operations Sceneform performs automatically:

  1. Checking for availability of ARCore
  2. Asking for camera permission

You don’t need to bother with these two steps when creating an ARCore app using Sceneform SDK. But you do need to include Sceneform SDK in your project.

Create a new Android Studio project and select an empty activity.

Add the following dependency to your project level build.gradle file:

Add the following to your app level build.gradle file:

Now sync project with Gradle files and wait for the build to finish. This will install the Sceneform SDK to the project and Sceneform plugin to AndroidStudio. It will help you to view the . sfb files. These files are the 3D models which are rendered in your camera. It also helps you in importing, viewing, and building 3D assets.

Building your first ARCore app

Now with our Android Studio setup complete and Sceneform SDK installed, we can get started with writing our very first ARCore app.

First, we need to add the Sceneform fragment to our layout file. This will be the Scene where we place all our 3D models. It takes care of the camera initialization and permission handling.

Head over to your main layout file. In my case it is activity_main.xml and add the Sceneform fragment:

I’ve set the width and height to match parent as this will cover my entire activity. You can choose the dimensions according to your requirements.

Compatibility Check

This is all that you need to do in the layout file. Now head over to the java file, in my case which is MainActivity.java. Add the method below in your class:

This method checks whether your device can support Sceneform SDK or not. The SDK requires Android API level 27 or newer and OpenGL ES version 3.0 or newer. If a device does not support these two, the Scene would not be rendered and your application will show a blank screen.

Although, you can still continue to deliver all the other features of your app which don’t require the Sceneform SDK.

Now with the device compatibility check complete, we shall build our 3D model and attach it to the scene.

Adding the assets

You will need to add the 3D models which will be rendered on your screen. Now you can build these models yourself if you are familiar with 3D model creation. Or, you can visit Poly.

There you’ll find a huge repository of 3D assets to choose from. They are free to download. Just credit the creator and you are good to go.

In the Android Studio, expand your app folder available on the left-hand side project pane. You’ll notice a “sampledata ” folder. This folder will hold all of your 3D model assets. Create a folder for your model inside the sample data folder.

When you download the zip file from poly, you will most probably find 3 files.

Most important of these 3 is the .obj file. It is your actual model. Place all the 3 files inside sampledata -> “your model’s folder”.

Now right click on the .obj file. The first option would be to Import Sceneform Asset. Click on it, do not change the default settings, just click finish on the next window. Your gradle will sync to include the asset in the assets folder. Once the gradle build finishes, you are good to go.

You’ve finished importing a 3D asset used by Sceneform in your project. Next, let’s build the asset from our code and include it in the scene.

Building the Model

Add the following code to your MainActivity.java file (or whatever it is in your case). Don’t worry, I’ll explain all the code line by line:

Читайте также:  Индикатор пропущенных звонков android

First, we find the arFragment that we included in the layout file. This fragment is responsible for hosting the scene. You can think of it as the container of our scene.

Next, we are using the ModelRenderable class to build our model. With the help of setSource method, we load our model from the . sfb file. This file was generated when we imported the assets. thenAccept method receives the model once it is built. We set the loaded model to our lampPostRenderable.

For error handling, we have .exceptionally method. It is called in case an exception is thrown.

All this happens asynchronously, hence you don’t need to worry about multi-threading or deal with handlers XD

With the model loaded and stored in the lampPostRenderable variable, we’ll now add it to our scene.

Adding the Model to Scene

The arFragment hosts our scene and will receive the tap events. So we need to set the onTap listener to our fragment to register the tap and place an object accordingly. Add the following code to onCreate method:

We set the onTapArPlaneListener to our AR fragment. Next what you see is the Java 8 syntax, in case you are not familiar with it, I would recommend checking out this guide.

First, we create our anchor from the HitResult using hitresult.createAnchor() and store it in an Anchor object.

Next, create a node out of this anchor. It will be called AnchorNode. It will be attached to the scene by calling the setParent method on it and passing the scene from the fragment.

Now we create a TransformableNode which will be our lamppost and set it to the anchor spot or our anchor node. The node still doesn’t have any information about the object it has to render. We’ll pass that object using lamp.setRenderable method which takes in a renderable as it’s parameter. Finally call lamp.select();

Phew!! Too much terminology there, but don’t worry, I’ll explain it all.

  1. Scene: This is the place where all your 3D objects will be rendered. This scene is hosted by the AR Fragment which we included in the layout. An anchor node is attached to this screen which acts as the root of the tree and all the other objects are rendered as its objects.
  2. HitResult: This is an imaginary line (or a ray) coming from infinity which gives the point of intersection of itself with a real-world object.
  3. Anchor: An anchor is a fixed location and orientation in the real world. It can be understood as the x,y,z coordinate in the 3D space. You can get an anchor’s post information from it. Pose is the position and orientation of the object in the scene. This is used to transform the object’s local coordinate space into real-world coordinate space.
  4. AnchorNode: This is the node that automatically positions itself in the world. This is the first node that gets set when the plane is detected.
  5. TransformableNode: It is a node that can be interacted with. It can be moved around, scaled rotated and much more. In this example, we can scale the lamp and rotate it. Hence the name Transformable.

There is no rocket science here. It’s really simple. The entire scene can be viewed as a graph with Scene as the parent, AnchorNode as its child and then branching out different nodes/objects to be rendered on the screen.

Your final MainActivity.java must look something like this:

Congratulations!! You’ve just completed your first ARCore app. Start adding objects and see them come alive in the real world!

This was your first look into how to create a simple ARCore app from scratch with Android studio. In the next tutorial, I would be going deeper into ARCore and adding more functionality to the app.

If you have any suggestions or any topic you would want a tutorial on, just mention in the comments section and I’ll be happy to oblige.

Like what you read? Don’t forget to share this post on Facebook , Whatsapp and LinkedIn .

You can follow me on LinkedIn , Quora , Twitter and Instagram where I answer questions related to Mobile Development, especially Android and Flutter .

Источник

Оцените статью