Apps Development in AR era and Challenges
Introduction
Is everyone excited about developing AR apps?
They can recognize objects and make virtual objects appear, display information, and manipulate virtual objects with gestures. In games, you can shoot monsters that fly into the real world, or you can recognize your actions and play them in multiplayer.
By the way, how do you test and debug AR apps while you are developing?
We can tell from our experience on the ground that it is a daunting task.

AR Apps Development situation
For example, in order to realize the same environment as the actual demonstration environment, a 1/1 scale model room was built and tested in the development environment. When lighting and other environments change, AR changes the feature points that can be obtained from space, and may not recognize anchor points well. It sounds hard to hear that it would not have worked at the real exhibition if it had not been built and tested a model room of the same scale.
For apps that use hand gestures and poses to manipulate virtual objects, you’ll need to test them out on the fly. It’s hard to test every time you change the code and do the same thing.
In addition, applications that share the same space on multiple devices need to test viewpoints such as whether the same objects can be seen from different viewpoints at the same time and whether actions are linked. That’s why developers have multiple iPads strapped to their arms to play two roles.
Since AR/MR is a spatial application, it is no longer possible to check whether objects placed in the space look natural (shadows, arrangements, etc.) without manual intervention.
AR Testing and CI
In conventional applications, it is easy to write test code or to test code by pushing it to github and CI automatically works. However, in AR applications, since real space is involved, the correctness of software processing can be tested by the conventional method, but when it is actually operated, the virtual object does not appear as expected, and the display position deviates, causing discomfort.
At present, tests are conducted with their own eyes, but there is a problem that CI cannot be conducted automatically.
In order for AR to become popular and killer apps to emerge, a virtuous cycle of more developers, more apps, more users, and more and more apps being improved must emerge.
It is necessary to develop AR applications easily, lower development costs and make them commoditized.
AR Testing Tools
Let’s take a look at the existing testing tools.
iOS: Reality Composer & Xcode
The Reality Composer recording and playback feature lets you record sensor and camera data in place and play them back on your iOS device when you later run the app in Xcode. You can repeat the AR experience with the recorded data without having to physically move the device or go there. However, this function is currently supported only on the device, and the device is required for using the function.
Android: Android Emulator & VirtualScene
When using the Android Emulator in Android Studio, you can select VirtualScene as the Camera Back and use a virtual scene camera to see the behavior of the ARCore application in a virtual space like VR. There are also preset macros and a mechanism to test AR actions by moving around in virtual space.
However, this function is currently a function to perform a simple test on a prepared virtual space or a preset macro. If you want to test in a real-world environment or in a given scenario, you will need the ability to use 3D models and custom macros captured from a real-world environment.
Unity: Unity Test Runner & UI Test Automation
You might be able to do things like unit tests to see if events happen correctly, and integration tests as a UI to see if AR is displayed correctly. Using the UI Test Automation functionality, you can automate the visibility of gameObjects, etc. in the scene and check the property values.
I think the test to check with human eyes is too laborious. Ideally, you’d want to push code to github and have AR tests run automatically, but there doesn’t seem to be such a framework or tool at this point.
If there are any other good ways like this, please let me know.
Programming Paradigms in the XR Era
With the advent of XR apps, the programming paradigm may change.
Unity has released a toolset for XR called Mixed and Augmented Reality Studio (MARS). It provides iOS and Android tools to easily capture the Simulation view and real-world information.
The Rich Semantic Data Layer will also be able to capture real-time information from the real world, including maps, products, locations, and semantics.
Summary
The current situation is that AR app developers test AR apps by human eyes. There is a problem that CI cannot be run if it is done manually. In the XR era, applications that link to the real world are assumed, and the complexity of testing and debugging will increase. New application architectures and frameworks will be required.
See also … CI/CD in AR apps development