At L4 we’re psyched about new technologies, and welcome any chance to dig into new problem domains. Augmented reality (AR) is not actually a new technology as it’s been available to most developers since around 2000 in projects like ARToolkit. AR is, however, a novel technology in that it requires a live video feed and the ability to process that feed in real time. As you dig into experience design with AR, you quickly realize that this camera+processor+display combination needs to be mobile. You can’t do much with fixed hardware, and having this kind of horsepower in your hand is a relatively new thing
One of the familiar paths in AR is in annotation and illustration of your environment. We’ve seen prototypes of heads-up displays, but outside of Pokemon Go, nothing’s caught fire yet. Most applications in use today involve modifying a camera’s image while simultaneously performing some sort of recognition process on the image in realtime. There have been a few forays with AR in the retail space that have seen some scale, allowing users to place virtual furniture in their room (IKEA), or allowing users to swap colors out on clothes and shoes in-store (Converse).
These examples may be interesting for the consumer, but the first – and still the most powerful – augmented reality app this writer experienced was wayyyyyyyy back in 2010, with SkyView. Point your device at the nighttime sky, and you see a view of the stars illustrated with constellations (and not just the classical Greek ones) annotated with data. There’s a poetic economy, a ‘picture-worth-1000-words’ richness, to this type of educational experience that pulls students in and drives lessons home.
As L4 has worked through prototyping AR experiences, we collected a kind of bag of tricks, which we would like to touch on here.
Annotating Your Environment
In one Proof of Concept, we needed to identify a real 3D object within the camera frame and add drawn elements dependent on the object’s position and orientation. In working through how we might accomplish the basic task of identifying objects in the camera frame, we landed quickly on Vuforia and Unity.
Unity is really the only game in town for rendering 3D models along with real-world imagery, and Vuforia brought an end-to-end ease of use in tooling that was a welcome relief. All aspects of the AR scenario we wanted to explore were supported by Vuforia tooling. We were able to quickly mock up several predefined targets to enable Vuforia to register position of an object and break the problem into manageable pieces:
- Add drawn elements to the captured image
- Track the position of object in 3D space
- Update the annotation as the position of the object change
In dealing with existing hardware, we anticipated a path forward involving targets printed on stickers, involving students in the setup of the AR experience. While this proved straightforward on flat surfaces, we had a hard time doing this with curved, cylindrical surfaces. Keeping the targets flat yielded fine results, but things got dicey once the targets were affixed to curved surfaces. When considering the potential desire to make this POC easy for children, everyone felt stickers added friction to the experience — it felt like a deal breaker.
Continuing in this line of research, we sunk time into exploring 3D object recognition, which involved creating a point cloud and using it as a definition on which to base recognition. Once again, Vuforia proved invaluable. Their Object Scanner application helped us generate usable point cloud data in short order. The high-contrast target images we used are easy to recognize in a variety of lighting scenarios, but this was not the case for our physical product. We were able to do 3D tracking reasonably well using Vuforia’s Object Tracker in controlled lighting situations, but without some dedicated forethought about recognition during hardware design, our objects were going to be hard to track.
Keeping Your AR Tools Close
Having a few reliable tools enables product managers, designers, and developers to ideate quickly and fail fast. A small investment in learning about available tools means we can do what we love: BUILD STUFF. We’re excited to see how our clients include this approach as we help them through future product development cycles. Show, don’t tell!