Lately I’ve been running across the term “Augmented Reality”. It’s been popping up in the news, on the web, in various blog posts, and in white papers. I even saw it on a billboard as I was driving to the office. What the heck does this term, Augmented Reality (AR), mean?
Well, I’m here to tell you. Let’s break it apart.
First, AR, as I describe it in this post, pertains to a type of software application (app) that can usually be found running on mobile devices. Think smart phones here. Myself, being an iOS software engineer, will be primarily referring to Apple iOS devices in this post. Although, there are AR apps that run on Android, Windows Phone, and many other types of mobile device operating systems.
By the way, for those of you who may not know, iOS is the name of Apple’s mobile device operating system. So, a mobile device running iOS is either an iPhone, iPod Touch, or an iPad. Android is, of course, the name of Google’s mobile device operating system, and Windows Phone is Microsoft’s mobile device operating system.
First, let’s consider the word “reality”. The “reality” part of AR refers to the aspect of the software application that provides a display of the real world to the user. This usually comes in the form of the live video camera feed. Although, I don’t see any reason why a real world feed couldn’t come from the audio mic. Just as long as the feed provides a close representation of what the user is actually seeing or hearing in the real world. Most AR applications currently in existence augment the mobile device’s video display.
Now let’s consider the second word, “Augmented”. According to the Merriam-Webster dictionary, to augment something is “to make greater, more numerous, larger, or more intense”. In the case of an augmented reality app, it is most likely going to be the mobile device’s video display that is made more “intense”. That is, the real-time video display is going to be augmented in such a way that additional, computer-provided data will be inserted into the video display. This computer generated data usually pertains to objects and / or people that appear in the video display. The data is often displayed in the form of graphical images that overly the video display. An example of an AR app is one that shows the constellations of stars when the user points the mobile device video camera towards the sky.
Another example of an AR app is one which displays to the user a computer generated image of a business, such as a coffee shop or a hardware store, as the user is viewing the urban environment through the video camera. The image appears as an overlay over the video camera screen and shows the direction and distance to the business. It could also show some information about the business such as hours when open and any coupons or specials that might be in effect. The CIBER Mobility Practice designs and programs these types of apps for various clients and also for proof-of-concept demonstration purposes.
There is a third part to the AR equation which is just as important, if not more so, and that is how to connect the “reality” part to the “augmented” part. For instance, how does the constellation app, as shown above, know where and how to insert and position the graphic images over the video display? This is where the mobile device’s built-in sensor hardware comes into play. Many of the newer mobile devices, such as the Apple iPhone and iPad, now come with various environment sensors built-in. These would be device location sensors such as GPS and cell phone tower triangulation. Device position sensors such as a gyroscope and accelerometer and device direction sensors such as a magnetic compass.
Even newer versions of mobile devices, for instance the upcoming iPhone 5, may have Near Field Communication (NFC) sensors built in. One such type of sensor is called Radio Frequency Identification (RFID) and is very useful for allowing a mobile device to acquire data on objects within its proximity. In order to sense RFID data, an RFID transmitter device is needed. Data pertaining to an object is programmed into the RFID transmitter device and the device is then attached to the object. Any mobile device within the proximity of the transmitting RFID device would then be able to automatically receive the RFID device data and thereby acquire information pertaining to the object upon which the RFID device is attached. Applications of this technology are often used for warehouse inventory control systems.
By using the device sensor data in combination with data stored on a remote network server, AR app software engineers are able to create sophisticated AR apps such as the constellation app.
I hope this article has been helpful in helping the reader to better understand what augmented reality is and what the future possibilities of the technology are. AR apps are still very much a new technology and mobile devices have just recently emerged with the required environment sensors and computer processing power needed to realize AR apps.
As a mobile device software engineer in the CIBER Global Mobility Practice, I find AR apps to be one of the most exciting types of mobile device software to work on and I look forward to helping to realize the future potential of this new technology.