🔙 Quay lại trang tải sách pdf ebook Beginning ARKit for iPhone and iPad: Augmented Reality App Development for iOS Ebooks Nhóm Zalo Beginning ARKit for iPhone and iPad Augmented Reality App Development for iOS — Wallace Wang Beginning ARKit for iPhone and iPad Augmented Reality App Development for iOS Wallace Wang Beginning ARKit for iPhone and iPad: Augmented Reality App Development for iOS Wallace Wang San Diego, CA, USA ISBN-13 (pbk): 978-1-4842-4101-1 ISBN-13 (electronic): 978-1-4842-4102-8 https://doi.org/10.1007/978-1-4842-4102-8 Library of Congress Control Number: 2018962490 Copyright © 2018 by Wallace Wang This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. Trademarked names, logos, and images may appear in this book. Rather than use a trademark symbol with every occurrence of a trademarked name, logo, or image we use the names, logos, and images only in an editorial fashion and to the benefit of the trademark owner, with no intention of infringement of the trademark. The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. While the advice and information in this book are believed to be true and accurate at the date of publication, neither the authors nor the editors nor the publisher can accept any legal responsibility for any errors or omissions that may be made. The publisher makes no warranty, express or implied, with respect to the material contained herein. Managing Director, Apress Media LLC: Welmoed Spahr Acquisitions Editor: Aaron Black Development Editor: James Markham Coordinating Editor: Jessica Vakili Cover image designed by Freepik (www.freepik.com) Distributed to the book trade worldwide by Springer Science+Business Media New York, 233 Spring Street, 6th Floor, New York, NY 10013. Phone 1-800-SPRINGER, fax (201) 348-4505, e-mail [email protected], or visit www.springeronline.com. Apress Media, LLC is a California LLC and the sole member (owner) is Springer Science + Business Media Finance Inc (SSBM Finance Inc). SSBM Finance Inc is a Delaware corporation. For information on translations, please e-mail [email protected], or visit http://www.apress. com/rights-permissions. Apress titles may be purchased in bulk for academic, corporate, or promotional use. eBook versions and licenses are also available for most titles. For more information, reference our Print and eBook Bulk Sales web page at http://www.apress.com/bulk-sales. Any source code or other supplementary material referenced by the author in this book is available to readers on GitHub via the book's product page, located at www.apress.com/978-1-4842-4101-1. For more detailed information, please visit http://www.apress.com/source-code. Printed on acid-free paper This is book is dedicated to everyone who has an idea for an app but didn’t know what to do first or how to get started. First, believe in your idea. Second, trust that you have intelligence to achieve your dream even if you don’t know how you’ll get there. Third, keep learning and improving your skills all the time. Fourth, stay focused. Success will come one day as long as you persist and never give up on yourself. Table of Contents About the Author ���������������������������������������������������������������������������������ix About the Technical Reviewer �������������������������������������������������������������xi Chapter 1: Understanding Augmented Reality and ARKit���������������������1 Augmented Reality on Mobile Devices������������������������������������������������������������������5 Introducing ARKit���������������������������������������������������������������������������������������������7 System Requirements for ARKit ��������������������������������������������������������������������15 Summary�������������������������������������������������������������������������������������������������������������16 Chapter 2: Getting to Know ARKit�������������������������������������������������������19 Understanding the Swift Source Code ����������������������������������������������������������������25 Understanding the User Interface �����������������������������������������������������������������������29 Creating Augmented Reality with the Single View App Template������������������������31 Summary�������������������������������������������������������������������������������������������������������������48 Chapter 3: World Tracking ������������������������������������������������������������������49 Displaying the World Origin���������������������������������������������������������������������������������50 Resetting the World Origin ����������������������������������������������������������������������������������57 Displaying Shapes at Coordinates ����������������������������������������������������������������������63 Adding and Removing Multiple Objects ��������������������������������������������������������������67 Summary�������������������������������������������������������������������������������������������������������������78 Chapter 4: Working with Shapes ��������������������������������������������������������79 Displaying Different Geometric Shapes ��������������������������������������������������������������83 Displaying Text����������������������������������������������������������������������������������������������������88 v Table of Contents Adding Textures to Shapes����������������������������������������������������������������������������������93 Changing the Transparency of Shapes��������������������������������������������������������������101 Drawing Shapes������������������������������������������������������������������������������������������������105 Summary�����������������������������������������������������������������������������������������������������������112 Chapter 5: Working with Lights ��������������������������������������������������������113 Using Color, Intensity, and Temperature ������������������������������������������������������������114 Using a Spotlight�����������������������������������������������������������������������������������������������130 Summary�����������������������������������������������������������������������������������������������������������143 Chapter 6: Positioning Objects ���������������������������������������������������������145 Defining Relative Positions �������������������������������������������������������������������������������145 Combining Geometric Shapes���������������������������������������������������������������������������155 Summary�����������������������������������������������������������������������������������������������������������162 Chapter 7: Rotating Objects��������������������������������������������������������������163 Rotating Objects Using Euler Angles�����������������������������������������������������������������163 Relational Object Rotation���������������������������������������������������������������������������������174 Summary�����������������������������������������������������������������������������������������������������������180 Chapter 8: Drawing on the Screen����������������������������������������������������181 Summary�����������������������������������������������������������������������������������������������������������202 Chapter 9: Adding Touch Gestures to Augmented Reality ���������������205 Recognizing Touch Gestures �����������������������������������������������������������������������������210 Identifying Touch Gestures on Virtual Objects���������������������������������������������������213 Identifying Swipe Gestures on Virtual Objects ��������������������������������������������������219 Identifying Virtual Objects with Pan Gestures ���������������������������������������������������223 Identifying Long Press Gestures on Virtual Objects ������������������������������������������230 Adding Pinch and Rotation Gestures�����������������������������������������������������������������232 Summary�����������������������������������������������������������������������������������������������������������238 vi Table of Contents Chapter 10: Interacting with Augmented Reality �����������������������������239 Scaling with the Pinch Touch Gesture���������������������������������������������������������������244 Rotating with the Rotation Touch Gesture���������������������������������������������������������248 Moving Virtual Objects with the Pan Gesture����������������������������������������������������254 Summary�����������������������������������������������������������������������������������������������������������260 Chapter 11: Plane Detection �������������������������������������������������������������261 Displaying Planes as Images ����������������������������������������������������������������������������269 Placing Virtual Objects on a Horizontal Plane ���������������������������������������������������281 Detecting Vertical Planes ����������������������������������������������������������������������������������290 Summary�����������������������������������������������������������������������������������������������������������297 Chapter 12: Physics on Virtual Objects ��������������������������������������������299 Applying Force on Virtual Objects ���������������������������������������������������������������������310 Colliding with Virtual Objects ����������������������������������������������������������������������������317 Detecting Collisions ������������������������������������������������������������������������������������������324 Summary�����������������������������������������������������������������������������������������������������������331 Chapter 13: Interacting with the Real World ������������������������������������333 Detecting Points in the Real World��������������������������������������������������������������������345 Defining a Point in the Real World���������������������������������������������������������������������352 Measuring Distance Between Virtual Objects ���������������������������������������������������355 Summary�����������������������������������������������������������������������������������������������������������362 Chapter 14: Image Detection ������������������������������������������������������������365 Storing Images��������������������������������������������������������������������������������������������������369 Detecting Multiple Images ��������������������������������������������������������������������������������374 Displaying Information in Augmented Reality ���������������������������������������������������379 Summary�����������������������������������������������������������������������������������������������������������389 vii Table of Contents Chapter 15: Displaying Video and Virtual Models ����������������������������391 Displaying Virtual Objects in Mid-Air�����������������������������������������������������������������393 Displaying Video on a Plane������������������������������������������������������������������������������399 Summary�����������������������������������������������������������������������������������������������������������405 Chapter 16: Image Tracking and Object Detection ���������������������������407 Detecting Objects����������������������������������������������������������������������������������������������418 Scanning an Object �������������������������������������������������������������������������������������418 Detecting Objects in an App ������������������������������������������������������������������������426 Summary�����������������������������������������������������������������������������������������������������������435 Chapter 17: Persistence �������������������������������������������������������������������437 Saving a World Map ������������������������������������������������������������������������������������������444 Loading a World Map ����������������������������������������������������������������������������������������447 Clearing an Augmented Reality View����������������������������������������������������������������451 Summary�����������������������������������������������������������������������������������������������������������460 Appendix A: Converting 3D Model Files��������������������������������������������461 Converting COLLADA (.dae) to SceneKit (.scn)��������������������������������������������������462 Convert 3D Models into a COLLADA (.dae) File �������������������������������������������������463 Appendix B: Creating Virtual Objects Visually����������������������������������469 Creating a SceneKit Assets Folder��������������������������������������������������������������������470 Creating a SceneKit (.scn) File��������������������������������������������������������������������������471 Adding Virtual Objects to a SceneKit (.scn) File ������������������������������������������������473 Customizing Virtual Objects ������������������������������������������������������������������������������475 Linking Virtual Objects ��������������������������������������������������������������������������������������476 Displaying a SceneKit (.scn) File in Augmented Reality������������������������������������477 Index�������������������������������������������������������������������������������������������������479 viii About the Author Wallace Wang has written dozens of computer books over the years, beginning with ancient MS-DOS programs like WordPerfect and Turbo Pascal, migrating to writing books on Windows programs like Visual Basic and Microsoft Office, and finally switching to Swift programming for Apple products like the Macintosh and iPhone. When he’s not helping people discover the joys of programming, he performs stand-up comedy and appears on two radio shows on KNSJ in San Diego (http://knsj.org) called “Notes From the Underground” (with Dane Henderson, Jody Taylor, and Kristen Yoder) and “Laugh In Your Face Radio” (with Chris Clobber, Sarah Burford, and Ikaika Patria). He also writes a screenwriting blog called “The 15 Minute Movie Method” (http://15minutemoviemethod.com) and a blog about the latest cat news on the Internet called “Cat Daily News” (http://catdailynews.com). ix About the Technical Reviewer Wesley Matlock is a published author of books about iOS technologies. He has more than 20 years of development experience in several different platforms. He first started doing mobile development on the Compaq iPaq in the early 2000s. Today, Wesley enjoys developing on the iOS platform and bringing new ideas to life for Major League Baseball in the Denver Metro area. xi CHAPTER 1 Understanding Augmented Reality and ARKit You may have heard of virtual reality (VR), but there’s a similar innovation that’s appearing on mobile devices like the iPhone and iPad that’s called augmented reality (AR). Although they may rely on similar technology, virtual reality and augmented reality offer vastly different uses in everyday life. Virtual reality works by forcing users to strap a device around their head like an alien facehugger. Such VR headsets completely isolate the user from his or her surroundings and immerses the user in a completely fictional world. NASA uses virtual reality to train astronauts to explore the surface of Mars, while American football teams are experimenting with virtual reality to train quarterbacks to re-experience plays without actually going out on a field and risking physical injury. By practicing skills in a virtual reality world, users can safely make mistakes and learn from them without any physical consequences. The huge drawback with virtual reality is that to use it, you must be in a safe place such as in a home or office. Because VR headsets isolate you from your surroundings, using virtual reality essentially blindfolds you. You can’t use virtual reality while driving, walking, or operating a © Wallace Wang 2018 1 W. Wang, Beginning ARKit for iPhone and iPad, https://doi.org/10.1007/978-1-4842-4102-8_1 Chapter 1 Understanding Augmented Reality and ARKit vehicle of any kind. Because you need to wear a VR headset, you can only use virtual reality wherever you can safely stand or sit without worrying about interference from outside elements such as other people or moving vehicles. For that reason, virtual reality’s uses are limited to fixed locations where users can remain safe while they immerse themselves in another world. On the other hand, augmented reality is designed to interact with the world around you. Augmented reality lets you view the real world but with additional information overlaid over reality to help you better understand what you’re looking at. For example, a measuring cup is a simple version of augmented reality. By pouring liquid in a transparent cup with measurement units printed on the outside, you can accurately measure the amount of any liquid in the cup, as shown in Figure 1-1. Without the measurement units printed on the outside of the transparent cup, you would never know exactly how much liquid the cup contains. Figure 1-1. A measuring cup is a simple version of augmented reality Hunters use a similar type of augmented reality when aiming a rifle. The scope magnifies the view of whatever the hunter may be looking at, and crosshairs etched in the lens shows the hunter exactly where the rifle’s bullet will hit, as shown in Figure 1-2. 2 Chapter 1 Understanding Augmented Reality and ARKit Figure 1-2. A hunting scope is another form of static augmented reality Both the measuring cup and rifle scope represent simple, but fixed, types of augmented reality. A measuring cup can only measure amounts of liquids poured into that cup and a rifle scope can only magnify a target. Computers have helped make augmented reality more versatile so it can show information as the real world around you changes. In the early days of aviation, pilots had to glance at an instrument panel to get information on their speed, direction, and location. Unfortunately, glancing down at the instrument panel means taking your eyes off the real world around you, even for a moment. Such brief glances away from the outside world can be dangerous because it takes your eyes off any possible threats or obstacles nearby. In war time, these obstacles could be enemy planes trying to shoot you down, while in peace time, these obstacles could be buildings or other planes that you need to avoid. That’s why modern planes offer a form of augmented reality known as a heads-up display (HUD). 3 Chapter 1 Understanding Augmented Reality and ARKit A heads-up display displays flight information projected directly on the cockpit glass. A pilot can turn off the heads-up display to get a clean view of the outside world, or turn on the heads-up display to see the real world and crucial flight information at the same time, as shown in Figure 1-3. Figure 1-3. An airplane heads-up display offers a more sophisticated form of augmented reality Unlike the fixed information displayed by a measuring cup or a rifle scope, an airplane’s heads-up display can display constantly changing information such as altitude and speed. Because heads-up displays are simply projections on a cockpit window, a computer can display different types of information depending on the pilot’s need. The ability to display dynamic, changing data and choose which type of data to display 4 Chapter 1 Understanding Augmented Reality and ARKit makes augmented reality far more useful and versatile than the fixed type of information displayed by crude augmented reality devices like a measuring cup or a hunter’s rifle scope. Augmented Reality on Mobile Devices The heads-up display in airplanes made flying easier for pilots. Unfortunately, such heads-up displays were expensive and bulky. That’s why only large passenger jets like the Boeing 737 or military aircraft like the F-14 were initial users of heads-up displays. As computers got smaller, lighter, and less expensive, the technology behind augmented reality became available in mobile devices like the iPhone and iPad. Three elements have made augmented reality possible on iOS devices: • Powerful processors • High resolution cameras • High-resolution displays The processors used in the iPhone and iPad now rival the power of desktop processors. An iPhone that you can buy today offers more processing power than a desktop computer sold just a few years ago. Even more remarkable is that the processor used in today’s iPhone and iPad far surpasses the power that early mainframe and minicomputers once offered. With each passing year, the processor used in the iPhone and iPad gets closer to matching the processing power of desktop computers. In some cases, the processor used in the iPhone and iPad actually exceeds the processing power of desktop computers. Augmented reality needs fast processing power, especially when dealing with changing information. However, the second element that makes augmented reality possible on mobile devices are the built-in cameras available on iOS devices. In the early days, cameras on mobile 5 Chapter 1 Understanding Augmented Reality and ARKit phones could only capture poor quality images. Today’s camera on the iPhone and iPad now rivals dedicated digital cameras of just a few years ago. Many professional photographers and even filmmakers use the iPhone camera instead of expensive, dedicated digital or film cameras. The high quality resolution of today’s mobile cameras have also helped make augmented reality possible. Finally, the displays on mobile devices also offer high resolution. Not only can the iPhone and iPad screens display sharp images of the real world around you, but they can also display augmented reality data on the screens as well. The combination of fast and small processors and high-resolution cameras and displays has made augmented reality possible on mobile devices such as the iPhone and iPad. Combine these features with motion tracking and iOS devices have all the technical capabilities necessary to display augmented reality on an iPhone or iPad. One of the earliest uses for augmented reality appeared with the game Pokemon GO. Instead of limiting the game to a virtual cartoon world trapped within the confines of your screen, Pokemon GO lets players hunt for cartoon Pokemon characters in the real world. By simply holding up your iPhone or iPad, you could aim your iOS device’s camera at the ground, in a tree, or on a couch to look for cartoon Pokemon characters, as shown in Figure 1-4. 6 Chapter 1 Understanding Augmented Reality and ARKit Figure 1-4. Pokemon GO displays cartoon Pokemon characters overlaid on the real world Introducing ARKit With the technical capabilities available in the latest iOS devices, augmented reality was ready for mobile devices. The big problem was tackling the complexity of creating apps that could use augmented reality. To create an augmented reality app, you had to create your own algorithms for detecting objects in the real world and displaying virtual objects in that image. That also meant tracking camera positioning and movement of the iOS device itself. Because of this complexity, augmented reality was possible, but too difficult for most developers to use. 7 Chapter 1 Understanding Augmented Reality and ARKit That’s why Apple created ARKit as a software framework to make creating augmented reality apps much simpler. ARKit takes care of the complexity of making augmented reality so you can focus on the actual use of your app, such as displaying cartoon monsters on the screen like Pokemon GO or displaying data on the screen like a pilot’s heads-up display. Apple didn’t invent augmented reality, nor did they create ARKit on their own. Instead, Apple has been buying augmented reality companies over the years and incorporating these other companies’ technologies into a unified framework called ARKit specifically designed to help iOS developers create augmented reality apps. One of Apple’s major augmented reality acquisitions happened in 2015 when they acquired a German augmented reality company called Metaio. To this day you can still search for “Metaio” on search engines like Google or Bing and find old videos and images showing Metaio’s technology in action, much of which will continue being integrated into Apple’s ARKit framework. IKEA initially used Metaio’s technology to create their augmented reality app that allowed you to place furniture to see how it would look in your own home. By aiming your camera at the floor, you can place a virtual image of furniture in your home so you can see how a piece of furniture might look before you buy it and bring it home. You can download the IKEA Place app and try it out in your house, as shown in Figure 1-5. 8 Chapter 1 Understanding Augmented Reality and ARKit Figure 1-5. IKEA Place is an augmented reality app that lets you place virtual furniture in the real world Ferrari used Metaio’s augmented reality technology to let prospective buyers view a Ferrari in the showroom, but use augmented reality to display that car in different colors. By simply pointing an iPhone or iPad at a Ferrari in the showroom, you could change the color on that car to see what color you might like best, even if that particular color car wasn’t available to examine physically in the showroom. Since many car enthusiasts want to know what’s inside a car, Ferrari’s augmented reality app also let users aim an iPhone or iPad at a car and view the internal features such as what the engine looks like, as shown in Figure 1-6. 9 Chapter 1 Understanding Augmented Reality and ARKit Figure 1-6. Ferrari’s augmented reality app lets users view the internal features of a car The Berlin Wall Memorial created an interesting augmented reality app with Metaio’s technology that let you point an iPhone or iPad at a static image such as a window in a boarded-up building that bordered the Berlin Wall. Then the augmented reality app would show a historical video showing how people climbed out of that specific window in their attempt to escape East Berlin and make it to freedom in West Berlin. You could also use this app to view different parts of Berlin and the app would display a video showing what that part of Berlin looked like during the time when the Berlin Wall still existed, as shown in Figure 1-7. Such uses of augmented reality helped turn the Berlin Wall Memorial from a museum filled with static images and places to a visually dynamic display that helped make history seem to occur right before your eyes. 10 Chapter 1 Understanding Augmented Reality and ARKit Figure 1-7. Augmented reality shows tourists what Berlin looked like back in the 1960s Augmented reality will likely become common in advertising. Pepsi used augmented reality as a promotional prank by displaying a camera and a screen on a popular London bus stop. While people waited for the bus, the screen displayed augmented reality showing a tiger walking down the sidewalk, a giant robot attacking the city, a meteor crashing into the ground, and UFOs floating above the sky, as shown in Figure 1-8. 11 Chapter 1 Understanding Augmented Reality and ARKit Figure 1-8. Pepsi used augmented reality as a promotional gimmick Just as air forces around the world rely on heads-up displays for their pilots, so will soldiers on the ground soon rely on similar heads up displays to help them identify targets around them. The U.S. Army is developing Tactical Augmented Reality (TAC) where soldiers will wear smart glasses so they can see enhanced views of the world around them, including night vision and identification of possible targets, as shown in Figure 1-9. 12 Chapter 1 Understanding Augmented Reality and ARKit Figure 1-9. Soldiers of the future may wear smart glasses with heads up displays to identify possible targets The Disney Corporation is experimenting with augmented reality to create interactive coloring books. As a child colors an image, they can view that image as a three-dimensional character standing on the pages right in front of them, as shown in Figure 1-10. Figure 1-10. Augmented reality can create interactive coloring books 13 Chapter 1 Understanding Augmented Reality and ARKit Games, advertising, heads-up displays, and interactive books are just some of the many possibilities that augmented reality offers. To this day, Apple continues acquiring augmented reality companies to improve its augmented reality plans, such as ARKit. In 2016, Apple acquired Flyby Media, an augmented reality company that focused on spatial recognition. Flyby Media’s technology would let augmented reality devices understand distances between mobile devices and real-world objects around them. In 2017, Apple acquired SensoMotoric Instruments, a company that specialized in eye tracking technology that could be used for virtual and augmented reality glasses. That same year, Apple acquired VRvana, a company that specialized in mixed reality headsets. In 2018, Apple acquired Akonia Holographics, a startup that advertised that they made “holographic reflective and waveguide optics for transparent display elements in smart glasses”. By tracking Apple’s latest augmented reality acquisitions, you can see what new features will eventually come to ARKit on iOS devices like the iPhone and iPad, and in future devices like smart glasses or heads-up displays for cars. ARKit will continue growing in features while making augmented reality accessible to all Swift and Objective-C developers who want to add augmented reality in their own iOS apps. By learning ARKit now, you can create augmented reality apps now and in the future. Note Augmented reality is best suited for mobile devices with a camera such as the iPhone and iPad. That means ARKit is designed for creating iOS apps but is not designed to work with Apple’s other operating systems, such as MacOS, tvOS, or watchOS. 14 Chapter 1 Understanding Augmented Reality and ARKit System Requirements for ARKit Since augmented reality requires processing power, cameras, and high resolution displays, you can only create and run ARKit apps on modern iOS devices. That means ARKit apps can only run on the iPhone 6s/6s Plus or higher along with the iPad Pro. Older iOS devices such as the iPhone 5s or iPad mini won’t be able to run ARKit apps. As people abandon older iOS devices in favor of newer models, this restriction won’t be much of a problem but for now, be aware that any ARKit apps you create may not run on some people’s older iOS devices. To create apps, you need to use Apple’s free Xcode compiler. When creating ordinary iOS apps, you can test them on the Simulator program that lets your Macintosh mimic different iPhone and iPad models such as the iPhone 4s. When creating iOS apps that use ARKit, you will not be able to test your apps on the Simulator program. Instead, you’ll need a physical iOS device such as an iPhone 6s or newer, or iPad Pro that you’ll need to connect to your Macintosh through its USB cable. You can only test ARKit apps through a physical device because you need to use the camera in a real iOS device. Finally, to create iOS apps that use ARKit, you can choose between Apple’s two official programming languages—Swift and Objective-C. While many older apps were written in Objective-C, Swift is Apple’s programming language of the future. Not only is Swift just as powerful as Objective-C, but it’s also faster and far easier to learn. Although you can use Objective-C to create ARKit apps, it’s far better to focus solely on Swift to create ARKit apps. Swift will only continue to grow in popularity, while Objective-C will continue decreasing in popularity over time as more developers embrace Swift. Because the future of Apple development is Swift and not Objective-C, this book focuses exclusively on Swift to create ARKit apps. 15 Chapter 1 Understanding Augmented Reality and ARKit To create augmented reality apps in this book, you’ll need a Macintosh and a copy of Xcode 10 or greater. You’ll also need an iOS device such as an iPhone or iPad that you can connect to your Macintosh through its USB cable. To take full advantage of all the latest features of ARKit, your iOS device should also be running iOS 12 or later. Summary The true potential of augmented reality and ARKit in particular is yet to be realized. Unlike virtual reality, which requires the purchase of a dedicated VR headset, augmented reality can be used on ordinary iPhones and iPads that many people already own. Also unlike virtual reality, augmented reality lets you use it wherever you happen to be as you interact with the real world around you. Games like Pokemon GO have helped introduce augmented reality to the public just as video games helped introduce people to early personal computers. Beyond the entertainment value of augmented reality, more people and companies will start seeing and using augmented reality for useful applications. One simple use for augmented reality involves directions. By viewing your surroundings through an iPhone or iPad screen, you can see streets and buildings. With augmented reality, you will soon be able to see colored pathways showing you the fastest way to walk to your destination along with street names and business names overlaid over roads and storefronts. When you want to use augmented reality, it’s as easy as pulling out your iPhone or iPad. When you’re done using augmented reality, just put your iPhone or iPad away. (To use virtual reality, you have to buy a dedicated virtual reality headset and strap it over your face, cutting off your view of your surroundings. When you’re done with virtual reality, you still have to lug around the virtual reality headset or store it somewhere, which makes virtual reality less convenient to use than augmented reality.) 16 Chapter 1 Understanding Augmented Reality and ARKit Augmented reality will gradually become commonplace on every iPhone and iPad. Eventually, smart glasses will appear that will display augmented reality without the need to hold an iPhone or iPad in the air. The future of augmented reality is coming faster than you think. By learning how to create augmented reality apps today using ARKit, you’ll be ready for the future, whatever form it may take. 17 CHAPTER 2 Getting to Know ARKit Augmented reality works by tracking the real world through a camera. By identifying solid objects in the real world such as floors, tables, and walls, augmented reality can then accurately place virtual objects on the scene that create the illusion of actually being there. Even if the virtual object is nothing more than a cartoon Pokémon character, augmented reality must overlay that virtual object so the virtual object doesn’t get cut in half by furniture, walls, or tables. Since creating algorithms for detecting objects in the real world can be difficult even for experienced programmers, Apple created a software framework called ARKit, which provides much of the basic needs of any augmented reality app. By using ARKit, you can create augmented reality apps by focusing on the unique features of your app rather than on the details of detecting, displaying, and tracking virtual objects in the real world. ARKit acts as a platform for you to develop your own augmented reality apps. To help you get familiar using ARKit, Xcode provides a simple augmented reality project that you can compile and run on any compatible iOS device physically connected to your Macintosh through its USB cable. To create this ARKit sample app, follow these steps: 1. Start Xcode. (Make sure you’re using Xcode 10 or greater.) 2. Choose File ➤ New ➤ Project. Xcode asks you to choose a template, as shown in Figure 2-1. © Wallace Wang 2018 19 W. Wang, Beginning ARKit for iPhone and iPad, https://doi.org/10.1007/978-1-4842-4102-8_2 Chapter 2 Getting to Know ARKit Figure 2-1. Choosing an Xcode project template 3. Click the Augmented Reality App icon and click the Next button. Xcode asks for a product name, organization name, organization identifiers, and content technology, as shown in Figure 2-2. 20 Chapter 2 Getting to Know ARKit Figure 2-2. Defining the options for an augmented reality project 4. Click in the Product Name text field and type a descriptive name for your project, such as ARExample. (The exact name does not matter.) Note The organization name and identifier can be any text such as your name or company name. The organization identifier is typically the website address of your company spelled backwards, such as com.microsoft. 21 Chapter 2 Getting to Know ARKit 5. Make sure the Content Technology popup menu displays SceneKit. SpriteKit and Metal give you more versatility at the expense of more complexity. For the purposes of this book, all augmented reality apps will rely on SceneKit. 6. Make sure the Include Unit Tests and Include UI Tests check boxes are not checked since we won’t be using these features in any of the apps created in this book. 7. Click the Next button. Xcode asks where you want to store your project. 8. Choose a folder and click the Create button. Xcode creates an augmented reality project that’s ready to run. 9. Connect your iPhone or iPad to your Macintosh using its USB cable. 10. Click on the popup menu near the top of Xcode window that displays the device to run your project on, as shown in Figure 2-3. 22 Chapter 2 Getting to Know ARKit Figure 2-3. Defining the target to run the project on 11. Choose your iOS device, such as an iPhone or iPad. 12. Click the Run button or choose Product ➤ Run. A dialog appears, asking if you want to allow your project to access the iOS device’s camera, as shown in Figure 2-4. 23 Chapter 2 Getting to Know ARKit Figure 2-4. Apps must always ask for permission to access an iOS device’s camera Note Xcode may ask for your password to allow your app to access additional libraries. To grant access, type your password and click the Allow button. 13. Your project appears on your iOS device. Notice that a cartoon airplane appears, as shown in Figure 2-5. Each time you run this project, point your iOS device in a different direction. Whatever direction your iOS device’s camera points at is where the 24 Chapter 2 Getting to Know ARKit cartoon airplane will appear. Move your iOS device around and you’ll be able to see the cartoon airplane from different angles. Figure 2-5. Viewing the virtual airplane through an iPhone screen 14. Click the Stop button in Xcode or choose Product ➤ Stop. Understanding the Swift Source Code By creating a project using the Augmented Reality App template, you can create a working augmented reality app without writing or modifying a single line of code. To better understand how to use ARKit, let’s dissect the Swift code so you can understand exactly what’s happening. Your entire augmented reality project consists of several files, but the ViewController.swift file contains all the Swift code you need to add augmented reality to any project. First, notice that the ViewController. swift file imports three software frameworks: UIKit (defines the user 25 Chapter 2 Getting to Know ARKit interface), SceneKit (defines the 3D animation used to create virtual images), and ARKit (links to the ARKit augmented reality library). import UIKit import SceneKit import ARKit Note SceneKit is Apple’s framework for creating 3D animation, but two other options are SpriteKit and Metal. If you choose either of these options, your project would need to import the SpriteKit or Metal framework instead of SceneKit. Next, you must define the ViewController class as the ARSCNViewDelegate: class ViewController: UIViewController, ARSCNViewDelegate { Now you need to create a scene for displaying a virtual image. To do this, you need to create an IBOutlet. The name of this IBOutlet can be anything you wish, but the Augmented Reality App template names this IBOutlet sceneView and it represents an ARSCNView object: @IBOutlet var sceneView: ARSCNView! Within the viewDidLoad method, you need to define four items. First, you must set the ViewController class to its own ARSCNViewDelegate: // Set the view's delegate sceneView.delegate = self Second, you can display statistics on the screen that let you know information such as frames per second (fps): // Show statistics such as fps and timing information sceneView.showsStatistics = true 26 Chapter 2 Getting to Know ARKit Third, you need to define the actual image to display on the augmented reality scene. Remember, the scene is defined by the IBOutlet named sceneView: // Create a new scene let scene = SCNScene(named: "art.scnassets/ship.scn")! If you click on the art.scnassets folder in Xcode, you can see two graphic files called ship.scn and texture.png, as shown in Figure 2-6. Figure 2-6. The contents of the art.scnassets folder The ship.scn file represents a SceneKit image (notice the .scn file extension). Another type of graphic image you can use with ARKit is the COLLADA (COLLAborative Design Activity) file, which has the .dae file extension. Nearly all 3D authoring tools, such as SketchUp, can export files to the .dae format, which is an open standard for storing 3D images. The ship.scn file defines the 3D shape of the plane. The texture.png graphic file defines the image that gets applied on the ship.scn image to display different colors or patterns. In most cases, you’ll need both a 3D image 27 Chapter 2 Getting to Know ARKit (a .scn or .dae file) and a texture (a .png file) that wraps around the 3D image and provides the “skin” or outside graphics for that 3D image. If you click on the texture.png file, you can see what it looks like, as shown in Figure 2-7. Figure 2-7. The texture.png file defines the “skin” of a 3D image Fourth, after defining the 3D image with a variable name (scene), you need to put this 3D image into the actual scene view: // Set the scene to the view sceneView.scene = scene In the viewWillAppear method, you need two additional lines of Swift code. The first line turns on the iOS device’s tracking to measure the location and angle you aim the iOS device’s camera: // Create a session configuration let configuration = ARWorldTrackingConfiguration() 28 Chapter 2 Getting to Know ARKit The second line runs the actual augmented reality session: // Run the view's session sceneView.session.run(configuration) Understanding the User Interface The user interface for the Augmented Reality App Template consists of a single view in a storyboard. On that view is an ARSCNView object that fills the entire view, as shown in Figure 2-8. This ARKit SceneKit View allows SceneKit 3D images to appear on the user interface. Figure 2-8. The ARKit SceneKit View defines where to display the 3D image on the user interface Every augmented reality app must access the iOS device’s camera. However, every app must first ask for permission to use the camera. To ask for permission to access the camera, your app must define a privacy setting for the camera. You can view this camera privacy setting in the Info.plist file, as shown in Figure 2-9. 29 Chapter 2 Getting to Know ARKit Figure 2-9. The Info.plist file defines the privacy setting for the iOS device camera The text that appears in the Value column of the camera privacy setting will appear in the dialog that asks the user’s permission for your app to access the iOS device’s camera. This text is simply an explanation for why your app needs access to the camera, such as “This application will use the camera for Augmented Reality.” You can always change this text to something else if you wish. The second line in the Info.plist file that every augmented reality app needs is the Required Device Capabilities setting. It must be set to arkit, as shown in Figure 2-10. 30 Chapter 2 Getting to Know ARKit Figure 2-10. The Info.plist file defines the hardware requirements for an iOS device to run augmented reality apps This setting in the Info.plist file makes sure your app will only attempt to run on an iOS device with the proper hardware capable of running ARKit such as an iPhone 6s or higher, or an iPad Pro or higher. Creating Augmented Reality with the Single View App Template The Augmented Reality App template provides the basic Swift code needed to display augmented reality. Rather than use the Augmented Reality App template, you can create an augmented reality app using the simple Single View App template instead. By creating an augmented reality app through the Single View App template, you can get a better idea what code you need to write and what user interface elements you need to give any app augmented reality capabilities. 1. Start Xcode. (Make sure you’re using Xcode 10 or greater.) 2. Choose File ➤ New ➤ Project. Xcode asks you to choose a template (see Figure 2-1). 31 Chapter 2 Getting to Know ARKit 3. Click the Single View App icon and click the Next button. Xcode asks for a product name, organization name, organization identifiers, and content technology (see Figure 2-2). Make sure all check boxes are clear for additional options such as Use Core Data or Include Unit Tests. 4. Click in the Product Name text field and type a descriptive name for your project, such as ARExample2. (The exact name does not matter, but make sure it’s different from the project you created using the Augmented Reality App template.) 5. Click the Next button. Xcode asks where you want to store your project. 6. Choose a folder and click the Create button. Xcode creates a basic iOS project that’s ready to run. At this point, you have a basic iOS app with no augmented reality features. To add augmented reality to an app, we need to write Swift code, modify the user interface, and define settings in the Info.plist file to allow access to the camera and run only on ARKit-compatible iOS devices such as the iPad Pro or iPhone 6s and higher. First, we need to make sure our iOS app can access the ARKit framework and use the camera. To do this, we need to modify the Info. plist file. 1. Click the Info.plist file in the Navigator pane. Xcode displays a list of keys, types, and values. 2. Click the disclosure triangle to expand the Required Device Capabilities category to display Item 0. 3. Move the mouse pointer over Item 0 to display a plus (+) icon. 32 Chapter 2 Getting to Know ARKit 4. Click this plus (+) icon to display a blank Item 1. 5. Type arkit under the Value category in the Item 1 row (see Figure 2-10). 6. Move the mouse pointer over the last row to display a plus (+) icon. 7. Click on the plus (+) icon to create a new row. A popup menu appears. 8. Choose Privacy – Camera Usage Description, as shown in Figure 2-11. Figure 2-11. The Privacy – Camera Usage Description line lets your app access an iOS device’s camera 33 Chapter 2 Getting to Know ARKit 9. Type AR needs to use the camera under the Value category in the Privacy – Camera Usage Description row. With “arkit” and “Privacy – Camera Usage Description” defined in the Info.plist file, our app can now access ARKit and use an iOS device’s camera. The next step is to modify the ViewController.swift file and write Swift code to display augmented reality. 1. Click on the ViewController.swift file in the Navigator pane of Xcode. 2. Add the following two lines under the Import UIKit line, as follows: import SceneKit import ARKit 3. Modify the class ViewController line to add the ARSCNViewDelegate as follows: class ViewController: UIViewController, ARSCNViewDelegate { 4. At the bottom of the ViewController.swift file, add the viewWillAppear and viewWillDisappear functions as follows: override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) let configuration = ARWorldTrackingConfiguration() sceneView.session.run(configuration) } 34 Chapter 2 Getting to Know ARKit override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated) sceneView.session.pause() } In this code, we’re referencing sceneView, which hasn’t been defined yet. This sceneView variable name will represent the user interface view that displays augmented reality in our app. The user interface object that displays augmented reality is called ARKit Scene View. To add the ARKit Scene View object, we need to drag it to the Main. storyboard file and create an IBOutlet for it in the ViewController.swift file. To do this, follow these steps: 1. Click on the Main.storyboard file in the Navigator pane of Xcode. Xcode displays an iOS device on the storyboard screen that you can change by clicking View As at the bottom of the storyboard screen, as shown in Figure 2-12. 35 Chapter 2 Getting to Know ARKit Figure 2-12. The View As option lets you choose different iOS devices for a storyboard 2. Click the Object Library icon to display the Object Library window, as shown in Figure 2-13. 36 Chapter 2 Getting to Know ARKit Figure 2-13. The Object Library icon opens the Object Library window 3. Click in the search field at the top of the Object Library window and type ARKit. The Object Library window displays all ARKit objects available, as shown in Figure 2-14. 37 Chapter 2 Getting to Know ARKit Figure 2-14. Displaying ARKit objects in the Object Library window 4. Drag the ARKit SceneKit View from the Object Library on to the storyboard. 5. Resize the ARKit SceneKit View on the storyboard, as shown in Figure 2-15. The exact size and position of the ARKit SceneKit View isn’t important but make it large enough because the size of the ARKit SceneKitView defines how large the image will appear when viewed through the iOS device’s camera. 38 Chapter 2 Getting to Know ARKit Figure 2-15. Resizing the ARKit SceneKit View 6. Click on the ARKit SceneKit View to select it and then choose Editor ➤ Resolve AutoLayout Issues ➤ Reset to Suggested Constraints. Xcode adds constraints to keep your ARKit SceneKit View properly aligned no matter which size or orientation the user holds the iOS device. 7. Click the Show Assistant Editor icon, as shown in Figure 2-16, or choose View ➤ Assistant Editor ➤ Use Assistant Editor. Xcode displays the ViewController. swift file side by side with the storyboard. 39 Chapter 2 Getting to Know ARKit Figure 2-16. The Show Assistant Editor icon lets you view a storyboard and Swift controller file at the same time 8. Move the mouse over the ARKit SceneKIt View, hold down the Control key, and drag the mouse underneath the class ViewController line, as shown in Figure 2-17. Figure 2-17. Control-dragging from the ARKit SceneKit View to the ViewController.swift file 9. Release the Control key and the mouse. Xcode displays a popup menu to define a name for the IBOutlet, as shown in Figure 2-18. 40 Chapter 2 Getting to Know ARKit Figure 2-18. Defining a name for an IBOutlet 10. Click in the Name field, type sceneView, and press Return. Xcode creates an IBOutlet in the ViewController.swift file as follows: @IBOutlet var sceneView: ARSCNView! 11. Click the Use Standard Editor icon or choose View ➤ Standard Editor ➤ Use Standard Editor. 12. Click the ViewController.swift file in the Navigator pane. Xcode displays the Swift code stored in the ViewController.swift file. 13. Edit the viewDidLoad function as follows: override func viewDidLoad() { super.viewDidLoad() sceneView.delegate = self sceneView.showsStatistics = true let scene = SCNScene(named: "")! sceneView.scene = scene } These code changes essentially duplicate the ARKit iOS template. However, we still need an object to place in our augmented reality view. When we created an augmented reality app from the augmented reality template, that template included a ship.scn file, where the .scn file extension stands for SceneKit. 41 Chapter 2 Getting to Know ARKit What we need initially are files stored in the .dae COLLADA file format, which stands for COLLAborative Design Activity. This file format is used as a standard file format for sharing graphic designs for three-dimensional programs. To find .dae COLLADA files, visit your favorite search engine and look for “.dae public domain” files that you can download. (For the artistically-inclined, you can create your own three-dimensional objects using graphics editors, such as the free Blender program available at www.blender.org.) Most COLLADA files consist of a .dae file that defines the shape of the object and a texture file that defines the outer design of that shape. Two sites that offer free (and paid) COLLADA files include Free3D (https://free3d.com) and TurboSquid (www.turbosquid.com). Once you’ve downloaded a .dae COLLADA file along with any accompanying texture files, you must create a special scnassets folder to store those images. To create an scnassets folder, follow these steps: 1. Choose File ➤ New ➤ File. Xcode displays different file templates. 2. Click iOS at the top of the template window and scroll down to click on the SceneKit Catalog icon under the Resource category, as shown in Figure 2-19. 42 Chapter 2 Getting to Know ARKit Figure 2-19. Choosing the SceneKit Catalog icon 3. Click the Next button. Xcode asks where you want to store this folder. 4. Click the Create button. Xcode creates a SceneKit Asset Catalog.scnassets folder in the Navigator pane. 5. Click on the SceneKit Asset Catalog.scnassets folder and press Return. Xcode highlights the entire folder name, as shown in Figure 2-20. Figure 2-20. Changing the name of the SceneKit Assets Catalog folder 43 Chapter 2 Getting to Know ARKit 6. Change the name of the folder to art.scnassets and press Return. Now that we’ve written the bulk of the Swift code needed in the ViewController.swift file and designed the user interface to display augmented reality through the ARKit SceneKit View, the last step is to import a .dae file and its texture file into the .scnassets folder you created in the Xcode Navigator pane. To do add 3D images to Xcode, follow these steps: 1. Drag and drop the .dae and accompanying texture file image from the Finder window to the scnassets folder, as shown in Figure 2-21. Figure 2-21. Drag and drop a .dae and texture file from the Finder window to the scnassets folder in Xcode 2. Click on the .dae file in the scnassets folder to select it. 3. Choose Editor ➤ Convert to SceneKit scene file format (.scn). A dialog appears, asking you to verify you want to convert the .dae file to a .scn file, as shown in Figure 2-22. 44 Chapter 2 Getting to Know ARKit Figure 2-22. Xcode asks for confirmation to convert the .dae file to an .scn file 4. Click the Convert button. Xcode converts your .dae file to an .scn file. 5. (Optional) Click on the .scn file and press Return to edit the filename to something simple and descriptive. 6. Edit the following line in the viewDidLoad function to include the name of your .scn file. If your .scn file was named satellite.scn, the code would look like this: let scene = SCNScene(named: "art.scnassets/ satellite.scn")! This Swift code will load the .dae file (converted to an .scn file) into your augmented reality view. However, there’s still one last step. With most .dae files, there’s an accompanying texture file that defines the outer appearance or “skin” of the three-dimensional object. The final step is to apply this texture or “skin” to the .scn file. To do this, follow these steps: 1. Click on the .scn file in your scnassets folder displayed in the Navigator pane. Xcode displays your image as a general shape but with no outer appearance. 2. Click the Show Scene Graph View icon near the bottom of the Xcode window, as shown in Figure 2-23. Xcode displays the Scene Graph View. 45 Chapter 2 Getting to Know ARKit Figure 2-23. The Show Scene Graph View icon 3. Click on each item displayed in the Scene Graph View pane and then click on the Show the Material Inspector icon, as shown in Figure 2-24. Or choose View ➤ inspectors ➤ Show Material Inspector. Figure 2-24. The Show Material Inspector icon 4. Click on the Diffuse popup menu and choose the name of your texture file, such as texture.jpg, as shown in Figure 2-25. 46 Chapter 2 Getting to Know ARKit Figure 2-25. The Diffuse popup menu lets you choose the texture image If your original .dae file came with two or more texture files, you may need to include those multiple texture files in the scnassets folder and use the Diffuse popup menu to select each appropriate texture file for different parts of your three-dimensional object. Now attach an iOS device to your Macintosh through its USB cable and click the Run icon or choose Product ➤ Run. You should now see your .scn file displayed over the image captured by your iOS device’s camera. 47 Chapter 2 Getting to Know ARKit Summary While it’s possible to create augmented reality apps on your own, it’s far simpler to rely on Apple’s ARKit framework. ARKit takes care of the details of managing a camera and real-world objects around you to combine reality with virtual images. The simplest way to create an augmented reality app is to start with the Augmented Reality template when creating a new iOS project. However, you can also add augmented reality features to an existing app. First, you must import the ARKit framework along with a graphics framework such as SceneKit. Next you must create an ARKit SceneKit View on your app’s user interface to view the actual augmented reality image. Finally, you must import a three-dimensional image into Xcode and convert it to an .scn SceneKit file. When you want an app focused on augmented reality, it’s best to create a new project using the Augmented Reality project template. When you want to add augmented reality features to an existing app, you can easily do so at any time as well. Now that you have a basic idea how to create an augmented reality app and the various steps you need to follow to create augmented reality, it’s time to go into more detail about the specific parts of the different augmented reality features available through ARKit. 48 CHAPTER 3 World Tracking Augmented reality works by tracking the real world through a camera. By identifying solid objects in the real world such as floors, tables, and walls, augmented reality can then accurately place virtual objects on the scene that create the illusion of actually being there. Even if the virtual object is nothing more than a cartoon Pokémon character, augmented reality must overlay that virtual object so the virtual object feels like it’s part of the real world seen through a camera. To identify the location of both real and virtual objects, ARKit uses a coordinate system where the x-axis points left and right, the y-axis points up and down, and the z-axis points toward and away from the camera, as shown in Figure 3-1. Figure 3-1. Defining the x-, y-, and z-axes for the ARKit coordinate system © Wallace Wang 2018 49 W. Wang, Beginning ARKit for iPhone and iPad, https://doi.org/10.1007/978-1-4842-4102-8_3 Chapter 3 World Tracking To place virtual objects in the real world, ARKit uses a technique called visual-inertial odometry, which is just a fancy way of recognizing solid objects in the real world (such as walls and tabletops) and the current position of the camera (the iOS device) in relation to objects in the real world. With this information, ARKit can place objects on real-world items such as floors or desks, or at a fixed distance from the camera’s current location, such as two meters in front of you and a half meter to your left. Identifying real-life objects seen through a camera is known as world tracking. World tracking accuracy works best in good lighting with multiple, contrasting objects that can be easily spotted such as a chair and a table in a room. World tracking accuracy can suffer in dim or poor lighting or when viewing objects that are not easy to identify such as a solid wall or road with no other contrasting objects. Think of how you identify objects in the real world. It’s easy to identify a lamp on a table because you can see both the lamp’s entire outline and the table surface and edges. If someone just showed you a close up of a lamp or table surface, you might not know whether you’re looking at a wall or a floor. As a general rule, if it’s easy for a person to identify objects in an image, it’s easy for ARKit to identify the shape of those objects too. Besides identifying object boundaries, another key to accuracy depends on the user holding the camera steady. This gives ARKit time to accurately map out its surroundings. If the user moves the camera around too quickly or in erratic movements, ARKit will have a harder time accurately identifying real-world objects in the same way you might have trouble identifying objects if shown a video of someone moving a camera rapidly in all directions. Displaying the World Origin Every augmented reality app needs to import the ARKit framework and a graphics framework to display virtual objects such as SceneKit, SpriteKit, or Metal like this: 50 import ARKit import SceneKit Chapter 3 World Tracking Once your app imports the ARKit framework and a graphics framework like SceneKit, the next step is to use the ARWorldTrackingConfiguration class like this: let configuration = ARWorldTrackingConfiguration() AR world tracking needs to take place inside an ARKit SceneKit View (ARSCNView), which you must add to your app’s user interface. You must create an IBOutlet inside this ARSCNView such as: @IBOutlet var sceneView: ARSCNView! Now you need to run AR world tracking within this ARSCNView like this: sceneView.session.run(configuration) At this point, you would normally display a virtual object in the ARSCNView such as a cartoon airplane or chair. For this exercise, we’re going to display the world origin that ARKit uses. These world origin coordinates will let you see the x-, y-, and z-axes that define where ARKit places virtual objects. Displaying the world origin is handy to debug your app and make sure it displays virtual objects exactly where you want them. To see how to display the world origin in an augmented reality app, follow these steps: 1. Start Xcode. (Make sure you’re using Xcode 10 or greater.) 2. Choose File ➤ New ➤ Project. Xcode asks you to choose a template. 3. Click the iOS category. 4. Click the Single View App icon and click the Next button. Xcode asks for a product name, organization name, organization identifiers, and content technology. 51 Chapter 3 World Tracking 5. Click in the Product Name text field and type a descriptive name for your project, such as World Tracking. (The exact name does not matter.) 6. Make sure the Content Technology popup menu displays SceneKit. 7. Click the Next button. Xcode asks where you want to store your project. 8. Choose a folder and click the Create button. Xcode creates an iOS project. First, let’s modify the Info.plist file to allow access to the camera and to use ARKit by following these steps: 1. Click the Info.plist file in the Navigator pane. Xcode displays a list of keys, types, and values. 2. Click the disclosure triangle to expand the Required Device Capabilities category to display Item 0. 3. Move the mouse pointer over Item 0 to display a plus (+) icon. 4. Click this plus (+) icon to display a blank Item 1. 5. Type arkit under the Value category in the Item 1 row. 6. Move the mouse pointer over the last row to display a plus (+) icon. 7. Click on the plus (+) icon to create a new row. A popup menu appears. 8. Choose Privacy – Camera Usage Description. 9. Type AR needs to use the camera under the Value category in the Privacy – Camera Usage Description row. 52 Chapter 3 World Tracking Now that our app can access the camera and use ARKit, let’s add an ARKit SceneKit View to the Main.storyboard file so our app can display images from the camera. To add an ARKit SceneKit View to your user interface, follow these steps: 1. Click on the Main.storyboard file in the Navigator pane of Xcode. Xcode displays an iOS device on the storyboard screen that you can change by clicking View As at the bottom of the storyboard screen. 2. Click the Object Library icon to display the Object Library window. 3. Click in the search field at the top of the Object Library window and type ARKit. The Object Library window displays all ARKit objects available. 4. Drag the ARKit SceneKit View from the Object Library on to the storyboard. 5. Resize the ARKit SceneKit View on the storyboard. The exact size and position of the ARKit SceneKit View isn’t important, but make it large enough because the size of the ARKit SceneKit View defines how large the image will appear when viewed through the iOS device’s camera. 6. Click on the ARKit SceneKit View to select it and then choose Editor ➤ Resolve AutoLayout Issues ➤ Reset to Suggested Constraints. Xcode adds constraints to keep your ARKit SceneKit View properly aligned no matter which size or orientation the user holds the iOS device. 53 Chapter 3 World Tracking 7. Click the Show Assistant Editor icon, or choose View ➤ Assistant Editor ➤ Use Assistant Editor. Xcode displays the ViewController.swift file side by side with the storyboard. 8. Move the mouse over the ARKit SceneKit View, hold down the Control key, and drag the mouse underneath the class ViewController line. 9. Release the Control key and the mouse. Xcode displays a popup menu to define a name for the IBOutlet. 10. Click in the Name field and type sceneView and press Return. Xcode creates an IBOutlet in the ViewController.swift file as follows: @IBOutlet var sceneView: ARSCNView! 11. Click the Use Standard Editor icon or choose View ➤ Standard Editor ➤ Use Standard Editor. 12. Click the ViewController.swift file in the Navigator pane. Xcode displays the Swift code stored in the ViewController.swift file. 13. Edit the ViewController.swift file as follows: import UIKit import SceneKit import ARKit class ViewController: UIViewController, ARSCNViewDelegate { @IBOutlet var sceneView: ARSCNView! override func viewDidLoad() { super.viewDidLoad() 54 Chapter 3 World Tracking sceneView.delegate = self sceneView.showsStatistics = true sceneView.debugOptions = [ARSCNDebugOptions. showWorldOrigin] } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) let configuration = ARWorldTrackingConfiguration() sceneView.session.run(configuration) } } The main difference between this app and the previous augmented reality apps we’ve built is this single line: sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin] This line tells Xcode to display the world origin coordinate system that will consist of red line (x-axis), green line (y-axis), and blue line (z-axis). To see the world origin coordinates in an augmented reality view, follow these steps: 1. Connect your iOS device to your Macintosh through its USB cable. 2. Click on the Set the Active Scheme popup menu and choose the iOS device you’ve connected to your Macintosh, as shown in Figure 3-2. Figure 3-2. The Set the Active Scheme popup menu 55 Chapter 3 World Tracking 3. Click on the Run button or choose Product ➤ Run. (The first time you run this app, you’ll need to grant it access to the camera.) 4. Turn and aim your iOS camera until you spot the colored world coordinates floating in midair, as shown in Figure 3-3. Figure 3-3. Viewing the world coordinate system through an iOS device camera 5. Click the Stop button or choose Product ➤ Stop. 56 Chapter 3 World Tracking ARKit displays the world origin coordinate system where your iOS device appears as soon as the app runs. That’s why you may need to step back to see the world coordinate system floating before your eyes the moment your app starts running. Resetting the World Origin Each time you run an augmented reality app, it defines the world coordinates at the current location of the iOS device. Of course, you may not want the world coordinate system to appear only where you’re currently holding your iOS device when the app runs. That’s why ARKit gives you the option to reset the world coordinate system so you can move your iOS device to a new location and reset the world location to the new position of your iOS device. To reset world coordinates, we’ll need a UIButton for the user to tap. Then we’ll need to write an IBAction method to reset the world coordinates to the current position of the iOS device. To create a UIButton and write an IBAction method to reset world tracking coordinates, follow these steps: 1. Click on the Main.storyboard file in the Navigator pane. 2. Resize the ARSCNView so there’s a blank space between the bottom of the ARSCNView and the bottom of the iOS device screen. 3. Click the Object Library icon to open the Object Library window. 4. Type UIButton. The Object Library window displays the UIButton, as shown in Figure 3-4. 57 Chapter 3 World Tracking Figure 3-4. Finding the UIButton in the Object Library 5. Drag the UIButton underneath the ARSCNView. 6. Resize the width of the UIButton. 7. Double-click on the UIButton to highlight its caption and type a new caption, such as Reset. Your user interface should look similar to Figure 3-5. Figure 3-5. Adding a UIButton to the user interface 58 Chapter 3 World Tracking 8. Hold down the Shift key and click on the ARSCNView object. Handles should now appear around both the ARSCNView and the UIButton. 9. Choose Editor ➤ Resolve AutoLayout Issues ➤ Reset to Suggested Constraints under the All Views in View Controller category. Xcode adds constraints for both the ARSCNView and the UIButton. 10. Click the Assistant Editor icon or choose View ➤ Assistant Editor ➤ Show Assistant Editor. Xcode displays the ViewController.swift file and the storyboard side by side. 11. Move the mouse over the UIButton on the storyboard, hold down the Control key, and drag the mouse underneath the IBOutlet in the ViewController.swift file, as shown in Figure 3-6. 59 Chapter 3 World Tracking Figure 3-6. Control-dragging from the UIButton to the ViewController.swift file 12. Release the Control key and the mouse. Xcode displays a popup menu. 13. Click on the Connection popup menu and choose Action, as shown in Figure 3-7. Figure 3-7. Creating an IBAction method 60 Chapter 3 World Tracking 14. Click in the Name field, type resetButton, and press Return. 15. Click in the Type popup menu and choose UIButton. 16. Click the Connect button. Xcode displays a blank IBAction method. 17. Click the Standard Editor icon or choose View ➤ Standard Editor ➤ Show Standard Editor. If Xcode does not display the ViewController.swift file, click on the ViewController.swift file in the Navigator pane. 18. Edit the IBAction resetButton function as follows: @IBAction func resetButton(_ sender: UIButton) { sceneView.session.pause() sceneView.session.run(configuration, options: [.resetTracking]) } 19. Move the let configuration = ARWorldTracking Configuration line underneath the IBOutlet line, as follows: @IBOutlet var sceneView: ARSCNView! let configuration = ARWorldTrackingConfiguration() The entire ViewController.swift file should look like this: import UIKit import SceneKit import ARKit 61 Chapter 3 World Tracking class ViewController: UIViewController, ARSCNViewDelegate { @IBOutlet var sceneView: ARSCNView! let configuration = ARWorldTrackingConfiguration() @IBAction func resetButton(_ sender: UIButton) { sceneView.session.pause() sceneView.session.run(configuration, options: [.resetTracking]) } override func viewDidLoad() { super.viewDidLoad() sceneView.delegate = self sceneView.showsStatistics = true sceneView.debugOptions = [ARSCNDebugOptions. showWorldOrigin] } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) sceneView.session.run(configuration) } } 20. Connect an iOS device to your Macintosh with its USB cable. 21. Click the Run button or choose Product ➤ Run. When the app runs, step back to see the x-, y-, and x-axes world coordinates floating in the air. 62 Chapter 3 World Tracking 22. Move to a new location and tap the Reset button on the iOS screen. 23. Step back and you’ll see the x-, y-, and z-axes world coordinates in the location of your iOS device when you tapped the Reset button. 24. Click the Stop button or choose Product ➤ Stop. Displaying Shapes at Coordinates Displaying the world origin lets you see where you can define virtual objects to appear in your augmented reality view. By specifying x, y, and z coordinates, you can display virtual objects appear in relation to the current position of the user’s iOS device. Besides displaying virtual objects like spaceships or dinosaurs, the simplest virtual objects ARKit can display at specific coordinates are shapes like spheres, boxes, and planes. To create a shape, you must start by creating a node based on the SCNNode class like this: let node = SCNNode() At this point, you need to define a shape for the node. SceneKit provides boxes, planes, spheres, toruses, and other shapes, so let’s choose a sphere and define its radius as 0.05 meters like this: node.geometry = SCNSphere(radius: 0.05) To make the sphere visible, let’s give it a color. To do this, we need to define the node’s material that defines its outer surface such as: node.geometry?.firstMaterial?.diffuse.contents = UIColor. yellow 63 Chapter 3 World Tracking These three lines of Swift code create a node, define the geometry of that node as a sphere, and then color the outside surface of that sphere with yellow. Now the only remaining task is to place that node at a specific location based on the world origin. To do this, we need to define the node’s position like this: node.position = SCNVector3(0,0,0) Since the node needs an x, y, and z coordinate, the position of the node must be defined by three specific values as well. Defining the x, y, and z positions as 0 means that the node will appear at the world origin where the x-, y-, and z-axes intersect. After defining a geometric shape, its dimensions, its color, and its position, the final step is to add that node to the existing scene so it actually appears in the augmented reality view. To do this, you just need one final line of code as follows: sceneView.scene.rootNode.addChildNode(node) This line of code adds the node (the sphere) to the root node of the augmented reality scene. The root node defines the hierarchy of items displayed in an augmented reality view. To see how this code works to display a yellow sphere at the world origin, follow these steps: 1. Modify the World Tracking project or create a new project identical to the World Tracking project except give it a name of Node Placement. 2. Modify the ViewController.swift file so the code looks like this: import UIKit import SceneKit import ARKit 64 Chapter 3 World Tracking class ViewController: UIViewController, ARSCNViewDelegate { @IBOutlet var sceneView: ARSCNView! let configuration = ARWorldTrackingConfiguration() override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view, typically from a nib. sceneView.delegate = self sceneView.showsStatistics = true sceneView.debugOptions = [ARSCNDebugOptions. showWorldOrigin] showShape() } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) sceneView.session.run(configuration) } @IBAction func resetButton(_ sender: UIButton) { sceneView.session.pause() sceneView.session.run(configuration, options: [.resetTracking]) showShape() } func showShape() { let node = SCNNode() node.geometry = SCNSphere(radius: 0.05) 65 Chapter 3 World Tracking node.geometry?.firstMaterial?.diffuse.contents = UIColor.yellow node.position = SCNVector3(0,0,0) sceneView.scene.rootNode.addChildNode(node) } } 3. Connect an iOS device to your Macintosh through its USB cable. 4. Click the Run button or choose Product ➤ Run. A yellow sphere appears at the world origin, as shown in Figure 3-8. Figure 3-8. Displaying a yellow sphere at the world origin 66 Chapter 3 World Tracking 5. Move your iOS device to a new location and tap the Reset button. Notice that the world coordinates now appear in a different location with a yellow sphere at the origin. 6. Click the Stop button or choose Product ➤ Stop. Making a yellow sphere appear at the origin is fine, but you can experiment with different values for the x, y, and z coordinates of the sphere besides 0,0,0. Try modifying the following line of code with different values for the node’s position, such as: node.position = SCNVector3(0.2, -0.4, 0.1) Remember, these values define meters so if you choose too large a value, such as 10 meters, the yellow sphere will appear too far away to see within the augmented reality view, so experiment with low values such as -0.4 or 0.2. Adding and Removing Multiple Objects In the previous app example, we displayed the world origin, which appears at the current iOS device’s location. Then we displayed a yellow sphere at a specific location. Unfortunately, defining x, y, and z coordinates for the yellow sphere remains fixed in code. If we want the yellow sphere to appear in another location, or if we want to display additional yellow spheres, we can’t do that. For more versatility, let’s put an Add button on the user interface. Each time the user taps the Add button, it will add a new yellow sphere. Of course, adding multiple yellow spheres won’t look any different if all the spheres share the same x, y, and z coordinates, so let’s also add three sliders that let us define new x, y, and z coordinates for a sphere before adding it to the augmented reality view. 67 Chapter 3 World Tracking To do this, we’ll need to resize the height of the ARKit SceneKit View and add three UISliders at the bottom along with three labels to identify which axis each slider defines. To do this, follow these steps: 1. Click the Main.storyboard file in the Navigator pane. 2. Resize the height of the ARKit SceneKit View to make more room near the bottom. 3. Add a new UIButton next to the existing Reset button and give this new UIButton a caption name of Add. 4. Add three UISliders. 5. Add three labels and modify the captions to display X, Y, and Z. Your user interface should look similar to Figure 3-9. 68 Chapter 3 World Tracking Figure 3-9. Redesigning the user interface with three sliders This completes the user interface changes. Let’s add constraints by choosing Edit ➤ Select All (or pressing Command+A). Then choose Editor ➤ Resolve Auto Layout Issues ➤ Reset to Suggested Constraints. Xcode adds constraints to your labels, buttons, and sliders. Now we need to modify this user interface in two ways. First, we need to define the minimum and maximum values for the slider along with a default value. Second, we need to connect the three sliders to our ViewController.swift file as IBOutlets. In addition, we also need to connect the Add button as an IBAction method. 69 Chapter 3 World Tracking To define the minimum and maximum values for each slider, follow these steps: 1. Click on each slider. 2. Click the Show Attributes Inspector icon or choose View ➤ Inspectors ➤ Show Attributes inspector. Xcode displays the Attributes Inspector pane. 3. Change the Value property to 0. 4. Change the Minimum property to -1. 5. Change the Maximum property to 1, as shown in Figure 3-10. This lets you choose a value of -1 meter to 1 meter for defining a coordinate for placing a sphere in the augmented reality view. Figure 3-10. Modifying the properties of a slider 6. Make sure you change the Value, Minimum, and Maximum properties identically for all three sliders. Now that we’ve defined the slider values, we need to connect all three sliders and the Add button to the ViewController.swift file. To do this, follow these steps: 1. Click on the Assistant Editor icon or choose View ➤ Assistant Editor ➤ Show Assistant Editor. Xcode displays the ViewController.swift file side by side with the Main.storyboard file. 70 Chapter 3 World Tracking 2. Click on each slider, hold down the Control key, and drag the mouse to the ViewController.swift file under the existing IBOutlet, as shown in Figure 3-11. Figure 3-11. Creating an IBOutlet for a slider 3. Release the Control key and the mouse. A popup menu appears. 4. Click in the Name text field and type a descriptive name such as Xslider, Yslider, or Zslider. You should create three IBOutlets that represent the x, y, and z coordinates as follows: @IBOutlet var Xslider: UISlider! @IBOutlet var Yslider: UISlider! @IBOutlet var Zslider: UISlider! 71