This article summarizes the key developer-related features introduced in iOS 8, which runs on currently shipping iOS devices. The article also lists the documents that describe new features in more detail.
iOS 8 lets you extend select areas of the system by supplying an app extension, which is code that enables custom functionality within the context of a user task. For example, you might supply an app extension that helps users post content to your social sharing website. After users install and enable this extension, they can choose it when they tap the Share button in their current app. Your custom sharing extension provides the code that accepts, validates, and posts the user’s content. The system lists the extension in the sharing menu and instantiates it when the user chooses it.
In Xcode, you create an app extension by adding a preconfigured app extension target to an app. After a user installs an app that contains an extension, the extension is enabled by the user in the Settings app. When the user is running other apps, the system makes the enabled extension available in the appropriate system UI, such as the Share menu.
iOS supports app extensions for the following areas, which are known as extension points:
Share. Share content with social websites or other entities.
Action. Perform a simple task with the selected content.
Today. Provide a quick update or enable a brief task in the Today view of Notification Center.
Photo editing. Perform edits to a photo or video within the Photos app.
Storage provider. Provide a document storage location that can be accessed by other apps. Apps that use a document picker view controller can open files managed by the Storage Provider or move files into the Storage Provider.
Custom keyboard. Provide a custom keyboard that the user can choose in place of the system keyboard for all apps on the device.
Each extension point defines appropriate APIs for its purposes. When you use an app extension template to begin development, you get a default target that contains method stubs and property list settings defined by the extension point you chose.
For more information on creating extensions, see App Extension Programming Guide.
Touch ID Authentication
Your app can now use Touch ID to authenticate the user. Some apps may need to secure access to all of their content, while others might need to secure certain pieces of information or options. In either case, you can require the user to authenticate before proceeding. Use the Local Authentication Framework (
LocalAuthentication.framework) to display an alert to the user with an application-specified reason for why the user is authenticating. When your app gets a reply, it can react based on whether the user was able to successfully authenticate.
For more information, see Local Authentication Framework Reference.
Take better photos in your app, provide new editing capabilities to the Photos app, and create new, more efficient workflows that access the user’s photo and video assets.
The Photos framework (
Photos.framework) provides new APIs for working with photo and video assets, including iCloud Photos assets, that are managed by the Photos app. This framework is a more capable alternative to the Assets Library framework. Key features include a thread-safe architecture for fetching and caching thumbnails and full-sized assets, requesting changes to assets, observing changes made by other apps, and resumable editing of asset content.
For more information, see Photos Framework Reference.
Use the related Photos UI framework (
PhotosUI.framework) to create app extensions for editing image and video assets in the Photos app. For more information, see App Extension Programming Guide.
Manual Camera Controls
The AV Foundation framework (
AVFoundation.framework) makes it easier than ever to take great photos. Your app can take direct control over the camera focus, white balance, and exposure settings. In addition, your app can use bracketed exposure captures to automatically capture images with different exposure settings.
For more information see AV Foundation Framework Reference.
Technology improvements in iOS 8 make it easier than ever to implement your game’s graphics and audio features. Take advantage of high-level frameworks for ease-of-development, or use new low-level enhancements to harness the power of the GPU.
Metal provides extremely low-overhead access to the A7 GPU enabling incredibly high performance for your sophisticated graphics rendering and computational tasks. Metal eliminates many performance bottlenecks—such as costly state validation—that are found in traditional graphics APIs. Metal is explicitly designed to move all expensive state translation and compilation operations out of the critical path of your most performance sensitive rendering code. Metal provides precompiled shaders, state objects, and explicit command scheduling to ensure your application achieves the highest possible performance and efficiency for your GPU graphics and compute tasks. This design philosophy extends to the tools used to build your app. When your app is built, Xcode compiles Metal shaders in the project into a default library, eliminating most of the runtime cost of preparing those shaders.
Graphics, compute, and blit commands are designed to be used together seamlessly and efficiently. Metal is specifically designed to exploit modern architectural considerations, such as multiprocessing and shared memory, to make it easy to parallelize the creation of GPU commands.
With Metal, you have a streamlined API, a unified graphics and compute shading language, and Xcode-based tools, so you don’t need to learn multiple frameworks, languages and tools to take full advantage of the GPU in your game or app.
Scene Kit is an Objective-C framework for building simple games and rich app user interfaces with 3D graphics, combining a high-performance rendering engine with a high-level, descriptive API. Scene Kit has been available since OS X v10.8 and is now available in iOS for the first time. Lower-level APIs (such as OpenGL ES) require you to implement the rendering algorithms that display a scene in precise detail. By contrast, Scene Kit lets you describe your scene in terms of its content—geometry, materials, lights, and cameras—then animate it by describing changes to those objects.
Scene Kit’s 3D physics engine enlivens your app or game by simulating gravity, forces, rigid body collisions, and joints. Add high-level behaviors that make it easy to use wheeled vehicles such as cars in a scene, and add physics fields that apply radial gravity, electromagnetism, or turbulence to objects within an area of effect.
Use OpenGL ES to render additional content into a scene, or provide GLSL shaders that replace or augment Scene Kit’s rendering. You can also add shader-based post-processing techniques to Scene Kit’s rendering, such as color grading or screen space ambient occlusion.
For more information, see Scene Kit Framework Reference.
The Sprite Kit framework (
SpriteKit.framework) adds new features to make it easier to support advanced game effects. These features include support for custom OpenGL ES shaders and lighting, integration with Scene Kit, and advanced new physics effects and animations. For example, you can create physics fields to simulate gravity, drag, and electromagnetic forces using the
SKFieldNode class. Physics bodies can now easily be created with per-pixel collision masks. And it is easier than ever to pin a physics body to its parent, even if its parent does not have a physics body of its own. These new physics features make complex simulations much easier to implement.
Use constraints to modify the effects of physics and animations on the content of your scene—for example, you can make one node always point toward another node regardless of where the two nodes move.
Xcode 6 also incorporates new shader and scene editors that save you time as you create your game. Create a scene’s contents, specifying which nodes appear in the scene and characteristics of those nodes, including physics effects. The scene is then serialized to a file that your game can easily load.
AV Audio Engine
AV Foundation framework (
AVFoundation.framework) adds support for a broad cross-section of audio functionality at a higher level of abstraction than Core Audio. These new audio capabilities are available on both OS X and iOS and include automatic access to audio input and output hardware, audio recording and playback, and audio file parsing and conversion. You also gain access to audio units for generating special effects and filters, pitch and playback speed management, stereo and 3D audio environments, and MIDI instruments.
For more information, see AV Foundation Framework Reference.
Health Kit Framework
Health Kit (
HealthKit.framework) is a new framework for managing a user’s health-related information. With the proliferation of apps and devices for tracking health and fitness information, it's difficult for users to get a clear picture of how they are doing. Health Kit makes it easy for apps to share health-related information, whether that information comes from devices connected to an iOS device or is entered manually by the user. The user’s health information is stored in a centralized and secure location. The user can then see all of that data displayed in the Health app.
When your app implements support for Health Kit, it gets access to health-related information for the user and can provide information about the user, without needing to implement support for specific fitness-tracking devices. The user decides which data should be shared with your app. Once data is shared with your app, your app can register to be notified when that data changes; you have fine-grained control over when your app is notified. For example, you could request that your app be notified whenever the user takes his or her blood pressure, or be notified only when a measurement shows that the user’s blood pressure is too high.
Home Kit Framework
Home Kit (
HomeKit.framework) is a new framework for communicating with and controlling connected devices in a user’s home. New devices being introduced for the home are offering more connectivity and a better user experience. Home Kit provides a standardized way to communicate with those devices.
Your app can use Home Kit to communicate with devices that users have in their homes. Using your app, users can discover devices in their home and configure them. They can also create actions to control those devices. The user can group actions together and trigger them using Siri. Once a configuration is created, users can invite other people to share access to it. For example, a user might temporarily offer access to a house guest.
Use the Home Kit Accessory Simulator to test the communication of your Home Kit app with a device.
For more information, see Home Kit Framework Reference.
iCloud includes some changes that impact the behavior of existing apps and that will affect users of those apps.
Document-Related Data Migration
The iCloud infrastructure is more robust and reliable when documents and data are transferred between user devices and the server. When a user installs iOS 8 and logs into the device with an iCloud account, the iCloud server performs a one-time migration of the documents and data in that user’s account. This migration involves copying the documents and data to a new version of the app’s container directory. This new container is accessible only to devices running iOS 8 or OS X v10.10. Devices running older operating systems will continue to have access to the original container, but changes made in that container will not appear in the new container and vice versa.
Cloud Kit (
CloudKit.framework) provides a conduit for moving data between your app and iCloud. Unlike other iCloud technologies where data transfers happen transparently, Cloud Kit gives you control over when transfers occur. You can use Cloud Kit to manage all types of data.
Apps that use Cloud Kit directly can to store data in a repository that is shared by all users. This public repository is tied to the app itself and is available even on devices without a registered iCloud account. As the app developer, you can manage the data in this container directly and see any changes made by users through the Cloud Kit dashboard.
For more information about the classes of this framework, see Cloud Kit Framework Reference.
The document picker view controller (
UIDocumentPickerViewController) grants users access to files outside your application’s sandbox. It is a simple mechanism for sharing documents between apps. It also enables more complex workflows, because users can edit a single document with multiple apps.
The document picker lets you access files from a number of document providers. For example, the iCloud document provider grants access to documents stored inside another app’s iCloud container. Third-party developers can provide additional document providers by using the Storage Provider extension.
For more information, see the Document Picker Programming Guide.
Handoff is a feature in OS X and iOS that extends the user experience of continuity across devices. Handoff enables users to begin an activity on one device, then switch to another device and resume the same activity on the other device. For example, a user who is browsing a long article in Safari moves to an iOS device that's signed into the same Apple ID, and the same webpage automatically opens in Safari on iOS, with the same scroll position as on the original device. Handoff makes this experience as seamless as possible.
To participate in Handoff, an app adopts a small API in Foundation. Each ongoing activity in an app is represented by a user activity object that contains the data needed to resume an activity on another device. When the user chooses to resume that activity, the object is sent to the resuming device. Each user activity object has a delegate object that is invoked to refresh the activity state at opportune times, such as just before the user activity object is sent between devices.
If continuing an activity requires more data than is easily transferred by the user activity object, the resuming app has the option to open a stream to the originating app. Document-based apps automatically support activity continuation for users working with iCloud-based documents.
For more information, see Handoff Programming Guide.
Unified Storyboards for Universal Apps
iOS 8 makes dealing with screen size and orientation much more versatile. It is easier than ever to create a single interface for your app that works well on both iPad and iPhone, adjusting to orientation changes and different screen sizes as needed. Design apps with a common interface and then customize them for different size classes. Adapt your user interface to the strengths of each form factor. You no longer need to create a specific iPad storyboard; instead target the appropriate size classes and tune your interface for the best experience.
There are two types of size classes in iOS 8: regular and compact. A regular size class denotes either a large amount of screen space, such as on an iPad, or a commonly adopted paradigm that provides the illusion of a large amount of screen space, such as scrolling on an iPhone. Every device is defined by a size class, both vertically and horizontally. iPad size classes shows the native size classes for the iPad. With the amount of screen space available, the iPad has a regular size class in the vertical and horizontal directions in both portrait and landscape orientations.
The iPhone has different size classes based on the orientation of the device. In portrait, the screen has a compact size class horizontally and a regular size class vertically. This corresponds to the common usage paradigm of scrolling vertically for more information. When the device is in landscape, it has a compact size class both horizontally and vertically. iPhone size classes shows the native classes for the iPhone.
Every view has a size class associated with it that you can change. This flexibility is especially useful when a smaller view is contained within a larger view. You can use the default size classes to arrange the user interface of the larger view and arrange information in the subview based on a different size class combination.
To support size classes, the following classes are new or modified:
UITraitCollectionclass is used to describe a collection of traits assigned to an object. Traits specify the size class, display scale, and idiom for a particular object. Classes that support the
UITraitEnvironmentprotocol (such as
UIView) own a trait collection. You can retrieve an object’s trait collection and perform actions when those traits change.
UIImageAssetclass is used to group like images together based on their traits. Combine similar images with slightly different traits into a single asset and then automatically retrieve the correct image for a particular trait collection from the image asset. The
UIImageclass has been modified to work with image assets.
Classes that support the
UIAppearanceprotocol can customize an object’s appearance based on its trait collection.
UIViewControllerclass adds the ability to retrieve the trait collection for a child view. You can also lay out the view by changing the size class change through the
Xcode 6 supports unified storyboards. Add or remove views and layout constraints based on the size class that the view controller is displayed in. Use Xcode 6 to test your app in a variety of size classes and screen sizes, making it easier than ever to design interfaces that adapt to the conditions under which they are running. Rather than maintaining two separate (but similar) storyboards, you can make a single storyboard for multiple size classes.
Additional Framework Changes
In addition to the major changes described above, iOS 8 includes other improvements.
Many frameworks on iOS have adopted small interface changes that take advantage of modern Objective-C syntax:
Getter and setter methods are replaced by properties in most classes. Code using the existing getter and setter methods should continue to work with this change.
Initialization methods are updated to have a return value of
Designated initializers are declared as such where appropriate.
In most cases, these changes do not require any additional work in your own app. However, you may also want to implement these changes in your own Objective-C code. In particular, you may want to modernize your Objective-C code for the best experience when interoperating with Swift code.
For more information, see Adopting Modern Objective-C.
AV Foundation Framework
The AV Foundation framework (
AVFoundation.framework) enables you to capture metadata over time while shooting video. Arbitrary types of metadata can be embedded with a video recording at various points in time. For example, you might record the current physical location in a video created by a moving camera device.
For information about the classes of this framework, see AV Foundation Framework Reference.
AV Kit Framework
The AV Kit framework (
AVKit.framework) previously introduced on OS X is available on iOS. Use it instead of Media Player framework when you need to display a video.
Core Image Framework
The Core Image framework (
CoreImage.framework) has the following changes:
You can create custom image kernels in iOS.
Core image detectors can detect rectangles and QR codes in an image.
For information about the classes of this framework, see Core Image Reference Collection.
Core Location Framework
The Core Location framework (
CoreLocation.framework) has the following changes:
You can determine which floor the device is on, if the device is in a multistory building.
The visit service provides an alternative to the significant location change service for apps that need location information about interesting places visited by the user.
For information about the classes of this framework, see Core Location Framework Reference.
The Foundation framework (
Foundation.framework) includes the following enhancements:
NSFileVersionclass provides access to past versions of iCloud documents. These versions are stored in iCloud, but can be downloaded on request.
NSURLclass supports storing document thumbnails as metadata.
NSMetadataQueryclass can search for external iCloud documents that your app has opened.
Game Controller Framework
The Game Controller framework (
GameController.framework) has the following changes:
If the controller is attached to a device, you can now receive device motion data directly from the Game Controller framework.
If you are working with button inputs and do not care about pressure sensitivity, a new handler can call your game only when the button’s pressed state changes.
Game Kit Framework
The Game Kit framework (
GameKit.framework) has the following changes:
Features that were added in iOS 7 are available on OS X 10.10, making it easier to use these features in a cross-platform game.
GKSavedGameclass makes it easy to save and restore a user’s progress. The data is stored on iCloud; Game Kit does the necessary work to synchronize the files between the device and iCloud.
Methods and properties that use player identifier strings are now deprecated. Instead, use
GKPlayerobjects to identify players. Replacement properties and methods have been added that take
The iAd framework (
iAd.framework) adds the following new features:
If you are using AV Kit to play a video, you can play preroll advertisements before the video is played.
You can look up more information about the the effectiveness of advertisements for your app.
For information about the classes of this framework, see iAd Framework Reference.
Media Player Framework
Two Media Player framework (
MediaPlayer.framework) classes are extended with new metadata information.
For information about the classes of this framework, see Media Player Framework Reference.
Sprite Kit Framework Changes
The Sprite Kit framework (
SpriteKit.framework) adds many new features:
SKShapeNodeobject can specify textures to be used when the shape is either stroked or filled.
SKEffectNodeclasses include support for custom rendering. Use the
SKUniformclasses to compile an OpenGL ES 2.0-based fragment shader and provide input data to the shader.
SKSpriteNodeobjects can provide lighting information so that Sprite Kit automatically generates lighting effects and shadows. Add
SKLightNodeobjects to the scene to specify the lights, and then customize the properties on these lights and any sprites to determine how the scene is lit.
SKFieldNodeclass provides a number of physics special effects you can apply to a scene. For example, create magnetic fields, add drag effects, or even generate randomized motion. All effects are constrained to a specific region of the scene and you can carefully tune both the effect’s strength and how quickly the effect is weakened by distance. Field nodes make it easy to drop in an effect without having to search the entire list of physics bodies and apply forces to them.
SK3DNodeclass is used to integrate a Scene Kit scene into your game as a sprite. Each time that Sprite Kit renders your scene, it renders the 3D scene node first to generate a texture, then uses that texture to render a sprite in Sprite Kit. Creating 3D sprites can help you avoid needing to create dozens of frames of animation to produce an effect.
New actions have been added, including support for inverse kinematic animations.
A new system of constraints has been added to scene processing. Constraints are applied after physics is simulated and can be used to specify a set of rules for how a node is positioned and oriented. For example, you can use a constrant to specify that a particular node in the scene always points at another node in the scene. Constraints make it easier to implement rendering rules in your game without having to manually tweak the scene in your event loop.
A scene can implement all of the run-loop stages in a delegate instead. Using a delegate often means that you can avoid needing to subclass the
SKViewclass provides more debugging information. You can also provide more performance hints to the renderer.
You can create normal map textures for use in lighting and physics calculations (or inside your own custom shaders). Use the new
SKMutableTextureclass when you need to create textures whose contents are dynamically updated.
You can dynamically generate texture atlases at runtime from a collection of textures.
Xcode 6 also incorporates many new Sprite Kit editors. Create or edit the contents of scenes directly, specifying the nodes that appear in the scene as well as their physics bodies and other characteristics. This scene is serialized to a file and can be loaded directly by your game. The editors save you time because often you don’t need to implement your own custom editors to create your game’s assets.
The UIKit framework (
UIKit.framework) includes the following enhancements:
Apps that use local or push notifications must explicitly register the types of alerts that they display to users by using a
UIUserNotificationSettingsobject. This registration process is separate from the process for registering remote notifications, and users must grant permission to deliver notifications through the requested options.
Local and push notifications can include custom actions as part of an alert. Custom actions appear as buttons in the alert. When tapped, your app is notified and asked to perform the corresponding action. Local notifications can also be triggered by interactions with Core Location regions.
Collection views support dynamically changing the size of cells. Typically, you use this support to accommodate changes to the preferred text size, but you can adapt it for other scenarios too. Collection views also support more options for invalidating different portions of the layout and thereby improving performance.
UISearchControllerclass replaces the
UISearchDisplayControllerclass for managing the display of search-related interfaces.
UISplitViewControllerclass is now supported on iPhone as well as iPad. The class adjusts its presented interface to adapt to the available space. It also changes the way it shows and hides the primary view controller, giving you more control over how to display the split view interface.
UINavigationControllerclass has new options for changing the size of the navigation bar or hiding it altogether by using gestures.
UIVisualEffectclass enables you to integrate custom blur effects into your view hierarchies.
UIPresentationControllerclass lets you separate the content of your view controllers from the chrome used to display them.
UIPopoverPresentationControllerclass handles the presentation of content in a popover. The existing
UIPopoverControllerclass uses the popover presentation controller to show popovers on the screen.
UIPrinterPickerControllerclass offers a view controller-based way to display a list of printers and to select one to use during printing. Printers are represented by instances of the new
For information about the classes of this framework, see UIKit Framework Reference.
Video Toolbox Framework
The Video Toolbox framework (
VideoToolbox.framework) includes direct access to hardware video encoding and decoding.
The following APIs are deprecated:
UIApplicationmethods and properties for registering notifications. Use the new API instead.
UIViewControllermethods and properties for interface orientation. Traits and size classes replace them, as described in Unified Storyboards for Universal Apps. There are other smaller changes to UIKit API to support size classes; often older interfaces that used specific device idioms have been replaced.
Methods and properties in Game Kit that use player identifier strings.
For a complete list of specific API deprecations, see iOS 8.0 API Diffs.