Apple Offers New Technologies For App Development

Apple unveils new technologies to make it easier and faster for app development.

At, WWDC, Apple unveiled a bunch of innovative technologies that the company claims will make it significantly easier and faster for developers to create powerful new apps.
The company said that SwiftUI is a revolutionary development framework that makes creating powerful user interfaces easier than ever before. And in the AR field, ARKit 3, RealityKit and Reality Composer are tools that make it even easier to create compelling AR experiences. Additionally, new updates to Core ML and Create ML enables more powerful and streamlined on-device machine learning apps.
Source: Apple
"SwiftUI saves developers time by providing a huge amount of automatic functionality including interface layout, Dark Mode, Accessibility, right-to-left language support and internationalization. SwiftUI apps run natively and are lightning fast. And because SwiftUI is the same API built into iOS, iPadOS, macOS, watchOS and tvOS, developers can more quickly and easily build rich, native apps across all Apple platforms." wrote the company
According to the company, now, a new graphical UI design tool built into Xcode 11 will make it even easier for UI developers to quickly assemble a UI with SwiftUI and that also without having to write any code.
A feature of Swift is that Swift code gets automatically generated and when the code is modified, and the changes to the UI instantly appear in the visual design tool. So, now you can see automatic, real-time previews of how the UI will look and behave as you assemble, test and refine your code. This ability of fluidly moving between graphical design and writing code makes UI development more efficient and also makes software developers and UI designers to collaborate more closely. Well, these previews can run directly on connected Apple devices, including iPhone, iPad, iPod touch, Apple Watch and Apple TV. This helps developers to see and monitor how an app responds to Multi-Touch, or how it works with the camera and on-board sensors,live, as the interface is being built.
ARKit 3, let’s developers integrate people’s movement into apps and enables facial tracking for upto three faces; and realityKit, provides a variety of animation, camera effects, and audio features.
The company said that Core ML, apps can now use state-of-the-art models to deliver experiences that deeply understand vision, natural language and speech like never before.
Core ML, also enables developers to update machine learning models on-device using model personalization. This allows developers to provide personalized features without compromising user privacy.
Interestingly, Create ML, Apple's dedicated app for machine learning development, allows developers to build machine learning models without writing code. With Create ML, Multiple-model training with different datasets can be used with new types of models such as object detection, activity and sound classification.
You can now also integrate Apple’s hardware such as the Apple Pencil and software features such as Siri into your own apps.
The launch of watchOS 6 and the App Store on Apple Watch, also enables developers to build and design apps for Apple Watch that can work completely independently, even without an iPhone. Developers can also take advantage of the Apple Neural Engine on Apple Watch Series 4 using Core ML.