Advertisment

<strong>Developers now can create apps using Apple Vision Pro SDK.</strong>

In order to assist developers in building apps for the Apple Vision Pro mixed reality headset, Apple has announced the visionOS SDK. Next year, Apple plans to release its first spatial computer in the US, and the company is giving app developers the resources they need to make programs for it.

author-image
Kapish Khajuria
New Update
Apple Vision Pro SDK

In order to assist developers in building apps for the Apple Vision Pro mixed reality headset, Apple has announced the visionOS SDK. Next year, Apple plans to release its first spatial computer in the US, and the company is giving app developers the resources they need to make programs for it.

Advertisment

The next mixed reality headset, in contrast to Apple's existing devices, will use three separate means of interaction: eyes, hands, and voice. Developers can modify their apps with the VisionOS SDK to take advantage of these capabilities and the device's specialized hardware.

What purpose would the SDK serve?

The foundational framework that is utilized in all of Apple's operating systems serves as the basis for the SDK. It makes use of Xcode, SwiftUI, RealityKit, ARKit, and TestFlight, all of which are well-known development tools.

Advertisment

Apple's Vice President of Worldwide Developer Relations, Susan Prescott, stated, "Developers can get started building visionOS apps using the powerful frameworks they already know, and take their development even further with new innovative tools and technologies like Reality Composer Pro, to design all-new experiences for their users." 

The VisionOS SDK is now available to developers, according to an announcement made on the Apple Developer website. Developers can now download Xcode 15 Beta 2, which includes the most recent visionOS SDK and a tool called Reality Composer Pro for displaying and previewing 3D content on the Apple Vision Pro.

A visionOS simulator can be used by developers to interact with their apps during the development process, according to the blog post. They can test how their apps will look in various lighting conditions or room layouts using this simulator. Users can modify an existing headset-specific app project or develop a brand-new application using the SDK.

Advertisment

With the introduction of Reality Composer Pro, Apple is also broadening the scope of its developer tools. The preview of 3D models, images, sounds, and animations on the headset is made simpler by this Xcode feature. To give a virtual encounter without the requirement for genuine equipment, there is likewise a test system. The initial absence of gaming experiences in the initial presentation will be addressed by incorporating Unity development tools next month.

The Pro's focus on enterprise applications is further emphasized in the announcement. PTC's chief technology officer for augmented reality and virtual reality, Stephen Prideaux-Ghee, talked about how manufacturers can use PTC's augmented reality solutions to work together on critical business issues by bringing interactive 3D content into the real world with Apple Vision Pro.

Stakeholders from various departments and locations are able to simultaneously review content as a result, facilitating decisions regarding design and operation.

Advertisment