Six Tips for Porting to Apple Vision Pro E-book | Unity (2024)

We’ve entered a new era of spatial computing, where robust extended reality (XR) tools and flexible workflows enable developers to add interactions, scale graphics, prototype, and test in-Editor. The launch of devices like Apple Vision Pro has provided the community with an unparalleled opportunity to create all-new optimized experiences and take your games and apps to the next dimension.

When developing for a new and evolving platform, it’s important to:

  • Accept that obstacles are inevitable
  • Prepare as much as you can
  • Stay adaptable
  • Make sure to approach development slowly and take the necessary time to stack the right building blocks
  • Focus on the 360-degree experience
  • Embrace technical reworks
  • Be collaborative and open
  • Turn to the community for advice

This e-book equips you with best practices and technical tricks to bring your game to the next level. It outlines the intricacies of building spatial experiences with Unity for visionOS and includes a behind-the-scenes look at how beloved titles like Job Simulator, Vacation Simulator, LEGO® Builder’s Journey, WHAT THE GOLF?, and Synth Riders came to life on Apple Vision Pro.

It covers the following six expert tips:

1. Consider the UX/UI

2. Choose the right mode for your content – mixed reality or fully immersive

3. Convert from controller-based to spatial tap and/or hand tracking

4. Balance performance and visual fidelity

5. Prioritize iteration and testing

6. Adapt to new rendering and optimization processes

1. Consider the UX/UI

The complexity of a game’s user experience (UX) and user interface (UI) is a key consideration when bringing your game to Apple Vision Pro. The platform’s evolving nature can lead to additional challenges for games or apps with a heavy UI. When interviewing studio experts about optimizing a game’s UX/UI, their recommendations focused on implementing technical reworks to compensate for the lack of eye-tracking data, incorporating body movement into gameplay, and finding the fun.


Adapt to a lack of eye-tracking data

As the Kluge Interactive team began reworking the UI for Apple Vision Pro, they quickly realized that Apple prescribes specific ways to interact with elements and standardize button design. Since eye-tracking data is not directly accessible to developers, they had to adapt and find a way to develop a UI that worked within these constraints.

Without access to eye-tracking data, a custom solution was out of the question. In order to integrate with Apple Vision Pro’s built-in highlighting system, they recreated their entire UI in 3D.

Six Tips for Porting to Apple Vision Pro E-book | Unity (1)

“Using a standard canvas-based UI could work, but we experienced challenges around the layering of different transparent layers,” explains Miguel Mosquera, a developer at Kluge Interactive. “The main problem was the square shape that highlighted the canvas, the picture, and the 2D elements. It wasn’t aesthetically pleasing, and the highlight didn’t match the object.” The team moved to a 3D UI to avoid that issue, using a square mesh and an alpha texture to achieve the desired effect.

Recreating the UI in 3D ensured that eye-gaze highlights matched the shape of the buttons, but that alone wasn’t enough to make the UI feel natural on the platform. The highlight effect was always a soft white glow around the borders of an object’s silhouette, so menu designers needed to take that into account. For example, they avoided light colors and blocky borders in favor of softer gradients with darker colors.

Although this added a new workflow for the team, they embraced the new process and are pleased with the results. “It was worth it. The UI is much cleaner than it is on other devices,” says Arturo Perez, Kluge Interactive’s CEO.

Six Tips for Porting to Apple Vision Pro E-book | Unity (2)

Take user movement and degrees of freedom into account

For Light Brick Studio, porting LEGO Builder’s Journey to Apple Vision Pro was an easy decision. The game has minimal UI, containing at most 20 buttons for actions like loading saved games and restarting a level, so reworking it was a lower lift than it would be for many games. In order to ensure proper functionality while adhering to the design principles of visionOS, they redesigned the UI in Unity, attaching colliders to interactable UI elements (a necessary component for Apple Vision Pro’s “spatial tap” gesture to correctly function).

The team also had to consider user movement. While they always aimed to simulate 3D space on a 2D surface, adapting to a mixed reality device like Apple Vision Pro meant accounting for the player’s ability to move in any direction and view the content from all angles. Existing models, originally designed to be viewed only from specific angles, had to be reworked to ensure that no gaps were left in the model’s geometry. To optimize performance and reduce draw calls, multiple meshes were combined into single GameObjects.

“We were able to take into account that a player could walk all the way around and then do a full 360-degree optimization of the model,” says Mikkel Fredborg, the technical lead at Light Brick Studio. “We got rid of all the geometry that was inside the bricks when we imported them into the game, so we removed all the interior details, as you never see them. We were able to take the automated tools that we had and adjust them to work with the full simulation experience.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (3)

Find the fun in the UX and interactions

When the Triband team experimented with spatial computing for WHAT THE GOLF?, they encountered challenges with directing user attention. In spatial applications, the user is free to move the “camera” around and look where they want. This was a completely different experience for the team than developing a game for a traditional monitor.

“In our experience, the field of vision in which users notice elements is much narrower than we expected,” says Lukas Wendt, a technical lead at Triband. To maximize the fun and impact of the experience within the smaller view, the team created physical interactions with weight and springiness.

Six Tips for Porting to Apple Vision Pro E-book | Unity (4)

Owlchemy Labs worked hard to find the fun and ensure their user experience was playful and intuitive. To achieve this, they focused on an item’s affordances – visual indicators on an object that guide a player into simple, intuitive interactions without a manual or tutorial. They also leaned into playful, responsive interaction design that encourages users to repeat core interactions.

“In the case where we didn’t want to add more features to our experience, we threw in a joke or an acknowledgement of what the user was trying to do,” explains Andrew Eiche, the CEOwl at Owlchemy Labs. “This helped them feel validated in the virtual world.”

The team also playtested their games very early and collected feedback frequently. Project leads had to attend most playtests, as they could affect change and make necessary adjustments. They then refined interactions and added special effects, animations, and sound effects to enhance user engagement.

Six Tips for Porting to Apple Vision Pro E-book | Unity (5)

2. Choose the right mode for your content – mixed reality or fully immersive

On Apple Vision Pro, a game or application can be presented in one of three ways – Windowed (2D), Immersive (extended reality/XR, which can be bounded or unbounded), or Fully Immersive (virtual reality/VR). Each has its own development pathway, and choosing the right mode for your application depends upon your specific needs and gameplay or user experience. There are a lot of factors that might guide or even force which of these options you opt for, but our studio experts advise keeping in mind depth of immersion, player attention, and graphical considerations when you make your choice.

Think about your desired level of player attention

On Apple Vision Pro, extended reality applications can be bounded (where the content is contained within a volume in the user’s space and can be run alongside other applications for multitasking) or unbounded (where the content becomes the user’s sole focus, other applications close, and content can be placed anywhere in a user’s environment).

Six Tips for Porting to Apple Vision Pro E-book | Unity (6)

Synth Riders started off as a bounded app, but the team eventually released it as unbounded. While considering Fully Immersive (VR) mode, the Kluge Interactive team encountered Game Center authentication. Ultimately, choosing between bounded and unbounded content boils down to how much focus you want to demand from your players and users.

“Do you want to have the players’ full attention or are you prepared to share it with something else? That will determine whether or not you want to be bounded or unbounded and whether you want to be fully immersive or existing in a window,” says AnnMarie Wirrel Bartholomaeus, a technical producer. “We wanted their full attention.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (7)

Consider level design intricacy

When the Triband team began developing WHAT THE GOLF? for Apple Vision Pro, they quickly realized that level design was going to be one of their biggest hurdles. The game was made out of hundreds of levels with different mechanics and camera setups. The quantity and variety of content was difficult for the team to translate in a consistent manner.

“We used XR Bounded mode since it gave us the most consistent look across levels. It helped give a miniature feeling to the levels,” says Wendt, the team’s technical lead.

Six Tips for Porting to Apple Vision Pro E-book | Unity (8)

“Working on a new platform with a new paradigm requires learning, rethinking, and collaboration. The team at Unity had already done a lot of the heavy lifting and were extremely helpful when we faced difficulties on our way to release.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (9)

LUKAS WENDT / TRIBANDTechnical Lead

They decided to use a virtual television to show 2D levels and levels that relied on specific camera movement to work. Most of the game is played on islands, so placing the television on an island helped with level cohesion.

To transition between levels on other platforms, the team used full screen fades and wipes, which weren’t possible in a spatial setting. Since time was of the essence, they worked quickly to land on a solution they were happy with. Wendt explains, “We decided on a transition where entire levels are moved up through the floor. To simplify the setup, we moved the volume the opposite way.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (10)

Work around the boundaries of bounded mode

Early on, the Light Brick Studio team decided to design in bounded mode, which meant that they had to operate within a defined volume. Any elements outside that volume were clipped and unseen. Furthermore, in bounded mode, there is no information about the player’s head, eye, or hand position/orientation until an interaction is performed. Developers only have access to information like hand position or the GameObject being interacted with once the player performs a pinch or spatial tap gesture. When the pinch ends, so does the information. In that sense, the interaction is similar to a touchscreen interface, where you only know what the player is doing while they touch the screen. Despite the initial challenges posed by these constraints, the team was able to build a cohesive experience within the game environment.

“We didn’t know what people were looking at, which was a big change, especially since our gameplay necessitates having players look at a brick to select it,” says Fredborg, the team’s technical lead. “Once we worked around all of these boundaries, it worked really well and it sits well within the system. It ended up being really nice to operate in this way since everything gets led in the same sort of user experience direction.”

3. Convert from controller-based to spatial tap and/or hand gestures

When designing and implementing gestural or tracked interactions, it’s important to ensure that the user experience is intuitive and responsive. To do so, the experts focused on using what they had and keeping the spatial tap process simple. Studios new to hand tracking had to adjust to a new input and lack of haptics, while more experienced developers set out to extend limits and make the experience as comfortable as possible.

Six Tips for Porting to Apple Vision Pro E-book | Unity (11)

Keep it simple for spatial tap

Owlchemy Labs advocates for keeping interactions simple and adhering to the platform’s standards, particularly with regards to the spatial tap gesture.

“Spatial tap is a wonderful gesture that confers a binary state without a physical button,” says CEOwl Eiche. “In general, it’s a good practice to follow what Apple does. When you do deviate, be sure you are making a conscious decision on why the deviation is better since it will be something new the users will have to learn.”

Similarly, Triband tried to keep interactions as intuitive and simple as possible. They used pinch and drag gestures to facilitate their control scheme, and used Unity’s Enhanced Touch Support to implement automatic finger tracking and touch history recording.

Six Tips for Porting to Apple Vision Pro E-book | Unity (12)

Give hand tracking a go

As the Kluge Interactive team delved into reimagining the game, they made their first real foray into hand tracking. Although they were initially hesitant about the lack of haptics, they loved the results. “Typically, when you hold a controller, your hand ends at your wrist, but with hand tracking, your fingers become important and you can gesture and interact with the music in a different way,” says Bartholomaeus, looking at the game through a technical producer lens. “It brought a new dimension to the game.”

Without controllers, they adapted to a new input. Controller haptics give players a sense of immediate feedback, so they needed a way to compensate for this missing information with non-invasive special effects. She says, “We worked on audio reactive effects, lighting, and different particles that fostered the connection with the song based on their action.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (13)

Apple Vision Pro’s hand detection differs slightly from other platforms, so the team used platform documentation to find the right values for hand position and skeleton offsets. To track hand position, they used the wrist as their point of extrapolation. By opting for a combination of hand tracking for gameplay mechanics and spatial tap for other interactions, they avoided the need for custom gestures.

“We took advantage of what the operating system offered us, and it was really handy,” says Bartholomaeus. “I encourage people to use that. It was really performant.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (14)

“With Unity, we get rapid access to so much technology from nascent hardware into a space where our developers can come to grips with unfamiliar hardware in a familiar way. This gives us the best leverage.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (15)

ANNMARIE WIRREL BARTHOLOMAEUS / KLUGE INTERACTIVETechnical Producer

Move from intention to reality

For the Job Simulator and Vacation Simulator ports to Apple Vision Pro, the Owlchemy Labs team went all-in on hand tracking, allowing them to explore interactions that are difficult or impossible on controllers. For example, second-order interactions – actions you take after holding an object – are more refined than with a controller. “In one of our demos,” Eiche explains, “we have a coin you can grasp, squeeze onto your thumb, and then flip. All of this happens by mimicking the gestures the player would do in the real world. On a controller, this would be a complicated set of actions to communicate.”

The team designed interactions to be “feel-first” – ensuring that simple interactions like grabbing an object feel responsive and natural, then progress to more complex interactions like throwing, pressing, and slapping objects. From there, they developed gestures to accentuate and accompany these interactions, even if they didn’t necessarily follow the rules of the physical world precisely – such as fingers stretching to reach an object.

“The best rule you can follow is that believability trumps realism,” says Eiche. “If the player thinks they should pick up an object but they are too far away and haven’t closed their hands enough, they should still pick it up. We worked hard to enable the user’s intentions to become reality.”

Test the limits of hand tracking

When it comes to gestures and interactions, Eiche recommends trying and testing as much as you can: “First, we used the entire hand model, and not just the gestures provided by the operating system. Pinch was great when selecting 2D buttons on panels but it cannot replace the joy of grabbing a cube. At the simplest level, using the hand as open or closed to determine picking something up felt way better than pinching for 3D objects.”

As a team that’s well versed in hand tracking, Owlchemy Labs experimented with different ways to account for the loss of tracking and provide a smoother experience. “If you just disable and enable the hand when it isn’t tracked, it can create physics chaos. Our games made the math work to appear smooth to the user and maintain the previous grasp state,” says Eiche.

To add comfort, they used more than one finger to determine whether the hand was open or closed. When grabbing the mouse in Job Simulator, they programmed the middle and ring fingers instead of the index finger –small details that add realism and comfort to a player’s experience. They also did some approximation across the hand to determine a better-feeling grasp interaction.

Six Tips for Porting to Apple Vision Pro E-book | Unity (16)

4. Balance performance and visual fidelity

When porting to Apple Vision Pro, maintaining onscreen detail while optimizing performance is a priority. In mixed reality mode, RealityKit handles the rendering, meaning shaders must be authored in Shader Graph in order to be converted into MaterialX, which RealityKit can render. Another consideration is lighting and how to achieve a balance between dynamic, responsive lighting alongside performance optimization. Our studio experts dove into how they achieved these outcomes, touching on subjects like pushing automatic shader updates with patches and using the Unity PolySpatial Lighting Node as their main output node.

Six Tips for Porting to Apple Vision Pro E-book | Unity (17)

Automate shader translation for RealityKit

In the earliest versions of LEGO Builder’s Journey, the Light Brick Studios team used a custom version of Unity’s Lightweight Render Pipeline (LWRP), then transitioned to the High Definition Render Pipeline (HDRP) for Xbox and PlayStation ports. For Apple Vision Pro, they had to convert everything to render with the Universal Render Pipeline (URP) and Shader Graph in order to work with Apple’s RealityKit in mixed reality mode.

Shader Graph shaders are converted into MaterialX shaders and then rendered by RealityKit. The Light Brick Studio team only had a handful of different custom shaders. Since the models in the game were made from LEGO bricks, the shading was fairly consistent, so they were able to reuse shader code and create variations. In order to minimize rework, they packed most functionality into subgraphs that could be modified once and automatically updated the shaders. During development, the Custom Function Node became available in Shader Graph, which allowed the team to further simplify and manage their shaders.

“Only a subset of custom code features were supported in the Unity visionOS package, but they were lifesavers when it came to creating shaders that had a bunch of mathematical computations,” explains Fredborg, the studio’s technical lead. “While it was possible to create the same code from visual nodes, it was much faster to write it as code and then convert it into MaterialX.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (18)

But the team ran into obstacles, finding that the results weren’t well optimized and the shaders needed more work than anticipated. They had to do a bit of patching to the generated MaterialX shaders to remove duplicate computations, made a script that searched for specific things in the shaders, and rewired the MaterialX nodes to eliminate suboptimal code.

“We needed the world position and world normal of the pixel for a lot of our own computations, but that was computed again from the PolySpatial Lighting Node,” he says. “Our script looked for those nodes and reconnected them so that those values were only computed once. We did that for a few other computations, and that helped speed up rendering a lot.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (19)

Find the optimal lighting node

To create shaders and lighting, the Kluge Interactive team turned to the tools and documentation for Unity PolySpatial. They discovered new nodes specifically crafted for visionOS that were helpful in the absence of lighting in the Simulator, as well as a list of nodes supported on the platform. This proved beneficial when converting tangent spaces, managing reflection probes and view directions, and handling normal mapping.

The team used the PolySpatial Lighting Node as their main output node and plugged everything into place. The base color, normal, metallic, and smoothness stayed the same.

“It acted as a two-pass shader because on the RealityKit side, the headset added reflections from our environment and some lighting based on the feedback the camera received,” explains Esteban Meneses, a 3D artist. “PolySpatial lighting managed everything that we knew the engine had inside, and there were options for the baked lighting. We debated using lightmaps and light probes, both of which were very useful.”

For the reflection probes, the team had access to simple and blended options. From there, they plugged everything that came from the PolySpatial Lighting Node into the base color and emission to be safe.

“There are passes that use the base color, and we didn’t want to have that in black,” Meneses continues. “I added an intensity modifier to crank up values and gain artistic freedom. I used Apple Vision Pro reflections’ variable and turned off the ambient occlusion in the final node, so we didn’t see the reflections of the environment.” This control over reflections allowed the team to divide their game neatly between objects that appeared to be in the room with the user and those they saw through a portal.

For LEGO Builder’s Journey, the Light Brick Studio team leaned on RealTime Point Lights and Spot Lights to improve the game’s visuals. They used the PolySpatial Lighting Node to add Unity lights to the generated shaders and created a custom shadow solution for the general Image Based Lighting provided by visionOS. As the technical lead on the project, Fredborg elaborates, “It was a very low-resolution shadow map rendering the scene straight from the top. We then generated mipmaps and merged them in the RGBA channels of a single texture. That allowed us to get soft shadows with a single texture sample.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (20)

5. Prioritize iteration and testing

No matter what hardware you’re developing for, iteration and testing are important. This is especially the case when you’re porting to a new platform where features are still evolving and there are questions that have yet to be asked or answered. When tasked with providing best practices, our studio experts emphasized the importance of speeding up iteration, making a stable game as early as possible, and finding the tools and features that work best for your particular project.

Iterate as quickly as you can

When it came to iteration, the Owlchemy Labs team built for the present, instead of what would be possible in the future. Their goal was to prototype features and gameplay as quickly as possible. As someone who has to balance both business and technical aspects of game development, Eiche explains, “We needed to validate the design before trying to catch every edge case. It’s okay to charge up some technical debt if you can build confidence in your design.”

As Kluge Interactive reimagined Synth Riders for Apple Vision Pro, they found Play to Device to be an extremely valuable tool for speeding up iteration. The feature allowed them to preview their content in the Apple Vision Pro device or the visionOS simulator straight from the Unity Editor without having to create a build. “Any changes that we made in the Unity Editor were updated in the simulator in real time. It was really quick for iteration and a big time saver for us,” says developer Mosquera.

Six Tips for Porting to Apple Vision Pro E-book | Unity (21)

Make sure there are enough eyes on your game

Owlchemy Labs emphasizes the importance of continuous testing and modification in spatial computing. They prioritized getting the game out of the Editor as soon as they could. They made builds frequently and used a continuous integration (CI) system.

As CEOwl, Eiche expands on their testing frequency, “We make a build for every platform we support every 15 minutes unless there hasn’t been a recent commit. We set up a consistent smoke test and run it on playtest builds twice per week.”

Every employee at Owlchemy Labs, even those who never open the engine, have a VR headset. They’re expected to test their games on a semi-frequent cadence because they think it’s important for teams to work with an understanding of how things affect players.

Once the team was able to play through the game, even in a rudimentary state, they began user testing. Spatial computing requires that players use their whole bodies, so they ran extensive tests to ensure that the game’s interactions and UX would accommodate diverse shapes and sizes.

Six Tips for Porting to Apple Vision Pro E-book | Unity (22)

Focus on stability and a single source of truth

Owlchemy Labs’s strategy was to get the game stable first, then push hard on iteration. As long as the game passed their smoke test, they sent it out to playtesters. “We’d rather have a test fail because of a blocking bug than have testers playing something out of date,” says Eiche.

In parallel, they ran a strike team that was responsible for general bug fixing, in addition to the normal feature bug fixes. This helped them clear out bugs and move back to stability, but did not block feature development or interaction.

Leadership also made technology accessible to ease development. The games can be worked on by any Mac that builds Apple Vision Pro applications, and the team hooked a Mac Studio into their CI system to have a single source of truth for builds. This helped developers deploy quickly.

“On top of that, we are huge proponents of Unity Version Control. We aggressively branch to allow feature development to keep moving forward without negatively impacting the main branch,” says Eiche.

Six Tips for Porting to Apple Vision Pro E-book | Unity (23)

“When you’re working on a new platform, it’s critical to communicate with the platform and the engine. Any release that is early in a platform’s life is a result of a team effort.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (24)

ANDREW EICHE / OWLCHEMY LABSCEOwl

6. Adapt to new rendering and optimization processes

While RealityKit’s functionality helps developers create compelling AR experiences and brings amazing opportunities for photorealistic rendering, animations, and more, it also adds new workflows and processes. When our studio experts ported their games to Apple Vision Pro, they encountered issues with platform-specific components and optimization, and with that, new challenges locating bottlenecks.

Manage platform-specific requirements

Navigating a new rendering pipeline was an adjustment for the Kluge Interactive team. For other platforms, their content was rendered directly in Unity, but with visionOS, there was a second step in RealityKit. Nodes that don’t exist in RealityKit therefore could not be rendered without a workaround.

One of the main challenges the team overcame was the absence of a Line Renderer, a component that takes an array of two or more points in 3D space and draws a straight line between them. This component was a huge asset to the team for their rail system for other versions of Synth Riders.

Six Tips for Porting to Apple Vision Pro E-book | Unity (25)

“When we encountered this obstacle, the Unity PolySpatial team was fantastic to brainstorm with,” says Bartholomaeus, observing as the project’s technical producer. She expands, saying, “for the rails, a feature that was not yet supported on the platform, they suggested that our technical artist recreate it with a standard mesh and vertex displacement, which worked really well.”

When it came to performance optimization, one of the main lessons they learned was to not make assumptions based on their experiences targeting other platforms. As the team went through the entire game, testing optimization paths to speed up performance piece by piece, they discovered that solutions that had reduced the impact of material instancing in URP in other platforms did not apply to Apple Vision Pro.

Six Tips for Porting to Apple Vision Pro E-book | Unity (26)

“Once we located this issue, we changed a couple of lines of code. It was an easy fix and the improvement was dramatic,” says Justin Dopiriak, a senior developer.

The team also debugged performance by using the Unity Profiler, Xcode Debugger, and the Play to Device feature. “At the beginning, it was hard to test and find issues,” explains Mosquera, another developer. “The Play to Device feature helped us localize problems faster, and, without it, the project would have taken longer to complete.”

Identify what’s relevant to render and limit data transfer

As Light Brick Studio was pushing visuals into new dimensions, they experienced some hurdles in keeping the platform running smoothly. Since Unity handled the game logic and RealityKit handled the rendering, one of the main challenges was identifying which side the bottleneck came from. The content needed to be optimized to run smoothly at 90 frames per seconds, so identifying the core issue is crucial. Part of that optimization was identifying what was relevant to render – and what was not.

“In the beginning, the PolySpatial package would reflect everything we were doing in Unity over to RealityKit, so every GameObject or transform would be there, and at that time we weren’t able to turn off specific layers from being transferred or reflected. It mirrored everything from the UI rendering, canvas rendering, and animator particle system,” explains the studio’s technical lead, Fredborg. “Now, there’s a filter where you can set it to not track specific layers, but at that time, we had to figure out a way to clean everything out that we weren’t going to see on the RealityKit side.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (27)

“The PolySpatial team was very responsive both on Slack and in the discussion forums. When we were blocked, they would get back to us with solutions quickly, which was really helpful.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (28)

MIKKEL FREDBORG / LIGHT BRICK STUDIOTechnical Lead

The team also needed to optimize GameObject hierarchies. On other platforms, LEGO bricks are three or more linked GameObjects, but for visionOS, they set a maximum of two GameObjects per brick to limit the amount of data being sent back and forth between Unity and RealityKit. This worked well and was a big win for the team.

Enhancing stop motion animation was also an effective solution to optimize the experience. The game contains animated waterfalls that were exchanging meshes on bricks. They consisted of hundreds of different GameObjects, which were quite slow. “We ended up addressing it in a staggered way instead of doing every frame for every GameObject,” says Fredborg. “We’d wait and do it every few frames between the different waterfall streams. It helped a lot.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (29)

Reduce draw calls

Our studio experts were in agreement that the keys to a smooth optimization process are prior experience and preparation. The LEGO Builder’s Journey team ported the game to multiple platforms and had optimized the game for each, so they had already undertaken steps like reducing draw calls. One of the ways they achieved this was by merging static geometry within scenes, and automatically removing hidden surfaces.

Fredborg says, “Our biggest performance gains came from simplifying the GameObject hierarchies, limiting which colliders were tracked, and ensuring we did not have any materials that weren’t used.”

Since Owlchemy Labs had already optimized their games for mobile-class hardware prior to moving to Apple Vision Pro, they experienced a significant jump in performance once porting to the platform that made further optimization less necessary. In order to optimize Job Simulator and Vacation Simulator for Quest 1 hardware, they leveraged static and dynamic batching whenever possible and replaced animations with shaders. Instead of standard Rigidbody sleeping optimization, they swapped them out for static geometry with basic colliders wherever possible. They then swapped them back when a collision was detected.

To further optimize, they built a test rig that teleported across the game and took snapshots on hardware. They analyzed the data to find hotspots and determine if they were CPU or GPU-bound. Eiche says, “it was a difficult process to iron issues out one by one, but it paid off. Our games look almost identical across platforms, from PC to PSVR to Meta Quest 1.”

Previous optimizations helped the team advance quickly, and with the extra time to spare, they leveraged features (such as real-time shadows) that were only available on high-performance hardware.

Six Tips for Porting to Apple Vision Pro E-book | Unity (30)

Conclusion

Throughout this e-book, we have covered six best practices for a smooth and accelerated port to Apple Vision Pro. When we asked our studio experts for their top tip when developing for visionOS, here’s what they had to say:

“Nothing counts until you try it in XR! Another key principle is to try to develop physical experiences whenever possible. The joy of spatial design is interacting with objects in your space.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (31)

ANDREW EICHE / OWLCHEMY LABSCEOwl

“Consider what technical architecture needs to be moved forward and start quickly. Even if this isn't the reason you’re doing it, do it since the closer you get to current technology, the better position you’ll be in to benefit from any opportunity.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (32)

ANNMARIE WIRREL BARTHOLOMAEUS / KLUGE INTERACTIVETechnical Producer

“Invest time into systems and tools that allow you to play part of your game in the Editor with a mouse and keyboard. It’s never going to give you the full picture, but some things can be tested much faster this way.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (33)

LUKAS WENDT / TRIBANDTechnical Lead

“You have to wear the headset to understand it. We recommend doing multiple builds and using the device as much as possible to focus on touch and movement. It’s a much more physical platform.”

Six Tips for Porting to Apple Vision Pro E-book | Unity (34)

MIKKEL FREDBORG / LIGHT BRICK STUDIOTechnical Lead

Whether you have just begun your spatial computing journey, are deep into it or are dipping your toes in to learn more, we hope these expert insights, best practices, and real-world experiences have inspired you to dig deeper. With the launch of Apple Vision Pro, developers have a prime opportunity to take their games, apps, and experiences to new heights.

Build for Apple Vision Pro today

Talk to our team to learn how we can help you leverage Unity’s powerful tools and workflows to build compelling spatial experiences.

Contact us
Six Tips for Porting to Apple Vision Pro E-book | Unity (2024)

References

Top Articles
Latest Posts
Article information

Author: Roderick King

Last Updated:

Views: 5447

Rating: 4 / 5 (71 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Roderick King

Birthday: 1997-10-09

Address: 3782 Madge Knoll, East Dudley, MA 63913

Phone: +2521695290067

Job: Customer Sales Coordinator

Hobby: Gunsmithing, Embroidery, Parkour, Kitesurfing, Rock climbing, Sand art, Beekeeping

Introduction: My name is Roderick King, I am a cute, splendid, excited, perfect, gentle, funny, vivacious person who loves writing and wants to share my knowledge and understanding with you.