M1 / Apple Silicon compatibility

No: “tvOS. Leverage many of the same frameworks, technologies, and concepts as iOS.” Source: Apple
Obviously, nobody forces you to design/write software by leveraging the same frameworks (to use Apple’s words), but it is advisable if you want to produce portable software.

This thread isn’t about portable code though.

This thread is about extracting high(er) performance from software. And that involves understanding the architecture to make the most of it.

It is, because some people (including you) are implicitly claiming a priori that good software is not portable across different Apple devices. Although this might have been true in the past, it is not true anymore. Also, you are confusing the concept of system design/computer organization — i.e. what you call “architecture” — with the instruction set architecture. Considering that

  1. all these operating systems (xOS) belong to the same family,
  2. all these CPUs belong to the same family (RISC ARM64),
  3. all these platforms can now use the same frameworks,

there is no need to study/understand how each device was designed, even though the devices look very different (e.g. an Apple TV, an iPad, a MacBook Pro). The same goes for their respective CPUs.

This thread is about extracting high(er) performance from software. And that involves understanding the architecture to make the most of it.

No, or at least not what you mean by “architecture”. On the contrary, it involves following Apple’s guidelines, in which case you will get assembly code that can be executed by different CPUs — provided they share the instruction set as is the case here — even though the “architectures”, to use your word, look different and/or have a different name.

Nowadays, a well-designed app can run on
tvOS :arrow_right:︎ iOS/iPadOS :arrow_right:︎ macOS
(not necessarily the other way around)
which is remarkable anyway.

That being said, it seems that Zwift developers couldn’t care less.

1 Like

I don’t think this is true. We’ve already heard that they are working on native builds for macOS, and we don’t know what else may flow from that work. My guess is that Zwift developers are also interested in this work and frustrated by the slow progress. If you need to blame someone, focus on the leadership. They set the priorities and sign up to take the heat for those choices. And they decide how much gets publicly communicated.


I never said anything remotely like that.

I’ll leave you to it though, because you seem solely concerned with code portability at the machine code level, which hasn’t been in dispute.


Glad you’re available to do Zwift’s job of communicating to their own company forum

1 Like

That’s exactly why I don’t typically post them on here. Don’t see why I should.


Which thread was this on?

Also enjoyed this one


I had access to a Macbook Air M2 so I tried it out with Zwift. No shadows.

This is from a MacBook Air M2, a week ago:

1 Like

Weird, I wonder why I wasn’t getting them.

Yeah that post hasn’t aged well.

Oh wow! Did you do anything to get those shadows going? Is this specific to M2 macs as opposed to M1/M1 Pro?

It’s not mine, just somebody who contacted me about it. The Zwiftalizer screenshot shows that the M2 GPU isn’t recognised by the game, and so it’s defaulting to Medium profile (which gets rider shadows). This is what happened with M1 devices originally, as explained further up the thread.

Got it, thanks. Just tried installing Zwift on my partner’s M2 MBA and did indeed see shadows, though limited to 1080p:


Yeah 1080p is the maximum game resolution for Medium profile.

That was true for me first time I tried Zwift. Old windows laptop could barely manage Zwift.

So Apple TV 4K was a definite upgrade in performance and ease of use.

Now I use a desktop Mac it’s the other way around, it is fast enough that I don’t want to use Apple TV anymore. Besides also the desktop has much nicer graphics.

1 Like


Are there any updates for Zwift to bring “high Details” to the m1 and m2 chips?
The support told me about a year ago that there will be soon a solution to take advantage of the GPU power of these systems.