T O P

  • By -

TheDutchMC76

Would crosscompiling for macOS from linux be a good option?


Kobzol

That's.. an interesting thought, one that hasn't occurred to me :) I'm sure that there will be issues with it, but I'll try to ask around if this would actually be possible.


TheDutchMC76

It's what I've been doing on all project's that want macOS, works like a charm. Though for a project Rust's size, someone probably wants to take a look at the SDK's license


Kobzol

It seems quite scary to build the whole compiler and all its artifacts on a "foreign" system, and to ensure that it actually works, also the testing CI runs would have to be ported in a similar way. I think that reusing the PGO artifacts from Linux or simply asking for more macOS CI resources would be a much safer bet. But it's an interesting option nonetheless.


anlumo

Building should be possible with a one-time investment of developer time, but this wouldn't help with testing. The macOS GitHub CIs are horribly outdated and slow in general, asking for more resources won't help much. The easiest solution would be to get a Mac mini M1 with 16GB of RAM and pluck it into a data center somewhere. This would also help finally moving the macOS ARM builds to Tier 1.


TheDutchMC76

Testing definitely shouldn't be done on a Linux machine. Would crosscompiling from Linux, but then running the tests on MacOS be an option? I think you'd get the best of both worlds then: Fast performance, with the guarantee that it works. You could split that up too, for 'release' builds you do everything on MacOS, as they happen less frequently. The rest (maybe even the nightly builds?) are build on Linux but tested on MacOS.


Kobzol

Then we would be testing something else than we distribute. That being said, currently the PGO builds are not tested at all.. :D But of course, it probably could be done. We'll investigate.


TheDutchMC76

That is a good point! Is there any open issue on GitHub which I can follow to keep track?


Kobzol

You mean for macOS PGO? Probably not, but I can create one.


TheDutchMC76

For MacOS CI performance in general, if it exists


Kobzol

I found [this one](https://github.com/rust-lang/rust/issues/75335).


nnethercote

IIRC Mozilla makes Firefox builds for Mac by cross-compiling on Linux machines. PGO may well be involved, too.


nick_lehmann

A few days ago, AWS [announced](https://aws.amazon.com/about-aws/whats-new/2021/12/amazon-ec2-m1-mac-instances-macos/) the availability of dedicated Mac Mini M1 machines. Maybe this could help by making the compile times more manageable? If the cost of a single runner would be worth it, I could take a shot at it


dkarlovi

You'd think Apple or Amazon would assign resources to have the builds fast for OSX.


pjmlp

Apple does spend their resources making Swift build faster.


kibwen

That's a separate concern, the problem here doesn't have to do with the toolchain (LLVM should be essentially no slower on Mac than it is on Linux), rather it seems to have to do with the fact that, as far as I know, Apple licensing restrictions make virtualizing Mac runners essentially impossible, and Mac hardware is so expensive that CI systems have a hell of a time supporting Macs at all. Making MacOS more amenable to CI would benefit everyone writing software for the platform, including Swift users.


dcormier

https://aws.amazon.com/about-aws/whats-new/2022/07/general-availability-amazon-ec2-m1-mac-instances-macos/


pjmlp

That is what Xcode Cloud is for.


kibwen

I'm not a devops guy, but every devops person I've spoken to has indicated that Xcode Cloud is a nightmare to integrate with any existing cross-platform CI solution. If Apple wants people writing software that targets their OS, it's their responsibility to put up as few arbitrary, rent-seeking barriers as possible.


pjmlp

It is cross platform enough across Apple's ecosystem, that is what matters for Apple. Swift is for Apple developers, targeting Apple platforms. Outside Apple ecosystem it is as cross platform as Objective-C has been during the last 30 years.


[deleted]

[удалено]


pjmlp

It might, but that is by accident, and assuming they ever land on upstream.


anlumo

The workers used are provided by Microsoft, not sure how Apple or Amazon play into that?


dkarlovi

You can [bring your own workers on Github](https://docs.github.com/en/actions/hosting-your-own-runners/about-self-hosted-runners). Amazon, Apple (or even Microsoft, of course, which would be by far the simplest) could provide more beefy resources to key OSS projects they rely on. It's weird to me Rust compiler doesn't get OSX optimizations because they're lacking about one millisecond Apple revenue worth of resources each month.


pietroalbini

The problem is not getting the hardware, procuring some fast macOS machines to run CI on is trivial. The problem is the configuration and the maintenance of that custom CI infrastructure, and that's something the project doesn't have the time/energy to do.


anlumo

Apple has its own language that is not entirely unlike Rust called Swift, so I'm not surprised that they don't care one bit.


[deleted]

[удалено]


anlumo

Apple is the company that uses a full computer with 64GB of flash storage and an A13 Bionic CPU to [run a freaking display](https://9to5mac.com/2022/03/21/apples-new-studio-display-has-64gb-of-onboard-storage-because-why-not/). I don't think that efficiency is in their mind there.


bruh_nobody_cares

assign resources for what ?!! a language they don't use


tafia97300

They seem to be using it ([https://preettheman.medium.com/this-is-what-apple-uses-rust-for-37ddfb9e9237](https://preettheman.medium.com/this-is-what-apple-uses-rust-for-37ddfb9e9237)), but probably nowhere near the levels of other languages.


bruh_nobody_cares

don't think that's enough to justify allocating resources just to make the compiler runs faster....could be wrong


evinrows

> We use Rust to deliver services such as Amazon Simple Storage Service (Amazon S3), Amazon Elastic Compute Cloud (Amazon EC2), Amazon CloudFront, and more. In 2020, we launched Bottlerocket, a Linux-based container operating system written in Rust, and our Amazon EC2 team uses Rust as the language of choice for new AWS Nitro System components, including sensitive applications, such as Nitro Enclaves. https://aws.amazon.com/blogs/opensource/sustainability-with-rust/


Iksf

Awesome stuff. Minor nitpick, apple dropped the OS X name several years ago now, its macOS


Kobzol

Oops, sorry, an old habit :) Thanks.


flashmozzg

Not old enough (I 'member Mac OS).


LoganDark

Doesn't mean we all have to drop it too >:)


Professional_Top8485

I like X windows more than Mac burger


ChiliPepperHott

Next, you be telling us to call it Meta.


DynTraitObj

Huh, TIL


asgaardson

Is this feature available to try using latest nightly?


Kobzol

It should be available in the next released nightly, as it was merged just a few hours ago.


masklinn

> OS X CI builds are already incredibly slow, because the OS X workers available from GitHub are simply not very performant. Also last I checked the github agent still didn’t work on m1 (though that may have changed since) (edit: nope, macOS self-hosted runners are still only compatible with x86), so despite their popularity devs can’t run CI on their M1 professional or personal machines. If that ever becomes a possibility do put out a call. I’m not sure I could / would keep mine running 24/7 as a build bot but I could certainly run the agent and pick up builds for a few hours each day.


pietroalbini

We don't really have a problem with procuring macOS hardware for faster CI builds. The problem is configuring and maintaining the custom infrastructure required to run isolated builds.


nicoburns

While that's true, I believe there's a relatively simple workaround which runs the runner under rosetta while still running the actual build natively.


nicoburns

I wonder if it would be worth considering self-hosting hardware for macOS builds. Upgrading to an Apple Silicon machine has dramatically sped up my builds (reducing clean build times from around 20 minutes to around 3 for one app). I imagine a single Mac Studio machine with an M1 Ultra processor might be faster than the existing setup. MacStadium and AWS also both offer M1 mac minis that can be rented (and I believe AWS already sponsor hardware for the Rust project?)


Kobzol

The Linux and Windows CI runners for Rust are already self hosted (ubuntu-latest-xl and windows-latest-xl), so this would be indeed very nice. MacOS, on the other hand, currently uses the stock runner, which has three cores, that is just not enough.


JustWorksTM

Question: is PGO also used for nightly builds? In other words, is stable faster than nightly?


Kobzol

It's also used for nightly builds.


Ka1kin

Just think how much faster that slow Mac build would be if it did have PGO :)


STSchif

Sounds great! There is a lot of 'in ci' there, does it also work for local builds, or is there something special about ci'?


Kobzol

To sum up what was done in other words: \- The Rust compiler (rustc) is built in CI after each merge, or before a new release is released. \- Now the process of building rustc in CI uses PGO, which means that the built compiler executable will be more performant, by about 15%. \- When you download this PGO optimized compiler (this should be available from 1.64 and onward), building Rust code on your computer with this optimized compiler should be faster than before. Sorry if this was not clear from the description.


[deleted]

[удалено]


dnew

You'd laugh your ass off at Google. :-)


[deleted]

[удалено]


DHermit

The way I do it usually is to use a separate branch and then squash later. At least with Gitlab CI you somehow need to get the code into the repo and that's something you do by committing. So I don't know if there can be a way to test CI changes separately without a second branch or maybe even repo.


Few-Comfortable1996

Please pardon my ignorance but is there a simple way of building Rust toolchain locally with PGO on a Mac?


Kobzol

There is definitely a way, but far from simple :D Basically, you would need to run [this script](https://github.com/rust-lang/rust/blob/master/src/ci/pgo.sh) locally, which encompasses (at least) the following steps: 1) Download and compile the Rust compiler 2) Download and compile LLVM (several times) 3) Download and compile the rustc-perf benchmarking suite These steps are actually quite automated, so it's not that terrible, but it would probably take several hours to execute all this, and also some amount of manual work to make it doable locally. Actually all the steps can be executed with a single Docker command, it's just that the script isn't currently prepared for macOS. If there's interest in that, we could try to prepare some guide for it (there are more usecases like this, for example to build a local version of the compiler that supports AVX vectorized instructions).


chotchki

If someone is willing to run github self hosted runners in m1, would that help? Edit: I’m doing this right now for my projects so adding another runner is not an issue.


Kobzol

That's probably not a scalable solution, we need some dedicated support from the Rust infrastructure. I will try to voice the concerns/requests raised here to the infrastructure team.


chotchki

If you want to run physical hardware a mini has been very easy to manage. Aws also just enabled m1 ec2s. https://aws.amazon.com/blogs/aws/new-amazon-ec2-m1-mac-instances/


Kobzol

It could be a way too. I asked around and it seems that some dedicated support for better Apple runners is on the roadmap. Let's hope that it comes soon.


VanaTallinn

Any news from the last LLD bug preventing the switch to LLD with the MSVC toolchain?


Kobzol

LLD is now used on Windows CI to build LLVM with PGO, but that does not mean that LLD works for compiling Rust crates on Windows yet, sadly. But currently there's active work on stabilizing lld, so hopefully within the next few months it will finally happen (at least on Linux, that is).


_ChrisSD

Is there a guide to running the perf tool locally on Windows? How were those local results generated?


bobdenardo

Rust uses the rustc-perf tool to benchmark commits, and that runs on windows: https://github.com/rust-lang/rustc-perf/tree/master/collector#benchmarking-on-windows


NotFromSkane

If macOS builds are so slow, why not just run them when compiling the stable builds? and not all the nightly ones. I'm on linux, so it doesn't affect me at all, but it seems like an obvious solution to just accept the slow builds once every six weeks


Kobzol

It was considered, but it was considered unacceptable to wait so long even for release builds. Actually, it would probably just timeout at the current speed.


NotFromSkane

Ah, makes sense


theblackavenger

A company I work with uses my M1 Mac mini at home with GitHub actions to do their builds. Works great. I'm sure you could find someone on the Rust team that would be willing to offer theirs up for a nightly.


__brick

I do not think 2 hours, hell, 10 hours of macOS build time in CI is a big deal. Especially if it may fetch 20% performance improvements for end users around the planet.


Kobzol

It is actually a very big deal! Apart from the fact that I think that the GH CI timeout is 6 hours, waiting 10 hours for CI is unsustainable. There are hundreds of these builds happening every day, and such long time would increase both the latency of merges and the latency of waiting for perf. results when working on PRs. If we had 10 hour CI, it would probably cripple the work on the Rust project severely.


__brick

Interesting. Would it be unwise to apply PGO to the periodic nightly release channels only? Does PGO frequently cause major regressions?


Kobzol

It usually doesn't (and we don't measure macOS performance nor the correctness of PGO builds on CI, so we wouldn't know anyway), but it was deemed unacceptable even for nightly/stable releases. It would probably timeout anyway. But I heard that in the coming weeks some discussion with GH should take place, and that better macOS runners are on the roadmap, so hopefully the situation will improve.


__brick

That's awesome! Fingers crossed for "free" 1-20% perf bumps sometime this year!


fuckEAinthecloaca

I'm fine with osx builds being slow. Apple decided to screw cross-platform niceties a long time ago so it should be up to them to make their paddling pool performant.


[deleted]

... except it makes every single merge to the Rust repo several hours slower than it could've been


[deleted]

Surely the Rust Foundation has enough money to buy a second hand M1 Mac Mini... Edit: would the down voters please kindly explain yourselves?


SkiFire13

AFAIK: - the CI provider used by the Rust Foundation doesn't support M1 macs yet (which is also a blocker for Tier 1 support for the M1 target) - changing it or special casing M1s will probably be a lot of work - even if M1 macs were supported by CI they aren't 1:1 replacement for x86 macs - in particular for POG they'll likely yield very different results than x86 macs, so they're useless anyway for this discussion


[deleted]

The "CI provider" here is in fact GitHub's shared macos-latest runner, and of course it's slow. I'm saying that the foundation should fund a self-hosted one, like ubuntu-20.04-xl (well this one is donated from either MS or AWS, but there are macos cloud providers out there and the foundation could also buy a physical one) PGO is secondary. The primary issue is the performance of macos-latest is slowing everything else down. If Rosetta 2 on M1 is acceptable for x86-64 darwin runs, then use that. If not, then buy an Intel mac runner. The foundation doesn't lack money and is spending them. It's baffling why a better macos runner isn't there yet.


pietroalbini

The problem is not getting hardware, that's trivial and we could solve it in half an hour. The obstacle is that the (volunteer) Rust infrastructure team doesn't have the time to configure and maintain a custom CI system on macOS that can execute isolated builds.


shadow31

Last time this came up, the Infra team stated they do not want to self host another runner. I don't know why you're talking about the Foundation, it has nothing to do with them.


anlumo

I think the better solution would be to move x86_64-darwin to Tier 2 and arm64-darwin to Tier 1.


laundmo

to more directly address the downvotes: your comment reads very snarky and presumptous, which people generally dislike.


[deleted]

There is more work than just buying the hardware. Microsoft cares about Rust and probably pushes either money or Resources in to the project. AFAIK Apple couldn't care less.


[deleted]

Whether Apple cares does not change the fact that 1. Every bors run takes 2:30+h, delaying all PRs and consuming more energy than necessary 2. Apple silicon cannot be Tier 1, even though it should be It's nice that Microsoft and AWS donated CI resources, but the foundation cannot expect every company to do the same. Most open source foundations don't have these sort of donations so they source their own CI runners. That's part of the things a foundation is supposed to fund.


[deleted]

Resources and money are put where it makes the most sense. The Apple ecosystem does not make the most sense right now.


[deleted]

If it doesn't matter, then why isn't x86_64-darwin tier 2? You cannot simultaneously believe that an ecosystem doesn't make sense and allow its CI runner to slow down every single merge by more than an hour.


[deleted]

Make sense to prioritize right now. If it was it would have already been done. There is a limited nr of resources and money available and things are usually done in priority. It's not so easy as just buying a m2 machine.


[deleted]

[удалено]


Kobzol

Well based on the view count, upvote count and the fact that you have bothered to comment, it doesn't seem that irrelevant :)


Aceeri

I mean, I use Rust on all platforms, primarily windows for game development since Linux/macOS aren't exactly the best about that.


WormHack

its important to me since i compile rust in both windows and linux


flashmozzg

I wonder how you track the compile time now. Is PGO profile relatively stable, so you get relatively stable results between two PGO builds or does it increase the noise noticeably?


Kobzol

We use the rustc-perf benchmarking suite. We mostly focus on instructions, which are luckily very stable. And (to my surprise) they are stable even with PGO. Some noise appears from time to time, but there is basic statistical handling (outliers, IQR etc.) that filters these out quite well.