I wanted to install the Rust toolchain on a Windows 10 machine. Since I was already using gcc and clang from MSYS2's UCRT64 environment, when I saw Rust might need MSVC as a prerequisite, I wondered whether I could just use the already installed gcc.
Then I found a post saying this is because Rust needs a linker. So instead of installing the x86_64-pc-windows-msvc toolchain, I installed the x86_64-pc-windows-gnu toolchain hoping just to use the GNU linker. And so far, it seems my code compiles and runs with no problem.
Then I found that the x86_64-pc-windows-gnu toolchain itself included the ld.exe in the bin folder, which should be the GNU linker(?). So I tried to remove my gcc and clang from the environment, and my Rust code still compiled and ran normally.
So why doesn't Rust use the x86_64-pc-windows-gnu toolchain for Windows by default? Because it seems that by using the x86_64-pc-windows-gnu toolchain, you don't need to install any extra thing, which is simpler.
You are contradicting yourself here a little. You said you installed the GNU toolchain above. I highly doubt Windows ships the GNU toolchain out of the box (could be wrong though). I think it is very unlikely you'll find a Windows with MinGW but not MSVC installed.
MinGW is a port of the GNU toolchain that targets Windows, and the Win32 API, for writing Windows software. The reason MinGW and MSVC are separate targets is because MinGW does a not-that-great job of being a Windows toolchain. It's incompatible with MSVC import libraries, the import libraries is does provide are old (and in many cases, flat-out wrong), and it doesn't use the same runtime libraries as MSVC code. I'm also not sure if GCC is ABI-compatible with MSVC; I know that clang isn't.
I tried cargo build -vv, but it didn't give linker information. When switching to the x86_64-pc-windows-msvc toolchain, cargo build will give an error of "linker link.exe not found" because I didn't install MSVC.
Out of interest I went down the rabbit hole and tried to find out how rustc finds the right linker. Didn't make it that far . I think rustc takes the information what default linker to use from the target specification json . I ran rustc +nightly -Z unstable-options --target=x86_64-pc-windows-gnu --print target-spec-json according to the rustc book to find out the target specification. The json says the linker is supposed to be x86_64-w64-mingw32-gcc. The source for the target specification is here: rust/x86_64_pc_windows_gnu.rs at c8e6a9e8b6251bbc8276cb78cabe1998deecbed7 rust-lang/rust GitHub.
As @jofas pointed out, MinGW doesn't support import libraries created by MSVC and many import libraries shipped with MinGW are outdated or straight up incorrect. This means that you can't use many libraries when using MinGW, but you can't easily figure out which ones. This is a much worse experience than having to install the visual studio build tools once at the start when prompted by rustup.
Zig for example defaults to using the gnu abi (via its own toolchain) on windows.
Several production apps use mingw gnu toolchain such as git for windows, krita, kdenlive and most gtk apps running on windows.
The only case I can think of where you would need x86_64-pc-windows-gnu is if you're using a crate which builds a C library, and that C library needs Msys2 in order to build. After all, that's basically the entire purpose of Msys2 - some C libraries have incredible complex builds, using huge Make files and linux-specific tooling, which would be very difficult, if not impossible, to recreate in a way compatible with Windows. Having built many programs with Msys/Msys2 myself, I can say these cases are becoming exceedingly rare, perhaps due to proliferation of better build systems than e.g. Make which let you make fewer assumptions about your environment.
Um. From rustup 1.25.0 release notes: One of the biggest changes in 1.25.0 is the new offer on Windows installs to auto-install the Visual Studio 2022 compilers which should simplify the process of getting started for people not used to developing on Windows with the MSVC-compatible toolchains.
The easiest way to obtain these targets is cross-compilation, but native build from x86_64-pc-windows-gnu is possible with few hacks which I don't recommend.Std support is expected to be on pair with *-pc-windows-gnu.
These targets can be easily cross-compiledusing llvm-mingw toolchain or MSYS2 CLANG* environments.Just fill [target.*] sections for both build and resulting compiler and set installation prefix in config.toml.Then run ./x.py install.In my case I had ran ./x.py install --host x86_64-pc-windows-gnullvm --target x86_64-pc-windows-gnullvm inside MSYS2 MINGW64 shellso x86_64-pc-windows-gnu was my build toolchain.
Created binaries work fine on Windows or Wine using native hardware. Testing AArch64 on x86_64 is problematic though and requires spending some time with QEMU.Most of x86_64 testsuite does pass when cross-compiling,with exception for rustdoc and ui-fulldeps that fail with and error regarding a missing library,they do pass in native builds though.The only failing test is std's process::tests::test_proc_thread_attributes for unknown reason.
Last November I added a new job to our CI to cross compile our project for x86_64-pc-windows-msvc from an x86_64-unknown-linux-gnu host. I had wanted to blog about that at the time but never got around to it, but after making some changes and improvements last month to this, in addition to writing a new utility, I figured now was as good of a time as any to share some knowledge in this area for those who might be interested.
Before we get started with the How, I want to talk about why one might want to do this in the first place, as natively targeting Windows is a "known quantity" with the least amount of surprise. While there are reasons beyond the following, my primary use case for why I want to do cross compilation to Windows is our Continuous Delivery pipeline for my main project at Embark.
It's fairly common knowledge that, generally speaking, Linux is faster than Windows on equivalent hardware. From faster file I/O to better utilization of high core count machines, and faster process and thread creation, many operations done in a typical CI job such as compilation and linking tend to be faster on Linux. And since I am lazy, I'll let another blog post about cross compiling Firefox from Linux to Windows actually present some numbers in defense of this assertion.
Though we're now running a Windows VM in our on-premise data center for our normal Windows CD jobs, we actually used to run it in GCP. It was 1 VM with a modest 32 CPU count, but the licensing costs (Windows Server is licensed by core) alone accounted for >20% of our total costs for this particular GCP project.
While this single VM is not a huge deal relative to the total costs of our project, it's still a budget item that provides no substantive value, and on principle I'd rather have more/better CPUs, RAM, disk, or GPUs, that provide immediate concrete value in our CI, or just for local development.
While fast CI is a high priority, it really doesn't matter how fast it is if it gives unreliable results. Since I am the (mostly) sole maintainer, (which yes, we're trying to fix) for our CD pipeline in a team of almost 40 people, my goal early on was to get it into a reliably working state that I could easily maintain with a minimal amount of my time, since I have other, more fun, things to do.
The primary way I did this was to build buildkite-jobify (we use Buildkite as our CI provider). This is just a small service that spawns Kubernetes (k8s) jobs for each of the CI jobs we run on Linux, based on configuration from the repo itself.
To be honest, I rejected this one pretty much immediately simply because the gnu environment is not the "native" msvc environment for Windows. Targeting x86_64-pc-windows-gnu would not be representative for actual builds used by users, and it would be different from the local builds built by developers on Windows, which made it an unappealing option. That being said, generally speaking, Rust crates tend to support x86_64-pc-windows-gnu fairly well, which as we'll see later is a good thing due to my chosen strategy.
I briefly considered using wine to run the various components of the MSVC compiler toolchain, as that would be the most accurate way to match the native compilation for x86_64-pc-windows-msvc. However, we already use LLD when linking on Windows since it is vastly faster than the MSVC linker, so why not just replace the rest of the toolchain while we're at it? ? This kind of contradicts the reasons stated in x86_64-pc-windows-gnu since we'd be changing to a completely different compiler with different codegen, but this tradeoff is actually ok with me for a couple of reasons.
The first reason is that the driving force behind clang-cl, lld-link, and the other parts of LLVM replacing the MSVC toolchain, is so that Chrome can be built with LLVM for all of their target platforms. The size of the Chrome project dwarfs the amount of C/C++ code in our project by a huge margin, and (I assume) includes far more...advanced...C++ code than we depend on, so the risk of mis-compilation or other issues compared to cl.exe seems reasonably low.
Ok, now that I've laid out some reasons why you might want to consider cross compilation to Windows from Linux, let's see how we can actually do it! I'll be constructing a container image (in Dockerfile format) as we go that can be used to compile a Rust program. If you're only targeting C/C++ the broad strokes of this strategy will still be relevant, you'll just have a tougher time of it because...well, C/C++.
By default, rustup only installs the native host target of x86_64-unknown-linux-gnu, which we still need to compile build scripts and procedural macros, but since we're cross compiling we need to add the x86_64-pc-windows-msvc target as well to get the Rust std library. We could also build the standard library ourselves, but that would mean requiring nightly and taking time to compile something that we can just download instead.
c80f0f1006