I’ve spent years slogging through Windows updates and changing policies, and now everyone’s touting our OS as a developer-friendly platform. But seriously—when did Windows become the unquestioned choice for development? With the introduction of things like the Windows Subsystem for Linux and a push for integrated development environments, it feels like Microsoft is scrambling to cover every base, yet often ends up with half-baked solutions.
I’ve set up several dev environments over the years, and every time I dig into configuring compilers, package managers, or even just getting a simple build system running, I’m met with layers of abstraction that seem designed to obscure rather than simplify. Is it really best practice to rely on WSL for a seamless Linux-like experience on a fundamentally Windows system? How many of us are still dealing with weird interoperability issues, unexpected environment behavior, or the need to jump through hoops to use certain Unix tools natively?
I’m curious about others’ real-world experiences. Can anyone actually set up a truly robust, native development environment on Windows without resorting to workarounds or using virtual machines? Or have we simply been sold a story that Arizona’s “one OS for developers” is a reality when it’s clearly just another patch on an old system?