Because it doesn’t really matter for the use cases where Python is typically popular, like data science or GIS or education or home automation or general hobbyist programming.
Python IS a popular programming language: Technology | 2025 Stack Overflow Developer Survey
But programming languages don’t have to be compiled.
Compiling programs isn’t as important as it used to be in the 90s and prior, when computers were way slower and interpreted programs were meaningfully slower and the Web was a much less mature platform. These days, many popular languages (Python, Javascript/Typescript, PHP, Ruby, Lua, etc.) are usually interpreted at runtime, and many other languages/systems do some sort of hybrid compilation.
Compilation isn’t strictly “better”, because it can make distribution more difficult (as in Python and code signing), security more problematic (direct memory access can lead to classes of security issues), and the developer experience worse (you have to wait minutes or hours for a compile, instead of the mere seconds it usually takes in Python or Javascript), etc.
Compilation can have performance benefits (primarily), but computers and phones are so blazingly fast the CPU is almost never the constraint anymore. For things like AI and data crunching, the real work is offloaded to the GPU anyway and the Python part is just glue code that isn’t doing the actual number crunching.
Outside a few niche areas like gaming or real time audio and operating systems, compilation just doesn’t matter all that much anymore.
What you’re talking about is more an issue with packaging and operating system rules on code distribution. That’s not a matter of compilation per se, but of each operating system’s balance of ease of use, security, and performance. That can change over time too, like even in Windows’s own case when the traditional Win32 EXEs were eventually replaced by somewhat interpreted/hybrid systems like dotNET and distributed via Clickonce (and then changed even more later on). iPhones, Macs, and Androids have their own setups that also changed over time.
Part of the reason I suggested PyScript is because the Web has become kind of a de facto evergreen distribution platform where you can write your code any way you like and some “runtime” (in this case a Python interpreter running in WebAssembly) seamlessly executes the code for you, all without your user knowing or caring and using the common UI language of HTML.
If instead you compiled it, you’d have to deal with differences in UI across platforms, different human interface guidelines, different permissions systems (like whether the app is allowed to access files in other folders), different terminals/shells and hotkeys and varying levels of ANSI color support and like breaks and path formats and user home directories and all that nonsense, on top of actual operating system and kernel and hardware differences.
The web abstracts all of that away so you can focus on just the business logic, the part of the code you actually care about. Let the interpreters and runtimes and browsers take care of the rest for you. In fact the desktop software world has moved so far in this direction that many apps that look like EXEs or Mac packages are actually just websites packaged up in an executable outer layer to abide by OS distribution rules, but beneath the hood they’re just HTML and Javascript.
That compilation is less important now is an orthogonal issue… mostly you’re just running into Apple’s security requirements and their unwillingness to play nice with the PC world and the VM (virtual machine) world, so it’s not so easy to cross-compile for macOS or iOS unless you have access to one, or at least access to a VM running on real Apple hardware. It’s just a licensing and greed thing on Apple’s part, not Python’s fault.