But at the same time, Microsoft is actually throwing people at the task to maintain it meanwhile Apple is letting them languish. Why discourage it if there are active maintainers?
Are you saying that Linux is also doing it wrong? Python packages often come from distro maintainers and not python devs?
Yeah, Linux is doing it wrong. I hate that I can't get a new version of python on my machine because the distro maintainers haven't gotten around to it.
Docker solves that. Yeah, you then have to deal with container problems, but it's the best packaging solution we have at the moment.
Maybe one day we'll take the lessons learned and natively build them into the OS, but distros are still obsessed with distro-specific kludges that require extra packaging work.
We have great universal packaging when it comes from the community at large, but not from single organizations. Pip is a platform-independent package manager! Why don't we have that for Linux software in general? Because each distro already has their own solution, and they won't work together to build a common one.
Both of which will be dead in 10 years, the former because it's solely focused on the desktop, latter because it's a vendor initiative not a community one.
You just compile your own from source. Python plays nice with older versions of itself. Use Gnu Stow as a secondary package manager to make it easier to remove or replace.
You can use, today, the same ways that you would force everyone to find/install python (virtualenv, docker, pipenv, conda, whatever is hot this week) in a world without distro-provided python.
And then halfway through the compilation you find out you are missing fifteen different dev-packages and the only way to figure out which are by googling obscure error messages. Oh, can't find include efglx.h, you obviously have to apt install libstdglx4-dev. (This was probably written deep on some establish build environment-wiki page that's impossible to find but since you are on Ubuntu 18 instead of 14 it wouldn't have worked anyway because now the package has changed name to apt install glx-dev. Besides - they also forgot to list half of the dependencies required on that page).
Figuring out how to establish a build environment for any program you want to install really can't be the best way to do things.
Updating and replacing system Python takes tremendous blood and treasure. Compiling python from scratch, updating symlinks, resolving dependency issues and finally debugging the whole thing is no small endeavour. Been there, done that; never again.
I've done this dozens of times, didn't seem like that big of a deal? I think the place this might have fallen over is the "updating symlinks" part. Don't replace the system Python, install yours alongside it. They don't need to know anything about each other; aliases in developers profiles and full paths in services/scripts.
Yup, stick it in /usr/local and place that in your PATH and Python search path before other dependencies. If you want to be fancy, put it somewhere in your home directory so other users on your same machine cannot be affected.
It's really not hard. I do this with other compilers and don't really have issues, though I tend to package it up into a container if I'm going to ship it (e.g. build environment for a team, interpretor for a production service, etc).
Or you could, just like on other platforms and with other languages, use virtual environments with a version manager such as pyenv for your development environment. Combine that with pipenv and you get pretty painless python development. To be fair, some of this stuff is more recent though.
Here is an excellent short blog describing all this and how pyenv and pipenv click together to create a virtual Python environment, that is completely independent of your OS vendor's runtime: https://gioele.io/pyenv-pipenv
This stuff is definitely quite new, and I guess the need for it has risen as a consequence of all the great new features that have shipped in Python during the last few years.
I recently had to write an utility script in Python. I chose to use Python 3 and type hints syntax, requiring at least Python 3.6. This was a problem when I realized some of my team use Ubuntu and Debian versions that ship with Python 3.5 (not to speak of Macs with Python 2.x, but these guys use homebrew anyway). Plenty of time was spent researching virtual environments and writing deployment instructions, and I ended up using pip in pyenv (but stopped short of using pipenv, since creating private distributable packages with it seemed convoluted.)
It's hard if you're trying to update the system Python and need to upgrade every dependency for that but if you just want to have, say, “python3.7” in your path it's just a couple of commands:
> Are you saying that Linux is also doing it wrong?
I'd say yes, the coupling of operating system and distribution of software applications has always seemed very wrong to me.
Software should run on the operating system which functions as a platform, not be coupled with the operating system. Linux package management essentially works like an app store with the additional burden of maintenance loaded on the distribution owners. I'd be very happy if flatpaks and snaps take off quickly because distribution agnostic up to date software, maintained by the developer, should be the norm.
> I'd say yes, the coupling of operating system and distribution of software applications has always seemed very wrong to me.
I agree with this in principle, but how far down the chain do you go? Obviously, you need to have some base libraries included. You just need to accept that some of those libraries will be old, and that they aren't going to change so rapidly for this to be a problem.
I believe that it's the Python Software Foundation that is controlling the install and maintaining updates.
But more broadly speaking, I'm not sure that's going to be enough. Even if you're using virtualenv's, having something installed at the system level means it's shared by all the software you're using. There's bound to be some incompatibility eventually. And then you're stuck waiting for one piece of software to support the latest version of Python. So, you have to keep putting off updates.
I think a better solution is to use something to set the python version for each program (ex. pyenv or docker).