I don't use Python, but I've never had any issues with any distributed Python app. I have, however, had endless problems with C and C++ apps.
The C way of referring to header files and libraries on the host system invariably leads to situations where the app wants to use a specific version that your system doesn't have. And we're not necessarily talking about system libs, either. Apparently authors thought the only way to mitigate the problem was to invent Automake/Autoconf in order to sniff what your system is capable of. (The saner solution for non-system libs would be to "vendor" your dependencies inside the app's source tree.)
Python has that pretty much solved with PIP. (Dependencies can still be a problem if a package uses the C way to link to things like Readline or OpenSSL or whatever.)
I deal with biology researchers trying to install various analysis programs. Python has caused me some pain recently. Its not easy, some use python 2.7 some packages use python 3.0. Many use different environments (pyenv/conda..).
I ended up with a separate environment for each package..
To be fair java based software install isn't much better.
R is oddly a standout, in ease of installing packages. (Except the one time it didn't work, but in this case it wasn't much worse than anything else).
In general software distribution could be made much better.
The C way of referring to header files and libraries on the host system invariably leads to situations where the app wants to use a specific version that your system doesn't have. And we're not necessarily talking about system libs, either. Apparently authors thought the only way to mitigate the problem was to invent Automake/Autoconf in order to sniff what your system is capable of. (The saner solution for non-system libs would be to "vendor" your dependencies inside the app's source tree.)
Python has that pretty much solved with PIP. (Dependencies can still be a problem if a package uses the C way to link to things like Readline or OpenSSL or whatever.)