There’s a whopping huge discussion on Python’s catalog-SIG mailinglist about deprecating external links on PyPI (the Python Package Index).
What is the problem? Well, my colleague has this now:
$ pip install mercurial Downloading/unpacking mercurial ^C^C^C^C
An endless wait for downloading mercurial from PyPI. Bad PyPI, right? No, actually not. Pip (or buildout) looks at http://pypi.python.org/simple/Mercurial/
This page is full of download URLs like http://mercurial.selenic.com/release/mercurial-1.6.3.tar.gz . Which have been failing for a couple of hours now because currently the mercurial website is down (it will probably be up again if you read this later). What I want to see is PyPI-hosted download links like http://pypi.python.org/packages/source/z/zest.releaser/zest.releaser-3.27.tar.gz#md5=f670b3b35b6a4e432fc97fc9659e95df .
PyPI reliability has been really good lately. But if you have a PIP requirements file or a buildout with many dependencies, two or three of those dependencies will be on external non-PyPI servers, like mercurial’s. So instead of the reliable PyPI, you now have three servers that can be down…
Andreas said it best: I give a shit at the arguments pulled out every time by package maintainers using PyPI only for listing their packages.
Some arguments might be valid, but these projects are, taken as one group, actively breaking pip and buildout regularly.
So I agree with Andreas. I don’t really care about “the arguments pulled out every time”. Effectively actively breaking pip and buildout is bad, period.
Update: Donald Stufft mentioned https://crate.io/externally-hosted/, a list of python packages not hosted on PyPI. That’s quite a list, actually.
My name is Reinout van Rees and I work a lot with Python (programming language) and Django (website framework). I live in The Netherlands and I'm happily married to Annie van Rees-Kooiman.
Most of my website content is in my weblog. You can keep up to date by subscribing to the automatic feeds (for instance with Google reader):