I hate python package/dependency/virtualenv management so much. Even JavaScript is preferable.
I find npm very easy to wrap my head around. I do npm install ___ and it does a lookup in its repository and downloads it and its dependencies to node_modules. If I want to start fresh, I simply delete node_modules. Everything else "just works" when I invoke node myapp.js. Punto e basta.
pipenv masquarades as the same thing, but then there is no python_modules folder to be found. Instead it downloads it to some obscure directory halfway across my computer. If I want to start fresh I guess I have to trust the tool's uninstall function? And it's unclear if I need sudo or not. Also I can no longer just run my program, I have to run it with pipenv now? Python requires too much cognitive burden for module/dependency/virtualenv management.
For me it's to the point that developing using a docker image with globally installed python modules is easier to manage and wrap my brain around than using pipenv/virtualenv/whatever.
PEP 582 is in the works to make __pypackages__ the Python equivalent of node_modules. Not sure if it will get accepted but I hope it does.
> a mechanism to automatically recognize a __pypackages__ directory and prefer importing packages installed in this location over user or global site-packages. This will avoid the steps to create, activate or deactivate "virtual environments".
That’s a great idea. I’ve just been playing with the implementation from the PEP, and it makes things really easy.
The speed of simply extracting a bunch of wheels directly in to `__pypackages__/$PYVER/lib/` is a huge benefit. It behaves well if you symlink from a global tree of extracted packages too, like an even simpler version of pundler¹.
If others want to play with it without too much breakage, I massaged the patch on to 3.7². The 3.8 base was a little too big a change for my liking ;)
Basically it seems you are lacking only two shell commands.
virtualenv venv
. venv/bin/activate
Then everything works with pip
pip install numpy
I actually prefer the pipenv way (or the virtualenvwrapper way) of putting virtualenvs into a dedicated location, so that I can wipe them off the hard disk if I want to free some space.
The real interesting and occasionally bothering point on managing your virtualenv is resolution of dependencies and compatibility ranges (hard to solve in general).
If this works for you, good. I think I would be cautious giving this out as advice because you can run into problems this way with regards to different python versions. The activate script uses the python version used to setup the environment.
That's true, but I assumed GP was referring to the fact that venvs are kept in a global, e.g. $HOME/.venvs, rather than in $(pwd)/.venv like $(pwd)/node_modules.
Those shell commands don't do anything to publish a package, do they? Sorry, python is still lacking quite a bit of what e.g. js with npm has had for years.
See, I feel like python has a lot of those benefits and the package management doesn't need to be so complex.
Seriously, python package management can be fairly simple. On most of our machines at work, it's just "virtualenv .env && source .env/bin/activate". Then you install your packages and... everything is in one directory, like node_modules in javascript.
A clean reinstall is easy from there: remove the .env and just repeat.
I feel like pipenv violates KISS, and that a more traditional virtualenv/venv/pyenv setup is the way to go.
> A clean reinstall is easy from there: remove the .env and just repeat.
Repeat what, manually doing a bazillion `pip install`? Another nice thing about npm is the packages.json file it creates. This allows us to simply add that to version control and then all the new dev has to do is clone and run `npm install` which reads packages.json and installs everything inside of it. I'm sure there's a way to do it in python but, like everything else I bet it's a non-intuitive multistep process.
For JavaScript I typed it all out from memory. For Python I had to consult StackOverflow because I couldn't remember "pip freeze". Yes it can be made simple with use of aliases and such, but like I said in my first post, out-of-the-box cognitive burden is several times greater than, say, npm or yarn.
Your opinion here is totally valid. Npm has a lot more magic involved than doing stuff by hand with virtualenv.
And yes, aliases and bash scripts help a lot, but do increase initial overhead. I have the entire "delete create install" sequence in an alias as well as activate in another.
But envs are just files, you can even skip activate and just do `.env/bin/python` and it works. That's powerful in a linux shell because now I can just use that environment like a regular executable from anywhere, no global installs required.
I can appreciate preferring something less manual even if I don't! Ultimately, your machine, your code.
Please do not talk about things you clearly don't know well enough (which is patently the case if you can't rememer freeze and deactivate).
> Out-of-the-box cognitive burden is several times greater
You got any stats on that, or is it just your opinion? Because to me, the completely counter-intuitive --save parameter is much more painful to remember.
The venv/pip workflow is not perfect, but what you've described is not the problem.
Ok sure, but starting clean is still a 2 step process (deactivate, remove)
> Please do not talk about things you clearly don't know well enough (which is patently the case if you can't remember freeze and deactivate).
Yes, I openly admit I do not know python's package management/virtualenv stuff well enough. And I don't care to learn either, because I don't use it enough for the investment to be worth it. My point still stands that for each javascript package management task that needs accomplished, you need 2-3x as many commands to accomplish the same thing in python.
> You got any stats on that, or is it just your opinion?
My opinion, of course, but shared by the dozens of people who upvoted my OP.
> Because to me, the completely counter-intuitive --save parameter is much more painful to remember.
In what way is --save counter-intuitive? If anything pip freeze is counter intuitive. How does it know what to save? Does it just save everything you've ever installed? What if you don't want to save everything, just a few of them?
Also, --save and --save-dev allow you to segregate developer dependencies from production dependencies. Is there a way to do that with python? Again, just a guess, but it's probably going to be an unintuitive 3-4 step process that I'll no doubt find on stack overflow.
> Yes, I openly admit I do not know python's package management/virtualenv stuff well enough.
And still, you are here trying to measure the length of your "commands" with others' "commands".
> My opinion, of course, but shared by the dozens of people who upvoted my OP.
Ah great, engineering by acclamation. That usually ends well. That's how we got pipenv, btw: a popular developer stood up and declared "I'll fix it!", to general acclaim from the Powers That Be... and then things broke harder, and here we are.
> for each javascript package management task that needs accomplished you need 2-3x as many
That's precisely the perspective that led us to the mess that is pipenv: "npm is the model, we should all be like npm". Except npm fundamentally serves only a few specific needs, and was built on the lawless prairies of an ecosystem with limited aims, no stdlib, and without 28 years of accumulated legacy practices; whereas python has been pulled in every direction for literally decades, and now has to herd all that legacy into something more coherent, slowly (because this or that constituency will be ready to scream about breaking compatibility, as we've just had to endure for about 10 years with py3) and correctly - to avoid ending up in situations like the periodic breakage that happens in npm because this or that package has misbehaved.
> In what way is --save counter-intuitive?
"I've already told you to install, why should I repeat the concept? Are you really so dumb a 'manager' that you would ignore what you just installed?"
And btw, Stack Overflow says --save is actually obsolete since 2013 at least [1], so it looks like you don't know npm very well either. Maybe we should just give up and build an AI that learns development from SO, and find ourselves more meaningful jobs.
> And btw, Stack Overflow says --save is actually obsolete since 2013 at least [1], so it looks like you don't know npm very well either.
"edited Sep 18 at 18:15"
`5.0.0` was introduced May 25, 2017 [1]. Most linux distributions have not picked it up yet in their repos. Ubuntu 18.04 (released this year) is still on npm 3.5. It's unreasonable to expect a developer to be familiar with the bleeding edge, especially when existing projects are locked into using older versions.
I don't know why you are being so contentious about this. I actually love Python and hate JavaScript. But in my opinion, JavaScript has the superior package management setup which is why I am rooting for PEP 582.
In python you just create a virtualenv with "virtualenv whatever". From then on everything is installed in there. If you want to start fresh, you just delete the virtualenv. How is this difficult?
I don't get the node way of doing things. Why should the availability of modules be dependent on the current working directory? Seems brittle to me that your app can't find its libraries because you ran it one directory up/down from where it should be.
As for sudo, there is no situation in which you should use sudo with pip.
I find npm very easy to wrap my head around. I do npm install ___ and it does a lookup in its repository and downloads it and its dependencies to node_modules. If I want to start fresh, I simply delete node_modules. Everything else "just works" when I invoke node myapp.js. Punto e basta.
pipenv masquarades as the same thing, but then there is no python_modules folder to be found. Instead it downloads it to some obscure directory halfway across my computer. If I want to start fresh I guess I have to trust the tool's uninstall function? And it's unclear if I need sudo or not. Also I can no longer just run my program, I have to run it with pipenv now? Python requires too much cognitive burden for module/dependency/virtualenv management.
For me it's to the point that developing using a docker image with globally installed python modules is easier to manage and wrap my brain around than using pipenv/virtualenv/whatever.