Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t understand the platform thing, is that something to do with running on Windows? Why wouldn’t you just pip install? Why bring conda etc into the mix?

If you have conflicts then you have to reconcile those at point of initial install - pip deals with that for you. I’ve never had a situation in 15 years of Python packages where there wasn’t a working combination of versions.

These are genuine questions btw. I see these common complaints and wonder how I’ve not ever had issues with it.



I will try to summarize the complaints (mine at least) in obvious simple points

1- pip freeze will miss packages not installed by pip (i.e. Conda).

2- It does include all packages, even not used in the project.

3- It just dumps all packages, their dependencies and sub-dependencies. Even without conflicts, if you happen to change a package, then it is very hard to keep track of dependencies and sub-dependencies that need to be removed. At some point, your file will be a hot mess.

4. If you install specific platform package version then this information will not be tracked


1/4- Ordinary `pip install` works for binary/platform-specific wheels (e.g., numpy) too and even non-Python utilities (e.g., shellcheck-py)

2/3- you need to track only the direct dependencies _manually_ but for reprodicible deployments you need fixed versions for all dependencies. The latter is easy to generate _automatically_ (`pip freeze`, pip-tools, pipenv/poetry/etc).


Ok. I think that’s all handled by my workflow, but it does involve taking responsibility for requirements files.

If I want to install something, I pip install and then add the explicit version to the base. I can then freeze the current state to requirements to lock in all the sub dependencies.

It’s a bit manual (though you only need a couple of cli commands) but it’s simple and robust.


This is my workflow too. And it works fine. I think the disconnect here is that I grew up fighting dependencies when compiling other programs from source on Linux. I know how painful it can be and I’ve accepted the pain and when I came to python/venv I thought “This isn’t so bad!”

But if someone is coming from data science and not dev-ops then no matter how much we say “all you have to do”. The response will be why do I have to do any of this?


I don't think that manual handling of requirement.txt in a collaborative environment is a robust process. It will be a waste of time and resources to handle it like that. And I don't know about your workflow but it is obviously not standard and it does not address the first and forth points.


Haha. Ok. I think that’s where we’re just going to have to agree to disagree.


Problems 1 and 2 can be solved by using a virtualev/venv per project.

3 is solved by the workflow of manually adding requirements and not including dependencies. It may not work for everyone. Something like pipreqs might work for many people.

I do not understand why 4 is such a problem. Can you explain further?


Can you name a package manager (any language) that handles #3 well?

How does it handle the problem?


Yes, there are more problems with Windows.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: