Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Would you mind going a bit into detail? I work with npm on a daily basis and I am pretty happy so far.


The whole version range stuff got me many times. I went to use fixed versions on my own package.json files, but the deps of my deps could still be dynamic, which is even worse, since they sit deeper in my dependency graph AND there are more indirect deps than direct deps. (~50 direct, >200 indirect)

Also, npm isn't deterministic and it got even worse with v3. Sometimes you get a flat list of libs, if a lib is used with multiple versions, the first usage will get installed flat, the rest in the directory of the parent lib, etc.

The npm-cli is basically a mess :\


The fun of kicking off a CI build after the weekend with no commits and see stuff randomly break because some dependency of a dependency got updated and broke things in a minor version is something I've only experienced in JS - beautiful.


In fairness to the language and tools, this seems to be more of a cultural problem than anything.

You can do the same kind of version range tricks in typical Java builds, for example (Maven), but most people hardcode the values to keep builds as deterministic as possible.

For some reason, the JS community seems to prefer just trusting that new versions won't break anything. Its either very brave of them really (or maybe just foolish).


> the JS community seems to prefer just trusting that new versions won't break anything. Its either very brave of them really (or maybe just foolish).

Let's not pretend that we aren't all blindly tossing in random libs of dubious quality and origin we find on github into our package.json and hoping for the best anyway. My company talks a mean talk about "best practices", but, my god, the dependencies we use are a true horror show.


I hate non-reproducible builds and semver-relaxed dep-of-the-dep issues, but, while a broken dep fails the build for lots of people (downside), the upside of this is that very quickly (within hours of a new dep being published) there will be lots angry people complaining about it on GitHub, and a faulty dep will be typically quickly rolled back / superseded with a patch. Otherwise, some bugs might be sitting hidden for a long time.


But can yarn fix this?

Say that I use yarn to depend upon some module X, specifying a fixed version number like the good boy scout that I am. Module X resides on npmjs and in turn depends upon the bleeding edge version of module Y. And then one day module Y goes all Galaxy Note and bursts into flames.

Can yarn shield my app from that?


Yes it claims to provide a lockfile while locks all your deps all the way down the dependency tree.

You can do (and are supposed to do) the same with npm's own shrinkwrap, but people claim that it doesn't work as intended.


It's not a cultural problem. People make mistakes. People don't know what a non-breaking change is, especially those not well versed in refactor work.

I don't think Yarn solves any of these problems, tbh. It seems like what we really need is a package manager that tests the api of each package update to make sure nothing in it has broken expectation in accordance with Semver.


It is a cultural problem in that people on other platforms (eg, Maven) don't choose floating dependency versions, generally.


You shouldn't be kicking off CI builds based on unpinned deps (unless you're deliberately doing 'canary testing' etc), because of course that will break. The npm solution for this is to use 'npm shrinkwrap' and you should always have been using this at your site/project level otherwise there was no hope it could work.

It's not that npm devs were naive enough to believe that unpinned deps would be safe for reproducible builds.

However I've heard several people allude over the years that 'npm shrinkwrap' is buggy, and isn't fully reproducible (though never experienced any problems personally). This is the aspect yarn claims to address, along with a supposedly more succinct lockfile format.


With or without npm-shrinkwrap.json? Not chastising, I'm sincerely asking.


Obviously without until we broke and looked in to it :D

JS dev has been a minefield like that, the entire ecosystem reminds me of C++ except lower barrier to entry means a lot of noise from newbies on all channels (eg. even popular packages can be utter crap so you can't use popularity as a meaningful metric)


I agree 100%, but the default upgrade strategy for npm --install does not help matters: it's much saner to wildcard patch versions only and lock major.minor to specific versions.

this obviously doesn't fix anything and I think the points in this discussion stand, but I've never understood why the defaults are not more conservative in this regard.


Well, now you can do that in Cargo as well :-)

What we do currently is we lock everything to an explicit version - even libraries.

At least it's possible to get deterministic builds if you are willing to do a bit of work carefully / manually updating all of your dependencies at once.


You shouldn't have that happen with Cargo, given that we have a lockfile. Even when you specify version ranges, you're locked to a single, specific version.


If you follow the best practices for using Cargo you don't have a Cargo.lock file for libraries.

This means your library tests will not be deterministic.

Using the Cargo.lock file for libraries does solve this, but then every binary package you build that references your library will have to be specifically tied to the versions in the library.

We do this internally because it's the only way to provide deterministic builds over time.

Over time I suspect it will get harder and harder to keep this rigid web of dependencies working.

One thing we might try is to enforce the rule 'no library will contain any tests'. At first glance this kinda makes my skin crawl, but maybe if we could find a way where every project that used a library could somehow share the tests for that library it could actually work.

git submodules might actually be able to provide these tests for each binary package. If only git submodules would stop corrupting our git repos... :-(


Hmmm, I feel like there's some kind of disconnect here; could you open an issue on users or the Cargo repo? Thanks!


Ideally for libraries you would try to test with different combos of transitive deps, but it's a large space.

Anyways you can put a lockfile with your library, and it shouldn't affect downstream.


npm shrinkwrap


NPM is deterministic when there using the same package.json and there is no existing node_modules folder.

And if you want to lock versions for your entire dependency tree, npm shrinkwrap is what you're looking for (It's essentially the same as lockfiles in other development package managers). Though for security reasons I prefer to keep things locked to a major version only (e.g. "^2.0.0"). Shrinkwrapping is useful in this instance too if you need to have predictable builds (and installs as it'll use the shrinkwrap when installing dependencies too if it's a library rather than the main app) but want to ensure your dependencies stay up to date.

It's not perfect by any measure, but there are ways to make it work the way you want.


Mhm that sounds reasonable. I haven't gotten into problems with versions yet but I don't really trust npm update, because I am not always sure how it behaves.

From a build tool perspective (we use npm scripts for basically everything [and some webpack]) I am also not missing something particular.

Looking at other comparable solutions (from other languages) I'd say npm does a pretty good job.


sure, it's better than pip or something.

but it still far from perfect.


the main problem with npm is just when its used internally. It is painful really. Using it when you have internet connection is just seamless




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: