Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think practical Unix philosophy is more nuanced. There are filter programs that read STDIN and write to STDOUT per default.

But this is not really practical for many use cases which are not filters first and foremost. So we end up with still reading from STDIN or the list of files presented as arguments. We write to the file given by -o (or --output) which can be "-" to signify STDOUT.

I find this pattern very flexible because it allows multiple input files or STDIN and a single output file which can be STOUT, so you still get your filtering behaviour if you want.

(For completeness sake, there is a third pattern somewhat prevalent. In call these copy type programs. They treat the last argument specially, usually as some form of target, like `cp a b c target`.)



--output was just one example. Many tools provide formatting options, ls can sort, find can delete files it finds, and so on.


I don't believe programs able to sort their output is acting against the UNIX philosophy. "ls" lists files and does a good job of it. find's -exec is a bit stretching it (-delete came later IIRC), but generally it adheres the philosophy pretty well.

From my experience of programming small utilities for myself, UNIX philosophy says "a tool handles a single job and handles it well" is a nod to fallacy of "gluing" things together.

Because when you start to glue things beyond simple pipes, the glue code becomes exponentially complex and takes more and more of the program, which creates a maintenance hell and affects performance (much more profoundly on a PDP).

So, simple tools doing single things is both easier to write and maintain, and they're much more performant in general.

Remember the guy who left a whole Hadoop cluster in the dust with GNU command line tools and some pipes, on a small laptop.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: