Hacker Newsnew | past | comments | ask | show | jobs | submit | more kahlonel's commentslogin

Using setbuf(stdout,NULL) also disables the buffering.


I’m fine with it if they decrease the subscription fee to a quarter. Nobody I know has 4 TVs in their house. Charging for “4 screens” while enforcing this new policy is a simple cue to unsub Netflix for good.


There’s nothing unpopular about this. In fact this is the most popular opinion on HN.


Remind me when this company goes bankrupt with millions of investment down the drain


All the best! There are many _huge_ projects written in pure C. Don’t let people tell you C is bad for your project size.


What a nightmare. I would never own a Tesla even if I get it for free.


As someone who likes to keep in check what’s going on back home while being permanently away from it, I HATE how twitter trends are hijacked by political parties to shit on the other parties. There trends are purely dogshit and nothing more than memes. There is no value I get from the trending section. Anything important is lost in the noise. I tried to switch to Worldwide trends I remember twitter used to have but got rid of it for some reason.


Finally someone said it. Terms like “realtime” and “baremetal” are used in such vague manners that they have lost their original meaning.


Originally, the term has really been just opposed to batch processing, in connection with systems like Whirlwind, SAGE, the DEC PDP-1 (as the first commercial real-time system) and is tightly connected to the idea of interactive computing. (Another early real-time system outside of the MIT tech path was MIDSAC, 1953.)

Compare Digital founder Ken Olsen's use of the term in his oral history, "The original computing was based on the way people had done computations before. You'd collect all the data, bring it together, process it and send the answers back. The idea of processing it, real time, took a long time to develop. In the world of commercial processing, it's just in the last few years that batch processing has started to disappear. The replacement for it is now called transactional processing, where if you make a transaction with a bank, it is instantly taken care of." (https://www.computerhistory.org/collections/catalog/10263035...)

There are many applications to this, each coming with their own set of implications. E.g., if you want a smoothly running interactive program on an early PDP computer, you want to have program paths of about equal runtime duration. Which may mean in turn that you would want opt out of no-operation paths as late as possible, by this stabilizing run-time, rather than as early as possible. (E.g., we perform calculations anyway, but apply the result only under a certain condition.) Or, if it is about complex processes, where it may mean that we will fulfil a contract in a guaranteed span of time to facilitate cooperations of any kind (e.g, what Olsen calls transactional processing or matching sampling rates with digital computers as a replacement for analog lab computers). Or it may be bound to a particular domain, where stale data isn't of any use (e.g., compare Whirlwind's origin in a digital flight simulator.) Or it may be just about a system being able to respond to input at all.


These C-hating articles are getting out of hand on HN. I see atleast one every week on the front page. If you don't like C, STFU, pick-up whatever FOTM BS you like, GTFO of here, and leave us "normies" alone who have to get actual stuff done which absolutely requires C.


I don't think you read the article


Test


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: