Hacker Newsnew | past | comments | ask | show | jobs | submit | libealistand's commentslogin

The irony here is that the massive water price hikes must surely have contributed in less waste of water resources.


It’s not going to plug leaks, though. The amount of water that goes straight into London’s ground is astounding.


Oh yeah, because assumin maximum values in your domain doesn't move all algorithms into constant time and renders complexity theory void.


It doesn't though. How would merge sort become constant time if you assume a maximum value? It's also a joke...


The main "value" in the domain of sorting problems is the number of elements in your collection.

A subproblem also considers the elements to be integers, then they become another "value" domain. (But in general, sorting problems only need their elements to be comparable, not necessarily integers.)


Sure, but in such a world asymptotic complexity has little meaning. For practical purposes it's useful to make simplifying assumptions that are done in complexity theory.


Some parts that are done can already be running while other parts are still compiling.

I'm seriously concerned about all the besserwissers in this thread.


I want you to know that I'm enjoying your replies as much as the linked post, if not more.


> Not only it not constant time, it's not even it's polynomial

You understand that this is part of the joke, right?

If we really want to get down to the details and kill the joke, then you don't actually need to wait real time. Computational complexity is concerned with steps in a computational model, not how much time passes on a clock. Sleep sort uses OS scheduler properties and in a virtual time environment, time advances to the next scheduled event. That's what brings you back to actual polynomial complexity, if you assume this kind of thing as your computational model.

> - it's psuedo-polynomial.

If you lecture people then please at least get your spelling right.


This is really what makes the joke work IMO.

Haskell's runtime and the OS it executes on exist only as a transient implementation detail of what is, literally, a pure environment!


I still don't get it. Sleep sort still needs O(n) Haskell-steps, while for example sorting in bash is just 2 bash-steps, calling it, and receiving the output.

I fail to see the joke, really. I only see false and nonsense statements, which still could be funny or interesting, but I don't see how?


The "constant time" is wall clock time, where it will be at least the biggest value times the constant microseconds plus like 5% for overhead like the garbage collector.


Complexity analysis deals with asymptotic cases where N gets arbitrarily large. Sleepsort would take the same wall time to sort [1,2] and [1,2,1,2] - but it would presumably take somewhat longer to sort an array of 1e10 ones and twos, because it's not a constant time algorithm.

On the joke part, sleepsort is intrinsically an extremely funny concept and I think everyone here gets that. But "constant time" has a rigorous/pedantic definition, which sleepsort doesn't meet, so I think for some readers calling it that kills the joke (in the same sort of way that it would kill the joke if TFA's code snippets used lots of invalid syntax).


I'd never seen sleepsort before, so I thought it was funny. ;-)

I like the idea of (ab)using the scheduler to sort numbers.

Now I'm inspired to make my own "sorting" routine, maybe touching filenames and then sorting them with `ls -n`, or rendering triangles and then ripping them off in z-plane order, or stuffing frames into a video and then playing it back.


A wet dream for the ubiquitous data collecting ad industry. Obviously such a service would live in the cloud and it would be "free" and that much personal data about everybody would be too good to pass as an opportunity. Forget FB or Google harvesting whatever little data they can pick up about you indirectly. This would be the holy grail.

Call me cynic but in a sense I'm glad we don't have this kind of thing.


Not even if it were self-hostable, open source with an MIT license?


> Something I never really noticed before is that we only use our calendars to look forward in time, never to reflect on things that happened in the past.

This is called a "diary" and is as old as humankind.

I use a calendar to track workout sessions. All in the past.


> Math is pointless from start to finish

And this attitude, my friends, is the reason why so much software out there is so bad.

We need more of a math mindset when developing software. What can we be sure about, what are the invariants, what can we prove? There is so much crap out there that somebody lacking understanding just tried to wing, and I'm constantly ashamed of it.

Computer science is applied math.


He said maths not CS. A lot of research mathematics has no application.


Number theory had no applications for centuries. Now, cryptography is based on it and the modern internet would be unthinkable without.

Foundational research does often not provide immediate applications. Still, if we don't do it, out understanding of the world is lacking and it hurts us later down the road.


While there certainly exists math for the sake of math, there is a trickle down effect that is quite real (there’s also a trickle up effect that is real but that’s unrelated). Someone does some math for the sake of math. Later on, someone who is slightly more applied sees a link between that math and a more applied problem they’re working on. If the idea is truly useful, it propagates down all the way to application-focused practitioners. Researchers exist on a spectrum, generally, between pure theory and pure application.

Math has no application until you find an application for it. Differential equations are just equations until you pair them with physics. Formal logic is just an abstract discussion of human reasoning until you build a circuit, etc.


One wonders if trickle down mathematics is any more efficient than trickle down economics. It seems like we might be better off not funding pure math, as forcing function to coerce those minds to work on more applied problems directly, instead of relying on this random serendipity.


It seems like I might be better off picking the winning lottery numbers directly instead of relying on the random serendipity of guessing them and most of the time being wrong.


Flat? I don't think so. I'm at a coffee shop and looking around me, this is very different from just 20 years ago. People with laptops and smaetphones around me. Just 20 years ago, that same place saw a few CS major nerds with their laptops, everybody else sat with a book or some paper to write on or reading a paper newspaper. Still with a latte, but tech has changed everybody's life dramatically in 20 years. 20 years from now I'm again expecting at least as big a change.

Definitely exponential.


You're agreeing with the comment you replied to. they were saying it was flat until ~1700 or so.


They wrote that it "appears flat". I tried to contradict that.

> However from inside this bubble there is a sort of tech dilatation where everything appears normal and flat and it is only the future that has exponential growth.


I don't get the drama. To me this looks like a normal process in an industry that discovered a groundbreaking technology. A new technology pops up, everybody rushes to innovate, lots of actors do nonsense, some do the right thing, but for many of them you don't know beforehand which is which. So you throw money at all of them. The rush creates some really good ideas and innovations. And a lot of BS and wrong decisions. Within a few years, the successes emerge, everybody else is flushed out of the market.

Today fiber is the network tech. Except for the last mile, everything is fiber. We're at 400G per port, soon 800G, and the market keeps growing and 1.6T is the obvious next step. Normal growth process.

So, I don't really see the takeaway here. What to learn from it? Not to invest in hypes? That's not really how we grow.


> Today fiber is the network tech. Except for the last mile, everything is fiber.

(G)PON, (gigabit) passive optical network, seems to be what everyone is moving to:

* https://en.wikipedia.org/wiki/Passive_optical_network

* https://en.wikipedia.org/wiki/GPON

Even for wireless (LTE/4G, 5G), GPON seems to be used for backhaul:

* https://ieeexplore.ieee.org/document/8650520

* https://www.lightwaveonline.com/5g-mobile/article/14188094/t...

IEEE 802.3ca-2020 allows for 25 GigE per lambda, being able to combine two for 50 GigE; I'm sure there's an ITU equivalent (or will be soon).

* https://www.itu.int/hub/2021/06/new-itu-standards-to-boost-f...

* https://en.wikipedia.org/wiki/Higher_Speed_PON

The main 'semi-hold out' are cablecos, where they are pushing fibre closer and closer to the premises, but generally haven't yet decided to replace co-ax in people's homes yet:

* https://en.wikipedia.org/wiki/DOCSIS


Depends on the country, where I live it's pretty common to have fiber to the home. Everything runs on that, internet, cable tv and phone calls.


Indeed. I had FTTH in Spain, in Andorra 100% of the (tiny) country has FTTH, I'm in the middle of absolutely nowhere in France atm: rural/seaside (about 45 mins to an hour drive to the closest highway) and yet I've got FTTH.

And at my new place in Luxemburg I've got FTTH too.

Fiber to the home is becoming incredibly common in many countries.

My brother had fiber to the home in 2003 already... In Japan (Tokyo).


But in other areas most residential bandwidth increasingly comes from wireless techs. Even in homes with wired service, nearly everyone uses wifi rather than wires, let alone fiber to end machines. The last mile is generally not fiber, nor is it a literal mile.


> nor is it a literal mile.

The cost problem of the last mile is not for the 2 meters between the fiber router in the home and the living room. The issue with the "last mile" is that it used to be excessively expensive to lay fiber everywhere on the, literally, last mile outside people's home.

I could plug a network card with an SFP port and have actual "fiber to the desktop" for the last three meters at my place but I don't. I'm not sure that these three meters where I run ethernet are called a "last mile" (off by a factor of 530x calling 3 meters compared to a mile) ad mean I don't have FTTH or that the last mile ain't fiber.

> ... most residential bandwidth increasingly comes from wireless techs.

What good does it make to laptops and phones using WiFi if the router is doing 40 Mbps max over DSL? It's once you bump that DSL link to FTTH that suddenly all these wireless devices can enjoy much faster speeds. In all the countries I've lived in people at home used WiFi, even from their phone, instead of 4G because 4G means lots of $$$ / EUR.

I think you're underestimating the gigantic speed boost many people are enjoying thanks to fiber (and not thanks to 4G) now that they're having FTTH.


I recently switched from a supposed 900 Mbps Comcast connection to FTTH, and the difference is astonishing. There's just no substitute for Fiber.


> The cost problem of the last mile is not for the 2 meters between the fiber router in the home and the living room. The issue with the "last mile" is that it used to be excessively expensive to lay fiber everywhere on the, literally, last mile outside people's home.

And yet it was done for electricity and telephones over a century ago. And fibre probably has a longer 'shelf-life' as it does not corrode, so the initial installation is more likely to last longer.


Indeed many people, including myself are now in the weird backwards world where the WAN is far faster than the LAN.

I have a 3gbps fiber link straight to my house. The only machines in the house that can use anything close to it are the ones that hook up to my USB-C docking station which has a 2.5gbps ethernet port. Everything else is WiFi speeds.

EDIT: This was a sudden change. I live rural and for the last 10 years I only had 10-15mbps speeds top, and most recently only by point to point wireless from a tower 5km away.


The advantage of a fast WAN is you can put it into a capable router (2.5-5-10g) and break it out into multiple clients who can use it simultaneously even if at only 1g each.


This also applies for wireless technology. My phone generally gets a faster connection via 4G, than via WiFi inside my house (the cable connection is fast, but the WiFi is slow). I think because the 4G connection can always see a tower through a window opening, while the WiFi has to go through multiple walls. (No interest in setting up a mesh network as the 4G is plenty fast enough)


This. Only a few years ago Hacker News commenters were crying out for new laws and regulations to subsidize the construction of Google Fiber to every home in America. “It’s a basic human right” they would say.

On the sideline there were a much, much smaller minority who looked at those comments in horror, with the context of knowing that wireless was a serious alternative that wasn’t that far away and was being broadly ignored


>> wireless was a serious alternative that wasn’t that far away

But it really isn't. There are hard physics limits on how much data can be transmitted within a given frequency band. Fiber total theoretical bandwidth is is essentially infinite in comparison. All the traffic of the entire internet could probably flow through a single fiber bundle perhaps less than a meter wide. For things like streaming 4k/8k/12k (real 4/8/12k) to multiple devices, wifi will never compete with fixed lines.


I don't see the problem? I have FTTH. I can get 10 Gbps. I don't want wireless, it's an unreliable last resort. If I have the option I'm absolutely going with fiber.

It's also the better long term tech. Fiber has enormous capacity, while wireless is a shared medium impeded by things like walls, and those problems get worse with the increased frequencies needed for more bandwidth.


The network neutrality discussion belongs in the same bucket. People thought it would be the end of the open internet if there wasn’t a dedicated regulation.


There was dedicated regulation, it was just taken up by states instead of the FCC https://www.techdirt.com/2023/06/12/telecom-industry-ass-kis...


Interesting. I'm not sure "the entire west coast and huge swaths of the midwest and east coast have passed state-level net neutrality laws" is correct if this linked map is to be believed: https://www.naruc.org/nrri/nrri-activities/net-neutrality-tr...

Huge parts of the US apparently do not have such legislation, and it is unclear which states that "proposed" it have put it into law.


You seem to be agreeing with the author, who notably compares it with railway mania.

> Technologically, fiber communication has been a brilliant success.

> The bubble warped the financial picture. Looking back, Odlyzko compared the millennial bubble to the British rail- way mania of the 1840s ...

> Looking back, the most important lesson from the fiber optic mania may be that the most successful technologi- cal revolutions can be the messiest. The bigger the profit potential, the more manic investors become and the less critical judgement they use.

And the concrete takeaways depend on exactly who you are, but "selling out early" before the bubble pops seems to be the story of the big individual winners in this story.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: