Hacker Newsnew | past | comments | ask | show | jobs | submit | deepnotderp's commentslogin

It’s difficult to cmp diamond is the issue I’d assume


This. A quick scan of the wikipedia page for diamond material properties suggests you are very correct. It appears very chemically inert, with some outstanding exceptions: "Resistant to acids, but dissolves irreversibly in hot steel"

https://en.wikipedia.org/wiki/Material_properties_of_diamond

Also, removed/liberated particles of Diamond from the workpiece which failed to fully chemically dissolve into the slurry would then contribute to the abrasive in the slurry. If the slurry abrasive was not also diamond, then that could lead to some serious scratch/gouging of the work surface.

Perhaps not insurmountable, but wow, that sounds like a stiff challenge, especially when accounting for cost.

I wonder if diamond would be machinable with a dry (plasma) etch instead? I am purely speculating here, this is far out of my wheelhouse. But SiO2 is already very chemically inert (though considerably softer vs diamond), but manufacturers regularly dry etch it.


Yay for fracking!

Yay for natural gas!


New hardware could greatly reduce inference and training costs and solve that issue


That's extremely hopeful and also ignores the fact that new hardware will have incredibly high upfront costs.


Great, so they just have to spend another ~$10 billion on new hardware to save how many billion in training costs? I don't see a path to profitability here, unless they massively raise their prices to consumers, and nobody really needs AI that badly.


10^11 cycles is not “practically unlimited endurance”, that’s less than a second of use at 1 GHz


You can't use a single cell of RAM at GHz frequencies. By the time you read a value and write another value back, you're talking about ~200ns so you are capped at ~5mhz writes (and anything that you are actually trying to access that quickly will be in caches anyway so your writes won't make it out to the DRAM unless you explicitly flush the caches)


200ns seems a bit high. But, if you do the math, you'll find that's a practically negligible difference, at only 6 hours at 5MHz.

DRAM appears to be closer to 300 hours, at reasonable temperatures [1], at the worst case workload.

It would be interesting if Google released their failure rates, like they did with hard disk vs ssd.

[1] modeled failures, page 75: https://repository.tudelft.nl/record/uuid:e36c2de7-a8d3-4dfa...


Skimming through the linked paper, I can't actually find that claim being backed up anywhere?

The abstract does indeed say "It was found that the system reliability decreases to 0.84 after 1·10^8s at a stressing temperature of 300K", but I can't find anything close to that in the sections about Bias Temperature Instability or Hot Carrier Injection.

The only thing which to me looks close is the rather acute failure in the Radiation Trapping section - but that also states that the failure mode is dependent more on the total dose than time, and the total dose at which it fails is somewhere between 126 krad - 1.26 Mrad. For reference, a dose of 1 krad is universally fatal to a human.

In other words: don't put unshielded DRAM in a nuclear reactor?


I included the page number with the link, to prevent this, and also noted that these were modeled failures. I had trouble finding any real world data, which is where the google comment came in.


>DRAM appears to be closer to 300 hours

Yikes! Things that you don't necessarily want to know. Another one is that GPUs are released crawling with bugs - only the ones without cheap driver workarounds are fixed.


Sure, but conventional DRAM endurance is 10^15 or more


> >10^3s retention, >10^11 cycles endurance

The implication is that it can theoretically hold a value for 10^14s (~3 million years).


Yes, but most memory workloads don't store the same value for 15+ minutes at a time. And if you're using it as long-term storage (so basically a flash alternative) that 15-minute retention time is awfully low.


Decades of research into optimizing DRAM refresh efficiency suggests that you don’t understand how the world is using DRAM.


Thermal storage has very poor discharge rates unfortunately (usually slower than a day), as well as surprisingly high cost once you factor in inefficiencies and turbine cost


As was repeatedly explained in that other thread, thermal storage of the kind described there is inherently a long term storage technology, and this drives the design to minimize capex, not maximize round trip efficiency. The focus on efficiency is fundamentally misplaced there, as it becomes orders of magnitude less important compared to diurnal storage (which batteries appear to be well on their way to dominating.)

Long term storage and diurnal storage are complementary technologies, sort of like the different levels of cache and main memory in a computer memory hierarchy. Combining them appropriately reduces cost vs. using just one of them.

Anyway, the technology as described would produce heat at 600 C for as little as $3/GJ, which nuclear would have a hard time competing with.


$3/GJ is $108/MWH which any large scale fission buildout would easily beat for thermal energy costs


You misplaced a decimal point. A MWH is 3.6 GJ, so it's $10.8/MWH.

$3/GJ is about the current Henry Hub price for natural gas, and as you should know cheap natural gas like this is what killed the "nuclear renaissance" in the US.


Oh my bad, you’re right

Re: Nat gas, agreed, it’s not solar though, storage is much more expensive

Thermal energy still needs to drive a turbine to generate electricity


Sure. 600 C is about the temperature of steam in a coal fired power plant, so one of the use cases here is to take an old coal plant and replace the heat source. It's much higher temperature than the steam in a LWR, so the turbine can be smaller and cheaper. Also, no steam generator is needed as in a PWR.


Yes but one of the reasons that coal is being replaced by gas is because of the capex of the steam turbine


Yes? That doesn't mean the capex of a steam turbine for this application would be unaffordable, or that this wouldn't have superior economics to nuclear (which also has a steam turbine, and a more expensive one).


MCF has a bunch of challenges, eg no good pigtail connector, need for rotational alignment, inability to radix (MCF is great for point to point, less good if you want to fan out from one chip to multiple chips), etc

And then even after all that, it’s still 1-2 orders of magnitude lower density than waveguides


They do have higher energy density, it’s just a low power density


This is a lot of words to say “solar + battery lcoe is $70/MWH and gas is $40/MWH, but most places outside the U.S. don’t have access to tons of cheap gas so it’s more expensive there”

And no, they are wrong, gas is still currently cheaper ($40 is less than $70)


In other places outside US, cost of gas electricity is not 40/MWh. In Germany for example, the cost of natural gas power plants is 110 to 170 Euros/MWh.

For solar in Germany, it is 37 Euro/MWh to 80 Euros/MWh not including storage.


Where did you get the 110 - 170 figure from?

According to [1] (figure 5, 6) its at the maximum, around 80€ MWh. Am I looking at the wrong stats?

[1] https://ec.europa.eu/eurostat/statistics-explained/index.php...


Not the parent, but I believe they were talking about LCOE, or total cost including building the plant and operating it. So that will be the cost of natural gas plus the rest amortized.


Yes. It would be LCOE for both solar and gas.


Yeah ok but without storage you are comparing apples to oranges. Even with storage it is barely comparable. Since even with batteries you can't provide power 24/7.


This is factually inaccurate. At solar and storage costs today, they are cheaper than existing fossil generation in all of Europe. In the US, even at today's low fossil gas prices, they are competitive. They will become even more competitive over the next several years as the price of solar and batteries continues to decline, and the US fossil gas market is exposed to global demand via LNG exports. Renewables prices will keep going down, fossil costs will keep going up, very broadly speaking.

Base load is a myth; as long as you can orchestrate low carbon energy (nuclear + renewables + hydro), storage (hydro and batteries), transmission, and load shifting and shedding, the grid will continue to operate at expected service levels. Europe demonstrates this today with high renewables penetration in Portugal, Spain, the UK, and Germany, and nuclear in France (with robust exports to adjacent grids). "Excess" renewables that are curtailed during low demand seasons solve for near term storage as the storage manufacturing/deployment ramp curves upward and the price decline curves downward.

https://pv-magazine-usa.com/2025/07/01/solar-cost-of-electri...

https://ember-energy.org/latest-insights/solar-electricity-e...

https://ember-energy.org/countries-and-regions/european-unio...

https://ember-energy.org/latest-insights/solar-is-eus-bigges...


> Base load is a myth; as long as you can orchestrate low carbon energy (nuclear + renewables + hydro), storage (hydro and batteries), transmission, and load shifting and shedding, the grid will continue to operate at expected service levels. Europe demonstrates this today with high renewables penetration in Portugal, Spain, the UK, and Germany, and nuclear in France (with robust exports to adjacent grids).

As long as you keep the gas peaker plants operating for those few months every couple of years with overcast, low wind weather (e.g. in 2021: https://theconversation.com/what-europes-exceptionally-low-w... ) which severely limits a lot of the renewable output.

Or you don't have an unpredicted peak/drop and the whole grid fails over (cf. Iberia a few months back).

Handwaving very complex problems as "it's a myth" won't make it go away.


> Handwaving very complex problems as "it's a myth" won't make it go away.

Having evidence that the problem is tractable, while observing the continued rate of deployment of generation and storage, as well as their cost decline rates, is arguably not handwaving anything away. Simply follow along observing China as they continue to prove out the thesis ahead of developed countries.

Enough sunlight falls on Earth in ~30 minutes to power humanity for a year. Everything else is capture, transmission, and storage.

China launches world’s first grid-forming sodium-ion battery storage plant - https://www.ess-news.com/2025/06/03/china-launches-worlds-fi... - June 3rd, 2025 ("With a total investment of over CNY 460 million ($63.8 million) and occupying 34,000 square metres, the Baochi plant is designed for an installed capacity of 200MW/400MWh. Based on a dual daily charge-discharge cycle, it can regulate up to 580 GWh annually — enough to power 270,000 households, with 98 per cent of its energy sourced from renewables. The facility supports more than 30 local wind and solar power stations, alleviating the impact of intermittent supply and facilitating the integration of high shares of renewables into the grid.")

How we made it: will China be the first electrostate? - https://www.ft.com/content/e1a232c7-52a0-44dd-a13b-c4af54e74... | https://archive.today/OSFYo - May 20th, 2025


If we use power-to-gas for the gas peaker plants, gas peaker plants can be just as renewable. We can use excess Summer electricity to generate it, We've already got the powerplants and storage anyway.


RE ".....At solar and storage costs today, they are cheaper than existing fossil generation in all of Europe....." If so, are power prices decreasing in Europe?

  If not, there must be other costs not included, in the above statement?


Europe isn't optimizing for lower prices. It's trying to phase out coal, which might lead to higher prices if gas is used more often:

https://www.theguardian.com/business/2025/apr/20/why-the-uks...


Lots of nuance, citations below, TLDR is fossil gas and hydro volatility from Russia and climate change has caused price levels seen. Europe avoided €59B in fossil fuel import costs due to new wind and solar in the EU since 2019. The evidence shows renewables have been restraining electrical cost increases, and as more renewables and storage is deployed, costs should remain flat (if not decline).

https://www.iea.org/reports/renewable-energy-market-update-j...

https://ember-energy.org/data/european-electricity-prices-an...

https://gmk.center/en/infographic/electricity-prices-in-euro...

https://www.cam.ac.uk/research/news/electricity-prices-acros...


Not sure why you say this. I'm on solar + batteries 24/7 365 days a year. Use no fossil fuels. Even had utility company remove power poles from my place.


Yeah for some parts of the world that is possible. Not for most of europe.


Probably pretty doable in Europe. For one most people don't have air conditioning and that's a big suck of electricity. And if you aren't using electric for heat then realistically your electric draw is not going to be that high


Heating generally uses much more energy than cooling, and even more so if something is burned.

However, it's true that places with low heat pump adoption tend to have few ACs. For example, ACs are rarer in Germany than in Norway, despite Germany being warmer.


More like comparing fresh apples to preserved dried apples, in that PV is still useful even without storage until it exceeds ~100% of daytime demand; and even then pumped hydro is like a fridge or something similar to put the fresh apples in, because you got it anyway for unrealated reasons.


Where is your $40/mwh figure from? The claim in the podcast is that average LCOE for new gas in the US is "around $76, $78" or "around $70".


Combined cycle can be as low as $37 during low price periods of natural gas; combustion cycle is around or higher than the $70 in the podcast. Dedicated peaker plants can be much more expensive depending on the design.


Where are your numbers from? "during low price periods" doesn't really make sense here, we're talking about LCOE which is the average cost over the lifetime of a plant.


LCOE is entirely fictional unless looking at a dead plant. Anyone who claims to know what natural gas prices will be over the lifetime of a plant should be avoided at all costs.

If you want numbers, search Google and find your preferred sources. It'll take all of 10 seconds, I promise.


> This is a lot of words to say “solar + battery lcoe is $70/MWH and gas is $40/MWH,

No, that isn't what the article says. I'll quote it for you:

    I think it was around $70 for new gas. It was a weighted US average — from memory, but I might be wrong on that.
I suspect your $40/MWH isn't LCOE, it's the marginal cost of producing an extra MWH from an existing plant. A second problem they don't mention for gas is the demand is so high, the wait time for a new turbine is around 4 years. Batteries on the other hand can be bought with very short lead times.


Lazard agrees with you: https://www.lazard.com/media/uounhon4/lazards-lcoeplus-june-...

Marginal costs of $108 for gas peaking and $31 for combined cycle. $48-$107 actual LCOE ($77.5 average).


FEC latency is >> propagation delays at these distances, so that's probably the dominant factor in most cases


Hyperspectral camera?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: