Yeah, big plus one from me. I recently tried to investigate some sort of alternative encoding to/from “the prompt,” and was swiftly told that was both not possible and would work against me. As you pointed out, the LLMs are trained on language and language itself is often not terse. Trying to skirt that will cause the LLM to calculate the vectors poorly because the relation between the input tokens and its training data doesn’t really exist.
I think there’s a certain amount of novelty to this, and the aesthetic of the language I find pleasing, but I’m a little confused… Admittedly, I didn’t read the entire doc and only quickly glanced at the source… But is it just transpiling Golang code to and from this syntax, or is it intended to be a whole language eventually? Can folks able to just import golang packages or do they have to only use what packages are currently supported?
Additionally I have two thoughts about it:
1. I think this might be more practical as a transparent layer so users can write and get Golang (or whatever) the original language was back. Essentially making it something only the model reads/outputs.
2.) Longer term it seems like both NVidia and AMD along with the companies training/running the models are focused on driving down cost per token because it’s just too damn high. And I personally don’t see a world where AI becomes pervasive without a huge drop in cost token— it’s not sustainable for companies running the models and end users really can’t afford the real costs as they are today. My point being, will this even be necessary in a 12-18 months?
I could totally be missing things or lacking the vision of where this could go but I personally would worry that anything written with this has a very short shelf life.
That’s not to say it’s not useful in the meantime, or not a cool project, more so if there is a longer term vision for it, I think it would be worth calling out.
GLyphLang is intended to be a whole standalone language. It's implemented in Go, but it doesn't transpile to or from it. It has its own lexer, parser, type checker, bytecode compiler, and stack-based VM. If it helps, the compilation pipeline currently looks like this:
While the original intent was to have something tailored to AI that a human could manage, I'm realizing (to your point) that will absolutely not be necessary sometime in the likely near future. I've started working on making GlyphLang itself significantly more token-friendly and am adding a top layer that will essentially do what I think you've suggested. I'm adding expand and compact commands for bidirectional conversion between symbols and keywords that will allow engineers to continue developing with more familiar syntaxes on a top layer (.glyphx), while LLMs will generate actual .glyph code. Once completed, the pipeline will look like this:
.glyphx (optional) -> .glyph -> AST -> bytecode -> VM
Regarding #2, that's a great point and actually something I considered, though admittedly maybe not long enough. Regardless, I've tried to develop this with a value proposition that isn't purely about cost (though that does drive a lot of this). I'm also working on these 3 points:
1. Reduced hallucinations: symbols are unambiguous - there shouldn't be confusion between def/fn/func/function across languages (no formal benchmarks yet, but they're planned)
2. Context window efficiency: fitting more code in context allows for better reasoning about larger codebases, regardless of cost
3. Language-neutrality (someone else brought this up): symbols work the same whether the model was trained on English, Spanish, or code
I think even if tokens become free tomorrow, fitting 2x more code in a context window will still significantly improve output quality. Hopefully it will be necessary or at the very least helpful in the next 12-18 months, but who knows. I really appreciate the questions, comments, and callout!
Yeah, I was noticing that too. There is a company out there, Amprius, which has validated their silicon anode lithium ion batteries that can discharge at 10C (or 20C pulses), have varying densities from ~345wh/kg to 450wh/kg, and are shipping them to customers for drones and VTOLs.
I think until we have an independent lab verify the results, it's pretty much impossible to say if their (Donut Labs) claims are true or not. The only thing I'm particularly suspicious of is that they claim their battery was verified but didn't say by who or provide a whitepaper on it. Both of those seem to be the bare minimum for most battery manufacturers, and with their extraordinary claims I'd assume they'd have them front and center.
I've been using Linux as my desktop since 2020, I switched because I wanted to play games and maintain a development environment I'm familiar with (having run Linux servers for ~15 years at that point) that would be stable. I had long used a Windows machine for gaming and a Mac laptop for development. My Mac was stable enough, but Windows was not-- it wasn't blue screens it was constant unpredictable updates (sometimes erratically running when I didn't want them to). I had an SSD in the machine with Windows, but after installing Pop_OS! (as a happy accident) I never found a compelling reason to use Windows again.
Steam has worked perfectly, clicking install and then hitting play, no futzing with drivers or weird updates. The only games I haven't been able to play are League of Legends and some of the new AAA shooters. I'm okay with that because I don't particularly care at this point, and it's not worth maintaining a Windows install to periodically play for an hour or so.
Linux has been unbelievably stable. This year, I fully upgraded the system and planned on reinstalling but I didn't even need to. On first boot, my old install was picked up and mostly just worked. On Windows I've tried that before, and it was an unrelenting shit show (that resulted in having to nuke the old windows install).
The only hitch I've had was installing conflicting NVidia drivers (open source vs proprietary); which, I was able to fix by booting into the command line then nuking both sets of drivers via apt remove and installing the one I wanted. Took me less than five minutes and my system was working. It also wouldn't have happened if I hadn't tried being too clever (and Pop_OS! having some quirks).
I recently setup a MiniPC to use while traveling to game on and this time I tried Arch. To my surprise the install was ridiculously easy. The most recent installer makes it a breeze. My only mistake was not noticing I'd installed a few desktop environments and the default wasn't what I wanted so things seemed broken. After selecting KDE from the login menu et volia! It worked perfectly. I'm considering switching my primary rig to Arch, but I'll give the most recent Pop_OS! release a try to see if the newer LTS version gets me access to some new packages first.
Linux is great folks. If you stick with a major distro you're likely going to love it. It's really low maintenance and just works. 11/10 would recommend to anyone.
> If you stick with a major distro you're likely going to love it.
Even the smaller ones are unironically pretty fun to work with now-a-days. I'm currently rocking Gentoo on my stuff. After the painful setup, it's actually quiet easy to maintain.
Yeah, I remember hearing NVidia did the same thing via Moore’s Law is Dead podcast. At this point it seems incredibly unlikely Intel will unseat TSMC anytime soon. TSMC has proven time and time again it is the only fab capable of producing leading edge nodes at the capacity and quality required by the likes of Apple, NVidia, and AMD. It also has substantially deeper pockets than Intel to continue to invest in staying number one.
I think if Intel is to stand a chance it’ll be via gaining momentum and market via “good enough” nodes and not cutting edge, essentially taking a page out of TSMCs playbook from the late 2000s and early 2010s. It needs more capital than it can raise, and time, both of which are hard to come by.
> TSMC has proven time and time again it is the only fab capable of producing leading edge nodes at the capacity and quality [...]. It also has substantially deeper pockets than Intel to continue to invest in staying number one.
Before about 2016, you could have said the same about Intel. They were generally considered process technology leaders. They were > a year ahead in shipping products with their latest 14nm node. Similarly their previous 22nm node. There had been several occasions over the previous decades where manufacturers stumbled, not as spectacularly as Intel's decade of malaise, but definitely nodes scrapped level.
So, things can change quite quickly. Intel's 18A node is likely to be "better" than TSMC's current N3x nodes (it is denser and better performing on paper) and will ship before N2, putting Intel momentarily in the lead for process technology again for a quarter or so, and it was first with some technologies like BSPD (TSMC won't do that until A16). Yields are a question, and N2 will be coming out which probably re-takes the lead... but this is quite a turnaround from late 2010s situation, right?
The big thing Intel needs is a working foundry pipeline, because there is so much money in high performance silicon that's not x86 these days. It has always been thought their CPU design teams were very close to fabrication which was thought to be something of an advantage for them. It's likely that has also made their process more difficult for outsiders. They've tried and failed several times to get this going and get external design wins, and just never done well even when their manufacturing was doing really well. Including this latest effort (https://overclock3d.net/news/misc/intel-may-cancel-its-18a-l...). Still, it's not impossible, and I'm sure TSMC considers this one of its biggest risks if Intel can boot a self-sustaining foundry business.
> Before about 2016, you could have said the same about Intel.
I'm going to guess that part of the problem is American business culture and ceding high level strategic decisions not to engineers but to MBA types. It's hard to see anyone falling the same way Intel fell looking at companies like Nvidia and AMD whom are both still (outside looking in) very much engineering driven.
Do you have any facts backing up that opinion? Because while I’ll agree that MBAs who ignore engineering nuance can be a problem, engineers are perfectly capable of running an org into the ground all on their own.
In this case, Intel looks like a variant of the Innovators’ Dilemma. Their internal processes, systems, and culture revolve around designing and manufacturing their own chips. Moving to a customer-centric approach is a big switch in culture and I’m not surprised it’s a challenge.
What decisions were MBA instead of engineering decisions? It seems like intel has just made a lot of bad bets or failed to put their mass behind good ones.
The heights nvidia has achieved seem incidental and have depended heavily on the transformer/LLM market materializing.
Intel's biggest problem has been management remaining in denial about their serious engineering problems, and believing that they'll have things sorted out in another few quarters. They were years late to taking meaningful action to adjust their product roadmap to account for their ongoing 10nm failures. Putting all their eggs in the 10nm basket wasn't an engineering decision, and keeping them all there after years of being unable to ship a working chip wasn't an engineering decision.
Intel's in a somewhat better place today because while they continue to insist that their new fab process is just around the corner and will put their foundry back on top, they've started shipping new chips again, using TSMC where necessary.
Stock buybacks and huge sums of capital wasted on mergers and acquisitions (that went nowhere) while not investing in the very expensive EUV fabrication equipment that TSMC had been using for years.
To be fair, Intel is finally making the major manufacturing equipment investments needed to catch up.
They previously did stock buybacks and acquisitions of other companies that went nowhere instead of investing in the EUV manufacturing equipment TSMC used. Now they have the more advanced version of EUV in production.
> Intel has reported processing over 30,000 wafers in a single quarter using High-NA EUV exposure, achieving simplified manufacturing by reducing the required steps for a particular layer from 40 to fewer than 10,
Apple likes to own the core components of the stuff they sell. How surprising would it be for Apple to buy Intel's factories and hire away some of TSMC's top scientists and engineers?
If you can make something with the quality Apple wants, they will buy you a manufacturing line in exchange for price, quality and production level guarantees.
Since Intel needs large customers for its Fab and capital investments, it would be a good deal for them even if the profit margin is much lower than they would prefer.
Very surprising. It would be an unprecedented amount of responsibility for Apple to subsume when they could let the fed nationalize it and save a few billion dollars for a rainy day.
Apple really doesn't like to own manufacturing. They want to be in a position of a favored key customer, but still have the ability to switch vendors at a moment's notice if they can get a better deal.
This is wild. I figured it was going to be something about reusing password but no it’s just a treasure trove of secrets from folks formatting JSON and saving it to a public link. I can’t for the life of me understand why anyone would do this, I haven’t used a language which doesn’t have some sort of pretty print functionality built into the common/standard JSON library.
What’s crazier is everyone’s browser can do this with like a single line of code:
> JSON.stringify(data, null, 2)
I suppose it’s technically two lines if you assign the JSON to a variable (like ‘data’ above) first.
This is true, it’s not an individual datapoint. When smartphones, like the iPhone, originally debuted carriers had a conniption fit because they couldn’t preload a ton of garbage apps to help subsidize the cost. Apple has been able to avoid this, but for your average smartphone this is absolutely how both the manufacturer and carrier are able to sell them so cheaply.
Every experience may not be as bad as the one the OP had, but it’s surely well within reality. Both carriers and handset manufacturers are glad to sell anything and everything about someone to make a quick buck. They’ve literally been doing it for 25+ years.
That sort of crash is what central banks exist to protect against. That could be something like the Fed stepping in as a buyer of last resort and picking up several trillion dollars of equity, and/or pushing rates to zero again to incentivize private capital to more or less do the same.
A global crash of financial systems is unlikely because it’s cause too much pain for everyone. It unfortunately means we plebs are likely the ones paying to bail it out.
Frankly, I would be happy if riots in the streets happen (if there is a bailout of Big Tech in the first place). What's going on recently with OpenAI publicly, and probably most of the 'AI' players not publicly, is disgusting - it's a bubble, everyone and their mother knows it, and these guys try to save their asses by asking for governement backing before the bubble even pops. Privatize gains, socialize losses...
It's a shame that no more bankers went to jail after 2008, although I find the situation was much more complicated than here.
Oh, don't get me wrong, if and when this happens, I'll join the riots.
And I'm not even anti-AI; I genuinely believe that, as a technology, it is a major advancement that can and should be put to so many good uses. Which is emphatically not what the people in charge are doing.
Stocks are going to be better at hedging against inflation than any one specific currency, because they’re able to raise prices. This is something Buffet has talked about which is mildly detailed here: https://www.investopedia.com/how-warren-buffetts-1-rule-can-...
How the dollar erodes, and its effect on other currencies will be unpredictable. Corporations raising prices to offset their rise in costs, is predictable. It’s just a matter of finding those corporations and industries which can do so without losing more sales.
I’d assume the urge to “cash up” here is driven by the idea of trying to sell the top, and buy the resulting bottom. I highly doubt Thiel thinks AI isn’t worth today’s market cap or several multiples of it, in the long term, and that any “bubble burst” will likely be a generational buying opportunity.
Yeah, big same here. It’s pretty frustrating, because I pay for YouTube premium, but cannot use my preferred browser. I have to use Chromium in order to have it work reliably. Doubly so considering it worked fine in Firefox for YEARS… until maybe six months ago?
It feels like something the FTC should be investigating, or perhaps a European equivalent, but I doubt it will.
The FTC will not investigate anything that was not reported to them. Did you report your experience? They do care about these issues, if you care enough to take six minutes to report it.
reply