I feel like in a few more years and 2-3 major versions C# will have all the useful features of F#.
It will also keep being much more exciting because our benevolent corporate visionaries manage to add new gotchas with every major and some minor releases
Certainly seems absurd to think that xz was the only target Jia Tan had been pursuing for years. Surely there were parallels initiatives to exploit other projects in the security chain.
Bandwidth/GPU real estate in terms of area. Biggest GDDR7 chips are 3GB 32-bit, it has 512bit wide bus. And even this is going to moonlight as a space heater
From PCB perspective, LPDDR5(X) interface is quite different from regular DDR5. Same with DDR4 and LPDDR4. Source: have designed few boards with different memory interfaces.
Also, our defaults are opposite of safe (most of the languages are still mutable by default, rigorous type systems wildly unpopular, there is a straightforward way to concatenate strings inside a query etc), our disaster prevention tools and practices seem most often to be targeted at symptoms instead of the causes (god forbid we rethink our collective ways and create/adopt tools that are much harder to use incorrectly), and all of this keeps happening because there is no pressure for it stop. What’s the incentive to?
I don’t think that there is a room for a meaningful and honest discussion about individuals in these circumstances.
English isn't my first language either, and I know how confusing it can be, so let me help here.
Here is the quote from the post: "If crypto people feel that the losses are because of Fed's actions, then they should also agree that the profits are because of Fed's actions". As we know, Fed's actions have changed: from providing "cheap" or almost free money (rate was something like 0.08%) to much more expensive ~4.5% now. It wouldn't be a huge leap of reason to assume that at least some of the impressive growth of crypto was driven by insanely low interest rates. But that's just my opinion, feel free to keep comparing monetary policy to homeopathy.
I understand the point, and obviously interest rates influence pretty much everything and anything. Stocks, tech salaries, housing prices, infrastructure projects, unemployment etc, they are all affected by central banks.
But that's not all, otherwise they'd perfectly align with interest rates, and stagflation would've been impossible, just as whatever it is that we have now.
If you look closely, it's the idea that "if X caused Y, it must have also caused !Y" that I compared to homeopathy, not the idea that monetary policy exists.
This is fantastically written documentation. What is even more exciting for me, that this is almost exactly what I was dreaming and talking non-stop about for the last few years. I never figured out the brilliant insight about redundancy of operator precedence. On the more embarrassing note in numerous implementation not-quite-attempts I always ended up gutting loops as a feature in favor of recursion and went with effects for capabilities (among other things). Then I was usually caught off guard by either of those or my other darling — partial application interacting in unexpected way (only for me probably) with linear types, which always punched me back to square one. Extremely impressed with how sound and grounded in reality your choices are. Fantastic job.
Fuck, I mean… yeah. Thanks for one more insight, by showing me that sending message/calling a method is a binary operation, seems so obvious but I never made that connection. This day started by shelling by Russian rockets, but reading the post and your comment somehow made it… good?
I always get caught off-guard by how much baby we’ve thrown with the water with modern languages compared to some brilliancy we had in Lisp and Smalltalk. Both are extremely out of my cup of tea region, but I learn so much when I interact with them. The only thing JS and something like Java taught me is to stay away:)
> Thanks for one more insight, by showing me that sending message/calling a method is a binary operation
The other way around.
A "binary message" is specifically the message of binary operators. Smalltalk also has unary messages (named messages with no parameters), and keyword messages (non-binary messages with parameters).
Its precedence rules are unary > binary > keyword, and left to right.
So
foo bar - baz / qux quux: corge
binds as
(((foo bar) - baz) / qux) qux: quux
You can still get into sticky situations, mostly thanks to the cascading operator ";":
a b; c
sends the message following ";" to the receiver of the message preceding ";". So here it's equivalent to
a b.
a c
now consider:
foo + bar qux; quux
This is equivalent to
foo + bar qux.
foo quux
Because the message which precedes the ";" is actually "+", whose receiver is "foo":
All Smalltalk history talks mention that APL was a major influence, and it didn't have operator precedence either. In addition, the Smalltalk-72 scheme of "parse as you go" would make implementing precedence really awkward.
Smalltalk-76 introduced a fixed syntax and does have a bit of precedence: unary > binary > keywords. All binary messages have the same priority, which given that you can define new ones avoids a lot of complexity at the cost of a few extra parenthesis.
I would have to respectfully disagree with you on the serializers part here. For me, after decades of fighting with “magic” stuff, simplicity is the key feature. So, everything that is a simple map function in disguise I tend to stick with functional languages and approaches.
The combination of caching Postgres queries (because you are making over 30 of them per page), but still deploying everything to 6 distributed regions is just perfect chef's kiss