I’ll admit it has been deeply entertaining watching the crypto community speed-run 200 years of “finding out the hard way why modern financial regulations exist” and “how to recognize a blisteringly obvious ponzi scheme” in the span of 15 years.
I say this as someone who got his first 0.05 btc for signing up for a newsletter in 2009 or 2010.
0.05? You probably misremember the amount. For reference, in 2010 the Bitcoin faucet gave away 5 BTC just for visiting a web page. A decent incentive for subscribing to a newsletter, more involved than visiting a web page, would have to have been higher than 5 BTC.
I must be misremembering the year then! It was 0.05 because I thought it was 5 then found the old hard drive last year and managed to recover it from an ancient bitcoin client. I don’t remember the details but I had to let it sync (downloading data?) for days before I could make a transaction to an exchange and sell.
What "chants" are you referring to specifically re differences in markets? AFAIK it has always been pushed as a decentralized alternative to sending money, but that has little bearing on the way the traded price changes?
When was that the chant?
They said more financial inclusiveness and self sovereignty. There are some with code is law and more cypherpunk ideals that is true. The early internet and open source movement had similar movements and these people still exist trying to improve society. Once things become mainstream the majority of focus becomes business and what products/services it can provide. The same companies in this space focused on business want clear regulation not lack of it.
Tomorrow your bank could cut you off from your own money because you protested the government[1], and what could you possibly do?
Having the freedom to truly own your own money and investments is valuable in and of itself, no matter how much crypto "devolves" towards traditional finance in other regards.
If your government wants to cut you off from your money, they'll do it for crypto just as easily as they do it for traditional banks. With the exception of direct-crypto purchases from shady internet sites, at some point your payment rails must pass through an entity that your government has some degree of authority over—you're not going to buy your groceries using Bitcoin over an onion service.
Even if crypto becomes 100% ubiquitous, the end game isn't "now the government can't control finance" the end game is "now the government will find a new way to control the new finance". Eventually, the government will intervene because people will be begging them to, because they don't actually want to live in a world where theft and fraud are irreversible and their entire financial life is tied to a set of cryptographic keys that they barely understand.
You're trying to push a technological solution to authoritarianism, and it's not going to work for the masses. Canada doesn't need crypto, Canada needs voters to hold the government accountable for its abuses.
Apart from some edge cases, Western democracies have quite solid property rights upheld also by the judicial system. But yes if you’re in Russia or China then this argument is moot. They have much less property rights. However, Bitcoin is also not a solution because there is no point in owning Bitcoin if you fell trice out of a Russian window or are sentenced to a labour camp or to the death penalty in China for "fraudulus activities“. Difficult to spend your Bitcoin in both cases.
Not only Russian or Chinese, you should open the bracket for any individual, organization or country that the western democracies will deem bad in their view. Or even have the any doubt of association with them. For example we can talk to many Muslims in the UK where no banks will grant them accounts (close account without notice) and how safe they feel [1]
Not that I am a crypto supporter, but the current western based financial system hegemony is good until you are have wrong name, religion, country... etc.
Being a trucker in Canada is a edge case. Also being a human interacting with police in the US.
Did you mean intellectual property? You draw Mickey a year ago and a FBI helicopter would soon be overhead to kill your dog and take any cash they find in your wallet.
> People seem to forget part of civil disobedience is going to jail and paying a fine.
That's not even vaguely what happened here and it's well beyond arguing in bad faith to attempt to trivialize it to such.
The goal of freezing the accounts was to make the truckers unable to buy food or pay rent to force a near immediate end. If the civil disobedience starts being evictions and starving protests are going to become a thing of the past.
Yes, like you can lose your treasure by forgetting where you buried it. Or you can lose your silver when an accidental fire melts your palace into slag. Or you can lose your wheat harvest to mold! Self-storage is the default way of holding wealth, not a new one. And it's a tradeoff that people will always choose to make as long as personal wealth exists.
Among the innumerable problems with cryptocurrency, this is not one of them.
Honestly, the first thing I'd do is hope that they forgot that I make house and car payments though them and that they forgot how much damage they could do to my professional life just by intentionally sabotaging my credit. Crypto doesn't provide any realistic protection to me from their whims should they become bad actors.
In the past few years we've seen ample evidence that crypto is largely a complete scam, and likewise strong evidence that none of crypto's hoped for value will come to fruition (we didn't see it useful for fighting inflation, it's not being used to avoid sanctions, it certainly isn't being used as a currency, etc).
The fact that crypto still has any market value, and that companies like coinbase not only exist but have had a stellar year defies the imagination.
I get a few years back when there was still a lot of speculation/optimism, but clearly today everyone see that it is just a con. Today even my most cynical view of markets seems naive.
Most of the ecosystem is scammy garbage, but I think there's still going to be enough demand for sound money that a currency that can't be artificially manipulated by a central bank will do well, relative to currencies that can.
Perhaps I'll be wrong, but that's why I'm not heavily leveraged and I hedge my bets.
I don't think the entire ecosystem is garbage. I do think there are a few useful ideas other than bitcoin. But I can easily support the assertion that more than 99% of the "crypto" things that exist are worthless and/or outright scams.
Takes like these are so moronic it hurts. I just paid an artist across the globe for some icons; the transaction took seconds to resolve and cost me cents to send via SOL. The closest I've come to that is Canada's eTransfer system, but it only works in Canada.
You are ignoring the possibility he got paid in SOL for some other work. Or the possibility the recipient of his SOL can buy dinner using SOL. This is all possible, even if unlikely.
I’ve used DeFi for as long as it’s been around and will continue to do so. It’s safer and can be more profitable than TradFi, even with stablecoin yield. It’s a larger area than you might realise, and there is a lot more to it than degen gambling.
Just because it’s obviously a con, doesn’t negate the possibility of get rich quick outcomes for blockchain participants. Human beings are quite well known for being easily suckered into lazy profit opportunities, and for rationalizing away complex ethical concerns when complicit in suckering others. Otherwise pyramid schemes would never work.
In my opinion, only the scammers OR people who truly believed in the promise of crypto and still are delusional to think it will materialize some day. Either way, it is terrible. I can't believe that a digital currency that cannot be used by most people is worth $xx,0000. But may be I am dumb.
Which is fascinating because sulphur emissions counteract (mask might be a better term) global warming. Reduction in sulphur emissions is suspected to be one of the main culprits of this years sudden rise in Earth sea-surface/land temperature this year.
Wild when you see just how much emissions are still being released and still presumably cooling the Earth, meaning the effects of climate change we're seeing now are still likely a dampened version of the true long term impact.
> Reduction in sulphur emissions is suspected to be one of the main culprits of this years sudden rise in Earth sea-surface/land temperature this year.
Was there an outright study of the "main culprits" part of this? As I recall there was some evidence but then the main discourse was based on a lot of extrapolation by a viral tweet.
Not to my knowledge, which is why I used the word "suspected" since I think this falls on the "makes intuitive sense, but would not surprise me in the least if it turned out to be completely incorrect" category of hypotheses.
I consider "suspected" to be the least level of evidence while still taking something into consideration as a potential cause. A suspected murderer might not even have been arrested, let alone convicted.
We do know that sulphur emissions have a global cooling effect, and we do know that sulphur emissions recently were reduced, so it's a reasonable hypothesis from first principles.
To be clear, I'm absolutely not promoting increased sulphur emissions as a solution to our climate problems. Moreso pointing out that all those emissions are potentially masking the true severity of our current predicament.
I recall there was a major push against Acid Rain in the 70s-90s. If SO2 emissions were effectively regulated in that period (easy to do because “acid” is scary) then what magnitude of impact did that have on our post-90s warming?
I know plenty of authors and none of them are subsidized by wealthy families. All of them do it part time in the evenings out of a labor of love.
It is worth pointing out that there's nothing particular odd if it were the case that writing was subsidized by wealthy families. For the vast majority of the history of writing, writing was subsidized an left to monks, philosophers or aristocrats. It's only been in the relatively recent time period that writing was a potential occupation for anyone interested with enough skills/talent.
In my experience, in NY, the majority of people working in contemporary literature publishing are ivy leage graduates, mostly women, and they live off of their parents. I'm not judging, just stating my observation.
Literary fiction, yes. That market's so fucked that the vast majority of literary magazines don't pay at all and you'll often get sneered at for asking about pay.
Anyone trying to make any amount of money at writing writes genre fic of one sort or another. Fantasy or maybe sci fi, and probably "juvenile fiction" (tends to sell better to adults, too). Romance (which may or may not actually be straight-up porn, basically). Airport thrillers. Not lit-fic. Never, if your goal is to make any money at all.
And yeah, the publishing-side heavily favors people with money, lit-fic or not, for the reason that making a living at it requires excellent connections to get you directly into a high-paying part of it, or else years and years making less than it takes to live on in places like New York, to work your way up the ladder. Either way, that probably means family money. This phenomenon been mentioned, directly or obliquely, in IIRC all of: Bullshit Jobs (Graeber, 2018), Fussell's Class (1983), and The Official Preppy Handbook (Birnbach et al, 1980).
> only reason for using a traditional publisher is the cash advance then?
A few really important things come to mind:
- Editing. I'm not talking about mere copy editing which you can get done reasonably cheaply, but rather having an editor that is reading through everything and giving feedback is hugely important.
- Layout and printing of the book There's a lot that happens between writing and having a polished book in your hands. You can contract all this out but it adds a lot of work.
- Distribution. While the burden of marketing a book has increasingly fallen upon the author these days, if you want your book to be on the shelf at your local Barnes & Noble, then your much better off going with a traditional publisher.
- Prestige. Like it or not, the vast majority of people on Earth still look down upon self publishing. For some types of books this is less important: technical books and fantasy fiction books can go without in many cases (but if you want to use your book for credibility in something like consulting you'll still want a traditional publisher). But if you want to write on a serious topic it helps a lot to have an academic press publish your work, or if you want to really pursue writing literature you at least want some publisher that is recognized in your relevant community.
Currently I think the only really good use cases for self publishing are the fantasy fiction and niche technical book markets assuming you already have an audience. And even in those cases there are plenty of reasons to go with traditional publishers over self publishing.
I did not downvote, but just wanted to mention that the first two do not require a traditional publisher. In fact none of them do, but especially not the first two.
It is true that there are real quality issues with a lot of self-published work because you don't _need_ an editor to publish your book. Heck, you don't even need to do a self-edit pass. Write it and hit publish! But it is increasingly an expectation that you have one, because quality expectations are extremely high, especially for competitive money-making genres.
I started out self-editing and now pay for three professional edits for each release: developmental, copy, and proofread. Professional editors are not exclusive to traditional publishing houses.
I never claimed that they "require a traditional publisher", in fact I explicitly point out that you can pay for these yourself (though I can't imagine putting together a good team of editors without having prior publishing experience).
My point was that, in response to the parent claiming there's nothing traditional publishers offer, these are things that traditional publishers do in fact offer an author. If you write for a traditional publisher you mostly have to just worry about writing, and, unfortunately, marketing these days.
Yup, and if they think it'll sell then book stores can stock up more. Many book stores don't stock unless the book is distributed as returnable (in case it doesn't sell). Whether self pub or trad pub, unsold books returned by stores come back out of the author's cut. In many cases it doesn't even make sense for the author to physically reclaim returned books as the shipping and storage are more expensive, so they get destroyed.
the funny bit is, you have to set a "retail price" in every country they operate in, and if you set it too low, the bookstore has a loss on each book. So you have to keep increasing the price until the margin is positive.
just in case someone in Australia goes to a bookstore and asks for it :)
I've always found it fascinating that geophysicist and earlier advocate for Bayesian methods, Sir Harold Jeffreys, didn't believe in continental drift and plate tectonics because he felt there was no known source of energy on the Earth massive enough to explain this movement. [0]
He remained an opponent until death (at which point continental drift was widely accepted) which is both a testament to the literally unbelievable energy behind seismic activity and the importance of updating your Bayesian priors as you gain new information.
> Max Planck, surveying his own career in his Scientific Autobiography, sadly remarked that “a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.“ -The Structure of Scientific Revolutions
This is actually such a great insight. That's why we also need new generations of politicians and management every so often to keep the wheel of progress rolling. And, incidentally, why I think that developing anti-aging technology is not a good idea.
I wonder if there's something physical to do with aging that makes people cognitively inflexible. Anti-aging tech would be great if you could prevent that negative aspect of aging too.
It is also suspicious because it means someone has assigned a 0% prior to their being insane or in some sort of Plato's-cave scenario - which is hard to justify.
The minimum possible Baysian prior is a base rate of "my senses are just not picking up reality and/or my memory is catastrophically compromised and/or I cannot process logic right now due to some reason" which while low is never going to be 0%. There are too many known ways for human brains to fall over. 0% priors are unjustifiable.
I've always found it fascinating that the guy who came up with the theory (Wegener) died in on the ice sheet in Greenland while attempting to resupply a research station. Having spent some time on the Greenland Ice Sheet at Summit Station, life on the Greenland ice sheet is much more cushy nowadays...
> And now Google is unusable: using LLMs even just as a compressed form of documentation is a good idea.
Beyond all the hype, it'd undeniable that LLMs are good at matching your query about a programming problem to an answer without inundating you with ads and blog spam. LLMs are, at the very least, just better at answering your questions than putting your question into to google and searching Stack Overflow.
About two years ago I got so sick of how awful Google was for any serious technical questions that I started building up a collection of reference books again just because it was quickly becoming the only way to get answers about many topics I cared about. I still find these are helpful since even GPT-4 struggles with more nuanced topics, but at least I have a fantastic solution for all those mundane problems that come up.
Thinking about it, it's not surprising that Google completely dropped the ball on AI since their business model has become bad search (i.e. they derive all their profit from adding things you don't want to your search experience). At their most basic, LLMs are just really powerful search engines, it would take some cleverness to make them bad in the way Google benefits from.
> A lot to be said for not defaulting to data frames, in both r and python
I would even add especially in Python. The main issue I have found is that pandas heavy code is just not as easy to integrate into other Python tools/features/abstractions as code using mostly numpy, dictionaries and various comprehensions to do the vast majority of your work.
As a heavy pandas user for several years, I decided about a year ago to not import pandas by default and instead treat most data problems like regular python problems. I've been genuinely surprised as how much easier it is to create useful abstractions with the code I've been writing, and also how much easier it's been to onboard non-DS devs into the code base.
There are a few obvious cases when Pandas is very helpful, and I'll pull it out in those places, but I've been able to do a tremendous amount of data work in the last year and used very little pandas. The result is that I have an actual codebase to work with now rather than a billion broken notebooks.
> The result is that I have an actual codebase to work with now rather than a billion broken notebooks.
This is the biggest part. Giving yourself permission to make real abstractions, rather than forcing yourself to go directly from data-on-disk to pandas (or whatever) makes it that much easier to test, repeat, modify, and extend whatever analysis you're working on.
Resampling, regularizing, binning and forward/backward filling time series data is an absolute pain in the ass using only SQL and/or vanilla python. It does its thing well, there.
(Note that in general, I'm the biggest pandas hater I know)
Recent experience at a fairly young startup has shown me that open office culture has also started to breed a very different type of programmer.
People will often be pairing nearly all day long, any claim that you need a moment to focus and think about a problem is met with perplexity, every idea should be shipped to prod asap, while tests exist the idea of performing basic QA/manual testing on your own work is only used in the most extreme cases.
Contemporary startup engineering culture is best described as frenetic. It certainly feels hyper productive (if not extremely exhausting for a more traditional, introverted programmer), but I've started to notice a fairly large amount of that "productivity" is fixing mistakes a more focused programmer would have avoided.
I suspect the long-term impact of open offices my be even more deleterious than it's impact on the focus of individual programmers.
I was raised by the focused type of programmer, and modern startup culture is horrifying to me (and him). I have left that world and now am solo engineer in a non-profit where I'm responsible for a list of results, not a pile of Jira tickets someone made up to look busy.
That sounds like my ideal job. The longer I've been a developer, the more I've come to dislike work that doesn't address problems faced by end-users/the org.
I couldn't be happier. I am the in-house expert in my field, I replaced an agency that was far more expensive and incompetent, and I have great hours (9-4:30!) and benefits. Oh, and I'm fully remote in an org that's been remote since 2013.
> .... a fairly large amount of that "productivity" is fixing mistakes a more focused programmer would have avoided
Holy cow this hits home. I've been on a number of teams like this going back years (decades) and ... I just don't get it. Had I been 'allowed' another 30 minutes, or an hour, or a day, on problem X... we'd have avoided weeks of unraveling problems later. But... no - gotta keep pressing on, hitting those pre-defined deadlines at all costs.
Deadline Driven Development is foolish. Deadlines are good to have, but you cannot force good functioning software if more time is needed to craft it. You get what you pay for, you reap what you sow. If you just want to churn out code in unrealistic time spans instead of extended efforts, you're going to get a bad product. Instead, cut things that can come out later, have developers focus on polish. I would rather a very stable and polished MVP over a rushed dumpster fire as a dev an end-user who has seen some awful.
As a former programmer and now EM, I agree with this. Open offices definitely felt a lot more productive since everyone was always frantically working and communicating. I think people actually have gotten more done since WFH started, though.
This is the crux. There are people who want this feeling, at all costs seemingly, despite no data backing up the assumption that returning to the office makes a materially positive difference and produces positive outcomes.
The fact that working from home means I avoid wanting to put a rifle round through my skull during a commute to the office is pretty strong data that return to office doesn’t work for me
The sweet spot is don't force people to work a certain way. I was in an open office place before 2020, was remote friendly, we mostly would come in except when we needed personal time or whatever, but if I wanted to focus, I'd pop in headphones and crank out code, as would anyone else. If I wanted to peer program I could, and if anyone wanted a quick laugh, we'd just talk for about five minutes, because sitting staring at code non-stop in an office environment can be draining too. I prefer WFH, and I can peer program with devs by calling them on Teams and screen sharing, but if I have to be in an office, it wont make much different to me, just the risk / wasted time from the commute.
I love WFH for this. It’s so much easier to plan my day according what best works for me like focus moments and current environment. We still do all the meetings and pair programming is so immensely better over a call with screen sharing.
All the energy I’d normally expend on “shielding” myself from the office environment can now go into focus and actual creativity.
I think the sweet spot is, let teams decide how often to meet if you're going that route. For example, last place I worked at we were mostly from various parts of the states, so we were considered remote, whilst others lived nearby and had to commute. I think managers should decide wholly how their teams work. If managers need to be onsite, that's reasonable too, though I would assume not always especially if their teams are remote.
I feel like the less technical teams might benefit more from face to face, but developers, a lot of us do our coding at home before our careers even start. It is a hacker's career path.
Not just startups.
I was once hired as a contractor for a major bank in Toronto who were desperate to ship a product, which was way past its promised delivery date. The AVP got an idea to put all of us in a conference room huddled around a conference table , because obviously us lazy programmers were slacking off in our cubicles and the crappy almost daily changing requirements were not to blame. The entire team began falling sick one by one (pre covid era). This was also where I learnt the hard way that it is possible to get the flu twice in the same flu season. It was a hilarious mess. Curiously enough the AVP got promoted to VP the next year.
And the following year, the whole office switched to open office plan. I think the ability to micro manage people and the power trip for managers explains this logic.
My experience with Canadian “business man, doing business” culture supports this.
Their lives are mostly modeling what business is supposed to look like. Nevermind it achieves nothing.
When I go home to downtown Vancouver I’m startled at damn good-looking everyone is in their suits and pomade-hair, in great offices exuding power and dignity. But their GDP per capita is crap compared to us schlubs in Seattle.
They must have learned performative-salaryman from the British.
Those programmers just don't get that they are part of a managerial Broadway musical. They won't even talk, and if they do not talk, how can they siiing.
Chat gpt write me a musical about micro management in software in 3 acts.
I once had a manager consistently try to pressure me and a few other developers to work in a "war room" setting to complete a project that was slipping past the deadline. He wanted to be part of it too, despite being non-technical and consistently slowing us down with impossible prescriptive solutions. It took a non-trivial amount pushback from all of us that that was the least productive way to get the project completed. He was later laid off.
Inside Facebooks' offices in Seattle circa 2019: "Overcrowded pig sty" is an accurate description. The smell was overpowering. Two pairs of bathrooms for an entire floor of developers packed shoulder to shoulder in an open plan hellscape.
I actually really loved the Dexter building. Yes it was all that, but I’m an extrovert and some part of my work day needs have been unmet for 4 years now.
Open plan offices took off in larger companies in the mid-2000s, and I think it's a classic example of a cargo cult.
Executive management looked at the handful of hugely successful startups who had open plan offices and thought, "It must be these open plan offices, that's their secret sauce! We just need to copy that and we'll be successful too!"
...ignoring survivor bias, because for every hugely successful startup who did open plan out of necessity, there was a big graveyard of startups who had the same practice and failed.
The awesome thing about modern programming culture is that rework due to the initial thing you shipped being rushed and shoddy actually looks really good on Tableau. Because you can assign more story points to fixing all the mistakes you made during the last 1-2 days of every sprint.
Bugs and unintentional design deficiencies get zero story points at my job. It's actually something I fought for because it's faux-progress - it's work that's actually a part of the original (likely underestimated) story someone already earned points for.
Everyone has a different take on story points, but the original idea was for them to record _effort_, not value.. more story points are actually worse. Delivered value is better. So digging a ditch and filling it in would get a bunch of story points but have zero value.
But managers want to look at the numbers they have, which is story points.
There's no such thing for us. We don't work in "sprints". We have a giant list of things to do. If you need something to do, you take something off the top of the pile and do it. When you're done, however long it takes, you take another thing off the top and work on that.
Yeah, this definitely gets at something. There's a sizable (or just noisy?) contingent of devs who prioritize activity above actual quality because quality is hard and not immediately measureable (supposedly). And it feels designed to be overtly anti-intellectual, as if the act of engineering is a mostly social act punctuated by the annoying demands made by the compiler, the runtime, and customers.
I suspect it's championed at some places because you're "leveling everyone up."
Is anyone trying to hire only introverted, spectrumy, possibly older developers for their startup? Seems like it could be a big competitive advantage if you have smart managers and don’t do foot-guns like open-plan and pairing.
It's worth pointing out that most of the best science happened before peer review was dominant.
There's an article I came across awhile back, that I can't easily find now, that basically mapped out the history of our current peer review system. Peer review as we know it today was largely born in the 70s and a response to several funding crises in academia. Peer review was a strategy to make research appear more credible.
The most damning critique of peer-review of course is that it completely failed to stop (and arguably aided) the reproducibility crisis. We have an academic system where the prime motivation is the secure funding through the image of credibility, which from first principles is a recipe for wide spread fraud.
>It's worth pointing out that most of the best science happened before peer review was dominant.
It's worth pointing out that most of everything happened before peer review was dominant. Given how many advances we've made in the past 50 years, so I'm not super sure everyone would agree with your statement. If they did, they'd probably also agree that most of the worst science also happened before peer review was dominant, too, though.
Our advances in the last 50 years have largely been in engineering, not science. You could probably take a random physics professor from 1970 and they'd not sweat too much trying to teach physics at the graduate level today.
But a biology professor from that time period would have a lot of catching up to do, perhaps too much, especially (but not only) if any part of their work touched molecular biology or genetics.
But there is zero reason why the definition of peer review hasn't immediately been extended to include:
- accessing and verifying the datasets (in some tamper-proof mechanism that has an audit trail). Ditto the code. This would have detected the Francesca Gino and Dan Ariely alleged frauds, and many others. It's much easier in domains like behavioral psychology where the dataset size is spreadsheets << 1Mb instead of Gb or Tb.
- picking a selective sample of papers to check reproducibility on; you can't verify all submissions, but you sure could verify most accepted papers, also the top-1000 most cited new papers each year in each field, etc. This would prevent the worst excesses.
PS a superb overview video [0] by Pete Judo "6 Ways Scientists Fake Their Data" (p-hacking, data peeking, variable manipulation, hypothesis-shopping and selectively choosing the sample, selective reporting, also questionable outlier treatment). Based on article [1]. Also as Judo frequently remarks, there should be much more formal incentive for publishing replication studies and negative results.
It seems kind of obvious that peer review is going to reward peer think, peer citation, and academic incremental advance. Obviously that's not how innovation works.
the system, as flawed as it is, is very effective for its purpose. see eg "success is 10% inspiration and 90% perspiration". on a darker side, the purpose is not to be fair to any particular individual, or even to be conducive to human flourishing at large.
Glad to see that it's been watered down to "just as bad as" arguments.