> but one day they could suddenly start ranting about their own political opinions or crazy beliefs.
Why is this a problem? I don't mean to be confrontational here, but by this I mean: is it about them being "crazy", or us not being able to hold complexity and ambiguity? Politics has to emerge somewhere, and it's not like we have third spaces for these rants in our modern world (save for a few die-hards at your local town-hall meeting).
Also, I think cartoon politics is something that tends to emerge out of somebody's experience. Often it is armor. I think if you learn to not take them at face value, then it can really give you a quick insight (not always accurate) about what makes somebody tick.
I don't think you're being confrontational, and I don't think it's a problem either to be honest. My point was more that, try as one might, you can't build the ultimate curated list of non-political follows because somebody will eventually write something that you consider political. It can't be avoided, which I think is what you're saying too.
I personally think that people try too hard to avoid politics and shame those who "make things political" – especially in tech. We live in an inherently political world, and our industry is increasingly political as it's co-opted by political figures and even dictators across the world. Trying to avoid talking about it is like stuffing our fingers in our ears and pretending reality isn't real, imo.
All 5g gay frogs aside, this a power problem and not really a people problem. How many establishment institutions are left that citizens would wish to enthusiastically uphold? We have come to almost expect corruption and these days.
This isn't a justification for irrational conspiracy theory (which are generally harmless, yet occasionally highly catastrophic). It's that the establishment whack-em-all approach is not working, and is probably exacerbating their problem.
> this a power problem and not really a people problem
It’s an education and channel incentive problem. Our kids’ literacy is crashing [1]. And most Americans get their news through channels that are incentivised by selling ads.
I totally agree about literacy being huge, and would actually extend it to going beyond literacy.
The literacy crash is alarming, and is no doubt agitating this situation in a major way. However, I think what we are experiencing is something like a kind of siloing of private realities. Not the pearl-clutching 'echo chamber' discourse from 2019. But an increasing lack of social competency amongst younger people that is disallowing them to be present others in the world itself.
This is why I still think it is a power problem. Government, however incompetent, still has the monopoly on control and policy. They have experts yelling at them everyday about these problems. But their answer does seem to be more censorship and surveillance, rather than addressing the causes of these problems. As I mentioned, this only exacerbates the problem and makes it more socially dangerous.
The fact that Flat Earthers believe the earth is flat isn't the problem. The fact that people of such low intellectual quality have so much power over the rest of us is the problem.
>Yes but a flat-earther is a useful idiot for precisely zero dangerous social movements.
Incorrect. As with most if not all conspiracy theories, flat-earthism incorporates anger at "the establishment" because "the establishment" is hiding the truth. And this is the hook. If you can be convinced that a secret cabal is manipulating all science, controlling all governments, censoring all media and filtering all information in order to keep the basic nature of reality hidden from humanity - which flat earthers do believe - then you're susceptible to someone suggesting who that cabal might be. You know who.
True. I was being hyperbolic, but I hoped it was clear that I meant that flat-earthers are nowhere near the threat-level of something like a qanon or antivax movement, who are far more politically-activated, willing to take matters into their own hands, and likely to incite actual physical harm through ideological-driven behavior.
I wouldn't undersell the harms of general conspiracy theories. Most of them become more harmful the more people believe in them and the social media that I see has a tendency to spread the more harmful types of conspiracy theories (medical or political misinformation I see all the time on the internet now).
Yes. It has become mainstream. As soon as Kirk died, people were spouting antisemitic BS conspiracies that 'it was all Israel', even though he was the biggest shill of Israel's policies!
As you say, the medical conspiracies have really evolved since covid. I'm just glad we had covid when we did, because I feel that 5 years later people are so much more ignorant and less willing to all go through something together for the greater good.
With that said, I think the lot of conspiracy that just doesn't really hurt anyone but the believer. Aliens, moon landings, illuminati, etc. Kind of the modern day opiate of the masses.
> With that said, I think the lot of conspiracy that just doesn't really hurt anyone but the believer. Aliens, moon landings, illuminati, etc. Kind of the modern day opiate of the masses.
I'm more and more convinced that these are gateway conspiracy theories. That's where my dad started innocently enough. I remember listening to AM Coast to Coast with him. Art Bell and those "innocent" conspiracies. A couple decades later he's a completely different person with completely different beliefs all still aligned to conspiracy theories he picked up on AM radio. But now it's about Muslims taking over the country and wanting to kill or convert us all and the Clinton's and Obama's plans to enable Sharia takeover.
Are those examples at all indicative of what conspiracy theories are anymore, they seem stuck in the last century? The only one of those I've literally ever seen, even online, is aliens (and just UFO videos). On the other hand I personally know a lot of people who consistently buy into medical and political misinformation (and social media pushes it to me at least weekly, no matter how much I try to say I'm not interested).
I know a lot of people who have had to cut off family members because they got too deep into conspiracy theories and it's pretty much always weird political ones.
Yes, I feel these type of conspiracies were structurally similar, but of a slighly different composition to political conspiracy. But I think what changed is that political conspiracy became activated by the atomisation of culture and the lack of social consensus. As a result, these former examples (ufo, moonlanding, etc) now almost feel quaint and cartoonish, because they are relatively inconsequential compared to the choices people make and socially harmful actions they'll undertake with political conspiracy.
For a few reasons I personally find that the best medicine is just to nod along with these people and watch them give away their ideological hand:
a) You know where they stand and who you are dealing with.
b) They can, however rarely, be forced to actually confront the irrational logic when sharing it.
c) I think it is the compassionate thing to do, as people just often want to spout these theories as a much needed release valve. After all, people believe this stuff often because of a confusion or frustration they have with their own lives.
> wackiest conspiracy theories are probably the ones most promoted by the establishment
Not really. They’re wacky because being believed by the establishment, they have consequences. I’m not bothered by flat Earthers and vaccine deniers. I am bothered when they’re in power, because now their beliefs have influence.
They're wacky because people want to live in a wacky world of cults and Satan and aliens and ciphers and wheels within wheels, where at least something is in control and there is an order to reality, however obscure and evil, as opposed to a world of chaos and mundane grasping evil where there is no purpose beyond rich bastards getting richer.
So you are saying this is more effective than foraged mushrooms in a dark dorm room - paired with a Winamp visualizer using real-time DirectX plugins and shader-based graphics? Forgive me for being a bit skeptical
I don´t know if it was intentional, but for whatever reason, I find the specificity of "Winamp visualizer using real-time DirectX plugins and shader-based graphics" in this context, quite funny.
More effective? No. Just a ballpark state similar to the article, which I believe is more like the initial come up where you still have the ability to snap back to reality. Quite efficiently and reliably. Can't edit my OP but the caveat should be applied that I was already experienced w/ psilocybin upon purchasing the device.
Can't claim it would produce the same results for everyone, but I provided a free, low friction option for anyone curious.
Geez, foraged mushrooms and Winamp make me think we’re of the same generation. Those were good times…
This isn’t well known because his name was (deservedly) mud at the time, but Timothy Leary did a lot of work with sound and lights. He did his light shows when he was a pop guru, but he was even doing this work before he got fired from Harvard.
At the time, he considered it following the history of altered states. In the nineties when he had mellowed out, Leary started talking about lights, sound and technology. Here’s one example:
I’ve had a few experiences with non chemically induced altered states. They’re psychedelic-ish, but not really comparable to a substance like psilocybin. They’re definitely altered states, just while I could draw a picture to describe mild effects of psilocybin to a non user, I couldn’t with music and light.
The sensation induced by binaural beats are based on brain waves synchronisation, basically we get control of the stick shifter of the brain, and we perceive the changes strongly, as they are much faster than usual.
TLDR: you definitely feel them, and it feels a bit like getting high.
Chemical induced states of altered consciousness are of a fundamentally different nature. Keeping the car centric metaphor it would be like switching the type of fuel you are giving to the engine. It feels different, for different reasons.
Sure, but maybe you have a goal that can be achieved with either method, even if doing so feels different. If they'll both achieve the goal, foregoing the inconveniences of drugs sounds pretty great.
Work is central to identity. It may seem like it is merely toil. You may even have a meaningless corporate job or be indentured. But work is the primary social mechanism that distributes status amongst communities.
A world of 99 percent of jobs being done by AGI (which there remains no convincing grounds for how this tech would ever be achieved) feels ungrounded in the reality of human experience. Dignity, rank, purpose etc are irreducible properties of a functional society, which work currently enables.
It's far more likely that we'll hit some kind of machine intelligence threshold before we see a massive social pushback. This may even be sooner than we think.
Have you considered that perhaps tying dignity and status to work is a major flaw in our social arrangements, and AI (that would actually be good enough to replace humans) is the ultimate fix?
If AI doing everything means that we'll finally have a truly egalitarian society where everyone is equal in dignity and rank, I'd say the faster we get there, the better.
Pretend I'm a farmer in 1850 and I have a belief that the current proportion of jobs in agriculture - 55% of jobs in 1850 would drop to 1.2% in 2022 due to automation and technological advances.
Why would hearing "work is central to identity," and "work is the primary social mechanism that distributes status amongst communities," change my mind?
My apologies if you thought I was arguing that a consequence of AGI would be a permanent reduction in the labor force. What I believe is that the baumol effect will take over non-replaceable professions. A very tiny part of our current economy will become the majority of our future economy.
What if the girl above is crying and appears hurt because she has been mollycoddled, and this is a strategy to get attention?
Perhaps the parents had clocked-on to this, and were just letting the girl self-soothe so she could learn resillience. Then, on-cue, in steps some member of the public with their own opinion on the child they're trying to raise. This would be kind of tiring for the fatigued parent of a toddler, and the frustration of the parent in the above scenario is justifiable, particularly as encounters like this could happen multiple times daily with a child like that.
Now they could also just be a shitty parent. There's plenty of them. But it's difficult for us to judge and make hard rules in cases like this.
I'd guess no. While they have similar training data, there is plenty of novelty and unique data entering each model due to how each user is using it. This is why ideas like model collapse are fun in theory, but don't really play out due to the irregular ways LLMs are used in the real world.
I could be wrong, but I have not heard a convincing argument for what you propose.
> Information to warez groups: Since the source is open, it should be simple to turn the demo into a fully featured version. Please let me know of any problems.
Makes sense as their name suggests that they are affiliated with one of the original warez crews, Radium, who were around in the late 90s. Or that the name is an homage to Radium, who did a lot of massive releases that these developers may have grown up on.
Amnesty international should be dismantled, they have lost all of their credibility by recirculating direct Russian propaganda and now their reporting about Israel is basically direct Hamas PR branch.
I sometimes wonder if the true "digital divide" comes down to those who were able to develop critical thinking skills prior to these last few years.
If you had previously developed these skills through widish reading and patient consideration, then tools like LLMs are like somebody handing you the keys to a wave-runner in a choppy sea of information.
But for those now having to learn to critically think, I cannot see how they would endure through the struggle of contemplation without habitually reaching for an LLM. The act of holding ambiguity within yourself, where information is often encoded into knowledge, becomes instantly relieve here.
While I feel lucky to have acquired critically skills prior to 2023, tools like LLMs being unconditionally handed to young people for learning fill me with a kind of dread.
I think a lot of doomerism over AI making everyone stupid is really overestimating how many people have good critical thinking skills today. I've worked in high skill engineering domains for nearly twenty years now and even well-accomplished people are often pretty unimpressive when it comes to really understanding complex concepts.
I anticipate smart people will not suffer some handicap simply because easy answers are now readily available for the vast bulk of people who were never going to think critically in the first place.
I think I am relatively smart and I can feel myself getting dumber when I over-rely on AI for certain things. I've had to carefully observe the effect and choose what I "allow" myself to use it for.
Example: when I use it for prose for fiction or my personal blog, I can feel my own imagination for constructing good sentences and my own vocabulary deteriorate much faster than I would've expected. But when I used it purely for brainstorming plot points _or_ doing a quick proofreading pass on a set of notes I already took myself that I want to post, I feel no sudden negative effects.
Feeling that sudden difference in competency is jarring - in a good way. It's very useful information. I wonder whether people growing up with these tools from the beginning won't get that benefit - they won't have a "before AI" and "after AI" brain to "jar" them into awareness and adjustment. Then again... maybe the skills I am trying to preserve will be seen as entirely irrelevant to them regardless.
There is this overly optimistic view that average person is intelligent, which is definitely not the case. AI doesn't make people stupid, it just exposes the existing stupidity.
Why is this a problem? I don't mean to be confrontational here, but by this I mean: is it about them being "crazy", or us not being able to hold complexity and ambiguity? Politics has to emerge somewhere, and it's not like we have third spaces for these rants in our modern world (save for a few die-hards at your local town-hall meeting).
Also, I think cartoon politics is something that tends to emerge out of somebody's experience. Often it is armor. I think if you learn to not take them at face value, then it can really give you a quick insight (not always accurate) about what makes somebody tick.