Hacker Newsnew | past | comments | ask | show | jobs | submit | vertnerd's commentslogin

As someone who grew up with only vinyl in the 60s and 70s, I would never choose it over a CD for audio quality.

BUT I would enjoy recreating the rituals that go with playing vinyl: obsessive cleaning of the disks, the gentle manipulation of a delicate tone arm, and the soft thud when the turntable cover drops. Playing a record was a minor event to be savored. I doubt the younger generations are getting all of that right.


Yeah, this.

Vinyl absolutely CAN sound great. If you have a nice amp and good speakers, modern listeners will be amazed at the fidelity possible from vinyl played back on a good turntable with a decent signal chain.

BUT.

CD is still better. CD is simpler. You don't have to faff about with cleaning them, or treat them like hothouse flowers. The platform is incredibly portable.

And yet: Vinyl is more fun.

We moved last year. Our audio room can play streaming, CD, or vinyl. It's the first and third options that get by FAR the most usage. CD comes up once in a blue moon.


I'm solo developing a spaceflight simulator on Linux (using the Godot engine), exporting binaries in both Linux and Windows. It turns out that I really didn't need to bother with the Linux export anyway because Steam runs the Windows version on Linux without any problems.

The ONLY thing I'm still having trouble with under Linux is Steam VR on the HTC Vive. It works. Barely.


Interesting, do you think game devs in future will just target windows/proton and not bother with native Linux ports?

Yes, most likely. Steam is dominant, and it's not hard to make a Windows release that works under Proton.

Though in my case, I currently offer demo/beta releases for both Windows and Linux directly from Github. If I ultimately elect to release my game under a GPL license, then supporting both Linux and Windows directly would make sense.


Ironically, win32 is the only stable API and ABI on gnu/linux

I used to be an educator, and many of my students had an autism diagnosis. I would get to know them and often eventually decide that they were "just like" me, except that whatever their problems were, I had it worse.

So then I would look at these autism checklists and say, "yep, that's me," but when I actually looked at the strict diagnostic criteria, it wasn't that clear.

Looking at this article, I get it. There are other, more focused criteria that can be more appropriate. But those diagnoses don't trigger the special services, so they don't get used often enough.

What is my takeaway? People often don't conform to a model of average human behavior. Being unusual isn't necessarily a grave character flaw (which is what my mother had me believe) but merely an expression of the great variety of human intellect and behavior. It gives me license, without official diagnosis, to enjoy being who I am without shame or embarrassment.


The diagnosis criteria are written by and for neurotypical people. Autistic people are likely to dismiss them as not fitting because they are reading them too literally.

Also we tend to underestimate our own symptoms. As a ADHD person it took me a long time to understand that many of my struggles were not things everyone experienced. I still find it hard to really grasp that most people don't suffer from executive dysfunction and can just do things, even things they are not interested in.

Honestly if you relate to autistic people chances are high that you have some form of neurodivergence. It might be worth trying to get a diagnosis, even just to be sure.


agreed.

I studied philosophy during a large extent of my life, and I am a convinced Witgensteinian.


I've also been working on half a dozen crates of old family letters. ChatGPT does well with them and is especially good at summarizing the letters. Unfortunately, all the output still has to be verified because it hallucinates words and phrases and drops lines here and there. So at this point, I still transcribe them by hand, because the verification process is actually more tiresome than just typing them up in the first place. Maybe I should just have ChatGPT verify MY transcriptions instead.


It helps when you can see the confidence of each token, which downloadable weights usually gives you. Then whenever you (your software) detects a low confidence token, run over that section multiple times to generate alternatives, and either go with the highest confidence one, or manually review the suggestions. Easier than having to manually transcribe those parts at least.


Is there any way to do this with the frontier LLM's?


Ask them to mark low confidence words.


Do they actually have access to that info "in-band"? I would guess not. OTOH it should be straightforward for the LLM program to report this -- someone else commented that you can do this when running your own LLM locally, but I guess commercial providers have incentives not to make this info available.


Naturally, their "confidence" is represented as activations in layers close to output, so they might be able to use it. Research ([0], [1], [2], [3]) shows that results of prompting LLMs to express their confidence correlate with their accuracy. The models tend to be overconfident, but in my anecdotal experience the latest models are passably good at judging their own confidence.

[0] https://ieeexplore.ieee.org/abstract/document/10832237

[1] https://arxiv.org/abs/2412.14737

[2] https://arxiv.org/abs/2509.25532

[3] https://arxiv.org/abs/2510.10913


interesting... I'll give that a shot


It used to be that the answer was logprobs, but it seems that is no longer available.


Just the other evening, as my family argued about whether some fact was or was not fake, I detached from the conversation and began fantasizing about whether it was still possible to buy a paper encyclopedia.


I admit I didn't even know I was using RCS on Android until I switched to a cheap flip-phone and I could no longer post to a Wordle group chat that I had been in for years. What is the possible advantage to the user for a messaging platform that ONLY works on an Android or iOS device with an active number? Don't want.


My Camroc produced exactly one usable photograph. It was hell to load it with a single disk of unexposed film.


You have my admiration! I remember it being completely fiddly, both loading the film, and cocking the shutter, then keeping it cocked.


I was captivated by the August 1980 issue of Byte magazine, which had a cover dedicated to Forth. It was supposed to be easy to implement, and I imagined I might do that with my new KIM-1 6502 board. Alas, the KIM-1 was lost when I went to college, and life forced me down different pathways for the next 45 years.

About a year ago I finally began to work on my dream of a Forth implementation by building a Forth-based flight management computer into a spaceflight simulation game that I am working on. Now, instead of writing mostly C# or GDscript code in Godot, I am trying to figure out ways to create a useful device using this awkwardly elegant language. I'm having fun with it.

One of the interesting bits is that I have been able to make the Forth code an entirely separate project on Github (https://github.com/Eccentric-Anomalies/Sky-Dart-FMS), with a permissive open-source license. If anyone actually built a real spacecraft like the one in my game, they could use the FMS code in a real computer to run it.

There is one part of the linked article that really speaks to me: "Implement a Forth to understand how it works" and "But be aware of what this will not teach you". Figuring out the implementation just from reading books was a fascinating puzzle. Once I got it running, I realized I had zero experience actually writing Forth code. I am enjoying it, but it is a lot like writing in some weird, abstract assembly language.


Circa 1980 BASIC was the dominant language for micros because you could fit BASIC in a machine with 4k of RAM. Although you got 64k to play with pretty quickly (1983 or so) it still was a pain in the ass to implement compilers on many chips, especially the 6502, which had so few registers and addressing modes that you're likely to use virtual machine techniques, like Wozniak's SWEET 16 or the atrocious p-code machine that turned a generation of programmers away from PASCAL.

FORTH was an alternative language for small systems. From the viewpoint of a BASIC programmer in 1981 the obvious difference between BASIC and all the other languages which that you could write your own functions to add "words" to the language. FORTH, like Lisp, lets you not only write functions but create new control structures based on "words" having both a compile-time and run-time meaning.

FORTH's answer to line numbers in BASIC was that it provided direct access to blocks (usually 1024 bytes) on the disk with a screen editor (just about a screenful on a 40x25) You could type your code into blocks and later load them into the interpreter. Circa 1986 I wrote a FORTH for the TRS-80 Color Computer running the OS-9 operating system and instead of using blocks it had POSIX-style I/O functions.

FORTH was faster than BASIC and better for systems work, but BASIC was dominant. Probably the best way to use FORTH was to take advantage of it's flexibility to create a DSL that you write your applications in.


I had that issue, and I think I still might have it in my closet. (Weren't those Robert Tinney covers amazing?)

I always wanted to try out Forth but had no real opportunity. Maybe I should now?



I am convinced that someday, device manufacturers will realize that complicated, small touchscreen UIs are a horrible idea. They're even worse for seniors, because our fingers are losing dexterity, may be slightly swollen and stiff, and are prone to tremors. So, at a point in our lives when merely holding the phone can be a challenge, navigating some ambiguous UI while our fingers are obscuring the very thing we're trying to use is insanity.

But Apple is the worst because of its Apple ID requirement. I tried to resurrect an old iPhone of mine only to get stuck in a week-long perpetual ID recovery loop with Apple. Enter the new password wrong too many times, and you have to wait another week to try again. Want to create a new Apple ID? Nope. No duplicate IDs attached to the same phone number. I finally just recycled the phone. I'm old. I don't have time to waste on an iPhone.


Fun link. I never wrote GPU drivers, but it does remind me of writing my first Ethernet card driver back in the day. I felt like I had decoded the Rosetta Stone, and there was absolutely no one to talk to who understood how that felt.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: