Hacker Newsnew | past | comments | ask | show | jobs | submit | cadr's commentslogin

Is there a way I can see which would run on a raspberry pi?

Zynthian (a RPi-based synth collection & groovebox) lists some of the most prominent ones on its website: https://zynthian.org/engines

Its install recipes directory may yield a less fancy, but probably more comprehensive list: https://github.com/zynthian/zynthian-sys/tree/oram/scripts/r...

With Zynthian OS up and running, the full list of plugins shows in its webconf page, it's so long that they have to hide basically most of the plugins from the main on-device UI.

Roughly speaking, if it's open source, most likely it will work. If it's proprietary, assume that only Pianoteq and a small number of u-he plugins will work. Most commercial products with binary-only distribution don't feel like RPi devices are a large enough market for them to build binaries for it. Even if they otherwise offer ARM builds for Apple Silicon and Linux builds for x86.


kxstudio supports rpi, it comes with a few DAWs and a great deal more, it is probably your best bet for this stuff on pi unless you want to compile stuff yourself.

https://kx.studio/


They will need arm builds sadly so the list is likely going to be rather small.

Pretty much everything in the linux audio world runs fine on ARM

I used to get so much done on my BART commute. Also learned piano on a little 25 key midi keyboard until the program I was learning from started needing a 26th key.


I used to get frustated that the train shook so loud that in combination with the sound of the train I struggled to listen to podcasts, kudos to you.


I sometimes would bring ear-protection headphones that I'd wear over my earbuds to muffle the train noise.


That's so cool. Can you share more about this. What program and keyboard did you use? Did you proceed with more serious learning?


I used an Akai LPK25 with my iPhone (using the Camera Connection Kit and a combined usb hub/dac) and an app called Simply Piano. They make a wireless version of that now that would simplify the setup a great deal. It is a mini keyboard and the keys are quite small, but in my experience it was fine for the beginner stuff (and the keyboard is useful in general later). As I said before, I stuck with this until the app started using keys outside the range I had.

Now, as for "did I proceed with more serious learning" - I alternate though a ton of hobbies. So I moved on after that, though still go back to it from time to time. But I also have other musical interests and it was helpful to those as well.

Also did a lot of music on the commute on my iPhone with Korg Gadget (and Caustic before that). Sometimes with a keyboard, sometimes without.


I'm amused she said they included it "in case somebody out there liked ugly bright red and yellow" and that "the 'Fluorescent' theme was also pretty ugly, but it didn't have a catchy name, so I've never heard anything about it."

Because I loved the Fluorescent theme.


Flourescent looks so of it's time, it fits right into an age of acid house, where you could go hang out in the hologram shop [1].

[1] - E Is for Ecstacy - BBC Everyman Documentary https://youtu.be/jyrhcjRc3TU?si=Qn9qG2z8wQzD-llJ&t=812


I remember playing with these or similar back then, and if you used it for awhile to just became normal.


Ooh! Tell us about microscope making as a hobby!


Uh, OK. So a few decades ago a scientist I respect built his own scientific tool from parts (https://www.nature.com/articles/35073680) and I was really blown away by that idea, especially because most scientific tools are very expensive and have lots of proprietary components. I asked around at the time (~2001) and there wasn't a lot of knowledge on how to control stepper motors, assemble rigid frames, etc.

Although my day job is running compute infra, I have a background in biophysics and I figured I could probably do something similar to Joe Derisi, but lacked the knowledge, time, and money to do this either in the lab, or at home. So the project was mostly on the backburner. I got lucky and joined a team at Google a decade ago that did Maker stuff. At some point we set up a CNC machine to automate some wood cutting projects and I realized that the machine could be adapted to be a microscope that can scan large areas (much larger than the field of view of the objective). I took a Shapeoko and replaced the cutting tool with a microscope head (using cheap objectives, cheap lens tube, and cheap camera) and demonstrated it and got some good images and lots of technical feedback.

As I now had more time, money, and knowledge (thanks, Google!) I thought about what I could do to make scientific grade microscopes using 3d printer parts, 3d printing and inexpensive components. There are a lot of challenges, and so I've spent the past decade slowly designing and building my scope, and using it to do "interesting" things.

At the current point, what I have is: an aluminum frame structure using inexpensive extrusion, some 3d printed junction pieces, some JLCPCB-machined aluminum parts for the 2D XY stage, inexpensive off-the-shelf lenses and industrial vision camera, along with a few more adapter pieces, and an LED illuminator. It's about $1000 material, plus far more time in terms of assembly and learning process.

What I can do: the scope easily handles scanning large fields of view (50mm x 50mm) at 10X magnification and assembles the scans into coherent fullsize images (often 100,000x100,000 pixels). It can also integrate a computer vision model trained to identify animacules (specifically tardigrades) and center the sample, allowing for tracking as the tardigrade moves about in a large petri dish. This is of interest to tardigrade scientists who want to build models of tardigrade behavior and turn them into model organisms.

Right now I'm working on a sub-sub-sub-project which is to replace the LED illuminator with a new design that is capable of extremely bright pulses for extremely short durations, which allows me to acquire scans much faster. I am revelling in low-level electronic design and learning the tricks of trade, much of which is "5 minutes of soldering can save $10,000".

I had hoped to make this project into my fulltime job, but the reality is that there is not much demand for stuff like this, and if it does become your job, you typically focus on getting your leadership to give you money to buy an already existing scope designed by experts and using that to make important discoveries (I work in pharma, which does not care about tardigrades).

Eventually- I hope- I will retire and move on to the more challenging nanoscale projects- it turns out that while you can build microscopes that are accurate to microns with off-the-shelf hardware is fairly straightforward, getting to nanoscale involves understanding a lot of what was learned between the 1950s and now about ultra-high-precision, which is much more subtle and expensive.

Here's a sample video of tardigrade tracking- you can see the scope moving the stage to keep the "snout" centered. https://www.youtube.com/watch?v=LYaMFDjC1DQ And another, this is an empty tardigrade shell filled with eggs that are about to hatch, https://www.youtube.com/watch?v=snUQTOCHito with the first baby exiting the old shell at around 10 minutes.


Having it automatically follow tardigrades is so cool! It sounds like great fun. Did you make the ML model for tracking them?

I've wanted to make this the Openflexure Microscope (https://openflexure.org/projects/microscope/) but it is behind the backlog of all sorts of other things.


Yes, I took an existing vision model that could run at realtime on my laptop, and fine-tuned it with a few hundred manually labelled images of tardigrades.

I don't like the openflexure design at all. I mean... obviously it works for a lot of people, but I just don't want a flexure based stage. I like real 2-axis stages based on rolling bearings, basically cloning the X and Y parts of this: https://www.asiimaging.com/products/stages/xy-inverted-stage...

UC2 is another cool project: https://openuc2.com/ but I found their approach constraining.

Frankly I think you could just buy an inexpensive 3D printer that had an open firmware, and replace the extruder with an objective, a tube, and a camera, and you'd have something up and running cheaper for less time.


Having Google Translate take that into French and then back gave me "The passengers of this jumbo jet, caught in a quagmire resembling an ocean liner, found a happy accident in this chaotic and surprising landing." I quite like both sentences.


That instant branching is super cool!


Sure is! Thank you :)


I feel like this is thinking too small. I don’t want a better textbook. I want them to be basing this off of the experience of going to the most effective private tutors.


The textbook is there so that the model has something reasonably correct to work with and doesn't make up too much wrong stuff to teach.

For tutoring, I think the approach in https://www.nature.com/articles/s41598-025-97652-6 is promising. (Prompts are included in the supplementary material on the last page: https://static-content.springer.com/esm/art%3A10.1038%2Fs415... ) They start with an existing collection of worked exercises, give the chatbot access to the full solution and then let students interact with it to get a walkthrough or just check their own solution, depending on how much help the student thinks they need.


> I want them to be basing this off of the experience of going to the most effective private tutors.

Good goal, but they've got to start somewhere.

Delivering an education experience even 80% as effective as the best private tutors would be a huge achievement.


Can you say more on your experience here? I've not heard people use that term before, and I'd be interested in how it received/what you think the difference is.


Sure, just that stories about "paying down tech debt" are an implicit admission of flaws and shortcuts and "doing it again because it wasn't done right initially".

Which is all true but the concept of making deliberate trade offs for speed and expedience invariably gets lost.

Stories about ongoing improvement - tech enhancement - just get seen as more positive. Plus that term covers both remediating original shortcut choices as well as new engineering improvements arising.


Thanks!


Warning: do not look at laser with remaining eye.


When you say "almost" - could you say something about the codebases you saw where this doesn't apply? I'd love to hear about the best codebase you were SWE for and what we could learn from it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: