Mostly whatever the earliest version pytorch supports. While 3.9 is supported until the end of this year, torch wheels and other wheels in the ecosystem were always troublesome in 3.9. So 3.10 it is.
3.9 would have been the preferred version if not for those issues, simply because it is the default on MacOS.
Yikes those a both very old. Python pre 3.12 had some serious performance issues. You should be aiming to run the current stable version which will contain any number of stability and interoperability fixes. The bundled OS python versions are often far behind and better suited to running the basic tools rather than being used for every application or script that you run where ideally you'd use a python version manager and isolated virtual environment.
I still see quite a few people in the ML world using 3.10 as their default...probably just habit, but a closer look at the dependencies might answer your question better.
Ah well that's not as bad I guess, I saw in their readme they're recommending people use 3.10, which when I see it is often a bit of a red flag that the project in question may not be well maintained, but I agree I do see quite a few ML repos still noting the use of 3.10 to this day.
Yeah, I honestly think some of the language used with LoRA gets in the way of people understanding them. It becomes much easier to understand when looking at an actual implementation, as well as how they can be merged or kept separate.