OSS model do not have to be local models, and it's not just about privacy, imo.
DeepSeek R1 hosting is out of reach for most, but it being open is a game changer if you are a building a business that needs the SoTA capabilities of such a large model, not because you will necessarily host it yourself, but because you can't be locked out of using it.
If you build your business on top of OpenAI, and they decide they don't like you, they can shut you down. If you use an open model like R1, you always have the option to self host even if it can be costly, and not be at the mercy of a third party being able to just kill your business by shutting down your access to their service.
Another benefit is they can be fine tuned. Also it's not only about if Openai will shut you down but decide to deprecate model (like they will do for gpt4.0) or swap the name for different model (like sonnet 3.5 did) or censure it or limit capability.
Suddenly after reasoning models, it looks like OSS models have lost their charm