Depends what for, I find AI tools best for boilerplate or as a substitute for stackoverflow. For complex logic, even GPT-4 ends up sending me down the garden path more often than not.
I got Llama 3 8B down over the weekend and it's alright. Not plugged it in to VSCode yet, but I could see it (or code specific derivatives) handling those first two use cases fine. I'd say close enough to be useful.
I don't see a flattening. I see a lot of other groups catching up to OpenAI and some even slightly surpassing them like Claude 3 Opus. I'm very interested in how Llama 3 400B turns out but my conservative prediction (backed by Meta's early evaluations) is that it will be at least as good as GPT 4. It's been a little over a year since GPT 4 was released to the public and in that time Meta and Anthropic seem to have caught up and Google would have too if they spent less time tying themselves up in knots. So OpenAI has a 1 year lead though they seem to have spent some of that time on making inference less expensive which is not a terrible choice. If they release 4.5 or 5 and it flops or isn't much better then maybe you are right but it's very premature to call the race now, maybe 2 years from now with little progress from anyone.
I shouldn't have used the word asymptote; I should have said logarithmic. I don't doubt a best-case situation where we get a GPT-5, GPT-6, GPT-7, etc; each is more capable than the last; just that there will be more months between each, it'll be more expensive to train each, and the gain of function between each will be smaller than the previous.
Let me phrase this another way: Llama 3 400B releases and it has GPT-5 level performance. Obviously; we have not seen GPT-5; so we don't have a sense of what that level of performance looks like. It might be that OpenAI simply has a one year lead, but it might also be that all these frontier model developers are stuck in the same capability swamp; and we simply don't have the compute, virgin tokens, economic incentives, algorithms, etc to push through it (yet). So, Meta pulls ahead, but we're talking about feet, not miles.