It only challenges user agents with Mozilla in their name by design, because user agents that do otherwise are already identifiable. If Anubis makes the bots change their user agents, it has done its job, as that traffic can now be addressed directly.
It's only recently, within the last three months IIRC, that Wikipedia started requiring a UA header
I know because as a matter of practice I do not send one. Like I do with most www sites, I used Wikipedia for many years without ever sending a UA header. Never had a problem
I read the www text-only, no graphical browser, no Javascript