I don't understand what they want to do, but I hope it won't be stupid, useless AI chat that sends all user data to OpenAI without disclosing it to the user. Nobody needs to talk to a computer program, and "AI chats" are the worst usage of otherwise important technology.
A good use of AI would be, for example:
- high-quality translation of a foreign text when you point a camera at it. Could be useful for travelers
- recognizing and reading aloud the text from camera image for people who have bad eyesight
- recognizing speech for people who have hearing loss
- image search, for example determining types of plants and insects and dog breeds
At least with their previous features it's been possible to set the address to any custom services you may have running, remote or local. For example Against all expectations I had I've actually found the highlight -> explain tool to be useful when sent to my local vLLM instance with a template I like.
Why not google/ddg/bing etc. them? That's on the context menu too, but LLMs seem uniquely suited at some problems like acronyms that are shared across many fields but different meanings, highlighting a sentence turns out the right acronyms very fast where search engines would take several attempts and is what I used to do previously.
A good use of AI would be, for example:
- high-quality translation of a foreign text when you point a camera at it. Could be useful for travelers
- recognizing and reading aloud the text from camera image for people who have bad eyesight
- recognizing speech for people who have hearing loss
- image search, for example determining types of plants and insects and dog breeds
- checking grammar and style in text input boxes