I doubt that, I hear on the internet that Gemini pro is great but every time I have used it has been beyond disappointing. I’m starting to believe that the Gemini pro is great is some paid PR push and not based on reality. The Gemma models are also probably the least useful/interesting local models I’ve used.
What are you using them for? Gemini (the app, not just the Google search overview) has replaced ChatGPT entirely for me these days, not the least of which is because I find Gemini simply be able to handle web searches better (after all, that is what Google is known for). Add to that, it can integrate well with other Google products like YouTube or Maps where it can make me a nice map if I ask it what the best pizza places are in a certain area. I don't even need to use pro mode, just fast mode, because it's free.
Claude is still used but only in IDEs for coding, I don't ask it general questions anymore.
I use Gemma as a developer for basic on-device LLM tasks such as structured JSON output.
That's true but to be honest I didn't really use those features anyway, my chats are just one long stream of replies and responses. If I need to switch to a new topic I make a new chat.
I used Gemini Pro and it was unable to comply with the simplest instructions (for image diffusion). Asking it to change the scene slightly by adding or removing object or shifting perspective yielded almost the same result, only with some changes I did not ask for.
The image quality was great, but when I ask a woodworker for a table and get a perfectly crafted chair of the highest quality, I'm still unsatisfied.
I cancelled my subscription after two days trying to get Gemini to follow my instructions.
When was this, before or after Nano Banana Pro came out? This is a well known bug, or rather, intended behavior to some extent, because it goes through content filters on Gemini which can be overly strict so it doesn't edit it as you'd expect.
You can try it on AI studio for free, which does not have the same strict content filters, and see if it still works for your use case now.