Sure, but what confidence do you have that what the "dumb" LLM says is worth any salt ? It's no different than aggregating the results of a Reddit search, or perhaps even worse because LLMs lack the intent or common sense filter of a human. It could be combining two contradicting sources in a way that only makes sense statistically, or regurgitate joke answers without understanding context (the infamous "you should eat at least one small rock per day").