It doesn't seem to have open weights, which is unfortunate. One of Qwen's strengths historically has been their open-weights strategy, and it would have been great to have a true open-weights competitor to 4o's autoregressive image gen. There are so many interesting research directions that are only possible if we can get access to the weights.
If Qwen is concerned about recouping its development costs, I suggest looking at BFL's Flux Kontext Dev release from the other day as a model: let researchers and individuals get the weights for free and let startups pay for a reasonably-priced license for commercial use.
It's also very clearly trained on OAI outputs, which you can tell from the orange tint to the images[0]. Did they even attempt to come up with their own data?
So it is trained off OAI, as closed off as OAI and most importantly: worse than OAI. What a bizarre strategy to gate-keep this behind an API.
There seem to be a lot of AI images on the web these days and it might have become the single most dominant style given that AI has created more images than any individual human artist. So they might have trained on them implicitly rather than in a synthetic way.
Although theory is not practice. If I were an AI company I'd try to leverage other AI company APIs.
The problem with giving away weights for free while also offering a hosted API is that once the weights are out there, anyone else can also offer it as a hosted API with similar operating costs, but only the releasing company had the initial capital outlay of training the model. So everyone else is more profitable! That's not a good business strategy.
New entrants may keep releasing weights as a marketing strategy to gain name recognition, but once they have established themselves (and investors start getting antsy about ROI) making subsequent releases closed is the logical next step.
That is also how open source works in other contexts. Initially closed source is dominant, then over time other market entrants use OSS solutions to break down the incumbent advantage.
In this case I'm expecting people with huge pools of capital (the big cloud providers) to push out open models because the weights are a commodity then people will rent their servers to multiply them together.
Even for a big cloud provider, putting out model weights and hoping that people host with them is unlikely to be as profitable as gating it behind an API that guarantees that people using the model are using their hosted version. How many people self-hosting Qwen models are doing so on Aliyun?
If you have worked or lived in China, you will know that Chinese open-source software industry is a totally shitshow.
The law in China offers little protection for open-source software. Lots of companies use open-source code in production without proper license, and there is no consequence.
Western internet influencers hype up Chinese open-source software industry for clicks while Chinese open-source developers are struggling.
These open-weight model series are planed as free-trial from the start, there is no commitment to open-source.
> Western internet influencers hype up Chinese open-source software industry for clicks while Chinese open-source developers are struggling.
That kind of downplays that Chinese open weights are basically the only option for high quality weights you can run yourself, together with Mistral. It's not just influencers who are "hyping up Chinese open-source" but people go where the options are.
> there is no commitment to open-source
Welcome to open source all around the world! Plenty of non-Chinese projects start as FOSS and then slowly move into either fully proprietary or some hybrid-model, that isn't exactly new nor not expected, Western software industry even pioneered a new license (BSL - https://en.wikipedia.org/wiki/Business_Source_License) that tries to look as open source as possible while not actually being open source.
> I don't get why China is shutting down open source [...] now they've shut off the releases
What are you talking about? Feels like a very strong claim considering there are ongoing weight releases, wasn't there one just today or yesterday from a Chinese company?
Most pedantically correct answer is "mu", because the answers are both derivable quantitively from "How many images do you want to train on?", which is answered by a qualitative question that doesn't admit numbers ("How high quality do you want it to be?")
Let's say it's 100 images because you're doing a quick LoRA.
That'd be about $5.00 at medium quality (~$0.05/image) or $1 at low. ~($0.01/image)
Let's say you're training a standalone image model. OOM of input images is ~1B, so $10M at low and $50M at high.
250 tokens / image for low, ~1000 for medium, which gets us to:
Fastest LoRA? $1-$4. 25,000 - 100,000 tokens output.
All the training data for a new image model? $10M-$50M, 2.5B - 10B tokens out.
Alibaba from beginning had some series of models that are always closed-weights (*-max, *-plus, *-turbo etc. but also QvQ), It's not a new development, nor does it prevent their open models. And the VL models are opened after 2-3 months of GA in API.
Hunyuan Image 2.0, which is of Flux quality but has ~20 milliseconds of inference time, is being withheld.
Hunyuan 3D 2.5, which is an order of magnitude better than Hunyuan 3D 2.1, is also being withheld.
I suspect that now that they feel these models are superior to Western releases in several categories, they no longer have a need to release these weights.
> I suspect that now that they feel these models are superior to Western releases in several categories, they no longer have a need to release these weights.
Yes that I can totally believe. Standard corporation behaviour (Chinese or otherwise).
I do think DeepSeek would be an exception to this though. But they lack diversity in focus (not even multimodal yet).
What do you mean Tencent just shut off the Hunyuan releases? There was another open weights release just today: https://huggingface.co/tencent/Hunyuan-A13B-Instruct . And the latest Qwen and DeepSeek open weight releases were under 2 months ago, there hasn't been enough time for them to finish a new version since then.
Deepseek and Alibaba just published their froniter models in open weights weeks ago. And they happen to be the leading open weights models in the world. What are you talking about?
> One of Qwen's strengths historically has been their open-weights strategy [...] let researchers and individuals get the weights for free and let startups pay for a reasonably-priced license for commercial use.
But if you're suggesting they should do open weights, doesn't that mean people should be able to use it freely?
You're effectively suggesting "trial-weights", "shareware-weights", "academic-weights" or something like that rather than "open weights", which to me would make it seem like you can use them for whatever you want, just like with "open source" software. But if it misses a large part of what makes "open source" open source, like "use it for whatever you want", then it kind of gives the wrong idea.
I am personally in favor of true open source (e.g. Apache 2 license), but the reality is that these model are expensive to develop and many developers are choosing not to release their model weights at all.
I think that releasing the weights openly but with this type of dual-license (hence open weights, but not true open source) is an acceptable tradeoff to get more model developers to release models openly.
> but the reality is that these model are expensive to develop and many developers are choosing not to release their model weights at all.
But isn't that true for software too? Software is expensive to develop, and lots of developers/companies are choosing not to make their code public for free. Does that mean you also feel like it would be OK to call software "open source" although it doesn't allow usage for any purpose? That would then lead to more "open source" software being released, at least for individuals and researchers?
I wouldn't equate model weights with source code. You can run software on your own machine without source code, but you can't run an LLM on your own machine without model weights.
Though, you could still sell the model weights for local use. Not sure if we are there yet that I myself could buy model weights, but of course if you are a very big company or a very big country then I guess most AI companies would consider selling you their model weights so you can run them on your own machine.
Yes, I think the same analogy applies. Given a binary choice of a developer not releasing any code at all or releasing code under this type of binary "open-code" license, I'd always take the latter.
> Given a binary choice of a developer not releasing any code at all
I mean it wasn't binary earlier, it was "to get more model developers to release", so not a binary choice, but a gradient I suppose. Would you still make the same call for software as you do for ML models and weights?
> One of Qwen's strengths historically has been their open-weights strategy
> let researchers and individuals get the weights for free and let startups pay for a reasonably-priced license for commercial use
I'm personally doubtful companies can recoup tens of millions of dollars in investment, GPU hours, and engineering salaries from image generation fees.
If Qwen is concerned about recouping its development costs, I suggest looking at BFL's Flux Kontext Dev release from the other day as a model: let researchers and individuals get the weights for free and let startups pay for a reasonably-priced license for commercial use.