No offense, but it's hard to monetize FOSS operating systems.
I've yet to see a scalable business model.
All FOSS OS's rely on either donations or consulting/support for revenue.
There's just no money to be made there, unless someone comes up with something innovative.
But ElementaryOS is in a unique position to monetize which they don't use.
They don't allow commercial applications on their app store. I'd publish there in a heartbeat, if they did.
They are also in the position they (barely) function at a very low cost. They don't need a lot of cash flow to make a huge difference.
If someone has a way in with them, please please mention this. I'd hate for ElementaryOS to die, I really like it and have followed it from the beginning.
It's the only Linux which strikes the balance between being "normal" and caring about simplicity, consistency and ease of use.
I forgot to mention - the user can pay $0 for any app. This is the problem which needs to be fixed. All apps are donationware, enforced by the appstore.
This is exactly why I never bothered. Combine that with the fact users can pay $0 and then come to GitHub to complain (i.e. create a support burden), I figured it wasn't the best option for a solo developer to build a business model on.
It would be more interesting if there was more flexibility in the monetization. Options for a minimum price + pay more per seat (for integration with some corporate provisioning software), subscription pricing, one time price + either support subscription or support "piecework" (like bug bounties or developer consultation), and invoicing integration.
> They don't allow commercial applications on their app store.
Which I kinda agree with, as they want to promote open-source apps first, but I'd also be okay with an option to display commercial apps to purchase or install with a disclaimer that these are not open-source apps.
It'd be nice to have the ability to install Spotify, Discord, and some other popular apps and have them automatically kept up-to-date directly from the App Store. And that wouldn't negate the ability to sideload or use a third-party repo.
Their solution is to act as a proxy between your HomeAssistant installation at home and the public internet. Supposedly the SSL certificate sits on your device and they have no access to your data. I think this is the right way to do it
> It's documented how to set this up yourself if you don't wanna pay them to manage it for you too.
That right there is the best way to go. Everything in the open, everything can be self-hosted, but you can pay for the convenience of someone else doing it.
I’d normally have done that myself. But considering that having them handle it also funds Home Assistant development, it was actually pretty easy to decide for.
Always thought that Aesprite were onto something good. Assuming I've got this right, the source code is out there and free, but you need to pay for the compiled binaries. Most people can't be bothered doing the latter, so paying a bit for it is worthwhile
I'd be worried about perverse incentives. The company is now incentivized to make the build process more difficult, or at least not incentivized to try to make the build process easier for end users, which goes against the point of open source.
> I'd be worried about perverse incentives. The company is now incentivized to make the build process more difficult, or at least not incentivized to try to make the build process easier for end users, which goes against the point of open source.
Open Source doesn't imply anything on the code. It assumes to benefit from a wide range of people (and outsiders views).
Open Source projects, which would reject PRs on improving the build process will suffer contributions in the long run, risking their acceptance in the community.
It should aid itself. At least this is the philosophy I proclaim myself :)
No, but if we imagine the incentive taken to the extreme, where you have an arcane build process that makes it essentially impossible for any person who's not on the original team to build the software, then the software devolves to effectively just a source-available project, since the users have no way of actually building their own versions, which also precludes being able to make any modifications even on their own forks.
I guess the major difference vs truly only source-available is that you can still copy and paste chunks of code to use in other projects?
I am struggling to think of any open source project of any size beyond small NPM packages that I've experienced that do not have an arcane build system. At least all of the ones I've encountered have been incredibly obtuse, to the point that I've mostly just given up. Anything Mozilla, Google, or Facebook writes. Actually, come to think of it, the only open source projects of non-trivial size I've ever been able to build without dedicating several weeks of time to have been from Microsoft.
To be fair, these steps can go wrong when the project doesn't make clear that there are a lot of implicit dependencies for the configure step to even work (and then make can sometimes break if those implicit dependencies aren't the expected version).
But to your point, there are a lot of projects for which those steps work just fine.
Sure and in fact it did happen with one of the software i was trying to build, but the (first) error is usually along the lines of "cannot open include file foobarlib.h" - so i can search for that library (and many package managers let you search for the header files directly) or "this struct doesn't have that field" (common with -IIRC- libjpeg that at some point made some structs opaque). The latter is a bit more work (fixing the code) but it is also very rare.
Indeed which is why I like rust so much. The best thing they did was to make every program statically linked, so the build process succeeds 99% of the time. The only time I’ve had it fail for me really is when a project depends on OpenSSL and I don’t have it configured to it’s liking. Those time bring me back to the “make” days when I’d have to spend an afternoon installing some tool with all its dependencies, and even then sometimes I’d sometimes give up in frustration.
The problem with static linking is that if all you have is a binary (e.g. closed source program) you do not get any new fixes or features from the libraries you depend on.
For example many early 2000s games that used SDL 1.x can be made to work on modern Linux simply by removing the SDL so file they were bundled with and let them use the one the system provides (most common issue would be audio but also mode setting or full screen support).
This isn't a thing only on Linux btw, SDL games that have an old DLL can also have issues on Windows (in fact a game of mine was like that :-P) but be made to work by simply replacing the DLL with a newer one.
That’s surely a problem but I try to avoid closed source software for this very reason. Obviously there are a million and one ways closed source software can leave you high and dry, and static linking is one of those but really I think the operative word driving the sadness here is “closed” rather than “static”.
This can be an issue with open source programs as well from a practical perspective. For example let's say i modified Gtk to use a sane file dialog - i'd rather replace the system installed shared object once and have everything use it rather than recompile everything that uses Gtk.
If I think about the software that I use on a regular basis that I've built from scratch at least once, quite a few of them are fairly straightforward and basically just the two lines I indicated. Just off the top of my head all of these were just those two lines (download build tool then run the install step):
1. Kubernetes
2. git-annex
3. The new generation of replacements for various core command tools (e.g. ripgrep, fd, etc.)
EDIT:
> several weeks of time
Out of curiosity, what projects were those that took several weeks? My presumption is probably very GUI heavy ones?
I don’t know about weeks, but I remember LibreOffice, Firefox, and Clang all having their own ad hoc build systems requiring some amount of custom configuration.
Debezium is also very Byzantine, or maybe it’s just that I don’t understand maven.
There's a little more to it (deb-src into /etc/apt/sources.list), but it's super-instructive to do it on something like 'busybox' and be able to make legitimate changes to something like 'ls' ... or do it to 'coreutils' and make modifications to the "real ls".
Although the "speed-bump" to being able to build packages for the first time is a bit rough, the benefit is that the documentation is outstanding and the process is pretty seamless for most/all packages, regardless of complexity.
The docs and tools are written by engineers and maintainers for people just like you... an independent consumer/programmer, sitting at their computer, trying to (re-) build a package to add a feature or fix a bug.
The other benefit is that the process is relatively consistent across literally thousands of packages and there's a lot of docs + tools +features to handle almost any scenario that Debian (Ubuntu) supports. If you learn it for one use case, your investment pays dividends across all other packaged software.
It depends on the ecosystem (especially for tools without GUIs, I've found it usually is just two lines at the terminal: one to install the build tool for a given ecosystem and then the other to run it), but sure there are plenty of open source projects which are difficult to build from scratch. However, for most of them this is an acknowledged shortcoming that they try to fix when given the time and resources. It might be a much different world if there was a financial incentive to magnify that shortcoming.
FOSS can’t really work like this because anyone is free to compile it and stick it in a flatpak or on the distro repos. It only really works for something like iOS where there is only one store.
I've seen the pay-for-binaries model in a few places.
Not sure if it's scalable since someone can write a build script and share it with the community.
It depends -- with a community build script, you're relying on (and trusting!) a third party to maintain it and not do anything malicious. Much easier to pay a few bucks for first party binaries. At least in my eyes.
I'm not sure Aseprite is an example to follow - just as they made this license change, a fork of the previous license - LibreSprite - has popped up. It lacks a lot of the features and bugfixes of Aseprite, but hey it's free, and many open source enthusiasts view it as the morally superior version.
This simultaneously shows the weakness of the open-source development model - a non-monetizable passion project can rarely match the quality of something with full-time devs behind it, and probably isn't something that the Aseprite dev(s) are happy about - it sucks to compete against your own product sold for $0.
My impression is that RedHat is basically the only company to pull off the open source + paid support model. All other open source companies I know of either use open core or paid hosting.
Are there are any other companies like RedHat successfully thriving off just paid support? If so which ones? If not why not?
PostgreSQL has a number of companies providing support, the db is free and open source. Ray is offered for free, Anyscale provides support. Quansight offers qhub for free and provides support. Those are just a few off the top of my head. Disclaimer: I work at Quansight and contribute to ray.
Anyscale seems to follow the paid hosting (open source product + paid in-house SaaS offering) strategy though? Similarly the majority of the companies listed at https://www.postgresql.org/support/professional_support/ either provide paid hosting or some version of open core (proprietary add-ons/tools) in addition to support, or else seem to be general DB consultancies that include Postgres as one of their supported products, although it does look like there are a few small teams that focus exclusively on Postgres consulting.
Quansight is a fascinating example! Are you allowed to share roughly what ratio of revenue comes from the support side and what ratio comes from the venture fund?
I don’t really know, sorry. There are a lot of moving pieces as the company grows (we are hiring) and the interplay between the pure consulting, open source work, growing the Venture fund is dynamic.
What's wrong with paid hosting? For the people who do pay for software (essentially commercial entities), the management and support are as important (if not more) than the software itself.
The code being open-source is actually a great advantage because it alleviates concerns around lock-in and vendor going under (If e.g. AWS's RDS is for some reason no longer available to us, I can still run Postgresql myself, at least until I find an alternative).
Oh nothing wrong. I'm just on a fact-finding mission on seeing whether any other companies have successfully followed the consulting/support contract model vs paid hosting or open core (because this has rather significant repercussions on what kind of products lend themselves well to a given business model, e.g. a desktop app is not going to work well for paid hosting).
But I'm not really counting cURL because that's just one person right? The challenges faced by essentially a 1-person freelancer are quite different than a larger company. Or is cURL now a whole company at this point? EDIT: I see you mean this as a separate category of just "projects."
Proxmox projects (Proxmox VE, Proxmox Backup, Proxmox Mail Gateway) are also 100% fully open source and gets the revenue through enterprise support. Works fine for us.
The thing is if you do pull off the open source + paid support model where does that leave you? Doing technical support? I suspect most of us would prefer to spend our time creating rather than answering emails and phone calls.
It feels like we have the cart before the horse, we start by deciding we want to do open source then try and squeeze the business model into it. We would be better starting with the business and customers then deciding whether open source helps or hinders.
IIRC EnterpriseDB mainly makes it money from paid hosting and I think Hashicorp makes its money from a combo of paid hosting and open core (i.e. it has other proprietary tools and add-ons). SUSE is a good example, but I'm not as sure about Canonical. Wasn't it the case that Canonical is actually operating at a loss and has been for a while? Maybe that's changed?
Oh interesting. But digging into that it looks like it's coming from adding open core and previous to that they were losing money? (Ubuntu Pro + Ubuntu Advantage providing auxiliary proprietary? tools)
My understanding is that Busybox and buildroot are largely maintained by individual consultants which is pretty similar (although there's no larger company around them.)