Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Australia's social security system is implemented using Model 204 – that's a database combined with transactional 4GL, that runs under z/OS, now sold by Rocket software. It was quite popular in the 1980s, but now I believe there are only two customers left on the planet – the Australian federal government, and the US federal government (although possibly they have more than one customer within the US federal government–Veterans Affairs is one, but maybe there are one or two more.)

A couple of years ago, Australia cancelled its multi-billion dollar project to move off it on to a Linux-based solution (Pegasystems), after having spent over US$120 million on it. The problem was that although the new system did the calculations correctly, it took minutes to process a single individual, something the mainframe could do in 2 seconds.

But, I'm 100% sure this was nothing to do with the inherent nature of the problem they were trying they were trying to solve – I think it was likely because they'd picked the wrong off-the-shelf product as a replacement and the wrong team to implement it, and if they'd gone with different vendors it could well have succeeded – but after spending years and over US$100 million to discover that, they aren't keen to try again.



They could host a competition.

Build a database the same size with fake but realistic data. Then leave to the competitors match the constraints.

Actually, I’d love to take part of a challenge like this.


I think a competition is a great idea in theory, but in practice is unlikely given procurement regulations, cultural issues, bureaucratic fears, etc

The part that still relies on the mainframe is the entitlement calculation - all the very complex rules which determine what payments each claimant is entitled to. Other aspects have already been moved to (or at least duplicated in) non-mainframe systems, e.g. SAP CRM. Those entitlement rules are written in SOUL, Model 204’s 4GL; a team of programmers in Canberra are kept busy constantly translating legislative updates into SOUL (the government can’t resist the urge to constantly tinker with the details of social security law, so almost every year brings at least a few minor changes, and every few years major ones).

Since this is basically business rules, they decided to use a Java-based low-code/no-code business rules automation platform as a replacement, and tirelessly translated all the business rules encoded in the SOUL code into it. And they succeeded functionally - the new system produced the same results as the old one - but the performance was worse by orders of magnitude, and since it was fundamentally single-threaded (time to process a single record - maybe in theory you could parallelize aspects of it but I doubt either old or new system were) - it wasn’t a problem you could solve just by throwing more hardware at it.

Idea I have: keep the SOUL code as-is, and build a SOUL compiler for Linux (e.g. using LLVM). Or even just transpile the SOUL code into C. Totally doable, likely to give similar performance to the original mainframe system… Of course, that wouldn’t solve the problem of “system written in obscure language almost nobody knows any more”, but at least could get it on to a mainstream platform

but… people with the skill set to do this are unlikely to be interested in a government job with a rather limited salary…

And, in most countries, government agencies are strangled by procurement rules which attract firms which are adept at negotiating those rules, even if not always so adept at successfully solving the underlying problem… meanwhile, other firms which might be highly adept at the underlying problem take one look at those rules and think “this isn’t worth it”


Part of the problem is that it's quite hard to realistically create "aged data", so running such a competition in a way that actually informs the decision is hard.

Mainframes come from a mindset where degradation in performance is something that requires a scheduled maintenance window. Not just for hardware, but also for software. Compare that to the more modern world of "oh we'll just VACUUM the database in the background". The surrounding ecosystem of software might not even tolerate a rare spurious glitch delaying response an extra second.

That and all the software stacks on them are huge complex custom monsters, and reimplementing the whole thing from scratch on a more common SQL database is not exactly easy, while maintaining data integrity, performance and not being able to pay Silicon Valley salaries.


> Part of the problem is that it's quite hard to realistically create "aged data",

Certainly. One of the projects I’m working on is just that and building a comprehensive dataset is A LOT of work. For some uses you can make it work with a little bit of realistic data for the actual teste and use simpler mechanisms to generate bulk data that’s there only to work as noise the program will never actually see (but the database logic will need to contend with).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: