The biggest problem with most typed languages is string handling, especially null terminated strings. Returning a string from a function is a nightmare in almost every typed language, except Delphi/Free Pascal, which does auto-reference counting, and manages it all, you never have to allocate/free memory for them, and they can hold gigabytes.
Typed languages offer some efficiencies, and can, if properly used, help prevent entire classes of footguns, but they do require a bit of planning. I've always wished I could do the same "gradual type" thing with Pascal, Basic, etc. As the program is run, the types are checked, and slowly baked in automatically.
> Returning a string from a function is a nightmare in almost every typed language, except Delphi/Free Pascal, which does auto-reference counting, and manages it all, you never have to allocate/free memory for them, and they can hold gigabytes.
Why mention memory allocation at all? There are plenty of typed languages that have automatic garbage collection and handle strings just fine.
Go (Golang) comes to mind as a superb example of a typed language in which string handling is a breeze.
I'm so confused. What do you think happens differently in C#/python/Ruby/rust/whatever when you add one string to another? Or when you read into a buffer without a predefined size? Or assign one string reference to two variables?
Typed languages offer some efficiencies, and can, if properly used, help prevent entire classes of footguns, but they do require a bit of planning. I've always wished I could do the same "gradual type" thing with Pascal, Basic, etc. As the program is run, the types are checked, and slowly baked in automatically.