It also provides a useful example of the limitations of TypeScript’s type checking. TS is an improvement on JS in this area, but sometimes people overestimate the safety guarantees it offers and forget that its type system is still unsound.
That case really is a type mismatch, so you would hope a decent static type system would catch it.
However, you don’t see the same warning in the case of functions that can be called with variable numbers of arguments if the types of the arguments being unintentionally supplied do match, because within the rules of TS, this is working as designed.
Combined with the perhaps unfortunate decision to provide a standard `map` function that doesn’t use its callback as most languages do, there is still the potential for an unexpected change of behaviour that the type checker can’t warn you about here.
It kind of is a type error, in that the error comes from the programmer incorrectly thinking that the function has type t => r, when it actually has type (t, s) => r. It's not a type error from TypeScript's point of view, because according to TypeScript's rules, both t => r and (t, s) => r are subtypes of (t, s, u) => r. The point of type systems is to detect cases where the programmer is wrong about the types of some value they're using, so it is a limitation of TypeScript that it wasn't able to detect the error in this case (I'm not sure if there's a good way to allow it to detect this particular error while also maintaining compatibility with JavaScript, but it is a cost that's being paid for that compatibility).
TS is taking on an impossible challenge in trying to add a robust type system on top of JS without harming compatibility. Its designers have chosen to favour compatibility where the two can’t be reconciled, and that is a reasonable, pragmatic choice. Better a new language that offers some improvements and lots of people actually use than a new language that offers somewhat more improvements that hardly anyone uses?
Unfortunately, this does mean TypeScript’s type system can’t be entirely sound. A classic situation that is also legal according to the rules of TS but “ought” to fail type checking is something like this:
Now arr_num[0] is null, clearly violating the intended type constraint.
This problem could be fixed by making it an error to alias arr_opt to arr_num. However, that might also cause a lot of extra work for anyone trying to migrate an existing JS code base, particularly if the types involved are not of their choosing but instead determined by code written elsewhere.
For example, if you called a library function that returned an Array<number> and you passed that into another library function that required an Array<number | null> and wasn’t going to modify that array, enforcing the constraint could mean that working code was broken for no real benefit.
Then you get into deeper questions about enforcing immutability using the type system, and finding that again you’re building on sand because you still have JS underneath. IMHO, it’s hard to blame the TS designers for not wanting to go down these kinds of rabbit holes.
In the end isn't that the same argument for JS? The programmer is responsible for passing a correct callback that accepts the 3 params that the map() provides, ie "calling the right function".
I think the author may have gotten their TypeScript example wrong.
Yes, TS is fine with passing more arguments to a callback that takes fewer. The callback cannot possibly use the additional arguments, so it doesn't matter what gets passed as it will not change the outcome.
This is very different from passing the wrong kinds of arguments to functions that do read them and do something with them, like parseInt.
Now, if you decide to pass a function with an optional second argument that matches the second argument that will get passed to the callback and expect that it will not be used because why would anyone pass additional arguments to a map callback - then yes, you will have the problem again.
function addOneByDefault(num: number, addAmount = 1) {
return num + addAmount
}
[1,2,3,4].map(addOneByDefault) // this typechecks but works poorly
This extra example is missing in the article and might be helpful to add.
I don’t see any disagreement that some type checker could catch this unintended behaviour. Many popular languages have checkers that would. The question here appears to be whether TypeScript’s type checker could do it without other consequences that are considered unacceptable.
Read carefully. The distinction here is that the type checker must allow for intended behavior within JavaScript while checking for an error.
The type checking I am talking about is not a sum type. It is not that the function can take a two different possible types. It's the fact that the parameter function can mutate into two different types depending on the usage. It has (<arity 1 or 2>) not (<arity 1> or <arity 2>) if you catch my meaning.... Or in other words the concrete type is not evaluated when you pass the function as a parameter but only when it is called with a certain amount of parameters... which is not something type checkers I know about look for.
The fundamental problem with the example under discussion seems to be that while the behaviour might not be intended by the programmer, it is working as specified as far as the language is concerned and changing that specification to make the unwanted behaviour fail a type check could have additional and unwanted side effects.
Perhaps I’m not correctly understanding your idea around arity as part of the function types, but so far it’s not obvious to me how what I think you’re describing helps to resolve that contradiction. Are you suggesting a way the type system could be changed without causing those additional, unwanted side effects?
Do you by any chance have a more rigorous definition or even a formal semantics for your proposed arity types that you could share, so the rest of us can understand exactly what you’re proposing here?
> The fundamental problem with the example under discussion seems to be that while the behaviour might not be intended by the programmer, it is working as specified as far as the language is concerned and changing that specification to make the unwanted behaviour fail a type check could have additional and unwanted side effects.
You don't need to change the behavior of the program. You can change the type checker to catch the unwanted error.
>Perhaps I’m not correctly understanding your idea around arity as part of the function types, but so far it’s not obvious to me how what I think you’re describing helps to resolve that contradiction. Are you suggesting a way the type system could be changed without causing those additional, unwanted side effects?
It's not formalized anywhere to my knowledge and I'm not willing to go through the rigor to do this in the comments. But it can easily be explained.
Simply put, what is the type signature of a function that can accept either two variables or one variable? I've never seen this specified in any formal language.
To fix this specific issue you want the type signature here to specify only certain functions with a fixed arity.
When some external library is updated with a function that previously had arity 1 to <arity 1 or 2> that could be thought of as type change that should trigger a type error.
Right now type checker recognizes F(a) and F(a, b=c) (where c is a default parameter that can be optionally overridden) as functions with matching types.
F(a) == F(a, b=c)
F(a,b) == F(a, b=c) <-----(F(a,b) in this case is a function where b is NOT optional)
F(a) != F(a, b)
From the example above you can see the type checker lacks transitivity (a == c and b == c does not imply a == b), because the type of a function with an optional parameter is not really well defined or thought out.
This is exactly the problem the author is describing. The type checker assumes that when the library changed F(a) to F(a, b=c) that the types are still equivalent, but this breaks transitivity so it's a bad choice and will lead to strange errors because programmers assume transitivity is a given.
You don't see this problem in other type checkers because JavaScript is weird in the sense that you can call a function of arity 1 with 5 parameters.
I don't think we can excuse it by citing the nature of source-to-source compilers. Other languages like Haxe, Elm, ReasonML, ClojureScript etc that target JS don't suffer from this.