Hickey has made the point that in real world (not asymptomatic) uses, the constants and depths make a difference he cares about. 10x takes seconds to a minute.
Only in theory, with branching factors of 32, they are not remotely equivalent unless you have a supercomputer counting atoms in a universe or something.
You know this would mean that a change to a vector under 32 elements is O(n) right? (Which it isn't quite in all cases, because the actual implementation is a bit more complex)
But this doesn't apply to hashset because hashmaps are sparse, and stored as pairs, meaning that they're effectively base 16. Also the clojure implementation of sets is just hashmaps where the key is the same as the value.
> You know this would mean that a change to a vector under 32 elements is O(n) right?
Technically true, but only because a change to a vector of under 32 elements is O(1) by definition (as long as the algorithm is deterministic) and O(n) includes O(1). It's equally true that a change to a vector of under 32 elements is O(n!).
The notation you're using is not meaningful in the context of bounded input. Big-O notation is not concerned with any behavior except the behavior at infinity.
This is not generally the case. Practically speaking, big O notation is almost always bounded. The clojure vector implementation has a maximum count of 2^32, therefore the behaviour at infinity is O(SomeException)