This reminds me of the time my middle-school history teacher decided to bring in one of the student’s financial advisor parents to defend price-gouging on gas during Hurricane Katrina evacuation and subsequent exodus.
It was an unconvincing argument then, and is an unconvincing argument now.
That’s awfully hard to do well though, especially setting up a system quickly.
Sure, you can limit amount per customer per store. But then someone comes in with their husband and double dips, and then go back through in 10 minutes hitting different checkouts, or just go through self checkout, and then go to different stores…
All the toilet paper is still gone, encouraging fear in other people to do the same as the couple above.
The alternative would have been “you idiots are buying all the toilet paper? Fine. It’s 5x more expensive now.”
People then see that toilet paper is still in stores and prices can come down gradually but rapidly, and if people start being nervous again prices can quickly raise to stamp that out.
> But then someone comes in with their husband and double dips, and then go back through in 10 minutes hitting different checkouts, or just go through self checkout, and then go to different stores…
During a panic toilet paper shooping spree that would allow like 100 other customers to also get toilet paper.
Wow cool, you just summed up something I’ve found myself doing subconsciously in the past few years. Thanks!
I use to be quite fond of short identifiers, especially ones the make the signs “line up”… until I worked with code long enough that I forgot what I did and had to read it again.
As an off-topic observation, whenever I see something like the phrase “operates between the public and the private space” I immediately think: this person definitely went to art school :P
International Art English is a well-documented, and mercilessly mocked (and deservedly so!) phenomenon, which thrusts the creator's image of self into the spotlight and questions assumptions about their ability for self-expression at the intersection of rational thought and plain language, through pervasive use of meaningless and tortured constructions, abject puffery, and run-on sentences.
lmao, I haven't had a single good interaction with cops and I'm not a minority and I most of the interactions I've had were not as a result of me doing crimes, and the ones that were "crimes" were for things like "being in a park shortly after sundown". I have never once had a reason to view them as anything other than shitty powertripping bullies.
Typical programming style in Common Lisp is procedural, not functional.
The CL spec doesn't guarantee tail-call elimination, and most CL implementations don't guarantee it either (it could be on or off depending on the current optimization level). So most people don't bother relying on it.
True. My CL code is atypical because I prefer functional style to iterative style. I know how to use declarations to turn on TCO for implementations that support it, which is most of the big ones. ABCL being a notable exception.
Whoa, what a surprising fact! I had not considered TCO in LISP was "nice to have." That's a good example where it's easy to hack but hard to make production ready.
Do you have any advice for understanding the difference between "relational" and "tablational"? I remember hearing something about how SQL is not really relational from my college professor, but we never really explored that statement.
Quite simply: A relation is a set of tuples, while a table is a list/multiset of tuples.
The Alpha/QUEL linage chose relations, while SQL went with tables. Notably, a set has no ordering or duplicates — which I suggest is in contrast to how the layman tends to think about the world, and thus finds it to be an impediment when choosing between technology options. There are strong benefits to choosing relations over tables, as Codd wrote about at length, but they tend to not show up until you get into a bit more complexity. By the time your work reaches that point, the choice of technology is apt to already be made.
With care, SQL enables mimicking relations to a reasonable degree when needed, which arguably offers the best of all worlds. That said, virtually all of the SQL bugs I see in the real world come as a result of someone not putting in enough care in that area. When complexity grows, it becomes easy to overlook the fine details. Relational algebra and calculus would help by enforcing it. But, tradeoffs, as always.
>SQL [...] is a database language [...] used for access to pseudo-relational databases that are managed by pseudo-relational database management systems (RDBMS).
>SQL is based on, but is not a strict implementation of, the relational model of data, making SQL “pseudo-relational” instead of truly relational.
>The relational model requires that every relation have no duplicate rows. SQL does not enforce this requirement.
>The relational model does not specify or recognize any sort of flag or other marker that represents unspecified, unknown, or otherwise missing data values. Consequently, the relational model depends only on two-valued (true/false) logic. SQL provides a “null value” that serves this purpose. In support of null values, SQL also depends on three-valued (true/false/unknown) logic.
Or, in other words, "relation" does not mean the relations between the tables as many assume: the tables, as a set of tuples, are the relations.
No, compared to not doing so many allocations that freeing them is time consuming or expensive. Having allocations slow a program down means that there are way too many, probably due to being too granular and being in a hot loop. On top of that it means everything is a pointer and that lack of locality will slow things down even further. The difference between allocating many millions of objects and chasing their pointers and doing a single allocation of a vector and running through that can easily be 100x faster.
Probably? Locality becomes fairly important at scale. That’s why there’s a strong preference for array-based data structures in high-performance code.
If I was them I’d be using OCaml to build up functional “kernels” which could be run in a way that requires zero allocation. Then you dispatch requests to these kernels and let the fast modern generational GC clean up the minor cost of dispatching: most of the work happens in the zero-allocation kernels.
I think it is, but to be clear I think (from my very limited experience, just a couple of years before leaving finance, and the people with more experience that I've talked with) that c++ is still a lot more common than any GC language (typically java, since OCaml is even rarer). So it is possible, and some firms seem to take that approach, but I'm not sure exactly how besides turning off GC or very specific GC tuning.
Here is a JVM project I saw a few years back, I'm not sure how successful the creators are but they seem to use it in actual production. It's super rare to get even a glimpse at HFT infra from the outside so it's still useful.
It was an unconvincing argument then, and is an unconvincing argument now.