I was on a different team, but my third-hand knowledge from 20 years ago (notably, before deep learning became mainstream, even at Google) was that Google crawl scheduling had a bunch of heuristics to guess at the update frequency of a given page. The probability that a page has changed since you last crawled it is an important factor in the expected utility of crawling it now.
As I mentioned, I expect that even more arcane heuristics, in the form of ML models, has largely or completely replaced the hand-written heuristics.
As I mentioned, I expect that even more arcane heuristics, in the form of ML models, has largely or completely replaced the hand-written heuristics.