The common thread from all the frontier orgs is that the datasets are too big to vet, and they're spending lots of money on lobbying to ensure they don't get punished for that. In short, the current corporate stance seems to be that they have zero agency, so which is it?
Huh? Unless you are talking about DMCA, I haven't heard about that at all. Most AI companies go to great lengths to prevent exfiltration of copyrighted material.
I note that the vide coding tools managed to keep the license headers in individual files, but the COPYING file containing the GPL2 has not made the transition.
Same source shows 'slop' alone being used as far back as 2008: https://desuarchive.org/_/search/text/slop/order/asc/ Some of these uses are the verb form, but most are the noun form, being used in pretty much the same sense it has been used for generations, and is in use today.
USDA administers SNAP, which provides food aid to over four hundred thousand families in the state. That's one of the programs this announcement is talking about suspending.
1. "The search algorithm is a highly parallel Monte Carlo Graph Search (MCGS) using a large transformer as its policy and value functon." ... "We use a generative policy to take progressively widened [7] samples from the large action space of Lean tactics, conditioning on the Lean proof state, proof history, and, if available, an informal proof. We use the same model and prompt (up to a task token) to compute the value function which guides the search."
See that 'large transformer' phrase? That's where the LLM is involved.
2. "A lemma-based informal reasoning system which generates informal proofs of mathematical state-ments, breaks these proofs down into lemmas, formalizes each lemma into Lean, and iterates this process based on formal feedback" ... "First, the actions it generates consist of informal comments in addition to Lean tactics. Second, it uses a hidden chain of thought with a dynamically set thinking budget before predicting an action."
Unless you're proposing that this team solved AGI, "chain of thought" is a specific term of art in LLMs.
3. "A geometry solver which solves plane geometry problems outside of Lean using an approach based on AlphaGeometry [45]." ... following the reference: "AlphaGeometry is a neuro-symbolic system that uses a neural language model, trained from scratch on our large-scale synthetic data, to guide a symbolic deduction engine through infinite branching points in challenging problems. "
AlphaGeometry, like all of Deepmind's Alpha tools, is an LLM.
Instead of accusing people of not reading the paper, perhaps you should put some thought into what the things in the paper actually represent.
If you think "transformer" = LLM, you don't understand the basic terminology of the field. This is like calling AlphaFold an LLM because it uses a transformer.
No, it isn't. They call out ExIt as an inspiration as well as AlphaZero, and the implementation of these things (available in many of their authors' papers) is almost indistinguishable from LLMs. The architecture isn't novel, which is why this paper is about the pipeline instead of about any of the actual processing tools. Getting prickly about meaningless terminology differences is definitely your right, but for anyone who isn't trying to define a policy algorithm for a transformer network, the difference is immaterial to understanding the computation involved.
Equating LLMs and transformers is not a meaningless terminology difference at all, Aristotle is so different from the things people call LLMs in terms of training data, loss function, and training that this is a grievous error.
Raw score is often quite frankly crap. It's often still easy to surface the negative reviews and since people don't at least at present fake those you can find out what they didn't like about a product. If a given products critics are only those whining about something irrelevant, not meaningful to your use case, or acceptable to you and it overall appears to meet spec you are often golden.
It was RAM a couple months ago, and it continues to be RAM. Major RAM manufacturers like SK Hynix are dismantling NAND production to increase RAM manufacturing, which is leading to sharp price increases for solid-state storage.
I think the snark comes from (and becomes merited through) an article that shows such an utter lack of empathy towards the problems that the vast majority face on a day-to-day basis.
No, when Alice and Bob are people whose hardships are actual hardships. It's not just that it's a hardship that's rare, or that "it's just a different hardship", or something — I can read about genuine plight that might affect some small portion of the population and empathize with that, and they with me at the same time – even implicitly, without statement in the article. But this, by virtue of being written, explicitly is unempathetic, whereas "this rare cancer affects 8 people" is not. That's not a problem I wish that I had, vs. this is a problem faced by someone who is well off, to even call this a "hardship" is a stretch.
To do so during a time when tech is also dragging its reputation into the mud by generally harming the rank and file, through large corporations whose actions are not held to account in anti-trust laws, to tech bro oligarchs who wine and dine with power while the rest of us are worst off in a time of unprecedented inequality, to tech laying off hundreds of thousands of employees over the last few years, to LLMs replacing hard working people with slop-generators… is just additional insult to injury.
The article is simply, itself, shallow. "… Is a Lesson for the Rest of Us" — no; barring unforeseen and extremely unlikely circumstances, I'm literally never going to have the "problems" faced by Brin, because I have no expectation of ever retiring with "perpetual wealth" levels of money.
reply