Hacker Newsnew | past | comments | ask | show | jobs | submit | tacet's commentslogin

They mean the data AI companies scrape(d) to train their models.

For example they can't opt for their comment not to be scraped off HN and used for training.


Hmm, there is no delay. Iirc the full cycle (sattelites return to exactly same spot in relation to earth) is about 10 days and most paces get crossed in 1-3 days.

Are you looking at sentinel 1 or sentinel 2


Or you could mess up your brain even more, as it is established by zen teachers and scientific literature.


Can you list some of the countries? I am curious, because in my country i can be seen within few days or call my doctor i need to. There are also consulting doctors that can be reached, if GP is not available. costs about 4 euros per visit.


> If people protest instead of working they will be fired which means they'll also lose their health insurance.

Not protesting got Americans to this spot.

sincerely a 28 day per year paid vacation enjoyer.


math comes to about 5 megabits upload. not great, but i could live with that.


>This is ridiculous. According to their resume they lived, studied, and worked in Poland for over 10 years and don't speak Polish?

Could be russian. :D We have people living here for 60+ years and cannot speak local language.


>very ominous semi-continuous monotonous hums

could be buzzer perhaps https://en.wikipedia.org/wiki/UVB-76


Too modern. But similar intent perhaps


Probably both. I'll admit to be layperson in all this, but deepseek is pretty impressive. Even if they have used more compute than they claimed, this part of article you linked didn't age well judging by reviews of 4.5 :D

>Many have compared V3 to GPT-4o and highlight how V3 beats the performance of 4o. That is true but GPT-4o was released in May of 2024. AI moves quickly and May of 2024 is another lifetime ago in algorithmic improvements.


if i understand correctly your argument, then i would say that it is very recognizable as science

>People could tell scaling wasn't working well before the release of GPT 4.5

Yes, on quick glance it seems so from 2020 openai research into scaling laws.

Scaling apparently didn't work well, so the theory about scaling not working well failed to be falsified. It's science.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: