You'd get sentences full of words like: tangential, orthogonal, externalities, anecdote, anecdata, cargo cult, enshittification, grok, Hanlon's razor, Occam's razor, any other razor, Godwin's law, Murphy's law, other laws.
It was a unique period of interplanetary space travel where most projects were simple flybys - the first time for each planet. Because the goal was just to flyby, the secondary benefit is that the trajectory sends it outside the solar system.
Nowadays, most missions involve insertions into orbit around the target planet, therefore no secondary opportunity to send it outside the solar system. The notable exception is New Horizons, which was a Pluto flyby and will also eventually leave the solar system.
It has a 3.7-meter (12 ft) diameter high-gain Cassegrain antenna to send and receive radio waves via the three Deep Space Network stations on the Earth. The spacecraft normally transmits data to Earth over Deep Space Network Channel 18, using a frequency of either 2.3 GHz or 8.4 GHz, while signals from Earth to Voyager are transmitted at 2.1 GHz.
Is there? Creating AGI sounds like a great way to utterly upend every assumption that our economy and governments are built on. It would be incredibly destabilizing. That's not typically good for business. There's no telling who will profit once that genie is out of the bottle, or if profit will even continue to be a meaningful concept.
I hear this comment a lot and I don't get it. Let's say AGI exists but it costs $100/hr to operate and it has the intelligence of a good PhD student. Does that suddenly mean that the economy breaks down or will the goalposts shift to AGI being "economical" and that PhD level isn't good enough? I still haven't gotten a heard a clear definition of AGI which makes me think that it will break the world.
This is what Open AI themselves believe the risk is:
> By "defeat," I don't mean "subtly manipulate us" or "make us less informed" or something like that - I mean a literal "defeat" in the sense that we could all be killed, enslaved or forcibly contained.
It won't break the world, but it's warranted that it will break the world of people doing labor and getting paid for it. And when you think of it, even being a mediocre (or even moronic) investor is practicing a form of labor, so not even capital ownership is safe in the long run. And yes, generational wealth is a thing but there are tides that slowly shift wealth from A to B (e.g. from USA to China). Have a machine smart enough with even a sliver of motivation (intrinsic or extrinsic) to get some wealth for itself, and just watch what happens...
If society only consisted of the people in a given sector/industry, could it continue and flourish? If we only had engineers, how would society fare versus if we only had influencers? In this paradigm, there's no difference between fine art and pop art.