I had never priced it and expected something reasonable and it is $426 on Amazon…
I don’t mean to start a holy war but this is why people buy consoles. I feel like the entire PC video card universe is so disconnected from reality. A PS5 or Xbox Series X is $499 and is a fully functioning device.
In the console ecosystem, you pay more for games, subscriptions, fees and such, depending on your habits.
And you don't get a fully functioning work PC out of it... Which is stupid, as both consoles would be great PCs, especially the XSX with bigger RAM ICs. They are kind of like high power Mac M1s.
I don’t know how it is in the US, but where I live no employer allows you to use your private pc for work.
I agree with your first point. However a plus on the console side is that I can actually still own my games instead of licensing them (if you buy physical).
Increasingly, the situation for console games is the same as for PC games. So you're also only licensing the game, need to be online to at least authenticate or even always to play, etc.
There probably still are games where you can just insert the disc into an offline console and start playing, but the trend goes in the other direction.
EDIT: I'm obviously talking about modern consoles. Get a used PS2 and some games and play to your heart's content!
I can see the trend you’re talking about, but the vast majority of games on consoles I can buy, put into my console and play without being online. Some games are obviously always online, some are just straight broken without patches, but the majority are just ok without internet.
So I find that, currently, my point stands.
As long as there are physical releases in the first place, that option will probably never vanish completely. Personally, I'm more "concerned" about the Indie/AA scene - when there's no physical release in the first place, I can't even visit the ship down the road to buy it. But that's independent of the PC/console divide, so I'm just rambling now.
Absolutely. But I do find that this actually gotten somewhat better in recent years with distributors such as Limited Run Games and the like (even when I loath them sometimes for their FOMO tactics) have often made possible small runs of physical editions of games, that would likely not have seen a release otherwise.
I mean you still can buy discs. You can buy it play the game and resell it on eBay if you don’t like it or are just done with it. If it’s a new game you can basically sell it for close to what you paid for it. I recently got out of a Diablo IV purchase I realized I didn’t like with minimal damage that way.
My brothers’ steam library makes me sad when I see how much money is wasted there in games they don’t play.
Well, yeah. A console is basically a discrete gpu with a few cpu cores bolted on to the gpu core, and a ssd added.
Discrete gpu involves paying for all the memory costs, all of the cooling costs, all of the video outputs and BOM costs, all of the testing and validation costs, and then just doesn’t do the last little bit that makes it a fully functional PC. Instead you are expected to buy a second set of memory, cooling, fans, etc and ship them all individually with total packaged shipping size of a small pallet, compared to a console shipped in something the size of a breadbox. It’s literally the most expensive way to build a computer with the most redundancy in system design and the most waste in shipment and validation.
So it’s not surprising that console costs and midrange dGPU costs are convergent. They are 90% of the way to a console, just missing the last few bits!
(But thats what PC gamers get from clinging to an outdated 1980 standard for computer design, and the form factors it provides for expansion cards. Just ask people to buy a new backwards compatible power cable string for their gpu, and provide them with a free adapter, and then watch the tantrums flow. The religious reverence for the ATX and pcie add-in-card form-factors is absurd and people get what they deserve when the pc designs that shake out of it 40 years later are incredibly poorly-fit to the real-world needs. Everything has changed, gpus dominate the system now, we deliver up to 400W through an intricate network of 75w/150w “aux” connector, and we still design cards and cases and motherboards like it’s 1980 and a gpu and cpu can both be passively cooled…)
You can build a Steam Console much cheaper, but it will also involve some sacrifices the PC community hates, like soldered (GDDR6/7) memory and no CPU/GPU upgradeability. But it’ll be 1/4 of the price, so it’ll be worth it even if you have to replace the whole unit to upgrade. That's why consoles are built that way, and not as an ATX PC.
An A770 should be ~$370, and you can buy A750s for ~$250, new of course. $426 sounds like someone's trying to fleece you or there just isn't enough floating around to sell anymore as Intel gears up for the next lineup.
> I don’t mean to start a holy war but this is why people buy consoles [...] A PS5 or Xbox Series X is $499 and is a fully functioning device.
How much do the console games cost? PC and console gamers tend to spend more on games than hardware, and PC games are offer more bang for the buck when amortized over playable time. Additionally, consoles are not backwards compatible with games for earlier platforms, so one has to re-buy versions of games they already own.
Consoles are appliances, and I understand the appeal of their simplicity. However, until consoles offer the flexibility of playing a game I bought 10 years ago[1] or the latest and greatest AAA title, I can't abandon PC gaming.
A console also compartmentalizes video gaming away from your compute/Internet stuff. (Both technically/security/privacy-wise, and in terms of distraction.)
(Source: Uses only a PS4 Pro for gaming, even though I have a PC with an RTX 3090 and gobs of RAM, sitting usually idle.)
I come from the other end. I can never bring myself to console games again. I am too old/busy to ever want to get good at gaming again. I play a bit now and may not touch it again for a week later. So the muscle memory etc. don't stick and I don't want that to frustrate me.
So I just play single player and use tools to give myself bullet time in every game, edit resources to escape grind. This is only possible on a PC. For content tourist style gaming, consoles are a no-go.
I do like the compartmentalization aspect of consoles though. Sooner or later Steam games will start being a security hazard. I don't want to do banking on a computer with games installed.
I don't understand what you're asking. If they don't pay enough to cover depreciation and the cost of electricity because you live in a high cost of electricity location, just say that.
Sole reason of not being able to use keyboard and mouse (and actually run software on it) is worth my PC'a money, and it's not much more expensive than console.
They are too expensive but it's slightly better if you consider it as a marginal cost on top of a PC you already have. Upgradability is the whole point of PCs really. You can keep a system going for many years by taking advantage of this. And a console isn't really a "fully functional system" when compared to a PC.
Not necessarily premiums. The manufacturers make their money back just by taking a percentage of all the (digital) sales.
That's why even the Steam Deck can be sold so cheap, without a premium on games. They can make up the difference with just more sales because you now have a device you'll use more.
> They can play games and nothing else. So completely worthless comparison.
Except they cost less than a high end video card (not to mention the rest of the PC) and your games are guaranteed to work.
> Console and PC are very different gaming experiences though. One is no substitute for the other.
That's absolutely correct, but innovation in PC games is on the indie side and those aren't so GPU hungry. Most AAA titles are the same on consoles and PC these days so might as well get them for the console.
So if you want to play both AAA and indie you buy both a PC and a console? How is that supposed to be cheaper than just adding a graphic card to your PC?
The indies I like don't need a serious graphic card. A good bunch of them work just fine on a 2018 i3 mac mini with the integrated graphics decelerator...
Don't you need the 1000+ EUR cards to play AAAs at the same performance as a console? That is, twice the price of a console only for the video card.
No, even considering the optimization involved, consoles are about the same performance as a RX 6700 non-XT. They lean heavily on upscalers, and the upscalers are usually inferior quality so a 3060 ti / 3070 is hitting in the same general ballpark of performance too.
Xbox series X also only has 10GB of vram and series S is 8GB, which people usually don’t realize. Microsoft used a fast partition/slow partition strategy (like GTX 970) so in practice the slow segment is your system memory and you can’t really cross over between them because it kills performance.
You can get 3060 Ti for $275 now or 6700XT for $330. NVIDIA has DLSS which is generally higher quality for a given level of upscaling (FSR2 quality is closer to DLSS balanced/performance level), which offsets the raw performance difference a bit. Or AMD has more raw raster and VRAM. But that's kinda your ballpark price comparison, not a 4090. The consoles aren't 4090 either, they're rendering games in 720p or 640p and upscaling.
But it gets into this weird space where people refuse to turn down a single setting or use even the highest-quality upscalers on PC, but PC is too expensive, so they'll buy a console where the settings are pre-turned-down for them and they'll be upscaled silently from even lower resolutions with even worse-quality upscalers, with no choice in the matter. Consoles are like the Apple products of the world, they take away the choices and that makes people happier because having too much choice is burdensome.
> But it gets into this weird space where people refuse to turn down a single setting or use even the highest-quality upscalers on PC, but PC is too expensive, so they'll buy a console where the settings are pre-turned-down for them and they'll be upscaled silently from even lower resolutions with even worse-quality upscalers, with no choice in the matter. Consoles are like the Apple products of the world, they take away the choices and that makes people happier because having too much choice is burdensome.
Yeah, i'd rather play the -ing game instead of counting the fps?
> Yeah, i'd rather play the -ing game instead of counting the fps?
yes, but, you can do that on PC too - just punch in medium settings and turn on DLSS Quality mode and away you go. You can get a $300 GPU that does the same thing as the console, you don't need to spend $700+ on a GPU to get console tier graphics.
The problem is that people insist on making comparisons with the PC builds at max settings, native-resolution/no upscaling, while they don't have a problem with doing those things on the consoles. And when you insist on maxing out a bunch of exponentially-more-expensive settings, you need a 4090 to keep up, and gosh, that makes PC building so much more expensive than just buying a console!
but again, the console is running 640p-960p internal resolution and upscaling it to 4K, which is like DLSS Performance or Ultra Performance mode. And if you enable those settings on PC, you can get the same thing for a pretty reasonable price. Not quite as good, but you're getting a full PC out of the deal, not a gaming appliance.
It's always been about consoles having an Apple-style model where they lock you into a couple reasonably-optimized presets, while PC gamers hyperventilate if you take a single setting off Ultra or benchmark with DLSS turned on. And obviously in that case you're going to need a lot more horsepower than consoles offer. Which is more expensive.
Also, GeForce Experience has a settings auto-optimizer which does this with one click, or you can use settings from the PCMR Wiki or DigitalFoundry etc. It does tend to target lower framerates than I'd prefer (as a 144 hz-haver) but there's a slider and you just move it a couple notches to the left.
> The indies I like don't need a serious graphic card. A good bunch of them work just fine on a 2018 i3 mac mini with the integrated graphics decelerator...
Yes, but your console don't run them so you need a (low end) PC + a console if you want to play both.
> Don't you need the 1000+ EUR cards to play AAAs at the same performance as a console?
No you don't. If you want to pay 1000+ that's because you ~~like to waste money~~ want to play at 4K res 144fps with the highest possible setting in the next 5 years at least. You can play AAA titles with the same kind of settings you have on console on a 300-500 euro graphic card. So for the price of the console you get an equivalent upgrade for your PC and don't have to pay a premium for your games and can play all your games on the same device.
Aren't console graphics usually not as good as the PC versions? Even if they have the same resolution, I was under the impression that effects and whatnot were lower. If that's in fact the case, then a 1000 EUR card wouldn't give you the same experience as a console. Hell, my mid-range AMD I bought new for 300 EUR has better graphics than my sister's PS4 pro.
I actually mean better graphics, yes. I'm not sure what you mean by "numbers" (fps?), but I'm talking draw distance, shadows, grass details, etc. I was comparing Red Dead Redemption 2. We don't have other games in common.
My AMD card is a 5600 XT, bought in 2020 IIRC, right before prices exploded due to mining. I don't think the PS5 was out at the time. Anyway, I've never seen one, so I'd be hard-pressed to make any comparison with it.
I've also not tested this in person, but I seem to remember watching a recording of someone playing GTA V on a PS4, and the graphics didn't look as good as on my PC. But then, I don't know how the compression and whatnot affected the quality.
You may be right for Rockstar games. Last one I played on PC was San Andreas. I mean in theory as a game developer you can add larger textures and whatnot in a PC game. If you want to spend money on it.
My impression generally is that, especially for the PS5 generation, there is a negligible difference unless you want those 180 fps and 8k and 16x fake frames or whatever DLSS is.
I'm not a hardcore gamer, nor do I follow these things too closely so I may be off here, but if I compare the specs of a 5600XT and the PS4's GPU, the former seems quite a bit faster. On the other hand, the PS4 has unified memory, whereas my PC is running PCIe 3 and DDR3 (quad channel, but still).
> My impression generally is that, especially for the PS5 generation, there is a negligible difference unless you want those 180 fps and 8k and 16x fake frames or whatever DLSS is.
As someone not particularly interested in the field (I own a PC because I need to do actual PC stuff, the "gaming" GPU I bought to kill time during covid lockdowns), my impression is that when a new generation console comes out, it's quite competitive with non-absurd PC builds. But PC GPUs tend to get noticeably better during the lifetime of the console. The PS4 came out 6 - 7 years before AMD released my GPU.
For 1080p and 1440p certainly not. Something like an RX 6750 XT will carry you all the way.
(The PS5 GPU was comparable to an RX 5700 XT / RTX 2070 Super at the time, although it's apples and oranges as the PS5 is an integrated system target whereas the PC is an open platform)
I don’t mean to start a holy war but this is why people buy consoles. I feel like the entire PC video card universe is so disconnected from reality. A PS5 or Xbox Series X is $499 and is a fully functioning device.