Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>The roughest era of AMD CPUs was the FX era.

Ahem. Bulldozer?

>Ryzen was a huge step forward in CPU design and architecture.

First gen Ryzen was kinda mediocre. Second gen(correction: meaning Zen 2 not Ryzen 2000 which was still Zen 1) was where the performance came.

Also let's not ignore how they screwed consumers like me by dropping SW support for Vega in 2023 while still selling laptops with Vega powered APUs on the shelves all the way till present day in 2024, or having a naming scheme that's intentionally confusing to mislead consumers where you don't know if that Ryzen 7000 laptop APU has Zen2, Zen3, Zen3+ or Zen4 CPU cores, if it's 4nm, 5nm, 6nm or 7nm or if it's running RDNA2, RDNA3 or the now obsolete Vega in a modern system.[1] Maddening.

Despite that I'm a returning AMD customer to avoid Intel, but I'm having my own issues now with their iGPU drivers making me regret not going Intel this time around. The grass isn't always greener across the fence, just different issues.

I get it, you're an AMD fan, but let's be objective and not ignore their stinkers and anti-consumer practices which they had plenty of and only played nice for a while to get sympathy because they were the struggling underdog, but didn't hesitate to milk and deceive consumers the moment they got back on top like any other for profit company with a moment of market dominance.

My point being, don't get attached or loyal to any large company, since you're just a dollar sign for all of them. Be an informed consumer and make purchasing decisions on objective current factors, not blind brand loyalty from the distant past.

[1] https://www.pcworld.com/article/1445760/amds-mobile-ryzen-na...

https://www.digitaltrends.com/computing/amd-confusing-naming...



>>The roughest era of AMD CPUs was the FX era.

>Ahem. Bulldozer?

Bulldozer is the same as FX.

>AMD FX is a series of high-end AMD microprocessors for personal computers which debuted in 2011, claimed as AMD's first native 8-core desktop processor.[1] The line was introduced with the Bulldozer microarchitecture at launch (codenamed "Zambezi"), and was then succeeded by its derivative Piledriver in 2012 (codenamed "Vishera").


>The roughest era of AMD CPUs was the FX era.

Or the early Athlons that would literally burn down without cooling? https://www.youtube.com/watch?v=YYQSHXNFvUk


Toms Hardware posted retraction over a year later admitting motherboard was at fault and test was proposed and designed by Intel (including picking motherboard vendors) as part of their Pentium 4 promotion drive.

Same as Pentium 3 of same era, thermal throttling on socket A was supposed to be implemented by Motherboard vendors using chip integrated thermal diode. Pentium 3 would burn same way if put on a motherboard with non working thermal cutout.


> thermal throttling on socket A was supposed to be implemented by Motherboard vendors using chip integrated thermal diode

TBirds and spitfire didn't have die sensor, that was first on Palomino/Morgan.

That said I've seen P4s die due to cooler failure so it was still dumb.


This is from that Toms article:

"Just like AMD's mobile Athlon4 processors, AthlonMP is based on AMD's new 'Palomino'-core, which will also be used in the upcoming AthlonXP processor. This core comes equipped with a thermal diode that is required for Mobile Athlon4's clock throttling abilities. Unfortunately Palomino is still lacking a proper on-die thermal protection logic. A motherboard that doesn't read the thermal diode is unable to protect the new Athlon processor from a heat death. We used a specific Palomino motherboard, Siemens' D1289 with VIA's KT266 chipset."

Intel suggested Siemens D1289 board for the test, board didnt have thermal protection. Intel suggested (or even delivered) Pentium III motherboard with working thermal protection.


>AMD FX is a series of high-end AMD microprocessors for personal computers which debuted in 2011

Ha, well that's wrong. This is the first time I find a mistake or more accurately, a contradiction in Wikipedia.

AMD's first FX CPU (the FX-51) came out in 2003 as a premium Athlon 64 that was an expensive power hungry beast, which is the one I assume the GP was talking about. Here, also from Wikipedia:

"The Athlon 64 FX is positioned as a hardware enthusiast product, marketed by AMD especially toward gamers. Unlike the standard Athlon 64, all of the Athlon 64 FX processors have their multipliers completely unlocked."

https://en.wikipedia.org/wiki/Athlon_64#Athlon_64_FX


It's not contradictory. The "FX" you're talking about is used as "Athlon FX"[1], whereas the "FX" in the article is "AMD FX"[2]. The branding might be a bit confusing, but the article isn't wrong.

[1] https://en.wikipedia.org/wiki/File:AMD_Athlon64_FX.jpg

[2] https://commons.wikimedia.org/wiki/File:AMD_FX_CPU_New_logo....


> First gen Ryzen was mediocre. Second gen was where the performance came.

Are you sure? I just looked at Ryzen 5 1600 vs 2600 benchmarks and the difference is around 5%. And I also remember the hype when the first generation was released. I think Ryzen gen 1 was by far the largest step.


Modern chip model numbers are just branding, and one must look at the benchmarks if you want value:

https://www.cpubenchmark.net/high_end_cpus.html

Yes, it is deceptive and annoying shenanigans for retail products =3


Zen 2 is Ryzen 3000.


Becoming mediocre by Intel's standard was a huge step at the time. So both of you can be right.


Almost on par with Intel in single core but twice the amount of cores. A big deal if you had a use for all these cores - I did, compiling C++ code.


Both of you forget that for the longest time Intel consumer chips excluded virtualization and other features until Ryzen 1st generation had it available. Like AVX-512 for example. 1st generation was a huge win in functionality for consumers even if it didn't hit the same performance of Intel. AVX-512 wasn't support on first gen, but there were other features I forget now but it was also a reason I had stuck to AMD.


> First gen Ryzen was kinda mediocre.

I've used both the Ryzen 3 1200 and 7 1700 and all of them seemed fine for their time and price.

Honestly, I had the 1700 in my main PC up until last year, it was still very much okay for most things I might want to actually do, except no ReBAR support pushed me towards a Ryzen 5 4500 (got it for a low price, otherwise slightly better than the 1700 in performance, still good for my needs; runs noticeably hotter though, even without a big OC).

I guess things are quite different for enthusiasts and power users, but their needs probably don't affect what would be considered bad/mediocre/good for the general population.


Im sure you will be happy to hear this is purely artificial limitation introduced by AMD for product segmentation purposes. Very first Ryzen Zen generation does fully support ReBAR in hardware, but its locked by AMD bios.

https://www.techpowerup.com/276125/asus-enables-resizable-ba...


Yeah, there were also efforts like this, too https://github.com/xCuri0/ReBarUEFI

Given that I got an Intel Arc A580 for myself, this was pretty important! Quite bad that it wasn't officially supported if there are no hardware issues and I would have liked to just keep using the 1700 for a few more years, but opted for just buying a new CPU so my old one would be a reasonable backup, path of least resistance in this case.

Would also like to try out the recent Intel CPUs (though surely not the variety that seems to have stability issues), but that's not in the cards for now because most of my PCs and homelab all use AM4, on which I'll stay for the foreseeable future.


I actually like both companies. Intel isn't bad, right now isn't great for them though.

We are better for Intel and AMD to coexist. But my gamble is on AMD because I've always liked the compatibility of the hardware with variety of technology. You can easily get server grade interfacing on consumer grade parts. For the longest time that wasn't true for Intel. When AMD pulls an Intel I'll be full Intel. There are huge wins for Intel getting new fabs built in the states, because it means a lot for security and development.


As for pulling an Intel, they kind of did with 5000 series pricing iirc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: