Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Nvidia Launches New Mainstream GeForce GTX 560 Ti Graphics Card (hothardware.com)
17 points by MojoKid on Jan 25, 2011 | hide | past | favorite | 19 comments


If you don't need CUDA or PhysX, is there any reason to use nvidia and not ATI?


Proper Linux driver support for 3D acceleration, video games, and multi-monitor setups?


On the other hand, if you're looking for open source drivers, I've had better experiences with the ATI (radeon/hd) drivers than the ones for Nvidia (nouveau).


Brand loyalty is a big part of it too. Personally I've been soured by a laptop that had a mobility radeon with custom drivers from the manufacturer that I couldn't upgrade without using Omega. The card definitely didn't live up to expectations but it could have just been a fluke.


The lovely thing about new graphics card introductions are the scrambles by competitors to lower prices to compete. AMD reduced the price of the 6950/1GB to $259 and the 6870 to $219 to nicely sandwich this new card with its $249 price point. Source: http://www.anandtech.com/show/4135/nvidias-geforce-gtx-560-t...

[edit: replace ATI with AMD. :)]


Too bad that it has no DisplayPort output.


Sadly, DisplayPort monitors are still vastly overpriced. We need mainstream LCDs to start supporting the standard.


Are there advantages of DisplayPort over HDMI? Or do you just want to use Apple branded monitors without an adapter? My naive outsider's thought works be that rather than adding an additional output to all video cards, that it would be better if monitor manufacturers would settle on a single standard.


• DisplayPort is open and royalty free.[1]

• Easily extensible (packet based protocol)[2]

• More than twice the bandwidth of HDMI[3]

• Planning to replace all display protocols from internal LCD interface to the living room

• Better electrical engineering (tricky clocking, spectrum spreading)

[1] HDMI royalties are about 1/2 the cost of H.264 decoders

[2] And yet I have to plug my USB cable in to get my laptop to see devices hanging on my display. Come on Apple! Read the spec! Implement! You are costing me 4 seconds per day in unnecessary plug fumbling time.

[3] You could hang four 1080p60 displays on a single DisplayPort


DisplayPort is a free standard, while HDMI requires a license. Also, DisplayPort allows devices to be daisy-chained rather than connected to a single hub (theoretical at this stage, though).


Actually, I have a HP LP2475w. I bought it because it has an IPS panel just like Dell U2410 which has a DisplayPort, too.


Thanks. I didn't know there were other manufacturers making monitors with DisplayPort inputs. I use mostly NVidia cards and Linux, so I was being genuine about my 'naive' status. My impression was that DisplayPort was dead outside Apple. Does you think it's still up and coming, or is it another Beta/VHS problem?


If my memory serves me _right_, I saw a Dell POS (Point of Sale) system a couple of days ago that had an integrated Intel videocard which offered only VGA and DP outputs. If you wanted a (legacy) DVI output you had to buy an extra ATI card that had DVI output.

Also a couple of new laptops offer DP output and combined with the advantages that others have already mentioned, I think that it has a future and it will replace DVI, but not HDMI.


I don't see anything to get overly excited about.


I agree, however I just bought one of these 2 minutes ago.


There is nothing wrong with it, it's just not the step up that Nvidia claims it is.

I was hoping for a 1K core chip for the 5xx series and instead we get these stop-gap products.


Nvidia and Intel are both on a tick/tock cycle.

2010 shrink the die size 2011 new architecture 2012 shrink the die size

Nvidia and Intel are both going to put out really high performance chips on the new architecture before the die shrink, but I am not going to buy them so it's irrelevant. An i5-2500 + GTX 560 are a great price / performance match and things are not going to really change much for another year.


A $250 graphics card that actually can play literally any title you throw at it, at reasonably high res and eye candy turned up.


For how long? I paid $500 or so for my Radeon HD 4870x2 around 2 years ago, and it's done everything you described since then. In my experience, getting high end video cards is a better choice in the long run.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: