Hacker Newsnew | past | comments | ask | show | jobs | submit | MojoKid's commentslogin

...every single US household would have to spend all of its income buying nothing but music for over 13 years in order to arrive at what the music industry has deemed a reasonable settlement.


Um no, the source is directly noted and link properly within text and in the source list of the article.


Maybe it's been fixed, maybe it was my browser, maybe it was a heisenbug, but when I wrote my comment I clicked on the "wrote in a blog post" several times, and each time got a "Bad URL" message. I've just tried again, and this time it's sent me here:

http://googleblog.blogspot.com/2011/02/finding-more-high-qua...

which I assume is what was intended.

Fairy Nuff.

My real point is that the blog post lunk to here offers nothing extra to the original from Google, and to quote http://ycombinator.com/newsguidelines.html

    In Submissions 

    ...

    Please submit the original source. If a blog post
    reports on something they found on another site,
    submit the latter. 
I could be wrong, but this submission just looks like blog spam. Can you elaborate on what you think the submission adds over the original from Google?


Battery Eater Pro taxes the CPU and GPU blocks slightly and continuously. It doesn't peg anything just keeps the machine working moderately and not idle. It's a little more strenuous than web browsing but not much more.


A $250 graphics card that actually can play literally any title you throw at it, at reasonably high res and eye candy turned up.


For how long? I paid $500 or so for my Radeon HD 4870x2 around 2 years ago, and it's done everything you described since then. In my experience, getting high end video cards is a better choice in the long run.


I don't think that was the intent of the statement. Read it again. The Hummingbird chip can't compete versus Snapdragon when it's tied down by Android 2.1... that was the point.


Indeed. I said questionable. Go to any review of the release of Android 2.2 and you can see the performance improvements that are pretty much universal across devices. This simply conflates the two and implies something that isn't true.

With 2.2 the Streak is competitive with other 2.2 devices. There is no surprise there, given that it runs essentially the same processor as the Nexus One.


The Dell Streak is actually a QSD 8650, while the Nexus One runs the QSD 8250. Both phones run on the Qualcomm dual-core architecture: arm9/arm11, with one processor dedicated to linux "apps" and one dedicated to the "amss", aka modem software stack.

I have both devices sitting on my desk here, so I ran a comparison. As I suspected, they are close. Also note that the Dell Streak's resolution is much higher and I'm not sure how/if the benchmark takes that into account:

  Linpack (higher is better):
  Nexus One: 32.94 MFLOPS
  Streak 2.2: 33.373 MFLOPS

  CaffeineMark (higher is better):
  Nexus One: 5587 (rank 124)
  Streak 2.2: 5738 (rank 109)

  An3dbench (no idea how scores stack up):
  Nexus One: 4746, Fill rate = 10MP/s, Game level FPS = 24.87
  Streak 2.2: 4495, Fill rate = 17.65 MP/s, Game level FPS = 32.26


Great data, thanks.


Exactly, it's all how the two companies market their architecture is all, that and branding. Technically speaking, AMD does more in less silicon area but also has to run at higher clock speeds to do so. The power draw is about comparable between similar price/performance ratios from each camp.


The Asus UL35 is in there too. Similar config with NVIDIA GPU.


Mac mini would have been a good comparison for another SFF PC of some sort, agreed.


Flamebait article? Hello? This piece actually takes time to look at the system configs, dollars and cents. This article is quite the antithesis of flamebait. Please read before judging.


The rumor on the street is that Apple is eyeballing these chips for next gen Macbooks.


Surely not these but the next batch of Fusion chips with "grown-up" CPU parts (Phenom/Athlon). Even the 11" Macbook Air's 1.4GHz Core2Duo ULV blows all of these chips out of the water.


True, maybe not Macbooks with Zacate, unless Apple wants something cheaper and lighter. And these parts aren't that far off from CULV Core 2 chips actually.


I think their version of cheaper and lighter will be the next step up from the A4, i.e. something ARM-based for their iOS devices. By avoiding the Core iX chips in the 11"/13" devices in favour of a faster GPU they've already demonstrated they're not keen to sacrifice computing power below a certain level.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: