Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't understand the problem. 2000Hz = 0.5ms/frame; 20% of this = 0.1ms. That sounds like a great result. Your target frame rate is presumably 60-100Hz, assuming it's a PC game, meaning your frame budget is 10-16ms. If your UI takes 0.1ms, you've got >99% of your budget left.

(Also: 3-4K vertices for UI was about what you could expect to budget for a PS2 or Xbox game! - max throughput for PS2 was something like 250,000 vertices/frame at 60Hz, and this is <2% of that. I struggle to believe this is any kind of an issue for anything modern.)



I should mention that 2000 Hz when only running the UI -- with the full game running at 150 FPS. So really the CPU time is 6.66ms*0.2 = 1.33 ms just for the UI!

Of course, that's on my beefy machine with a octo-core overclocked CPU and a couple 1080Tis. But what about the players who are trying to play on their mobile CPUs with integrated graphics? That margin could be the difference between 45 and 60 FPS.


This is wrong, just because it uses 20% CPU when rendering at 2000Hz doesn't mean it will consume 20% CPU when running the game too. Running the game alongside the UI is what makes it drop to 150FPS from 2000FPS. So if the UI consumes the 20% of the frame time at 2000FPS, that is indeed 0.1ms and this number will not magically increase when you add the game. So when you're running at 150FPS your frame time is 6.66ms and you're using 0.1 for the UI. That seems pretty good for me to be honest.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: