The debate around frame rates and whether or not they affect performance has raged for so long that I think a lot of us stopped caring a long time ago, but new evidence from Nvidia suggests that those with better graphics cards and higher frame rates have a higher K/D ratio.
The tech giant has been collecting data from battle royale games such as Fortnite, PUBG, Apex Legends, and Blackout, and it’s discovered that those who play on a 144hz monitor at 144 FPS do have a tangible advantage over those who play on stable 60 FPS?
According to Nvidia’s research, those players who use RTX 2000 series cards have a 53% higher K/D ratio compared to those who use the GTX 600 series graphics cards.
While you could argue that those with better graphics cards are also more likely to be playing more, hence the higher K/D ratio, the research does go on to note that superior graphics cards help players at various levels of play, whether they put in 40 hours a week or simply drop in every few weekends.
It also found that RTX GPU owners who play on 240 Hz monitors boast an almost 90 per cent higher K/D ratio when compared to gamers who play on GTX 1060 and a 60 Hz monitor.
Nvidia writes in a post:
What’s interesting here is that having a better graphics card helped at all levels of play time, whether you only play a few hours a week, or are a Battle Royale veteran. In fact, in the chart we see that the gap between GeForce GTX 1050/Ti users and GeForce GTX 1080/Ti users expands as hours played per week increase, which means that players with more hours played appear to benefit even more from having a better GPU. This data aligned with what we observed in our lab research — the higher the skill level, the more that players are attuned to the game and can benefit from differences in hardware.
While there are some interesting findings in here, I would also urge everyone to consider the source. Nvidia make graphics cards, so it’s not like it would encourage you to stick to the cheaper end of their wares, is it?