The Last of Us Part 2 proves that 8 GB of VRAM can be enough, even at 4K with maximum settings, so why aren’t more games using the same clever asset-streaming trick?

by lucky
0 comments

When Sony’s masterpiece appeared on the PC 1 Shast PC two years ago, I hoped that it would become a water shedding moment for the history of the console ports. Well, it was, but for all the wrong reasons – buggy and unstable, he didn’t like your CPU and GPU, and most of all, he tried to eat more worm than your graphics card. It is fair to say that the water shed moment of Tlow 1 has not done enough to the entire ‘8GB of Varam’ 8GB.

Most of these problems were eventually solved through a series of patchs, but like many big budgets, meg graphics games, if you fire it on 4K on ultra -settings, the game happiness will allow you to use more waram more than really. The TLO1 screenshot below is from the test rig using an RTX 3060 TI, which has an 8GB memory, showing the built -in performance HUD. I have confirmed that the use of Ram with other tools and the game is really trying to use the 10GB of edema.

(Image Credit: Sony Interactive Fun)

So when I started checking our last part 2 a few weeks ago, the first thing I monitored after developing a coffee part of the game was the amount of graphics memory he was trying to allocate and use. LI to do this, I used Microsoft Pixes on WindowsFor developers, a device that allows them to analyze in terms of threads, resources and performance, can analyze what is happening under their game hood.

You may also like


Parse error: syntax error, unexpected token "<", expecting end of file in /home/u848752932/domains/pokogame.site/public_html/wp-content/themes/soledad/footer.php on line 31