Yes, I know: more. Love it or hate it, as well as things are going on. Our concert IT, it began with the appliance, then the frame general, then Multi The frame general, and soon, looks like, completely AI-infield frame.
In the GDC today Nvidia announced “Nerve shading support will come to the Direct X preview in April, in which the NVIDIA GeForce RTX GPUS will unlock the power of the AI ​​tanker core.
“NVIDIA RTX nerve shaders enable SDK developers to train their game data and shader code on RTX AI PC and to accelerate their nerve representation and to accelerate the weight of the model with the NVIDIA tensor core at the run time.
In other words, AI will not only be used to combine frames and create newly created based on the traditionally offered frame, but will also help to present this original frame. This rendering pipeline is being added to another step.
The last goal maybe in the game engine GPU may need to tell about the basic features of the game, such as wibacts, movements, and so on, and the rest of the picture will be removed.
It’s hard to imagine how it can work Anyone Saying how to prepare the picture, but it will be a place where “game data and shader code” training comes: developers can give the AI ​​model a good idea when it is presented, what should be like it, and then when users really play the game, the AI ​​model can do its ugly to create a copy of it.
As if Nvidia’s Blackwhel White Paper Explains: “Instead of writing a complex shader code to describe these (shader) functions, developers train the AI ​​model to conclude that the shader code will be counted.”
It will probably be made according to Blackwell, given Nvidia has worked with Microsoft to produce cooperative vector API, though NVIDIA says “some (developer’s nerve sheds) will also go to the former Generation GPUs.”
We already had an idea that the work was going on, as in December 2024 we saw in the upcoming graphics cards talking about “neural rendering capabilities”. We had seen too Nerve rendering from Nvidia beforeBut not in the context that can really be implemented in sports.
Developer Technology’s NVIDIA VP John Spatzer has termed it “the future of graphics” and Microsoft’s director 3D Dev Manager Sean Hargirus has agreed that its “Co -operative Vector will provide direct X and HLSL to the future of Gaming Industry.”
For me, it is almost a reflex for me to be suspicious of anything, but I should remember that my doubts on the frame gene are slowly gone. I remember that the hands of the character move to the hoods in the game and when the launch is launched, the DLSS 3 frames write to the general, but now those problems are rare and even the latest half bed if you have a high baseline frame rate.
So I will try to keep your mind open at least Likely That this can in fact be a step forward. At any cost, we will be looking for a long time – just a few weeks until Davis can start trying it.