With today’s announcement of the brand new RTX Titan GPU along with the relatively recent release of the RTX 2080TI and the RTX 6000, its obvious Nvidia is betting big on its new Turing architecture and its much-touted Ray Tracing capabilities via NVIDIA® OptiX™
I think this is going to affect the Media and Entertainment industry hugely. To understand why I feel is so groundbreaking, you need to look under the hood of how this new Nvidia Optix AI system operates. For now, its primary purpose is to provide real-time ray tracing based renders, which we all know is very compute intensive but where the miracle happens is the AI portion. This AI system is using deep learning to improve its render times by actually learning from past calculations and using that data to predict outcomes faster than actually doing the calculation over and over! This is just like when we were first starting seeing video compression technologies hitting the industry in the ’90s, it changed the entire media industry and created some new ones. Some of the significant CG rendering apps have already incorporated Nvidia Optix AI, such as VRAY-Next or Arnold 5.2, and Redshift to name a few. Right now the AI is just utilized for the denoise process, but I feel that other deep learning applications will start coming out that can leverage the AI to improve how we work on monotonous functions like rotoscoping significantly.
I think this RTX Titan (or “T-Rex” as some are calling it) will be the card to get for the power user. It has a whopping 24GB of VRAM, the same amount of Cuda core as the top of the line RTX 6000/8000 but at only $2499 vs. $6300. For power users that are utilizing GPU power, this is going to be the card to get. Nvidia is touting that this card will allow for real-time 8k editing!
Red, the camera maker, has been working with Nvidia so that its most computationally demanding part of REDCODE can be processed at 8k resolution in real-time the new Nvidia Turing architecture.
We are beginning to seeing many facets of the content creation world tapping into the power of GPU hardware, but the real exciting changes will be the groundbreaking new developments coming from applications that will utilize the deep learning / AI capabilities of these GPUs.