The first time I heard about the new Nvidia RTX technology was from a fellow YouTuber, believe it or not. It sounded extremely exciting, but since I tend to always come late to having the newest technology because of financial reasons, I didn’t want to get too giddy about something I was probably not going to get for a very long time, so I let it go from my mind. Imagine my surprise when the next product I was sent to review was an RTX GPU: the Nvidia Quadro RTX 4000.
It’s imperative to note that in order to really understand the individual model I’m presenting you have to be aware of its underlying technology. Because we’re not simply talking about a new model in a line here: we’re talking groundbreaking new technologies.
So the first thing I did then was to watch the whole Nvidia GeForce RTX presentation that Jensen Huang gave last year for the official launch of the GeForce RTX line. Even if the presentation wasn’t for the Quadro RTX line, he focuses on the new Turing architecture and it’s very well explained, with some jaw-dropping examples and great comparisons. If you’re curious, I recommend you watch it.
The Future of Computer Graphics
Nvidia is introducing something that’s been like the holy grail forever. It truly is a game changer. I don’t want to scare you with minute, super-specialist details, but I have to give you a broad idea of what’s new in order for you to make sense of what’s inside this GPU.
The new NVIDIA Turing architecture fuses real-time ray tracing, AI, simulation, and rasterization to fundamentally change computer graphics:
So, what’s RTX? The short version is: Real Time Ray Tracing. This. IS. Huge.
Ray tracing is a rendering technique that traces the path of light that intersects objects in the scene. The objects it encounters have real world material properties that inform the engine so it can calculate how this will look, including reflection, refraction, subsurface scattering and other optical phenomena that is very hard to simulate by other means. The results are amazing, but at a great cost: it’s very computationally intensive. This is the reason why you mostly see it in higher end animations like in Hollywood movies, and not in games, which need real time graphic speed. Now Nvidia created the architecture to make real time ray tracing happen. Here you have a beautiful example:
The Turing architecture is armed with dedicated ray-tracing processors called RT Cores that accelerate the computation of how light and sound travel in 3D environments by up to 10 Giga Rays per second. Turing accelerates real-time ray tracing by 25X over the previous-generation NVIDIA Pascal and can render final frames for film effects more than 30X faster than CPUs.
Turing features new Tensor Cores, processors that accelerate deep learning training and inference, providing up to 500 trillion tensor operations per second. This level of performance dramatically accelerates AI-enhanced features—such as denoising, resolution scaling, and video re-timing—.
New Streaming Multiprocessor
The Turing architecture dramatically improves raster performance over the previous-generation Pascal with an enhanced graphics pipeline and new programmable shading technologies. These technologies include variable-rate shading, texture-space shading, and multi-view rendering, which enable more fluid interactivity with large models and scenes and improved virtual reality experiences.
Turing-based GPUs feature a new streaming multiprocessor (SM) architecture that supports up to 16 trillion floating-point operations in parallel with 16 trillion integer operations per second. Developers can take advantage of up to 4,608 CUDA cores with NVIDIA CUDA 10, FleX, and PhysX software development kits (SDKs) to create complex simulations, such as particle or fluid dynamics for scientific visualization, virtual environments, and special effects.
In the Quadro RTX 4000, you get all this in a Single Slot Form Factor, which is great for the typical desktop that most of us have. The Quadro line is intended for reliability for graphics professionals and creators. Architects, designers, animators, video fx and editors will find solutions to previous workflow challenges. They will Improve review processes with cinema-quality renders that show accurate shadows, reflections, and refractions through real-time ray tracing and dynamic global illumination, instantly edit and manipulate scene elements with interactive rendering in the application viewport and create more precise scenes with AI-powered advanced programmable shading.
Here are the features and specs for the Quadro RTX 4000:
- Three DisplayPort 1.4 Connectors
- VirtualLink Connector
- DisplayPort with Audio
- VGA Support
- 3D Stereo Support with Stereo Connector
- NVIDIA GPUDirect™ Support
- Quadro Sync II Compatibility
- NVIDIA nView® Desktop Management Software
- HDCP 2.2 Support
- NVIDIA Mosaic4
Now, you might be wondering: ok, all of this sounds so impressive! But how does it actually perform?
Well, the answer is a little bit more complicated than you’d like. I’ll give you what I’ve got so far.
First, I went on to try my usual applications like Adobe Photoshop and Premiere Pro. Everything worked fine, as expected, but I didn’t notice that much of a difference (my previous GPU was a GeForce GTX 1050ti). I thought these programs didn’t help much in testing the new card, since I use Photoshop primarily for 2D and Premiere Pro for simple video editing, but not special effects.
Then I went on and installed Adobe Dimension. Surely, if I encounter titles like “Adobe Dimension CC and NVIDIA RTX GPUs for Real-Time Rendering” in the Nvidia YouTube channel, it would be perfect to show me what I can do, right? Imagine my surprise when I couldn’t find an option in the settings to see, or tell the program my rendering preferences. I did some research until I found a post in an Adobe forum in which someone asked the same question. The answer shocked me: Adobe Dimension only uses CPU for rendering… for now.
Then, I went on and did a lot of searching online for what software actually does use the RTX technology, and I came to a realization that ties what I said at the beginning to the current state of things: This groundbreaking technology is so new that software companies have yet to fully implement it in their code. See, the hardware is there, but the software has to call for it to be used. I found there are many companies that are working towards this and are very excited, like Chaosgroup (V-Ray), Solidworks, Blackmagic, Autodesk, Pixar, Allegorithmic, OTOY and lots of others. The future is exciting, even with Adobe Dimension, as they’re working on implementing the new Nvidia technology into the program.
Something else to keep in mind is DirectX Raytracing, implemented since version 12, which only runs on Windows 10. DirectX Raytracing is able to take advantage of the new Nvidia technology.
Ok, so not being able to test the new technology, I went on to test the card for whatever capabilities current software is able to take advantage of. Keep in mind, though: my CPU is 10 years old, a first gen. Intel I7 920, which can create a bit of a bottleneck. This card would definitely perform better if it was set in a newer and faster configuration. I ran a few benchmarks that did really well, like UserBenchmark, that gave these results:
You may notice DX12 is not included in these tests, so I tried the BaseMark benchmark for DX12:
I also included a render benchmark, like LuxMark:
I would have loved to test Nvidia I-ray (Nvidia’s render engine), but RTX is still not implemented in the latest version. They’re working on it.
SolidWorks is also working towards supporting RTX and I’ve been told it will come in the next version, but I’ve got a preview of the unreleased version of Visualize in the works:
With RTX on, rendering took 4 minutes. With RTX off, it took 5:58 minutes. That’s 33% faster! That will definitely change any graphics production workflow.
Although I’m certainly not a gamer, I was curious to see how my new configuration would perform. I installed and played two games: Planet Coaster and Warframe. Both behaved extremely well, especially if you take fast-paced games like Warframe (third person shooter with a great variety of parkour-like movements and crazy graphic-intensive abilities). It ran silky-smooth, without a single glitch, so I thoroughly enjoyed being tense and nervous without any interruptions =)
I wanted to make sure I paid attention and see if I could identify the differences in my common workflow with this GPU, so I took some time in order to “taste” what it felt like. After the proper observation, I have to say: I really did notice a big difference. The first thing that stood out was its reliability. Everything worked, all the time. No graphic feat seemed to upset it, no matter what I tried. I monitored it with GPU-Z for a long time and I was pleasantly surprised with the ease it seemed to show, even with demanding stuff. Then I started noticing the differences compared to the way my previous GPU performed. And oh yes, this one was definitely and noticeably better. Things like digital painting with big brushes and no lag, for example. Or moving 3D objects around in the viewports. Every little (or big) thing that uses GPU acceleration worked like a charm.
Ok, it’s true that software still has to catch up with Nvidia’s new technology, but oh boy is what’s coming exciting! I think having the hardware ready to use for when the next version of everything comes out is not such a bad idea. If you can, I highly recommend you go to the GTC (GPU Technology Conference) 2019, from march 17-21 2019. And if you can’t, don’t miss the keynote and conferences that you’ll see appear online later.
But Is it worth it?
I’m not even going to talk about high-end professionals. They already know the answer. But if you’re an independent creative like me, or just a hobbyist, I can say without hesitation that investing in a piece of hardware with so much potential and capabilities is well worth the expense.
Some (a lot of) years ago, I became enamored with 3D graphics… but I eventually fell out of love because I HATED the seemingly eternal render times. Since then, technology kept getting better, but I think this new technology is a true game changer. This will surely make anyone not only work (and play) better and faster, but also fall back in love with whatever part of the creative graphic world was too annoying to keep doing. Sure, it was a bit frustrating not being able to take full advantage of the amazing new Turing architecture and RTX, but even without it I got a solid performer that made my graphic life a pleasure. I think we take for granted how much effort and resources go into making this kind of groundbreaking stuff work. I’m incredibly excited to be able to have a first seat in witnessing this pioneering technology flourish, that will have a huge impact the way all creative people work.
Barbara Din is a visual artist, graphic designer, painter, interior designer, crafter, musician and writer living in Argentina. Learn more about Barbara and her work at the following links:
Barbara Din Patreon page
Barbara Din YouTube Channel