Let’s Rant About GPU’s

Okay so I don't have a high end GPU or quite honestly, the money to buy a high end GPU. But let's look at these newest releases from Nvidia and AMD and discuss the yuck that's happening right now.

I'm going to cite YouTube videos from Hardware Unboxed and Gamers Nexus, for this discussion. Check these dudes out if for some reason you haven't yet because they're fair and informed about everything they present. Certainly they're more informed than me.

First let's look at Nvidia and the newest 20xx cards

So yeah, real time ray tracing. These RTX GPU's came out over a month ago and RTX remains, even now, simply a tech demo. Battlefield V uses RTX and the visuals are pretty nice. I'll admit to that freely. But, those who bought their shiny new 2080 or 2070 are probably pretty upset about the fact that when they use the very function those GPU's are made for, their frame rates drop to the sub 60's! That's something like a 75% decrease in performance to use the ray tracing that folks paid a great amount of money for.

Now look, I get it. New tech has to start somewhere and without early adopters and people who are willing to risk paying for the cost / performance ratio, we wouldn't have a lot of new tech at all. So I'm glad people are at least willing to try this out. You have give it a fair shake. Ray tracing being used in movie rendering has been going on for a long time. That's where we get those cool CG effects in films and TV. But it takes hours to render a single frame in those cases. So the fact that we can now use a version of that, no matter how simplified, at 30 plus FPS is pretty remarkable. So props to Nvidia for bringing it on.

That doesn't mean that RTX is worth it for gaming right now. In fact, in my opinion, it's not... at all. However, that doesn't mean ray tracing won't become "the future". I'm actually fairly certain that we'll see it rise to something awesome, but it's going to take a while and right now it's just not there. We have to keep in mind that the first generation of most new tech is pretty bad. And ray tracing for gaming is no exception. This is new territory and with that comes the reality that new drivers have to be coded and optimized to make this new shiny stuff useful.

Look at something like the beginning of the LCD panel market back in the early 2000's as an example. Not only were those televisions and monitors prohibitively expensive, but they often times had dead pixels and the refresh rates were awful. Ghosting and image tearing was prevalent on every model. But people were willing to give it a shot and now we have ultra high definition panels with extreme refresh rates and pixel densities that our eyes can't even fully comprehend. And we don't even have to rent forklifts to move them into our homes. It's a great thing, but it's taken time to get here.

The problem is, mostly, that Nvidia have full control of the high end GPU market so they can charge customers whatever the bleeding eff they want and people will eat it up. And with their only competitor, AMD, not having any immediate plans for a release to slot into the high end gaming GPU market, Nvidia will continue to do so. That said, I do understand that the newer, bigger GPU dies are more expensive to produce. But they are absolutely not twice the expense that should then passed on to consumers.

Now let's look at AMD

Before I go on I want to say that I feel like brand loyalty is okay, but sticking with a company just because you feel like they somehow have your back or appreciate you as a customer is stupid. Neither AMD or Nvidia give a single curly butt hair about you, me or anyone aside from their investors. They are businesses that are there to make money and in order to do that they have to walk a balance beam. And seeming friendly or like they somehow are releasing a product as a favor to the consumer base is how they walk that beam.

I have an AMD GPU and I love it. I currently use an XFX R9 390X. Performance has been great for about the last three years, though it does run like a space heater in a closet. So when I saw that there was going to be a new release in the RX 480, I started paying attention. I was interested because even though AMD have been behind in performance for a long time now, they do usually have a price advantage and that's important to me. I was disappointed that my 390X turned out to be quite a bit better than the new Polaris release. Then came the RX 580 and again I started watching. Maybe now's the time to upgrade? Nope! According to UserBenchmark, the old 390X is still 3% faster than the current RX 580. It's also important to note that the 390X price point was that high from the mining boom that decimated the GPU market at the time. Moving on...

Now the RX 590 is out and AMD claims it's better than the RX 580 by about 12%. But it seems that that's not entirely accurate either. Though technically possible, the results compare closer to about a 5% difference in real world applications. That puts my now very outdated R9 390X at about a 3% to 9% disadvantage depending on the manufacturer. That's it.

To be fair, the price point for the RX 590 isn't too horrible at the MSRP of $280, but it's a little high for the current market, especially for the performance. When an RX 580 can be had for around $200, it's not enough of an advantage to justify an upgrade. Maybe that's different if you're coming from a much lower end card, but that's beside the point and not necessarily true. What you're getting with the RX 590 is a slightly higher clock speed on a 12 nanometer process. No faster VRAM clocks, no GDDR5x, no extra stream processors. Just a higher base clock speed on a slightly different die for close $100 more. I feel like the RX 590 should be called the RX 580X instead and sold for and MSRP of about $240. Maybe AMD didn't call it that because the RX 580 should just be the RX 480X. Who knows...?

Right now the best AMD have are the Vega GPU's and the performance is actually pretty good for both the Vega 56 and Veg 64. The Vega 64, in fact, is right on par with reference 2070 and 2080 in some games. But the HBM2 memory that Vega uses is really expensive and even though it technically uses less voltage, ultimately it's still power hungry. So what we got at the time Vega released was a GTX 1070(ti) - 1080 competitor, but a price point that was (and still is in some cases) much higher. Which is too bad because the Vega GPU's have great compute power and the HBM doesn't need to run at high clock speeds because of it's design. It's just not as practical as GDDR5. Or GDDR6 which has a fairly similar bandwidth to HBM2 and is less expensive to produce.

What's worse is that AMD abstaining from the high end gaming GPU market (at least for the time being), is only hurting consumers. Allowing Nvidia to over charge for their new releases because, hey, what other options do people have? But again, that falls to the point that companies don't actually care about consumers. Remember? AMD have no new gaming GPU's in store until some time in the first half of next year. And that very well won't be the high end market. That will almost certainly come at least a year later.

Maybe that's good for AMD, to an extent anyway. Perhaps by then they'll have a foot in the real time ray tracing sector and have developed a GPU with enough ray tracing cores to handle the graphical demand. Perhaps the ray tracing drivers across the board from both manufacturers will have improved and more games will be able to utilize it better. I imagine that at least the latter will be true. That should create competition again and perhaps that will bring the high end GPU prices down to more reasonable levels.

So here's the thing

I'm not at all saying that any of these GPU's are bad. Nvidia's RTX cards are expensive, especially those TI's, but they perform really well, even if the real time ray tracing is so new it's essentially useless as a practical factor in games. AMD did need to release a new GPU and this is their first 12 nanometer part in that area. One that does use less wattage than previous iterations so that's cool beans. Both companies are testing something new on the public and hopefully they'll continue to make steps at increasing levels of technology and compete with one another again. Because even though neither actually care about consumers as individuals, the competition between the rival tech giants is great for those individuals. Just remember that as a consumer this all comes down to personal preference, or what kind of money you're willing to spend for the performance value you're seeking. Keep your eyes open for sales and just get what suits you.

Here are a couple of Videos for reference

Drink in these results from Hardware Unboxed and Gamers Nexus.

 

 

One comment

  1. Avatar
    Cody Hall says:

    Yeah there’s clearly no point in going for RTX yet, maybe in a couple years when more games support it. And even then it would only be really helpful in things like competitive shooters and such where that kind of accurate rendering process improves the game in anyway

Leave a Reply to Cody Hall Cancel reply