Tom's Hardware Verdict
The Nvidia RTX 4060 Ti 16GB doubles down on VRAM but has the same underlying specs as its 8GB sibling. The extra memory helps in select cases, but most users would be better off opting for a GPU with a wider memory bus and 12GB instead of the 128-bit bus and 16GB, even if it costs more.
Pros
- +
Fine for 1080p gaming
- +
Good ray tracing and AI tech
- +
Double the VRAM, double the fun?
Cons
- -
Weak performance given the price
- -
16GB only helps in select workloads
- -
Limited by the 128-bit bus
Why you can trust Tom's Hardware
Some would say the Nvidia GeForce RTX 4060 Ti 16GB is the card that Nvidia and its partners don't want to see reviewed. No add-in board (AIB) partner would send us a card, and Nvidia didn't sample anyone... so we bought one, at retail, after searching for over a week to find one in stock. Is it one of the best graphics cards? You can probably already guess the answer to that question.
Based on the same Ada Lovelace architecture and with the same core specs as the RTX 4060 Ti Founders Edition, the sole difference is the use of two 2GB memory chips on each memory channel, doubling the capacity to 16GB. More memory should be good in certain workloads, though the 128-bit memory interface remains and will sometimes hold the GPU back.
More critically, tacking on $100 for the extra VRAM represents yet another cynical move from Nvidia. Yes, some people will be willing to pay the price, but Intel's Arc A770 comes in both 8GB and 16GB variants (albeit with a 256-bit interface), with about a $50 gap in pricing. Put bluntly, Nvidia charges as much as it feels it can get away with, and sometimes more.
There's no Founders Edition for the 4060 Ti 16GB, which makes the definition of a "reference" card somewhat nebulous. We figure anything available at the base $499 MSRP qualifies, and after looking at the options available at Newegg, Amazon, and elsewhere, we opted for the Gigabyte RTX 4060 Ti 16GB Gaming OC. Other MSRP models include the MSI Ventus 2X and the Zotac Amp (that's the Across the Spider-Verse bundle, in case that's a selling point for you).
Supposedly over 20 other 4060 Ti 16GB variants are available from other AIBs, but most are currently out of stock. We figured the triple fans on the Gigabyte card would provide a better overall cooling solution, so let's hit the speeds and feeds.
Graphics Card | Gigabyte RTX 4060 Ti 16GB | RTX 4060 Ti 16GB | RTX 4070 | RTX 4060 Ti | RTX 4060 | RTX 3060 Ti | RX 6800 XT | RX 6800 | RX 6750 XT |
---|---|---|---|---|---|---|---|---|---|
Architecture | AD106 | AD106 | AD104 | AD106 | AD107 | GA104 | Navi 21 | Navi 21 | Navi 22 |
Process Technology | TSMC 4N | TSMC 4N | TSMC 4N | TSMC 4N | TSMC 4N | Samsung 8N | TSMC N7 | TSMC N7 | TSMC N7 |
Transistors (Billion) | 22.9 | 22.9 | 32 | 22.9 | 18.9 | 17.4 | 26.8 | 26.8 | 17.2 |
Die size (mm^2) | 187.8 | 187.8 | 294.5 | 187.8 | 158.7 | 392.5 | 519 | 519 | 336 |
SMs / CUs / Xe-Cores | 34 | 34 | 46 | 34 | 24 | 38 | 72 | 60 | 40 |
GPU Cores (Shaders) | 4352 | 4352 | 5888 | 4352 | 3072 | 4864 | 4608 | 3840 | 2560 |
Tensor / AI Cores | 136 | 136 | 184 | 136 | 96 | 152 | N/A | N/A | N/A |
Ray Tracing "Cores" | 34 | 34 | 46 | 34 | 24 | 38 | 72 | 60 | 40 |
Boost Clock (MHz) | 2595 | 2535 | 2475 | 2535 | 2460 | 1665 | 2250 | 2105 | 2600 |
VRAM Speed (Gbps) | 18 | 18 | 21 | 18 | 17 | 14 | 16 | 16 | 18 |
VRAM (GB) | 16 | 16 | 12 | 8 | 8 | 8 | 16 | 16 | 12 |
VRAM Bus Width | 128 | 128 | 192 | 128 | 128 | 256 | 256 | 256 | 192 |
L2 / Infinity Cache | 32 | 32 | 36 | 32 | 24 | 4 | 128 | 128 | 96 |
ROPs | 48 | 48 | 64 | 48 | 48 | 80 | 128 | 96 | 64 |
TMUs | 136 | 136 | 184 | 136 | 96 | 152 | 288 | 240 | 160 |
TFLOPS FP32 (Boost) | 22.6 | 22.1 | 29.1 | 22.1 | 15.1 | 16.2 | 20.7 | 16.2 | 13.3 |
TFLOPS FP16 (FP8) | 181 (361) | 177 (353) | 233 (466) | 177 (353) | 121 (242) | 65 (130) | 41.4 | 32.4 | 26.6 |
Bandwidth (GBps) | 288 | 288 | 504 | 288 | 272 | 448 | 512 | 512 | 432 |
TDP (watts) | 160 | 160 | 200 | 160 | 115 | 200 | 300 | 250 | 250 |
Launch Date | Jul 2023 | Jul 2023 | Apr 2023 | May 2023 | Jul 2023 | Dec 2020 | Nov 2020 | Nov 2020 | May 2022 |
Launch MSRP | $499 | $499 | $599 | $399 | $299 | $399 | $649 | $579 | $549 |
Online Price | $500 | $500 | $590 | $374 | $300 | $335 | $520 | $440 | $350 |
The RTX 4060 Ti 16GB has the same specs as the 8GB variant, other than VRAM capacity. The Gigabyte model we're using for this review gets an extra 60 MHz for its boost clock, which in practice usually won't matter much — the 4060 Ti Founders Edition averaged just under 2.8 GHz across our test suite, while the Gigabyte card was closer to 2.75 GHz. Paper specs aren't everything, in other words.
As you can imagine, there's quite a bit of healthy competition for the 4060 Ti 16GB. AMD's RX 6800 can now be picked up starting at $450, while the RX 6800 XT has frequently been on sale for $500 over the past couple of months — the cheapest price at the time of writing is $520. Previous generation RTX 3070 and RTX 3070 Ti cards also cost less than the 4060 Ti 16GB now — as they should, considering the overall performance. We'll also toss in some Intel Arc cards for the benchmarks as well, but we'll get to those in a few pages.
The bump in memory capacity will definitely help, but raw bandwidth remains a potential problem. If you're playing games that don't need or use more than 8GB of VRAM, we'd expect similar performance — with a bit of wiggle room since we're comparing a factory overclocked card to the reference models. 1440p, and especially 4K, could benefit from the extra VRAM, but Nvidia isn't marketing the RTX 4060 Ti as a 1440p or 4K gaming solution. That's probably thanks to its lack of compute and bandwidth, even though the RTX 3060 Ti and RTX 3070 both targeted 1440p.
Note also that the 16GB cards, in the same power envelope, may perform slightly worse than the 8GB models. We definitely saw that in some of our benchmarks. It's not clear precisely how much power the extra memory uses, but it's more than zero watts, and that could, in some cases, reduce the maximum boost clocks. Or perhaps it's just the Gigabyte card in particular, but the difference in favor of the 8GB Founders Edition was generally in the low single-digit percentage points and was basically within the margin of error.
Here's the block diagram for the RTX 4060 Ti, along with the full AD106 chip. Nothing is changed for the RTX 4060 Ti 16GB. There's one disabled NVDEC (Nvidia Decoder) block and two disabled SMs (Streaming Multiprocessors). Manufacturing would be more complex, as GDDR6 chips need to be mounted on both sides of the PCB. That used to be relatively common, but in recent years such cards are usually professional models or "prosumer" cards like the Titan series.
All the other Ada Lovelace architectural features are present, including the heavily marketed DLSS 3 Frame Generation. If you're willing to trade latency for a bit more visual smoothness, that's what it gives you, but the performance charts with DLSS 3 enabled can be rather misleading in our experience. A 50% or larger boost in frames via DLSS 3 doesn't feel 50% faster — we'd say more like 10–20 percent at best.
Besides gaming, VRAM capacity can also be a factor in AI workloads. Many large language models (LLMs) benefit from lots of memory, and 8GB isn't enough for even "medium" sized models in many cases. I have to wonder if some of the RTX 4060 Ti 16GB scarcity at launch was from AI researchers and companies grabbing it for experimentation just because of its memory capacity. It still feels like a bit of a throwback to 2021, where GPUs just were sold out at launch, though at least now there are cards priced at MSRP.
Let's go ahead and move on to the specifics of the Gigabyte RTX 4060 Ti Gaming OC.
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content
Current page: Nvidia GeForce RTX 4060 Ti 16GB Review
Next Page Gigabyte RTX 4060 Ti 16GB Gaming OCJarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
-
JarredWaltonGPU I just want to clarify something here: The score is a result of both pricing as well as performance and features, plus I took a look (again) at our About Page and the scores breakdown. This is most definitely a "Meh" product right now. Some of my previous reviews may have been half a point (star) higher than warranted. I've opted to "correct" my scale and thus this lands at the 2.5-star mark.Reply
I do feel our descriptions of some of the other scores are way too close together. My previous reviews were based more on my past experience and an internal ranking that's perhaps not what the TH text would suggest. Here's how I'd break it down:
5 = Practically perfect
4.5 = Excellent
4 = Good
3.5 = Okay, possibly a bad price
3 = Okay but with serious caveats (pricing, performance, and/or other factors)
2.5 = Meh, niche use cases
...
The bottom four categories are still basically fine as described. Pretty much the TH text has everything from 3-star to 5-star as a "recommended" and that doesn't really jive with me. 🤷♂️
This would have been great as a 3060 Ti replacement if it had 12GB and a 192-bit bus with a $399 price point. Then the 4060 Ti 8GB could have been a 3060 replacement with 8GB and a 128-bit bus at the $329 price point. And RTX 4060 would have been a 3050 replacement at $249.
Fundamentally, this is a clearly worse value and specs proposition than the RTX 4060 Ti 8GB and the RTX 4070. It's way too close to the former and not close enough to the latter to warrant the $499 price tag.
All of the RTX 40-series cards have generally been a case of "good in theory, priced too high." Everything from the 4080 down to the 4060 so far got a score of 3.5 stars from me. There's definitely wiggle room, and the text is more important than just that one final score. In retrospect, I still waffle on how the various parts actually rank.
Here's an alternate ranking, based on retrospect and the other parts that have come out:
4090: 4.5 star. It's an excellent halo part that gives you basically everything. Expensive, yes, but not really any worse than the previous gen 3090 / 3090 Ti and it's actually justifiable.
4080: 3-star. It's fine on performance, but the generational price increase was just too much. 3080 Ti should have been a $999 (at most) part, and this should be $999 or less.
4070 Ti: 3-star. Basically the same story as the 4080. It's fine performance, priced way too high generationally.
4070: 3.5-star. Still higher price than I'd like, but the overall performance story is much better.
4060 Ti 16GB: 2.5-star. Clearly a problem child, and there's a reason it wasn't sampled by Nvidia or its partners. (The review would have been done a week ago but I had a scheduled vacation.) This is now on the "Jarred's adjusted ranking."
4060 Ti 8GB: 3-star. Okay, still a higher price than we'd like and the 128-bit interface is an issue.
4060: 3.5-star. This isn't an amazing GPU, but it's cheaper than the 3060 launch price and so mostly makes up for the 128-bit interface, 8GB VRAM, and 24MB L2. Generally better as an overall pick than many of the other 40-series GPUs.
AMD's RX 7000-series parts are a similar story. I think at the current prices, the 7900 XTX works as a $949 part and warrants the 4-star score. 7900 XT has dropped to $759 and also warrants the 4-star score, maybe. The 7600 at $259 is still a 3.5-star part. So, like I said, there's wiggle room. I don't think any of the charts or text are fundamentally out of line, and a half-star adjustment is basically easily justifiable on almost any review I've done. -
Lord_Moonub Jarred, thanks for this review. I do wonder if there is more silver lining on this card we might be missing though. Could it act as a good budget 4K card? What happens if users dial back settings slightly at 4K (eg no Ray tracing, no bleeding edge ultra features ) and then make the most of DLSS 3 and the extra 16GB VRAM? I wonder if users might get something visually close to top line experience at a much lower price.Reply -
JarredWaltonGPU
If you do those things, the 4060 Ti 8GB will be just as fast. Basically, dialing back settings to make this run better means dialing back settings so that more than 8GB isn't needed.Lord_Moonub said:Jarred, thanks for this review. I do wonder if there is more silver lining on this card we might be missing though. Could it act as a good budget 4K card? What happens if users dial back settings slightly at 4K (eg no Ray tracing, no bleeding edge ultra features ) and then make the most of DLSS 3 and the extra 16GB VRAM? I wonder if users might get something visually close to top line experience at a much lower price. -
Elusive Ruse Damn, @JarredWaltonGPU went hard! Appreciate the review and the clarification of your scoring system.Reply -
InvalidError More memory doesn't do you much good without the bandwidth to put it to use. The 4060(Ti) needed 192bits to strike the practically perfect balance between capacity and bandwidth. It would have brought the 4060(Ti) launches from steaming garbage to at least being a consistent upgrade over the 3060(Ti).Reply -
Greg7579 Jarred, I'm building with the 4090 but love reading your GPU reviews, even the ones that are far below what I would build with because I learn something every time.Reply
I am not a gamer but a GFX Medium Format photographer and have multiple TB of high-res 200MB raw files that I work extensively with in LightRoom and Photoshop. I build every 4 years and update as I go. I build the absolute top-end of the PC arena, which is way overkill, but I do it anyway.
As you know. Lightroom has many new amazing AI masking and noise reduction features that are like magic but so many users (photographers) are now grinding to a halt on their old rigs and laptops. Photographers tend to be behind the gamers on PC / laptop power. It is common knowledge on the photo and Adobe forums that these new AI capabilities eat VRAM like Skittles and extensively use the GPU for the grind. (Adobe LR & PS was always behind on using the GPU with the CPU for its editing and export tasks but now are going at it with gusto.) When I run an AI DeNoise on a big GFX 200MB file, my old rig with the 3080 (I'm building again soon with the 4 090) takes about 12 seconds to grind out the AI DeNoise task. Others rigs photographers use take several minutes or just crash. The Adobe and LightRoom forums are full of howling and gnashing of teeth about this. I tell them to start upgrading, but here is my question.... I can't wait to see what the 4090 will do with these photography-related workflow tasks in LR.
Can you comment on this and tell me if indeed this new Lightroom AI masking and DeNoise (which is a miracle for photographers) is so VRAM intensive that doubling the VRAM on a card like this would really help alot? Isn't it true that NVidea made some decisions 3 years ago that resulted in not having enough (now far cheaper) VRAM in the 40 series? It should be double or triple what it is right? Anything you can teach me about increased GPU power and VRAM in Adobe LR for us photographers? -
hotaru251 4060ti should of been closer to a 4070.Reply
the gap between em is huge and the cost is way too high. (doubly so that it requires dlss3 support to not get crippled by the limited bus) -
atomicWAR JarredWaltonGPU said:Some of my previous reviews may have been half a point (star) higher than warranted. I've opted to "correct" my scale and thus this lands at the 2.5-star mark.
Thank you for listening Jarred. I was one of those claiming on multiple recent gpu reviews that your scores were about a half star off though not alone in that sentiment either. I was quick to defend you from trolls though as you clearly were not shilling for Nvidia either. This post proves my faith was well placed in you. Thank you for being a straight arrow!