RTX 2060 a good buy?

I am thinking of upgrading my Zwift PC GPU.
I currently have a 1060 3GB which sometimes struggles at 4k Ultra, mostly it’s fine but it becomes jerky now and then, and that somehow bothers me…the 3GB is just not enough I believe.

I was planning on getting a 6GB RTX 2060 card from ASUS. Is that a good purchase?
My aim is 60fps without slowdown at all.
I checked zwiftanalyzer, FPS seems good there.

Thank you!

That’s overkill unless you are playing other games too. A 1660 is a good choice. Most FPS issues are when it is crowded and no matter which graphics card you have you will see them.

2 Likes

Thats good feedback, thank you.
I looked at the 1660 and the price difference is not so large, so I thought a little more performance than needed would be good for future proofing.

It is the coding of the game that is the issue, 1660 would be fine and even that 1060 should not be too bad, is this in starting areas or when you do get into larger crowds of people?

1 Like

If you have an eye towards future proofing then a card with a minimum of 8Gb of RAM would be the way to go.

The 1060 3GB is easily strong enough, the bottlenecks are with the CPU and it’s virtually impossible to avoid all stutter and frame rate drops in busy areas and group rides without spending a small fortune on something at 5Ghz. Talk of Zwift utilising 8GB of VRAM is wishful thinking at best, even in the distant future. Reasonable advice for other gaming/uses of course.

If I was OP and this is just for Zwift I’d consider a CPU upgrade (though with Intel this is restrictive without a new motherboard) before a new GPU. I certainly wouldn’t splash out on a 2060, you’d be much better off spending that kind of money on an adaptive sync monitor instead.

3 Likes

Yes and no, really “future proofing” in general does not work in the computer world. Spend more now to try and last 3-4 years when you could just buy lower now and buy again in 2 years and have newer tech that will be on par or better than the new card. And, the amount of memory is not the only requirement, you can buy low end cards with 4GB of vRAM and they are horrible. Companies use the Memory amount to fool people into think more is better, especially on lower end and mid range cards.

As noted 8GB is over kill for Zwift. Until Zwift gets better multicore optimizations going on vs preferring a fast clock rate, or offloading more work to the GPU

Note, NVIDIA will be releasing their 3*** series end of the year, so if you do not have to buy a new card now, try to hold off, and see what comes around…

PS I’ve had a 1660 6GB well over a year and I’ve never seen Zwift use more than 4GB. So it’s not even using what’s commonly available now, let alone more.

I would argue with you over future proofing since I am currently using a computer I built in 2009 that I purposely made to be useful for at least a decade. The graphics card failed last year but it was only a 1GB, PCIE 2.0 card so it did its job. When it comes to the CPU and PCIE 3.0 nothing has really changed since Haswell was released in 2013. If this guy wants to use his computer for anything other than Zwift, as we enter the next gen desktop releases later this year with PCIE 4 and AMDs new chips, then he’ll want to have 8GB of RAM on a video card (assuming he doesn’t want to wait and buy a whole new setup like you suggested when Nvidia release Ampere). Even 8GB is cutting it close for high end gaming, or stuff like video editing, or even having lots of background processes running, or lots of tabs open in their browser. Some youtube content creators who edit videos have 128GB of RAM just for their CPU and definitely use nvidia’s 2080 Ti’s for their graphics cards which I think are generally 12GB of RAM. I have read that 32 GB of RAM is the standard now for the motherboard and the new PCIE 4 graphics cards will be 10GB minimum for the most part.

That being said if he just wants it to run Zwift for the next decade, then a 6GB card should be fine. If he is worried about buying a crappy graphics card then there are tons of reviews that one can find on various websites and youtube channels done by very competent people like GamersNexus. I, personally am really looking forward to building my next future proof computer but I am hoping it can wait until PCIE 5 comes out not too long from now.

1 Like

Always exceptions to the rules of course. For content creators and gamer’s for example, future proofing seldom works out, why SLI died a slow death and XFire. They often need the latest and greatest purely for performance reasons. HardOCP , Anand and others are a go to for me, but HardOCP now being bought out and Kyle leaving stinks, no more reviews.

32GB could be the norm for higher end users who do more than 1 or 2 things , but even today 16GB is overkill for 99% of the population and most games do not effectively use 8GB of vRAM aside from a couple triple AAA titles. Why NVIDIA is still releasing cards with 6GBs in them, which outperform some 8GB AMD counter parts, vRAM is not everything. Now, if 8K gaming takes off…that VRAM is going to need to go up very very quickly! But considering even high end 2080ti’s alone can struggle at 4k…may be some time.

My wife is using a 10 year old Dell Optiplex with a i5 2400, i put in an SSD and upped the ram to 8G and she is golden, even now that she is working from home.

Now ask me to use that same system and I would cry tears all day long!

If you can make a system last that long, good on you, I wish I could but I often find I am pushing my limits too far too often. But my solution to that was to get a dedicated Lab box,so now I can buy a less than high end system. (Dual 10 core E5 v2 Xeons / 128G of ram and all the goodies running a single node vSAN cluster))

I am with you though on wanting a new system. I want to see how Ampere does and AMD’s Zen 3…so saving some money for the end of the year just in case to move off this i5 8600 rig.

1 Like

My work laptop has a quadro 4000 which is not the best but still runs zwift 60fps at 4k resolution. I feel like the game isn’t too demanding compared to AAA pc game titles

I picked up a B-ware RTX 2060 for a good price, installed it this morning.
It certainly seems a lot more fluid…the old 1060 3GB was jerky at times.
I have paired the RTX 2060 with an i7-7700k, SSD and 16GB of ram (dual channel).
I’m happy with the result - thanks for your input.

I think the old 1060 GPU was actually adequate, but having only 3GB of RAM created a bottleneck at 4k resolution. As someone said, Zwift seems to use a little over 4GB.
Would love a TV with G-sync support for ZWIFT - I’m sure it would look amazing. There are unfortunately only a few options on the market and everything is very expensive - like the LG OLEDs.

1 Like

You don’t need Gsync any more, Freesync has worked with Nvidia (10xx+ cards and DisplayPort only) for over a year now. It’s ace.

With TVs you’re still stuck though.

yep good point, freesync would be fine too. Many of the smaller TVs don’t support freesync. My 48" QLED does not have freesync, but my living room 65" does.
48" is about the limit for my pain cave.

BTW - NVIDIA GPUs only support freesync over display port, not with HDMI. So, one would need a TV with freesync and DP - I’m not aware that such a TV exists.

Interesting about the 3GB being a bottleneck though. Wouldn’t amaze me if the game doesn’t correctly account for that card, and assumes it has 6GB like the normal 1060. So at 4K you’re bouncing off the limit.

1 Like

You’re stuck with monitors anyway, because the feature needs DP. I run a 27in 1440p 95Hz monitor and it smooths out all the frame rate dips and changes beautifully. :+1:

exactly.
For my living room PC I use an AMD GPU to ensure I can use Freesync

1 Like

It’s a shame that Zwift relies on OpenGL, because AMD graphics cards are great value and like you say adaptive sync is much easier to obtain. They’d be a perfect fit but their drivers are loads worse at OpenGL than Nvidia. :confused:

1 Like