Jump to content

RX 7600 and RTX 4060Ti specs and release dates leaked | 8GB and 16GB cards coming for both

AlTech

Right now, the 3070 8GB is losing to the 6800XT 16GB (Hardware unboxed) only because of VRAM, even in raytracing. just two years, and the tables have flipped completely for the two cards.

 

Current Gen consoles have 16GB of unified memory, that's why latest games are exceeding the 8GB VRAM thresholds even at 1080p.

I think the bare minimum of VRAM for a new card is to get 60FPS, high detail AAA games:

  • 8GB is low end, 1080p, less than 200$
  • 12GB is mid range, 1440p, less than 400$
  • 16GB for high end, 2160p, less than 800$
  • Halo products can cost as much as manufacturer wants. I'm fine with a 48GB 2000$ card.

If Nvidia wants to sell a low end 1080p 8GB card, that's good. It has to be priced as a low end card. The 4060 and 4060Ti 8GB are low end cards, not midrange. The 4060Ti 16GB is a low end card due to core configutation.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, 05032-Mendicant-Bias said:

Right now, the 3070 8GB is losing to the 6800XT 16GB (Hardware unboxed) only because of VRAM, even in raytracing. just two years, and the tables have flipped completely for the two cards.

Stop.
the 6800xt is, and always was FASTER than the 3070. Even if the 3070 was 16GB, the 6800xt would consistently get higher frame rates.

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/26/2023 at 8:27 PM, leadeater said:

The marketing team at Nvidia will have no input at all in to which GPU die and sub configuration will be used for a particular product. You are right in identifying that the silicon engineers design the GPU dies and the product engineers take those designs and together with the silicon engineers plan out the various sub variants of the dies to suit the products. There is no way the marketing team is going to have much involvement at this point in product design since they have a different skillset. They can feed in what type of product expectations they should aim for at a high level but they certainly aren't going to get in to any discussions about GPU and memory configurations.

 

I seriously disagree that the Nvidia marketing team has any significant say over these areas. Their job is to market the product they have been given, while they will have involvement in the product design it's the product design team that will be the ones deciding what is or is not a RTX 4060 Ti and how that product is to be created.

 

Presenting to the marketing team unnamed and unplanned market placement products simply isn't a realistic scenario for Nvidia, or any larger silicon company like Nvidia/Intel/AMD/Broadcom etc.

I'll concede this as it's very likely the way my company operates and the way a multi-billion dollar company operates are very different, my experience might not be practical on this subject. I am also biased towards favoring the engineering team given the impossible tasks we have to do in our lab and the fact that we are always the scapegoats when something goes wrong.

 

On 5/26/2023 at 8:27 PM, leadeater said:

That could well have happened, like I believe it was the case for the RTX 4080 12GB but I disagree with how it happens and I disagree it's coming from the marketing team. But here's the thing, I can't ignore patterns and repetition. A one off like the RTX 4080 12GB, sure something went wrong somewhere, an executive could have weighed in and pushed for the naming on the justification of "better sales" etc but this is not a one off, not just for RTX 40 series either.

 

And like all things the executives get to carry all the credit so they get to carry the criticism too. I'm sure the individual engineers know well what they are doing and probably weren't keen on some of the decisions of the past few years however I am not criticizing them when I say it's a design engineering failure, I am criticizing Nvidia the company. The Jim Keller's and the Raja Kadori's get the credit for the products under their leadership even though it's large teams and it's the people at this level I am commenting on because that is most assuredly where the problems are coming from. Those are the people putting profits and product costs too far ahead of good product design.

 

I think you are reading it as though the team of people, the ones doing the work, did a bad job or are the source of the failure. That's not at all what I mean when I say Nvidia has made a product engineering failure. The RTX 4060 Ti was signed off by the product design team, if they seriously didn't agree with it then do a better job at objecting. Simple fact is it was approved and produced.

 

After the cringe of Computex, I am willing to just blame Jensen himself. Clearly the leather jacket was on too tight this past year.

 

On 5/28/2023 at 5:53 AM, WereCat said:

Looks like it does make a difference as it's running at PCIe 3.0 x8 on boards with only PCIe 3.0 (so PCIe 4.0 x4). But it's not really a big difference in actual tasks/gaming but it's enough to make the 4060ti and 3060ti to perform basically within margin of error.

It's good to see however that it's not a limitation to worry about too much if at all. It's just even more disappointing as the performance difference between the cards was already almost non-existent and at PCIe 3.0 there is none.

 

I think it's also worth noting that there is a trade-off between the 3060ti and 4060ti depending on what you do as the 3060ti seems to fare quite a bit better in productivity tasks but the 4060ti has some extra features and uses less power. But the price difference being almost 100eur between them (almost 200eur for 16GB) just makes no sense.

  Reveal hidden contents

 


image.thumb.png.3728c42fab8b2f03ca0cac023a5eb71f.png

 

 
 

 

I saw this over the weekend, though I wouldn't put too much weight into the bandwidth limitation being a big issue for most games just yet. I am more interested in seeing whether this has any implications on future games, especially as we trend more towards seeing console games ported to PC and Resizable BAR/SAM become utilized more often in newer titles. Intel actually lists PCIe 3.0 x16 as a hard requirement for their ARC cards, and the A770 doesn't function correctly with it disabled. 

 

Has anyone done any PCIe bandwidth scaling tests with ReBAR/SAM supported titles? If not, I may have to find time to test this myself as I am genuinely curious.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, MageTank said:

After the cringe of Computex, I am willing to just blame Jensen himself. Clearly the leather jacket was on too tight this past year.

LOL! I've only seen clips of it, I should watch it since all the server stuff is applicable to me.

 

4 minutes ago, MageTank said:

I'll concede this as it's very likely the way my company operates and the way a multi-billion dollar company operates are very different, my experience might not be practical on this subject. I am also biased towards favoring the engineering team given the impossible tasks we have to do in our lab and the fact that we are always the scapegoats when something goes wrong.

Still always good to hear how other places do things, even within huge companies things can happen in similar ways in isolation.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, MageTank said:

Has anyone done any PCIe bandwidth scaling tests with ReBAR/SAM supported titles? If not, I may have to find time to test this myself as I am genuinely curious.

I thought ReBAR was really a game independent technology? I know Nvidia drivers/Geforce Experience has game whitelist for enabling it and you can also force it on but I think AMD and Intel it's on always if the system supports it so every game uses it?

 

ReBAR was a flash in the pan thing, I can't remember if games do actually need to be optimized for it or it's really just a driver thing and system managed at creating memory calls and sizes 🤔

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, leadeater said:

I thought ReBAR was really a game independent technology? I know Nvidia drivers/Geforce Experience has game whitelist for enabling it and you can also force it on but I think AMD and Intel it's on always if the system supports it so every game uses it?

 

ReBAR was a flash in the pan thing, I can't remember if games do actually need to be optimized for it or it's really just a driver thing and system managed at creating memory calls and sizes 🤔

It's definitely game independent, games don't need to be specifically optimized or coded to support it, but some games scale better than others with it enabled (and some scale negatively for some reason, such as Borderlands 3). I know Cyberpunk 2077 showed promising scaling with it enabled vs disabled. From what I've heard, most console ported games show positive scaling since ReBAR was a console feature before it was a PC feature and they took advantage of it due to the limited hardware constraints they had to work with. I've personally never done any in-depth testing with it beyond the initial release on a few titles.

 

 

I am curious to see if the lack of PCIe bandwidth on the RTX 4060 Ti impacts the performance of ReBAR. If it does, it might cause the card to age more like milk compared to the 3060 Ti.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MageTank said:

It's definitely game independent, games don't need to be specifically optimized or coded to support it, but some games scale better than others with it enabled (and some scale negatively for some reason, such as Borderlands 3). I know Cyberpunk 2077 showed promising scaling with it enabled vs disabled.

yeah, as with the directstorage some works or is automatically switched to whatever is supported (can be bugs or whatever) not fully sure about ReBar. It seems like Nvidia enables/disable depending on the game support, and likely disabled for older cards altogether.
As from the pages, you need a bit of support to get everything working correctly like a lot of these features, some might be already supported.

Quote

a supported VBIOS, a compatible CPU, compatible motherboard, motherboard SBIOS update, and driver that supports it. (also OS enabled/support)

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×