Jump to content

Nvidia and AIBs refuse to seed RTX 4060Ti 16GB Samples to Reviewers | Snubbing Reviewers and Gamers

AlTech

Summary

 

Nvidia has refused to seed RTX 4060Ti 16GB cards to reviewers prior to or near to the launch of the product.

 

Reviewers and AIBs approached about the subject have stated Nvidia isn't seeding review samples for the 4060Ti 16GB; likely to prevent Reviewers from releasing critical reviews undermining the Card's pricing and it's overall existence.

 

Board Partners have also confirmed they won't be seeding samples to reviewers either which leaves consumers in the dark as to this product's value proposition or lack thereof.

 

Quotes

Quote

So the RTX 4060 Ti ’16GB’ is meant to be arriving this month, word on the street is the 18th. I can’t confirm nor deny that, as I don’t actually know, but Nvidia has already announce a July release.

Now there’s no official review program for this model, there will be no FE version and it seems that Nvidia and their partners really don’t want to know about it.

Every Nvidia partner I’ve spoken to so far has said they won’t be providing review samples, and they’re not even sure when they’re model will be available.

So I don’t know when you’ll be able to view our review, but I will be buying one as soon as I can.

 

Quote

None of the AIBs approached by HardwareUnboxed were willing to do so. Consequently, gamers who were expecting to read reviews on the day of release may encounter difficulties finding them, unless tech reviewers manage to obtain the card through retail channels before the embargo lifts (which has happened in the past).

 

My thoughts

Welcome to the future, where you don't know how good or bad a product is until some poor normie buys one and complains about how terrible the value proposition is.

 

The level of disrespect by refusing to seed samples to reviewers ultimately shows that both AIBs and Nvidia feel that this launch is pointless and perhaps even a joke that got out of hand. And yet at the same time AMD has no official new cards at the moment at this price; at least until Gamescom when their Navi 32 cards will be launched.

 

Until Gamescom and AMD bringing Nvidia back to reality, gamers are just stuck with Nvidia asleep at the controls of this market with a never ending field of icebergs ahead; and yet no amount of shaking Nvidia to wake them up, to tell them about the icebergs ahead, seems to be working.

 

Sources

https://videocardz.com/newz/nvidia-not-seeding-geforce-rtx-4060-ti-16gb-for-reviews-aibs-hesitant-to-participate-as-well

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

The reviewers will still review it and slam it even harder then. 

Large channels will likely manage to put out review on day one regardless with a lot of work and some contacts from retailers. 

 

I see it as a lose lose situation for Nvidia. 

Link to comment
Share on other sites

Link to post
Share on other sites

It's a stupid move, they'll gain 2 days before the bombing falls, but noone is making queue for that GPU, they won't sell much during that  scammy "respite"

And disrespecting consumers always ends up being paid sometime 

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

i mean its going to just be a 4060ti performance, with 16 GB. There wont be any perf increase unless there was a slight issue with VRAM, and even then with the Memory bus it probably wont even make THAT much of a difference.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Shimejii said:

i mean its going to just be a 4060ti performance, with 16 GB. There wont be any perf increase unless there was a slight issue with VRAM, and even then with the Memory bus it probably wont even make THAT much of a difference.

Probably yes... But for$100 more... And some partner models are rumoured to be even more expensive. 

So if a 4060ti for $400 is already a bad deal since you could have that performance for that price for years, this will be even worse. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AlTech said:

 

 

Until Gamescom and AMD bringing Nvidia back to reality, gamers are just stuck with Nvidia asleep at the controls of this market with a never ending field of icebergs ahead; and yet no amount of shaking Nvidia to wake them up, to tell them about the icebergs ahead, seems to be working.

 

 

Tbh, AMD doesn’t appear willing to compete at 100% either. The RX 7600 was pretty much just a flick at Nvidia’s nose. 
 

Should probably look to Battlemage to hopefully put some fire back into the market. I don’t see anything else on the horizon otherwise. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Adds another reason to not purchase an Nvidia GPU....

 

I'm very happy with my 7900 XT, when I was looking to upgrade I wasn't prepared to let myself get shafted by Nvidia. Glad I picked my 7000 series card.

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Zodiark1593 said:

Tbh, AMD doesn’t appear willing to compete at 100% either. The RX 7600 was pretty much just a flick at Nvidia’s nose. 
 

Should probably look to Battlemage to hopefully put some fire back into the market. I don’t see anything else on the horizon otherwise. 

The Navi 32 cards are rumoured to be announced at Gamescom in about 6 weeks. At that point I would say there'll be some fire in the market.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

So, if you're a big tech tuber who still haven't figured out that pretty much the best way is to be "frack the companies" and build relations with retail space where stores really shouldn't give two cents about whether or not you completely crush the product (they do provide the product but it isn't their product) or have a bit "poke, poke, wink, wink" relations within the companies which are more than happy to leak out everything. Now is a good time to start building that part.

Link to comment
Share on other sites

Link to post
Share on other sites

No real point anyway, it's just a 4060 Ti with a worse price that won't VRAM choke, review is just confirming what we already know and can infer from existing review data.

 

Honestly I think some of the reviewers should just release the same review video again with a few overdubbed sections and just dunk on it hard for the price lol.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, AlTech said:

The Navi 32 cards are rumoured to be announced at Gamescom in about 6 weeks. At that point I would say there'll be some fire in the market.

Ehh, looks like this will be positioned above the 7600, so yet again, nothing for that mid-range $200 mark.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

4060ti, 500 dollars, play games like a 3060ti but doesn't choke on... 5 total games because of VRAM
Cooooool

Nvidia bunch of cowards. Let us shit on it on launch as is our right. 

should be the 420 dollar card now that vram prices dropped even more since the 4060ti came out.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Zodiark1593 said:

Ehh, looks like this will be positioned above the 7600, so yet again, nothing for that mid-range $200 mark.

Below $200 GPUs are dead lol besides a few cheap A750 cards.

 

AMD isn't interested in making new GPUs below $200 again.

 

The used market, old stock of previous gen, and future APUs like Strix Halo exist for that.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

People: we want more VRAM

nvidia: here's more VRAM

People: that sucks

 

I get it, it'll be the best GPU ever if it was much cheaper, but it isn't going to be anywhere near that. So it is a product for the few people who think they want more VRAM but can't afford a 4070 and/or think 12GB still isn't enough, nor afford higher end models.

 

2 hours ago, AlTech said:

Below $200 GPUs are dead lol besides a few cheap A750 cards.

Probably deserves its own separate niche, but I'd like to see more in the sub-75W category, lead by 750 Ti, 1050 Ti, 1650, Intel A380, and I think AMD had something too but no one cares as it was AMD.

 

A problem is there's a big gap between APUs and what could have been a $200 dGPU.

 

I have to wonder if people would be willing to spend more on an APU if it came with much better iGPU? AMD APUs as they exist are a kinda balanced value CPU + meh iGPU. I can't see iGPU performance increasing significantly unless they increase cost too. Premium APU where they can afford to put a big cache on it to help the iGPU around DDR bandwidth limits.

 

Edit: I thought I saw something before, and Halo Strix could be that "expensive APU", for laptops at least, and compete in the entry level gaming laptop space. Not going to see that one in a socket, which is what I guess most on this forum would be more interested in.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, AlTech said:

The Navi 32 cards are rumoured to be announced at Gamescom in about 6 weeks. At that point I would say there'll be some fire in the market.

AMD will probably just put the GPUs at 5~10% discount compared to the Nvidia performance equivalent, like they have been doing for quite some time at this point.

So it's probably going to just keep things as they are.

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, porina said:

People: we want more VRAM

nvidia: here's more VRAM

People: that sucks

The price is too high. The 4060Ti 16GB should have been $399 at most.

 

The 4060Ti 8GB should either have not come out or been $299 at most.

58 minutes ago, porina said:

I get it, it'll be the best GPU ever if it was much cheaper, but it isn't going to be anywhere near that.

Because we can't have good GPU launches in 2023?

58 minutes ago, porina said:

So it is a product for the few people who need more VRAM but can't afford a 4070

*FTFY

 

This card being overpriced doesn't deligitimize their concerns that 8GB isn't enough for playing at resolutions above 1080p.

58 minutes ago, porina said:

and/or think 12GB still isn't enough, nor afford higher end models.

Now you're just putting words in people's mouthes.

It is Nvidia who made this GPU a GPU with 8GB and 16GB VRAM. 12GB instead of both options would be sufficient for most potential 4060Ti buyers if it was $399.

58 minutes ago, porina said:

Probably deserves its own separate niche, but I'd like to see more in the sub-75W category, lead by 750 Ti, 1050 Ti, 1650, Intel A380, and I think AMD had something too but no one cares as it was AMD.

AMD doesnt have anything there at the moment besides the RX 6400 not do they intend to release anything new there.

58 minutes ago, porina said:

A problem is there's a big gap between APUs and what could have been a $200 dGPU.

The 7600 exists to stop people complaining that there's no new GPUs under $300.

 

If you want a GPU for $200 or less, AMD would say get a previous gen card or get an APU. They're not interested in supplying sub $200 GPUs anymore.

58 minutes ago, porina said:

I have to wonder if people would be willing to spend more on an APU if it came with much better iGPU?

They probably would though the APU would need to make sense in AMD's Ryzen lineup.

 

I could see a 6C/12T $250 to $300 R5 APU such as an 8600G using Strix Point and a $350 to $400 8C/16T R7 8700G APU to go along with it.

 

58 minutes ago, porina said:

AMD APUs as they exist are a kinda balanced value CPU + meh iGPU. I can't see iGPU performance increasing significantly unless they increase cost too.

Phoenix APUs exist on laptops and AMD is working on Strix Point and Strix Halo. Yes they'll cost more but they'll also deliver more perf.

58 minutes ago, porina said:

Premium APU where they can afford to put a big cache on it to help the iGPU around DDR bandwidth limits.

I doubt they'll be a massive cache. Strix Point is meant to have a 128 Bit DDR5 bus and Strix Halo is meant to have a 256 Bit DDR5 bus.

58 minutes ago, porina said:

Edit: I thought I saw something before, and Halo Strix could be that "expensive APU", for laptops at least, and compete in the entry level gaming laptop space. Not going to see that one in a socket, which is what I guess most on this forum would be more interested in.

It could come to desktop as an R9 8900G if AMD wanted it to. Idk if they want to do that or not tho.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, KaitouX said:

AMD will probably just put the GPUs at 5~10% discount compared to the Nvidia performance equivalent, like they have been doing for quite some time at this point.

So it's probably going to just keep things as they are.

7700 leaked price is $450 and 7800 leaked price is $550.

 

I wouldn't call that discount Nvidia performance though.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AlTech said:

7700 leaked price is $450 and 7800 leaked price is $550.

 

I wouldn't call that discount Nvidia performance though.

7800 leaks imply roughly the same performance as a 4070(on Time Spy), for $550 which is 10% less than the 4070.

7700 leaks put it about 15% ahead of the 4060Ti, for $450 which is 12.5% more than the 4060Ti, or 20% behind the 4070 for 25% less.

Both fall pretty much exactly in the "Nvidia equivalent with 10% lower price" thing they have been doing. If next generation Nvidia prices the 5060 at $500, i almost guarantee that AMD will release their performance equivalent for $450. 

At least on the CPU side Intel keeps their prices somewhat stable, ignoring AMD prices. Otherwise Alder and Raptor lake would have been released for $100 more for the "K" SKUs.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AlTech said:

Because we can't have good GPU launches in 2023?

It feels like none of the products would be seen as a problem if they were priced lower. I think there is more to it than corporate greed and the world has changed. While there is some wiggle room for pricing adjustments I don't expect pricing to return to historic levels.

 

1 hour ago, AlTech said:

I doubt they'll be a massive cache. Strix Point is meant to have a 128 Bit DDR5 bus and Strix Halo is meant to have a 256 Bit DDR5 bus.

It could come to desktop as an R9 8900G if AMD wanted it to. Idk if they want to do that or not tho.

They seem to be designed around LPDDR which precludes the possibility for AM5 compatibility. If we assume they can also work with regular DDR5, two stick is the 128 bit interface. While the uplift in speed since DDR4 helps, it is still bottom tier dGPU level. Halo's wider bus can help, but based on previous thread on it, you're only getting around desktop 3050 performance levels. You can't put Halo on AM5 without crippling it, or doing something different from 256 bit bus. The only alternatives I can see are either cache or on substrate VRAM. This is why I'm thinking a more expensive APU could be really performant if you don't mind spending on it.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

People: we want more VRAM

nvidia: here's more VRAM

People: that sucks

What sucks is there is a 8GB 4060 Ti in the first place when a 16GB could have been the standard and only option and it wouldn't be priced as high in that situation. Nvidia would have to compete with a sole 16GB variant based on it's performance, now they can up charge on a capacity rather than capability/performance while still being able to point to the lesser card as offering a market competitive option.

 

  • RTX 4060 Ti is the same capacity has the RTX 3060 Ti/3070/3070 Ti with half the memory bus so twice the density/capacity memory module meaning lower cost per capacity and lower cost to implement in the die and on the graphics board. Memory configuration is significantly cheaper accounting for all factors.
  • RTX 3060 Ti has a GDDR6X variant still on same bus width i.e. twice as wide as the generation replacement. This is more costly than prior mentioned memory configurations.
  • AD106 is slightly more than half the die area of the GA106 it replaces. TSMC 4N could be twice the cost by area than Samsung 8LPU and be the same GPU package cost effectively
  • TSMC 5N wafer cost is ~$17000, 4N is revision of 5N so roughly the same
  • TSMC 7N wafer cost is ~$10000, Samsung 8LPU is competing node supposedly a little cheaper but information sparse on that. Samsung 8LPU wafer cost would need to be ~$8500 to attempt a cost increase justification not factoring cost decrease in memory configuration
  • TSMC 3N wafer cost is ~$20000

 

Based on public information, pricing, product specifications I can only conclude it is cheaper to make the RTX 4060 Ti 8GB than any RTX 3060 Ti and the RTX 4060 Ti 16GB is decent potential to also cost (Nvidia) less.

 

2 hours ago, porina said:

it'll be the best GPU ever if it was much cheaper, but it isn't going to be anywhere near that.

Per above, could, should, must be on fair objective analysis to approach a conclusion of good value product for the consumer.

 

Either I have missed a major cost factor or Nvidia gross margin on the RTX 4060 Ti is higher than RTX 3060 Ti.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:
  • TSMC 5N wafer cost is ~$17000, 4N is revision of 5N so roughly the same

That is the assumption I made when looking at it previously, but 4N is an nvidia custom so in reality it probably costs a premium on top of standard N5. Also, most TSMC nodes are N_ but the nvidia one is 4N. I wondered if I made a typo when I saw that, but it is widely reported as such.

 

4 minutes ago, leadeater said:
  • TSMC 7N wafer cost is ~$10000, Samsung 8LPU is competing node supposedly a little cheaper but information sparse on that. Samsung 8LPU wafer cost would need to be ~$8500 to attempt a cost increase justification not factoring cost decrease in memory configuration

I saw various figures for N7 but only going up to 9.3k. It was actually lower early on, but was impacted by price increases. Presumably in part as demand outstripped supply at the time.

 

I was also unable to find anything half solid on Samsung 8 costs, but I would expect it to be a fair bit lower than N7, given it is derived from 10nm generation so arguably a half-node or so behind N7.

 

4 minutes ago, leadeater said:

Based on public information, pricing, product specifications I can only conclude it is cheaper to make the RTX 4060 Ti 8GB than any RTX 3060 Ti and the RTX 4060 Ti 16GB is decent potential to also cost (Nvidia) less.

Today or historically? I see you mention the VRAM but did you have other significant factors in mind?

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, porina said:

It feels like none of the products would be seen as a problem if they were priced lower. I think there is more to it than corporate greed and the world has changed. While there is some wiggle room for pricing adjustments I don't expect pricing to return to historic levels.

 

They seem to be designed around LPDDR which precludes the possibility for AM5 compatibility.

They'll have DDR5 compatibility just like Pheonix and Dragon Range had DDR5 compatibility.

10 minutes ago, porina said:

If we assume they can also work with regular DDR5, two stick is the 128 bit interface. While the uplift in speed since DDR4 helps, it is still bottom tier dGPU level.

Halo's wider bus can help, but based on previous thread on it, you're only getting around desktop 3050 performance levels.

Effectively for almost nothing if the APU costs $250-300. You're maybe paying $50 more compared to a hypothetical 8600. That's not a bad deal if you only need 3050 levels of perf.

10 minutes ago, porina said:

You can't put Halo on AM5 without crippling it, or doing something different from 256 bit bus.

Why not?

10 minutes ago, porina said:

The only alternatives I can see are either cache or on substrate VRAM. This is why I'm thinking a more expensive APU could be really performant if you don't mind spending on it.

Neither option is viable for AMD. Substrate VRAM is too expensive and kinda missed the main benefit of using an APU, unified RAM.

 

Instead of people having 16GB or 32GB of RAM and 8GB of VRAM they can have 16GB or 32GB shared RAM. Meaning their effective VRAM is whatever the OS + games don't use for RAM.

 

Cache uses too much die space and requires too much effort. AMD wants to design products that can be dropped into laptop and if they want it on desktop it needs to also be drop in with minimal configuration; just like Renoir and Cezanne on AM4 existed in 4000G series and 5000G series respectively.

 

1 minute ago, porina said:

That is the assumption I made when looking at it previously, but 4N is an nvidia custom

[Citation Needed]

1 minute ago, porina said:

so in reality it probably costs a premium on top of standard N5.

Also, most TSMC nodes are N_ but the nvidia one is 4N. I wondered if I made a typo when I saw that, but it is widely reported as such.

I believe it's a typo. Possibly spread by Nvidia to identify leakers.

1 minute ago, porina said:

I saw various figures for N7 but only going up to 9.3k. It was actually lower early on, but was impacted by price increases. Presumably in part as demand outstripped supply at the time.

 

I was also unable to find anything half solid on Samsung 8 costs, but I would expect it to be a fair bit lower than N7, given it is derived from 10nm generation so arguably a half-node or so behind N7.

Samsung nodes are also less technically competent at high performance silicon. They don't scale well on clockspeed and power. If you push the clockspeeds expect power draw to skyrocket.

 

They're designed for low power silicon like smartphone SOCs hence the suffixes Samsung gives them: LPE, LPP etc

 

1 minute ago, porina said:

Today or historically? I see you mention the VRAM but did you have other significant factors in mind?

VRAM is the main factor in the price increase between RTX 4060Ti 8GB and 16GB.

 

8GB of VRAM costs a lot less than $100 and Nvidia's marking up the VRAM increase.

 

Overall AD106 is expensive because Nvidia chose to make Ada Lovelace Gen expensive the same way they've chosen to make every gen since Turing expensive by designing expensive monolithic dies.

 

If Nvidia is charging more because AD106 is more expensive then perhaps they should consider investing in chiplet technology instead of ramping up die sizes and spending tons of die space on features few care about like ray tracing.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, AlTech said:

I believe it's a typo. Possibly spread by Nvidia to identify leakers.

4N is not a typo, its just 4N or whatever reason FROM tsmc. Its weird and yea specific to nvidia so I dont know what it really is.
Hell I dont even know if its based off of N4 or N5. 

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, porina said:

That is the assumption I made when looking at it previously, but 4N is an nvidia custom so in reality it probably costs a premium on top of standard N5. Also, most TSMC nodes are N_ but the nvidia one is 4N. I wondered if I made a typo when I saw that, but it is widely reported as such.

N4 is still a fundamental 5N node and I doubt Nvidia negotiated a bad price so I'll say it's around the $17000 mark. I cannot see it getting close to the $20000 of an actual different node i.e. 3N. Anything more than $18000 I'd be having strong doubts.

 

33 minutes ago, porina said:

I was also unable to find anything half solid on Samsung 8 costs, but I would expect it to be a fair bit lower than N7, given it is derived from 10nm generation so arguably a half-node or so behind N7.

 

Samsung 8LPU is not a derivate/refresh of 10nm, that would be Samsung 8LPP. Samsung 8LPU is a purpose designed 7nm market competitive option with EUV support (TSMC N7 did not have/start with this) with 60% power efficiency gain or 15% performance. There are AMD GPU dies using TSMC N7 with lower MTr/mm2 than Nvidia Samsung 8LPU dies.

 

8LPP != 8LPU

8LPU ~ N7

(but not quite, but mostly with EUV bonus)

 

image.png.a2dc224427d7b70d8f8c90f2068b25f8.png

More logic area the better you can get that MTr/mm2 metric, that's where Navi 10 falls down on

 

33 minutes ago, porina said:

Today or historically? I see you mention the VRAM but did you have other significant factors in mind?

Historic doesn't really matter. Today what it would cost now to make RTX 4060 Ti should not be more than the RTX 3060 Ti. We can wiggle around a bit on costing etc but a full $100 USD increase on a product this low in cost is greater than general allowance for that. Last I checked a fab didn't get flooded etc to cause such a price increase.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, AlTech said:

Welcome to the future, where you don't know how good or bad a product is until some poor normie buys one and complains about how terrible the value proposition is.

As @AlTech mentioned, trusted reviewers will buy one & tear it a new one.

We as consumers should then wait with buying it just a day or two to give them the time to fully run all benchmarks.

When the PC is acting up haunted,

who ya gonna call?
"Monotone voice" : A local computer store.

*Terrible joke I know*

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×