Jump to content

UPDATED* AMD announces the Radeon VII - but it's $699 | Nvidia calls it "Lousy"

YoloSwag
1 minute ago, Blademaster91 said:

A single card that matches the performance Nvidia had with the previous 1080Ti, AMD is really late with a competing card

AMD directly compared Radeon VII to a RTX 2080, not a 1080Ti. Nice attempt at controlling the narrative though. 

 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

Well i m really waiting to see the Sapphire nitro version of this card blowing hot air in my desktop, and gtx 1070 crying somewhere in the corner of the room.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, DrMacintosh said:

AMD directly compared Radeon VII to a RTX 2080, not a 1080Ti. Nice attempt at controlling the narrative though. 

 

Not really any narrative lol, the 1080Ti and 2080 trade blows in most games. Yeah i know the naysayers are going to say RT is useless though take away the feature of RT on the 2080 since the Radeon lacks it, and AMD is competing  with the 1080Ti from like 2 years ago. IMO rather underwhelming for a $699 card with 16GB of VRAM that gamers don't need.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Blademaster91 said:

and AMD is competing  with the 1080Ti from like 2 years ago

See right there, you did it again. Downplaying the performance....

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, DrMacintosh said:

AMD directly compared Radeon VII to a RTX 2080, not a 1080Ti. Nice attempt at controlling the narrative though. 

 

 

3 minutes ago, Blademaster91 said:

Not really any narrative lol, the 1080Ti and 2080 trade blows in most games. Yeah i know the naysayers are going to say RT is useless though take away the feature of RT on the 2080 since the Radeon lacks it, and AMD is competing  with the 1080Ti from like 2 years ago. IMO rather underwhelming for a $699 card with 16GB of VRAM that gamers don't need.

 

Just now, DrMacintosh said:

See right there, you did it again. Downplaying the performance....

Alright you two, let's compromise and say that Radeon VII is not what we were hoping for, but still isn't a bad GPU, even for its price point.

 

And like others have said, it is ONE HELL of a compute card for its price! :o 

Sorry for the mess!  My laptop just went ROG!

"THE ROGUE":  ASUS ROG Zephyrus G15 GA503QR (2021)

  • Ryzen 9 5900HS
  • RTX 3070 Laptop GPU (80W)
  • 24GB DDR4-3200 (8+16)
  • 2TB SK Hynix NVMe (boot) + 2TB Crucial P2 NVMe (games)
  • 90Wh battery + 200W power brick
  • 15.6" 1440p 165Hz IPS Pantone display
  • Logitech G603 mouse + Logitech G733 headset

"Hex": Dell G7 7588 (2018)

  • i7-8750H
  • GTX 1060 Max-Q
  • 16GB DDR4-2666
  • 1TB SK Hynix NVMe (boot) + 2TB Crucial MX500 SATA (games)
  • 56Wh battery + 180W power brick
  • 15.6" 1080p 60Hz IPS display
  • Corsair Harpoon Wireless mouse + Corsair HS70 headset

"Mishiimin": Apple iMac 5K 27" (2017)

  • i7-7700K
  • Radeon Pro 580 8GB (basically a desktop R9 390)
  • 16GB DDR4-2400
  • 2TB SSHD
  • 400W power supply (I think?)
  • 27" 5K 75Hz Retina display
  • Logitech G213 keyboard + Logitech G203 Prodigy mouse

Other tech: Apple iPhone 14 Pro Max 256GB in White, Sennheiser PXC 550-II, Razer Hammerhead earbuds, JBL Tune Flex earbuds, OontZ Angle 3 Ultra, Raspberry Pi 400, Logitech M510 mouse, Redragon S113 keyboard & mouse, Cherry MX Silent Red keyboard, Cooler Master Devastator II keyboard (not in use), Sennheiser HD4.40BT (not in use)

Retired tech: Apple iPhone XR 256GB in Product(RED), Apple iPhone SE 64GB in Space Grey (2016), iPod Nano 7th Gen in Product(RED), Logitech G533 headset, Logitech G930 headset, Apple AirPods Gen 2 and Gen 3

Trash bin (do not buy): Logitech G935 headset, Logitech G933 headset, Cooler Master Devastator II mouse, Razer Atheris mouse, Chinese off-brand earbuds, anything made by Skullcandy

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Techstorm970 said:

Alright you two, let's compromise and say that Radeon VII is not what we were hoping for, but still isn't a bad GPU, even for its price point.

I wasn’t expecting anything tbh, I play mostly World of Warships at 1080p @60fps. My RX 580 is overkill already xD 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Blademaster91 said:

IMO rather underwhelming for a $699 card with 16GB of VRAM that gamers don't need.

people who only play games don't really need 16GB of VRAM now, but people who do other things with their pc do need it. 

 

not everyone has the money to build 2 machines (1 for work and 1 for games). this (from what I've seen) looks like a very good allrounder. good in games and good in compute. 

 

also, like @DrMacintosh said it does perform better than a 1080Ti. they compared it to the 2080. 

She/Her

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Techstorm970 said:

And like others have said, it is ONE HELL of a compute card for its price!

 

1 minute ago, DrMacintosh said:

I wasn’t expecting anything tbh, I play mostly World of Warships at 1080p @60fps. My RX 580 is overkill already

 

As for me, I'm looking forward to their RX 480/580 replacement. Nvidia's future mid-range competitor looks like a bit of a mess based from leaked info and speculation.

 

Can we have it like before? Like when Nvidia launched the GTX 1060 where it was better than a GTX 970 and traded blows with a GTX 980. I think that would be great to see again this year for both camps.

You can bark like a dog, but that won't make you a dog.

You can act like someone you're not, but that won't change who you are.

 

Finished Crysis without a discrete GPU,15 FPS average, and a lot of heart

 

How I plan my builds -

Spoiler

For me I start with the "There's no way I'm not gonna spend $1,000 on a system."

Followed by the "Wow I need to buy the OS for a $100!?"

Then "Let's start with the 'best budget GPU' and 'best budget CPU' that actually fits what I think is my budget."

Realizing my budget is a lot less, I work my way to "I think these new games will run on a cheap ass CPU."

Then end with "The new parts launching next year is probably gonna be better and faster for the same price so I'll just buy next year."

 

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, leadeater said:

Side note why does anyone even care about older AMD reference coolers anyway, in the past who was actually buying those cards. Only thing those reference coolers were used for was ammo to dump on AMD GPUs even though it didn't represent what people are actually buying.

From what I gather usually the people who buy reference cooler cards are:

  • People on a very tight budget.
  • OEMs
  • People who don't care about the reference cooler it's self, they want a cheap card for the reference PCB to use in custom watercooling. (Since most blocks are made for reference boards)

I never did quite understand the poo pooing on a card for it's reference cooler's poor performance... I mean its supposed to be like the heatsink that comes with a high end CPU (or at least used to). It works, and as long as it works and isn't harming the part, then whatever. If it does happen to perform well then, cool.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Techstorm970 said:

 

 

Alright you two, let's compromise and say that Radeon VII is not what we were hoping for, but still isn't a bad GPU, even for its price point.

 

And like others have said, it is ONE HELL of a compute card for its price! :o 

That's absolutely correct. 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Blademaster91 said:

16GB of VRAM that gamers don't need.

Well majority of gamers probably don't need more than 4 cores but they keep asking for them. Maybe that amount of Ram will make the card more futureproof f they develop good drivers on it to keep up the performance(i.e. as a r9 fury x or a 480 8gb).

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Sypran said:

People who don't care about the reference cooler it's self, they want a cheap card for the reference PCB to use in custom watercooling.

Looking forward for compatibility of the card with the NZXT G12 bracket.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm curious if AMD will do anything with lower tier SKU's or do anything at 60CUs with different memory configurations like the 2060 cards. I think the R7 could make a bigger splash on the market if it had cheaper options with GDDR6 or GDDR5 or just less HBM2.

Link to comment
Share on other sites

Link to post
Share on other sites

Should have been $599 or even $549 to be able to make a big impact on the market :)

Corsair iCUE 4000X RGB

ASUS ROG STRIX B550-E GAMING

Ryzen 5900X

Corsair Hydro H150i Pro 360mm AIO

Ballistix 32GB (4x8GB) 3600MHz CL16 RGB

Samsung 980 PRO 1TB

Samsung 970 EVO 1TB

Gigabyte RTX 3060 Ti GAMING OC

Corsair RM850X

Predator XB273UGS QHD IPS 165 Hz

 

iPhone 13 Pro 128GB Graphite

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Humbug said:

I agree this GPU is a good technological achievement.

It has only 15% higher clocks than Vega64. However the performance is 25-30% improved.

This is a clear sign that the architecture is much better on the newer Vega II. Not just a die shrink.

It is mostly a compute card and that's reflected in the changes to the chip: ML instructions and FP64 performance.

 

However for the first time ever AMD has apparently managed to exceed the 64 ROP barrier and doubled it to 128. However you'd think given the doubling in units to push pixels that the performance increase would be bigger. Especially considering that they've also removed the memory bottleneck by more than doubling memory bandwidth. You'd actually think the performance would be 50%+.

 

I'd say there must still be a bottleneck somewhere holding it back. Would love to see a deep dive on that.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, DrMacintosh said:

See right there, you did it again. Downplaying the performance....

He's not really. Gaming wise this thing performs the same as a gtx1080ti. That card was released 2 years ago. That's just a fact. Compute however will probably be a completely different story. I expect this thing to be a monster in compute workloads at that price 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Nowak said:

lol @ everyone who was expecting a RTX 2080 killer for $250

That's Navi. Nobody was expecting that to be released until 2nd half of 2019

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, C2HWarrior said:

That's Navi. Nobody was expecting that to be released until 2nd half of 2019

Except AdoredTV, who insisted that Navi and Ryzen 3000 would be unveiled at CES this year.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Sypran said:

People on a very tight budget.

There were cheap AIB options that weren't big 3 fan coolers etc, reference buyers were typically those that couldn't wait for AIB card or the system came with it. Though systems using AMD GPUs at all isn't that common, at least not for the better GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Settlerteo said:

Well majority of gamers probably don't need more than 4 cores but they keep asking for them. Maybe that amount of Ram will make the card more futureproof f they develop good drivers on it to keep up the performance(i.e. as a r9 fury x or a 480 8gb).

A majority of gamers don't need the 16GB, futureproof who knows unless these cards outperform the 2080 at 4K and at that budget most would probably go for a 2080Ti anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Nowak said:

Except AdoredTV, who insisted that Navi and Ryzen 3000 would be unveiled at CES this year.

Being unveiled and being released are completely different. Nobody expected Navi to be released until 2nd half 2019. Amd could have showed a teaser for Navi if they wanted to, we know it's near completion. It's seems however that they have a high number of defective mi60 cards that they need to sell so they released Radeon VII. They are not gonna be like "hey buy these cards now but in 7-9months time we are releasing a new, much better architecture". That's just bad business.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Trixanity said:

It is mostly a compute card and that's reflected in the changes to the chip: ML instructions and FP64 performance.

 

However for the first time ever AMD has apparently managed to exceed the 64 ROP barrier and doubled it to 128. However you'd think given the doubling in units to push pixels that the performance increase would be bigger. Especially considering that they've also removed the memory bottleneck by more than doubling memory bandwidth. You'd actually think the performance would be 50%+.

 

I'd say there must still be a bottleneck somewhere holding it back. Would love to see a deep dive on that.

Other little detail is significant chunks of Vega were busted at the silicon level. It's a brilliant mixed-load GPU, but there's a slew of functions that they just didn't get working. RTG still has a bunch of development issues to work out, especially if the rumors of another Navi respin are true.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/10/2019 at 2:34 AM, Nowak said:

Except AdoredTV, who insisted that Navi and Ryzen 3000 would be unveiled at CES this year.

Like or dislike Adored, but stop lying about what he said.

 

On 1/10/2019 at 2:46 AM, Blademaster91 said:

A majority of gamers don't need the 16GB, futureproof who knows unless these cards outperform the 2080 at 4K and at that budget most would probably go for a 2080Ti anyway.

In hindsight, the "Nvidia supports FreeSync" move has a bit to do with the Radeon 7. The 7 is clearly a 4K-capable gaming card, and most of the 4K monitors you'd game on have the FreeSync. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×