Jump to content

AMD employee confirms new GPU with HBM and 300W

Kowar

I however am not. Nvidia fan-boy alert.

 

When will people learn that its better to go by what you want in a graphics card, not by the manufacturer alone because your too damn narrow minded to consider other options.

It is not about the manufacturer.  It is about getting a better video card with more features.  Do you like to play games as they were intended to be played? If so you won't be looking to AMD's next offering in video cards.  You will be looking at what you can afford from Nvidia.

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

So the toaster jokes are still true only this time, its now an oven. 

 

Even if AMD gpus are gonna be more powerful than equivalent Nvidia gpus, I'm not touching them with a ten foot pole unless they get their TDP and thermals under control.

60FPS Microwave

Intel Core i5-4670K | Galax GTX 970 EXOC | ASRock Z97E-ITX/ac | Team Elite 8GB 1600MHz | Gelid Black Edition | Samsung slowdown + WD Blue 1TB x2 | Cooler Master V550 | Corsair K65 + Logitech G100s | MasterCase Pro 3

Link to comment
Share on other sites

Link to post
Share on other sites

It is not about the manufacturer.  It is about getting a better video card with more features.  Do you like to play games as they were intended to be played? If so you won't be looking to AMD's next offering in video cards.  You will be looking at what you can afford from Nvidia.

you are right

my life will be never be full without playing a game that deeply supports Physx for example............ errm. wait I google it... sorry could not find any game that is based on physics but let say CoD Ghost, without streaming it and seeing everything blowing up

I remember when I had my 9600 GT, so much phisics man, and now with my HD 7770 whenever I play a game I am like " ohh I miss physX" then it turns out it does not support it..

 

 

Nvidia has some features, but nothing gamechanger

Link to comment
Share on other sites

Link to post
Share on other sites

... and no PhysX.  Only saved for the company that makes video cards to play games the way they were meant to be played.  NVIDIA!

attachicon.gifnvidia.jpg

 

sheep.jpg

 

How are games meant to be played with PhysX, when games do not use it? Or only use simple PhysX, that the CPU can run (AMD APU too)? How does Nvidia run Mantle? Uhm.

 


 

How does this person have over 5k posts in here?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

... and no PhysX.  Only saved for the company that makes video cards to play games the way they were meant to be played.  NVIDIA!

 

Making PhysX exclusive to Nvidia card actually pretty much killed its potential.

There is only a handful of titles that actually uses hardware-accelerated(the exclusive part of PhysX) version...

For rest of it, it's just a cpu-run physics engine not much different from Havoc.

Anyone who has a sister hates the fact that his sister isn't Kasugano Sora.
Anyone who does not have a sister hates the fact that Kasugano Sora isn't his sister.
I'm not insulting anyone; I'm just being condescending. There is a difference, you see...

Link to comment
Share on other sites

Link to post
Share on other sites

dude, no. 4GB is fine even for 4K atm, and 5K in 90% of the games. remember the ultrawide setup of the Autobahnhammer? that had the same(ish) amount of pixels as a 5K display, and the only time it ran out of its 4gig buffer was when Shadow of Mordor was played with all those ultra textures and whatever ;) so be sure that 4Gig is enough

 

and from a developers point of view now

 

if the coders have any idea how the engine fetches stuff from the RAM and vRAM, 4 gigabytes will be enough to get us through 4K comfortably :) if the optimisation is crap (like it seems to be, lately), then yes, we will need crazy 8GB cards soon

 

Shadow of Mordor ate my poor xfired 7970s, cut my fps in half if i tried to use ultra texture quality :( Damn 3gb cards. Obviously with just high texture quality it was fine, but having that one setting not maxed out made me a sad panda :(

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Making PhysX exclusive to Nvidia card actually pretty much killed its potential.

There is only a handful of titles that actually uses hardware-accelerated(the exclusive part of PhysX) version...

For rest of it, it's just a cpu-run physics engine not much different from Havoc.

Hand full of titles is a lie. There are many more games than a hand full that use PhysX and I think people are anxiously waiting to play the delayed The Witcher 3 and that will have PhysX.

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

So the toaster jokes are still true only this time, its now an oven. 

 

Even if AMD gpus are gonna be more powerful than equivalent Nvidia gpus, I'm not touching them with a ten foot pole unless they get their TDP and thermals under control.

 

You're right, thats why i'll never run anything more than my 9600 GT cards cause anything over 95 watt TDP is regression not progression!

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

sheep.jpg

 

How are games meant to be played with PhysX, when games do not use it? Or only use simple PhysX, that the CPU can run (AMD APU too)? How does Nvidia run Mantle? Uhm.

 


 

How does this person have over 5k posts in here?

Another false statement.  Many quality games use PhysX.  Are you planning to play The Witcher 3? As for the mention of Mantle seriously? LOLOL

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

So sad, my wolves in the Witcher 3 will have slightly less impressive fur. What ever will I do with myself?

Intel i7 6700k @ 4.8ghz, NZXT Kraken X61, ASUS Z170 Maximus VIII Hero, (2x8GB) Kingston DDR4 2400, 2x Sapphire Nitro Fury OC+, Thermaltake 1050W

All in a Semi Truck!:

http://linustechtips.com/main/topic/519811-semi-truck-gaming-pc/#entry6905347

Link to comment
Share on other sites

Link to post
Share on other sites

Hand full of titles is a lie. There are many more games than a hand full that use PhysX and I think people are anxiously waiting to play the delayed The Witcher 3 and that will have PhysX.

 

 

56 titles.

Some of which are the ever so popular Crazy Machines, Gas Guzzlers, and Hot dance party series.

 

His point stands pretty strong, if it an open thing that any GPU could leverage the amount of games using it would be undeniable be extremely higher. I would call 56 titles a handful out of the mass amount of games that have even just come out in the past 3-4 years never mind the 10 years physx has existed. The first year physx existed not a single game came out for it, only in 2005 did it get a massive title list of two. 2006 it added a blazing 5 more titles! It keeps going, but point is adaption hasnt exactly been what anyone would call super fast.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Another false statement.  Many quality games use PhysX.  Are you planning to play The Witcher 3? As for the mention of Mantle seriously? LOLOL

 

What games will use a version of PhysX, that cannot run on the CPU? Even Watch Dogs, a GameWorks proprietary title, ran better on AMD at launch, with all detail activated. I only remember borderlands 2 having some acid effect, that was completely out of place.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I am not going to pull out all the videos that show a comparison of cpu vs gpu  in regards to this.  If you can't admit some top games look better with Nvidia then get your AMD cards and be happy.  Also be happy cooking food on them. It looks like the new one should make for a real good oven.

Oh and btw for people saying fan boy and that sort of thing my daughter was named after AMD so whatever.

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

So sad, my wolves in the Witcher 3 will have slightly less impressive fur. What ever will I do with myself?

 

Play the game at more than 3fps? So far all open world sandbox games, using the black boxed proprietary GameWorks, have been severely delayed (i.e. Watch Dogs, Witcher 3). So far FurWorks, can only be done on 2-3 wolves, without rendering anything else. I doubt we will see anything impressive here.

 

56 titles.

Some of which are the ever so popular Crazy Machines, Gas Guzzlers, and Hot dance party series.

 

His point stands pretty strong, if it an open thing that any GPU could leverage the amount of games using it would be undeniable be extremely higher. I would call 56 titles a handful out of the mass amount of games that have even just come out in the past 3-4 years never mind the 10 years physx has existed. The first year physx existed not a single game came out for it, only in 2005 did it get a massive title list of two. 2006 it added a blazing 5 more titles! It keeps going, but point is adaption hasnt exactly been what anyone would call super fast.

 

And how many of those, can process PhysX on the CPU, instead of the GPU? @ComradeHX is right: When Nvidia bought PhysX and made it closed, proprietary and exclusive to Nvidia, they really killed it off.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Another false statement.  Many quality games use PhysX.  Are you planning to play The Witcher 3? As for the mention of Mantle seriously? LOLOL

 

Majority of the games that use it are..ah..garbage?

 

Lets list the decent ones out of the 56 it has shall we?

1)Tom Clancy's Ghost Recon Advanced Warfighter

2) Shadow Grounds Survivor (was ok I guess)

3) UT3

4) Mirror's Edge

5) Batman: Arkham Asylum

6) Metro 2033

7) Mafia 2

8) Alice: MAdness Returns

9: Batman Arkham City

10) Borderlands 2

11) Metro Last Light

12) The Bureau: Xcom

13) AC IV Blackflag

14) CoD Ghosts (ah...i guess ill count it)

15) Borderlands: the pre-sequel

 

so thats 15 decent (arguable) titels out of 56, a mind boggling 37 percent of games with it are decent. 

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

dude, no. 4GB is fine even for 4K atm, and 5K in 90% of the games. remember the ultrawide setup of the Autobahnhammer? that had the same(ish) amount of pixels as a 5K display, and the only time it ran out of its 4gig buffer was when Shadow of Mordor was played with all those ultra textures and whatever ;) so be sure that 4Gig is enough

 

and from a developers point of view now

 

if the coders have any idea how the engine fetches stuff from the RAM and vRAM, 4 gigabytes will be enough to get us through 4K comfortably :) if the optimisation is crap (like it seems to be, lately), then yes, we will need crazy 8GB cards soon

 

My logic came from the assumption that as visuals improve in future games (look more and more realistic), higher res textures and more [complex] objects are implemented into each scene, thus would require more Vram. I agree with what you're saying though, Vram interface/architecture is getting faster and better optimized with each new generation, making the amount of Vram available less of an issue (if the data transfer rates are fast enough and the game is optimised well enough to take advantage of that).

 

I think ultimately we'll have to wait and see exactly how things will pan out. I think as 4K becomes more popular, 4GB Vram will be enough for the most part - just like 2GB Vram is enough at 1080p - for the most part. There will always be some games that demand that much more and for those that want the best of the best will have to shell out for the 8GB cards - just like how lots of people right now run 4GB cards and game at 1080p (myself included). ;)

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Play the game at more than 3fps? So far all open world sandbox games, using the black boxed proprietary GameWorks, have been severely delayed (i.e. Watch Dogs, Witcher 3). So far FurWorks, can only be done on 2-3 wolves, without rendering anything else. I doubt we will see anything impressive here.

 

 

And how many of those, can process PhysX on the CPU, instead of the GPU? @ComradeHX is right: When Nvidia bought PhysX and made it closed, proprietary and exclusive to Nvidia, they really killed it off.

 

Most of the titles you can do on CPU instead of gpu, that said most of the "decent" are all gpu rendered.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

I am not going to pull out all the videos that show a comparison of cpu vs gpu  in regards to this.  If you can't admit some top games look better with Nvidia then get your AMD cards and be happy.  Also be happy cooking food on them. It looks like the new one should make for a real good oven.

Oh and btw for people saying fan boy and that sort of thing my daughter was named after AMD so whatever.

Let's not get ahead of ourselves with this fanboy madness. It's always best to be open minded, besides mostly AMD reference cards could match up to being called '' the oven''.

Is your daughter called Duron? O.o 

Born to game, forced to work.  -_-

Link to comment
Share on other sites

Link to post
Share on other sites

Let's not get ahead of ourselves with this fanboy madness. It's always best to be open minded, besides mostly AMD reference cards could match up to being called '' the oven''.

Is your daughter called Duron? O.o 

LOL no she is not called Duron.  Her initials make AMD and it was intentional.

In regards to the 300 watts mentioned I don't think anyone is happy about that nor should they be.  An oven is an appropriate term describe this next card from AMD.

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

LOL no she is not called Duron.  Her initials make AMD and it was intentional.

In regards to the 300 watts mentioned I don't think anyone is happy about that nor should they be.  An oven is an appropriate term describe this next card from AMD.

Do you call a Ferrari an oven just because the motor gets hot ? No. You call it an impressive car. Why the double standard ?

Spoiler

 

Link to comment
Share on other sites

Link to post
Share on other sites

LOL no she is not called Duron.  Her initials make AMD and it was intentional.

In regards to the 300 watts mentioned I don't think anyone is happy about that nor should they be.  An oven is an appropriate term describe this next card from AMD.

Jesus Christ talk about extreme over exaggeration and then some.

Don't get it, that's it. You don't have to bash it like that.

Link to comment
Share on other sites

Link to post
Share on other sites

Why is everyone hating the 300w?

Im upgrading soon and when i do

Overkill is not enough

Nvidia is to Dr Dre Beets as AMD is to KFC.

One makes you broke, the other you can get more of and have a midnight snack from the fridge when hungry again. Once you go Nvidia, you go broked, turn into an Elitist, or get the incorrect amount of VRAM.


- WCCFTECH

 I was only 9 years old. I loved Fifflaren so much, I had all the NiP merchandise and matches pirated. I prayed to Fifflaren every night before bed. Thanking him for the life I have been given. Fifflaren is love I say. Fifflaren is life. My dad hears and calls me a fuckhead. I knew he was just jelly of my passion for Fifflaren. I called him a Sw@yer. He hits me and sends me to go to sleep. I'm crying now, and my face hurts. I lay in bed and it's really cold. A warmth is moving towards me. I feel someone touching me. I feel someone touching me. It's Fifflaren. I am so happy. He whispers in my ear; "this is my pyjama". He grabs me with his powerful Swedish hands and puts me on my hands and knees. I'm ready. I spread my ass cheeks for Fifflaren. He penetrates my butt-hole. It hurts so much but I do it for Fifflaren. I can feel my butt tearing as my eyes start to water. I push against his force. I want to please Fifflaren. He roars a viking roar as he fills my butt with his love. My dad walks in. Fifflaren looks straight into his eyes and says; "He is a ninja now". Fifflaren is love, Fifflaren is life 
Link to comment
Share on other sites

Link to post
Share on other sites

LOL no she is not called Duron.  Her initials make AMD and it was intentional.

In regards to the 300 watts mentioned I don't think anyone is happy about that nor should they be.  An oven is an appropriate term describe this next card from AMD.

 

 

Right, so my GT 9600 comment was more of a joke but at what point in time did you decide we were finally allowed to be happy with a TDP over 100watts?

 

Its going to happen with new stuff, making bigger stuff go faster. Get.over.it.

 

If your so damn concerned with TDP going up with new generations of cards go back and run all your games on your 95watt TDP GT 9600s then. see how that goes for ya brah

 

EDIT: if the damn 390x comes out and even has a 5% performance gain over the 980 then your goddamn right im going to be more than ok with the 300W TDP. Welcome to the high end of GPU wars, we like sh*t that goes faster. 

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hand full of titles is a lie. There are many more games than a hand full that use PhysX and I think people are anxiously waiting to play the delayed The Witcher 3 and that will have PhysX.

Well then find me how many games released in last year that actually uses hardware-accelerated Physx, then?

 

 

 

 

12) The Bureau: Xcom

 

14) CoD Ghosts (ah...i guess ill count it)

 

You really can't count those two.

Anyone who has a sister hates the fact that his sister isn't Kasugano Sora.
Anyone who does not have a sister hates the fact that Kasugano Sora isn't his sister.
I'm not insulting anyone; I'm just being condescending. There is a difference, you see...

Link to comment
Share on other sites

Link to post
Share on other sites

so the R9 380X would be a huge glorified R9 290X on steorïd? that was to be expected honetly...any idea how many stream processors we are talking about here? is it based on GCN architecture or it's something new?

i guess it's a new AMD thing...they plan to reach the 300W mark with the 300 series cards and aim for 400W by the time the 400 cards will come out :D

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×