Jump to content

AMD confirms 4GB limit for first HBM graphics card

Samfisher

I feel like GTA V is a bit of an aberration, Witcher 3 doesn't use anywhere near as much VRAM at all.

I'm talking <2 GB at 1080p max settings, and that game is unquestionably better looking than GTA V

Interesting.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting.

This is second hand of course, "joker slunt" on youtube said it in his "performance review" on youtube

Specs: 4790k | Asus Z-97 Pro Wifi | MX100 512GB SSD | NZXT H440 Plastidipped Black | Dark Rock 3 CPU Cooler | MSI 290x Lightning | EVGA 850 G2 | 3x Noctua Industrial NF-F12's

Bought a powermac G5, expect a mod log sometime in 2015

Corsair is overrated, and Anime is ruined by the people who watch it

Link to comment
Share on other sites

Link to post
Share on other sites

Having a large (giant) all-silicon interposer for the GPU die and memory to sit on all connected was going to cost money. How people could not see this coming baffles me. Also, AMD needs money, and a lot of it, before the end of 2018 going through 2019 when all 2.4 billion left of its debt all comes due over just 5 quarters. AMD needs margins, so it can engage in a price war, but it needs to start high enough to get enough margins to live.

People saw that it would cost more money just they and I were expecting a trade off for HBM design to allow faster and more memory something that isn't easily done with current methods otherwise it was a waste of time and a bad move.

High margins only works if it sells. 

Spoiler

Corsair 400C- Intel i7 6700- Gigabyte Gaming 6- GTX 1080 Founders Ed. - Intel 530 120GB + 2xWD 1TB + Adata 610 256GB- 16GB 2400MHz G.Skill- Evga G2 650 PSU- Corsair H110- ASUS PB278Q- Dell u2412m- Logitech G710+ - Logitech g700 - Sennheiser PC350 SE/598se


Is it just me or is Grammar slowly becoming extinct on LTT? 

 

Link to comment
Share on other sites

Link to post
Share on other sites

seems like the next iteration will be better  :)

Link to comment
Share on other sites

Link to post
Share on other sites

This pretty much solidifies my decision to go G-Sync.  With having a monitor, it rarely gets replaced at the same time a motherboard+cpu is replaced.  With this sort of piggy-back/leap-frog upgrade cycle, it probably means I would never go AMD ever again.  Toss-in the Shield requirement, it seems like there is never a point to go AMD unless it is on a second computer at this point.

 

Well played NVIDIA.  That 75/25 split will probably grow even higher, which is a remarkable feat.

Link to comment
Share on other sites

Link to post
Share on other sites

I honestly kinda hope their is some sort of VRAM issue like the 970 to see if Nvidia's response is as childish as AMD's was.

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

People saw that it would cost more money just they and I were expecting a trade off for HBM design to allow faster and more memory something that isn't easily done with current methods otherwise it was a waste of time and a bad move.

High margins only works if it sells.

It will in time. Micron beat Hynix HBM to market with HMC in a 3D form back in 2013, but it didn't end up in use for GPUs despite having a density, bandwidth, power, and latency advantage against GDDR5.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It will in time. Micron beat Hynix HBM to market with HMC in a 3D form back in 2013, but it didn't end up in use for GPUs despite having a density, bandwidth, power, and latency advantage against GDDR5.

I have little doubt that HBM is superior to standard methods, and has a lot of potential in the future. Just presently and of course this just speculation but probably wont show much benefit to the performance for games - but that remains unseen. 

Spoiler

Corsair 400C- Intel i7 6700- Gigabyte Gaming 6- GTX 1080 Founders Ed. - Intel 530 120GB + 2xWD 1TB + Adata 610 256GB- 16GB 2400MHz G.Skill- Evga G2 650 PSU- Corsair H110- ASUS PB278Q- Dell u2412m- Logitech G710+ - Logitech g700 - Sennheiser PC350 SE/598se


Is it just me or is Grammar slowly becoming extinct on LTT? 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6GB should be the new 4GB.

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

While it's understandable to be concerned about the total capacity, we can't forget about the fact that the frequency and bandwidth of this memory will make it stupid fast and efficient at transferring data. Also, don't forget about the other recent advancements/optimizations in data compression etc. which further reduces issues with Vram usage and data transfers.

 

4GB seems a little low to me for a GPU of this class. No doubt there will be an 8GB version. Most users would probably be just fine with 4GB - especially if you're still on a single 1080p or 1440p monitor. However, those running 4k or surround displays will probably be more wise to wait and grab the 8GB version. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Keen for the release and some benchmarks that's for sure!

Link to comment
Share on other sites

Link to post
Share on other sites

I'm rather ignorant on this topic although I understand the basic idea. However my 2gb 960 (don't hurt me, I had my reasons) will play gta v with settings using 2.5gb+ vram (by selecting ignore whatever limits) with no problems what so ever despite only having 2gb. Could this do something similar? I'm probably way off but am curious

 

It'll just swap what's in the memory more often, which could result in lagspikes. Many people seem to think a VRAM limit is holy, but it really depends on the game. You might run into issues if you move really fast through the world, but as long as the card has enough time to load in the textures it needs for the next area, it shouldn't be a problem.

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

I get near 3.5 on GTA V on 1080 (triple monitor or single) with everything turned up, so yeah, 4k probably gonna be more than this can handle.

Turning off MSAA would help a ton.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, it's a 1st gen technology, you can't blame them for that.

The stars died for you to be here today.

A locked bathroom in the right place can make all the difference in the world.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD did address concerns about the limited amount of VRAM, which I haven't seen posted in this thread yet. Basically to get more bandwidth on GPUs required adding more VRAM, so they didn't try to optimize VRAM usage on past GPUs. To quote them:

 

"When I asked Macri about this issue, he expressed confidence in AMD's ability to work around this capacity constraint. In fact, he said that current GPUs aren't terribly efficient with their memory capacity simply because GDDR5's architecture required ever-larger memory capacities in order to extract more bandwidth. As a result, AMD "never bothered to put a single engineer on using frame buffer memory better," because memory capacities kept growing. Essentially, that capacity was free, while engineers were not. Macri classified the utilization of memory capacity in current Radeon operation as "exceedingly poor" and said the "amount of data that gets touched sitting in there is embarrassing."

Strong words, indeed.

With HBM, he said, "we threw a couple of engineers at that problem," which will be addressed solely via the operating system and Radeon driver software. "We're not asking anybody to change their games."

http://techreport.com/review/28294/amd-high-bandwidth-memory-explained/2

 

So it doesn't look like it will be that big of an issue, and the extra bandwidth on the cards will help performance at high resolutions without a doubt, plus rumors that it outperforms the Titan X. So from what I know its likely the 390x will be the card to get for 4k gaming if everything is true, and will likely keep that spot until Nvidia's next generation of cards.

Link to comment
Share on other sites

Link to post
Share on other sites

Don't know how I feel about this, some games just required more than 4GB running at higher resolutions, pretty sure if you ran GTA V at 4k and max settings as you expect a card like this to do would go over the 4GB. Could be due to the cost? I remember seeing somewhere it costs considerably more than GDDR5

Actually Vram usage at 4k isn't too bad. i have 2x GTX 980s in 2 separate systems. Both plugged into 4k Displays (a 42 inch TV and a 4k monitor) The only thing that needs turning off as far as Vram's concerned is Anti-aliasing. I play with almost everything else at ultra/high. Most importantly textures are at Ultra. The biggest hurdle at 4k for most games tends to be the power of the GPU core as opposed to the Vram in my experience. 

 

Its also worth noting that  DirectX 12 will allow stacking of Vram in Crossfire and SLI if its supported by game developer, so 2 cards could give you 8GB of RAM

Spoiler
  • Primary PC (Photo editing and gaming on a 4k Monitor): Intel I7 4790k @4.5ghz @1.2v | 24GB  Corsair Vengeance Ram @1600Mhz | 1x Palit GTX 1080 JetStream| 480GB Crucial SSD | 2x WD Blue 500gb | Corsair RM1000 | Corsair 600t
  • Secondary PC (Gaming and watching films on a 42" 4K LG IPS TV @4k60): Inteli7 4790k | Corsair H60 | Asus H81l-plus ITX | 16GB Kingston Hyperx Beast 1600MHz | 1x GTX 980 G1 Windforce| 1x Samsung 1 TB 850 EVO SSD |Corsair CXM750W | Coolermaster Elite 130
Link to comment
Share on other sites

Link to post
Share on other sites

 As a result, AMD "never bothered to put a single engineer on using frame buffer memory better," because memory capacities kept growing. Essentially, that capacity was free, while engineers were not. 

 

This gives us a glimpse of how things were being runned in AMD - hopefully with Lisa this shit has changed.

Link to comment
Share on other sites

Link to post
Share on other sites

This is bad, really bad. :( if the Fiji HBM card is going to be around 850$ with only 4gb many will turn away from it to the 980ti (probably 6gb). Unless there's an 8gb gddr5 option or an oem makes an 8gB... Please no.

I fail to see your point. Why spend that money on more VRAM if you don't need it? Last gen. Nvidia had "only" 3gb in their gaming cards, and that didn't seem to bother anyone. 

 

Or am I the only one who use my high-end stuff for anything other than bragging?

CPU: i7-3960x @ 5GHz Motherboard: Rampage IV Gene ​RAM: 32GB HyperX Fury 1866 STORAGE:  WD Green 3TB + Crucial BX100 250GB + 16GB Primocache


GPU: 7990 + 7970 @ 1190c, 1515m Case: Phanteks Enthoo Mini XL PSU: AX1200i Monitor: Qnix QX2710 @ 110Hz


Cooling: D5 + X-RES 140CSQ, Supreme HF Cu, EK-FC7970/7990, 10xSP120 QE, RX240 + RX360

Link to comment
Share on other sites

Link to post
Share on other sites

I fail to see your point. Why spend that money on more VRAM if you don't need it? Last gen. Nvidia had "only" 3gb in their gaming cards, and that didn't seem to bother anyone. 

 

Or am I the only one who use my high-end stuff for anything other than bragging?

https://youtu.be/Cv57qDXpEPU?t=3m52s

Link to comment
Share on other sites

Link to post
Share on other sites

I fail to see your point. Why spend that money on more VRAM if you don't need it? Last gen. Nvidia had "only" 3gb in their gaming cards, and that didn't seem to bother anyone.

Or am I the only one who use my high-end stuff for anything other than bragging?

For a $850 CARD, the audience is going to be using higher res/more VRAM demanding display setups. And they might opt for the 980ti over the R9 3xx because of this. Which is not good for AMD

- snip-

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×