Jump to content

AMD R9 290X has an UBER Mode Benchmarks inside

Its not a word I have used, but it does look like its going to be called UBER mode... a few sources have used this now...

I really doubt it. How many sourced used Xbox 720, which doesn't even make sense.

 

And what happens in a few years from now? Not so uber then, the Uber mode becomes "was uber at the time"?

“Snorting instant coffee is the best,” said Kayla Johns, 19, of Portland.

Link to comment
Share on other sites

Link to post
Share on other sites

1080p? Everything at max 60+hz for sure. A 680 can do most games at 1600p @ about 60hz, only limited by VRAM

Depends on your definition of 'can do most games'. I like 60fps and over with max or at lest very high details and preferably some form of AA.

Case: Corsair 4000D Airflow; Motherboard: MSI ZZ490 Gaming Edge; CPU: i7 10700K @ 5.1GHz; Cooler: Noctua NHD15S Chromax; RAM: Corsair LPX DDR4 32GB 3200MHz; Graphics Card: Asus RTX 3080 TUF; Power: EVGA SuperNova 750G2; Storage: 2 x Seagate Barracuda 1TB; Crucial M500 240GB & MX100 512GB; Keyboard: Logitech G710+; Mouse: Logitech G502; Headphones / Amp: HiFiMan Sundara Mayflower Objective 2; Monitor: Asus VG27AQ

Link to comment
Share on other sites

Link to post
Share on other sites

I really doubt it. How many sourced used Xbox 720, which doesn't even make sense.

 

And what happens in a few years from now? Not so uber then, the Uber mode because "was uber at the time"?

Google "R9 290X uber mode" and get back to me...

Lian Li PC-V359WRX Micro-ATX Case | Intel 5960X Extreme 3.00GHz | ASRock Fatal1ty X99M KILLER | Crucial 32 GB 2666 DDR4 | Thermaltake NiC C5 | EVGA Supernova 1200W P2 | 2x 240GB OCZ Radeon R7 | 2x 256 GB Samsung 840 Series Pro | 2 X 120GB Samsung 840 EVO | 6x NF-F12’s | Place Holder GPU R9 290X |

Links Current 5960X Old FX9590

Link to comment
Share on other sites

Link to post
Share on other sites

have fun with games stuttering

That's very ignorant of you to say.

 

Currently DX10/11 games are completely fixed(at least my 7870x2 setup works fine), but the only problems I've heard of are DX9 games like Skyrim.

 

Do you even own a AMD card? I own both sides, and I love both sides for what they do, but unless you have had personal experience with the problems at hand(on both sides mind you), I wouldn't go about and spout nonsense.

Link to comment
Share on other sites

Link to post
Share on other sites

Google "R9 290X uber mode" and get back to me...

I'm not saying you're wrong. I'm saying I doubt it and think it's a stupid name regardless. And until we got proper confirmation from AMD my doubt is valid.

“Snorting instant coffee is the best,” said Kayla Johns, 19, of Portland.

Link to comment
Share on other sites

Link to post
Share on other sites

LOL look at sleeping dogs, the 290X destroys the 780.

 

That's just vram bottleneck... talk about fair play when taking 780 which never was designed with 4k in mind (remember that GK110 was prepared for 2012 launch of Kepler lineup, it was supposed to be 680 year ago... this gpu is over 1 year old) and comparing it to a new architecture.

][ CPU: Phenom II x6 1045t @3,7GHz ][ GPU: GTX 660 2GB ][ Motherboard: Gigabyte GA-MA770T-UD3P ][ RAM: 8GB @1450Mhz CL9 DDR3 ][ PSU: Chieftec 500AB A ][ Case: SilentiumPC Regnum L50 ][ CPU Cooler: CoolerMaster Hyper 212 Evo & Arctic MX4 ][

Link to comment
Share on other sites

Link to post
Share on other sites

That's just vram bottleneck... talk about fair play when taking 780 which never was designed with 4k in mind (remember that GK110 was prepared for 2012 launch of Kepler lineup, it was supposed to be 680 year ago... this gpu is over 1 year old) and comparing it to a new architecture.

TBH Should we really consider architectures when comparing cards? We're consumers, if the cards are in the same price range and are designed for the same use, they are comparable. 

“Snorting instant coffee is the best,” said Kayla Johns, 19, of Portland.

Link to comment
Share on other sites

Link to post
Share on other sites

TBH Should we really consider architectures when comparing cards? We're consumers, if the cards are in the same price range and are designed for the same use, they are comparable. 

 

Well saying that 'sleeping dogs at 4k has amazing lead for amd r9 290x' when we know it's vram bottleneck (not enough vram), is not fair. there is no GPU that is able to drive games at 4k without crossfire and r9 290x is no different. why push 4k through if nobody can afford it and it just is unneeded? meh

 

I bet that the first GPU to actually RUN 4k without problems will be nvidia Volta with stacked DRAM for 1tb/s bandwidth.

][ CPU: Phenom II x6 1045t @3,7GHz ][ GPU: GTX 660 2GB ][ Motherboard: Gigabyte GA-MA770T-UD3P ][ RAM: 8GB @1450Mhz CL9 DDR3 ][ PSU: Chieftec 500AB A ][ Case: SilentiumPC Regnum L50 ][ CPU Cooler: CoolerMaster Hyper 212 Evo & Arctic MX4 ][

Link to comment
Share on other sites

Link to post
Share on other sites

This is not the original source, Chinese just love to remove watermarks from stolen content.

Link to comment
Share on other sites

Link to post
Share on other sites

Well saying that 'sleeping dogs at 4k has amazing lead for amd r9 290x' when we know it's vram bottleneck (not enough vram), is not fair. there is no GPU that is able to drive games at 4k without crossfire and r9 290x is no different. why push 4k through if nobody can afford it and it just is unneeded? meh

 

I bet that the first GPU to actually RUN 4k without problems will be nvidia Volta with stacked DRAM for 1tb/s bandwidth.

I just think you're looking at it the wrong way. If you're a gamer with a $650 budget for a GPU then as far as performance goes for your money AMD is simply winning. Consumers don't care about architecture limits and Vram bottlenecks just because it's "fair".

 

But 780Ti may change a few things.

“Snorting instant coffee is the best,” said Kayla Johns, 19, of Portland.

Link to comment
Share on other sites

Link to post
Share on other sites

I just think you're looking at it the wrong way. If you're a gamer with a $650 budget for a GPU then as far as performance goes for your money AMD is simply winning. Consumers don't care about architecture limits and Vram bottlenecks just because it's "fair".

 

But 780Ti may change a few things.

 

No, amd is not winning if you have AMD CPU because Nvidia drivers are SOOO much better (multithreaded vs singlethreaded) on slower, but multicore AMD cpus.

 

AMD cpu + nvidia gpu, even if nvidia gpu is a bit slower at same price bracket, it will be higher frames per second in the end.

][ CPU: Phenom II x6 1045t @3,7GHz ][ GPU: GTX 660 2GB ][ Motherboard: Gigabyte GA-MA770T-UD3P ][ RAM: 8GB @1450Mhz CL9 DDR3 ][ PSU: Chieftec 500AB A ][ Case: SilentiumPC Regnum L50 ][ CPU Cooler: CoolerMaster Hyper 212 Evo & Arctic MX4 ][

Link to comment
Share on other sites

Link to post
Share on other sites

No, amd is not winning if you have AMD CPU because Nvidia drivers are SOOO much better (multithreaded vs singlethreaded) on slower, but multicore AMD cpus.

 

AMD cpu + nvidia gpu, even if nvidia gpu is a bit slower at same price bracket, it will be higher frames per second in the end.

 

What the hell are you on about? 

        | AMD FX-8350 @4.6 || Gigabyte GA-990FX-UD3(Stay away from this MOBO) || AMD Radeon HD 7950 @1100/1400 || G.SKILL Aries 16GB @1600 |


        | ADATA XPG SX900 128GB SSD(OS) || WD Caviar Black 2TB HDD || ASUS XONAR_DG 5.1 Sound Card|| XSPC Raystorm EX240 D5 WC Kit |


      |CORSAIR HX750 750W || Modded NZXT Phantom |

Link to comment
Share on other sites

Link to post
Share on other sites

What the hell are you on about? 

I wouldn't pay much attention to him, he's been constantly making " OH NVIDIA drivers are SOOO MUCH better than AMD drivers" comments on almost every thread he comments on.

Link to comment
Share on other sites

Link to post
Share on other sites

What the hell are you on about? 

Ya I couldn't make sense of that so I just left it.

“Snorting instant coffee is the best,” said Kayla Johns, 19, of Portland.

Link to comment
Share on other sites

Link to post
Share on other sites

Just hurts the brain reading it....

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Depends on your definition of 'can do most games'. I like 60fps and over with max or at lest very high details and preferably some form of AA.

I get 50fps in BF3 at ultra max AA so that and Crysis 3 (can only run on high because its only 2GB) are the only games I have encountered so far that give me trouble. Of course new games like star citizen or poorly optimized titles will do worae

Link to comment
Share on other sites

Link to post
Share on other sites

It probably would make you go deaf though...

Link to comment
Share on other sites

Link to post
Share on other sites

I guess we'll have to wait until release day to see what kind of performance the 290x has under Linux. For once AMD is ahead of Nvidia for support of the 3.12 kernel. At least for now.

 

But as for me I would like to see a sku with a waterblock already attached. 

Link to comment
Share on other sites

Link to post
Share on other sites

 

Perhaps some of you guys skipped this:

 

Originally Posted by Shadow_UGZ:
 
Just came back from the AMD event in Montreal and wanted to share the info I got. The event was in a small room with 3 AMD guys, 2 identical rigs (except for GPUS) and some refreshments.
 
I think the specs/benchmarks have already been posted so I'll just post some observations and other info I got from the AMD guys.
 
- The 290X was MUCH smoother than the GTX 780 (The GTX780 was tearing like a mofo in Bioshock Infinite, painful to watch...)
- The 290X has a "UBER-mode" switch on the side which he couldn't tell more about because of NDA
- 280X is still Tahiti but it's a "completely new ASIC"
- Never Settle will not be bundled with R7/R9 series at launch but will be eventually. It will still be bundled with 7970, etc
- The 290X was surprisingly very quiet and looks amazing (much better than GTX780 IMO, I love black)
- 290X is HDMI 1.4 (not 2.0, so it's limited to 30Hz @ 4K on HDMI)
- NDA lifting this month "for sure"
- They are testing some BETA drivers with 12K (3 x 4K) eyefinity support which they will be releasing soon
- He would not let me run GPUz... 
- Very disappointed by 4K, expected a lot more. Not worth it IMO
 
I have some pics but they are very blurry, I can dump them in an imgur album if you guys really want to see them.
 
Here are the pictures: AMD 290X event Montreal - Imgur
 
Just noticed why the GTX 780 was tearing so much when looking at the pictures: Minimum FPS
 
Bioshock Inifinite Benchmark
 
290X
 
Min: 33.0 FPS
AVG: 43.3 FPS
Max: 56.6 FPS
 
GTX 780
 
Min: 9.89 FPS
Avg: 39.49 FPS
Max: 59.21 FPS
 
Originally Posted byJuub:
 
-He also said their aim is to compete with the GTX 780 and not the Titan. Surprisingly, Bioshock Infinite had the 290x beating the 780 by about 15% but in Tomb Raider only about 5%. It's weird considering Bioshock tends to favor NVIDIA and Tomb Raider AMD. He couldn't reveal the price but he did tell me we'd be ''extremely happy'' about it. Considering it's aiming to compete with the 780, I wouldn't be surprised to see it priced lower.
 
-He didn't tell me Never Settle bundles will be eventually included with the R series, he simply told me people love bundles and it's a great incentive to buy so they'll definitely try to have some in the future.
 
-The Sharp monitors were both 4K 60HZ and 30''. They looked good but not nearly enough to justify their ridiculous price tags of 3,000$+. 4K doesn't look that much better than 1600p at that screen size but that's probably due to the screen being too small to display UHD in its full glory. Still, for those who absolutely want a 4K 27'', you'll be disappointed.
 
-290X is designed to run at 95c so it'll get hot but it won't throttle nor get damaged by this. This is the target they're aiming for. The GPU will run really hot but will still be stable.
 
-Mantle will be strictly up to the developers to use or not to use. They apparently have some big name players in the field who will use Mantle. Dice is the first big one but expect to see more in the near future. Mantle is what everyone say it is. It brings a console-like development environment to the PC. I wouldn't hold my breath for 20% better performance. I would however expect much smoother and better ports.
 
-The tech demo showing Ruby with the not so great TressFX had around 23 different effects designed by AMD and it was just a beta. A finished version will apparently look much better.
 
-CPU's running on both systems were i7-3960X.
 
-The AMD rep said he saw the rumors about the pricing online and the most widely accepted prices are ''completely wrong''.
 
-He couldn't reveal anything about the specs. No GPU-Z. Nothing. Just the cards as is.
 
- I attempted to get the price out of him by first asking how much the monitors were worth and then by asking how much the whole system was worth. He didn't bite.
 

 

 

95c? Wow. Hopefully there are waterblocks out quickly. That is crazy hot.

Link to comment
Share on other sites

Link to post
Share on other sites

The uber mode switch bothers me a little bit.

Like it suggests that the card consumes too much power or becomes too noisy at those clocks, so it's lower by default...

 

Like when AMD released bulldozer CPUs, they couldn't ramp up the clockspeeds because power consumption would become ridiculous.

Link to comment
Share on other sites

Link to post
Share on other sites

The uber mode switch bothers me a little bit.

Like it suggests that the card consumes too much power or becomes too noisy at those clocks, so it's lower by default...

 

Like when AMD released bulldozer CPUs, they couldn't ramp up the clockspeeds because power consumption would become ridiculous.

I personally could care less about power consumption, Honestly My EVGA NEX SuperNova 1500w PSU needs a bit of a challenge.

Link to comment
Share on other sites

Link to post
Share on other sites

What the hell are you on about?

I wouldn't pay much attention to him, he's been constantly making " OH NVIDIA drivers are SOOO MUCH better than AMD drivers" comments on almost every thread he comments on.

Ya I couldn't make sense of that so I just left it.

bathsalts like dem chinese zombies.

You missed my point.

Amd drivers are fine as long as you have fast single thread I.e. intel cpu.

But nvidia uses more threads @Kuzma thanks to which amd CPUs work better with it even tho they have slower single thread.

https://linustechtips.com/main/topic/66712-r9-280x-versus-gtx-760-4gb-battle-of-the-drivers-fight/

][ CPU: Phenom II x6 1045t @3,7GHz ][ GPU: GTX 660 2GB ][ Motherboard: Gigabyte GA-MA770T-UD3P ][ RAM: 8GB @1450Mhz CL9 DDR3 ][ PSU: Chieftec 500AB A ][ Case: SilentiumPC Regnum L50 ][ CPU Cooler: CoolerMaster Hyper 212 Evo & Arctic MX4 ][

Link to comment
Share on other sites

Link to post
Share on other sites

You missed my point.

Amd drivers are fine as long as you have fast single thread I.e. intel cpu.

But nvidia uses more threads @Kuzma thanks to which amd CPUs work better with it even tho they have slower single thread.

https://linustechtips.com/main/topic/66712-r9-280x-versus-gtx-760-4gb-battle-of-the-drivers-fight/

 

IMO, they need to compare with it 1 Tier lower Nvidia card like GTX760 instead of GTX770.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×