Jump to content

GeForce GTX 1080 3DMark (First Benchmarks)

whoa...

 

1800+ Mhz

 

im gonna take a pinch of salt here

saved.png

The norms in which determines the measure of morality of a human act are objective to the moral law and subjectively man/woman's conscience

Link to comment
Share on other sites

Link to post
Share on other sites

I could be wrong, but that's an outlandish overclock for any cooler, let alone a reference card.

 

We'll just have to wait till the card comes out and a reliable tester benchmarks the reference 980, 980ti, and 1080.

Link to comment
Share on other sites

Link to post
Share on other sites

About 20-30% faster than a stock 980Ti. That's about what everyone was expecting. We can also expect TDP to be 200w or lower as well. Looking at the boost clocks, I won't surprised to see ABP custom cards hit 2GHz on the core. I'm hoping, like the 980Ti, Pascal will overclock very well, and scale well at the same time.

5820K - ASUS X99-A - 16GB Corsair LPX - HD 7970 GHz - Qnix 1440p @ 96Hz - Waiting for Polaris/Pascal

Link to comment
Share on other sites

Link to post
Share on other sites

Gtx 980 was only slightly faster than the 780Ti at launch. There's a much bigger gap here and should get bigger over time with drivers

Current PC: Origin Millennium- i7 5820K @4.0GHz | GTX 980Ti SLI | X99 Deluxe 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Face2Face said:

About 20-30% faster than a stock 980Ti. That's about what everyone was expecting. We can also expect TDP to be 200w or lower as well. Looking at the boost clocks, I won't surprised to see MIB custom cards hit 2GHz on the core. I'm hoping, like the 980Ti, Pascal will overclock very well, and scale well at the same time.

I could be wrong, but if the results here are genuine, it does not bode well for the GTX 1080. Consider the equivalent of IPC on GPUs. This has a core clock that's way faster than the GTX 980 Ti, but the increase in performance is marginal.

Link to comment
Share on other sites

Link to post
Share on other sites

To have the 1080 replace the 980 ti is no small feat. 

[CPU: 4.7ghz I5 6600k] [MBAsus Z170 Pro G] [RAM: G.Skill 2400 16GB(2x8)]

[GPU: MSI Twin Frozr GTX 970] [PSU: XFX Pro 850W] [Cooler: Hyper 212 Evo]
[Storage: 500GB WD HDD / 128GB SanDisk SSD ] [Case: DeepCool Tessaract]

[Keyboard: AZIO MGK1] [Mouse: Logitech G303] [Monitor: 2 x Acer 23" 1080p IPS]

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Aereldor said:

I could be wrong, but if the results here are genuine, it does not bode well for the GTX 1080. Consider the equivalent of IPC on GPUs. This has a core clock that's way faster than the GTX 980 Ti, but the increase in performance is marginal.

Yes, the IPC is actually worse than a GTX 980Ti clock for clock. Someone took a LN2 980Ti and ran the same boost clocks as the 1080 and the 980Ti actually scored higher on 3DMark FS. I think the biggest draw is lower power consumption, and hopefully price. Also, if the 1070 is priced right, we could be looking at  980Ti performance for $300-$400. If that's the case, then a 1070 would be the real winner.

5820K - ASUS X99-A - 16GB Corsair LPX - HD 7970 GHz - Qnix 1440p @ 96Hz - Waiting for Polaris/Pascal

Link to comment
Share on other sites

Link to post
Share on other sites

Ugh that reference cooler is so stupid looking. Like something a 12 year old would think looks awesome(which might be why it looks like that), or something Zotac would make.

CPU: Ryzen 7 3700x,  MOBO: ASUS TUF X570 Gaming Pro wifi, CPU cooler: Noctua U12a RAM: Gskill Ripjaws V @3600mhz,  GPU: Asus Tuf RTX OC 3080 PSU: Seasonic Focus GX850 CASE: Lian Li Lancool 2 Mesh Storage: 500 GB Inland Premium M.2,  Sandisk Ultra Plus II 256 GB & 120 GB

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Face2Face said:

Double post - DELETE

5820K - ASUS X99-A - 16GB Corsair LPX - HD 7970 GHz - Qnix 1440p @ 96Hz - Waiting for Polaris/Pascal

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Face2Face said:

Yes, the IPC is actually worse than a GTX 980Ti clock for clock. Someone took a LN2 980Ti and ran the same boost clocks as the 1080 and the 980Ti actually scored higher on 3DMark FS. I think the biggest draw is lower power consumption, and hopefully price. Also, if the 1070 is priced right, we could be looking at  980Ti performance for $300-$400. If that's the case, then a 1070 would be the real winner.

Give it a month or two, and that's precisely what you'll get with the R9 Nano/Fury. They've dropped as low as $450, and are only getting lower. Furthermore, AMD clearly has the upper hand when it comes to DirectX 12, with a $300 R9 390 keeping up with or even beating a $600 GTX 980 Ti.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, SpaceTurtle917 said:

How do we know this is legitimate? 

We don't. Although rumors this close to launch are usually fairly accurate.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, maizenblue said:

Ugh that reference cooler is so stupid looking. Like something a 12 year old would think looks awesome(which might be why it looks like that), or something Zotac would make.

I'd wait until the final product is release to make your final judgment. I don't think it looks bad at all, and it just sits inside your case anyway. 

5820K - ASUS X99-A - 16GB Corsair LPX - HD 7970 GHz - Qnix 1440p @ 96Hz - Waiting for Polaris/Pascal

Link to comment
Share on other sites

Link to post
Share on other sites

Will we see 2Ghz this gen? :D

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Aereldor said:

Give it a month or two, and that's precisely what you'll get with the R9 Nano/Fury. They've dropped as low as $450, and are only getting lower. Furthermore, AMD clearly has the upper hand when it comes to DirectX 12, with a $300 R9 390 keeping up with or even beating a $600 GTX 980 Ti.

To be fair, isn't the 390 only winning in DX12 in the new Hitman, an AMD sponsored title? AMD is making a good showing in DX12, but I doubt NVIDIA is sitting idle. If GP104 is anything like GP100, then were looking at the same SMX layout as GCN. Meaning NVIDIA is designing their architecture now to be more like GCN. If that's the case, utilization will be very good, and hopefully won't suffer the same fate as Kepler, and soon to be Maxwell. Time will tell is Async will be AMD's ace in hole.

5820K - ASUS X99-A - 16GB Corsair LPX - HD 7970 GHz - Qnix 1440p @ 96Hz - Waiting for Polaris/Pascal

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Face2Face said:

To be fair, isn't the 390 only winning in DX12 in the new Hitman, an AMD sponsored title? AMD is making a good showing in DX12, but I doubt NVIDIA is sitting idle. If GP104 is anything like GP100, then were looking at the same SMX layout as GCN. Meaning NVIDIA is designing their architecture now to be more like GCN. If that's the case, utilization will be very good, and hopefully won't suffer the same fate as Kepler, and soon to be Maxwell. Time will tell is Async will be AMD's ace in hole.

 

That might be the case but, AMD have the console space right now.

AMD has the upper hand in gaming development because the uarch in the consoles GPU is GCN....

 

with that in mind... i certainly hope that async will bloom not because AMD needs money but because its basically free performance to cards

The norms in which determines the measure of morality of a human act are objective to the moral law and subjectively man/woman's conscience

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, zeraine00 said:

That might be the case but, AMD have the console space right now.

AMD has the upper hand in gaming development because the uarch in the consoles GPU is GCN....

 

with that in mind... i certainly hope that async will bloom not because AMD needs money but because its basically free performance to cards

Correct, that's why it's looking like NVIDIA is designing their GPUs to look more like GCN, except for Async. It has to do with Wavefronts and Warp design. GCN has a ton of compute power, and it's being utilized to the fullest in newer games designed for GCN. Kepler and Maxwell SMX layout is basically leaving performance on the table as we're not seeing utilization like GCN. AMD was very smart to win the Console contracts, as it will help them tremendously in the PC gaming space as well. GCN architecture is more advance than most people think.

5820K - ASUS X99-A - 16GB Corsair LPX - HD 7970 GHz - Qnix 1440p @ 96Hz - Waiting for Polaris/Pascal

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, zeraine00 said:

whoa...

 

1800+ Mhz

 

im gonna take a pinch of salt here

saved.png

if you compare skylake clocks to older ones its not to far off to be honest. 

Desktop Build Log http://linustechtips.com/main/topic/486571-custom-wooden-case-with-lighting/#entry6529892

thinkpad l450, i5-5200u, 8gb ram, 1080p ips, 250gb samsung ssd, fingerprint reader, 72wh battery <3, mx master, motorola lapdock as secound screen

Please quote if you want me to respond and marking as solved is always appreciated.

Link to comment
Share on other sites

Link to post
Share on other sites

With such a high clock speed it will be interesting to see what temperatures this card is hitting. I'm assuming it's a reference card, and that Nvidia are maintaining their 80'C thermal throttle, in which case I'm curious as to how long it would be able to hold out at 1800Mhz (if the screenshots are true of course)

CPU: AMD 7800X3D  | GPU: Asus Dual RTX 2080 Advanced | RAM: 32GB 6000MHz CL30 | Motherboard: ASUS ROG Strix X670E-F

PSU: Corsair RX750 | OS: Windows 11 64-Bit | Mouse: Logitech G502 Proteus Spectrum | Keyboard: Corsair K70 RGB (Brown Switches) 

Link to comment
Share on other sites

Link to post
Share on other sites

Considering this is kind of a double process jump its a little disappointing performance wise. Not enormously so it just tells us that 16nm (20+FinFET) is only really good for a 2x scaling and not any more than that. A decent replacement for the 970/980 and notably faster than what AMD intends to release based on all we know, but its also a bigger chip.

Link to comment
Share on other sites

Link to post
Share on other sites

"Worth noting 3DMark11 is not showing correct GPU clock"

 

If anyone missed that part, though I doubt any of this holds any merit whatsoever.  I guess we will find out tomorrow around 1pm...

- ASUS X99 Deluxe - i7 5820k - Nvidia GTX 1080ti SLi - 4x4GB EVGA SSC 2800mhz DDR4 - Samsung SM951 500 - 2x Samsung 850 EVO 512 -

- EK Supremacy EVO CPU Block - EK FC 1080 GPU Blocks - EK XRES 100 DDC - EK Coolstream XE 360 - EK Coolstream XE 240 -

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Face2Face said:

Correct, that's why it's looking like NVIDIA is designing their GPUs to look more like GCN, except for Async. It has to do with Wavefronts and Warp design. GCN has a ton of compute power, and it's being utilized to the fullest in newer games designed for GCN. Kepler and Maxwell SMX layout is basically leaving performance on the table as we're not seeing utilization like GCN. AMD was very smart to win the Console contracts, as it will help them tremendously in the PC gaming space as well. GCN architecture is more advance than most people think.

 

with just raw power... without a doubt GCN uarch is one of the most powerful compute monsters out there.

 

Though pascal is a compute powerhouse i wonder what will be the effect of bring more compute units.... such as more power consumption or increased heat output

The norms in which determines the measure of morality of a human act are objective to the moral law and subjectively man/woman's conscience

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ChrisCross said:

if you compare skylake clocks to older ones its not to far off to be honest. 

with pascal bringing more compute power... it would be amazing if Nvidia pull it off....

 

 

The norms in which determines the measure of morality of a human act are objective to the moral law and subjectively man/woman's conscience

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Face2Face said:

To be fair, isn't the 390 only winning in DX12 in the new Hitman, an AMD sponsored title? AMD is making a good showing in DX12, but I doubt NVIDIA is sitting idle. If GP104 is anything like GP100, then were looking at the same SMX layout as GCN. Meaning NVIDIA is designing their architecture now to be more like GCN. If that's the case, utilization will be very good, and hopefully won't suffer the same fate as Kepler, and soon to be Maxwell. Time will tell is Async will be AMD's ace in hole.

FreeSync is open source, which makes FreeSync monitors cheaper. It's CrossFire vs SLI all over again.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×