Jump to content

3DMark Time Spy results alledgedly from RTX 2080 appear in database

RobbinM
18 minutes ago, ne0tic said:

And it's maybe worth it for those like me that are rocking a GTX 960 and want an upgrade.

Good point. I'm in a similar boat, but it's also tempting to upgrade to a 1080, save hundreds of dollars (even though I might be willing to shell out for a 20 series card) and call it good.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, pas008 said:

what? even on 1k series

20% more cuda?

faster memory

 

pascal was huge jump on performance with same kinda jump (believe slightly more) in specs from maxwell

 

That is if you are willing to pay for the higher price tag. You can always just buy a used 10 series card or one of the now discounted 10 series from places trying to clear inventory. 

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, pas008 said:

what? even on 1k series

20% more cuda?

faster memory

 

pascal was huge jump on performance with same kinda jump (believe slightly more) in specs from maxwell

 

I definitely don't think it's worth it if your going from a pascal card but what I'm saying is that it may be worth it if you're going from a maxwell card. But we will have to wait for ''real'' benchmarks before we know if it's worth it or not.

Corsair iCUE 4000X RGB

ASUS ROG STRIX B550-E GAMING

Ryzen 5900X

Corsair Hydro H150i Pro 360mm AIO

Ballistix 32GB (4x8GB) 3600MHz CL16 RGB

Samsung 980 PRO 1TB

Samsung 970 EVO 1TB

Gigabyte RTX 3060 Ti GAMING OC

Corsair RM850X

Predator XB273UGS QHD IPS 165 Hz

 

iPhone 13 Pro 128GB Graphite

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ne0tic said:

I definitely don't think it's worth it if your going from a pascal card but what I'm saying is that it may be worth it if you're going from a maxwell card. But we will have to wait for ''real'' benchmarks before we know if it's worth it or not.

I mean that's really up to the person. Some might find it to be worth it while others won't. 

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Hurican7 said:

Good point. I'm in a similar boat, but it's also tempting to upgrade to a 1080, save hundreds of dollars (even though I might be willing to shell out for a 20 series card) and call it good.

Yeah, I'm planning on getting a RTX 2070 or a GTX 1080 but I don't know which of them. I can get a GTX 1080 for about $650 and a RTX 2070 for $750 so I'm leaning towards the RTX 2070 because of ray tracing and DLSS.

Corsair iCUE 4000X RGB

ASUS ROG STRIX B550-E GAMING

Ryzen 5900X

Corsair Hydro H150i Pro 360mm AIO

Ballistix 32GB (4x8GB) 3600MHz CL16 RGB

Samsung 980 PRO 1TB

Samsung 970 EVO 1TB

Gigabyte RTX 3060 Ti GAMING OC

Corsair RM850X

Predator XB273UGS QHD IPS 165 Hz

 

iPhone 13 Pro 128GB Graphite

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, ne0tic said:

Yeah, I'm planning on getting a RTX 2070 or a GTX 1080 but I don't know which of them. I can get a GTX 1080 for about $650 and a RTX 2070 for $750 so I'm leaning towards the RTX 2070 because of ray tracing and DLSS.

I guess you will have to wait for benchmarks. The new AA might help give the 2070 better performance than one might expect. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, pas008 said:

but looks like we are getting 20 to 30 % more raw powah

in cuda and tenors doing work to relief cuda on other tasks along with faster memory

I'm curious to see what developers can do with the other core types. Or rather I'm curious to see what they can do in general.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Brooksie359 said:

That is if you are willing to pay for the higher price tag. You can always just buy a used 10 series card or one of the now discounted 10 series from places trying to clear inventory. 

pascal release was titan 1080 and 1070 3 tiers priced 1200 700 and 450

 

and 2070 is prolly being held back to clear 1080ti cards because possible surplus of them

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, M.Yurizaki said:

I'm curious to see what developers can do with the other core types. Or rather I'm curious to see what they can do in general.

this is what I was talking about in other topic but ledeater kept being closed minded

but those tensor cores could relieve cuda in many area meaning less to achieve more

but we dont know yet

but if they can dlss means they do have another purpose besides helping with rt

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, pas008 said:

this is what I was talking about in other topic but ledeater kept being closed minded

but those tensor cores could relieve cuda in many area meaning less to achieve more

but we dont know yet

but if they can dlss means they do have another purpose besides helping with rt

Well outside of that, I feel like I'm the only one who's excited for new technology while everyone else is going "muh fps brah"

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, M.Yurizaki said:

Well outside of that, I feel like I'm the only one who's excited for new technology while everyone else is going "muh fps brah"

same here

fps isnt only thing that is part of performance

 

this is first step in huge visual future

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, M.Yurizaki said:

Well outside of that, I feel like I'm the only one who's excited for new technology while everyone else is going "muh fps brah"

I mean the 2080ti is supposed to hit over 60 fps at 4k so based on what the tech media saw. So if you can get that and have new exciting tech like raytracing then yeah I am excited. Having more options is always good. 

Link to comment
Share on other sites

Link to post
Share on other sites

8GB of VRAM?  It's gotta be the 2070

"And I'll be damned if I let myself trip from a lesser man's ledge"

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, ARikozuM said:

Also, only 8GB of VRAM... Give us 12GB. 

 

4 hours ago, I-r0k said:

8gb??!! WHY!!!!????

I mean, it is GDDR6 memory, so hopefully the speed increases will help counteract that it is only 8GB. But I really haven't ran into very many memory limits with my current 1080 either... So we shall see when them benchmarks come out.

Main Rig: cpu: Intel 6600k OC @ 4.5Ghz; gpu: Gigabyte Gaming OC RTX 2080 (OC'd); mb: Gigabyte GA-Z170X-UD3; ram: 16 GB (2x8GB) 3000 G.Skill Ripjaws V; psu: EVGA 650BQ; storage: 500GB Samsung 850 evo, 2TB WD Black; case: Cooler Master HAF 912; cooling: Cooler Master Hyper 212 Evo, Lots of fans, Air!; display: 4k Samsung 42" TV, Asus MX259H 1080p audio: Schiit Audio Magni Amp w/ Audio Technica M50x

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:

Well outside of that, I feel like I'm the only one who's excited for new technology while everyone else is going "muh fps brah"

Nah, those of us excited are just sick of the haters, tinfoil hat wearers and "muh fps bah" - type of responses already.

 

I can guarantee you, I am more than interested in this.

DLSS alone can be a HUGE gain. It looks pretty damn good and is basically free performance if you usually turn AA on. Which on 1440p I definitely do. And this is only the "I want moar FPS" fix.

RT looks amazing. And no matter how little other people appreciate it, the room demo they showed, with the table and only light from the window amazed me. That is exactly what I always hated about games. Even those that did fake it well, I always thought "oh, how would there be light right there?". Sure, this is not a big concern for others, but it always drives me nuts. With RT the room finally looks like it is supposed to. I am BEYOND excited for that.

 

On the other hand, I understand that some people only want a faster GPU and don't want better graphics. At least I try to understand that (it is hard tho).

 

NVidia is moving the industry forward with the first RT implementation. And that is awesome in itself. But it is a rough start, as for every new tech. 

We can clearly see the people that don't understand, don't want to understand, or just flat out misinterpret about every piece of information we get. As long as it does not give FPS, it must be bad and / or useless.

 

We can only hope that Benchmarks shut "them" up or RT games look as amazing as possible. 

Yes that is hoping. No this is not fanboyism. It is just me being excited that we finally get something worthwhile beyond pure FPS numbers. (Along with FPS numbers !)

Link to comment
Share on other sites

Link to post
Share on other sites

This is pretty cool. Synthetics do show a bit of the story, so I like where this is going.

 

Also, I personally dont care about price since I'm an enthusiast with this stuff. Regardless, a 40% improvement is awesome for something that ISNT an RTX demo.

*Insert Name* R̶y̶z̶e̶n̶ Intel Build!  https://linustechtips.com/main/topic/748542-insert-name-r̶y̶z̶e̶n̶-intel-build/

Case: NZXT S340 Elite Matte White Motherboard: Gigabyte AORUS Z270X Gaming 5 CPU: Intel Core i7 7700K GPU: ASUS STRIX OC GTX 1080 RAM: Corsair Ballistix Sport LT 2400mhz Cooler: Enermax ETS-T40F-BK PSU: Corsair CX750M SSD: PNY CS1311 120GB HDD: Seagate Momentum 2.5" 7200RPM 500GB

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Rattenmann said:

-Snip-

After hearing about "muh fps" for the billionth time I decided to have a look at the past to see when we had some new fangled GPU technology come out, what did it bring to the table. Most of the time the first generation of hardware with the shiny new feature wasn't capable of blazing fast performance that people today are expecting, comparatively speaking.

 

In fact, I feel like that's what a lot of the PC gaming market just demands: more FPS. Any new technology? It better bring more FPS or it's not worth it. I remember when Mantle was all the hot rage and looking at it, I knew that it wouldn't help machines with high-end CPUs (if your CPU is fast enough to feed the GPU to 100% utilization, lowering the CPU load won't make it go any faster). And when the results came in with publications testing with high-end CPUs showing little to no difference, people started questioning Mantle's purpose (at least I think that's what happened).

 

I mean, if people are going to push FPS for competitive play, they're likely going to turn off all the fancy graphical features anyway to the point that a 1080 Ti will probably have time to crunch a DES key within an hour while they play games.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×