Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Zat0ichi

GTX 1680

Recommended Posts

Posted · Original PosterOP

Forgive the intrusion.

I'm playing with my tinfoil hat today and think that speculation of this inevitable product is being somehow suppressed.

Or I'm mistaken the product is inevitable?

 

Google will find nothing, at all.

 

Even searching reddit for a post I made - nothing.

 

Seems odd that when I'm searching for a weird little fix for some old tech i often find my own questions yet this remains mysterious. 

 

Why that would be done is obvious. Sales. Why would people buy a rtx2070 when an 1680 will give people the rasterising power they want at  the right price point.

Try it. Make a little 1680 noise and see if you can get an echo.

 

How this "supression" is implemented is not for discussion. 

 

Treat it like astronomers using transits to detect black holes or crowd sourcing mobile phone photos to create photos of the milky way.

Link to post
Share on other sites

was that card released?


ASUS X470-PRO • R7 1700 4GHz • Corsair H110i GT P/P • 2x MSI RX 480 8G • Corsair DP 2x8 @3466 • EVGA 750 G2 • Corsair 730T • Crucial MX500 250GB • WD 4TB

Link to post
Share on other sites

What is with people posting all this incoherent non-sense on here these days.

 

Are you asking us to talk about Nvidia releasing a 1680, so that they will actually? Or are you trying to suggest that some greater power is at work to cover up the inevitable release of a 1680?


Desktop: i7 7700K @ 4.7Ghz, 16GB DDR4, Cooler Master H80i v2, 500GB M.2 SSD, Corsair 230T case, Sapphire RX 590 8GB

Laptop: Eluktronics Mech 15 G2, i7 8750H, 1060 6GB, 16GB DDR4, 480GB Nvme SSD, 144hz panel  

Laptop: Eluktronics W650kk1, i5-7400, 8GB DDR4, GTX 1050Ti, 250GB WD SSD, 120Hz Asus Display  

Laptop: Alienware 18 (2014). I7 4930MX @ 4.1GHz, 16GB RAM, 500GB 840Evo, 1TB HDD, GTX 980m 8GB  **Broken**

Laptop: Razer Blade Stealth 2017, i7 7500u, 16GB RAM, 512GB SSD

Laptop: Razer Blade 14 mid 2016. I7 6700hq, 16GB RAM, 512GB SSD, 970m 6GB.

Laptop: Origin Eon11s. i7 3820QM, GT 650M 2GB, 120GB SSD, 16GB DDR3 RAM, 11.6in 768p display  **Dismantled**

 

Link to post
Share on other sites
54 minutes ago, Zat0ichi said:

 

They could release something like that but there wouldn't be much point

 

people should just be buying a used 1080ti if they want 2080 performance at a discount.


I edit my posts a lot

Link to post
Share on other sites
1 hour ago, Zat0ichi said:

Why would people buy a rtx2070 when an 1680 will give people the rasterising power they want at  the right price point.

This is the reason the 1680 is not only not inevitable, it's actually improbable. Why would Nvidia release a card that makes one of their products obsolete and probably shatters the probability of raytracing actually taking off any time soon? The only possible scenario I see where Nvidia would release such a card is if Navi is suddenly all that everyone's hoped for and nobody is buying 20(70/80/80ti) cards anymore.


PSU tier list // Community Standards 

 

My System:

Spoiler

Xeon E3-1231v3, Fractal Design Meshify C TG, Corsair Vengeance Pro 4x4GB @1600MHz, MSI z97s SLI Krait EditionSamsung 850 EVO 512GB and a 2TB WD Blue, Gigabyte RTX 2060 Windforce OC, Seasonic Focus Plus Gold 650, Corsair Hydro H75

 

Lenovo L480 (i5-8250U, 16GB RAM)

Link to post
Share on other sites

Yeah, as mentioned above... a 1680 with better or same performance than the 2070 and cheaper? Nvidia killing its creation (RTX) is like Apple killing its notch, maybe they will kill those, but not in the short run

Link to post
Share on other sites
10 minutes ago, PacketMan said:

Yeah, as mentioned above... a 1680 with better or same performance than the 2070 and cheaper? Nvidia killing its creation (RTX) is like Apple killing its notch, maybe they will kill those, but not in the short run

Didn't nVidia already kinda kill the RTX 2070 with the RTX 2060 being 40% cheaper for just 9% slower?


Workstation Rig:
CPU:  Intel Core i9 9900K @4.8ghz  |~| Cooling: Noctua NH-U12P |~|  MOBO: Asus Z390M ROG Maximus XI GENE |~| RAM: 32gb 3200mhz CL16 G.Skill Trident Z RGB |~| GPU: nVidia Founders Edition RTX 2080 Ti  |~| PSU: Corsair RM850X 80Plus Gold |~| Boot: WD Black M.2 2280 500GB NVMe |~| Storage: 2X4TB HDD 7200rpm Seagate Iron Wolf + 2X2TB SSD SanDisk Ultra |~| Case: Cooler Master Case Pro 3 |~| Display: ASUS ROG Swift PG348Q 3440x1440p100hz |~| OS: Windows 10 Pro.
Personal Use Rig:
CPU: Intel Core i7 8700 @4.45ghz |~| Cooling: Cooler Master Hyper 212X |~| MOBO: Gigabyte Z370M D3H mATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: EVGA Founders Edition GTX 1080 Ti |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk SSD Plus G26 480gb |~| Case: Cooler Master Case Pro 3 |~| Display Setup: Acer X34 3440x1440p100hz |~| OS: Windows 10 Pro.
Link to post
Share on other sites
19 minutes ago, Princess Cadence said:

Didn't nVidia already kinda kill the RTX 2070 with the RTX 2060 being 40% cheaper for just 9% slower?

No because I keep seeing people recommend 2070 for some reason. I'll never understand why. People keep saying "but muh total system cost" completely ignoring the fact that 

  1. Most people upgrade their graphics cards at least once in the PCs life 
  2. Total system cost makes no sense because it doesn't increase the performance of the whole system, only the GPU
  3. Even with total system cost rather than , in a system that's suitable to a 2060/2070, the 2070 is STILL a worse value than 2060.


Main System: EVGA GTX 1080 SC, i7 8700, 16GB DDR4 Corsair LPX 3000mhz CL15, Asus Z370 Prime A, Noctua NH D15, EVGA GQ 650W, Fractal Design Define R5, 2TB Seagate Barracuda, 500gb Samsung 850 Evo
Secondary System: EVGA GTX 780ti SC, i5 3570k @ 4.5ghz, 16gb DDR3 1600mhz, MSI Z77 G43, Noctua NH D15, EVGA GQ 650W, Fractal Design Define R4, 3TB WD Caviar Blue, 250gb Samsung 850 Evo
 
Link to post
Share on other sites
9 hours ago, Princess Cadence said:

Didn't nVidia already kinda kill the RTX 2070 with the RTX 2060 being 40% cheaper for just 9% slower?

I meant the ray tracing feature, but here in Spain you can find a 2060 for 400 euros and a 2070 for 500 euros, so now it's not that much of a difference (having in mind the 2080 is 750 euros)

Link to post
Share on other sites

I’m hoping for the 1880 to be next, a 2080 without the pointless raytracing and other useless bells and whistles.

 

Just raw power for FPS on decent screen real estate 

Link to post
Share on other sites
Posted · Original PosterOP
12 hours ago, Theguywhobea said:

What is with people posting all this incoherent non-sense on here these days.

 

Are you asking us to talk about Nvidia releasing a 1680, so that they will actually? Or are you trying to suggest that some greater power is at work to cover up the inevitable release of a 1680?

Love you.

X

Link to post
Share on other sites
Posted · Original PosterOP
11 hours ago, PacketMan said:

Yeah, as mentioned above... a 1680 with better or same performance than the 2070 and cheaper? Nvidia killing its creation (RTX) is like Apple killing its notch, maybe they will kill those, but not in the short run

I only found out recently that Nvidia was started by s splinter group of Silicon graphics that insisted on mainstream GPUs.

 

I think that Nvidia need to take a slice of humble pie and take a step back. They have a good slice of the automotive and data centre market so they can stop squeezing the home users now.

 

Their shares are now driven by AI and data centres so time to fix the massive image problem they have now to reassert comprehensive market dominance before intel step into the ring. Who knows what will happen then...

 

 

Link to post
Share on other sites

Ya I figured the overall span of this generation (kind of a combined generation) would be 1650, 1660, 1660 ti, and then going up from there, the RTX 2060 (1670 equiv would compete here), RTX 2070 (1680 equiv would compete), RTX 2080 (1680ti would compete), RTX 2080ti.  There would be no motivation to make those high end 16 cards because so many would pass on the RTX to save a buck and the feature would be dead in the water.  

Link to post
Share on other sites

People should have figured out Nvidia's strategy by now. It's blindingly obvious. They want to lock out ray tracing for the higher price points for as long as possible, while the low end suffers. That quadro rtx card with more rtx ops than the 2080ti shows that they are holding back and want to prevent the low end from these technologies. They can afford to do that because AMD doesn't have real time ray tracing yet. They could pack all the gpu's with enough rt cores to do ray tracing at 4k100. But they didn't because they know they can get more money if they starve the low end and force people into buying expensive gpu's.

 

It's simple. They want you to pay and pay big for ray tracing, they want to force you into doing it. That's why there'll be no rtx 2050 and no gtx 1680ti. Why would they kill off their own cards. The 2070 and 2080ti are bad enough as it is. The 2080 and 2060 are less bad, but they still aren't amazing value.

 

It won't last long however. Once intel come out with their gpu, Nvidia are in serious bother. Intel will wipe the floor with them, ESPECIALLY in compute (All those kinds of workloads has always been Intel's focus rather than gaming). If you look at the new gen11 outperforming mini-vega, you can see that gen11 is in between amd and nvidia. That gap will be closed quickly, and intel will pummel nvidia. Therefore you'd expect that the high end from next-gen intel will outperform Turing. And with Raja at the helm we will see some really fancy technologies. We saw raja do HBCC. Most people downplay it, but it's a really good technology and it could be taken advantage of with Intel's lower vram cards. That will lower prices and again, put Intel in between AMD and Nvidia on pricing.

Link to post
Share on other sites
Posted · Original PosterOP

Low and behold. Speculation is appearing through Google now. Not just this thread either.

 

Maybe my other posts were not on high profile enough sites or needed to reach critical hit numbers.

Link to post
Share on other sites
Posted · Original PosterOP
7 hours ago, MeatFeastMan said:

People should have figured out Nvidia's strategy by now. It's blindingly obvious. They want to lock out ray tracing for the higher price points for as long as possible, while the low end suffers. That quadro rtx card with more rtx ops than the 2080ti shows that they are holding back and want to prevent the low end from these technologies. They can afford to do that because AMD doesn't have real time ray tracing yet. They could pack all the gpu's with enough rt cores to do ray tracing at 4k100. But they didn't because they know they can get more money if they starve the low end and force people into buying expensive gpu's.

 

It's simple. They want you to pay and pay big for ray tracing, they want to force you into doing it. That's why there'll be no rtx 2050 and no gtx 1680ti. Why would they kill off their own cards. The 2070 and 2080ti are bad enough as it is. The 2080 and 2060 are less bad, but they still aren't amazing value.

 

It won't last long however. Once intel come out with their gpu, Nvidia are in serious bother. Intel will wipe the floor with them, ESPECIALLY in compute (All those kinds of workloads has always been Intel's focus rather than gaming). If you look at the new gen11 outperforming mini-vega, you can see that gen11 is in between amd and nvidia. That gap will be closed quickly, and intel will pummel nvidia. Therefore you'd expect that the high end from next-gen intel will outperform Turing. And with Raja at the helm we will see some really fancy technologies. We saw raja do HBCC. Most people downplay it, but it's a really good technology and it could be taken advantage of with Intel's lower vram cards. That will lower prices and again, put Intel in between AMD and Nvidia on pricing.

I'm no Nvidia fan boy. Although I do always end up buying team green. Owned every GTX xx60/ti card

 

I've wanted to buy AMD but they just did not have the product I wanted. The gtx 970 3.5 thing was the begining of Nvidia taking the piss. GTA V maxed at 1440p. 4gb gram usage. So glad I found a reasonably priced 1070 in 2017.

Amd just hasn't been able to answer Nvidia's standard. Once you've paid the premium they are cool and quiet and trouble free. The premium is just ridiculous now. (I am most definitely a l. Rtx hater)

Not sure if Raja's strategies just weren't implemented well, some deficiency in AMD or what.

He is not necessarily a good talisman. 

 

I really want him to bring out rock solid stable cost effective GPUs that will destroy rasterised complex geometry.

 

Let Nvidia do their under subscribed gameworks and rtx nonsense. That can be the premium. 

 

Just give the normal people high polygon high frame rates at 4k. 

I think tesselation and volumetrics are not gameworks exclusives anymore. They do add ALOT to immersion.

 

Yes lighting is actually critical to good cinema but gaming is not ready yet. True raytraced GI is a wonderful thing, for the future.

 

Rtx Witcher 3 patch anyone?

 

 

So, Nvidia greed will hold back the 1680 until AMD or intel kick theM hard enough.

That's years away...

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×