Jump to content

GTX 980 Ti or GTX 1070 - Help me decide!

14 minutes ago, Cryptonite said:

great choice, I hope someone locks this thread though, it got derailed damn badly and by people who love to argue about the dumbest crap I've ever heard.

It's a public forum. Just because you didn't participate in the conversation doesn't mean you don't deserve a middle finger for your contribution.

 

Discussion = arguing. Great.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Qwweb said:

Again, I will reiterate. 1070~980ti in gaming (1070 is 3% better); 980ti>1070 in workloads. 980ti has features that better benefit workloads such as; higher memory bandwidth, larger memory bus, more L2 cache for the VRAM, more shaders, more texture mapping units, higher texture rate, more render output units, and a higher pixel fill rate. All of this effects workloads much more than game performance, but it should be said that this results in smoother gameplay as the 980ti has more VRAM cache and a better output bus than the 1070.  

more this and that doesn't magically make the 980 Ti better in CUDA accelerated application, if that is the case then the old GTX 680 with 3x more cores should obliterate GTX 580 in CUDA workload but it's the other way around. From what i've seen the 1070 and 980 Ti also perform similarly in productivity application.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, JohnT said:

Wasn't this the Sandy Bridge chips? There are a lot of conflicting stories with some even calling this a bug. I see this "issue" caused by an unbalanced system. A lot of goons shoot for the hills for their GPU and leave their CPU lagging behind. A lot of people in this forum wholeheartedly have been recommending i5's for a long time. Cards like the 1080 changes that. The 1080 can easily saturate an i5. And frankly, an i7 isn't that much more expensive. It's on sale today at Newegg for $300. A CPU for gaming can easily last for three years, and at $100/year. Yeeeeeeeeach the 7700k is a pretty freaking good value. If we want games to keep getting better, we can't expect to play games on max settings on older generation equipment. 

Nope, these are Skylake and Kaby Lake chips we're talking about as well. In the modern age, 4C/4T simply isn't enough for a wide variety of titles.

 

As for the topic of whether or not the i7 7700K is good value compared to previous generations, not really. The difference between a i7 6700K/Z170 motherboard and a i7 7700K/Z270 motherboard is $50. That's a significant margin for a very minimal performance boost.

 

Nobody can say that the i7 is good value because in a market saturated with Intel, there's no set "definition" of what is "good" pricing and what is not.

9 hours ago, JohnT said:

I bet you can't name a single one, bro. And forget video processing... it's not performed by a majority of users. And if they do, they aren't doing for production purposes like the Youtubers we watch daily. Most computing needs do not greatly benefit from more than 2 cores. Gaming is slowly starting to make use of more than 2 cores. Most applications used in the real world aren't. At this time, more cores allow you to do more things simultaneously. Under rare circumstances does more cores mean more performance.

So now we're delving into applications that only the majority of users do? That's mighty convenient for the sake of your argument.

 

I'm ignoring that first point because there is a market for eight core CPUs. You do realise that there isn't just a "gamer" market, but there's also a "workstation" market as well? People are prepared to pay up to save time. That's why the i7 5960X and i7 6900K exist in the first place and that's why people buy them at their ridiculous price points. Here's just a handful of applications that would benefit from an octacore.

 

-Video encoding

-Rendering (for example, 3D animation)

-Motion design/compositing

-Scientific/financial modelling

-Editing (video, audio or photo)

-Virtual machines

-Compiling programs

 

If I want to go one step further, I can bring up the point of multitasking; streaming and playing a game at the same time without any poor performance or streaming quality is a neat example.

 

Gaming has been using more than just two cores for a very, very long time. Some games don't even boot up on a dual-core Pentium. 4C/8T is where it's at. Hell, there are even games are using more than eight threads. In this context, Amdahl's Law is null and the prospect of games starting to utilise even more cores is very realistic. Remember when people said that dual cores were enough for gaming? Or more recently, that a quad core i5 is as good as you're going to get? It may happen again with 4C/8T i7s.

 

Besides, AMD is also offering quadcores and hexacores on top of the octacore flagship. It's not just the enthusiast market they are tackling.

9 hours ago, JohnT said:

I don't know. If you're never top dog, you're never really doing anything new. AMD always markets their innovation at a lower price point, but they aren't doing anything new. They are the budget kings.

AMD can have a GPU that thrashes any current flagship NVIDIA card by double the performance but then sell it at a price point of ten grand. It'd be DOA because almost nobody can afford it. Again, it's not just about performance but price.

9 hours ago, JohnT said:

Dude the Fury X was a joke from the start. 18 months later and I can obliterate its 4 GB memory buffer with Planet Coaster. You can buy this pop tart brand new for $300. It hasn't even retained 50% of its retail value even though it's the only card with HBM. Get real. The Fury X was never competition.

Call it a joke but that's the card I (and many others) should be thanking since I was able to net a GTX 980Ti that had the performance of a GTX Titan X Maxwell at a much lower price. It was competitive enough to make NVIDIA preemptively launch the GTX 980Ti just like the R9 290X was competitive enough to force NVIDIA to cut the prices of their Kepler lineup and then introduce the GTX Titan Black and GTX 780Ti in response. The Fury X was competition, there's no doubt about it.

9 hours ago, JohnT said:

But how can you argue that it isn't because AMD hasn't really had a king in the CPU market for over a decade now? If AMD could provide a super high performing chip, even if its super expensive, Intel will have to compete and eventually the R&D will trickle down. As it stands, if AMD continues to develop products that the competition already provided 2.5 years ago... man... I don't see why Intel would even bother. A ten core chip is outrageous. But Intel also has to price it right so the companies buying server chips won't get too frustrated.

You can say "AMD is only providing performance that came years ago" but you cannot deny that the generational differences between Intel's CPUs is so marginal that it'd honestly won't matter. In fact, from the way AMD is marketing it, it's supposed to be competing against the i7 6900K which only released last year. 

 

You say Intel won't bother but they're already scrambling to get new CPUs out to the market. Kaby Lake-X (the i5 7640K and the i7 7740K) already looks like a flop to be honest, they've already begun to implement some defensive marketing and Intel engineers have just recently said that Ryzen is "clearly competitive". If the competition is saying that your product is competitive, it undoubtedly must be. Who'd have thunk it?

9 hours ago, JohnT said:

Yes! This is called competition. This will make Nvidia try harder for the next generation. Without a top dog, there is no competition to take down. Nvidia has to chase its own tail. I respect what you're saying, but yeeeeah you're kind of getting at the same point. I would love for the RX 490 to destroy the 1080. But if it only "performs about the same as the GTX 1080," or in other words, a new card that performs just as good as a GPU that's been on the market for 6 to 12 months (depending on the release dates) WTF has AMD been doing since the Fury X?

No, I totally get what you're getting at. Waiting basically a year for AMD to deliver the same amount of performance? Believe me, I share the same sentiment. But again, the two big factors are again price and performance. NVIDIA pushes performance, AMD lowers prices. Without Vega, we'd probably going to see GTX 1070s still at the $350-400 area, GTX 1080s at the $550-600 area and the Titan XP at the $1200 mark and the GTX 1080Ti won't exist.

 

Also, don't oust the possibility of AMD outperforming most of their counterparts. Given their track history, eh, I'm inclined to believe that there's a good likelihood of that not happening but given the time they have, it could.

9 hours ago, JohnT said:

Rebranding drivers?

If their "rebranded drivers" are much better than NVIDIA's, then that says a lot about the Green Team's drivers in general, doesn't it?

9 hours ago, JohnT said:

Marketing Zen?

AMD always likes hyping things up. You and I know that.

9 hours ago, JohnT said:

A price war is not true competition. It's just a tactic for a slice of market share.

Hold up, hold up, hold up. A price war is the epitome of competition. The two have to co-exist. You know why basically everyone is harking for AMD to release Ryzen and Vega? Because they want competition in the form of lower prices for all. Doesn't matter if you prefer Intel or NVIDIA or AMD because every sane person is going to want lower prices and the only way to achieve that is... you guessed it, competition. It is true competition, but you only choose to interpret competition as trying to beat the other team's product in terms of performance without any regards to price.

9 hours ago, JohnT said:

Nvidia has a stronger reputation with gamers and a shiny 490 won't change that. Only a top dog product will.

History has proved that wrong. Anytime AMD held the performance crown, they'd be the less popular brand. The one with the mindshare will always win.

 

Tell me, how many people still think AMD has bad drivers despite being extremely successful with the past year or so with Crimson and Crimson ReLive while NVIDIA has constant problems time and time again? Or what started this whole argument in the first place "they run hot and loud all the time" despite cards such as the XFX RX 480 GTR which JayzTwoCents demonstrated ran at 64C when overclocked to a whopping 1475MHz (60C when not) and consumed an average of 133W when overclocked (about 100W when not).

9 hours ago, JohnT said:

The RX 490 may get Nvidia to reduce's the 1080's price by $100. But then Nvidia is going to pound AMD with the 1080ti at an even higher price point, higher specs, higher performance, and the 490 will make a few great benchmarking videos where it will trade blows with the 6 to 12 month old 1080, and get crushed by the 1080ti. At the end of the AMD will have accomplished nothing in terms of advancing performance.

AMD should tackle the GTX 1080Ti/Titan XP, they won't leave it out.

 

They've indirectly advanced performance by forcing NVIDIA to innovate as a result of providing comparable products. Are you starting to see the pattern? NVIDIA releases product, AMD releases comparable product that has a lower price, NVIDIA releases a better product in response. It's quite literally the definition of competition.

 

And again, performance isn't everything.

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, HKZeroFive said:

As for the topic of whether or not the i7 7700K is good value compared to previous generations, not really. The difference between a i7 6700K/Z170 motherboard and a i7 7700K/Z270 motherboard is $50. That's a significant margin for a very minimal performance boost.

This argument is questionable, at best. The 6700k and 7700k are not priced differently. Only the motherboards are. It's likely the previous generation has been marked down and that's why you see the $50 difference. It's also how much you look around for deals. Back in January 2015, I bought my 5820k on sale for $30 less than the going price of 4790k at that time. The 7700k was on sale for $50 off MSRP just yesterday. Intel's top end consumer CPU hasn't changed pricing since Haswell. I cannot disagree that the enthusiast stuff has been all over the place, but really the X99 platform is for niche markets of the newer generation people who make a living from home (streamers, youtubers, etc.) These needs do not make up a majority of the market.

 

8 hours ago, HKZeroFive said:

-Video encoding

-Rendering (for example, 3D animation)

-Motion design/compositing

-Scientific/financial modelling

-Editing (video, audio or photo)

-Virtual machines

-Compiling programs

What? Are you aware how insignificant the market is for these uses for consumers? I didn't ignore video and rendering related because it's convenient, but I knew you would try to capitalize on it when it actually hurts your argument if you look at the reality behind it. The reality is, the majority of computer users spend their CPU power on Facebook, Twitter, Amazon shopping, other social media, porn/Netflix (probably a tie), email, and Youtube. Yeeeeeeeeeeah the things you mentioned are like the top 1 to 5 percent of computer users. And realistically the people who actually spend time performing those rigorous tasks regularly aren't doing it on a Z170/Z270 or X99. These people/companies are using workstations (as you mentioned) with Xeons and Quaddros, which aren't necessarily better (different convo), but yeah I assure you no IT guy is buying Z or X platforms for their engineers and overclocking it for them. Especially in larger companies where there are thousands of computers. I will say it again, there is little room in the consumer market for more than 2 to 4 cores at this time. In fact, people who discuss CPUs and purchase CPUs and other separate PC components is a niche market all on its own.

 

8 hours ago, HKZeroFive said:

Gaming has been using more than just two cores for a very, very long time. Some games don't even boot up on a dual-core Pentium

Hmmmm... Quad cores have been around for consumer markets since 2008. The first game to prevent starting on dual core system was FarCry 4 unless I am wrong. And I remember that was more of a "soft" restriction as hyperthreaded CPUs functioned just fine, and many were able to hack the game to run on dual cores. It's likely a couple other games were released since 2014 with similar "requirement" but I can't really recall. Quad cores have been around for so long that anybody trying to run a modern AAA title should have one by now. We can't really keep progress down because the console players are moving into the PC gaming realm and don't want to spend more than a PS4 on a system. Sub $500 computers have never done well for gaming. Cards like the GTX1050 are making it more of a possibility, but still. This is a minor inconvenience or a stupid mess up from Ubisoft. I don't even remember this issue coming up with Primal.

 

9 hours ago, HKZeroFive said:

Besides, AMD is also offering quadcores and hexacores on top of the octacore flagship. It's not just the enthusiast market they are tackling

Then who the **** are they expecting to sell this thing to? Facebook, Amazon, Netflix, and Microsoft Word doesn't benefit from 8 cores!

 

9 hours ago, HKZeroFive said:

AMD can have a GPU that thrashes any current flagship NVIDIA card by double the performance but then sell it at a price point of ten grand. It'd be DOA because almost nobody can afford it. Again, it's not just about performance but price.

I think we are going to have to agree to disagree. I completely understand your perspective, but as I previously mentioned, I don't think AMD is providing direct competition for Nvidia. The invent of higher resolution monitors and extreme frame rates are really pushing Nvidia to come out with better products. AMD has yet to respond to the 1070 and 1080. What competitive product does Nvidia have to set in its target? Nothing. Don't even get me started on the Titan. AMD has never even provided a product that comes close. And yet the Titan continues to sell. Albeit less now that 1080p is still very popular and the 1070 is more than up to the task.

 

9 hours ago, HKZeroFive said:

The Fury X was competition, there's no doubt about it.

It was way too little, and way too late. AMD seems to be making a habit of this as they spend a lot of R&D time rebranding driver names. Yes, Nvidia responded to the Fury, but they didn't really need to.

 

9 hours ago, HKZeroFive said:

... but you cannot deny that the generational differences between Intel's CPUs is so marginal that it'd honestly won't matter

Because Intel doesn't need to improve. It has nothing to compete against except lower prices and lower performance from AMD. Why improve your product when it doesn't need to be improved? Frame rates aren't being held back by CPU design. Maybe the i7 vs i5 argument... blah blah blah buy an i7. End of problems. It's a PC. Not a console. In fact, the 6700k did increase gaming performance over Devil's Canyon and Haswell-E. Intel changed the tick-tock thing to the APO thing. They told use to expect optimization. Many people are able to push their Kaby Lake to 5.0 GHz at similar power levels as previous generations. I don't know what you're expecting Intel to do when (similar to Nvidia) they have nothing to target from the "competition."

 

10 hours ago, HKZeroFive said:

You say Intel won't bother but they're already scrambling to get new CPUs out to the market. Kaby Lake-X (the i5 7640K and the i7 7740K) already looks like a flop to be honest, they've already begun to implement some defensive marketing and Intel engineers have just recently said that Ryzen is "clearly competitive". If the competition is saying that your product is competitive, it undoubtedly must be. Who'd have thunk it?

I got this info way after I posted my previous response. I still don't see the fuss. It seems a little rushed, but we haven't seen anything concrete from AMD. Ryzen has only been benchmarked by AMD and we've heard leaked information. At this point, Ryzen is marketing competition and Intel needs something new to throw out. That's marketing. But it's funny you say it's a flop. Is that maybe because Intel doesn't really see Ryzen as competition in terms of performance?

 

10 hours ago, HKZeroFive said:

If their "rebranded drivers" are much better than NVIDIA's, then that says a lot about the Green Team's drivers in general, doesn't it?

Ehhhhhhhh I don't believe the Crimson drivers are really that impressive. I haven't had the best experience with it. The GeForce drivers are super simple, and the Experience program can add quite a bit. It does a better job of managing driver updates IMO than Crimson. Crimson just puts everything together... sort of.

 

10 hours ago, HKZeroFive said:

Hold up, hold up, hold up. A price war is the epitome of competition. The two have to co-exist. You know why basically everyone is harking for AMD to release Ryzen and Vega? Because they want competition in the form of lower prices for all. Doesn't matter if you prefer Intel or NVIDIA or AMD because every sane person is going to want lower prices and the only way to achieve that is... you guessed it, competition. It is true competition, but you only choose to interpret competition as trying to beat the other team's product in terms of performance without any regards to price.

Yes yessssssss but Intel and Nvidia are not going to respond with cheaper prices. They are going to respond with newer products (as you have already pointed with out the KL-X). Ryzen and Vega do not target the competition's flagship and that's the problem. It targets products that have already been on the market and have established their market base. People would be silly to ditch their Haswell i7+ for a Ryzen. People would be stupid to ditch their 1070 or 1080 for a 490. Competition and pricing wars work best when you actually have competing products that are both desirable by consumers. Think Apple and Samsung phones, or Nikon and Canon flagship DSLRs that always try and outdo each other with each release. That's competition! You can't have a pricing war when you haven't released your product (I think you agree with this). But I don't think AMD can be in competition when they never bring anything new to the table.

 

10 hours ago, HKZeroFive said:

Tell me, how many people still think AMD has bad drivers despite being extremely successful with the past year or so with Crimson and Crimson ReLive while NVIDIA has constant problems time and time again? Or what started this whole argument in the first place "they run hot and loud all the time" despite cards such as the XFX RX 480 GTR which JayzTwoCents demonstrated ran at 64C when overclocked to a whopping 1475MHz (60C when not) and consumed an average of 133W when overclocked (about 100W when not).

I don't know where you're getting your info from. Crimson was a ****ing nightmare at the beginning. There was so much overlap between the Catalyst interface and Crimson that it was very confusing. I have many other issues with the AMD drivers also when I tried using my 280 in my HTPC. The issues have all gone away with my Nvidia cards. Let's not get into this. I can't stress how disappointed I was with a gaming GPU not being able to handle consistent HDMI bitstream. I haven't had any issues with Nvidia drivers since I purchased my 1070 (and subsequently the 1050 to replace the 280 in my HPTC). Why is 1475 MHz impressive? The 1070 runs at 65ish C over 1900 MHz. Even the 460 is confusing. It uses less than 75W but still requires a 6-pin for power. The competition has better products.

 

10 hours ago, HKZeroFive said:

They've indirectly advanced performance by forcing NVIDIA to innovate as a result of providing comparable products. Are you starting to see the pattern? NVIDIA releases product, AMD releases comparable product that has a lower price, NVIDIA releases a better product in response. It's quite literally the definition of competition.

I've been dying through the process for years... just like everyone else has. Like I said before, I don't think AMD is actually pushing Nvidia to do better. I think the market has had a bigger impact on the requirements from top tier cards and only Nvidia has had time to react. AMD is still catching up. See AMD has a big green flag to set its sights on. Nvidia has NOTHING to focus on from AMD. Nvidia has 4k60fps or higher from a single GPU in sight.

 

Yeah I'd be stoopid to say competition doesn't depend on product price. But AMD is leaving too much time and alienating its customers to the competition. (Not to mention my experience with AMD products has been absolutely awful). There's plenty of websites that discuss market share. I've looked through them and it really seems AMD has a clear gain in shares one quarter per year (for discrete PC GPUs). The general split is 70/30 in favor of Nvidia, and it's clear Nvidia is slowly increasing the gap. Which all means that whatever AMD is doing isn't working out too well. I would expect far greater fluctuation in market share, say around 50/50+10%, if AMD was actual competition. A consistent 70/30 split year over year means AMD is always second best.

 

As a consumer, I don't want my money going to a company that's always catching up. I want to support a company that actually performs better than its competition. Earn my money, and give me a good functioning product. Support it. Fucking support it. Oh man... my woes...

 

AMD ditched support for my ancient HD 4850. It was working GREAT in my gaming rig on Windows 8. I was still play Tomb Raider at 60fps on low/medium settings. It worked. I was happy. Then Windows 8.1 arrived and AMD completely ditched support for Windows 8. No more driver updates. Okay update to Windows 8.1, which then Microsoft pushed upgrades to Win 10. At which time AMD completely stopped supporting "legacy" products with the last few iterations of Catalyst. And poof I was no longer able to game or use my GPU anymore. Even on my HTPC, the overscan was so bad and I had no way of adjusting it because there were no drivers. So, based on price alone, in came the 280 that constantly ran at 80C under load with heavy breathing fans that never stopped even when idling after gaming. Then I stupidly bought another one because of that crossfire status, and oh yeah it was cheap. Which always turned off my PC during AAA games because apparently 850watts wasn't enough smh. **** you, AMD.

 

I can still download drivers for my brother's Geforce 9800 (released in 2008) ranging from Vista to Win10, including Win8. For a 9 year old product.

 

You know I couldn't even sell the piece of junks 280 GPUs for $100 a piece? They were $300 MSRP cards. I got one offer for $100 for both. Ridiculous.

 

No more AMD for me. Forget their lack of innovation and competition. Their products just suck.

 

If you really read this entire convo and rant, let me know, I'll send you cookies.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, JohnT said:

This argument is questionable, at best. The 6700k and 7700k are not priced differently. Only the motherboards are. It's likely the previous generation has been marked down and that's why you see the $50 difference. It's also how much you look around for deals. Back in January 2015, I bought my 5820k on sale for $30 less than the going price of 4790k at that time. The 7700k was on sale for $50 off MSRP just yesterday. Intel's top end consumer CPU hasn't changed pricing since Haswell.

The motherboards still play an integral factor in price. You can't just compare the two CPUs side by side without accounting for the other parts which are needed to have a functioning system.

 

It'd be like me saying the i5 7600 and i5 7600K don't have a large disparity in price when you compare the two of them but then add on the CPU cooler and Z motherboard on top of the more expensive i5 7600K and you have a much more significant price difference.

Quote

I cannot disagree that the enthusiast stuff has been all over the place, but really the X99 platform is for niche markets of the newer generation people who make a living from home (streamers, youtubers, etc.) These needs do not make up a majority of the market.

 

What? Are you aware how insignificant the market is for these uses for consumers? I didn't ignore video and rendering related because it's convenient, but I knew you would try to capitalize on it when it actually hurts your argument if you look at the reality behind it. The reality is, the majority of computer users spend their CPU power on Facebook, Twitter, Amazon shopping, other social media, porn/Netflix (probably a tie), email, and Youtube. Yeeeeeeeeeeah the things you mentioned are like the top 1 to 5 percent of computer users. And realistically the people who actually spend time performing those rigorous tasks regularly aren't doing it on a Z170/Z270 or X99. These people/companies are using workstations (as you mentioned) with Xeons and Quaddros, which aren't necessarily better (different convo), but yeah I assure you no IT guy is buying Z or X platforms for their engineers and overclocking it for them. Especially in larger companies where there are thousands of computers. I will say it again, there is little room in the consumer market for more than 2 to 4 cores at this time. In fact, people who

discuss CPUs and purchase CPUs and other separate PC components is a niche market all on its own.

Then using all that info where you say the majority of computer users browse the internet, I can reciprocate the same argument that Intel's consumer quad core lineup (i7 7600K/i7 7700K) is almost just as niche as a fully-blown eight core, sixteen threaded CPU. The thing is, even in everyday usage, more threads and more cores will beneficial... hence why I brought up the point about multitasking. Productivity is a thing nowadays... having more than application open at the same time without the computer struggling will appeal to a significant amount of consumers.

 

Now, you can say that the tasks I've mentioned above make up a minority of users and you're not wrong. But the market exists and that's what both AMD and Intel are both targeting. Even so, IPC/single-threaded performance has essentially hit a hard wall... the only way to go up is in core size.

Quote

Hmmmm... Quad cores have been around for consumer markets since 2008. The first game to prevent starting on dual core system was FarCry 4 unless I am wrong. And I remember that was more of a "soft" restriction as hyperthreaded CPUs functioned just fine, and many were able to hack the game to run on dual cores. It's likely a couple other games were released since 2014 with similar "requirement" but I can't really recall. Quad cores have been around for so long that anybody trying to run a modern AAA title should have one by now. We can't really keep progress down because the console players are moving into the PC gaming realm and don't want to spend more than a PS4 on a system. Sub $500 computers have never done well for gaming. Cards like the GTX1050 are making it more of a possibility, but still. This is a minor inconvenience or a stupid mess up from Ubisoft. I don't even remember this issue coming up with Primal.

The thing with strictly dual-core Pentiums is that the amount of stuttering they produce in modern AAA titles (Digital Foundry has a video showcasing the drops in frametimes) is the main reason why most people (the informed ones at least) avoid them altogether and go straight to either a i3, Athlon X4 860K or more recently, the hyperthreaded Pentium G4560.

 

Office PCs, fine, go with a dual-core Pentium. But for gaming (anything else aside from CS:GO, Dota or LoL)? Forget about it.

Quote

Then who the **** are they expecting to sell this thing to? Facebook, Amazon, Netflix, and Microsoft Word doesn't benefit from 8 cores!

You can use the same argument for i5s or even i3s. Why drop $200-ish just to browse the internet and watch some videos? Because people are either doing other applications as well.

 

Look at it this way. Why does Intel even have eight-core, ten-core CPUs? Because there's a market for it and they're buying those CPUs up. If it wasn't profitable, I'm sure Intel would have eliminated that lineup altogether. AMD sees the same in that they can make money from it. No matter how niche the market is, both Intel and AMD recognises that it exists and want to make a buck from it. And if AMD has their prices low enough (recent leaks have the octacore close to the price of the i7 7700K), they might shift the market towards favouring CPUs with more cores. You've been forgotten that it'd likely be a marginal difference in terms of single-threaded performance whilst being a massive step-up in terms of multi-threaded performance.

 

If you're that concerned with AMD not catering to the "majority" then there's their quad cores with SMT which will undoubtedly be priced competitively.

Quote

I think we are going to have to agree to disagree. I completely understand your perspective, but as I previously mentioned, I don't think AMD is providing direct competition for Nvidia. The invent of higher resolution monitors and extreme frame rates are really pushing Nvidia to come out with better products. AMD has yet to respond to the 1070 and 1080. What competitive product does Nvidia have to set in its target? Nothing. Don't even get me started on the Titan. AMD has never even provided a product that comes close. And yet the Titan continues to sell. Albeit less now that 1080p is still very popular and the 1070 is more than up to the task.

AMD will respond with Vega. They may be late as fuck but they will compete.

 

The Titan continues to sell because it's the best of the best and apparently there's a significant amount of people who have a lot of money to burn. But guess what's going to happen when AMD releases Vega? The GTX 1080Ti will inevitably be released at a much lower cost with roughly the same amount of performance and the Titan XP (just like any other Titan before it) will be made irrelevant.

Quote

It was way too little, and way too late. AMD seems to be making a habit of this as they spend a lot of R&D time rebranding driver names. Yes, Nvidia responded to the Fury, but they didn't really need to.

So you think that NVIDIA would let AMD rule the high-end market alone while still having their Titan priced at $1000? Not a chance.

 

As for drivers, you can definitely see it pay off. AMD has received constant praise for their driver quality whereas NVIDIA has received constant criticism.

Quote

Because Intel doesn't need to improve. It has nothing to compete against except lower prices and lower performance from AMD. Why improve your product when it doesn't need to be improved? Frame rates aren't being held back by CPU design. Maybe the i7 vs i5 argument... blah blah blah buy an i7. End of problems. It's a PC. Not a console. In fact, the 6700k did increase gaming performance over Devil's Canyon and Haswell-E. Intel changed the tick-tock thing to the APO thing. They told use to expect optimization. Many people are able to push their Kaby Lake to 5.0 GHz at similar power levels as previous generations. I don't know what you're expecting Intel to do when (similar to Nvidia) they have nothing to target from the "competition."

Again, performance isn't the only factor when it comes to competition!

 

"Why improve your product when it doesn't need to be improved?" Yes, because that's what every consumer wants to hear. You can achieve higher clocks on Kaby Lake than on Skylake! That's innovation right there whereas AMD delivering multi-core CPUs for a much lower price isn't. See the conflicting logic?

Quote

I got this info way after I posted my previous response. I still don't see the fuss. It seems a little rushed, but we haven't seen anything concrete from AMD. Ryzen has only been benchmarked by AMD and we've heard leaked information. At this point, Ryzen is marketing competition and Intel needs something new to throw out. That's marketing. But it's funny you say it's a flop. Is that maybe because Intel doesn't really see Ryzen as competition in terms of performance?

Or maybe because Intel is rushing to get their new CPUs out in anticipation to Ryzen? That's the more likely answer. As far as I can see, Kaby Lake-X offers nothing noteworthy except for maybe unifying the consumer and enthusiast lineup on the single X-series platform... which is awfully similar to what AMD is doing with the AM4 platform. I wonder why?

 

"Is that maybe because Intel doesn't really see Ryzen as competition in terms of performance?" Yet Intel's own engineers explicitly stated that it was "clearly competitive". Sorry, but I fail to see the logic in your statement.

Quote

Ehhhhhhhh I don't believe the Crimson drivers are really that impressive. I haven't had the best experience with it. The GeForce drivers are super simple, and the Experience program can add quite a bit. It does a better job of managing driver updates IMO than Crimson. Crimson just puts everything together... sort of.

GeForce Experience 3.0 is probably one of the most unfortunate things I had to deal with on the NVIDIA side. The UI is intrusive, I've had FPS problems and the recommended game settings are not helpful at all... but the most griping thing about GFE is that I have to fucking log in. So I just uninstalled it. Forgive me for saying that Crimson ReLive looks intriguing because my brother doesn't have the same problems that I do.

Quote

Yes yessssssss but Intel and Nvidia are not going to respond with cheaper prices. They are going to respond with newer products (as you have already pointed with out the KL-X). Ryzen and Vega do not target the competition's flagship and that's the problem. It targets products that have already been on the market and have established their market base. People would be silly to ditch their Haswell i7+ for a Ryzen. People would be stupid to ditch their 1070 or 1080 for a 490. Competition and pricing wars work best when you actually have competing products that are both desirable by consumers. Think Apple and Samsung phones, or Nikon and Canon flagship DSLRs that always try and outdo each other with each release. That's competition! You can't have a pricing war when you haven't released your product (I think you agree with this). But I don't think AMD can be in competition when they never bring anything new to the table.

"Yes yessssssss but Intel and Nvidia are not going to respond with cheaper prices. They are going to respond with newer products (as you have already pointed with out the KL-X)." Which is about the dumbest thing they could do in regards to responding to Ryzen... they should have just lowered prices. I've already stated why Kaby Lake-X looks like a flop because it doesn't bring anything new to the table. It also:

 

1) undermines the recent purchases of customers (especially those who just bought Kaby Lake).

2) throws the whole Intel lineup in chaos.

 

"People would be stupid to ditch their 1070 or 1080 for a 490." I can think of many reasons why they would. For instance, FreeSync. It's a game-changer in terms of your gaming experience. Believe or not, there are a good amount of people who prefer a R9 Fury with a FreeSync monitor as opposed to a GTX 1070 without an adaptive sync monitor. 

Quote

Why is 1475 MHz impressive? The 1070 runs at 65ish C over 1900 MHz. Even the 460 is confusing. It uses less than 75W but still requires a 6-pin for power. The competition has better products.

I'm surprised that you didn't know that you should never, ever compare the clocks of two completely different architectures. 1475MHz on a RX 480 is impressive in the context of RX 480 clocks. Achieving 1900MHz on a GTX 1070 is irrelevant.

 

There are RX 460s that do not require a PCIe power connector and there are some that do (like the GTX 1050 FTW).

 

As for the competition having the better products... the RX 470 is significantly better than the GTX 1050Ti while it does not cost much more at all (you can find them for $130 following rebate). The RX 480 4/8GB trades blow with the GTX 1060 6GB while usually being much cheaper. The RX 460 is a bit of a bad joke if I'm going to be honest so let's just leave that out and call the GTX 1050 the winner.

Quote

I've been dying through the process for years... just like everyone else has. Like I said before, I don't think AMD is actually pushing Nvidia to do better. I think the market has had a bigger impact on the requirements from top tier cards and only Nvidia has had time to react. AMD is still catching up. See AMD has a big green flag to set its sights on. Nvidia has NOTHING to focus on from AMD. Nvidia has 4k60fps or higher from a single GPU in sight.

Well, I honestly think AMD is doing good enough for NVIDIA to be aware to not slack off. The thing is, AMD is actually competent when it comes to GPUs to the extent that NVIDIA can't just create a monopoly straight up. Their RX series have turned out to be somewhat decent. And remember, Volta is still a good way off. It's not like NVIDIA will counter AMD immediately with a brand new architecture.

Quote

 

Yeah I'd be stoopid to say competition doesn't depend on product price. But AMD is leaving too much time and alienating its customers to the competition. (Not to mention my experience with AMD products has been absolutely awful). There's plenty of websites that discuss market share. I've looked through them and it really seems AMD has a clear gain in shares one quarter per year (for discrete PC GPUs). The general split is 70/30 in favor of Nvidia, and it's clear Nvidia is slowly increasing the gap. Which all means that whatever AMD is doing isn't working out too well. I would expect far greater fluctuation in market share, say around 50/50+10%, if AMD was actual competition. A consistent 70/30 split year over year means AMD is always second best.

70/30? That's actually says that AMD has a growing market share (and every other source I've read suggest that exact same) since I remember it to be more 80/20.

 

And market share doesn't come from pure performance. It's about setting a good price and having good performance and convincing consumers to buy your product because of those two key factors

Quote

As a consumer, I don't want my money going to a company that's always catching up. I want to support a company that actually performs better than its competition. Earn my money, and give me a good functioning product. Support it. Fucking support it. Oh man... my woes...

Well, I don't disagree with you there. I bought the GTX 980Ti for the same reason because the Fury X was a good bit slower (but it's still competition!).

 

That should be the motto for every purchase. I don't like fanboying if haven't noticed by now :P

Quote

AMD ditched support for my ancient HD 4850. It was working GREAT in my gaming rig on Windows 8. I was still play Tomb Raider at 60fps on low/medium settings. It worked. I was happy. Then Windows 8.1 arrived and AMD completely ditched support for Windows 8. No more driver updates. Okay update to Windows 8.1, which then Microsoft pushed upgrades to Win 10. At which time AMD completely stopped supporting "legacy" products with the last few iterations of Catalyst. And poof I was no longer able to game or use my GPU anymore. Even on my HTPC, the overscan was so bad and I had no way of adjusting it because there were no drivers. So, based on price alone, in came the 280 that constantly ran at 80C under load with heavy breathing fans that never stopped even when idling after gaming. Then I stupidly bought another one because of that crossfire status, and oh yeah it was cheap. Which always turned off my PC during AAA games because apparently 850watts wasn't enough smh. **** you, AMD.

Quite the moving story. The HD 4850 was a card of its time but it's unfortunate that AMD stopped supporting it. Although I have some serious doubts that a 850W power supply cannot run two R9 280s in Crossfire. I suspect it was something else.

Quote

I can still download drivers for my brother's Geforce 9800 (released in 2008) ranging from Vista to Win10, including Win8. For a 9 year old product.

Well, I don't disagree that AMD should have not stopped supporting that HD 4850.

Quote

You know I couldn't even sell the piece of junks 280 GPUs for $100 a piece? They were $300 MSRP cards. I got one offer for $100 for both. Ridiculous.

The GTX 980Ti (the very card I own) had a $650 MSRP and now they're worth about $300-ish. The R9 290X had a $550 MSRP and now they're worth below $200. I'd hate to see the price depreciation of any Titan card. I see why you're mad but it happens to basically any card.

Quote

No more AMD for me. Forget their lack of innovation and competition. Their products just suck.

"That's just like your opinion, man".

Quote

If you really read this entire convo and rant, let me know, I'll send you cookies.

Please do.  :P

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/9/2017 at 1:28 AM, HKZeroFive said:

The motherboards still play an integral factor in price.

It'd be like me saying the i5 7600 and i5 7600K don't have a large disparity in price

Ehhhhh but different motherboards have different features. Z270 is pushing USB3.1 and RGB like crazy. You're right about the k-variants having an inherently higher price tag, but technically you shouldn't buy an unlocked chip if you aren't going to OC. In theory you can get somewhere between 10 to 40% more clock speed from an unlocked chip. That doesn't necessarily translate to 10 to 40% more performance, but yeah it's all a bit gimmicky for common uses. And oh yeah it voids your warranty 9_9

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

Then using all that info where you say the majority of computer users browse the internet, I can reciprocate the same argument that Intel's consumer quad core lineup (i7 7600K/i7 7700K) is almost just as niche as a fully-blown eight core, sixteen threaded CPU.

I couldn't agree with you more. Quad core chips and unlocked chips and hyperthreading and QPI and quad channel memory is meaningless to a majority of the market. These features are typically sought after by gamers, home-based productivity, home-based video editors, general nerds, etc. No grandma or typical, un-nerdy college student is going to walk into their local electronics store and demand a quad core PC, specifically an unlocked variant. You could reciprocate the same argument... it is valid. Quad core is a niche market. It's a significantly larger market than 8-core chips and there might very likely be some crossover between the markets.

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

Office PCs, fine, go with a dual-core Pentium. But for gaming (anything else aside from CS:GO, Dota or LoL)? Forget about it.

Yes, agree!

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

You can use the same argument for i5s or even i3s. Why drop $200-ish just to browse the internet and watch some videos? Because people are either doing other applications as well.

 

Look at it this way. Why does Intel even have eight-core, ten-core CPUs? Because there's a market for it and they're buying those CPUs up. If it wasn't profitable, I'm sure Intel would have eliminated that lineup altogether. AMD sees the same in that they can make money from it. No matter how niche the market is, both Intel and AMD recognises that it exists and want to make a buck from it. And if AMD has their prices low enough (recent leaks have the octacore close to the price of the i7 7700K), they might shift the market towards favouring CPUs with more cores. You've been forgotten that it'd likely be a marginal difference in terms of single-threaded performance whilst being a massive step-up in terms of multi-threaded performance.

 

If you're that concerned with AMD not catering to the "majority" then there's their quad cores with SMT which will undoubtedly be priced competitively.

There is no question about it that phones, tablets, and cheapo laptops are absolutely dominating the entire market. And TBH I've always questioned the need for such a large range of products from Intel. Sometimes they refresh enthusiast grade, sometimes just the ULV stuff. Why do they need seven X99 chips spanning two generations? Is KL-X on X99 also? Add two more! Why do they need ~15 different SKUs for consumer mobile products? Only Intel and Anandtech pretend to know 

 

There is definitely a market for higher core CPUs. But it's difficult to gauge. Intel's marketing definitely creates the illusion of needing more and more cores, but I don't know if regular joes can justify more than 4 cores. It's my opinion that Intel is targeting the streamers and youtubers who are playing games, recording the game, recording their webcams, and encoding videos all at the same time. Like Tmartn... he posts like 4 to 5 videos A DAY. Most of the serious streamers I've seen are using multiple computers to do all this. I can see how a 10-core monster with virtualization can help them. I mean we are also seeing larger PC cases that can house two computers. An ATX and a mini/micro ATX. But honestly whether you go for the 10-core CPU or the $1,000 cases, the people who need this much power are being slammed with high costs. If you're occasionally encoding a video maybe once or twice a month, yeahhhh probably an 8 or 10-core is hard to justify.

 

I also don't see how AMD is going to fix or cheapen any of this. I mean the FX9590 stacks up nicely against the 2600k. But the FX9590 came out in mid 2013 and uses 220w TDP. 

 

I'm concern that AMD is not providing exciting enough products to really appear as a threat to Intel (or Nvidia). Consider the words "efficiency" or "TDP." Yeah. You absolutely have to cross AMD off your list.

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

AMD will respond with Vega. They may be late as fuck but they will compete.

Still waiting ;)

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

The Titan continues to sell because it's the best of the best and apparently there's a significant amount of people who have a lot of money to burn. But guess what's going to happen when AMD releases Vega? The GTX 1080Ti will inevitably be released at a much lower cost with roughly the same amount of performance and the Titan XP (just like any other Titan before it) will be made irrelevant.

You make two excellent points here. First point: people have money to burn. I say nay, people are likely racking up their credit cards. Only DINKs can really afford to live debt free. Second point: You're right. The 1080ti will be released (and still hold the crown over Vega). But at this rate the most recent Titan X will be 6 to 12 months old when the Vega is released. The expectation is that older technology naturally becomes less expensive over time.

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

"Why improve your product when it doesn't need to be improved?" Yes, because that's what every consumer wants to hear. You can achieve higher clocks on Kaby Lake than on Skylake! That's innovation right there whereas AMD delivering multi-core CPUs for a much lower price isn't. See the conflicting logic?

But the extra cores are dead weight. Intel releases higher performing products with fewer cores and less power draw. I'm missing your point. AMD isn't providing a competitive product. It's just providing a product with meaningless specs.

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

Or maybe because Intel is rushing to get their new CPUs out in anticipation to Ryzen? That's the more likely answer. As far as I can see, Kaby Lake-X offers nothing noteworthy except for maybe unifying the consumer and enthusiast lineup on the single X-series platform... which is awfully similar to what AMD is doing with the AM4 platform. I wonder why?

 

"Is that maybe because Intel doesn't really see Ryzen as competition in terms of performance?" Yet Intel's own engineers explicitly stated that it was "clearly competitive". Sorry, but I fail to see the logic in your statement.

Or maybe there is no compelling reason to discard a 3 year old platform because the competition isn't doing anything better. I don't care what Intel's office of public affairs pretends their engineers say. I saw the leaked pricing from the Ryzen. It's not quite as low as I thought it would be. I am curious to see how it stacks up against Intel's chips. AMD has had a fantastic boost strategy where they pretend their mid-range chips can really trade blows with Intel's higher end consumer market. (recall the Athlon's naming convention). I'm exciting to see the benchmarks. But seeing as how this was AMD's "we are catching up" products, it's probably going to be less exciting than I hope.

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

GeForce Experience 3.0 is probably one of the most unfortunate things I had to deal with on the NVIDIA side. The UI is intrusive, I've had FPS problems and the recommended game settings are not helpful at all... but the most griping thing about GFE is that I have to fucking log in. So I just uninstalled it. Forgive me for saying that Crimson ReLive looks intriguing because my brother doesn't have the same problems that I do.

Fair points. I'm not quite as annoyed with Nvidia software just yet. Like I said Nvidia does a better job at managing updates through GFE. But I do have to agree the recommended settings are pure shit. It recommended turning RotTR to medium so I can turn up the resolution using DSR. No.

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

1) undermines the recent purchases of customers (especially those who just bought Kaby Lake).

2) throws the whole Intel lineup in chaos.

 

"People would be stupid to ditch their 1070 or 1080 for a 490." I can think of many reasons why they would. For instance, FreeSync. It's a game-changer in terms of your gaming experience. Believe or not, there are a good amount of people who prefer a R9 Fury with a FreeSync monitor as opposed to a GTX 1070 without an adaptive sync monitor. 

1) and 2) I agree. It's debatable that two more chips will throw the already chaotic lineup into more chaos. I'm under the impression Intel's naming structure was designed to make the consumers with the lower numbers feel inferior. It's maddening but I've gotten over it.

 

Yes. Okay. FreeSync is a great thing. I've never tried FreeSync. But I have sipped the Gsync and high refresh rate kool-aid and it's delicious. I can never play on a regular monitor again. I have no comparison to judge the differences between FreeSync and Gsync. This is truly the lowest point of Nvidia's GPUs. Unfortunately for your argument, Nvidia's market share says a gamer is more likely to have an Nvidia GPU and potentially a Gsync monitor over FreeSync (given the circumstances)

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

I'm surprised that you didn't know that you should never, ever compare the clocks of two completely different architectures

I can't know everything bro... I can't contest with anything else you say here

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

Well, I honestly think AMD is doing good enough for NVIDIA to be aware to not slack off. The thing is, AMD is actually competent when it comes to GPUs to the extent that NVIDIA can't just create a monopoly straight up. Their RX series have turned out to be somewhat decent. And remember, Volta is still a good way off. It's not like NVIDIA will counter AMD immediately with a brand new architecture.

I'm like 50/50 on all this. Volta is surprisingly far out... It's pretty exciting that they are targeting 7nm.

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

70/30? That's actually says that AMD has a growing market share (and every other source I've read suggest that exact same) since I remember it to be more 80/20.

 

And market share doesn't come from pure performance. It's about setting a good price and having good performance and convincing consumers to buy your product because of those two key factors

The market share is quite variable. I don't know man. Nvidia has 70 to 80% of the market, higher prices, better performance, and they can convince more people to buy their products. How is AMD competing again??

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

I don't like fanboying if haven't noticed by now :P

Haha I've noticed maybe a little bit? I'm not trying to fanboy Intel or Nvidia. I'm just boycotting AMD

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

Although I have some serious doubts that a 850W power supply cannot run two R9 280s in Crossfire. I suspect it was something else.

Yeah I agree. I don't think it was my PSU because it would continue to function just fine. What throws a wrench into the system is that the problems would go away when I disabled crossfire. I don't know what else it would have been. I'm using the same setup with GTX 1070 (haven't even formatted the SSD) and it's functioning without issues.

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

I see why you're mad but it happens to basically any card

Because I was able to sell my TV stand for 2/3 of what I paid for it. I didn't get one reasonable offer for my GPUs! haha sometimes I just rant

 

On 2/9/2017 at 1:28 AM, HKZeroFive said:

Please do.  :P

Shoot me your fav type and a PO box

 

 

Did you see they leaked Ryzen prices??? BENCHMARKS!!

Link to comment
Share on other sites

Link to post
Share on other sites

What is up with the shit show that happens when anyone asks this question? A question with one answer.

 

The 1070 and 980ti trade blows. Get the 1070 if they're the same price, the 980ti if it's cheaper.

 

 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/16/2017 at 3:12 AM, JohnT said:

And oh yeah it voids your warranty 9_9

Shouldn't be a worry if you know what you're doing.

Quote

But honestly whether you go for the 10-core CPU or the $1,000 cases, the people who need this much power are being slammed with high costs. If you're occasionally encoding a video maybe once or twice a month, yeahhhh probably an 8 or 10-core is hard to justify.

In what sense are they being slammed with "high costs"? If it's Intel pricing, I can totally understand that but if AMD's Ryzen's pricing turns out to be remotely close to what the rumour mill is saying and the single core performance is decent, we're looking at a very affordable octacore. $1700 is very different to $300-500.

Quote

I also don't see how AMD is going to fix or cheapen any of this. I mean the FX9590 stacks up nicely against the 2600k. But the FX9590 came out in mid 2013 and uses 220w TDP. 

Eh, the FX 9590 was a joke of a CPU that could catch on fire if you didn't pair it with a decent motherboard and had a ridiculous $1000 pricetag. I'm going to have to disagree with you when you say it stacks up against the i7 2600K. It doesn't at all.

 

But... Ryzen (judging from slides presented by AMD themselves) is completely different from the Bulldozer architecture and is more similar to Intel's. It's a blank slate for AMD. And let's not forget, in their long absence from the CPU market, it's very easy to reduce the TDP. Especially if it's ridiculously high in the first place.

Quote

I'm concern that AMD is not providing exciting enough products to really appear as a threat to Intel (or Nvidia). Consider the words "efficiency" or "TDP." Yeah. You absolutely have to cross AMD off your list.

In their more modern products, AMD isn't quite as efficient as their counterparts (although Polaris, which includes the RX 460, 470, 480, is quite promising of their future endeavours). But for the modern consumer (and not the large companies with their array of servers), efficiency will probably be a minor selling point as price and performance usually come first.

Quote

But the extra cores are dead weight. Intel releases higher performing products with fewer cores and less power draw. I'm missing your point. AMD isn't providing a competitive product. It's just providing a product with meaningless specs.

Higher performing products in what sense? Sure, single threaded performance wise, Intel should be better but I'm betting that the difference is going to be marginal at best. Multithreaded performance is probably going to see a much wider gap between the two.

 

And given all this stuff where people think futureproofing is now a thing, if I was presented between a 4C/8T CPU and a 8C/16T for the same price, I would go for the latter. And I'm willing to bet many others would as well.

Quote

Or maybe there is no compelling reason to discard a 3 year old platform because the competition isn't doing anything better. I don't care what Intel's office of public affairs pretends their engineers say. I saw the leaked pricing from the Ryzen. It's not quite as low as I thought it would be. I am curious to see how it stacks up against Intel's chips. AMD has had a fantastic boost strategy where they pretend their mid-range chips can really trade blows with Intel's higher end consumer market. (recall the Athlon's naming convention). I'm exciting to see the benchmarks. But seeing as how this was AMD's "we are catching up" products, it's probably going to be less exciting than I hope.

It's significantly lower than I thought it would be. Given that the supposed competitor of the Ryzen CPU is about $1000, I would have thought maybe a pricetag of $700 would make people consider to get AMD instead of Intel. Apparently, they went even lower and now you see the internet raving on about it.

 

You can say AMD is effectively "catching up" but single threaded performance has hit a hard wall for Intel themselves. Given the lack of any performance improvement (although it could be argued that a lack of competition has also exacerbated this), it would be absurd to even think that AMD would blow all expectations and destroy Intel in performance. Even Intel can't do that. That's why more cores is starting to become a thing because that's essentially the only way CPUs can "go up". There are rumours of Intel introducing mainstream hexacores in the future for a reason.

Quote

Unfortunately for your argument, Nvidia's market share says a gamer is more likely to have an Nvidia GPU and potentially a Gsync monitor over FreeSync (given the circumstances)

More likely to have a G-Sync monitor over a FreeSync one? I highly doubt that. If someone was prepared to pay an extra $200-300 or so for a G-Sync monitor over a FreeSync just because they went with a NVIDIA card, that's the definition of insanity (or rabid fanboyism; take your pick).

 

The price difference between FreeSync and G-Sync monitors is enough to warrant a consideration of a different GPU altogether and the informed consumer knows that. That's why in every single AMD vs NVIDIA argument (especially in the RX 480 vs GTX 1060 ones), FreeSync has become such an effective point because it holds actual weight.

Quote

The market share is quite variable. I don't know man. Nvidia has 70 to 80% of the market, higher prices, better performance, and they can convince more people to buy their products. How is AMD competing again??

By targeting the lower end portion of the GPU market where the majority of consumers reside. Hard to say if the strategy has paid off or not, but the RX 400 cards in general have been very competitive.

 

But according to multiple sources, it's basically fact at this point that AMD is indeed gaining marketshare when it comes to GPU.

Quote

Haha I've noticed maybe a little bit? I'm not trying to fanboy Intel or Nvidia. I'm just boycotting AMD

You share the same mindset as one particular forum member :P

Quote

Did you see they leaked Ryzen prices??? BENCHMARKS!!

Let's wait and see. Reviews are supposedly coming out on the 28th.

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, XxXxXXx_Evan_xXXxXxX said:

-snip-

OP price

1070 = £390
980 Ti = £370

So can you modify that value graph?

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 2/18/2017 at 9:36 AM, xAcid9 said:

OP price

1070 = £390
980 Ti = £370

So can you modify that value graph?

scores.png

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, XxXxXXx_Evan_xXXxXxX said:

-snip- 

Wrong price and that value/performance graph...

 

giphy.gif

 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

When talking about a 980ti, why are we talking new? Get a used one, my Hybrid cost 300 us. 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Just get a 1070 man, lower TDP and newer tech with more driver support. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, GreenToxon said:

Just get a 1070 man, lower TDP and newer tech with more driver support. 

Pascal is just a shrunken Maxwell, driver support is fine. 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Want my advice? If you are willing to look at like craigslist or /r/hardwareswap, you can could get a pretty good deal on a 980 ti. For example, I bought 2 zotac reference style for only $500! I hope this helped, but if you already bought the 1070 (which you might have because this thread is 20 days old) just stick with that!

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/17/2017 at 5:09 AM, HKZeroFive said:

I'm going to have to disagree with you when you say it stacks up against the i7 2600K. It doesn't at all.

It just does, bro: http://cpuboss.com/cpus/Intel-Core-i7-2600K-vs-AMD-FX-9590 There is no clear winner between the 9590 and 2600k (regardless of the website's scoring, I'm looking at the benchmark numbers)

 

On 2/17/2017 at 5:09 AM, HKZeroFive said:

efficiency will probably be a minor selling point as price and performance usually come first.

Serious??

 

On 2/17/2017 at 5:09 AM, HKZeroFive said:

The price difference between FreeSync and G-Sync monitors is enough to warrant a consideration of a different GPU altogether and the informed consumer knows that. That's why in every single AMD vs NVIDIA argument (especially in the RX 480 vs GTX 1060 ones), FreeSync has become such an effective point because it holds actual weight.

Okay. The argument holds weight like it would in a textbook... it's logical. But in reality, nothing works how they are written in textbooks. Look at this data: http://store.steampowered.com/hwsurvey/videocard/?sort=pct The RX480 doubled in Steam users from Oct and Feb. The 1070 also doubled. The 1060 tripled. This is harder to do for the 1060 because it already had a much larger user base than the 480. I have no clue who's buying AMD cards for FreeSync. It's a great argument... but nobody believes it ;) Now I am not saying people are buying Nvidia cards for Gsync, but they sure aren't buying AMD for FreeSync.

 

In all honesty, I don't think gamers understand how valuable adaptive refresh rates are for gaming. I'm not sure if many are taking the technology into consideration. Not many people want to drop tons of money on a new monitor after shelling out for a video card. In this regard, I do believe AMD would have a better selling point against Nvidia. But I've never done a comparison between them so I can't comment on FreeSync directly. I did sip the Gsync kool-aid and the technology is absolutely amazing. 

 

On 2/17/2017 at 5:09 AM, HKZeroFive said:

You share the same mindset as one particular forum member :P

Is this good or bad? 

 

On 2/17/2017 at 5:09 AM, HKZeroFive said:

Let's wait and see. Reviews are supposedly coming out on the 28th.

From what I hear, today is reckoning day!!! My Youtube subscriptions are blowing up. Let's see if AMD can cause some trouble. Time to watch some videos...

Link to comment
Share on other sites

Link to post
Share on other sites

i would defenitly get the 1070 but if you have some extra cash get the 1080 or 1080 Ti when it comes out

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×