Jump to content

WangKe

Member
  • Posts

    109
  • Joined

  • Last visited

Awards

This user doesn't have any awards

  1. 390X it is, then. If its still around $50-$100 cheaper, I'd take the 390X over the 980. If I had a choice between 30fps for $400-500 or 29.(something something) for $400-450, I'd take 29 fps to save money. I could buy a game or two (or three, depending on sales) with whats left over.
  2. Am I missing something? Didn't the 390X beat it in sheer performance? Or did Nvidia release a new driver that beat the 390X in sheer performance?
  3. 1.) What do you mean 3 different architectures? If you mean Pascal with Kepler and Maxwell, why neglect Kepler while Maxwell is still around? Pascal ain't even out yet. I'm not saying Fermi should be treated better, nor should it even be included in the "3 different architectures" thing, but come on... Kepler shouldn't entirely be behind Maxwell by that much. 2.) Again, tell me how is this possible? It looks like AMD cares even "without the budget". 3.) Maxwell's barely gonna be two years old, and its gonna get the same treatment. Does Nvidia expect everyone to upgrade from a 980 Ti to a 1080 Ti or non Ti? 4.) The 290X was released on October 24, 2013 and the 780 Ti was released on November 7, 2013, while the 290 was released two days prior to the 780 Ti. The 290X traded blows with the 780 Ti, despite the 780 Ti beating it, but again... how is this possible?
  4. Guys who recommend openbacks should read or read this from OP Well, the NVX XPT100 or M40X would be a good idea. But if you don't mind going for in-ears, go for the Phillips Fidelio S1 or S2. They are pretty good and it seems like Philips can barely do no wrong with headphones/earphones, despite making underwhelming products like monitors, televisions, maybe even lightbulbs, I dunno what else Philips makes.
  5. 1.) Uhm, what? 2.) If the 200 series cards were based on a 2012 architecture, why is it keeping up with a 2014 architecture? Does that mean AMD made a superior GPU but had problems with the software for it? Or is it because AMD shows more care for its customers than the competition? 3.) There's a fine line between profit and screwing its customers. 4.) Then how is a weaker version of a 2013 card keeping up with a 2014 card in 2015? Not saying a 480 should still be as good as a 980, but a 780 Ti shouldn't drop off that bad to the point that a 970 can beat it. 2012 architecture, yes, but if AMD can do it with that, why not? I'm pretty sure Nvidia's filled to the brim with cash, I'm pretty sure they can beef up their software team or whatever that're in charge of driver updates.
  6. Okay, I am so confused... to run Furmark or not? Anyways, back to OP, try running game benchmarks like the ones on Shadows of Mordor, GTA V, Metro (which Metro was it?), and so on as well.
  7. But architecture shouldn't be a good enough reason to give it "faux" drivers (for the lack of a better term). I mean, look at this. How is a 290, yes, a two ninety without an X, suddenly keeping up and getting real close to the 780 Ti? And Kepler is just one generation behind. If Nvidia wants us to upgrade every year, then make the cards cheap like $200 for their top tier card while their lower tier cards are lower than that, with an expiration date or something while they're at it. Its more of architecture improvements than driver per architecture with the way Nvidia's been doing it. Hell, no company should just leave their previous gen cards out like that for that short of a time.
  8. Well, if I could buy a 390, 390X, or Fury, I'd go AMD no questions asked and this thread wouldn't have to be posted. But since I don't wanna buy from the other local stores because I won't get as much of a discount, I'm left with the 980 Ti and a 380 as my choices (go for whats the top-tier now and get shat on once the next generation comes, or take a cheap card and wait for the top tier card of the next generation) I'm more concerned with how long will a company keep supporting its product overtime at a reasonable span of time. Once newer games are out, I don't want to see the same thing like the 780 Ti VS the 970 all over again, where my patience for waiting for a high/top tier card gets eaten by a card thats one or two tiers of a generation after that. As someone who's had a 460, I waited THAT long for a top tier card. If anything, the 780 Ti should be able to perform evenly when stacked up against the 970. And the funny part is that AMD, a company that I'd call "Idiots who make an effort to improve themselves" didn't just let the 290X just still be a 290X. Hell, it finally beat the card that beat the original Titan and is atleast above the 970. Did I mention that the 290X is technically older than the 780 Ti? Sure, the 780 Ti beat the 970, but look at the gap between the two? And hell, I've seen live performance benchmarks (saw them in person) of a 970 with a barely aggressive overclock (1400+ mhz) beating 780 Ti's that have similar overclocks as yours (based on your signature) I don't mind dropping the settings to maintain a certain amount of FPS, but I don't want to drop the settings too hard. I want my gaming experience to be great today, slightly worse a generation after, and worse than that when another generation comes in. But I don't want to see it go down hard to the point that a card half its MSRP (on the day of release) be eaten up like that in a fairly short amount of time. Game optimization go hand-in-hand between game developers and software engineers from the likes of Nvidia and AMD, whether both are talking to each other about it or not. If my gaming experience is bad due to bad FPS, one could argue its because the game may be hard to run. But if I see a card half its price perform even slightly better when overclocked decently on said game thats hard to run, thats a slap to the face.
  9. Only in Skyrim where an Apple can potentially kill a Dragonborn
  10. To be fair, they did drop their pre-GCN cards which sort of, just sort of, makes sense. But the part that they never stopped giving up on even the likes of the 7970 (sure, its allegedly the same architecture that can be found on the 200 series and maybe 300 series, don't quote me on that either) is remarkable. IMHO, if they (by that I mean AMD or Nvidia, if this was the case) aren't gonna give it enough effort to atleast have some FPS increase, which based on Linus' video, which 5 years later the 480 is given a pretty decent boost of FPS but pretty much 10% +/- after capping out at 2014, a total of 15% increase wouldn't hurt over 5 years worth of support. But if they're just gonna give it that much effort, why not just totally drop it after 3 or 4 years? Not sure if there's a market for guys who just wanna stick to a single GPU, or a single laptop, for around a decade or something. Better than somewhat giving the consumers some kind of false hope that drivers mature over time. Architecture does play a role on how much the guys at AMD and Nvidia can do to it with driver support, but if it can't take it because its THAT old (like a 500 series card kind of old), why not just drop it? Still dunno if I should be offended that a 970 eats a 780 Ti or not. The 780 Ti ain't exactly that old. I'm more worried about getting ripped of hard a generation later. A 980 Ti will always beat a 380 (unless Nvidia does something about it in a negative way), but sure as hell I don't want to see it eaten alive by a 1070 (or whatever the $300-$400 Pascal GPU is gonna be called) or even losing once to a 1060. Which is why I do kind of want to go to AMD after seeing the 290X mature over time, and even the 7970.
  11. Depends on what mods and how many mods you put in it, I guess. Plus, yeah:
  12. Well... fuck. I guess I'm going with a 380. I guess I'll have to go back to 1080p and wait for Greenland.
  13. How come I have a feeling that if I get a 980 Ti right now, it'll be the same story as the 780 Ti? Like, at one point defeated by a 960 and is pretty much eaten by a 970. Sure, benchmarks online will say otherwise regarding the 780 Ti VS the 970, but last time I went into a "LAN"-like party around a month ago and some benchmark tests (using Tomb Raider and GTA V because thats all we had at that time) with some fairly recent drivers (I don't remember which ones), the other dude with a 970 had to undervolt his card (but keep the power limit to 110%) and they were on par with barely 1-2 fps difference with the 970 having the advantage. Mine was overclocked to as high as it can because it lost the silicone lottery, but kept beating 290X's overclocked way back, and later on which a 970 when it came out would still beat the 290X (note: back then. But now, I think its a different story all in all) and what not. Damn...
  14. Well, I corrected myself. Check the post again. And by drop, I made a mistake thinking dropping also equated to giving just "compatibility" for the sake of it just working with it but not really getting eaten alive by a card thats one generation newer. So, yeah, more of lack of optimization, or lack of effort given into optimization.
×