Jump to content

Metro

Member
  • Posts

    1,127
  • Joined

  • Last visited

Posts posted by Metro

  1. I'm at the CES and just saw Linus in the intel stand on the center hall. Anyone knows if he will be giving a conference or something ?

     

    If you are at CES, could you ask ASUS if the ASUS PG348Q is 100hz out of the box or "up to" 100hz and if the scan line issue has been fixed (that would be great if you could and have time) thanks :)

  2. CES is not open to the public, so even though you have the money to spend, you can't get in, unless you know someone. Anandtech took a pick of his food receipt a few years back during CES. The hotdog (can't remember exactly) was expensive.

     

    Yeah, I know, only industry professionals :)

  3. why don't you ask the Luke guy that makes these boring videos without really showing anything

     

    what we really want to see is the AMD boot with next gen GPU that's running the demo and custom built PCs from probably Intel boot or someone...

    and we still haven't seen the kingston PC Linux guy did with the 7 R9 Nanos

     

    He would not look at the comments.

  4. What's wrong with JayZTwoCents? Is it the fact that he isn't Linus? Or that fact that he proves the fanboys wrong? Orr is it that he actually replies to comments that hate on him proving them wrong? TekSyndicate should be the one to hate on.

     

    No point watching them for any eviews, all they do is give positive comments on the product, even if it has problems. However this was about finding someone at CES :)

  5. Sell your 980ti's and you can go

     

    I could afford to go anyway, however I have already missed most of it. Also I am not sure I would want to spend tickets for the flight, hotel and CES tickets. If I was going to Las Vagas anyway, I expect I would pay for the All Access Pass or certain tracks

  6. I'd like to go but from here in the UK it would be quite expensive and I'm already going to the US this year, can't afford two trips until I get a better paying job.

     

    I would go as well, however I am from the UK

     

    I just wanted to ask someone who is there, to check something for me (if they do not mind)

  7. Yeah, they had there conference about it a few months ago, I expect they are waiting for AMD to release the Fury X2

     

    The GPU AMD shown was a low end GPU for mobile and power efficiency, so I expect high end Arctic islands will not be out this year.

  8. Hey guys :)

     

    I am currently building my new rig. I already received most of my stuff but I am still wondering what to buy concerning the monitor.

    I'll be running a 980ti.

    As the title says, I am looking for a 34" 3440x1440 IPS G-Sync curved monitor without QA problems and if possible, without a gamer look.

    I found the Acer predator X34, but the panel lottery is not my thing.

    The Asus PG348Q is very interesting but again, comparing with the recent rog monitors release, I am afraid that the QA will be horrible again.

    Also, I really, really, really hate that gamer aesthetic trend. I want something more minimalist-modern.

    so here you are to save my day again ^^

    I am counting on you guys ^^

     

     

    You will get about 70 FPS in Battlefront Ultra with one GTX 980ti, you will get about 35-40 FPS in the Witcher 3 Ultra Hairworks with one GTX 980ti. If you wanted SLI you will get 120 FPS in Battlefront Ultra and 60-70 FPS in the Witcher 3 Ultra Hairworks

     

    So you will not really benefit from a 100hz panel, if you play the latest games.

  9. I would definitely go with high end Volta SLI, however it depends if they make a a 5120x2160 95hz monitor, if they do I will get high end Pascal. Although I expect that the monitor would be released in 2018-2019

     

    I will have to look at the performance difference between the GTX 980ti and GTX 1080ti

  10. I think we are having a fundamental disagreement over the terminology of "high end".

     

     

    On release, the 480, 580, 680, 780, 780 Ti, 980 and 980 Ti have been the high end flagship cards for nVidia. Why ignore the Titans you ask? Because they're $999+ cards that are exorbitantly priced and not worth considering as consumer cards. (Also, they support what I'm saying).

     

    400 Series Launch date: March 26, 2010.

    480 Release date: March 26, 2010.

     

    500 Series Launch date: November 9, 2010

    580 Release date: November 9, 2010

     

    600 Series Launch date: March 22, 2012

    680 Release date: March 22, 2012

     

    700 Series Launch date: May 23, 2013

    780 Release date: May 23, 2013

     

    900 Series Launch date: September 18, 2014

    980 Release date: September 18, 2014

     

    Are you starting to follow what I'm conveying? nVidia debuts every generation with its flagship. That means that the GTX 1080 (or whatever they decide to name it), will be released at the launch of this coming generation, which will likely happen in mid 2016. It will be the highest performance consumer card nvidia has to offer. It will likely later be surpassed by a 1080 Ti, but that's neither here nor there. The jump from 980 Ti to 1080 will very likely be larger than 1080 to 1080 Ti (as history has shown with the jumps from the 285 to the 480 and the 580 to the 680, both of which were process shrinks).

    Can we stop with the bullshit, now?

     

    No, the GTX 1080 is a mid end GPU, with a low die size. The GTX 1080ti is high end with a large die size. Also the GTX 1080ti will be a bigger jump. Why would a GTX 980ti (high end) to GTX 1080 (mid end) be a bigger jump in performance. The GTX 1080 will be the same performance jump as the GTX 780ti to GTX 980

  11. Sorry metro. You're wrong again. Quote me. The biggest jump will be from the first generation of Pascal gpu's, not the second or third generation 14/16nm FinFET.

    You're also wrong about the release date. AMD just announced Arctic Islands release to hit mid-2016 and they displayed a working competitor to the GTX 950 that had a wattage draw of approximately 40w. We are going to see flagships by mid to late 2016 guaranteed, and like I said before: take it to the bank and quote me, they will be the biggest increase in performance since the 680 over the 580 (40%) or the 7970 over the 6970 (60+%).

     

    I expect low end and mid end Pascal to be Q2 2016, then high end Pascal in Q2 2017

     

    One is the 16nm fab is new...and untested...and you can't go large and power hungry on first generation chips. Don't believe how unreliable a new fab is? Look at how long it took Nvidia to pick between Samsung and TSMC and finally siding with TSMC due to their longer working history and reliability.

    Second point is...they will kill their own sales by offering their best card right at the release of the new generation. They will not be able to give any reason at all to their enthusiast market to upgrade their card within 3 years of launch. Whereas when you look at Kepler, people bought the 680, then the Titan came out, then the 780, and most people who bought the 680 eventually upgraded to those cards. And then when Maxwell launched, they started with just the 750 and 750ti. Then released the 980 that some bought, and finally after a while, the Titan X and 980ti which gave people (enthusiast market, again) a reason to upgrade.

    Nvidia likes incremental upgrades. They don't want you to buy a card now, and launch another card in a year that completely destroys the card you bought. They put out one card, then 12 months later they bring out another card that is generally just 10-20% faster, so it's better than what they offered before, but not enough to **** off anyone who bought their last gen cards. And then another 6-12 months later they put out an even better card, with 50%+ better performance than their older cards, and people start upgrading again.

    If on day one of the Pascal launch, if they come out with their absolute best card, they are going to have nothing interesting to bring to market for 2 to 3 years until Volta comes out. And that would be silly. Even if it were possible with the new 16nm fab. In terms of business, you need to offer a product that is a bit better than your competition, without being too much better/too costly for you. So they just need to put out a slight performance increase, but sell the card on much lower power consumption, wait for AMD to release something else, and then launch a bigger die version themselves, and back/forth they go. Just a quick reference:

    FERMI

    GTX 580 = 520mm2

    KEPLER

    GTX 680 = 294mm2

    GTX Titan = 551mm2

    MAXWELL

    GTX 750ti = 148mm2

    GTX 980 = 398mm2

    GTX Titan X = 601mm2

    Do you see the pattern? Small, Medium, Big, restart. Don't think about it based on die size. Because it's really just about transistor count. Pascal at 16nm, even a 294mm2 sized die like the GTX 680, along with HBM2, would result in performance close to the Titan X...and perhaps even higher due to lower heat/power consumption allowing higher clocks.

    Hope that helps.

    Also, NVIDIA is launching the GTX 990, so it would not make sense to release high end Pascal in Q2 2016

    We also know HBM2 is going to have problems with amount of stacks as there was not enough HBM1 for AMD, so if NVIDIA and AMD are using HBM2, there will not be enough (Titan O, will need 16GB HBM2) this is why NVIDIA will have to wait for Samsung HBM2 (which starts production in Q2 2016) so it will take till 2017 for there to be enough and for NVIDIA to implement it on the GTX 1080ti and Titan O

     

     

    Also give me a link where AMD says they will release high end Arctic islands in Q2 2016

  12. I doubt that. Here's why:

    • The 780 Ti to 980 improvement was the smallest improvement in the last 7 years, it was also the first step backwards in transistor count in nvidia's flagships. As there was no process shrink to go with the die shrink, the only thing carrying the 980 forward over the 780 Ti was architecture, so given that it should have had ~35% less transistors, it still managed roughly 10% more more performance. (
    • Moving from 40nm process to 28nm process was approx. a 42% shrink. In practice, nvidia managed to put more than twice as many transistors per sq. mm. The shrink from 28nm to 14nm (or 16nm) is a approximately halved again. If this means nvidia doubles their transistors per sq. mm. again, they are in for a serious increase in power.
    • If nvidia goes with a relatively conservative die size of 400mm2 (approx. same as 980), they should manage to get approximately 10,348,000,000 transistors on the die. Pascal architecture SHOULD only be an improvement from Maxwell architecture, so it would be a given that even if it only equaled Maxwell performance on a transistor for transistor level (which it won't, it'll outdo it), it would be a 29% faster card than a 980 Ti.
    • The smallest Flagship die nvidia's released since 2008 was the 680 in 2012. If they copied that die size, it might have approximately 7800 transistors, which would be just shy of a 980 Ti. If Pascal's architecture is as much of an improvement as Maxwell was over Kepler, you're still looking at a large performance increase.

    There are no signs pointing to this being an incremental upgrade for nvidia or AMD. Everything says that this will be the biggest jump we've seen since 2012.

     

    It might be the biggest jump, however that will be over many different GPUs, also why release one big GPU, then nothing until Volta (2018 or 2019) if you can do small incremental upgrades and get more money. Also expect a Q1 2017 release for high end Pascal

  13. @Drez So, I ended up keeping the x34. I'm really digging it, thus far. I have a bit of backlight bleed, but what monitor doesn't these days?! No other issues, though. :D

     

    Just a question, I guess you can get 100hz, try running the Gsync Pendulum demo, then try it at low FPS. I expect you will notice lines on the screen. I would have waited, CES is on Tuesday and I expect more information on the ASUS PG348Q

×