Jump to content

yoseipt

Member
  • Posts

    31
  • Joined

  • Last visited

Reputation Activity

  1. Like
    yoseipt got a reaction from vitor_cut in Fable Legends DirectX 12 benchmark   
    Interested to see how it will perform with lower tier GPU's 
     
    Poder aos tugas btw =P
  2. Like
    yoseipt reacted to vitor_cut in Fable Legends DirectX 12 benchmark   
    Another DX12 Game with AMD doing better than nVidia. It seems GCN really like DX12, and nVidia got a problem to solve, although i think its an  architectural problem  with Maxwel, and no driver will fix it
     
     
     
     
     
    source:
    http://www.extremetech.com/gaming/214834-fable-legends-amd-and-nvidia-go-head-to-head-in-latest-directx-12-benchmark
     
    http://techreport.com/review/29090/fable-legends-directx-12-performance-revealed/2
     
    http://www.pcper.com/reviews/Graphics-Cards/Fable-Legends-Benchmark-DX12-Performance-Testing-Continues/Results-1080p-Ultr
  3. Like
    yoseipt reacted to mr moose in Microsoft Edge is the first web browser to support Dolby Digital Plus   
    I don;t think IE was ever as bad as people make out, but you know what happens when one or two popular people on the internet suggest something is bad or make a joke about people using a product.  All of a sudden that product gets a bad reputation and everyone is too scared to admit they don't have an issue.  I call it the Jar Jar Binks effect. Until comedians started making jokes about jar jar binks, he was a popular character.
  4. Like
    yoseipt got a reaction from Kuzma in AMD's Claim, Nvidia's Rebutle, and Intel's Intelligence   
    "G-sync is a good thing but it hinders performance so quality doesn't suffer thats all it does."
     this is false. g-sync doesnt limit performace, all it does is adjust the refresh rate of your monitor to the same fps level that your card is rendering. think about it, imagine that u have a monitor that is 120hz capable: if your fps output from the gpu is lets say 75fps it will increase the frequency of your monitor to 75Hz, then it drops to 45fps, and it will adjust the frequency of your monitor to 45Hz and so on.  its like overclocking and downclocking your monitor, to be correct its just a auto downclock of your monitor since it cant clock higher than the monitor spec. otherwise it could compromise the lifetime of the monitor itself.   so there is nothing caping the performace of your gpu, once the image is rendered its displayed to you. in some cases it even increases performance, that being for example when the gpu finishes to render the image before the refresh rate of the monitor, so that there is no waiting between when the image was rendering and the refresh of the monitor, thats when u see it. thats how it eliminates lagg and shuttering.
  5. Like
    yoseipt got a reaction from xerox in AMD's Claim, Nvidia's Rebutle, and Intel's Intelligence   
    "G-sync is a good thing but it hinders performance so quality doesn't suffer thats all it does."
     this is false. g-sync doesnt limit performace, all it does is adjust the refresh rate of your monitor to the same fps level that your card is rendering. think about it, imagine that u have a monitor that is 120hz capable: if your fps output from the gpu is lets say 75fps it will increase the frequency of your monitor to 75Hz, then it drops to 45fps, and it will adjust the frequency of your monitor to 45Hz and so on.  its like overclocking and downclocking your monitor, to be correct its just a auto downclock of your monitor since it cant clock higher than the monitor spec. otherwise it could compromise the lifetime of the monitor itself.   so there is nothing caping the performace of your gpu, once the image is rendered its displayed to you. in some cases it even increases performance, that being for example when the gpu finishes to render the image before the refresh rate of the monitor, so that there is no waiting between when the image was rendering and the refresh of the monitor, thats when u see it. thats how it eliminates lagg and shuttering.
  6. Like
    yoseipt got a reaction from Vitalius in AMD's Claim, Nvidia's Rebutle, and Intel's Intelligence   
    "G-sync is a good thing but it hinders performance so quality doesn't suffer thats all it does."
     this is false. g-sync doesnt limit performace, all it does is adjust the refresh rate of your monitor to the same fps level that your card is rendering. think about it, imagine that u have a monitor that is 120hz capable: if your fps output from the gpu is lets say 75fps it will increase the frequency of your monitor to 75Hz, then it drops to 45fps, and it will adjust the frequency of your monitor to 45Hz and so on.  its like overclocking and downclocking your monitor, to be correct its just a auto downclock of your monitor since it cant clock higher than the monitor spec. otherwise it could compromise the lifetime of the monitor itself.   so there is nothing caping the performace of your gpu, once the image is rendered its displayed to you. in some cases it even increases performance, that being for example when the gpu finishes to render the image before the refresh rate of the monitor, so that there is no waiting between when the image was rendering and the refresh of the monitor, thats when u see it. thats how it eliminates lagg and shuttering.
  7. Like
    yoseipt got a reaction from NoobsWeStand in GK110 Was Never Meant To be Used In A GTX680-Like Product.   
    a card having compute capability doesn't mean that it isn't gaming or consumer oriented.
    In that matter the main difference between titan and 780, apart from the number of cuda cores, is the double precision capability. And both this cards are GK110 based.
    With that said, it justifies the price of the titan. i said it before, the titan even for its price point is a good value for cad in a way where its priced between consumer and professional products. Knowing that the titan has the double precision feature of course.
    GTX480 HAD DOUBLE PRECISION AND IT WAS A GOOD COMPUTING CARD, so was the 580 (Not sure about the double precision on this one though, but it performed well in compute). So both this cards were compute capable and gaming oriented.
    Regarding the 480 do you remember the 460? It didn't had double precision capabilities. maybe there was a resemblance in strategy (if we asume that the 680 was ment to be 660)
    This is obviously a strategic position of nvidia in segmenting their offer to different market segments.
    Another thing that delayed the GK110 to enter the consumer market was the BIG order from the US Government for tesla cards (yes there was one). With that die size its hard to manufacture that chip, or slower, and when u have a pre order you will prioritize it. So it was a gamble to go out into consumer space with the gk104 only.
    I will remind you that gk104 when it was lauched performed better that 7970 and it was also cheaper. Not only that the chip was cheaper to produce than the GK110 (because of that size). If the chip was meant to be mid range then it came out expensive and thats why nvidia did a launch like that knowing that it performed better than the red team.
    Another evidence of that is the story behind the development of the gtx 690 and titan coolers. 
    You can read it here, its a long read but interesting.
     
    http://www.tomshardware.com/reviews/nvidia-history-geforce-gtx-690,3605.html
     
    I dont care about AMD or NVIDIA what worries me is that amd is not pushing Nvidia. Nvidia top card is the Titan atm and its based on a card that was launch around the time 7970 was launch (the tesla one don't recall wich one though).
    And now amd 290x it seems to be competing with the titan performance (i hope its better performer than titan)...so like linus said nvidia probably is yawning at this.
    What i hope is that the mantle api and amd monopoly in consoles becomes a game changer and the game portability from consoles to pc is well optimized. As a result i want nvidia pushed really hard because the improvements in the last year or so isn't that impressing.
  8. Like
    yoseipt got a reaction from James_AJ in GK110 Was Never Meant To be Used In A GTX680-Like Product.   
    a card having compute capability doesn't mean that it isn't gaming or consumer oriented.
    In that matter the main difference between titan and 780, apart from the number of cuda cores, is the double precision capability. And both this cards are GK110 based.
    With that said, it justifies the price of the titan. i said it before, the titan even for its price point is a good value for cad in a way where its priced between consumer and professional products. Knowing that the titan has the double precision feature of course.
    GTX480 HAD DOUBLE PRECISION AND IT WAS A GOOD COMPUTING CARD, so was the 580 (Not sure about the double precision on this one though, but it performed well in compute). So both this cards were compute capable and gaming oriented.
    Regarding the 480 do you remember the 460? It didn't had double precision capabilities. maybe there was a resemblance in strategy (if we asume that the 680 was ment to be 660)
    This is obviously a strategic position of nvidia in segmenting their offer to different market segments.
    Another thing that delayed the GK110 to enter the consumer market was the BIG order from the US Government for tesla cards (yes there was one). With that die size its hard to manufacture that chip, or slower, and when u have a pre order you will prioritize it. So it was a gamble to go out into consumer space with the gk104 only.
    I will remind you that gk104 when it was lauched performed better that 7970 and it was also cheaper. Not only that the chip was cheaper to produce than the GK110 (because of that size). If the chip was meant to be mid range then it came out expensive and thats why nvidia did a launch like that knowing that it performed better than the red team.
    Another evidence of that is the story behind the development of the gtx 690 and titan coolers. 
    You can read it here, its a long read but interesting.
     
    http://www.tomshardware.com/reviews/nvidia-history-geforce-gtx-690,3605.html
     
    I dont care about AMD or NVIDIA what worries me is that amd is not pushing Nvidia. Nvidia top card is the Titan atm and its based on a card that was launch around the time 7970 was launch (the tesla one don't recall wich one though).
    And now amd 290x it seems to be competing with the titan performance (i hope its better performer than titan)...so like linus said nvidia probably is yawning at this.
    What i hope is that the mantle api and amd monopoly in consoles becomes a game changer and the game portability from consoles to pc is well optimized. As a result i want nvidia pushed really hard because the improvements in the last year or so isn't that impressing.
×