Jump to content

qepsilonp

Member
  • Posts

    18
  • Joined

  • Last visited

Reputation Activity

  1. Informative
    qepsilonp got a reaction from Ddarlington36 in AMD Ryzen 2600 Eng Sample leak   
    Well then you are going to be let down, if you already have a Ryzen processor with the right number of cores for you Ryzen was always not going to be a worth while upgrade, wait for Zen 2 / Ryzen 3 like I am if you already have a Ryzen CPU or have a CPU which is good enough for purpose, Zen+ was made so that AMD could stay semi competitive between the long wait between Zen and Zen 2 as a stop gap, as 12LP is really only an improved 14LPP process and that's the only "big" difference, there are the tighter cache timings which should improve AMD's performance in games at extremely high frame rates.

    Which will make AMD look better when the tech press run 720p benchmarks with a Titan XP, although it will also help in more reasonable situations also like if you want to run at 144Hz which Ryzen does have a little bit of touble getting to those kinds of frame rates due to the cache latency, memory latency and fabric latency. Although the last two memory latency and fabric latency are the same thing, because the IMC is connected to the rest of the CPU via the infinity fabric, but AMD also promised better RAM support so if you can for example run 3600Mhz memory you improve the latency of the fabric dramatically.
     
    But I hope it is something that is properly address in Ryzen 3 / Zen 2 because in specific titles it's even a problem at what are reasonable frame rates, 90 to 120fps because the games are more latency sensitive, this is the optimisation issue you hear in reference to Zen, it's games over using the L3 cache rather than using the L2 cache
     
    Although it is a weakness in the Zen architecture, so it's not so much a optimisation issue as just something that Intel is better at, but at the same time it isn't necessary to be so dependent on the L3 cache, it's the same issue that Skylake X had, but as Intel move to higher and higher core counts for the mainstream they wont be able to rely on the old ring bus anymore, so I think game developers need to start moving the dependence of there game engines to L2 cache rather than L3, it is something you need to do anyway to get higher core scaling which means it's probably already being done.
     
    This is also the reason AMD holds up much better in content creation because they have a 512KB L2 cache vs Intel's mainstream parts which have 256KB of L2 cache, which as said in applications that scale well with many cores it is necessary to rely on the L2 cache and because AMD also have a low latency link between all the L2 caches on a CCX when you do for example a Cinebench run on all cores a 1800X will beat a 7700k by almost exactly double even though on a single threaded test the 1800X will lose by about 15%.
  2. Like
    qepsilonp got a reaction from Taf the Ghost in AMD Ryzen 2600 Eng Sample leak   
    Well then you are going to be let down, if you already have a Ryzen processor with the right number of cores for you Ryzen was always not going to be a worth while upgrade, wait for Zen 2 / Ryzen 3 like I am if you already have a Ryzen CPU or have a CPU which is good enough for purpose, Zen+ was made so that AMD could stay semi competitive between the long wait between Zen and Zen 2 as a stop gap, as 12LP is really only an improved 14LPP process and that's the only "big" difference, there are the tighter cache timings which should improve AMD's performance in games at extremely high frame rates.

    Which will make AMD look better when the tech press run 720p benchmarks with a Titan XP, although it will also help in more reasonable situations also like if you want to run at 144Hz which Ryzen does have a little bit of touble getting to those kinds of frame rates due to the cache latency, memory latency and fabric latency. Although the last two memory latency and fabric latency are the same thing, because the IMC is connected to the rest of the CPU via the infinity fabric, but AMD also promised better RAM support so if you can for example run 3600Mhz memory you improve the latency of the fabric dramatically.
     
    But I hope it is something that is properly address in Ryzen 3 / Zen 2 because in specific titles it's even a problem at what are reasonable frame rates, 90 to 120fps because the games are more latency sensitive, this is the optimisation issue you hear in reference to Zen, it's games over using the L3 cache rather than using the L2 cache
     
    Although it is a weakness in the Zen architecture, so it's not so much a optimisation issue as just something that Intel is better at, but at the same time it isn't necessary to be so dependent on the L3 cache, it's the same issue that Skylake X had, but as Intel move to higher and higher core counts for the mainstream they wont be able to rely on the old ring bus anymore, so I think game developers need to start moving the dependence of there game engines to L2 cache rather than L3, it is something you need to do anyway to get higher core scaling which means it's probably already being done.
     
    This is also the reason AMD holds up much better in content creation because they have a 512KB L2 cache vs Intel's mainstream parts which have 256KB of L2 cache, which as said in applications that scale well with many cores it is necessary to rely on the L2 cache and because AMD also have a low latency link between all the L2 caches on a CCX when you do for example a Cinebench run on all cores a 1800X will beat a 7700k by almost exactly double even though on a single threaded test the 1800X will lose by about 15%.
  3. Like
    qepsilonp got a reaction from MaktimS in AMD Ryzen 2600 Eng Sample leak   
    Well then you are going to be let down, if you already have a Ryzen processor with the right number of cores for you Ryzen was always not going to be a worth while upgrade, wait for Zen 2 / Ryzen 3 like I am if you already have a Ryzen CPU or have a CPU which is good enough for purpose, Zen+ was made so that AMD could stay semi competitive between the long wait between Zen and Zen 2 as a stop gap, as 12LP is really only an improved 14LPP process and that's the only "big" difference, there are the tighter cache timings which should improve AMD's performance in games at extremely high frame rates.

    Which will make AMD look better when the tech press run 720p benchmarks with a Titan XP, although it will also help in more reasonable situations also like if you want to run at 144Hz which Ryzen does have a little bit of touble getting to those kinds of frame rates due to the cache latency, memory latency and fabric latency. Although the last two memory latency and fabric latency are the same thing, because the IMC is connected to the rest of the CPU via the infinity fabric, but AMD also promised better RAM support so if you can for example run 3600Mhz memory you improve the latency of the fabric dramatically.
     
    But I hope it is something that is properly address in Ryzen 3 / Zen 2 because in specific titles it's even a problem at what are reasonable frame rates, 90 to 120fps because the games are more latency sensitive, this is the optimisation issue you hear in reference to Zen, it's games over using the L3 cache rather than using the L2 cache
     
    Although it is a weakness in the Zen architecture, so it's not so much a optimisation issue as just something that Intel is better at, but at the same time it isn't necessary to be so dependent on the L3 cache, it's the same issue that Skylake X had, but as Intel move to higher and higher core counts for the mainstream they wont be able to rely on the old ring bus anymore, so I think game developers need to start moving the dependence of there game engines to L2 cache rather than L3, it is something you need to do anyway to get higher core scaling which means it's probably already being done.
     
    This is also the reason AMD holds up much better in content creation because they have a 512KB L2 cache vs Intel's mainstream parts which have 256KB of L2 cache, which as said in applications that scale well with many cores it is necessary to rely on the L2 cache and because AMD also have a low latency link between all the L2 caches on a CCX when you do for example a Cinebench run on all cores a 1800X will beat a 7700k by almost exactly double even though on a single threaded test the 1800X will lose by about 15%.
  4. Funny
    qepsilonp got a reaction from kvn95 in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  5. Agree
    qepsilonp got a reaction from N1ght_F1nger in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  6. Agree
    qepsilonp got a reaction from Lewellyn in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  7. Agree
    qepsilonp got a reaction from themaniac in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  8. Agree
    qepsilonp got a reaction from Techicolors in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  9. Agree
    qepsilonp got a reaction from JeroenM in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  10. Agree
    qepsilonp got a reaction from xAcid9 in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  11. Agree
    qepsilonp got a reaction from Nexus in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  12. Agree
    qepsilonp got a reaction from shadowbyte in Core i3 7350K Overclockable Dual Core Review   
    Just aiming for a resolution that the 1080 could draw at a 60fps ish would have been a start on two of the tests they only got 25 and 28 fps, which for CPU porposes is useless information almost any CPU from the last 6 years could play those games at 25 and 28 fps odd FPS.

    And the is a reason for using 1080p and getting rediculious frame rates it simulates a GPU upgrade and the load of games increasing over time on the CPU.
  13. Agree
    qepsilonp got a reaction from dalekphalm in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  14. Agree
    qepsilonp got a reaction from Pesmerga in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  15. Agree
    qepsilonp got a reaction from MageTank in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  16. Agree
    qepsilonp got a reaction from SteveGrabowski0 in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  17. Agree
    qepsilonp got a reaction from shadowbyte in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  18. Agree
    qepsilonp got a reaction from Kuuma in Core i3 7350K Overclockable Dual Core Review   
    Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k

    linky here: 


    I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU.

    The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind.

    Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU.

    And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it.

    Get you s*** together LTT

    If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks. 
  19. Like
    qepsilonp got a reaction from BrightCandle in Ah not again   
    I seem to be making a habit of this but on the WAN Show again Linus and Luke were mistaken about something again, they said that this generation of consoles are weak, and the unsaid implication was that the last generation were powerful, not ture...
     
    When the PS3 lunched back in Nov 2006 you could for $653 buy a PC with
    7900GT for $200 "With $40 mail-in rebait"
    AMD Athlon 64 X2 4200+ $184
    1GB RAM $100
    250GB HDD $70
    MB MSI K9VGM-V Socket AM2 $38
    430W PSU $34
    Case $30
     
    "Prices used were Sourced from Waybackmachine.org and Newegg.con from 14th November 2006."
     
    Which a 7900GT was almost twice as powerful as the GPU in the PS3, as the PS3's GPU was a G70 chip with half its ROPs disabled clocked at 500Mhz. The 7900GT was a fully unlocked G70 chip with a clock speed of 450Mhz so not twice as powerful but 90% more powerful, and you could drive a 7900GT with all that other hardware so yeah. On top of that 1 year later the 8800GT came out which was far more powerful than the 7900GT, and the difference between the PS3's GPU and the 8800GT is about the same difference between the 970 and the PS4.

    And if your going to put a computer together with a 970 your talking about $600 - $800 vs the PS4's $340 yes it will stomp the PS4 but thats the point by November 2007 you could stomp the PS3 by a similar margin while spending only a little more than you would on a PS3.

    In other words consoles haven't been competitive since the original Xbox and the PS4 and Xbone are far more competitive than the PS3 and X360 as the PS3 was outstripped by 90% on the GPU the PS4 with a similarly price PC can only be outstripped by 40%.
  20. Like
    qepsilonp got a reaction from Rohit Jackdaw in LiFi misconceptions   
    In WAN Show Linus and Luke seemed to be mistaken about how you would use Li-Fi and how it would work so I thought I would explain the biggest misconception is that you would have to have a separate light for lighting you home and the lights for LiFi nope, what you could do is replace your lights in your current light fittings with Li-Fi lights with powerline adapters to boost download speed when used in conjunction with WiFi or the Cellular network for uploads.
     
    this means the light will be as strong as a normal light so not only do you not need a direct line of sight if your doors open in the room that the LiFi light is in if your in the hallway you could still have a connection never mind if someone were to step in behind you and put you in some shade as light bouses off walls and it would not put you in anywhere near enough darkness to cut off the connection.

    With this approach you could use a one crappy WiFI access point and put in LiFi lights in every room and while you may only get 1Mbps uploads speed your download speed with the limitation of powerline adapters would be around 200 - 500Mbps reliably which the upload speed is enough to do most things like browsing the web and etc but that 200 - 500Mbps is enough to watch 8k video easily etc.
     
    your not going to be on-line gaming with that but for use with mobile and tablets its perfect and all you need on the device is a simple dirt cheap light sensor and nothing else really a independent processor to decode the light singles would be good but you don't need one as the end device would already have a processor, you don't need 200Gbps as said 200 - 500Mbps would be more than enough and a ordinary light sensor is enough to get that done.

    Also Light pollution to a point doesn't matter as your not looking at the total amount of light your looking for small changes on a very small time scale so for example me flashing another light manually right into the sensor will not matter as the total amount of light doesn't matter its being able to detect the small changes in light, obviously it will effect the transfer speed but given the theoretical maximum 200 - 500Mbps is very achievable under those circumstance and it would probably be limited to those speeds as not to be effected in the first place.
  21. Like
    qepsilonp got a reaction from ShadowCaptain in LiFi misconceptions   
    In WAN Show Linus and Luke seemed to be mistaken about how you would use Li-Fi and how it would work so I thought I would explain the biggest misconception is that you would have to have a separate light for lighting you home and the lights for LiFi nope, what you could do is replace your lights in your current light fittings with Li-Fi lights with powerline adapters to boost download speed when used in conjunction with WiFi or the Cellular network for uploads.
     
    this means the light will be as strong as a normal light so not only do you not need a direct line of sight if your doors open in the room that the LiFi light is in if your in the hallway you could still have a connection never mind if someone were to step in behind you and put you in some shade as light bouses off walls and it would not put you in anywhere near enough darkness to cut off the connection.

    With this approach you could use a one crappy WiFI access point and put in LiFi lights in every room and while you may only get 1Mbps uploads speed your download speed with the limitation of powerline adapters would be around 200 - 500Mbps reliably which the upload speed is enough to do most things like browsing the web and etc but that 200 - 500Mbps is enough to watch 8k video easily etc.
     
    your not going to be on-line gaming with that but for use with mobile and tablets its perfect and all you need on the device is a simple dirt cheap light sensor and nothing else really a independent processor to decode the light singles would be good but you don't need one as the end device would already have a processor, you don't need 200Gbps as said 200 - 500Mbps would be more than enough and a ordinary light sensor is enough to get that done.

    Also Light pollution to a point doesn't matter as your not looking at the total amount of light your looking for small changes on a very small time scale so for example me flashing another light manually right into the sensor will not matter as the total amount of light doesn't matter its being able to detect the small changes in light, obviously it will effect the transfer speed but given the theoretical maximum 200 - 500Mbps is very achievable under those circumstance and it would probably be limited to those speeds as not to be effected in the first place.
  22. Like
    qepsilonp got a reaction from DigitalHermit in LiFi misconceptions   
    In WAN Show Linus and Luke seemed to be mistaken about how you would use Li-Fi and how it would work so I thought I would explain the biggest misconception is that you would have to have a separate light for lighting you home and the lights for LiFi nope, what you could do is replace your lights in your current light fittings with Li-Fi lights with powerline adapters to boost download speed when used in conjunction with WiFi or the Cellular network for uploads.
     
    this means the light will be as strong as a normal light so not only do you not need a direct line of sight if your doors open in the room that the LiFi light is in if your in the hallway you could still have a connection never mind if someone were to step in behind you and put you in some shade as light bouses off walls and it would not put you in anywhere near enough darkness to cut off the connection.

    With this approach you could use a one crappy WiFI access point and put in LiFi lights in every room and while you may only get 1Mbps uploads speed your download speed with the limitation of powerline adapters would be around 200 - 500Mbps reliably which the upload speed is enough to do most things like browsing the web and etc but that 200 - 500Mbps is enough to watch 8k video easily etc.
     
    your not going to be on-line gaming with that but for use with mobile and tablets its perfect and all you need on the device is a simple dirt cheap light sensor and nothing else really a independent processor to decode the light singles would be good but you don't need one as the end device would already have a processor, you don't need 200Gbps as said 200 - 500Mbps would be more than enough and a ordinary light sensor is enough to get that done.

    Also Light pollution to a point doesn't matter as your not looking at the total amount of light your looking for small changes on a very small time scale so for example me flashing another light manually right into the sensor will not matter as the total amount of light doesn't matter its being able to detect the small changes in light, obviously it will effect the transfer speed but given the theoretical maximum 200 - 500Mbps is very achievable under those circumstance and it would probably be limited to those speeds as not to be effected in the first place.
  23. Like
    qepsilonp got a reaction from Tedster in LiFi misconceptions   
    In WAN Show Linus and Luke seemed to be mistaken about how you would use Li-Fi and how it would work so I thought I would explain the biggest misconception is that you would have to have a separate light for lighting you home and the lights for LiFi nope, what you could do is replace your lights in your current light fittings with Li-Fi lights with powerline adapters to boost download speed when used in conjunction with WiFi or the Cellular network for uploads.
     
    this means the light will be as strong as a normal light so not only do you not need a direct line of sight if your doors open in the room that the LiFi light is in if your in the hallway you could still have a connection never mind if someone were to step in behind you and put you in some shade as light bouses off walls and it would not put you in anywhere near enough darkness to cut off the connection.

    With this approach you could use a one crappy WiFI access point and put in LiFi lights in every room and while you may only get 1Mbps uploads speed your download speed with the limitation of powerline adapters would be around 200 - 500Mbps reliably which the upload speed is enough to do most things like browsing the web and etc but that 200 - 500Mbps is enough to watch 8k video easily etc.
     
    your not going to be on-line gaming with that but for use with mobile and tablets its perfect and all you need on the device is a simple dirt cheap light sensor and nothing else really a independent processor to decode the light singles would be good but you don't need one as the end device would already have a processor, you don't need 200Gbps as said 200 - 500Mbps would be more than enough and a ordinary light sensor is enough to get that done.

    Also Light pollution to a point doesn't matter as your not looking at the total amount of light your looking for small changes on a very small time scale so for example me flashing another light manually right into the sensor will not matter as the total amount of light doesn't matter its being able to detect the small changes in light, obviously it will effect the transfer speed but given the theoretical maximum 200 - 500Mbps is very achievable under those circumstance and it would probably be limited to those speeds as not to be effected in the first place.
×