Jump to content

qepsilonp

Member
  • Posts

    18
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

479 profile views

qepsilonp's Achievements

  1. Dude I said upgrading from Ryzen to Ryzen 2 was never going to make sense...
  2. Well if AMD want PCIe 4 for Zen 2 or Zen 3 on the workstation and server they might as well do it on desktop as well, well as long as they go the same route of Ryzen 1 of using the same die for the consumer space as the workstation and server, the only problem then is designing the boards which isn't terribly difficult for a PCIe change. I suppose you could ague the reason they when for 64 and 128 PCIe lanes on TR and Epyc was because they were not going to go for PCIe gen 4, although PCIe gen four was finalised PCIe gen 4 in Oct 2018 and AMD only recently announced the design was finished so Zen 2 with PCIe 4 is not all that unlikely. And there is a reason to use the same die for HEDT and Data Centres because you can use the best dies for server and workstations and don't need to throw away once that don't meet the spec, you can just use them in the consumer space and having an upgrade path in data centres is even more important than it is in the consumer space, as you average consumer will sit on a CPU for like 3 - 5 years, by which time you would have already changed sockets, where as data centres usually upgrade much more regularly and completely replacing all there hardware is a lot more difficult and expensive. Why would they keep the socket for consumers and change the socket for data centres?
  3. Ehhh I dunno PCIe Gen 3 x4 is already becoming a limitation for NVMe although I must admit a doubling would ease that for a while which can be done with PCIe 4 and the Titan V has already run into limitations with PCIe gen 3 8x assume the next high end GPU adds another 70% performance and the one after that then you are going to be limited by PCIe gen 3 16x PCie gen 4 is definitely something that consumers will need soon after Zen 2 like another year maybe.
  4. Errr you think they are going to implement PCIe 4 on a AM4 board which would mean that they would need to support both PCIe 3 and PCIe 4 on the CPU, or did you mean Zen 3 and considering Zen 3 is probably not going to come out early in 2020 depending on when PCIe 5 is finalised in the year say it's finalised in Jan 2019 if AMD went for a late 2020 launch which they probably will anyway then that leave lots of time, and according to other sources it's 12 months, although that was Tech quickie and LTT isn't where you should get your tech news on anything more than a superficial level.
  5. Erm I think your talking about Zen 2, Zen 2 will be Ryzen 3 as Zen+ is Ryzen 2 and it will almost certainly come in 2019, although my guess would be at easilest it will come is May 2019, as on AMD charts right now they have Zen 3 coming in 2020, the chart isn't numbered but it ends in 2020 and it has Ryzen 3 on it, also given comments by AMD saying that they will have new CPU's on AM4 until 2020 to 2021 late 2020 for Zen 3 makes sense, this was back when Ryzen had just launched though so there was less certainty than there is now. Although I believe Zen 3 will not be compatible with AM4 because I would think they would want to jump on PCIe 5 and DDR5 and yes I am actually talking about PCIe 5, PCIe 4 was orignally planned for 2013 I think, but they had a load of toubles, but according to PCI SIG, PCIe 5 is going to be finalised in 2019, given usual development times devises and motherboards should be available about a year after that. So if Zen launches during 2020 or after it AMD will want to implement PCIe 5 and will simply skip PCIe 4, also DDR5 is another thing they will want to implement which will by available by then, which would make the new CPU's incompatible with the old boards, unless AMD wanted to waste quite a bit of die space and a lot of development time having the CPU be compatible with PCIe 3, PCIe 5, DDR4 and DDR5, then make a new series of boards that support the PCIe 5 and DDR5 which AMD is already behind Intel on there IMC unnecessarily increasing complexity would be dumb as you would likely also take a performance hit. But imagine AMD gave you 20 PCIe gen 5 lanes right to the CPU... that would be nuts you could run 10 Titan Xp's off that much bandwidth as 2 PCIe gen 5 lanes would equate to roughly 8x PCIe gen 3 lanes, and there is not a measurable difference for running in 8x vs 16x with the Titan Xp. Like 10 virtual machines all with there own Titan Xp assuming AMD move to 12 cores you could only have 1 core each, but still, that is a stupid amount of bandwidth, or ThreadRipper with 64 PCIe Gen 5 lanes thats 32 Titan Xp's or 16 Titan V's, Epyc 64 Titan Xp's or 32 Titan V's. Thats Nuts, I think even data centre / deep learning guys would even be like, "THAT'S TOO MUCH BANDWIDTH, WHAT AM I SUPPOSED TO DO WITH THIS?"
  6. Well then you are going to be let down, if you already have a Ryzen processor with the right number of cores for you Ryzen was always not going to be a worth while upgrade, wait for Zen 2 / Ryzen 3 like I am if you already have a Ryzen CPU or have a CPU which is good enough for purpose, Zen+ was made so that AMD could stay semi competitive between the long wait between Zen and Zen 2 as a stop gap, as 12LP is really only an improved 14LPP process and that's the only "big" difference, there are the tighter cache timings which should improve AMD's performance in games at extremely high frame rates. Which will make AMD look better when the tech press run 720p benchmarks with a Titan XP, although it will also help in more reasonable situations also like if you want to run at 144Hz which Ryzen does have a little bit of touble getting to those kinds of frame rates due to the cache latency, memory latency and fabric latency. Although the last two memory latency and fabric latency are the same thing, because the IMC is connected to the rest of the CPU via the infinity fabric, but AMD also promised better RAM support so if you can for example run 3600Mhz memory you improve the latency of the fabric dramatically. But I hope it is something that is properly address in Ryzen 3 / Zen 2 because in specific titles it's even a problem at what are reasonable frame rates, 90 to 120fps because the games are more latency sensitive, this is the optimisation issue you hear in reference to Zen, it's games over using the L3 cache rather than using the L2 cache Although it is a weakness in the Zen architecture, so it's not so much a optimisation issue as just something that Intel is better at, but at the same time it isn't necessary to be so dependent on the L3 cache, it's the same issue that Skylake X had, but as Intel move to higher and higher core counts for the mainstream they wont be able to rely on the old ring bus anymore, so I think game developers need to start moving the dependence of there game engines to L2 cache rather than L3, it is something you need to do anyway to get higher core scaling which means it's probably already being done. This is also the reason AMD holds up much better in content creation because they have a 512KB L2 cache vs Intel's mainstream parts which have 256KB of L2 cache, which as said in applications that scale well with many cores it is necessary to rely on the L2 cache and because AMD also have a low latency link between all the L2 caches on a CCX when you do for example a Cinebench run on all cores a 1800X will beat a 7700k by almost exactly double even though on a single threaded test the 1800X will lose by about 15%.
  7. Eh engineering sample no need to get your knickers in a twist, they are likely testing what clockspeed they could get right now and may do another respin before finial silicon anyway, for the moment all we can do is say there is certainly a clockspeed improvement for the moment.
  8. Ah I know what the problem is, 504 will take the temps directly from the tCTL and for the 1700X and 1800X AMD have put an offset on the tCTL for reasons, so for 504 if your running a 1800X or 1700X you can take 20C off the temps, Hope that helped. Also after upgrading and downgrading the BOIS because of the fan noise I was able to change my RAM speed upto 2400Mhz even though before I both with the upgraded 504 BOIS and 502 BOIS before downgrading I was unable to... wtf? Edit; I see the why the timings were loosened, I'll see if I can get to 2600Mhz then it might be worth letting the timings fall back a little. Edit 2; Got to 2600Mhz... Lets see how far I can push this 3000Mhz? Well 2934Mhz to be exact I'll try pushing up the BLK to 102.3 to get 3,001.482Mhz if this works. Edit 3; No luck but 2600Mhz is not bad. Edit 4; Actually the timings haven't changed at all I thought it was 15 15 15 35 before but it was 15 17 17 35. So I really don't know whats going on there but I suppose I shouldn't complain. Edit 5; Na I'm just fucken with ya, not the actual results just from the "Edit 5" bit.
  9. On the show in reference to Game Mode you were saying that a frame rate increase of 2 - 5% is not worth it, paraphrasing here. But for this kind of thing what you should be looking at are frame times and frame pacing. Because if the OS is throwing enough interrupts at the game to effect average frame rates by 2% - 5% you can be damn sure that frame times are suffering a hell of a lot more. Because an average frame rate of 60fps doesn't matter much if your getting stalls down to 30fps on a semi regular occasion. All the other stuff you and Wendell were saying though was spot on.
  10. Just aiming for a resolution that the 1080 could draw at a 60fps ish would have been a start on two of the tests they only got 25 and 28 fps, which for CPU porposes is useless information almost any CPU from the last 6 years could play those games at 25 and 28 fps odd FPS. And the is a reason for using 1080p and getting rediculious frame rates it simulates a GPU upgrade and the load of games increasing over time on the CPU.
  11. Why are you testing a CPU with a 4k resolution? You do know that a i5 750 overclocked 3.7Ghz can run most games with overclocked GTX 1080 at little or no loss of framerate compared to a i7 4790k overclocked to 4.5Ghz @4k linky here: I'm trying to keep it civil and get across how... ... ... breath ... ... ... Taking a vape on my e-cig... ... ... getting the thesaurus to find less insulting words to use... ... ... inane, ill-advised, irrelevant, laughable, half baked and nonsensical it was to use a 4k resolution to benchmark a CPU. The words to describe how ridiculous it was to do that literally escaped me, that is how bad it was. Well the words didn't escape me but I didn't think it was a good idea to use the words that came to mind. Don't do that in future, high resolutions like 4k and 1440p should be used to test GPU's lower resolutions like 1080p and 720p should be used to benchmark CPU's because CPU's are barely effected by resolution increases while resolutions are basically completely bound by the GPU. And this isn't the frist time you have done this either, make a mistake once ok fine, it's fine. Twice, ok that was a little annoying just leave a comment saying, "why?" Three time well obviously you need me to spell out why what you did was wrong in place where you will actually read it. Get you s*** together LTT If you pull this again for the Ryzen review I'll just unsub because your obviously not a good place to get reviews because your testing methodology sucks.
  12. I seem to be making a habit of this but on the WAN Show again Linus and Luke were mistaken about something again, they said that this generation of consoles are weak, and the unsaid implication was that the last generation were powerful, not ture... When the PS3 lunched back in Nov 2006 you could for $653 buy a PC with 7900GT for $200 "With $40 mail-in rebait" AMD Athlon 64 X2 4200+ $184 1GB RAM $100 250GB HDD $70 MB MSI K9VGM-V Socket AM2 $38 430W PSU $34 Case $30 "Prices used were Sourced from Waybackmachine.org and Newegg.con from 14th November 2006." Which a 7900GT was almost twice as powerful as the GPU in the PS3, as the PS3's GPU was a G70 chip with half its ROPs disabled clocked at 500Mhz. The 7900GT was a fully unlocked G70 chip with a clock speed of 450Mhz so not twice as powerful but 90% more powerful, and you could drive a 7900GT with all that other hardware so yeah. On top of that 1 year later the 8800GT came out which was far more powerful than the 7900GT, and the difference between the PS3's GPU and the 8800GT is about the same difference between the 970 and the PS4. And if your going to put a computer together with a 970 your talking about $600 - $800 vs the PS4's $340 yes it will stomp the PS4 but thats the point by November 2007 you could stomp the PS3 by a similar margin while spending only a little more than you would on a PS3. In other words consoles haven't been competitive since the original Xbox and the PS4 and Xbone are far more competitive than the PS3 and X360 as the PS3 was outstripped by 90% on the GPU the PS4 with a similarly price PC can only be outstripped by 40%.
  13. Sorry was just trying to bash this out as fast as possible because I don't care that much, as I personally use a desktop and there for a wired connection almost all the time so it doesn't effect me. Also misconceptions of people in generally don't matter as long as the people developing it understand so its going to come one way or the other.
  14. In WAN Show Linus and Luke seemed to be mistaken about how you would use Li-Fi and how it would work so I thought I would explain the biggest misconception is that you would have to have a separate light for lighting you home and the lights for LiFi nope, what you could do is replace your lights in your current light fittings with Li-Fi lights with powerline adapters to boost download speed when used in conjunction with WiFi or the Cellular network for uploads. this means the light will be as strong as a normal light so not only do you not need a direct line of sight if your doors open in the room that the LiFi light is in if your in the hallway you could still have a connection never mind if someone were to step in behind you and put you in some shade as light bouses off walls and it would not put you in anywhere near enough darkness to cut off the connection. With this approach you could use a one crappy WiFI access point and put in LiFi lights in every room and while you may only get 1Mbps uploads speed your download speed with the limitation of powerline adapters would be around 200 - 500Mbps reliably which the upload speed is enough to do most things like browsing the web and etc but that 200 - 500Mbps is enough to watch 8k video easily etc. your not going to be on-line gaming with that but for use with mobile and tablets its perfect and all you need on the device is a simple dirt cheap light sensor and nothing else really a independent processor to decode the light singles would be good but you don't need one as the end device would already have a processor, you don't need 200Gbps as said 200 - 500Mbps would be more than enough and a ordinary light sensor is enough to get that done. Also Light pollution to a point doesn't matter as your not looking at the total amount of light your looking for small changes on a very small time scale so for example me flashing another light manually right into the sensor will not matter as the total amount of light doesn't matter its being able to detect the small changes in light, obviously it will effect the transfer speed but given the theoretical maximum 200 - 500Mbps is very achievable under those circumstance and it would probably be limited to those speeds as not to be effected in the first place.
  15. If you want a new AMD system your going to have to wait for there Zen architecture in late 2016 so if you can wait that long GOOD FOR YOU!!! But the only thing really coming in 2015 is Intel. And really they should has just scrapped Broadwell unless its going to be cheap, like really... All it is is the Ivy Bridge CPU's just shrunk down, Skylake is the Architectural improvement... Like seriously... they would have to be a good $70 cheaper for anyone to buy them. Unless they just increase the price of skylake a little and decrease the price of Haswell a little.
×