Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

leadeater

Moderator
  • Content Count

    17,027
  • Joined

  • Last visited

Awards

About leadeater

  • Title
    Fanatic
  • Birthday Sep 23, 1987

Profile Information

  • Location
    New Zealand
  • Gender
    Male
  • Occupation
    Systems Engineer | IT

System

  • CPU
    Intel i7 4930K
  • Motherboard
    Asus Rampage IV Black Edition
  • RAM
    16GB G.Skill TridentX F3-2400C10-4GTX
  • GPU
    Dual Asus R9-290X
  • Case
    LD PC-V8
  • Storage
    4 512GB Samsung 850 Pro & 2 512GB Samsung 840 Pro & 1 256GB Samsung 840 Pro
  • PSU
    EVGA Supernova NEX 1500 Classified
  • Display(s)
    Dell U3014 30"
  • Cooling
    Custom EKWB, 3x 480 RAD everything cooled inc ram (why not?)
  • Keyboard
    Razor Black Window Ultimate BF4
  • Mouse
    Mad Catz R.A.T. 5
  • Sound
    Custom build speakers, home theater sound
  • Operating System
    Windows 10

Recent Profile Visitors

23,953 profile views
  1. Possibly, would be an odd choice if there is only 1 large GPU chip per IO chip though, would be rather bandwidth and connectivity limiting for not much gain unless they plan on updating the GPU chips faster or have many different ones etc. Might be a high speed low latency L4 cache maybe, like RNDA2 but off chip on package from the main GPUs.
  2. My guess is the two large chips in the middle are the GPU elements, the 8 chips around those are HBM and the two in diagonally opposite corners are....???
  3. Keep in mind that while it may have the same core configuration as iGPUs in Intel CPU products it will have a higher power limit so will perform better, so even if it were paired with an Intel CPU with the same iGPU core it'll have greater performance. Also they have dedicated VRAM and higher effective bandwidth so that'll also increase performance. Very much looking forward to reviews of these, no matter how underwhelming the raw performance they might have it'll be damn interesting to see.
  4. Boost is always held back by the motherboard, the boost settings are a system configuration not a CPU configuration. There is a default boost configuration loaded in to the CPU microcode but this is overwritten as soon as you put in to a motherboard that has different default configuration i.e. Any Z series motherboard after 200 series and possibly even that generation itself. An i7 10700 in an HP EliteDesk computer will boost to 224W for 28 seconds then drop down to 65W under any benchmark every time always. This will also be true for Dell and Acer or any other OEM PC targeted at
  5. Well the context that needs to be applied is this is specifically related to motherboards and motherboard settings, Intel can only give a product specification and rating for the product itself under their test conditions and parameters. Now I know Intel allows the PL2 values to be changed, power amount and time, and actively encourages and benefits from it for gaming vendors/motherboards but these power draws are artifacts of these parts combinations and configurations alone. Where the i7 10700 is most used is in corporate OEM computers that use the Intel specification so do indee
  6. Moved to New Builds and Planning @kylanderHave a read of the posting guidelines for this forum section and update your original post, you'll get better advice if you provide the extra information there.
  7. When it comes to game optimization and readiness the studio drivers from my understanding just lag behind the gaming ones, by months. So there may be potential for problems but lately I've really only seen this as performance, it's not like back in DX9 to very early DX11 where games may not run at all or would be graphically broken without the most current drivers. That used to be rather common so it's great that is far less so now.
  8. No idea, has to be better the Movie Maker though but then everything is better than that.
  9. Or Microsoft Movie Maker Jokes aside it might well become anything but Adobe if they go down this path. What's next? Quadros and Titans only?
  10. How about you ask me the number of times Plex has stopped working in the many years of using it before you try and raise a need for a backup method of access files, which I can do if I really want. There are many what ifs that could apply to anything, the real question is do the actually matter? Typically no. Like I said Plex has equal chance of breaking as SMB does so I fail to see your point. I can access the files over SMB but that is solely outside of Plex configuration on my system. If SMB goes down then Kodi goes down, if your NAS goes down Kodi goes down, if your house ca
  11. Kodi has equal chance of going wrong as Plex does, there is zero effective difference here. It doesn't matter how dumb you think the player is, or isn't, it works extremely well and is very light weight on the device and is widely supported across many devices and itself supports native internet playback. SMB is just software, like Plex is, both are highly robust software. If you're running Plex you do not need SMB at all so you're diagram is not actually correct. You can host your Plex library on a network share but most people do not do that and use a local drive or volume. My Pl
  12. You don't have to use a GPU for transocding either, like someone said it's entirely optional. My Plex server is CPU only running as a VM and can transcode a few streams perfectly fine, though it can depend a bit on the bitrate of the original as some of the remux copies of my blu rays are way on the extreme end (newer ones). Also at idle a GTX 1050 is about 3W so it's really not a lot, newer similar GPUs can sometimes idle a tiny bit lower than this too. To address the original question however, not every device can actually play videos from an SMB share. Most devices have some kin
  13. It's as if there is a fundamental difference between ABC and the others, I wonder what ever that could be? When two mega corporations fight, like children most often, there are never any winners. It's like a game of Who's Line is it Anyway, the points don't matter.
×