Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Egad

Member
  • Content Count

    259
  • Joined

  • Last visited

Posts posted by Egad

  1. This is a very odd video, obviously is the audio issue at the front forefront of it, which to me also made it feel very low energy.  It had periods where it just felt like watching Anthony mutter as he tinkered with an old computer.  

     

    The scripting and planning of this also felt really off.  Like why do you need a 2.5 gig NIC?  Why is it even mentioned?  What's the overlap of people with a second or third gen i7 as their primary rig but they have a NAS or something in their home that can do a sustained output at 2.5 speeds and they didn't think to put a 2.5 gig card in their desktop?  I mean I could see if this was aimed at a college student and we're talking about how to leverage your college dorm's network on a budget type of thing, but a setup where this box sits on a desk, you run a cable from the box to your switch, and in turn your switch connects to some cable modem that that 1 gig nic was more than enough.  The whole network thing just kind of limps along until a muttered conclusion of "So I guess higher speed NICs may not be worth it...".  

     

    The USB 3 and Bluetooth stuff felt kind of half baked.  I can see reasons there, but elaborate on them.  For most folks that plug a keyboard and a mouse into their computer, USB 2.0 is fine.  Adding two USB 3.0 ports on the back isn't a huge value add unless you're constantly pulling media of a camera or something over a cable.  But these decisions should be driven by "Hey I have this peripheral I want and here is how I can expand my system to utilize it."  

     

    It feels like the two points that have solid application GPU makes it better at games and >1666MHz DDR3 is often asking for trouble pre Haswell and they get lost in all the digression on other upgrades.

  2. In additional the below ambient problem mentioned above, you'd need a pressure regulator to deal with the fact the system you're plugging into can easily be 4 or 5 bars, and most consumer water cooling stuff can only handle 1 to 2 bars.  You'd always want to control the flow since at the rates the commercial stuff flows any leak would become a catastrophe very quickly.  Especially since the max fluid dumped into your case is the  based on the amount the building system holds rather than just the finite amount in your loop.  

     

    I think the failure cases make this too severe to be attractive.  

  3. There seems to be two different things.  The first is the Hackintosh doesn't have "stacked" radiators in that the air doesn't go directly from Radiator A to Radiator B, it goes Radiator A to the internal volume of the case to Radiator B.  I agree with LTT there, the air will still pull heat of the rear radiator on its way out.  In general having one big radiator used on exit, so that your airflow is best, that is:

     

    Ambient enters case -> picks up a minor amount of heat off the VRMs and such -> crosses large radiator and pickups CPU and GPU heat -> exits case

     

    But as Linus says you gotta work with what fits in the case and the delta between that model and the one they used is probably nominal.  The cases just aren't that airtight and he's right about that/

     

    I think Corsair is much closer to be being right about the server build and the direct stacking though, especially once you put the server in the rack and surround with a bunch of other servers kicking out heat.  Nothing LTT did in the video really addresses that minus the one comment about Jake tried removing a radiator, but no clarity on to if this was just done on a table or if the server actually went back into the rack.  To properly address that point LTT should actually 3D the proposed baffle, put the server in the rack and benchmark it during the workday (so while the other servers are also under normal workload and raising ambient temps the room).

  4. Quote

     


    So first question is, Why do used GPU's fans wear out so quickly even though they're supposed to last at least 4 years?
     

     

     

    GPU fans are rated to last for hours.  So when a gaming GPU says its fan for last N years, the years is based on well the GPU fan is good for X hours, so then the years it will last is: X / (52 * average amount of time someone games a week).  Your RX570 was running all day, so 168 hours a week vs average amount of time someone spends gaming.  Plus if the GPUs were all slammed into a system together, probably more heat and the fans running at high RPMs more which is also going to shorten life.

     

    Quote

     


    So second question is, why do some used GPUs look different from the ones on the official website of the make?
     

     

     

    XFX makes multiple 570 variants.  I count 7 on their North American website, they may have more for other regions.  Different variants hit different guaranteed overlocking bins.  You mention you flashed a custom BIOS on it and the cooler is also failing, so there are variables at play here.  

     

    Quote

     


    So third and final question is, Why my RX 570 doesn't reach full clock speed at default voltage?
     

     

     

    Probably because it's throttling due to heat or power limit and undervolting reduces heat/keeps you from hitting the power limit.

  5. Dells don't always obey Dell law though.  I agree in theory if you in Slots 1 and 2 you have Matched Pair A and if in Slots you have Matched Pair B, then it should work as long as neither matched pair violates the rules, even if pair A and B are not equal things should work.

     

    However my personal experience with Dells that should does not mean they will.  I would agree with others, you can clean the RAM slots, reseat the CPU and generally hope for the best.  Hopefully one of those fixes it, but if you can boot with Pair A or Pair B but not both, the unfortunate solution is probably you have to dump one of the pairs in favor of four of the same.  I've had to do it enough I really only do memory upgrades for Dell using the Crucial tool:

     

    https://www.crucial.com/store/advisor

     

    Reality is if Dell never shipped a config of say 2x4, 2x2, then they never validated it.  Or maybe they did validate it but the particular timing differences between the pairs you have don't work because of a bug in the BIOS about down clocking to the lowest speed, etc.   At least 20% of the time I've gotten burned doing exactly what you're doing.  The fact either pair works but both pairs together don't makes me think this one of those cases, with an off chance it's something isn't seated right as the problem.

  6. At that point your options are either a R5 3600 or a R7 3700X.  With exactly how many VM's you'll have running simultaneously answering the question of which one you want.   This also influences the RAM, etc and in turn will determine how much money you have left over for a GPU after doing the core of your system.

     

    Reality is if your primary purpose is building a workstation type rig and you have a 1k budget, gaming will take a hit.  Especially if you need to throw in a monitor, windows licenses, etc.    

  7. I think it's a reasonable price, you can negotiate though.  

     

    The PSU or a comparable one is ~40 to 45 new.  BestBuy for example will see you a EVGA 650W bronze efficiency PSU for 44.99, I'm sure there are 500 and 550s out there for less.  A 1070 used is a bit above 200 if you go the eBay route, a bit under if you do a local deal normally where no shipping or any third party taking a cut is involved.  Although it's hard to be solid on the latter given regional variants.  

     

    550 W should drive your system.

     

    Personally I'd consider no more than 180 for the graphics card alone and then going out of pocket for a better PSU a better overall deal, even if it means I spent ~220-230 total, but this isn't bad either.  Odds are the seller probably wants to bundle in that system integrator level PSU though because no one is going to buy that thing on its own.  

  8. I'd go:

     

    5400 RPM drive.  You have a 1 TB SSD so you really will only have data on the spinning disk.  Get a 2 TB WD Blue or HGST Deskstar that does 5400 RPM.  5400s have a longevity edge, are quieter, and use less power.    If you actually have a use case for the 7200 RPM drive, you'd be better off with a SATA SSD as your second tier.  

     

    MSI X570 Mobos aren't great, also in general, I think a setup where you go with a pricer X570 just to get something that says X570 and wifi is not the best move.  You're running a 3700X on a stock cooler, you can cut there and have more money for a fully modular power supply or just more money in your pocket.  You can either step down a bit with regard to X570 since you really aren't using any of the features or drop to a B450.  

  9. 144 Hz at 1440p isn't happening for Starcraft on that budget.  Starcraft relies heavily on a single master thread, you may have stuff spawned off, but your frame rate lives and dies off that main thread.  You'd want to be in the Intel camp on a 5 GHz chip, although even then in say late game 4 v 4 I doubt even a 9th gen Intel locked on 5 GHz is going to make it happen.

     

    The easiest path is just to aim for 60 fps.  You can get a higher refresh monitor as long as you get a good quality one whose adaptive sync works with your chosen GPU brand and has the range to handle the late game FPS drops.  

     

    Also making a ton of VMs means spending more on RAM and disk space for their virtual hard drives which at this budget is going to cost you fps in games since something has to give to cover those costs.  Why exactly are you making a 'ton' of VMs?  

     

     

  10. Used Dell Optiplex or equivalent business tower ~120 to 150 dollars.  Don't buy off Dell, Lenovo, etc directly.  They always charge ~100 over market.  Find the local guy flipping business towers.

    GTX 1650 149.99

    One additional stick of good old green Crucial RAM (assuming the used tower comes with 1x 4 GB)

     

    Spend the rest on a SSD if you want.

     

    At the 400 range, you're already going to be compromising PSU, case, etc to come in under budget.  Not a lot of stuff in a 400 dollar rig you'd want to carry forward to your next build.  Plus if you're lucky, you'll get Windows for free with the used business computer.  If spend some time surfing the local shops you can find a Dell, Lenovo, or HP tower that uses standard power pin for the mobo.  So then you also have the option to swap in a more power PSU which in turn enables a better GPU.  

  11. Vega 64 no.

     

    500w is the official min for Vega 56 from AMD.  Note that Gigabyte has models of the V56 that recommend 650w PSUs, etc because of after market tuning.  So the specific Vega 56 card and how you tune it matters (undervolting is probably good, OCing is probably a horrible idea).

     

    So as long as you confine yourself to AMD's stock power profile, then a conditional yes, depending on what else in your system.  I personally wouldn't, but that all comes down to personal comfort of running with little headroom on your PSU.

  12. The only value prop on the Threadripper 1 is that in the future you can buy a later gen/higher core count Threadripper.  But you're limited to Threadripper 1 and 2 due to the socket shift for Threadripper 3.  I'd honestly rather have 8 3000 series cores than 12 1000 series cores.  A 2700X almost tied a 1920X in transcoding (the TR chip only had a 3 fps edge on h264 medium quality).  Can't find any head to head benches on a 3700X vs 1920X, but I'd expect the 3700X to win.  Most benches are of the 3700X vs a 1950X, with the 3700X losing by 5 fps to 7 fps despite having half the cores.  In GN's 1920X revisit you can see the 3700X pulling head of the 1920X in Premier and such.  GN actually ends up recommending the 3600 over the 1920X for most folks.

     

    To me it is:

     

    B450 Mobo: ~120 bucks

    Ryzen 3900X: ~450 bucks (which a B450 like a Tomahawk can handle at stock clocks)

    570 out the door

     

    X399 Mobo: ~270 bucks

    TR1920X: ~200 bucks

    470 out the door

     

    I think the X399 mobo destroys the value, unless you for a refurb mobo and cut price and/or your long play is to get a used 2990WX in a few years.  You either pay ~100 more (the 3900X option) for something that crushes the TR1920X or you spend ~299 bucks on a 3700X, save ~50 bucks over the TR1920X and still have a system, that I think will perform better overall for Plex duties.

  13. 4 hours ago, TaxManD1 said:

    I plan on starting from scratch and doing a full build 

    So I have a bunch of people transcoding Both Audio & Video. Im the only one Direct playing. They will all be transcoding & some even pull surround sounds as well. So the users are family & friends some are using DSL others are using FIOS Gigabit. I currently have 200/50 Internet myself and never really have an issue too much unless everyone is on at the extact same moment. I do have 4K as well but I block that library from everyone other then myself. So starting literally from scratch on a build what would you suggest ? 

    CPU heavy for sure in those situations. 

     

    With your budget you're in an interesting spot.  You have lots of options with regard to a Ryzen 8 to 16 core processor (3700, 3900X, 3950X) that won't blow your budget and are going to handle the load and should last for some time.  I think the 3700 more than meets your needs and then the 3900X and 3950X are more if you want as big beefy home server that might end up doing other stuff.  Honestly even a 3600 probably works, but if the goal is "Build this and don't touch it for awhile", I'd go 8 cores.  

     

    You also have Threadripper as an option, but unless you source used parts you'd probably end up creeping over the 2k limit.  You'd be looking at say 1,400 for a 3960X, 500 on a mobo and still need the other pieces.  The other thing is Threadripper is a more niche platform, so you have to put up the enthusiast space teething problems.

     

    Where I could probably look is:

    Ryzen 3700X ~300 USD

    B450 Motherboard

    EVGA 1660Ti Black ~250 USD

    8 to 16 GB RAM (2666 MHz green stuff is fine).  16 GB of Crucial 2666 MHz should be ~75 dollars, 8 GB should be ~40 dollars

    Tier A PSU

    SATA SSD Boot Drive OR nVME drive if you want to save the SATA port for the data drives

    However many drives you need, want to spend the cash on.  Just avoid Seagate and their abnormally high failures.  HGSTs are really good and WD Reds are also good.  

     

     

    Some specific notes:

    The B450 Motherboard's main purpose in life is to have SATA ports, so don't get the cheapest one out there.  Get one where you have more SATA ports.  I would say the MSI B450 Tomahawk with its 6 ports is about as low I'd go.  Since the fewer ports you have the more it forces you only have high capacity drives connected to them.

     

    Since you need a GPU anyway given the lack an iGPU on your CPU, might as well go with the 1660Ti.  I find it a nice quality of life improvement because when you rip a Bluray and find it's VC-1 or whatever you can just immediately feed it into the Turing GPU to transcode and it won't take long.  That being said, literally any video card works.  At the end of the day, your GPU only needs to drive your display, your CPU is doing the transcode work.  You don't even need a GPU in that you could install a Linux distro that lets you headless boot and remote in from a different box.

     

    For RAM 8 GB is more than enough, you just need RAM to hold your videos in as they playback.  The only reason to go up to 16+ GB is if you plan to any kind of memory cacheing or such.  Green Crucial 2666 MHz DDR4 is the way to go.  You don't need anything fancy.  

     

    Power supply you want something good because this thing is on 24/7/365/.  I'm partial to the Seasonic Prime Titanium line.  They're pricey, but they run fanless below ~40% load and I don't mind paying for efficiency and the fact it's no moving parts in its default state.  You can easily save a buck here by looking at something else.  

     

    Disk wise, it kind of comes down to ports on your mobo and your future expansion.  If say you have 6 SATA ports, have 1 port assigned to your SATA SSD boot drive and 4 drives assigned a RAID 10 pool, you can only add one more drive.  If you buy a M.2 boot drive then you'd have two SATA ports free (or in the future could you buy a PCIe card to give you more SATA ports, etc).  I'm partial to 10 TB Western Digital Red Drives and 12 TB HGST drives.  Just avoid Seagate and its above average failure rate.  I'd say pick something you are comfortable buying 4x of and then going RAID 6 or RAID 10 (plus of course external backup).  Then you can add more drives.  You can also look into Unraid which lets you pool up drives that don't match if you want look into that.  
     

     

     

     

     

     

     

     

     

     

  14. It all depends, for 4 to 6 streams it boils down to:

     

    How many are direct streams (that is no transcoding)?

    How many are audio only transcoding?

    How many are audio and video transcoding?

     

    If you're mostly doing direct play, the answer is that you don't need much.  Generally speaking internet speeds will be the main bottleneck.   

     

    For video, NVidia with its NVenc encoder is the fastest way to transcode, but not the non Quadro cards only support two streams (and a couple only support 1).  There are custom drivers available (third party, not nVidia) supported that let you do more than two streams and let you turn say an EVGA 2060 KO into a transcode beast, but you have to support the drivers and you can't totally ignore the CPU since audio is still offloaded to it.  You can also source some of the lower end Turing Quadros (normally second hand since they ship in a bunch of Dell Precisions and such)  and use those.  Pascal Quadros can be had for cheap, but the Turing NVenc is a really good one, so it's hard to recommend a Pascal Quadro at this point.  The other thing to call out is NVenc still has a higher bitrate, which presents the problems of saturating both your upload and the device playing it back.  CPU based h265 still gets you the best bitrates.  

     

    What it kind of boils down to though is if you're say rarely over 2 video transcode streams, get Turing card like the 1660Ti.  You can check nVidia's matrix here.  Make sure it has a TU core, has 2 concurrent sessions, and doesn't have an asterisk next to its Number of NVENC Chips (ex: The 1650) showing it's a Volta transcoder in Turing chip.  If you want more sessions then it's either a quadro, third party drivers on a GeForce, or going CPU based.

     

    For audio, it's honestly not a huge load on the CPU, but it's enough of a load you want to avoid going too cheap.  That being said if you're 100% confident the CPU only really needs to handle audio, even something like a Ryzen 1600AF easily get the job done with some headroom to spare.  One caveat is if you're doing lots of surround sound.  The one time I saw my server's CPU get wrecked was when I was watching a Bluray rip with 7.1 on the LAN and a buddy was pulling a video with 7.1 to his media PC.  Videos were direct playing, but because we both used Macs and Macs only support a subset of the audio codecs (even via the direct app) the CPU got spanked trying to convert both.  If you're just turning AC3 stereo into AAC stereo or such though, you can use almost anything.

     

    Finally if half your users are family members on DSL (for example my use case) then CPU is king.  You can take say 480p and CPU transcode that it to 0.9 mbps for Grandma and Grandpa to watch in her cabin on Lake Huron because they only have 2 mbps DSL but still wants to watch the Golden Girls and Grandpa watches Victory at Sea on loop pretty much.  Turing NVEnc is great, but still not that efficient.  

     

    For me personally, I invested a bit more in spinning disk and I have two copies of each file.  The rip off the bluray directly and then a CPU transcoded one that I automatically generate via Handbrake.  I name my bluray rips "Whatever - bluray.webm" and then just have a scanner that every night looks for cases where "Whatever - bluray.webm" exists and "Whatever - h265.webm" does not, then handbrake command line is used to generate the h265.   The files themselves sit on different disks, the blurays are hanging out on WD Reds that are in RAID 10 and backed up.  The h265 files are hanging out on the whatever disks (some 6 TB Blues and other stuff I've snagged on sales over the years) with no RAID and are considered disposable derivatives.  

     

    The h265 derivatives are also cuts down sound to just stereo which increases the odds I can direct stream sound.  This works for my case since my users all use the native client (if your users are using the webclient, you want to be h264 and AAC audio) and I taught them to use the Play Version tool and go for the low bitrate one.  Eventually when say AV1 is more of a thing I'll just replace the h265 derivatives with av1 ones (this also is why I'm CPU biased, sure Turing has an awesome h265 support, but my 6 year old CPU can do AV1, something my 1 year old GPU can't).  I have a 1660Ti without any custom drivers on hand for when someone can't direct play or wants say 7.1 and needs to stream my direct bluray rip.  My CPU is a i7-5960X which does just fine for audio.  Probably overkill, I retire my old gaming rigs over to Plex duty, so it's what I had.

     

     

     

     

  15. 59 minutes ago, The117thCon said:

    two 2080tis cost the same as a titan rtx though least with the titan you can use it all the time where as sli is mostly dead

    Obviously true baller is dual Titans, but looking over the GamerNexus results I'd go 2x 2080Ti over 1x Titan:

     

    https://www.gamersnexus.net/guides/3419-sli-nvlink-titan-rtx-benchmark-gaming-power-consumption

     

    The difference between a single 2080TI and RTX Titan isn't major, but you get get big gains from multiple GPU when the game actually implements it.  It comes down to if you want 5 to 10 more fps during most games, or want to gamble big on getting 70+ fps uplifts on DX12 titles that do multiple GPU well.  

     

    Plus more and more titles are shipping with at the very least the ability to unload the async compute work onto a second GPU, since DX12/Vulkan make that easy and you can do it even if there isn't a high speed interconnect between the GPUs.  It's just taken time for engine makers to bake that into their engines as easily available tooling.

     

     

  16. There is normally a 2070 Super to be had at 499.  Amazon has the Zotac one in stock at that price and B&H has the Gigabyte one.  So you need another ~120 to get one.  Dump the fans and you're halfway there.  Dumping the case gets you the rest of the way there.  You can buy fans and move the computer to a new case later.  Adding CUDA cores to a GPU post purchase is not an option.  

     

    Also pushing past 3200 MHz, while it does help the infinity fabric speed, is putting you into the land of diminishing returns.  I'd definitely want say a 2070 Super and 3200 MHz RAM over  2070 with 3600 MHz.  That's a place to maybe 10 bucks or so if you must.  I'd cut case and fans first though.  

  17. I'm confused, was the Air specifically required?  Because the 13" Pro starts at 1,299 and is a reasonably competent. Sure Apple tax still and you won't be streaming much given that's the quad core configuration and you'll still need to turn down graphics so as not to overload the Intel Iris graphics given that's the quad core, but you're not going to run into the whole issue that the dual core CPU that can't push back 1.6 GHz thing is the real thing killing you.  

     

    So if it was just OS X was required, should have gone entry level MacBook pro instead of buying the upgraded ultra thin.  If it's the Air specifically, I'd imagine the school goes for them specifically because you know you won't be gaming on them in class and they can do a full school day without needing to visit a charger.

  18. Not surprised to be honestly.  The higher frequency on the Intel chips still gives them an edge.  Even with  AMD holding the IPC edge on most workloads, it only becomes relevant if the Intel chip needs two ticks of its clock to do the work.  Not many games put that much load on the CPU, both chips generally get the frame done in the same tick.  Had they thrown some Total War with massive armies on there or some late game clicking the End Turn button in Civ you might see some better results for AMD since the AI leaning on the CPU might force Intel to tick twice, but that's all just speculation.  

     

    I have an AMD system and will fully admit I bought on the grounds I'm not yet moving to a high refresh rate monitor since frankly I want to see the OLED market mature a bit more.  I figure either I can toss a higher frequency Ryzen 4000 in there and resell my 3900X or just make the 3900X box my new home server and go with an Intel #-KS chip.  For anyone pulling the trigger on a monitor though and who has the budget, go with a KS.  

     

    The weird thing to me are the Seagate drives.  People who are building at this level, but don't have a NAS are weird.  I guess you gotta use those surplus drives so you don't mail them back though.

  19. TV is more bang for the buck right now.  I have a projector and an in ceiling screen, with space to mount the TV on the wall.  So when the screen is down it covers the TV.  I've then rotated between projector and TV upgrades over the years.

     

    I have the Samsung Q60, which is great for its response times.  The two downsides are its sound and viewing angles.  Sound isn't is a problem for my setup since all my audio comes out of my media PC and goes directly into the receiver and all my seating is pretty much facing the TV (no wing seating).  

     

    I just have a 1080p projector which I still like and does just fine, but I feel like I want to wait a generation before moving on a 4K projector.  

     

    I'm kind of waiting for the BenQ TK850 and BenQ TK800M to kind of merge into one uber projector in a future iteration before I go through the hassle of upgrading.    Basically color accuracy of the TK850 and brightness of the TK800M.  Whereas I feel like with the current 4K TV options, there exists a reasonably priced once I'll hold onto for years.  Whereas if I buy a project now, I'll just be unbolting it from the ceiling in 18 to 24 months because something much better came along.

  20. I did CAT 6A from my router to:

     

    1. My main desktop rig's location

    2. The location of my NAS (2x to it)

    3. The media PC that plays back on my projector

    4. A spot I might someday use as my office (meaning my main desktop would now be there)

     

    My logic was that a 4K Bluray with 7.1 surround sound streams at 100 mbps, the best I've ever gotten off Steam is ~400 mbps (I have a 2 gig fiber drop for the house and live 25 miles from the Steam server for my region) so saturating a 1 gig connection to one device takes some serious effort.  Really the only things that saturate my network are direct file copies and when I'm using the feature of letting one Windows box share its updates with others on the LAN (and that happens at 4 am, so meh).

     

    My theory is that someday when I'm rocking an 8K projector or something I might be happy I have 10 gig between my NAS and media PC, but honestly I question if even that would saturate 1 gig.  So I'm fine with sticking lots of Cat6 in the walls, and like W-L says, do lots of drops.  It's more about support a bunch of devices each wanting say 200 mbps than one device wanting 10 gig.  

     

    I'd also add on, think about where you want wifi spots and home automation. 

     

    With regard to wifi, I wired up all my closets on the grounds I might someday hide a wifi access point in there or a mITX build or something.  That way all my access points do wired backhaul for sending data back to the router.  I operate on the assumption someday we might have a faster wifi that has an even shorter range, so I hit all the bedroom closets, the garage, the coat closet on the first floor, etc.  I came in just about the top shelf in each situation so in theory I can just toss the wifi device up there and feed it via power over ethernet and I don't need to have it actually in my bedroom with its lights blinking away all night.  I also have a drop up in my attic so that I can snake a cable out of a roof vent and I have a pair of access points on my roof, one for the front yard and one for the backayard.  It's much easier to have coverage in your yard when you don't need to punch through an exterior wall to get the signal there.  I thought about doing a drop on my back deck, but I was concerned over the ease of physical access to it.  So I figured Wifi 6 is good enough.  

     

    With regard to home automation, doing power over ethernet vs having the device on wifi and a battery you need to change/recharge is much nicer.  

     

  21. You have some interesting claims, I'd love to clarify.  Also I feel kind of badm like I'm triggering some kind of compulsion loop here or something.

     



    Fish out of water Polaris 11 would not beat a Gt 1030 GDDR5

     

    First you are claiming that a GT 1030 beats any Polaris 11 GPU arch?  So we take a computer, plug in a GT 1030, and runs some benchmarks, then remove the GT 1030, plug in the RX470, and run the benchmarks again.  You are stating that the GT 1030 benchmarks will exceed the RX470 benchmarks.

     

    Because that's frankly total garbage.  The RX470 was typically recommend over the GTX 1050Ti given its performance edge.  The GTX 1050Ti and GT 1030 are both Pascal cards and the GTX 1050Ti was two increments ahead of the GT 1030 in nVidia's lineup, you're arguing the sky isn't blue at this point.

     



    The jaguar cpu is not a polaris 11...Thats the closest comparison only.  To be better when paired or equal too.  Pair a lessor gpu like mine with a better cpu then the jaguar and it will be better hardware wise every time.

     

    PS4 shipped with a HD7870 that had two units disabled due to power and cooling concerns.

    PS4 Pro shipped with a downclocked RX 480, which is Polaris 11.  So when comparing GPU abilities, to call it comparable to the PS4 Pro, it needs to slide in between the RX470 and RX480, the two Polaris 11 cards.

     

    Sure some other things were done to glue them onto the overall Neo and Livermore chips, but that's what the graphic components were.

     

    There is a bunch of other stuff going on here.  At some level you seem to want to claim whatever lets you win the argument.  Like your comments about Microsoft Office to me all of a sudden and your comments about emulation to steelo.  First I'd point you back at the fact I agree that a Haswell + GT 1030 is better for CPU heavy tasks.  However people are responding to your claims the GT 1030 is superior graphically.  There is no situation when playing a 3D gaming title that the GT 1030 trumps any Polaris 11 part, be it the RX470, the RX480, or the Polaris 11 part in the Pro.

     

    Second as for emulation, Pascal doesn't have a true hardware scheduler for async compute which hurts it with regard to running Vulkan and DX12 titles.  See for example Maxwell and Pascal cards poor showing in the recently Doom Eternal benchmarks.  At the end of the day when the PS4 emulator does mature, you're going to be trying to play PS4 games on something that lacks hardware level async (and 99.99% chance the emulator is going to be written to use Vulkan), has a GPU that only enjoys a slight edge over the PS4's cut down HD 7870 and is outclassed by the downclocked in the PS4 Pro, and won't be emulating anything.  Since the rule is you need to bring significantly more computing to the table than the console had so as to deal with the emulation overhead.  Even if all the instruction sets line up, you're still running the PlayStation OS on top of Windows or Linux to come degree.

     

    If it's your hobby cool, all my builds are hobbiest and don't make perfect sense.  But misleading claims about the capabilities of a machine you've stated you want to try to sell others is not cool.

     

     

     

     

  22. Your hobbies are your hobbies man and follow your bliss.  But making claims that a GT 1030 beats a Polaris 11 based part is just plain wrong.  A GT1030 doesn't even beat a Polaris 10 (the RX460) in Firestrike and other graphical benchmarks.  T

     

    While I would agree a Haswell + (pretty much any GPU that can survive on just 75w of PCIe power) is a better option for games known to be heavy on the CPU relative to their GPU load (Source engine titles, certain eSports titles, grand strategy games, etc), it's a much less well rounded gaming setup than just a straight PS4 Pro. 

×