Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Maxxtraxx

Member
  • Content Count

    612
  • Joined

  • Last visited

Posts posted by Maxxtraxx


  1. If you want freesync unfortunately your only real high-end option is a Vega card but those can be scarce and I believe pricey. But if you can find one especially with an aftermarket cooler you'd be doing well.

     

    An RX 580 will work but will definitely struggle to give you the Fps do you want at each resolution with AAA games.

     

    Other than those two options that leaves you with Nvidia 1070, 1070Ti, 1080 or 1080 TI, and  sticking with vsync to prevent screen tearing... in which case you probably want to lean more towards Overkill to keep your ability to produce FPS well above your refresh rate(if you accomplish that then freesync and gsync don't make any difference since you're over driving the monitor refresh anyway).

     

    FYI, I run a 1080ti with my 1440p screen at 75 Hertz and vsync and it works very well.


  2. 1 hour ago, Bcat00 said:

    Get the Define S instead? 

    Or you can just get a 240mm rad if you are more inclined to stay with that case.

    Personally i say change the rad because according to Gamers Nexus, the 360 rad actually lost to the 240 when under load. No idea why but seen it one of his test videos.

    I'm keeping with my recommendation, a 360 rad with only 2 of 3 fans installed should perform very similar to a 240. It doesn't require changing any components other than not installing 1 fan and the performance would be unchanged. If he would prefer to go through the returns process and get something new then that is perfectly fine as well, just requires more steps.


  3. Not an perfect solution... but you could leave out the middle fan and fit everything in...

     

    IMO, a 360mm rad is WAY more than enough for an 8600k, an 8600k can likely get decent overclocks with a 120mm radiator.

     

    Remember... a 300 watt graphics card with an AIO solution (or 500 - 60? watt GPU for the R9 295X2) uses only a 120mm radiator and they see some of THE BEST temperatures of any GPU you can buy.

     

    An overclocked 8600k will pull... maybe 200 watts(double stock TDP)?

     

    edit:

    The Strix is 298mm long and the case with a 25mm fan accepts a 315mm card sooo, with a 30mm rad it can fit a 310mm card so if my specs for the card length are correct then leaving out the middle fan will allow you to fit it.


  4. Quote directly from an article that addresses this topic, I find myself in direct agreement with most of it. And most of this article's content goes completely untouched by the mainstream because to them freedom means being able to freely censor any voices that dissent from being politically correct or left wing tech monopoly corporation approved.

     

    "Do ISPs have the potential to become the content police of the internet, absent regulation? Yes they do, and that is a legitimate concern for defenders of internet freedom. It’s not hard to imagine a scenario in which an ISP, pressured by governments, activists and the media, decides to cut off access to a loathed website (say, the Daily Stormer), and in doing so undermine the principle of the open web.

    But it’s weird for ISPs to be the primary target of such fears, when it’s online platforms and services (the ones not currently subject to Net Neutrality rules) that did precisely that. Specifically Google and GoDaddy, which cut off domain support for the Stormer, and Cloudflare, which cut off DDoS protection to the site.

    ISPs could conceivably do the same thing, but they haven’t yet, outside of authoritarian countries like China and Turkey. Moreover, they are considerably more resistant to the kind of advertiser boycotts that forced YouTube away from content neutrality, because they’re reliant on subscription rather than ad revenue. They also know that if they take any steps towards censorship, well-funded Net Neutrality activists and their allies in Congress will pounce.

    That’s why they, unlike Google, Facebook, Twitter, and other platforms that censor with impunity, have made public pledges not to act as gatekeepers."

     

    Article can be found here


  5. Overall the Overclocking difference will be more greatly affected by the silicon lottery than SC2 vs FTW2.

     

    The FTW2 likely has a higher allowable power limit by a few percent (possibly 120% vs 112%)

     

    They both will be bound by Nvidia's hard Voltage limit

    They both will be kept cool by very similar coolers and have similar temperatures

     

    They both will very likely clock somewhere around 2100Mhz

     

    IMO: if you can fit it, the EVGA Hybrid models with the Hybrid AIO cooler are my favorite for the quietness and cooling capability.


  6. 2 hours ago, Bananasplit_00 said:

    both the 480 and 580 win in most games over the 1060 6GB

    The 580 trades blows with the 1060 6gb but the 480 is a bit behind both in general. Can't go wrong with any of them.

     

    Digital foundry test: here

     

    As far as a 1070/1080 equivalent... It has been said already, 980ti or Vega 56 and 64 would be the only options. Get what you like, what is the best value and you feel most comfortable with... They're all good cards.


  7. 13 hours ago, DarthSmartt said:

    Thanks!

    Happy to help!

     

    12 hours ago, Zmax said:

    You pay more for a founder edition, with no real advantage, since they are specs by NVidia. 

     

    Look at a card you want and see what water kit will fit it. Before you do anything. 

     

    In my opinion

    If my assumption proves correct... that he's getting the A240G kit from EK...

     

    EK does not make any other GPU blocks for other cards that are compatible with this kit, the reason being that this kit uses all aluminum components to get the selling price down to 1/2 of what the kit would cost with copper and nickel plated components.

     

    Therefore he MUST get a founders edition pcb.


  8. 25 minutes ago, DarthSmartt said:

    Can someone direct me to a 1080ti with the same pcb as the founders edition? The founders edition just has the stock PCB right? gracphics cards are confusing.

    As you wish!

     

    HERE

     

    More info: the Founders edition is the "stock pcb" for the 1080Ti there are no other non Founders cards that use the Founders pcb, not even the cheap MSI aero cards, they use a custom(if possibly worse) pcb.

     

    My assumption for the kit that you found would be the: EK A240G kit found HERE

     

    FYI, remember that this kit it not currently compatable with ANY of EK's other parts as the aluminum components are approved for use with any of their copper or nickle plated parts, that being said... it will still perform very well and will cost FAR less to get you started in water cooling.


  9. 6 hours ago, Max_Settings said:

    Thee Pixel 2 was already a deal breaker with no dual lens camera and not a good enough bezelless design. This screen also a deal breaker, and since the Pixel 2 also has no headphone jack, I don't feel bad at all about getting an iPhone X.

    I'm not a phone snob/trendy individual soooo.... The things that interest me most are:

     

    1: Camera: It seems that everyone agrees that the camera is crazy good, with great video stabilization.

    2: Size: I have medium large hands and my current Nexus 5X is as big as my hand is comfortable with for single handed use and as big as I want for a mobile device, IMO over 5 inches is silly for on the go use.

    3:Screen: over 1080 resolution in a 5 inch screen is crazy overkill... 400+ ppi... bragging rights and spec wars for the sake of only those 2 things when we get to and go beyond that.

    4: Good speakers, stereo being a plus, for both speaker phone calls, for music/podcasts in my pocket while working, occasional video/youtube.

    5: i do wish it had a headphone jack... not everyone has everything they own being bluetooth capable... 2 of my 3 cars have no bluetooth only aux input.

     

    What i don't care about:

    Bezels: so.... having no bezels so that your fingers are in the way of the screen while holding the device, makes no sense to me. IMO Sacrifices sense for looks.

     

    6 hours ago, Max_Settings said:

    TBH they are more or less the same phone with just a different shell and a few less internal pieces than the V30.

    There is ZERO evidence available for the statement made here, in fact Google just announced that the Pixel 2/2XL has is own separate custom designed camera image processor, that looks to be very impressive from specs that is not yet enabled on the devices. Go look at the ifixit articles on the v30 and pixel 2XL maybe?

     

    6 hours ago, Max_Settings said:

    For zoom and the effects. Plus it is just part of the design of phones now, like the bezelless design, a lot of phones are going to that, if you don't have it, it looks dated IMO.

    If you need pictures better than what the pixel 2 produces... which are phenomenal, maybe a camera is in order? Same thoughts on the bezelless trend/fashion as above.

     

    5 hours ago, Sniperfox47 said:

    The things in the Verge article are more than a little overstated.

     

    The blueshift is because of the polarizer (so you can use it with your glasses) and is waaaaaaaay less noticable in the final product than in the preview units (review units and Verizon demos).

     

    The rainbow artifacts at obscure angles I haven't seen verified elsewhere but could also be due to the polarizer.

     

    The faded and less warm colours are because it's an HDR panel calibrated for the sRGB space. A number of reviewers, including one the Verge article mentions, have specifically said it looks pretty much like what you'd expect from a professional monitor, rather than a phone. And no, it's nowhere near as green as that picture makes it out to be >.>

     

    The hazing is an issue, but you have to specifically look for it, is hard to spot with any ambient lighting, and only happens at really low screen brightnesses. It looks to be an issue with voltage control in the panel matrix, and is due to LG's pOLED panel.

     

    Not defending them, the polarizer was a dumb idea, the calibration was dumb on a consumer device and "Vivid mode" is nowhere near intense enough -it's just a 10% bump in colour intensity-, and they should have made different compromises on their panel even if pOLED gives them more flexibility. Just saying there are legitimate reasons for a lot of this, and for most of these things, most people would have to be explicitly looking for to even notice.

     

    You mean like how the 6 had constant storage performance issues due to forced encryption with no hardware acceleration and pretty much locked up while doing updates?

     

    Or how the Nexus 4 couldn't handle OTG because it switched to host mode but didn't output any power?

     

    Or how the Nexus 7s (both versions) would frequently fail to charge back up if you let the battery run too low, unless you held a specific button combo while plugging it in to force the chipset to reboot?

     

    Nexus 5 was the only Nexus device that wasn't *very clearly* a developer device. Pixel has gone too far the opposite direction I fear, as being *very clearly* a mass consumer device.

    All the high quality, well informed and thoughtful things stated above just about sum up this discussion. Thank you for your thoughtful, informational and well constructed post.


  10. 2 hours ago, Taf the Ghost said:

    Paper Launch to get all of the reviews out before the Christmas shopping season. Though it seems the supply of 8700 (non-k) and 8400 did exist enough that people got CPUs.

     

    But, in the retail space, the 8700k should probably sell around 45-50% of Intel's market, which is part of the reason why it's a problem. 

    16 minutes ago, NumLock21 said:

    Has it really launched? I thought it was more of a paper launch than a actual launch.

    Not a paper launch... paper launch: "the phrase is used to denote product announcements that explicitly compare the "new product" with other actually available products, despite the fact that the newly announced product is not actually available to consumers." quoted from ars technica

     

    Coffee lake 8700k's are available, are being sold but it appears that intel pushed the launch up and as such the number of chips is limited as of right now compared to what intel's normal launch volumes are...

     

    so, My opinion on Intel's decision is that instead of stockpiling product for a month or more in a warehouse and not making sales available until next month they released a smaller number of processors to get them into the public's hands NOW and replenish stock as fast as production allows, this would indicate that the total number of processors available will be unchanged(compared to a later launch date (but available sooner)) however the sales will be spread out over a longer period of time.

     

    Impatience for an expensive product because everyone can't have it NOW... a first world problem and one that tends to leave a wake of angry entitled customers.

     

    10 minutes ago, lvh1 said:

    Guess I was lucky, I've grabbed a 8700k on monday. Out of the 12 stores I was monitoring only one had 8700k's in stock, and only on monday.

    Good to hear! I would like to hear your OC results and thoughts after you've had some time with it!


  11. Has Intel ever guaranteed backward or forward compatability for coming generations of processor releases with older hardware and vice versa?

    I'm not aware they have, so assuming and "requiring" Intel to do so based on historical trends is very misguided. That is to say that this(see below) is spot on:

    On 10/6/2017 at 8:05 AM, leadeater said:

    If you align Z87/Z97 to Z67/Z77 and Z170/Z270 the cycle is as per normal i.e. Haswell and Broadwell on both with Z97 being cut off from Skylake support. Z270 should under all reasonable analysis have been expected to have been cut off from Coffee Lake, there has never been a precedent of 3 architectures being supported on a single chipset. Again Haswell refresh is still the same architecture and Broadwell was a no show anyway.

     

    There actually are apparent and significant pin usage changes for power supply and more? in the socket design despite being on the same socket pin layout.

    Everyone can complain all they want, Intel made a decision for reasons that are in all likelihood multiple in number and greater in scope than anyone outside their engineering team team, finance team and company leadership will ever know. To label them entirely anti-consumer due to a personally uninformed judgement/guess about those reasons is juvenile.

     

    Intel has market dominance for a variety of reasons, some of them being shady business practices, many of them being that they have the best product on the market and have had as such for the last decade or so and their product quality speaks for itself. From my experience Intel puts a great amount of hard work into their products and like... Nvidia when their products launch, for the most part... it all tends to just work very well directly at launch.

     

    IMO with AMD... it rarely seems to feel that way... it almost seems as though their new products tend to still be in beta at launch, Examples: Ryzen's memory headaches at launch, Motherboards with bios problems, getting NVME raid support long after launch, Vega launching with broken drivers and overclocking utilites and many features being completely unimplemented until months after launch.

     

    These are my thoughts on the subject


  12. 1 minute ago, Vincent123 said:

    What should the temperature be for a gtx 1080 and how much should I overclock it to. 

    You can overclock it as far as you like, obviously for the temperature lower the better but generally most people are comfortable with anything under 80 degrees Celsius.

     

    When it comes to overclocking you can't hurt the card by moving any of the sliders too far in MSI afterburner.

     

    Generally speaking when your overclocking it's trial and error so make sure you have a suite of GPU stressing utilities like a ungine or firestrike. Then slowly move core clock slider up about 50 mhz at a time until you get a crash then start inching up the core clock and the voltage sliders until you hit a frequency or temperature that you're comfortable with. 

    Most 1080s will hit somewhere between 2 ghz and 2.2 ghz.


  13. 8 hours ago, DanielMDA said:

    I plan on going with a 11GBPS GTX 1080, specifically the AORUS Xtreme Edition, however im going "cheap" on peripherals, so Ill be using this Monitor:

     

    https://www.amazon.com/dp/B01N1J0B3Q/_encoding=UTF8?coliid=I1GT73PVC41YM4&colid=ZV08283R9PPF&th=1

     

    It is a 2560x1440p Monitor (QHD), however is capped at 60Hz in the Refresh Rate, and the GTX 1080 11GBPS its overkill for 60Hz in QHD, so, can I cap the GPUs output to a maximum of 60Hz? I want the 11GBPS GTX 1080 because it will have a longer lifespan before bvecaoming obsolete, since im aiming at 1440p gaming for 3 years.

     

    EDIT: Does this Monitor have speakers? I plan on going with the 2.1 Ch Logitech Z623

    I own a 1080Ti and that very monitor. I love the card and the monitor, my gaming area is space constrained so I sit a bit closer to the monitor so the size is perfect for me. I love the IPS color quality and angles, the image is noticably crisper than the 1080p panel I had in both color quality and smoothness. My display was able to overclock to 85hz easily and with vsync on it works great. 

    I'm not personally a big fan of 4k, I've tried it and was not impressed in any noticable way but I would love to have an ultrawide 3440x1440 IPS monitor as my personal preference... But they're silly expensive at the moment.

     

    I am a fan of your choices, I can see no reason to call this monitor bad in any way and it is a good match for your gtx1080, with my old gtx1080 games like Witcher 3 with maxed settings would drop below 60fps at times if I recall correctly so I think it is a good match and more future proof than a 1070.

     

    As far as gtx1080 from gigabyte/aorus goes, if you like it then go for it, I wouldn't feel great telling you to get the cheapest 1080 like say the Asus turbo model but a cheaper card with a quality cooler could save you some money... But YOU need to get what YOU want and like, not what people you don't know say because they think you should spend your money differently than how you want.

     

    This monitor has speakers... Bad ones but they are there.


  14. Hi all, after just watching/listening to buildzoid's breakdown of the differences between the power monitoring capabilities of AMD and Nvidia hardware it left me with one prominent thought that I believe it answered.

    Primarily being the question in my head as to why Nvidia's gpu boost implementation/idea has not been readily adopted by AMD. 

    For Polaris AMD talked heavily about efficiency and before launch I actually wondered it they were going to be able to approach Nvidia's  level of fps per watt performance. I assumed it was because AMD had simply not run with the idea and implemented it in their software(maybe a valid idea seeing as they just recently enabled crossfire on their pinnacle Vega cards weeks after release).

    My uninformed self at that time didn't begin to question that maybe GPU boost was more than just software and that Nvidia actually has very good power consumption monitoring hardware built into their cards that AMD simply does not have in theirs. 

    Correct me if I am wrong here, but it seems to me that AMD would stand to benefit greatly by following the same path and allow them to build cards that potentially run much cooler at given clock speeds. 

     

    I'm interested in everyone's constructive thoughts here.

     

    Here's the link to the video for your viewing pleasure:

     


  15. 16 hours ago, metso said:

    but is 1080 ti worth it for 1080p 27 inch monitor 144hz or should i go with 1080?

    If you are doing 1080p 144hz...

     

    Noise will likely NOT be an issue due to very low GPU utilization, with my 1080Ti at 1440p most games only use 50-70% of the gpu's performance, it literally just pokes along at 1400-1500 mhz (normally 2ghz) so it produces very little heat due to the low voltage requirements at that clock speed.

     

    If you can afford the 1080Ti and you want it, go for it, it will give you future proofing/monitor upgrade headroom to work with and will likely be VERY quiet due to underutilization.


  16. 9 minutes ago, Sakkura said:

    The system AMD chose was aimed at preventing wholesalers from evading limits like that.

     

    Here is a good explanation of what seems to have been going on behind the scenes.

    What I took away from that post is that a wholesaler is forced to buy the Packs to receive X number of stand alone cards to "prevent" mining co-ops from buying bulk stand alone cards and preserve stock for end users.

     

    A question that I ask is, what is the pack to stand alone ratio, bases upon stand alone availability it FEELS like there were more packs offered than stand alone cards(but this is likely a piece of info we'll never know), and AMD's intent would be revealed by that ratio. IF the packs to stand alone ratio is 2:1 or worse then I feel like AMD is trying to force additional margin into the product by artificially inflating the price (and in the process alienating many card buyers by ruining the price/performance ratio), IF the ratio is more favorable to the stand alone card then I would agree that it was done more to deter wholesale co-ops with the idea to see how gamers reacted to the packs being sold at a higher price.

     

    IMO, it feels (very subjective) to me like AMD is doing the former.

     

    What this ends up doing is forcing the end user to pay an extra $100 for the card that includes stuff they may not want and thusly prices a 64 out of the competitive market and increases margin for AMD on the sales of the packs.

     


  17. 6 hours ago, JoostinOnline said:

    Well with this they had a reason with some validity.  Supposedly, they were worried that miners would buy them all up (that's what happened with the RX 500 series), which is why they introduced bundles.

    The problem with calling bundles a good way to keep miners from buying vega cards is that the same goal can be accomplished by limiting the number of cards purchasable by an individual household on a vendor by vendor basis, it isn't a perfect solution

     

    BUT... it doesn't FORCE a prospective Gamer who wants the card to pay $100 extra if he/she doesn't want the included games or to wait around trying to buy the individual card when it restocks for 2 hours before selling out again at a vendor.

     

    IMO AMD has been pretty sketchy in their claims that they will "restock" the individual cards, and sounds more like they will provide LIMITED quantities of the individual cards.

     

    Looking at all the evidence from AMD's Statements, Gamers Nexus's cost analysis of the card and the limited stand alone card stock vs availability bundle stock makes me lean more towards AMD breaking even or losing money on the cards and the bundles are a way to recoup just a bit more cash.

     

    5 hours ago, Jito463 said:

    One possible reason would be gamer market share.  If miners buy up all the cards, then Nvidia will still have a near stranglehold on the gamer market, allowing them to practically dictate the state of gaming cards moving forward.  If AMD wants to have some say in the future of gaming on PC, they need to regain that consumer market share, which can't be done if no one can buy their cards for gaming.

    What I said above still stands... but if AMD wants a future in gaming then their future cards need to follow the RX 400/500 series cards and not repeat vega... the goal of which is to produce a better performance per dollar card than Nvidia. However, what they did was produce a card that:

     

    1: has a silicon die the size of a 1080Ti that provides the gaming performance of a 1080

    2: A: uses the most expensive and production limited memory available

    2: B: needs the additional cost of a delicate interposer that also requires the GPU package to be shipped to multiple locations/countries for full assembly(it goes from GloFo(not sure where their 14nm foundry is) to Korea for HBM and interposer assembly then gets applied to the board)

    3: uses a board with DOUBLE (I believe) the number of power phases as a 1080Ti (again more parts/stuff)

    4: they're 1.5 years late to the game for this level of performance, that being that top level performance only carries a premium for so long

     

    as a result, they can't hit the better price to performance ratio that rx400/500 was able to hit because they're late to the game, have an unreasonably expensive card, have very limited production capability.

     


  18. 1 hour ago, Nena360 said:

    That depends on how easy AMD can get HBM2... :P

    Eh, HBM2 wouldn't be the limiting factor for Navi as HBM2 is already in mass production and will only get better and better yeilds as time goes on, the 7nm production node would be the determining factor and possibly AMD having limited resources for chip design compared to Nvidia.

     

    I would personally love to see what the Intel 14nm++ process could to the current crop of GPU's as Intels 14nm process is much better than anyone else's and equal or better to others new "10nm" processes.


  19. 3 minutes ago, Nena360 said:

    I heard they were not worth it for mining are you sure? o3o

    Navi will cost more and well Volta is also a option unless AMDs re-brand will have a nice price cut and its higher power consumption will mean the miners wont touch em! :D

    We MIGHT see Navi by late 2018 but I would personally think 2019 based upon how late Vega is to the party. So... don't hold your breath for Navi.

     

    Update:

     

    "Now, GlobalFoundries is making a serious push of its own with early 7nm technology availability and volume production currently planned for the back half of 2018." via ExtremeTech.com

     

    7nm in the last 6months of 2018 would IMO put Navi in early 2019.


  20. 4 minutes ago, nerdslayer1 said:

    wait for a aftermarket cooler 

    Agreed, IMO wait till the aftermarket options come out and pick the one that suits your style, performance and acoustic needs the best.

     

    However, the VEGA reference board is still a very good board with a small cooler that most would label inadequate to deliver maximum performance that is both cool and quiet.

    As gamers nexus has pointed out based upon the Size of the silicon chip, the cost of the HBM2, the interposer and the quality of the board components(such as vrm) AMD could very well be losing money or might be breaking even on sale cost vs parts cost on the 56 model.

×