Jump to content

mvitkun

Member
  • Posts

    4,264
  • Joined

  • Last visited

Everything posted by mvitkun

  1. No offense but I have to ask this, if you're buying a $500+ GPU why would you go with a more budget oriented brand? I mean for $20-$40 more you could get a Asus Strix, a Gigabyte G1 Gaming, or an EVGA SC......all of which have waterblocks available and have much more robust power delivery. I'm not saying you brought this on yourself or anything, because at this price range you should be able to buy from any brand, but at the same time when you spend this much spending a bit more is usually a good idea since a better brand might be the difference between a warranty covering some act of god or not covering it.
  2. You can make a logo in pretty much anything,heck you could draw a logo and scan it in. Using a 3D modelling program gives you many options, provided you make detailed models, because you're able to render it in any view (orthographic or perspective) and you can choose whether to render as vectors to give a nice solid color with or without outlined edges or you could render it as is with a more traditional renderer. As an upside 3D programs, with the exception of the sculpting functionality , don't benefit as much from a tablet as something like photoshop would. You could also use something like Photoshop or Gimp to create a regular 2D logo if you have some skills drawing, although it's be somewhat hard to create good logos with a mouse only. Personally if working on a logo I'd say you should consider using both a graphics editing/compositing program along with a 3D modelling program to get the best results.
  3. HDCP isn't required to be on your CPU,GPU, and display at such a level.......it's just integrated into DP, hdmi and DVI.
  4. There are two ways we could go about finding out the answer to that: A.Speculate pointlessly B.you could try it without those settings enabled And another benefit to option B is that you find out immediately. If you want to speculate then one of the advanced settings is view distance IIRC, and having more objects on screen is more CPU intensive.
  5. the advanced settings make SLI or Crossfire anythings cry........ and they do very little for visual fidelity. Just turn them off and turn up msaa to 2x
  6. I was hoping I could get some help from the LTT community in finding out what bit-rates Netflix runs at in different OSs and comparing the browser vs app bit-rate. If somebody running Windows 8, with a fairly good connection, could try playing the video titled "Example Short 23.976" in Chrome and then try playing it in the Windows 8 Netflix App. And if somebody else could try running Netflix on Linux with Chrome using that same video. I know that on Windows 7 it locks Netflix down to 720p @ 3000kb/s, I'm fairly sure Linux is also locked to 720p @ 3000kb/s, and I think that Windows 8 should still run at 720p @ 3000kb/s in browser but run at 1080p ****kb/s in the app. When you play that video it'll gradually buffer it from Netflix's base quality to the highest quality supported and the bit-rate will show up in yellow text in the upper left hand corner. thanks guys.
  7. I know for 100% that Windows 7 is locked to 720p Netflix, regardless of whether you use the Silverlight or HTML5 player. So I'd assume linux is also locked to 720p. Maybe I'm wrong but I think the Windows 8 App doesn't give better 1080p quality than on browser, it just unlocks 1080p since netflix only does 720p on browsers. I recall reading that netflix blocked screencapture on its HTML5 player already http://www.extremetech.com/computing/159960-netflix-switches-from-silverlight-to-html5-in-windows-8-1-reduces-cpu-usage-dramatically
  8. It would take much more sensitive technology than we'll have for the next few decades to make precise use of the brain as an input on a computer. By precise I mean the use cases in which Styluses are actually truly useful like sculpting in Mudbox, drawing in Photoshop, etc. Use cases in which it feels more natural to draw with a Stylus than with a mouse.
  9. VID isn't the same as core voltage. VID is what the processor is requesting by default.
  10. mvitkun

    The guy agreed to ship first if I put up a bit…

    Tough crowd, damn. People are so harsh lately. Yes I mis-pressed a key that's right next to another key twice.......castrate me why don't you.
  11. mvitkun

    The guy agreed to ship first if I put up a bit…

    Yeah I know, meant 970....typed 670 randomly.
  12. mvitkun

    The guy agreed to ship first if I put up a bit…

    How about comparing the two before selling? Not sure why you'd want a 780 lightning over a 670. I mean I could see a 780 Ti, but a regular 780 performs worse than a 670 and draws considerably more power.
  13. You might be right, however since Asus themselves decided to list Z97 boards as N/A rather than incompatible under hyperkit compatibility after having said that following bios updates both Z79 and X99 will support m.2 and pcie based NVMe SSDs I'll choose to keep the topic as is until there is more definitive information about Z79 not supporting hyperkit.
  14. http://rog.asus.com/418662015/labels/product-news/asus-announces-all-x99-and-z97-motherboards-support-nvm-express-devices/ http://www.anandtech.com/show/9086/x99-goes-tuf-sabertooth-x99-at-cebit-2015-with-nvme-support
  15. Quote from Asus' ROG site Quote from anandtech from when the board was first announced, before the intel 750 SSD was a thing.
  16. When Intel launched their NVMe SSD line, the 750 series, Asus had their AIC (Add-in Card), dubbed the 'hyper kit', that converts M.2 to a female mini-SAS HD connector which would then connect to a male mini-SAS HD connector which would terminate into an SFF-8639 connector that would connect to the Intel 750 SSD in the 2.5" enclosure. By using Mini-SAS HD, while it is definitely crude, it is able to provide 32Gb/s vs the 16Gb/s of SATA Express which Intel's 750 SSD has already surpassed. They had/have it bundled with only one of their motherboard's and have no option to purchase it separately despite boasting that it's compatibilite with all of their X99 motherboards. In case you're wondering by the way, yes, it is limited to THEIR X99 motherboards, PCPER attempted to use the hyper kit with several other X99 MOBOs from other vendors with an M.2 slot without success. Asus' press releases remain rather ambiguous as to the compatibility of the hyperkit with their Z97 lineup leaving them marked as N/A under compatibility despite saying both platforms are compatible with M.2 and PCIE based NVMe SSDs Big whoop though, right? Asus was also the first motherboard vendor to adopt the SATA Express standard onto a motherboard, it seems they're ready to jump on any standard's bandwagon in order to get past the aging SATA standard. Well that's why this news matters, MSI also thinks this new kid on the block with a name that gets him beaten up daily, SFF-8639, might be here to stay so they're playing it safe and announcing an AIC as well. I wouldn't expect to see a mini-SAS HD connector integrated into a motherboard anytime soon, unless of course we get another big SSD vendor to release an NVMe PCIE SSD in a 2.5" enclosure using the same convoluted solution to connect the the motherboard's PCIE bus. Sources: OC3D Bit-tech KitGuru
  17. What are you talking about? Nvidia has been saying the exact same thing about how performance with stacked memory will be so much better. Even if you disregard HBM's impact on the performance of the R9 300 cards, which you really shouldn't, it has ~30% more cores. So at the very least, if you expect there to be no improvement on the Fiji XT core vs the Hawaii XT core, it should perform 30% better than a 290X. That leaves it nipping at the heels of the Titan X and once in a blue moon even beating the Titan X. Seriously, try it out and multiply the fps rendered by the 290X by 1.3. It'll land smack dab in the middle of the 980 and Titan X. That's if you blindly and stupidly ignore improvements to the core design and performance improvements from HBM.
  18. So you haven't downloaded the patches or the specialized driver and you're already trashing the game?
  19. There are 4GB 270Xs.....and the 270X and 270 are pretty much the same card.
  20. @TheRagingGamer Did you download AMD's beta driver for GTA V?
  21. Are you maxing it out entirely? Try disabling the options in the advanced graphics tab, everybody who's done an overview of the graphics options has said that the options add very little to the visual fidelity of the game while also being extremely taxing. Also try turning MSAA down to 2x, due to the large map you're applying AA to more objects and it becomes much more taxing.
  22. Tek Syndicate make great videos, but most of their videos are podcasts talking about recent news and the occasional show coverage. And I love watching their podcasts and show coverage. But their actual hardware reviews are less common than most other tech youtubers. They make some great in depth videos, like this: Their actual hardware reviews on their hardware channel don't strike me as too amazing and they don't have a particularly large amount of them, but in general The Tek is a great channel to watch in order to find out some niche stuff about the PC space.
  23. You should take any and all information, no matter the source, with a grain of salt. Anybody could lie to you, nobody has any particular incentive to be honest with you. Why would somebody want you to get real information and make up your own mind about subject X when they can much more easily lie to you and make up your mind for you and make sure that the way you think is favorable for them. If you honestly care about getting real information about product X then look at all the reviews, typically there's a few reviews that sugarcoat the bad news about the product with prettier words and then there's one or two that don't. And it isn't always about being paid, sometimes it's differing opinions or something that somebody else didn't notice. No one source will have all the information, let alone all the entirely correct information.
×