Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About Hawick

  • Title

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Well that Ryzen build that @LinusTechTipsFanFromDarlo has posted looks pretty solid. I know my build is shit.
  2. I'm comparing apples to apples. In that particular picture I just wanted to show you that it's indeed 40Mbps. That file was generated yesterday because I was testing 1080P 40Mbps. It's just an example, to show you that the bit rate is what it seems. The video in my previous post was also 10Mbps as seen in the title (I always check it). Yeah something's obviously wrong here and that's exactly what I'm trying to figure out. Again, the problem is the heavy pixelation during motion, which should not be an issue with such a high bit rate (anything above 30Mbps). As seen in the PS4 video, it's low resolution (1280x720), low fps and low bit rate and on top of that there's a shit load of things on the screen (cars, trees, mountains, effects, etc and they're all moving crazy fast). Pixelation occurs when: - bit rate is too low or there's too much motion and not enough bit rate to handle it - fps is low (a 60 fps video usually has less pixelation during motion) - resolution is low So yeah I don't know what the hell causes this. Bug? I tried so many different NVIDIA drivers and Geforce Experience versions... I'll probably ask someone to record the same thing on their PC and see how the pixelation is. This is a goddamn mystery. EDIT: And I'm not trying to fanboy the PS4, not at all. I like PS4 but I also like PC. I know that NVIDIA ShadowPlay is supposed to be much better.
  3. I checked it: 10Mbps is the lowest Shadowplay bit rate value. Looks much worse compared to PS4 8000Kbps. my PC: i5-4460 3.2ghz 32gb ram gtx 750 ti 2gb video card nvme pcie ssd
  4. Lol, thanks. Looks better than mine. Is this AMD Ryzen CPU better/worse than the i7 equivalent btw?
  5. MOBO: ASUS TUF X299 MARK 1 Socket 2066 CPU: i7-7820X Octa-Core 3.6 Ghz LGA2066 Socket CPU COOLER: Cooler Master Hyper 212 Black Edition 42 CFM RAM: Kingston HyperX FURY 16GB (2x8GB) DDR4 2666MHz HX426C16FB2K2/16 PSU: Corsair TX-M Series TX850M 850W Gold (CP-9020130) STORAGE: some cheap SSD + HDD, doesn't matter CASE: Undecided GRAPHICS CARD: Undecided Are the components compatible with each other? I used PC Part Picker and it said the PSU does not have some 4-pin ATX thingy. Opinions?
  6. No I mean screw the dual cpu. I mean a single cpu socket like most gaming boards.
  7. Ok that was pretty dumb from me. What chipset is the most common? What CPU socket? I want DDR4 ram too and a cheap i7 + the on board audio.
  8. I'm looking for a motherboard that meets these requirements: - 2 physical CPU sockets - 4 to 8 RAM sockets - fastest possible PCI-E lane (idk if it's x16 ?) - on board high quality audio stuff (sound blaster x-fi or anything similar)
  9. It certainly looks better than my shadowplay recording and yes I used the highest shadowplay settings.
  10. I never heard of the super edition, wtf!
  11. CPU: i5-4460 @ 3.2ghz Current GPU: GTX-750 TI 2GB RAM: 32GB DDR 1600Mhz PSU: CX 430W MOBO: Gigabyte GA-Z97X Gaming 7 SSD Boot Drive: Plextor M8PeY 256GB SSD Storage: Intel 545S 128GB HDD Storage: WD Blue TB Cooling: everything stock Goal: 1440P GTA V / BF4 @ 35 fps (minimum) First set of questions: Should I buy GTX-1660 OC 6GB? Should I save up a little more for GTX-1660 TI? Is it worth it? Should I buy something else? Maybe GTX-1060? Is that better? Second question: Is this CX430 watt power supply enough to power a GTX-1660 or 1660TI? Thanks for reading.
  12. "GIGABYTE GeForce GTX 1660 OC 6GB GDDR5 (GV-N1660OC-6GD) " ^ This card right here. The website I'm trying to buy this card from claims that the speed of the graphical chip is 1830Mhz. I checked some other brands like "Palit" and "Inno3D" and they all say the graphical chip frequency is 1530Mhz. Now how the hell is there such a huge difference between Gigabyte and these other brands? Gigabyte offers 300Mhz more? Is that true? If that's true, how stable is it? Is it 100% stable? 300Mhz in video card overclocking is HUGE. I can't believe it's actually stable. Can anyone explain this?
  13. i5 4460 3.2ghz So I should use x264 cpu?
  14. I have GTX750TI. So you're basically saying the recorded videos quality also depends on my graphics card?