Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

TheAgnda

Member
  • Content Count

    349
  • Joined

  • Last visited

Awards


This user doesn't have any awards

About TheAgnda

  • Title
    Member
  • Birthday 1994-12-16

Contact Methods

  • Twitter
    https://twitter.com/TheAgnda

Profile Information

  • Gender
    Male
  • Location
    Utah
  • Interests
    Tech, games, music.
  • Occupation
    Analyst / IT

System

  • CPU
    7900X @ 4.8Ghz & 6800k @ 4.2Ghz
  • Motherboard
    Asus Tuf Mark 1 X299 & ASUS X99-E
  • RAM
    TridentZ DDR4 (8x8GB) & GeIL EVO DDR4 (2x16GB)
  • GPU
    NVIDIA GTX 1080 & NVIDIA GTX 1080
  • Case
    Corsair 900D & Corsair 270R
  • Storage
    1x600p PCIe M.2 1TB, 1xEX920 PCIe M.2 1TB, 2xWD Black 7200RPM 2TB & 1x600p PCIe M.2 1TB
  • PSU
    Corsair HX1000i & EVGA 850 B3
  • Display(s)
    1xAsus XG35VQ, 1xLG 65UH6030, 1xAsus VH242H, 1xHP 22cwa & 1xAsus VH242H, 1xAsus VC39H
  • Cooling
    Full Room Water Cooing (1xEKWB X3 400, 1xPhobya Balancer 250, 1xReservoir Phobya 150, 3xKoolance PMP-500, 4xEKWB SE560, 16x140mm Fans, 2x SilverStone Fan Hub, 1xEVGA 450 B3)
  • Keyboard
    Logitech G910 & Logitech K780
  • Mouse
    Logitech G903 & MX Master 2s
  • Sound
    1xRME HDSPe, 1xRME Fireface UC, 1xMackie PROFX12, 2xYamaha HS5, 1xYamaha HS8S
  • Operating System
    Windows 10
  • PCPartPicker URL

Recent Profile Visitors

569 profile views
  1. TheAgnda

    Budget/Console (for the Switch) Capture Card

    PCIe isn't going to give you any major advancements at 1080P60, I'd keep an eye on eBay or your local classifieds for an HD60 Pro or HD60s.
  2. Assuming the adapter is 2xMolex to 6 pin that should work fine. 4790 has TDP of 84W: https://ark.intel.com/products/80806/Intel-Core-i7-4790-Processor-8M-Cache-up-to-4-00-GHz- GTX 960 has TDP 120w: https://www.geforce.com/hardware/desktop-gpus/geforce-gtx-960/specifications And that is their max power draw, it is unlikely that you run your GPU and CPU at 100% simultaneously. During gaming for instance your draw will likely be under 200W, and most other hardware accelerated tasks for that matter. RAM, fans, drives take very little power in moderation.
  3. Assuming you don't have a crazy amount of fans, drives, or something else going on you're only around 230W synthetic load. Second PSU isn't necessary, and very unlikely that your PSU will "blow up" lol Quality and age of said PSU should be taken into consideration though.
  4. I would try using a USB cable with double Ferrite chokes: https://www.amazon.com/gp/product/B003MQ29B2/ref=oh_aui_search_detailpage?ie=UTF8&psc=1 If this doesn't work I would try using a USB hub with another double ferrite choke cable between it and the computer and then another between the interface and the hub. Additionally the outputs on the Scarlett 2i2 are balanced, make sure you are using TRS cables and not TS cables. The second ring on TRS cables is used as a ground on balanced connections reducing noise: https://www.amazon.com/gp/product/B000068NYG/ref=oh_aui_search_detailpage?ie=UTF8&psc=1 Lastly as mentioned by @Domifi the issue could be the fault of factor outside the computer. Just last week I was getting electric noise in my left studio monitor, after trying just about everything involving the cables, interface, and computer connected to the monitor the noise was still present. With few options left I started trying more outlandish things like turning off my phone and other devices in proximity, and after unplugging my router (which was in the same room) the noise stopped. After readjusting the antennas to point away from the left monitor the noise ceased even with the router powered on. Wireless signals can definitely still affect even newer audio equipment.
  5. This is perfect, thank you! Surprised to find that even the 2080ti doesn't have more encoding bandwidth when compared to my 1080... At least it has higher quality encoding. Not really gonna help me with the issue at hand though, and with a 250M bitrate I presume there would be no difference in quality. Thanks again.
  6. I really appreciate it. Just to make things easier copy these settings:
  7. One stream is my display and the other a camera, I actually do 4 videos but the last 2 (Discord & a soundboard) are just black screens with audio encoded via CPU. Everything is synchronized and allows me edit and mix all sources in post at original quality, while something like OBS / single stream is destructive.
  8. I use FFmpeg to encode / record two separate 4K60 Videos simultaneously. Overwatch would work perfectly.
  9. Something with lots of movement typically stresses the encoder more, definitely not something 2D, sorry should have mentioned that. I found that Rocket League is great for testing, but anything 3D and Modern should do it. At 250M it would be nigh impossible to tell the difference, especially after being crushed by Youtube compression. It's less about the quality and more about the capability, with my GTX 1080 I can encode 2x4K60 streams simultaneously but just barely, I can't end the recording gracefully always drops frames.
  10. With a lack of encoding information in regards to the new RTX cards I was hoping I could get someone to run a quick test for me. Recording conditions (OBS) - Res: 3840x2160, Frame Rate: 60, Bitrate: 250M (2500000), Color Format: NV12 (should be default) - nothing else matters too much. While recording just open up task manager, go to the performance tab, select the GPU in question, and screenshot the window. As should be obvious you'll need a 2080 or a 2080 Ti with a 4k display to accurately perform the test, I would really appreciate it! Screenshot example:
  11. TheAgnda

    tiny cooling loop for GPU

    I would consider removing the CPU AIO and running the GPU and CPU in the same loop.
  12. TheAgnda

    Trying to understand audio interface

    It also acts as and output device in windows + most interfaces have a headphone output for direct monitoring of inputs.
  13. TheAgnda

    NVENC On RTX2080 & RTX2080ti

    No problem, not trying to patronize you or anything. Like you said, depending on the conditions one can acquire the same quality footage encoding with lower end GPUs, however quality does not equate to performance. Unfortunately that makes it all the more confusing and is what I believe to be the reason for the spread of misinformation. To this day I often hear large content creators with hundreds of thousands and sometimes even millions of subscribers nonchalantly proclaim that if you plan on solely encoding with a GPU you should get xx50. The only person I've heard even mention the differences is Stephen Burke from Gamers Nexus and it was very quickly in a video about the Elgato 4K60 Pro. And I'm pretty sure that the people who caught it assumed he misspoke or something as it is widely excepted that like series GPUs have "the same" encoding performance. Hence my concern, in the LTT video they never mentioned the encoding performance of the 2080, and after my debacle with the 1050 I'm not going to assume. Heck, for all I know maybe this time around the 2050/60/70 WILL actually have the same encoding performance as the upper end. But for now if I can get confirmation that the 2080 has the same encoding performance and quality as the 2080ti I may very well pick one up.
  14. TheAgnda

    NVENC On RTX2080 & RTX2080ti

    This isn't true. I was running a 1080 but decided to downsize due to reading in countless places that the 1050 had "the same" encoding performance as a 1080. Only to find that when encoding a 3440x1440 @ 100FPS or a 4K60 stream (both at 250M bitrate) the 1050 couldn't keep up in some instances while the 1080 didn't break a sweat with 75% headroom. When monitoring encoding usage in task manager on both cards while recording a 4K60 stream the 1050 would stick around 65% usage while the 1080 was at about 23%. Not to mention I was using over 1GB of VRAM on both cards and the 1050 only has 2GB. Not often but every once in a while the 1050 would have encoding spikes and drop frames at my desired resolution and bitrate. For most people this may seem "the same" because they plan on encoding one stream and they do it at a much lower bitrate and often resolution, but for me this wasn't going to fly as the plan was to encode 2 4K60 streams at once via FFmpeg (Or 1 3440x1440 @ 100FPS & 1 4K60). After contacting NVIDIA I was told that while most 10 series cards have 1 active NVENC chip the 1080+ has 2 and that the encoding chip was not the only factor to consider. Basically the 1080 had more than 2x the encoding performance when compared to every 10 series card below it. I can confirm that the 1050 cannot encode 2 4K60 streams simultaneously at a 250M bitrate while the 1080 can with 35% headroom. Bottom line, encoding performance is not the same for every card in a series, I learned this the hard way. Read more here: https://devtalk.nvidia.com/default/topic/987460/nvdec-cuda-nvenc-speed-comparison/ https://developer.nvidia.com/video-encode-decode-gpu-support-matrix
  15. TheAgnda

    NVENC On RTX2080 & RTX2080ti

    Just watched LTT's video on the new RTX cards and there was a short segment about how NVENC had been improved. However it seemed like they were only comparing the 2080ti and 1080ti... Has anyone (or maybe LTT staff could jump in here real quick) come across any information regarding NVENC on the 2080? Does it have the same chip(s) / performance as the 2080? This would actually be a pretty big selling point for me, I'm running a 1080 in one of my builds exclusively for encoding.
×