Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Light-Yagami

Member
  • Content Count

    1,466
  • Joined

  • Last visited

Reputation Activity

  1. Informative
    Light-Yagami got a reaction from Levent in Comparing NVENC h264/h265 and CPU h264/265 - ULTIMATE edition   
    Hello everyone. 
     
    I decided to do a thorough research about how different codecs perform at different bitrates and quality presets.
     
    Footage used:  AC Valhalla 1080p 30fps 50mbps game footage, recorded with RTX 2060max-q high efficiency codec
    Program for testing: Handbrake 1.3.3
    Methodology: Compress master footage with different with varying bitrates and quality presets, choose a single frame that on average compressed very poorly, compare.
    All comparison can be done to this master picture: Valhalla master file.pdf
     
    NVENC comparison:
    Presets used: fixed bitrate 1000kbps, 2000kbps, 4000kbps, 8000kbps high quality (behaves same as slow/medium - no difference in result) and high performance (performs same as fast - no difference in result.
    Valhalla result GPU.pdf
     
    CPU comparison:
    Presets used: fixed bitrate 1000kbps, 2000kbps, 4000kbps, 8000kbps at slow, medium, fast and faster preset. dual pass and turbo first pass were also used.
    Valhalla result CPU.pdf
     
    Conclussion: In total, 40 GPU compressions and 32 CPU compressions were done. I had no knowledge that most GPU compression presets actually behaved exactly the same - I only show the relevant ones in the PDF.
    In my opinion, at least 4mbps are needed for acceptable results, provided you're using a high quality compression preset. 8mbps is what I would say enough that most people won't notice the difference. I hope the PDF quality is good enough to see the differences.
     
    Thank you for reading. Give me a thumbs up if you liked the post, testing like this takes quite a while. 🙂 I'm open to any questions you might have regarding Handbrake presets. Cheers 
  2. Like
    Light-Yagami got a reaction from The...guy in Smartwatches   
    LTT has reviews of all possible watches, just not all together. And what you're looking for is some sort of galaxy watch. Because nothing else supports samsung health.
  3. Like
    Light-Yagami got a reaction from Dedayog in Do NVMe SSD drives make your PC faster than SATA SSD drives?   
    Because higher numbers are "better". Because nobody really understands that just because you have 6GBps storage, it doesn't mean that the rest of the system can take advantage of it. Because PS5 has a fast ssd and having Gen4 SSD somehow makes you better.. Because marketing.. 
     
    But most importantly - ignorance. And to be honest.. the vast majority doesn't care. Those few who're very loud about it make it seem like everyone does. 
  4. Agree
    Light-Yagami got a reaction from WaggishOhio383 in Smartwatches   
    LTT has reviews of all possible watches, just not all together. And what you're looking for is some sort of galaxy watch. Because nothing else supports samsung health.
  5. Agree
    Light-Yagami got a reaction from FakeKGB in My brand new rtx 3060 seems to be broken. Am I missing something.   
    lol.. you're running an RTX 3060 with 8GB of system memory? There's your answer.. you're pushing way to many resources around. Most of your game is running straight off the SSD/HDD to compensate for non-existent RAM, causing stutter in the process. Get another 8gb stick and your problems will go away.
     
    Unless your card is overheating, but in that case, other games would run like crap as well.. I think it's definitely the lack of RAM.
  6. Like
    Light-Yagami got a reaction from ShawnStudioTech in Question With M1 Macbook Pro !!!   
    All that 8GB model will do is offset more memory to SDD. Which will make it slower in export times, bigger projects and so on.. the 16GB model will preform better in high memory operations, just because of the amount of memory.
     
    I would buy a 16GB model
  7. Like
    Light-Yagami got a reaction from RapidDevil in Does my Mobo support NVME SSD's?   
    PCIe 16x slot at the top of the slot stack has dedicated lanes to the CPU. Storage lanes are also dedicated to the CPU and to not touch unless stated otherwise  (depends on the board)
     
    As you might know, the vast majority of consumer CPUs support 20 PCIe lanes. 16 for the GPU, 4 for NVME storage. If MB supports more than 1 nvme slot, the bandwidth is often shared and halves if you were to occupy both slots at once.
     
    You don't have to worry about bandwidth sharing, as long as the CPU has 20 available PCIe lanes.
    I believe you should see speeds between 2200-2400 MBps (I don't believe in theoretical speeds, 2500MBps in this case). There should be no issues with the drive having more to give.
     
    Thank you for the praise, you're too kind. I try to provide people with useful information and treat them as if they have competency of critical thinking themselves, instead of wasting server space with pointless comments that serve no purpose.
  8. Like
    Light-Yagami got a reaction from Mister Woof in Does my Mobo support NVME SSD's?   
    PCIe 16x slot at the top of the slot stack has dedicated lanes to the CPU. Storage lanes are also dedicated to the CPU and to not touch unless stated otherwise  (depends on the board)
     
    As you might know, the vast majority of consumer CPUs support 20 PCIe lanes. 16 for the GPU, 4 for NVME storage. If MB supports more than 1 nvme slot, the bandwidth is often shared and halves if you were to occupy both slots at once.
     
    You don't have to worry about bandwidth sharing, as long as the CPU has 20 available PCIe lanes.
    I believe you should see speeds between 2200-2400 MBps (I don't believe in theoretical speeds, 2500MBps in this case). There should be no issues with the drive having more to give.
     
    Thank you for the praise, you're too kind. I try to provide people with useful information and treat them as if they have competency of critical thinking themselves, instead of wasting server space with pointless comments that serve no purpose.
  9. Like
    Light-Yagami got a reaction from RapidDevil in Does my Mobo support NVME SSD's?   
    20 gbps converts to exactly 2500 MBps. So no - you will not see speeds over 3000 MBps. Unless the text on the box is wrong and it supports higher speeds.
     
    pcie x4 connection has a theorethical limit of 3940 MBps, which is just shy of 32 gbps. If the motherboard says 20 gbps, then it's probably not the full x4 connection speed.
  10. Informative
    Light-Yagami got a reaction from panzersharkcat in 10875H breaks 10k pts barrier in Cinebench R23   
    Hello everyone,
     
    During my testing, I've discovered a way to trick the system to always run high power, instead of complying with integrated PL1 and PL2 limits.
    System tested: Dell XPS 9700 (modified, repasted with liquid metal, cooling pad also used)
    System characteristics: PL2 - 135W, PL1 75W
     
    Warning: please don't do this to your system.. If you want this sort of performance, just get the new AMD 5000 series.
     
    How to break PL system:
    In XTU, max out the turbo power boost duration slider, disable PL2. Set PL1 to 135W -> apply
    Leave window open, you'll need it during CB run.
     
    Open and start Cinebench run. System will draw around 120W, but eventually reach Tmax and settle at around 100-103W (that's what my system can sustain). After a while (run lasts 10min), system will try to hard reset PL1 to 75W. When that happens, simply reapply XTU settings (prepare by settings power to 135.25, having cursor ready on apply button). Repeat the process if need be. If the system is allowed to draw 100W on the CPU for the whole duration of the run, it will cross the 10k pts barrier. Here's the screenshot.

     
    If you leave the system to run it's normal course, it'll score in the mid 9000s.
     
    That's all I have for you. Too bad there's no undervolt support, should be possible to achieve much better results if there was a way to reduce power draw.
  11. Informative
    Light-Yagami got a reaction from Legolessed in 10875H breaks 10k pts barrier in Cinebench R23   
    Hello everyone,
     
    During my testing, I've discovered a way to trick the system to always run high power, instead of complying with integrated PL1 and PL2 limits.
    System tested: Dell XPS 9700 (modified, repasted with liquid metal, cooling pad also used)
    System characteristics: PL2 - 135W, PL1 75W
     
    Warning: please don't do this to your system.. If you want this sort of performance, just get the new AMD 5000 series.
     
    How to break PL system:
    In XTU, max out the turbo power boost duration slider, disable PL2. Set PL1 to 135W -> apply
    Leave window open, you'll need it during CB run.
     
    Open and start Cinebench run. System will draw around 120W, but eventually reach Tmax and settle at around 100-103W (that's what my system can sustain). After a while (run lasts 10min), system will try to hard reset PL1 to 75W. When that happens, simply reapply XTU settings (prepare by settings power to 135.25, having cursor ready on apply button). Repeat the process if need be. If the system is allowed to draw 100W on the CPU for the whole duration of the run, it will cross the 10k pts barrier. Here's the screenshot.

     
    If you leave the system to run it's normal course, it'll score in the mid 9000s.
     
    That's all I have for you. Too bad there's no undervolt support, should be possible to achieve much better results if there was a way to reduce power draw.
  12. Agree
    Light-Yagami got a reaction from SkilledRebuilds in Dell XPS 9700 4k vs 1080p video playback power consumption   
    Hello,
     
    Ever since I bought my XPS 9700, I wondered how much power difference does it make to just run the screen at 1080p, instead of 4k.
     
    Most very smart people on numerous forums who haven't used the laptop will gladly tell you that it doesn't make any difference because you're effectively illuminating the same number of pixels, even though you're just rendering out 25% of them, duplicating the remaining 75%.- meaning the panels itself consumes so much more than the GPU does, that it wouldn't even make a difference, regardless of the resolution used.
     
    These are my findings:
    I tested power consumption with HWinfo64 and by playing an x265 file in vlc player. i7-10875H had 4 cores disabled, as well as turbo-boost, so that CPU consumption would be as low as possible, allowing me to see the difference only based on GPU power. Screen was at minimum brightness and windows power saver was used.
     
    Average System power over 10 minutes (4K, full screen playback): 13.6W
    Average System power over 10 minutes (1080p, full screen playback): 9.6W
     
    A power reduction of 4W over 6 hours of video playback means 24Wh saved, adding more than 2.5 hours of additional uptime. This is a big enough difference that changing the resolution down to 1080p makes it worthwhile. 
     
    Furthermore, idle power consumption drops from around 9.5W to below 7W, which is huge! That means that at low brightness levels, XPS 9700 models with 4k screens can compare with native 1080p models. At higher brightness levels, 4k panels consumes more than 1080p panel, up to 3 times more. But I'm not comparing that here. 
     
    Conclusion: It does matter what resolution you're running at, and HD630 graphics does consume significantly more at 4K..
     
    I hope someone finds this information useful.
     
    Cheers
  13. Informative
    Light-Yagami got a reaction from TacticlTwinkie in Dell XPS 9700 4k vs 1080p video playback power consumption   
    Hello,
     
    Ever since I bought my XPS 9700, I wondered how much power difference does it make to just run the screen at 1080p, instead of 4k.
     
    Most very smart people on numerous forums who haven't used the laptop will gladly tell you that it doesn't make any difference because you're effectively illuminating the same number of pixels, even though you're just rendering out 25% of them, duplicating the remaining 75%.- meaning the panels itself consumes so much more than the GPU does, that it wouldn't even make a difference, regardless of the resolution used.
     
    These are my findings:
    I tested power consumption with HWinfo64 and by playing an x265 file in vlc player. i7-10875H had 4 cores disabled, as well as turbo-boost, so that CPU consumption would be as low as possible, allowing me to see the difference only based on GPU power. Screen was at minimum brightness and windows power saver was used.
     
    Average System power over 10 minutes (4K, full screen playback): 13.6W
    Average System power over 10 minutes (1080p, full screen playback): 9.6W
     
    A power reduction of 4W over 6 hours of video playback means 24Wh saved, adding more than 2.5 hours of additional uptime. This is a big enough difference that changing the resolution down to 1080p makes it worthwhile. 
     
    Furthermore, idle power consumption drops from around 9.5W to below 7W, which is huge! That means that at low brightness levels, XPS 9700 models with 4k screens can compare with native 1080p models. At higher brightness levels, 4k panels consumes more than 1080p panel, up to 3 times more. But I'm not comparing that here. 
     
    Conclusion: It does matter what resolution you're running at, and HD630 graphics does consume significantly more at 4K..
     
    I hope someone finds this information useful.
     
    Cheers
  14. Informative
    Light-Yagami got a reaction from Applefreak in Dell XPS 9700 4k vs 1080p video playback power consumption   
    Hello,
     
    Ever since I bought my XPS 9700, I wondered how much power difference does it make to just run the screen at 1080p, instead of 4k.
     
    Most very smart people on numerous forums who haven't used the laptop will gladly tell you that it doesn't make any difference because you're effectively illuminating the same number of pixels, even though you're just rendering out 25% of them, duplicating the remaining 75%.- meaning the panels itself consumes so much more than the GPU does, that it wouldn't even make a difference, regardless of the resolution used.
     
    These are my findings:
    I tested power consumption with HWinfo64 and by playing an x265 file in vlc player. i7-10875H had 4 cores disabled, as well as turbo-boost, so that CPU consumption would be as low as possible, allowing me to see the difference only based on GPU power. Screen was at minimum brightness and windows power saver was used.
     
    Average System power over 10 minutes (4K, full screen playback): 13.6W
    Average System power over 10 minutes (1080p, full screen playback): 9.6W
     
    A power reduction of 4W over 6 hours of video playback means 24Wh saved, adding more than 2.5 hours of additional uptime. This is a big enough difference that changing the resolution down to 1080p makes it worthwhile. 
     
    Furthermore, idle power consumption drops from around 9.5W to below 7W, which is huge! That means that at low brightness levels, XPS 9700 models with 4k screens can compare with native 1080p models. At higher brightness levels, 4k panels consumes more than 1080p panel, up to 3 times more. But I'm not comparing that here. 
     
    Conclusion: It does matter what resolution you're running at, and HD630 graphics does consume significantly more at 4K..
     
    I hope someone finds this information useful.
     
    Cheers
  15. Like
    Light-Yagami got a reaction from Firedrops in PotPlayer Extensive Guide For Best Video Quality   
    Hello everyone!
     
    This guide will focus on optimising PotPlayer for best video quality. For those who're not yet familiar with PotPlayer, it's similar to MPC-HC or VLC players, but offers a simpler design and UI with powerful post processing tools that make videos look cleaner, sharper and richer in colour. That is my personal opinion, I've only done a comparison between it and the VLC player so I am biased in this regard.
     
    This guide has 3 sections. First one focuses on settings everyone should enable, regardless on the power of their system, since they're not very resource intensive. Second section focuses on settings you can enable if you have a strong system (specially on the GPU side) and the third section focuses on settings for lower tier systems. I have a desktop setup with i7-6700k@4.7ghz and GTX1070 with my laptop running i7-4710HQ and GTX860M. Those are my reference points for a high end and low end system. GPUs are the most important here, since we'll be focusing on utilising them for video upscaling, as well as running multiple post processing filters with them. Disclaimer: Settings I will show you and work best for me will stress your system (particularly GPU) - this is not a battery friendly way to watch videos. With that said, lets get right into it.
     
    Section 1
     
    When you install a 64bit PotPlayer version (which you can do here) make sure to download all addons in the installation process. After you install it, make sure to set Nvidia as its primary graphics processor if you're running it on laptop, since it defaults to integrated graphics. (If you don't know how to do that, click here)
     
    Open PotPlayer and click on three horizontal bars in the top left corner. After menu opens, click on preferences.
     
    1. On the left side click on "playback" and under "process priority" on the right side of the window select "high" instead of "Above normal" - if you want to allocate more resources to playback. I didn't, and it's fine.
    2. Click on the plus icon next to the "Filter control" on the left side of the window and select "Video decoder", as the option column expands. Click on "Built-in Video Codec/DXVA Settings" in the mid bottom of the window and check boxes as in the picture (Under DXVA2 copy-back settings select Your external GPU)
    press okay and apply settings. This settings makes your GPU render all video.
     

     
    3. Under "Audio Decoder" tab go to "Built-in Audio Codec/Pass-through Settings" and mimic what I've done (this is where you need those addons you downloaded during PotPlayer installation) After you're done, click okay and apply settings.
     

     
    4. Click on "Video" on the left side of the window and under "Video Renderer" select "Madashi Video Renderer" or MadVR, as I'll be referring to it (You'll have to select it again once you leave preferences by clicking on "show main menu" button top left > video > video renderer > Madashi Video Renderer). Below under "Fullscreen exclusive mode" select disable, if not already selected. Under "Deinterlacing" tab on the top select "use hardware deinterlacing" under the Deinterlacing Method. Under "Effects" tab on the top far right (you might have to click on the arrows to move the tabs) check the "Deblock" box (and leave the slider at 256) and "deband" box on the bottom. Apply settings.
     
    5. Click on "Audio" on the left side of the window and change settings as shown in this screenshot. Apply settings. (You don't have to do this, my headphones are compatible with 24bit 96khz sample rate)
     
     
     
    Section 2
     
    We're now going to change a couple of settings in MadVR settings. I'll first do the "high end" config and "lower end" after that, but you can always mix them together to find what suits you best. I'll tell you as I go along - what worked for me and the performance hit it has, so you'll be able to figure out if you want to do anything differently.
     
    Under "Video" tab, press on 3 dots right of the MadVR option and press "Edit Settings". New window will pop up. We'll first change some options that stay unchanged between the high and lower end config. Do as shown in screenshots.
     

     
    I hope these screenshots make it easy for you to coordinate yourself in settings. They can be very frustrating to navigate until you're not used to them. Now for the part where configs start to differ. These are screenshots for a config with similar power to my desktop config (i7-6700k+GTX1070). 

     
    Troubleshooting for higher end config:
    If you're dropping frames, decrease the number of neurons under "chroma upscaling" to 64. If you still drop frames, decrease number of neurons under "image upscaling" > algorithm quality to 32. If you're still dropping frames, change upscaling method under "image upscaling" from NNEDI3 to Lanczos and configure it the same in the screenshot for image downscaling. Anything easier to run than that already falls into "lower end" config imo.
     
    Section 3
     
    These settings can be used on almost all laptops with any sort of discrete GPU. I'll start by showing you the settings I have on my laptop (i7-4710HQ+GTX860M). After that, I'll present you with some alternatives that are easier to run but also produce worse results. 
     
     
    4
    At this point, everything should run smoothly, but if not: Decrease number of neurons under "chroma upscaling" to 32. If you experience dropped frames still, switch from NNEDI3 pixel shader code to Lanczos and configure it the same as shown in screenshot for "image upscaling". If that lags as well, change "image upscaling" from Lanczos to DXVA2. Even the weakest systems should run this smoothly. The load on the system is same as VLC player at that point.
     
    I hope I made everything understandable. After hours of digging through this guy's SITE , I tried my best to pick up on the important stuff and present it as simply as possible. I'm open for questions - I can help you out if you have any problems during the optimisation process. Thank you for reading.
     
    Additional notes: I am in no way an expert on this topic nor do I fully understand the program. I kept this simple (among other things) for my sanity as well, since there are so many options you can choose from with minimal differences it's really difficult to decide and understand what works together best and what doesn't. End result is very good in my opinion - better colour science, sharper, cleaner video with less artifacts and noise.
  16. Like
    Light-Yagami reacted to Arika S in Hi, guys my gpu might've died pls help   
    blinking white light = abnormal power delivery (it's only showing on the left side indicating the problem lies here)
    red light = no power. (both cables unplugged, so red on both sides)
     
    Make sure the cables are connected properly. Maybe try another power supply if you have one lying around
  17. Like
    Light-Yagami got a reaction from erik3135 in Memory Speed Compatibility, CPU or Motherboard Based?   
    If mobo supports 3466, that's it. I doubt a B450 would do any higher or even - be able to do 3466 reliably. Imho, a 3200mhz kit with CL14 is the faster you'll get away with. Or - buy a new MoBo.
     
    Sweet spot for new ryzen chips is 3600mhz CL16, but it won't gain you any appreciable performance, unless you run a very high end GPU and you game at 300fps+. So you're fine
  18. Like
    Light-Yagami reacted to bit in Help me please   
    damn that's fucked up sorry bro
  19. Agree
    Light-Yagami got a reaction from Letgomyleghoe in Is this it ??   
    It's time to let Intel go, my friend. It's okay. The pain is temporary 
  20. Agree
    Light-Yagami reacted to Dedayog in Ryzen 9 5900x or I9 10900k for gaming?   
    Either one of those for WoW... uhm, massive massive overkill.
     
    You even admit that you already get more FPS than your monitor can handle.  
     
    Sorry but for WoW... no.
  21. Agree
    Light-Yagami got a reaction from PeachGr in Switching from a gaming chair to ergonomic £1,000+ Herman Miller mesh chair or alternatives?   
    If you sit 16 hours a day, buy a herman miller.. simple as that.. I bet there are chairs you can get for cheaper.. but I've been sitting a lot in my gaming chair since my class lessons have been moved online, and I tell you - I wish I had the money to buy a good chair such as herman miller.. It's a ton of cash for a few bits of metal and plastic - but your back is more important than a hole in your wallet for a couple of months.
  22. Agree
    Light-Yagami got a reaction from kelvinhall05 in Switching from a gaming chair to ergonomic £1,000+ Herman Miller mesh chair or alternatives?   
    If you sit 16 hours a day, buy a herman miller.. simple as that.. I bet there are chairs you can get for cheaper.. but I've been sitting a lot in my gaming chair since my class lessons have been moved online, and I tell you - I wish I had the money to buy a good chair such as herman miller.. It's a ton of cash for a few bits of metal and plastic - but your back is more important than a hole in your wallet for a couple of months.
  23. Like
    Light-Yagami got a reaction from elpiop in Help me with Google Sheets please   
    I figured out what was wrong. Google Sheets does accept the formula, but you have to write an ";" instead of ","
     
    That is all. Took me 2 hours to figure out, when I uploaded a working Excel document to Drive and opened it with Google spreadsheets - and it automatically merged the formula. Thanks for the help though, I greatly appreciate it
  24. Agree
    Light-Yagami reacted to shoutingsteve in Please help, Laptop Screen has faint darkish line on left corner   
    Are you able to get into the BIOS, during the BIOS screen, you will be outside the windows environment and we can at least tell if it is hardware at that point.  If it persists int he bios screens, then it is a hardware problem.  if not, then it is an issue inside winodws and might just need a reinstall of the graphics drivers.
  25. Agree
    Light-Yagami got a reaction from BiG StroOnZ in Help me pick a Video Card for my build   
    Wait for next gen GPUs coming in a few months. It'll be worth the wait  And I'm not saying that "just because". It'll be a very big difference. Price to performance as well.
×