Jump to content

SkyHound0202

Member
  • Posts

    670
  • Joined

  • Last visited

Posts posted by SkyHound0202

  1. In CMYK printing process, the lighter the color, the less ink/toner the printer will use.

     

    By switching to a light color, it will save you a few microliter/microgram per page, which translate to about 0.0005 to 0.0025 EUR per page (depend on print technology and coverage), which is neglectable in small print volume, but does add up in the long run (saves up to 25 EUR with 10k prints).

     

    Compare to switching to lighter colors to save ink I would argue that using third party supply is more economic since it will save around 0.001 to 0.005 EUR per page (up to 50 EUR per 10k).

  2. I won't necessarily call it a bad thing if a "starting at" price is low enough that makes a previously unobtainable tech (in terms of price and/or performance) available and accessible to more people, while also informing the customers what caveat ("catch") it may have.

     

    For some (if not most) people, their biggest concern is not performance, but price. To them, solving the problem of owning a device take precedence over the actual performance of the device.

  3. GPU die is the part that is least likely to fail on a graphic card. Fan fails over time, power delivery degrades after extensive use, solder joints break from stress, connectors break due to user error.

     

    So if done correctly, finding used GPU dies elsewhere and remanufacturing them with new components on a new board is a good way to go. ("Kinology", etc.) Heck, even professionally refurbishing a used a card by giving it new paste and new fans plus a thorough check is a valid option. (rest of the tested)

     

    Not only does it reduce e-waste, but it also bring the price down in general.

     

    But the problem is that Nvidia and their partners doesn't want these, since someone buying one of these remanufactured, unlicensed cards means one less die Nvidia will sell to their partner and one less card their partners will sell to you. So they have been clamping on it pretty hard.

     

    AMD on the other hand doesn't have such limitation. A remanufactured RX 6600M card (mobile Navi 23 die on PCIe card, mostly used for mining) functions as expected on desktop, while Nvidia mobile GPU require either extensive hacks or will be eventually blocked from newer driver updates.

  4. At some point they should just adopt the 8-pin EPS12V (i.e., the CPU power connector rated for 336 watts maximum) for consumer graphic cards as well. It's already being used in server graphic accelerators (Tesla) and professional graphic cards (Quadro) anyway.

     

    8-pin PCIe power connector (150 watts max) is nearly two decades old and won't cut it in the current hardware landscape.

  5. Of course you don't need a new PC, if it's a) recent enough, and b) upgradable.

     

    A Kaby Lake i5 is not even old, despite being listed as "Discontiuned" on Intel ARK. True while it won't received any more driver update for its iGPU, but the CPU itself still sports all the modern features (AVX2, PCIe 3.0, NVMe, Optane, etc.) and four capable cores at a reasonable clock speed.

     

    And thank goodness that Acer uses standard form factors for their system. Had it been a custom Dell or those 12VO board, a power supply upgrade would be impossible.

     

    (I will skip the RAM, storage and GPU part since they are generally swappable and readily available, unless soldered. Also skipping OS as it's software and up to user preference, but modded Windows or Linux should be fine)

  6. My hot take under the 9900KS video in 2019:

    Intel's foundries are their liability. Moving to a smaller-but-immature 10 nm node now means losing out the current older-but-mature 14 nm node capacity, which Intel clearly would not until they have sorted out their 10 nm problems. It's not about microarchitecture, but how their foundries (by extension, their manufacturing business model) works.

     

    And a bunch of people immediately jumped in without even knowing what I was talking about. Someone brought up the broken Tick-Tock model and be like "130 nm FTW". Then there's people keep mixing architecture advantage and node advantage and came up with "1x nm Zen1/+ competes with 22 nm Haswell or 14 nm Broadwell".

     

    Anyway, I was mocked and humiliated that day for merely understanding Intel's broken business model of holding onto their fabs and let node supersedes architecture.

     

    Years later, lo and behold, Intel's 10th and 11th gen is still utilizing 14 nm.

     

    I was right.

  7. The SSD type doesn't matter since they will all bottlenecked by the SATA2 interface found on the PS4 Slim anyway, but ANY SSD is an upgrade when compared to the mechanical hard drive that comes with the PS4 Slim. You will see some improvements in terms of boot & load time and texture streaming in game.

     

    There's no compatibility issue. Reason that few people uses 870 series for PS4 upgrade is that it's still relatively new (released two years ago, general availability at MSRP one year ago) and the fact that older 860 series (also other brands) can be have for cheap.

     

    870 QVO is a decent choice since it uses QLC (4-bit) NAND to offer large capacity on the cheap. While it has less write endurance compares to Samsung EVO and PRO line (QVO 1 TB only has a rated 360 TBW, 60% of EVO of the same size), the gaming environment is less write intensive (unless you ) so it shouldn't be a problem. Do keep you saves backed up just in case.

  8. Short answer: Unlimited tests in 3DMark and Offscreen testes in GFXBench.

     

    Long answer:

    Spoiler

    It's hard to compare graphic performance though synthetic software. The main problem is that Apple uses their proprietary Metal API on iOS (OpenGL ES has been deprecated in iOS 13) while Android used OpenGL ES and/or Vulkan. So they are not the same from the ground up since the API overhead and driver optimization for hardware is different on both platform.

     

    This is further complicated by other variables like software edition and hardware design for a device using the said processor (example: same A15 in iPhone 13 & 14, iOS updates, benchmark software). What's more, even the same processor in the same device performs differently due to other factors like background activities, battery health, screen brightness, temperature and so on.

     

    In a word, results from cross-platform synthetic benchmark software like 3DMark and GFXBench only tells how well a device can run the test, not necessarily a full-on comparison between devices or brands.

     

    If you want to do a proper comparison, you can look up the result database of the said software and analyze them yourself. Otherwise you will need a lot of data on various devices through repeated tests using unlimited/offscreen style test (after clearing as much background apps as possible) since they are render at fixed resolution and is more comparable between runs/devices and more representative of the graphic power of the chip.

     

  9. Just now, Takumidesh said:

    I am deliberately avoiding hawdware transcodes, as these are not on the fly so.

    Though, I do intend to pick one of those up at soe point for on the fly transcodes.
    Anyway the transcodes are not the only long term load, just the one I am currently running to get this data.

    Guess I missed the point. Thought you were doing transcoding primarily & constantly.

     

    If you just want the CPU to be efficient, you can try dropping the Package Power Tracking ("PPT", W), Thermal Design Current ("TDC", A) and Electrical Design Current ("EDC", A).

     

    Default PPT/TDC/EDC for the 5900X should be 142/95/140. You should be able to safely drop it to 128/80/125 to run the chip at 95W TDP, then further to 88/60/90 and effectively turn your Ryzen 9 5900X (105W TDP) into an OEM-only Ryzen 9 5900 (65W TDP). This would result in slight performance loss but huge efficiency gain. See this for more info. (if you want even lower, try 45W TDP at 61/45/65).

  10. You need to go to Control Panel\Network and Internet\Network Connections and find your adapter (in your case it should say something like Realtek 2.5 GbE family controller) then right click for Status and check Connection-Speed. This denotes your link speed, i.e., the speed of the physical link over the cable.

     

    For gigabit ethernet it should say 1 Gbps (note: despite the controller on motherboard being a 2.5 GbE one, the fritzbox only have a gigabit port.)

     

    Any test yield online through website like speedtest net only indicate the speed of your internet subscription package, and it not indicative of the speed achieve in a local network (LAN).

     

    Also, I am afraid there's no real need to get an expensive cat 8 cable when a cheap cat 6 cable (even a cheaper cat 5e cable when shorter than 55 meter) can do 2.5 GbE in a home environment.

  11. You can manually enable "write caching" to use some system memory (RAM) as a write to improve performance.

     

    It should be enabled by default. If not you can go to Device Manager, select the drive from Disk Drive section, right click and select Properties then Policy tab. Tick the check box "Enable write caching on the disk".

     

    Keep in mind this method may only improve the performance slightly as DRAM-less SSD are inherently a bit slower than their DRAM-equipped counterpart, but in day to day operation there should be any noticeable differences except extreme edge cases, and any SSD is far superior than a traditional mechanical drive.

  12. First of all, check if the program you are going to run is listed in the Wine Application Database. If it's listed and it has a bronze or above rating (i.e., runs), then you may go ahead and install Linux and Wine. Otherwise proceed with caution as the program may not work under Wine.

     

    Some Linux distros ship with Wine included, these included version might need to be reinstalled for better functionality. Otherwise you can install Wine with a few lines of code.

     

    I don't really have any suggestions for a distro, given the open nature of Linux, you can simplify a bloated distro to get a light weight one. If I were to suggest one, I would go with Lubuntu (the slimmed version of Ubuntu) as I uses Ubuntu on my tertiary machine.

     

    Edit: forgot to add that if you are aiming for game on Steam, use Proton instead.

  13. You cannot use hub to aggregate data through a Thunderbolt connection. The hub can only be used to connect host and downstream devices, not the other way around.

     

    You motherboard does not support Thunderbolt and the onboard USB-C port does not support display output. If you want a single cable solution, i.e., single USB-C cable to a hub near TV, you need to buy a USB-C with DisplayPort alternative mode (not Thunderbolt) PCIe add-in card like this one (refer to this guide for more possibilities), connect a DP cable to the add-in card, then connect the hub to computer on one end (using USB-C cable with USB and DP signaling) and TV & peripherals from the hub.

     

    There's a free alternative though: if you only want to game on your TV, you can install Steam Link from Google Play Store on your X900H and stream game, providing you have good connection (ethernet or good Wi-Fi).

  14. AMD should distinguish their OEM product by adding a vendor code somewhere (maybe laser etch it onto the IHS), and permit booting a tempered Pro series processor on consumer board with PSB associated features disabled. Samsung does this to their client SSD all the time, i.e., Lenovo's Samsung PM981 SSD (OEM equivalent of 970 EVO Plus) has a 000L2 suffix. While you can boot the drive on any other systems, only Lenovo firmware can be used to upgrade the drive.

  15. Using an ad blocker is not piracy. In short, piracy is when a copyrighted work is used without permission, say someone screen caped a YouTube video and uploaded onto Dailymotion. But since a ad blocked video is still played from YouTube in a browser, this does not constitutes a piracy.

     

    But the act of using ad browser clearly constitutes as deprivation of potential earnings. Ad blocker prevents the ad from displaying, thus inhibited the advertiser/Google's ability to monetize the video, thereby reducing the payout to the creator, should they benefit from the program.

×