Jump to content

crystal6tak

Member
  • Posts

    1,745
  • Joined

  • Last visited

Reputation Activity

  1. Like
    crystal6tak got a reaction from ShayOh in Netgear WNR2000v5 wireless adapter/bridge?   
    Like QueenDemetria said, DD WRT is a 3rd party fireware. Anyways, I checked. Your router is not supported unfortunately.
     
    It seems you can setup the router to be a wireless repeater so it would be:
    Primary router --> WNR2000v5  -->  wireless pci card --> PC
     
    Or setup the router to be a wireless bridge so it would be:
    Primary router --> WNR2000v5 --> PC
     
    A few google searches later I can't find any solid information on WNR2000v5 as wireless adapter/bridge. It seems it's totally possible for v2 and v3 as they have DD-WRT support. So unfortunately your v5 router might not have the adapter/bridge function.
     
    I think you should rename this thread to "Netgear WNR2000v5 wireless adapter/bridge?" to get more support. 
  2. Informative
    crystal6tak reacted to Gregorovich97 in UPDATE: War Thunder removed Steam from its Website after Review Bombing   
    As a self proclaimed veteran of War Thunder since the release of ground battles way back when and a general tank enthusiast and service person I can perhaps explain why the recent reviews have switched over to negative, or at least from the perspective of that of my fellow Squadron members and myself.
     
    The game is a far cry from how it was in the ground battle beta/release when i played in 2014/15, which it kind of expected for a game approaching it's 10th anniversary of adding ground battles. The grind was there, but if you put in the time you could get most of your way through a faction tree in a few months or so (the top tank at the time was the Maus, so pre cold war era machines) and get the specific tank you might want fairly quickly and in a game that was generally fun.
     
    Fast forward a bit to 2020ish and the game has already changed so much, new cold war equipment starts to be added, an emphasis on getting premium paid for vehicles to make researching new vehicles and modules for them "quicker" (despite the fact the game slowed down this progression in the name of letting the grind and earning still take time for the veteran players). The game added top tier premium vehicles to the game after the developers promising that these premium vehicles would only be for the world war 2 eras and not a pay to play for the cold war era.
     
    This creeping addition meant that in the early release of a beloved vehicle (XM-1 or T-72 Turms for example) in premium form would out preform the equevalent rank competition with all the benefits of it being a premium piled on top. It meant it would be a no brainer to make use of a new top premium as they often over preformed making the individual have more fun but resulting in those trying to play without paying £50 for a new vehicle pain. The junior players would buy these vehicles either through seeing it preforming well in videos from content creators, or just wanting to get into the cold war since they had little interest in playing through the pre WW2 vehicles to get to the era the want to play, they would play one life per match and then return to the hangar to play their next match. The issue with that is the game regularly gives you multiple spawns and a minimum of 2 per match even if you didnt earn any score on the first life, this meant that in a match with certain countries on the different teams you could see anything from 1-8 people play one life in their paid for tank and then quit leaving the rest of their team without the lives to win the match. At the same time, these premium vehicles in the hands of experienced players would usually out preform the others in the same rating vehicles, they benifited from new equipment not availiable to other vehicles of that level and usually these vehicles started lower in the ratings to boost percived preformance, and therefore the sales, until the next premium is starting to crest the horizon.
     
    Another greivance players had was the meme im sure many non players would be familiar with just from hearing it being said so much, Russian Bias.
     
    It's the tinfoil hat conspiracy that has been perpetuated since the launch and even on World of Tanks, both games being made by Russian game companies. However there are some truths to this claim and it's often hard to show it unless you actually experience it, however there are documented occurences where a senior modeller for the game had to be fired for adding extra armour to his favorite russian tanks like the T-34 and nerfing the stats of the ones he didnt like like the Panzer III or IV. There have also been various shadow nerfs and buffs that have only been discovered through datamining and comparisons which older preformance of these vehicles. The game also favored the infamous KA-50, a premuim helicopter with greatly exagerated in game effectiveness against both ground and air that only took almost 4 years to finally nerf. There were even a few occurences during the the start of the Russo-Ukraine war where British vehicles saw some shadow nerfs after the shootdown of Russian helicopters using British made Starstreak missiles that were donated to Ukraine. The game claimed to want to distance itself from the war despite proof that they had sponsored a Sepratist Ukrainian youtube channel that was demonstraing tanks on a range in occupied Ukraine.
     
    This final straw that broke the community however, was not the imbalance in an "Esports ready" game or the premium vehicle nonsense. It was their attitude towards the players, the way the gaslit us and told us that the changes they made to their game was for our benefit and that the extra grind was necessary to keep the game engaging to us. They would have the cheek to post a poll asking us out of the 4 shitty options for an economy update we would rather have, and when the majority would say neither and to ease up on the grindy economy and let it be a tad easier, they ignored us, or worse, they tried to silence us. In the community post they made about recent backlash and negative reviews over their new economy updates they implied it was our fault for not understanding how a Free to Play game works and admitting that the game is not Free to Play because how else could they possibly make money apart from the in game market, premium account, battle passes, crafting events and premium vehicles. Clearly we need to pay them more for the privilage of playing their game and those filthy F2P players are the leaches making the game worse, if you believe Gaijin's story.
     
    They asked us to take the legitimate complaints about the game to their forums, not the Steam page where new people could be detered from playing as that would mean no new players and that the game would shutdown if it had negative reviews. So people did. They made forum posts on the War Thunder website, where are they? Oh yeah, lost to the ether as their forum teams delete as many of the constructive critical complaints as possible to make things look like they are running just fine. Much like the Moskva was, this game is on fire and sinking and the devs are trying to save face over actually investing in making the gameplay better, they would rather blame the faulty playerbase that admit their own mistakes.
     
    After the Squadron im a part of began it's boycott of the game we have come to realise that War Thunder (Gaijin) is like an abusive partner, using psycological manipulation and sneaky tactics to keep us with them but not asking for more from them, and I think it's for the best we take a break. If we lose this game it will probably be for the best as it will open the market for a new team to try and do better.
     
    Final point:
    It's not a review bomb as much as it is a long time comming of built up negative reviews finally being written by players because we tried so hard to look past the game's flaws (telling the devs our issues) and support the game we loved as there didnt seem to be anything quite like it.
     
    (WoT is good in it's own ways but comparing the two games would be like comparing mincraft and terraria, they have similarities and crossovers but are fundamentally different)
  3. Like
    crystal6tak got a reaction from itseggious in [SOLVED] Overclocking Intel HD 4600 on 4790K. Questions.   
    Sending off my dGPU for a bit and will be running iGPU for a few days. Or even a week. Might as well play around with my iGPU while I'm at it.
     
    I will be trying to run...not so light games. Like project cars or the recently steam released mechwarrior online. So extracting as much juice as I can out of the HD 4600 would be my goal.
     
    As I have never OCed a iGPU before. I don't know what to expect and a quick google search does not yield much result.
     
    So, here are the questions I have in mind:
     
    1.) Should I use Intel Extreme Tuning Utility for overclocking the GPU? Using the "Processor Graphics Ratio Limit" multiplier to raise clock speed.
     
    2.) Is it safe to increase the graphics voltage? If so, how far should I go? Temperature isn't an issue. I'm running Noctua NH-D15 on an open air system. And it's winter. Slider goes all the way to 2 volt increase. Obviously don't go that high, but at what point should I stop increasing voltage? If that's a good idea in the first place.
     
    3.) Anything I need to look out for when ocing iGPU? There's no VRM temperature and what not. But still, anything else I should monitor other than CPU temperature?
     
    Thank you!
     
    EDIT: Oh, yey, 1k post.
     
    SO, AFTER FUMBLING AROUND: (Unigine Valley. Low Quality. No AA)
    Stock: iGPU: 1300 Mhz, RAM: 1600 Mhz RAM overclock only to 2133 Mhz 1.66V (used bios) iGPU overclock only to 1700 Mhz, +0.35V (used intel XTU) Full Overclock: iGPU: 1700 Mhz +0.35V, RAM: 2133 Mhz 1.66V   A good 27.6% boost in valley. If that translates well in games, 43 fps will shoot right to 60! Sweet.   Again, thanks for all the help guys!   I'll be off playing some games then. Might record and post how well they run. Who knows.
  4. Like
    crystal6tak got a reaction from zeinahmedeee in SuperDisplay breaks Cinebench R23 (Application Error)   
    Hi! Yes, open up SuperDisplay and uncheck "Enable Wintab driver". Now Cinebench R23 runs without crashing. Once you're done, re-enable Wintab driver.

  5. Like
    crystal6tak got a reaction from Benito the Big in SuperDisplay breaks Cinebench R23 (Application Error)   
    Hi! Yes, open up SuperDisplay and uncheck "Enable Wintab driver". Now Cinebench R23 runs without crashing. Once you're done, re-enable Wintab driver.

  6. Informative
    crystal6tak reacted to Sir Beregond in Rtx 4090 is a monster (Official Benchmarks)   
    The 8800 GTX would like a word. At least 130% rasterization generational improvement over the 7800 GTX. If I recall, the 6800 Ultra was something like 110-120% over the FX 5800/5900 Ultra too. Again, rasterization generational improvement.
     
    Sure if we are counting RT and DLSS the 4090 improvements are huge, but raster is not 130% faster than the 3090 Ti. They made this same claim with Ampere too and that was definitely NOT "the biggest generational leap ever". More like biggest leaps of past decade?
     
    Not trying to cast shade on the performance, the 4090 looks like a beast. Just calling out marketing BS.
  7. Like
    crystal6tak got a reaction from danchappers in SuperDisplay breaks Cinebench R23 (Application Error)   
    Hey all!
     
    Just want to throw it out there it appears as of writing this, installing SuperDisplay will break Cinebench R23/R23.2, causing it to crash on startup with this error (oddly, R20 runs fine):

     
    I've attached the BugReport.txt in this post, but the important part I think is this bit:
     
    ExceptionNumber = 0xC0000005
    ExceptionText = "ACCESS_VIOLATION"
    Address = 0x00007FFE2A0789AD
    Thread = 0x0000000000001B14
    Last_Error = 0x00000000
     
    If anyone's experiencing this issue, just uninstall SuperDisplay and R23 will startup fine again. Also if you guys don't mind, try installing SuperDisplay and see if you guys get the same error. I've emailed SuperDisplay to let them know about this error. So far this occurs on both of my systems, specs below:
     
    System 1:
    i7-4790K
    GTX 1070
    20GB DDR3 1600
     
    System 2:
    i7-12700KF
    RTX 3080
    32GB DDR5 5200
     
    If you guys got any ideas on a fix too, let me know!
    _BugReport.txt
  8. Agree
    crystal6tak got a reaction from RockSolid1106 in Hacker Leaks GTA 6 Alpha- Build Test Videos on GTA Forums   
    What hype train? These footage aren't supposed to be public.
     
    Being realistic is realizing "open betas" that occurs 1 month before release is bullshit. You're demoing the final product, devs can't do much in 1 month.
  9. Agree
    crystal6tak got a reaction from VirusDumb in Hacker Leaks GTA 6 Alpha- Build Test Videos on GTA Forums   
    What hype train? These footage aren't supposed to be public.
     
    Being realistic is realizing "open betas" that occurs 1 month before release is bullshit. You're demoing the final product, devs can't do much in 1 month.
  10. Informative
    crystal6tak reacted to mariushm in Nuclear waste made into batteries.   
    Uhmmm... no.
     
    Betavoltaic devices are old thing, they're made for decades ... ex another company that makes them : https://citylabs.net/products/
     
    Also see the wikipedia article : https://en.wikipedia.org/wiki/Betavoltaic_device
     
    The company in the article has the innovation to incorporate the radioactive material in an artificial diamond, therefore it's safer.
     
    BUT keep in mind the amount of energy created is extremely small, no matter the package. The amount of radioactive material they can put in a diamond like material can produce about as much power as you'd need to power a watch on your hand or a pocket calculator... basically 1-2v at 0.1mA - to power an iphone and make calls you'd need at least 2-3v and around 50mA ... so around 500-1000 times more power than what one of thes batteries can do.
     
    Also keep in mind it's not infinite ... yes, the battery can produce energy for 28 thousand years, but depending on isotope you have a half-life of around 10-20 years.... so in 10 years your 2v 0.1mA battery will produce HALF or 2v 0.05mA because the radioactive material decays.
    So they're not wrong, it will still produce energy 28k years from now ... but 1v 0.00000000001mA ....enough maybe to blink a led once a day if you top up a good quality capacitor.
     
     
    edit : ok city labs uses tritium which has a half-life of around 12 years, the company above plans to use radioactive graphite from spent rods or other materials, that material's half-life is around 5k years so the batteries with that radioactive material will last longer... and i suppose they could parallel a bunch of such diamonds and shove them in a AA or some similar package to get more current out of a battery.
     
     
  11. Like
    crystal6tak reacted to Rem0o in FanControl, my take on a SpeedFan replacement   
    ______________________________
    Version updated date: 11/04/2022
     
    Current update version: 136
     
    https://getfancontrol.com
     
    To run at startup: Use the new "Start with Windows" option in the left hamburger menu
    ______________________________
     
    Tutorials:
     
     
     
     

     
     
     

     
    ______________________________

           TLDR
    ______________________________
     
    I built a new custom UI on top of OpenHardwareMonitor with additional features, mainly linear fan curves with custom temperature sources.
     
    ______________________________

         STORY
    ______________________________
     
    As you guys may know, SpeedFan is sadly not updated anymore, so newer boards are not detected properly. 
     
    The main feature I used was the custom fan curves with custom temperature sources. I used it to bind my case fans speed to the hottest component of my PC, my GPU. 
    (My BIOS only supports CPU temperature as a temperature source for the PWM fans).
     
    I searched around for an alternative software with this particular feature and only found a paid option (Angus Monitor).
     
    However, I also came around this:

    https://github.com/openhardwaremonitor/openhardwaremonitor
    https://github.com/LibreHardwareMonitor/LibreHardwareMonitor
     
    The first link is an original project which could be downloaded here https://openhardwaremonitor.org/, but just like SpeedFan, the project is not updated anymore.
    However, thanks to the code base being open sourced, there are a couple of active branches, LibreHardwareMonitor being the best one I found, supporting my MSI Z390 Edge AC board and being updated regularly.
    OpenHardwareMonitor is divided into two parts, an API to interact with your hardware ( CPU, RAM, Fans... ) and a UI. The existing UI is a HWMonitor clone that allows to set a manual fan speed to any fan, but no temperature/speed fan curve here.
    ( sigh )
     
    So I decided to make my own lightweight application with the OpenHardwareMonitorLib API, and here is what I got so far...        
     
    Current features:
    OTA update Multi-config support with quick-switch from tray icon Dark/light mode + colors Graph fan curves Linear fan curves Flat fan curves Mixed fan curves Sync fan curves Custom name for each fan / curve / control Material UI ( thanks to http://materialdesigninxaml.net/ ) Smooth fan speed transitions Custom temperature source Automatic or manual matching between your controls and fan speeds Activation% (dead zone) for each fan Saves your current configuration and reload it on startup Board support is updated whenever the API gets an update!  
    Please note that this is a small personal project. It works great for my needs but I didn't test it on a hundred different motherboards. Take it as it is. If it works on your current setup, well you got your fan control situation sorted at least until you change your motherboard!
     
    If you want to help me out a bit or give me feedback, I included some links/button in the left hamburger menu out of the way to send me an email
    or to pay me a 🍺 ... or 🍺🍺🍺.
     
    I will also keep an eye on this thread to see how it goes.
     
    Confirmed compatibility list from members
     
    Enjoy!
     
     
     
  12. Like
    crystal6tak reacted to Rem0o in FanControl, my take on a SpeedFan replacement   
    Look for the mix fan curve, it's exactly that.
  13. Informative
    crystal6tak reacted to Mister Woof in Alder Lake (Intel 12th gen) disabling E-Cores may boost gaming performance   
    How? L3 being shared is not secret, and each E-core cluster has its own independent L2.

    Additionally, AMD's Zen 3 has shared L3 cache among all cores in a CCD.
     
    I would say this is the industry standard.
     
     
    And Bulldozer shenanigans wasn't because of cache, it was shared FPU.
  14. Informative
    crystal6tak got a reaction from Alder Lake in Alder Lake (Intel 12th gen) disabling E-Cores may boost gaming performance   
    Hey all! Couldn't find any reviewers testing the effect of disabling efficiency cores. I decided to do my own small scale testing. I hope you find these results interesting! I do and I think it warrants a larger scale testing with more games.   My system configuration: i7-12700KF RTX 3080 DDR5 5200 Mhz 38-38-38-76 Asus Z690 Prime-A   Games tested (I used these as they're what I owned that had built in benchmark tools): Arma 3 (Using Yet Another Arma Benchmark) Rainbow Six Siege F1 2018 GTA V   Here are my results (mirror link) (All at 1080p, with each game benchmark ran 3 times): In E-Off, E-Cores were disabled and AVX512 were enabled via BIOS. Arma 3 saw the biggest improvement, with six siege seeing a minor but measurable increase. Disabling E-core also did not reduce performance in any scenario (again, small sample).   W11 honestly wasn't impressive. Using it was a headache too. F1 crashed once with error "D3D Device removed 0x887A0006". Had one random BSOD on restart. GTA V settings wouldn't save (had to copy pc_settings.bin from the W10 install over to W11 for it to work). Meanwhile W10 ran smoothly during all the benchmarks. I'd love to see if any journalist could do more testing with E-Cores disabled to confirm my findings!   Other games that see a benefit:
    Death Stranding - Tested by CapFrameX, mentioned by aeon100500 on reddit   EDIT1:
    Added Win 11 + E-Off results. Arma 3, GTA V, F1 performed more or less the same. Six siege improved slightly, but still not close to WIn 10 + E-Off level   EDIT2:
    P-cores were at 4.7 ghz during benchmarks for both E-cores On and Off (in some games, during E-Cores on, it fluctuates between 4.7 and 4.9ghz)
    Tested E-cores Off with AVX-512 off. The FPS improvements persists and were not any different.
    My motherboard currently do not have the "legacy game compatibility mode" option in the BIOS. Once that comes out, I'll be sure to test that too!
  15. Informative
    crystal6tak got a reaction from BTGbullseye in Alder Lake (Intel 12th gen) disabling E-Cores may boost gaming performance   
    Hey all! Couldn't find any reviewers testing the effect of disabling efficiency cores. I decided to do my own small scale testing. I hope you find these results interesting! I do and I think it warrants a larger scale testing with more games.   My system configuration: i7-12700KF RTX 3080 DDR5 5200 Mhz 38-38-38-76 Asus Z690 Prime-A   Games tested (I used these as they're what I owned that had built in benchmark tools): Arma 3 (Using Yet Another Arma Benchmark) Rainbow Six Siege F1 2018 GTA V   Here are my results (mirror link) (All at 1080p, with each game benchmark ran 3 times): In E-Off, E-Cores were disabled and AVX512 were enabled via BIOS. Arma 3 saw the biggest improvement, with six siege seeing a minor but measurable increase. Disabling E-core also did not reduce performance in any scenario (again, small sample).   W11 honestly wasn't impressive. Using it was a headache too. F1 crashed once with error "D3D Device removed 0x887A0006". Had one random BSOD on restart. GTA V settings wouldn't save (had to copy pc_settings.bin from the W10 install over to W11 for it to work). Meanwhile W10 ran smoothly during all the benchmarks. I'd love to see if any journalist could do more testing with E-Cores disabled to confirm my findings!   Other games that see a benefit:
    Death Stranding - Tested by CapFrameX, mentioned by aeon100500 on reddit   EDIT1:
    Added Win 11 + E-Off results. Arma 3, GTA V, F1 performed more or less the same. Six siege improved slightly, but still not close to WIn 10 + E-Off level   EDIT2:
    P-cores were at 4.7 ghz during benchmarks for both E-cores On and Off (in some games, during E-Cores on, it fluctuates between 4.7 and 4.9ghz)
    Tested E-cores Off with AVX-512 off. The FPS improvements persists and were not any different.
    My motherboard currently do not have the "legacy game compatibility mode" option in the BIOS. Once that comes out, I'll be sure to test that too!
  16. Informative
    crystal6tak reacted to Falkentyne in Alder Lake (Intel 12th gen) disabling E-Cores may boost gaming performance   
    Disabling E cores should always increase performance if what you're using doesn't need more than 16 threads or 8 physical cores.
    Remember that when you disable the E-cores, the P-cores gain access to the E-cores L3 cache all to themselves.  That always counts for something.
    Disabling hyperthreading on a 10900k for example gave the physical cores access to the extra L3 cache that the logical cores used.

    Now if you're multitasking, like trying to stream and game at the same time, you may want the E cores enabled.
     
    Also I found that if you have C-states enabled, you may get abnormal performance when you load certain monitoring programs if the E-cores are enabled (sometimes the P-cores go to sleep for several seconds).  I saw this on stockfish chess engine when I tried loading up hwinfo sensors while the engine was busy calculating (instead of starting the engine after hwinfo64 was running).  If C-states were disabled, the P-cores went back to full usage immediately.
  17. Informative
    crystal6tak got a reaction from YoungBlade in Alder Lake (Intel 12th gen) disabling E-Cores may boost gaming performance   
    Hey all! Couldn't find any reviewers testing the effect of disabling efficiency cores. I decided to do my own small scale testing. I hope you find these results interesting! I do and I think it warrants a larger scale testing with more games.   My system configuration: i7-12700KF RTX 3080 DDR5 5200 Mhz 38-38-38-76 Asus Z690 Prime-A   Games tested (I used these as they're what I owned that had built in benchmark tools): Arma 3 (Using Yet Another Arma Benchmark) Rainbow Six Siege F1 2018 GTA V   Here are my results (mirror link) (All at 1080p, with each game benchmark ran 3 times): In E-Off, E-Cores were disabled and AVX512 were enabled via BIOS. Arma 3 saw the biggest improvement, with six siege seeing a minor but measurable increase. Disabling E-core also did not reduce performance in any scenario (again, small sample).   W11 honestly wasn't impressive. Using it was a headache too. F1 crashed once with error "D3D Device removed 0x887A0006". Had one random BSOD on restart. GTA V settings wouldn't save (had to copy pc_settings.bin from the W10 install over to W11 for it to work). Meanwhile W10 ran smoothly during all the benchmarks. I'd love to see if any journalist could do more testing with E-Cores disabled to confirm my findings!   Other games that see a benefit:
    Death Stranding - Tested by CapFrameX, mentioned by aeon100500 on reddit   EDIT1:
    Added Win 11 + E-Off results. Arma 3, GTA V, F1 performed more or less the same. Six siege improved slightly, but still not close to WIn 10 + E-Off level   EDIT2:
    P-cores were at 4.7 ghz during benchmarks for both E-cores On and Off (in some games, during E-Cores on, it fluctuates between 4.7 and 4.9ghz)
    Tested E-cores Off with AVX-512 off. The FPS improvements persists and were not any different.
    My motherboard currently do not have the "legacy game compatibility mode" option in the BIOS. Once that comes out, I'll be sure to test that too!
  18. Informative
    crystal6tak got a reaction from Mister Woof in Alder Lake (Intel 12th gen) disabling E-Cores may boost gaming performance   
    Hey all! Couldn't find any reviewers testing the effect of disabling efficiency cores. I decided to do my own small scale testing. I hope you find these results interesting! I do and I think it warrants a larger scale testing with more games.   My system configuration: i7-12700KF RTX 3080 DDR5 5200 Mhz 38-38-38-76 Asus Z690 Prime-A   Games tested (I used these as they're what I owned that had built in benchmark tools): Arma 3 (Using Yet Another Arma Benchmark) Rainbow Six Siege F1 2018 GTA V   Here are my results (mirror link) (All at 1080p, with each game benchmark ran 3 times): In E-Off, E-Cores were disabled and AVX512 were enabled via BIOS. Arma 3 saw the biggest improvement, with six siege seeing a minor but measurable increase. Disabling E-core also did not reduce performance in any scenario (again, small sample).   W11 honestly wasn't impressive. Using it was a headache too. F1 crashed once with error "D3D Device removed 0x887A0006". Had one random BSOD on restart. GTA V settings wouldn't save (had to copy pc_settings.bin from the W10 install over to W11 for it to work). Meanwhile W10 ran smoothly during all the benchmarks. I'd love to see if any journalist could do more testing with E-Cores disabled to confirm my findings!   Other games that see a benefit:
    Death Stranding - Tested by CapFrameX, mentioned by aeon100500 on reddit   EDIT1:
    Added Win 11 + E-Off results. Arma 3, GTA V, F1 performed more or less the same. Six siege improved slightly, but still not close to WIn 10 + E-Off level   EDIT2:
    P-cores were at 4.7 ghz during benchmarks for both E-cores On and Off (in some games, during E-Cores on, it fluctuates between 4.7 and 4.9ghz)
    Tested E-cores Off with AVX-512 off. The FPS improvements persists and were not any different.
    My motherboard currently do not have the "legacy game compatibility mode" option in the BIOS. Once that comes out, I'll be sure to test that too!
  19. Informative
    crystal6tak got a reaction from Levent in Alder Lake (Intel 12th gen) disabling E-Cores may boost gaming performance   
    Hey all! Couldn't find any reviewers testing the effect of disabling efficiency cores. I decided to do my own small scale testing. I hope you find these results interesting! I do and I think it warrants a larger scale testing with more games.   My system configuration: i7-12700KF RTX 3080 DDR5 5200 Mhz 38-38-38-76 Asus Z690 Prime-A   Games tested (I used these as they're what I owned that had built in benchmark tools): Arma 3 (Using Yet Another Arma Benchmark) Rainbow Six Siege F1 2018 GTA V   Here are my results (mirror link) (All at 1080p, with each game benchmark ran 3 times): In E-Off, E-Cores were disabled and AVX512 were enabled via BIOS. Arma 3 saw the biggest improvement, with six siege seeing a minor but measurable increase. Disabling E-core also did not reduce performance in any scenario (again, small sample).   W11 honestly wasn't impressive. Using it was a headache too. F1 crashed once with error "D3D Device removed 0x887A0006". Had one random BSOD on restart. GTA V settings wouldn't save (had to copy pc_settings.bin from the W10 install over to W11 for it to work). Meanwhile W10 ran smoothly during all the benchmarks. I'd love to see if any journalist could do more testing with E-Cores disabled to confirm my findings!   Other games that see a benefit:
    Death Stranding - Tested by CapFrameX, mentioned by aeon100500 on reddit   EDIT1:
    Added Win 11 + E-Off results. Arma 3, GTA V, F1 performed more or less the same. Six siege improved slightly, but still not close to WIn 10 + E-Off level   EDIT2:
    P-cores were at 4.7 ghz during benchmarks for both E-cores On and Off (in some games, during E-Cores on, it fluctuates between 4.7 and 4.9ghz)
    Tested E-cores Off with AVX-512 off. The FPS improvements persists and were not any different.
    My motherboard currently do not have the "legacy game compatibility mode" option in the BIOS. Once that comes out, I'll be sure to test that too!
  20. Informative
    crystal6tak reacted to CerealExperimentsLain in The "new" PS5 actually has better cooling!   
    This doesn't really jive with PlayStation history.  The PlayStation's have always been constantly revised beyond the major revisions that consumers see such as 'Fat' and 'Slim' models.  The PS1 had around 12 major motherboard revisions.  The PS2 has over 20.  The PS3 has 20 or so.  While the PS4 had less, the basic PS4 had about 6 revisions and the PS4 Pro had 2-3.

    Sony going back and optimizing the design is not a result of 'rushed development' it's literally what they've always done with their hardware for the last 26 years.
  21. Agree
    crystal6tak reacted to emosun in What's the best CPU the motherboard G7B630-N supports?   
    how does the machine interface? If it's via one of the regular ports then more than likely the machine just has to run the program it uses and have that specific port on it. 

    In any case you should consider duplicating the machine if it's at all important.
  22. Informative
    crystal6tak reacted to abit-sean in What's the best CPU the motherboard G7B630-N supports?   
    According to this https://manualzz.com/doc/en/52001435/dfi-g7b630-n-cpu-memory-compatibility-list-manual the X6700 should be fine. Not sure about the Q series.
  23. Informative
    crystal6tak reacted to Chronified in What's the best CPU the motherboard G7B630-N supports?   
    The board likely released near the beginning of the LGA 775 sockets lifespan when it was just the Pentium 4 , Pentium D and Core2Duo
    A little bit of digging and I found this motherboard, which is a revision of yours, same chipset and has official support for the Core2Quad. https://www.bressner.co.uk/products/motherboards/mb-g7b630-nr

    Product page for yours likely doesn't have the Core2Quad support listed because it didn't exist yet at launch, but like many Dell Optiplex systems I haven't come across a single LGA 775 Q95 chipset motherboard that I can't just plop a Q6600 into lol
  24. Informative
    crystal6tak reacted to Chronified in What's the best CPU the motherboard G7B630-N supports?   
    Sweet, that's Core2Duo territory 
    Best bet is a Core2Quad Q6600, they can be found on eBay for $8-$12 , easy drop in. 
    Search up the bsel mod if you don't have an overclockable motherboard, you cover 1 pad on the CPU with tape and it bumps the FSB speed from 1033 to 1333, making the clock speed jump from 2.6Ghz to 3.0Ghz, no other tweaks needed. 
    Here's a pic of the pad to cover, easily the best bang for your buck when upgrading old Core2Duo based machines! 
     

  25. Informative
    crystal6tak reacted to Fasauceome in How to move Windows 10 from one m.2 drive to another m.2 drive? (In a laptop)   
    it can. Although, in practical terms, there's not much benefit to going from one SSD to another because everyday tasks will be really fast on either.
×