Jump to content

Devs have forgotten what optimization means, they just dont care.

Unpopular opinion here but... The state of the gaming industry is largely the fault of gamers. For a long time gamers turned a blind eye to Bethesda's buggy bullshit because "Skyrim was great!". We live in an age where games are simply half baked and the rest of the game comes in the form of microtransactions or the games are broken on release because people pre-ordered(Why???).

Had gamers revolted against large companies; EA/Bethesda/Blizzard for the shady microtransactions/broken games they launched on release, then the overall state of gaming would of been better. Unilaterally it seems gamers hate microtransactions, so why do they keep appearing in games? Because it is still profitable... The same way releasing a bugged game is still profitable because people still pre-order because ("Bethesda made it."). It hasn't been until recently gamers started speaking with their wallets, and it happened far too late to make a difference.

There are a few good companies out there still, but a majority are just pushing out yearly releases and they still get bought by the millions. PC/Console are in the same boat here. When gaming went from "Lets make a game that is so fun to play it makes you want to quit your job" to.. "Lets make a game so expensive to play that you need a job" thats where things went wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

It has *always* been like this. Some developers make good engines, some don't. Some rush games to market, some don't.

39 minutes ago, M.Yurizaki said:

Besides that I don't even know what id Tech 7 is bringing that id Tech 6 can't do and all I'm getting is some vague "it can do 10 times the geometric complexity" which to me is something I'd have to have a side-by-side comparison on what this means. For all I know, id Tech 6 may have set a low bar.

Id Tech 6 is a great engine, as DOOM 2016 proves. More geometric complexity can mean more detailed models, or more models on screen without tanking the game's performance. With that said, Id's not really the best example for what you were saying - pushing engine technology has always been one of their main focuses. Despite the flaws some of their engines (namely Id Tech 5) have had, they have always been revolutionary in one way or another.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Not sure what you mean OP...my 2012 Battlestation runs BF5 great.  I get frame "dips" but nothing that causes me a gameplay issue.  I stay above 60fps (60hz monitor) and that's all that matters to me.  I haven't tried many other AAA titles this year but I can run all of the games in Origin Access at near max settings, all my steam games - its the games that I buy that I KNOW aren't optimized that cause me the most problems - see also, Alpha games lol.

Workstation Laptop: Dell Precision 7540, Xeon E-2276M, 32gb DDR4, Quadro T2000 GPU, 4k display

Wifes Rig: ASRock B550m Riptide, Ryzen 5 5600X, Sapphire Nitro+ RX 6700 XT, 16gb (2x8) 3600mhz V-Color Skywalker RAM, ARESGAME AGS 850w PSU, 1tb WD Black SN750, 500gb Crucial m.2, DIYPC MA01-G case

My Rig: ASRock B450m Pro4, Ryzen 5 3600, ARESGAME River 5 CPU cooler, EVGA RTX 2060 KO, 16gb (2x8) 3600mhz TeamGroup T-Force RAM, ARESGAME AGV750w PSU, 1tb WD Black SN750 NVMe Win 10 boot drive, 3tb Hitachi 7200 RPM HDD, Fractal Design Focus G Mini custom painted.  

NVIDIA GeForce RTX 2060 video card benchmark result - AMD Ryzen 5 3600,ASRock B450M Pro4 (3dmark.com)

Daughter 1 Rig: ASrock B450 Pro4, Ryzen 7 1700 @ 4.2ghz all core 1.4vCore, AMD R9 Fury X w/ Swiftech KOMODO waterblock, Custom Loop 2x240mm + 1x120mm radiators in push/pull 16gb (2x8) Patriot Viper CL14 2666mhz RAM, Corsair HX850 PSU, 250gb Samsun 960 EVO NVMe Win 10 boot drive, 500gb Samsung 840 EVO SSD, 512GB TeamGroup MP30 M.2 SATA III SSD, SuperTalent 512gb SATA III SSD, CoolerMaster HAF XM Case. 

https://www.3dmark.com/3dm/37004594?

Daughter 2 Rig: ASUS B350-PRIME ATX, Ryzen 7 1700, Sapphire Nitro+ R9 Fury Tri-X, 16gb (2x8) 3200mhz V-Color Skywalker, ANTEC Earthwatts 750w PSU, MasterLiquid Lite 120 AIO cooler in Push/Pull config as rear exhaust, 250gb Samsung 850 Evo SSD, Patriot Burst 240gb SSD, Cougar MX330-X Case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Frame dips, get a gsync screen and be enlighted how far a pc gamer is than any console peasant, 1440p gsync + 1070 (with a 4790k) and you will spit on any console u see, You take vega 64 and with new tech comes problems, I had same issues with my rx 480, now 580's are fine but thats the price for taking fresh tech....

Case: Corsair 760T  |  Psu: Evga  650w p2 | Cpu-Cooler : Noctua Nh-d15 | Cpu : 8600k  | Gpu: Gygabyte 1070 g1 | Ram: 2x8gb Gskill Trident-Z 3000mhz |  Mobo : Aorus GA-Z370 Gaming K3 | Storage : Ocz 120gb sata ssd , sandisk 480gb ssd , wd 1gb hdd | Keyboard : Corsair k95 rgb plat. | Mouse : Razer deathadder elite | Monitor: Dell s2417DG (1440p 165hz gsync) & a crappy hp 24' ips 1080p | Audio: Schiit stack + Akg k712pro + Blue yeti.

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Sauron said:

Id Tech 6 is a great engine, as DOOM 2016 proves. More geometric complexity can mean more detailed models, or more models on screen without tanking the game's performance. With that said, Id's not really the best example for what you were saying - pushing engine technology has always been one of their main focuses. Despite the flaws some of their engines (namely Id Tech 5) have had, they have always been revolutionary in one way or another.

I know what "more geometric complexity" implies. But the figure they throw out has me skeptical. I've seen gameplay footage of DOOM Eternal and the only thing I can pick out is the world detail certainly bumped up. But at the same time I'm not really impressed because it feels like I've seen that level of detail before in other games.

 

Also the throwing id Tech 6 was more of a contrast against other engines, but the sentence after mangled the context. Basically, I don't find comparing DOOM to say Battlefield 1 all that fair because one is running on a newer engine with all the enhancements and optimizations that could be made up until that point and the other was made on an engine that was already three years old and any underlying design choices that hampered performance may not be easily improved.

 

And if id has made another iteration of their engine this quickly, that to me seems more like this is an evolutionary improvement than a revolutionary one. Especially if the list of features different from the previous engine isn't that much.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, M.Yurizaki said:

And if id has made another iteration of their engine this quickly, that to me seems more like this is an evolutionary improvement than a revolutionary one. Especially if the list of features different from the previous engine isn't that much.

They may be transitioning to a more gradual cycle, after all the name doesn't mean all that much - they could probably have called it Id Tech 6+ just as easily. Id Tech 4 had a lot of intermediate improvements and extensions added to it for pretty much every game that used it, they just chose not to call any of those Id Tech 5 (presumably because the latter was already in development separately).

 

Then again they have a bigger team than they used to, so... who knows?

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Sauron said:

They may be transitioning to a more gradual cycle, after all the name doesn't mean all that much - they could probably have called it Id Tech 6+ just as easily. Id Tech 4 had a lot of intermediate improvements and extensions added to it for pretty much every game that used it, they just chose not to call any of those Id Tech 5 (presumably because the latter was already in development separately).

 

Then again they have a bigger team than they used to, so... who knows?

If anything, I think naming conventions are marketing's job because higher numbers means better.

Link to comment
Share on other sites

Link to post
Share on other sites

Why would the devs bother optimizing uncessary things that do not even need to be optimized? Software engineering 101, only optimized a piece of code that eats up 50% or more of your CPU cycle otherwise don't. There is no point spending over half of your development time and resources, reworking the entire project achriteture and subsystems, introducing loads of bugs in the proccess just to optimize some shit like 5-10% more fps when rendering sky's and oceans. 

Sudo make me a sandwich 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TimsTips said:

I am a dev, not for games, but it’s all the same stuff. Most developers suck and most managers suck. That’s your answer. 

Devs are evil. 

Sudo make me a sandwich 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, handymanshandle said:

Now, more on topic: there’s always good and badly optimized games. Team Fortress 2’s optimization has dwindled since the Love and War Update (it’s been 4 and a half years since the Conga came out) but I still play it. 

 

Hmm, is THAT why TF2 runs on my dad's old laptop with much lower FPS on similar settings than Fortress Forever, even though they use the same engine (Source)?

20181206231322_1.jpg.568ea50b02f8164beb1b3e1121d906a0.jpg  20181206231309_1.jpg.6bd96899d4105539778f04a338401995.jpg

 

20181207032940_1.jpg.c40e438896531c096deb1e792c4c4f38.jpg  20181207033001_1.jpg.825f998a9b19fbff2a6e609aa7c7e219.jpg

 

 

Also I can't get CS:GO to run hardly at all.  It will launch to the main screen, and if I'm REALLY fast with clicking, I can go into the graphics settings and change maybe ONE thing, but then the system RAM usage hits its 2GB capacity, the game freezes, then quits to desktop.  (I was hoping to benchmark it, I thought since it used the same engine as TF2 it should run.)

 

 

Another thing I find really interesting ... Team Fortress Classic, which uses the GoldSrc engine ... in OpenGL mode, I get like 4 or less fps, at 640x480.  Staring at a wall in respawn with my face up against it I was getting about 7 fps, and going sniper and zooming on the sky outside I got about 10 fps or so.

 

1666183672_Screenshot(26)-2018-12-071530-TFC640x480OpenGL4.0fps2fortramproom.png.cba52e66970302e398ef1097c34ff055.png  1536277501_Screenshot(27)-2018-12-071531-TFC640x480OpenGLVideosettings4_0fps.png.11aaca580efea6d76d01310a5a0f6067.png

 

 

 

However, in Software Renderer mode, I get closer to 40-50 fps at 1280x800, the laptop's native display resolution.

 

1608767118_Screenshot(28)-2018-12-071604-TFC1280x800Software40fps2fortramproom.png.3c41cc287455cb1ce6fe413c792168be.png  462681619_Screenshot(29)-2018-12-071604-TFC1280x800software27fpssettings.png.5b437c069f393130923d5d328892481a.png

 

 

Any idea why that could be?  I'm guessing the Intel GMA X3100 might have issues with OpenGL, but I wouldn't have thought software mode would be THAT huge of an improvement on performance.  Direct3D isn't an available option.

 

Also there's an option for running as low as 320x240 resolution in TFC on that laptop, but going any lower than 640x480 messes up the UI.  For example, mouse clicks / movement don't register in the right place, the options menus spill over the edge of the screen and it's hard to find them, etc.  (To fix it, I had to set the launch options to force it to run at 640x480, then go and change the in-game settings.)

 

If I remember right, the first computer I played TFC on back in 1999 or 2000 (Pentium 166 MMX, ATI All-In-Wonder (3D Rage II+), Windows 98, 16MB 72-pin SIMM, 6.4GB Seagate IDE HDD) was getting about 8 fps or so in software mode at like 320x240 or something like that, and wouldn't run hardly at all in OpenGL or Direct3D.

 

Also in all the games tested, there was a brief 25-40% fps penalty at the time of screenshotting, so the actual FPS numbers were a bit higher than is reflected in the screenshots.  An exception might be TFC in OpenGL - 4.0 fps was the lowest I ever saw net_graph display, behaving like 60fps on a 60Hz monitor would if you have VSync on with a high-end GPU.

 

Here's a few links to 3DMark & PCMark results from my dad's laptop (that I was testing TFC, TF2, FF & CS:GO on) from March / April of this year.  (Also a Cinebench R15 run in May yielded 95 multi-core score, 49 single-core score.)  I'm having issues getting it running again now, trying reinstalling it, so these old scores will have to do for now.  (A main difference is I'm now running a newer build of Windows, 1809, on a different SSD, a 240GB Crucial BX300.)

Ice Storm Extreme

Ice Storm

Cloud Gate 1.1

PCMark 8 Work Conventional 2.0

PCMark 8 Home Conventional 3.0

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, PianoPlayer88Key said:

-big ass snip-

For one reason, over the years, TF2's optimization has been marred heavily because it's switched major Source engine revisions twice (it received Source 2010 when the Mac Update dropped, from what I remember) and received Source 2013 with the Robotic Boogaloo update, which switched the way the engine stored files. There's a lot of clutter when it comes to the game, especially post-Gun Mettle, because of the way the game handles various things.

Most notably is how the post-Meet Your Match and Jungle Inferno UI is stacked onto the standard UI, which doesn't help with performance. The game also stores a lot of items in RAM to keep them cached in, which is to say, a lot of what is cached in are medals given to about 20 people because they played in a competitive tournament. Also, the recent SF2018 update really didn't help, because TF2 now stutters more than it used to. Also ALSO, maps aren't as optimized as they used to. Map creators go all out on details, and while they're visually stunning sometimes (Cursed Cove?), they don't really run all that well, at least a lot of the newer maps. 

Simply put, there's many reasons why TF2 runs poorly nowadays.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

Only gonna get worse as hardware gets more powerful. The stronger the hardware, the lazier the programmers. And I say this as someone who's working on a game myself, my game runs WAY WORSE than it looks. The game should be killing it on a Pentium III but you need a Core 2 Duo to run it perfectly. I haven't put the effort into making it run on a Piii because why bother? No one uses those, everyone's got a 775 dual core or better. This mindset is getting pretty common with a lot of devs, but on a MUCH larger scale. "Sure, there were games that looked better than this and ran smooth on a 8800gt, but why bother? EVERYONES GOT A GTX OR SOMETHING NOWADAYS RIGHT?". "Sure a 750ti should be murdering this game, but all REAL gamers have GTX 970s right???". (PUBG comes to mind)

 

Can't say I blame devs for this mindset, why spend all the development time optimizing when it already runs "good enough"? But it is sad to not see hardware utilized as well as it could be.

 

Another example, I used to watch 720p Youtube on a Athlon 64 all day (2011-2012ish). You'd be lucky to get 360p running smooth on one with a modern web browser now. But who cares? Everyone's running a Core 2 duo or better right?

 

Sad, but not a surprise at all.

i7 2600k @ 5GHz 1.49v - EVGA GTX 1070 ACX 3.0 - 16GB DDR3 2000MHz Corsair Vengence

Asus p8z77-v lk - 480GB Samsung 870 EVO w/ W10 LTSC - 2x1TB HDD storage - 240GB SATA SSD w/ W7 - EVGA 650w 80+G G2

3x 1080p 60hz Viewsonic LCDs, 1 glorious Dell CRT running at anywhere from 60hz to 120hz

Model M w/ Soarer's adapter - Logitch g502 - Audio-Techinca M20X - Cambridge SoundWorks speakers w/ woofer

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, PianoPlayer88Key said:

 

cliiiip

Heheheheh, remembering the days I ran TF2 on a 9600 Pro 128mb with acceptableish frame rates. :D

i7 2600k @ 5GHz 1.49v - EVGA GTX 1070 ACX 3.0 - 16GB DDR3 2000MHz Corsair Vengence

Asus p8z77-v lk - 480GB Samsung 870 EVO w/ W10 LTSC - 2x1TB HDD storage - 240GB SATA SSD w/ W7 - EVGA 650w 80+G G2

3x 1080p 60hz Viewsonic LCDs, 1 glorious Dell CRT running at anywhere from 60hz to 120hz

Model M w/ Soarer's adapter - Logitch g502 - Audio-Techinca M20X - Cambridge SoundWorks speakers w/ woofer

 

Link to comment
Share on other sites

Link to post
Share on other sites

Devs today also aren't nearly as good as the legendary programmers that literally  invented PC gaming.

 

Publishers also don't care.

Intel 4670K /w TT water 2.0 performer, GTX 1070FE, Gigabyte Z87X-DH3, Corsair HX750, 16GB Mushkin 1333mhz, Fractal R4 Windowed, Varmilo mint TKL, Logitech m310, HP Pavilion 23bw, Logitech 2.1 Speakers

Link to comment
Share on other sites

Link to post
Share on other sites

Be thankful we still don't have BLOOM at 400-1000%

While many hate Motionblur,.. it used to be worse,.. those that do have it enabled in some games,.. it's much better.

We take much for granted,..and expect perfection as it "was better" in the earlier days to get a more quality title,.. even if the technology backing the game was say.. DirectX 3-9 based,.. so many earlier day games were running better on launch.

 

I'm happy with less bloom,..iE Lighting changes and enhanced "game engines", but the RUSH of pushing a coding team to an earlier than expected release date (with many assets on the table to work with) as a "timed" priority over a quality relatively finished product really makes the product suffer so much.

Coming Soon..

Delayed at last minute..

Broken still...


The Updates/Patches to usable status within 6 months, further 6+ months of core updates to see the final product,..to which a new game has been announced or already being worked on.

 

 

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Khader87 said:

devs are just producing games to maximize their proft

you literally answered everything yourself here. 

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

Some of the games that people call unoptimized usually just poor quality control. 

With Bethesda games some meshes could be considered Monday morning or Friday afternoon variety. All the sort of things that should be picked up by quality control. Also some textures and meshes are brilliant alongside others that look like they were done by unsupervised inturnes. This creates a lot of the unevenness in how the games look and play. 

In contrast CD Projekt Red's Witcher 3's textures and meshes look like they were made by the same person. That is all down to a disciplined workforce and excellent quality control.

 

The 3D part of 3D games is all I am qualified to talk about since I did 3D for a living for a very long time. I haven't ripped objects out of games lately because I know what I will find.  

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

The stuff we p.c. gamers have to live with..... either stay at 1080 or prepare to pay.... to figure out the best way of increasing your cpu's core speed.. the lengths we have run to learn about cooling and efficiency... MEDIUM SETTINGS ..oh my gosh!!! I just figured out that I can only run Nier in a proper way by turning antialias off!! finally grass, trees and objects entering the visible range are drawn in a right matter!! Damn us … if not for the strategy segment... pc for instance .. or tw I would have stopped with gaming a long time ago... Thank God for the grognard attitude that grants stability to my gaming life !!!!!! 

Btw it all started with blockchaining… its all those youths not thinking straight anymore.. this almost looks like an Asian spring awakening cia orchestrated spy movie!!! 

Link to comment
Share on other sites

Link to post
Share on other sites

Well another part is that old hardware and software isn't supported anymore. A developer could target say a Core 2, but if they have issues and they can pin it down to the hardware having some quirk, Intel isn't going to go "oh let me help you guys" for free, if at all. Likewise if your game to run on Windows XP, well... good luck with any issues that arise from some quirk in the operating system because Microsoft isn't going to help.

 

You could probably run a lot of games on hardware less powerful than the minimum requirements say you need. The only reason why they have it there in the first place is likely because that's the earliest hardware and software version someone's willing to support.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/11/2018 at 1:22 PM, Khader87 said:

why DX12 and vulkan still not widely supported?

Because the Game Engines have to be developed from the ground to support Mantle and Variants and they need to learn some new things, have more responsibility and do some things differently...

So that's why  that won't happen.


That also leads to the Mess OpenGL was at the end: Keep the old garbage from the 90s and add all the new shit to it...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, campy said:

You can get 720p normal 40fps average out of an 8800GT from 2007. 1gb versions ideal but 512mb will do it. Hell, an entire 2007 system with a Q6600 and 4gb of ram with that 8800GT can do it just fine.

Thing is:
It shouldn't.

With shit that old, the Game should run at 10fps or not even start. Just look back at 11 years prior to that, 1995, what CPUs were available? IIRC we're talking about 133MHz Pentium or so. If you'd come in a Forum with such machine in 2007 (yes they existed, I was active at that time in one of the biggest German Forums. And I was so active that the next one needed many years to push me from my #1 posting throne with over 60k postings)...

Anyway, You'd not be taken seriously, you'd be flamed and bashed. 

But things from that time wouldn't run anyway, so that's that.

21 minutes ago, M.Yurizaki said:

Well another part is that old hardware and software isn't supported anymore.

And that's how it should be!
Nobody should care about Core 2 and Phenoms and use AVX if possible, wich was introduced around 7 years ago with the Sandy Bridge Architecture...

 

And the Error Messages should be clear and not nice and say something like:
"Your hardware is crap, it won't run on this shit".

 

21 minutes ago, M.Yurizaki said:

A developer could target say a Core 2, but if they have issues and they can pin it down to the hardware having some quirk, Intel isn't going to go "oh let me help you guys" for free, if at all

That's not the only problem.

Another one is that those things are 12 years old at worst. If used regularly, they are dying.

And that causes problems. I myself have a couple of dead DDR-2 SDRAM DIMMs lying here. Everything in that system is on its last leg and could stop working the next day - or already have issues.

 

So if you let your system run on that, it might be that you'll be bombarded with shit from Core 2 Users that think your software is buggy when in reality their CPU is going to be kaktus soon. 

 

And I say that as someone who used a Core 2 System like 5minutes ago, in a DELL Vostro 1500 (with a WORKING G84 Chip!!1111)

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×