Jump to content

OMG This is so sad ㅜ_ㅜ

pipicat6489
Go to solution Solved by Sarra,

Could have redirected some of your GPU budget into better CPU, faster RAM, or both, and gained more performance in Minecraft, and possibly even saved some money in the end.

 

That AMD system is hard to beat for the price, though.

well, i have a pc i bought for 1359 USD,

10400 intel

b460m msi mortar

samsung ddr4 2666 8x2GB

zotac rtx 3070 oc

crucial mx 500 500gb

seagate 7200rpm 2tb

600w power

used the intel general cooler

 

and my friend bought a pc for 635 usd

amd 3500

a320m

ddr4 2666mhz 8x2gb

inno3d gtx 1660 super

wd green 240gb ssd

500w power

used the amd general cooler

 

well we did a fps battle on minecraft 1.8.9 forge, we did te same video settings, and same ram.

 

and you know what,
 

he won...

 

how can this be possible??? like i even oc my gpu!!!

Link to comment
Share on other sites

Link to post
Share on other sites

bedrock or java?

if it was useful give it a like :) btw if your into linux pay a visit here

 

Link to comment
Share on other sites

Link to post
Share on other sites

Minecraft is CPU-bound and your CPUs are very close in performance. Vanilla Minecraft really doesn't tax GPUs, a 1660 and a 3070 super will perform the same with comparable CPUs.

 

Also, if one of you used Optifine or a similar mod, it'll drastically impact performance.

 

If you want to compare, your settings have to be identical... If you're rendering at 24 chunks and he's sticking to 14, he's going to get more frames lol.

 

I don't know how much of a difference the drivers will make with Minecraft.

Link to comment
Share on other sites

Link to post
Share on other sites

Minecraft is CPU-bound and your CPUs are very close in performance. Vanilla Minecraft really doesn't tax GPUs, a 1660 and a 3070 super will perform the same with comparable CPUs.

 

Also, if one of you used Optifine or a similar mod, it'll drastically impact performance.

 

If you want to compare, your settings have to be identical... If you're rendering at 24 chunks and he's sticking to 14, he's going to get more frames lol.

 

I don't know how much of a difference the drivers will make with Minecraft.

Link to comment
Share on other sites

Link to post
Share on other sites

the chunks are the same, its vanilla java mc. we both used optifine, same setting

Link to comment
Share on other sites

Link to post
Share on other sites

Could have redirected some of your GPU budget into better CPU, faster RAM, or both, and gained more performance in Minecraft, and possibly even saved some money in the end.

 

That AMD system is hard to beat for the price, though.

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...
On 12/30/2020 at 11:24 PM, FangerZero said:

Try a GPU intensive game and the story will be different, I'm sure. 

Thanks, Fortnite Had A Different Story

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/30/2020 at 2:59 AM, Sarra said:

Could have redirected some of your GPU budget into better CPU, faster RAM, or both, and gained more performance in Minecraft, and possibly even saved some money in the end.

 

That AMD system is hard to beat for the price, though.

he got caught red handed he used overclocked 4000mhz ram

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/21/2020 at 8:31 AM, Aereldor said:

Minecraft is CPU-bound and your CPUs are very close in performance. Vanilla Minecraft really doesn't tax GPUs, a 1660 and a 3070 super will perform the same with comparable CPUs.

 

Also, if one of you used Optifine or a similar mod, it'll drastically impact performance.

 

If you want to compare, your settings have to be identical... If you're rendering at 24 chunks and he's sticking to 14, he's going to get more frames lol.

 

I don't know how much of a difference the drivers will make with Minecraft.

Isn't the AMD CPU mentioned 6c 6t, whilst the Intel is 6ct 12t?

 

I ask because I have the i5 10400f, and it is 6c 12t.

 

Does that extra 6t make no real difference, or is it not utilised in certain games?

 

P.S does the F in the model name (i5 10400f) mean it cannot be over-clocked?

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Maury Sells Wigs said:

Isn't the AMD CPU mentioned 6c 6t, whilst the Intel is 6ct 12t?

Yep - the 3500 is 6C/6T, while the 10400F is 6C/12T.

2 hours ago, Maury Sells Wigs said:

P.S does the F in the model name (i5 10400f) mean it cannot be over-clocked?

The F means that it has no iGPU. Only K Intel CPUs can be overclocked.

elephants

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, FakeKGB said:

Yep - the 3500 is 6C/6T, while the 10400F is 6C/12T.

The F means that it has no iGPU. Only K Intel CPUs can be overclocked.

Yep, you are spot on - the F series mean no iGPU.

 

Bearing in mind the intel has an extra 6 threads, would you say they are still pretty equal?

 

If so, why is the 6 thread difference not amounting to any noticeable performance advantage(a)?

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Maury Sells Wigs said:

Isn't the AMD CPU mentioned 6c 6t, whilst the Intel is 6ct 12t?

 

I ask because I have the i5 10400f, and it is 6c 12t.

 

Does that extra 6t make no real difference, or is it not utilised in certain games?

 

P.S does the F in the model name (i5 10400f) mean it cannot be over-clocked?

 

F means it has no iGPU, just like the ryzens. The 10400f has more threads, the Ryzen 5 3500 is a fuckton cheaper (it's only available in parts of asia), and it has a higher all-core boost and better single thread performance. 

 

It can also be overclocked, and on any motherboard, unlike the 10400F

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Aereldor said:

 

F means it has no iGPU, just like the ryzens. The 10400f has more threads, the Ryzen 5 3500 is a fuckton cheaper (it's only available in parts of asia), and it has a higher all-core boost and better single thread performance. 

 

It can also be overclocked, and on any motherboard, unlike the 10400F

Thanks for giving me the information.

👍

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×