Jump to content

AMD refuses to set handicap on any workload

Moonzy

Yeah... artificial limits are kinda meh in general though...

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, HelpfulTechWizard said:

You be crazy to not do this, even on got got it at MSRP. At stock I think 3070 profits 6$/day (usd)

That's all I've been saying really.  These days at these prices buying a GPU just to be entertained a little better isn't worth it.  Not when an experience as good as a console can be had with a good AMD APU (For those reading but not in the know there are some important differences, look up heterogeneous system architecture.)  There is a reason PS 5 and XBOX used them. 

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, HelpfulTechWizard said:

You be crazy to not do this, even on got got it at MSRP. At stock I think 3070 profits 6$/day (usd)

How is 6 dollars a day of profit worth having a pc run 24/7 that you can't even use? I mean what is that per month? 180 bucks? That seems like a huge waste of energy and time imo. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Brooksie359 said:

How is 6 dollars a day of profit worth having a pc run 24/7

That's up to you

$6 for a year is $1800, that's more than a scalper priced 3070

Of course it's inconsistent but I'm just saying some is better than none

 

If you're looking for $100 a day, then GPU prices won't just be $1000

 

8 minutes ago, Brooksie359 said:

having a pc run 24/7 that you can't even use?

Despite popular beliefs, a PC that is mining can be used as normal, I do it every day

Some games can be run in tandem without much impact, but you can always pause mining if you're doing tasks that are affected

 

8 minutes ago, Brooksie359 said:

That seems like a huge waste of energy and time imo. 

Won't get into energy as that's an endless debate

 

But if you think it's not worth your time then don't do it, many others clearly does

 

The 3 3060 ti that I bought in December already paid for themselves by mining

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Brooksie359 said:

How is 6 dollars a day of profit worth having a pc run 24/7 that you can't even use? I mean what is that per month? 180 bucks? That seems like a huge waste of energy and time imo. 

It's not a waste of time when you aren't using it anyways.

And wit that, even using it and just having it I your downtime, you could probably get 500 within like 100 days.

 

And that's without optimising.

Undervolt (lowers temps), lower power limit (can get to like -50% before you lose perf; it lowers cost in electricity), overclock mem and core, you may be able to make closer to 6.50 or 7.00 (I'm not sure on this, but @Moonzy could tell you).

 

 

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, HelpfulTechWizard said:

Undervolt (lowers temps), lower power limit (can get to like -50% before you lose perf; it lowers cost in electricity), overclock mem and core, you may be able to make closer to 6.50 or 7.00 (I'm not sure on this, but @Moonzy could tell you).

For 3070, some of them can go down to 120W, most of them requires 130W

So, around 48-55% power limit

 

Mem clock: I have 10 3070

The lowest achieving two are +975, while the highest ones are +1400, median would be around 1150 or so

 

Core clock is either 0 or -502 (maximum underclock, won't affect perf anyways)

 

Fan speed at 80% to ensure VRAM gets adequately cooled because there's no sensor for VRAM other than G6x modules. You may put 100% if you like jet engines.

 

Hash rate depends on mem clk, so around 59MH/s to 62MH/s

 

Income is roughly $5.5-6 a day, pre minus electric fee

Taking out electric fee would be... $5 perhaps, depending on your local electric fee and setup

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Moonzy said:

Mem clock: I have 10 3070

The lowest achieving two are +975, while the highest ones are +1400, median would be around 1150 or so

 

Man g6x really does of well.

 

For mining, my 480 gets about 8mh/s  when stock

Stock limit (+500mhz mem) is about +.75mh/s (about 9mh/s)

Bios Modded up to 2650 (+900mem), I get  +2mh/s (10mh/s)

It can run 2750mhz mem, but I lose perf. (Probably need to do custom timings for it)

 

I'd kill to get +1400 out of my memory (but aperently +1k is on the rediclious end for a 480). The memclk:perf is basically a straight line for a 480

Edited by HelpfulTechWizard
Made it more clear

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Rym said:

Probably because it's not designed for mining.

That be true, but well designed for gaming doesn't necessarily mean bad for mining

Just look at Nvidia's cards

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Moonzy said:

Mem clock: I have 10 3070

O.O thats a lot. I want one

✨FNIGE✨

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Brooksie359 said:

How is 6 dollars a day of profit worth having a pc run 24/7 that you can't even use? I mean what is that per month? 180 bucks? That seems like a huge waste of energy and time imo. 

SimpleI bought my 5500 to play games and daily replace my 5 year old 390 that simply died on me. 

Then came work;

Now not only I'm not at home during the day, I'm not at home at all 5 days out of 7;

could leave it there and collect dust

or could recoup the what i spent

6$/day would be awesome, but i'll just have to make do with 3$/day.

With that i'll just use the money to upgrade my PC, eventually i'll reach the point where i can get a beefier GPU

But being able to jump on Ryzen without putting my own work money on it is pretty awesome.

And all i had to do was leave the PC at home on.

And it didn't even made a difference on my electricity bill. So yay.

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Moonzy said:

That's up to you

$6 for a year is $1800, that's more than a scalper priced 3070

Of course it's inconsistent but I'm just saying some is better than none

 

If you're looking for $100 a day, then GPU prices won't just be $1000

 

Despite popular beliefs, a PC that is mining can be used as normal, I do it every day

Some games can be run in tandem without much impact, but you can always pause mining if you're doing tasks that are affected

 

Won't get into energy as that's an endless debate

 

But if you think it's not worth your time then don't do it, many others clearly does

 

The 3 3060 ti that I bought in December already paid for themselves by mining

Yeah not in the habit of having a pc running 24/7 sucking up energy just to make some money. Sure thats 1800 dollars in a year but that is also leaving a pc on 24/7 for a year which I am not a fan of. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, HelpfulTechWizard said:

It's not a waste of time when you aren't using it anyways.

And wit that, even using it and just having it I your downtime, you could probably get 500 within like 100 days.

 

And that's without optimising.

Undervolt (lowers temps), lower power limit (can get to like -50% before you lose perf; it lowers cost in electricity), overclock mem and core, you may be able to make closer to 6.50 or 7.00 (I'm not sure on this, but @Moonzy could tell you).

 

 

Maybe I just have a different sense of value as 500 bucks is not enough to warrant mining even part time as its a waste of electricity and its also making my pc run longer than I would normally. Not a fan of having to replace my fans super soon because they are running all the time. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Brooksie359 said:

Maybe I just have a different sense of value as 500 bucks is not enough to warrant mining even part time as its a waste of electricity and its also making my pc run longer than I would normally. Not a fan of having to replace my fans super soon because they are running all the time. 

Fine. Don't get your gpus value back.

/s

seriously though, its just because for me, the cost of 500 for a gpu is a bit high,. being able to make it back is so little time, its worth it.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, HelpfulTechWizard said:

Fine. Don't get your gpus value back.

His hardware, his choices

He may think that gaming more than offsets the value of his GPU and having downtime from a system that's potentially broken by mining isn't worth it.

 

Side note:

I hate people force/shame people into doing mining

If they don't wanna do it, that's their choice, don't force your lifestyle/standards onto them.

Educate them and let them make their own decisions

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Moonzy said:

His hardware, his choices

He may think that gaming more than offsets the value of his GPU and having downtime from a system that's potentially broken by mining isn't worth it.

 

Side note:

I hate people force/shame people into doing mining

If they don't wanna do it, that's their choice, don't force your lifestyle/standards onto them.

Educate them and let them make their own decisions

i meant that sarcastically, id guess that didnt come off that way.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Glad to read this; even from a purely gaming standard, these limiters are utter bullshit.
If they have the time and resources to develop limiters, I'd rather see them put those towards improving future products or making their drivers better.

Link to comment
Share on other sites

Link to post
Share on other sites

If anyone does it as badly as NVIDIA has, then it's better to just not bother with it. Besides, this mining nonsense will crash like several times again, then we'll have short while of normal conditions until someone makes up another BS crypto currency and everything repeats...

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Uttamattamakin said:

That's all I've been saying really.  These days at these prices buying a GPU just to be entertained a little better isn't worth it.  Not when an experience as good as a console can be had with a good AMD APU (For those reading but not in the know there are some important differences, look up heterogeneous system architecture.)  There is a reason PS 5 and XBOX used them. 

The problem with spending for a fast APU is that if someone intends to upgrade the GPU when process finally come down, the iGPU portion becomes dead weight. May as well search for a PS5 if you want gaming, or fork over for scalper prices if you need the productivity, and skip the PC for awhile. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

I'll never have experience of PC gaming on any console. Ever. I've had PS2 and while it's not indication of today's standards, game controls haven't changed. I just can't play FPS or RPG games with gamepad. It's so unnatural and clumsy.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, RejZoR said:

I'll never have experience of PC gaming on any console. Ever. I've had PS2 and while it's not indication of today's standards, game controls haven't changed. I just can't play FPS or RPG games with gamepad. It's so unnatural and clumsy.

you can use kb and mouse on consoles

 

1 hour ago, RejZoR said:

If anyone does it as badly as NVIDIA has, then it's better to just not bother with it. Besides, this mining nonsense will crash like several times again, then we'll have short while of normal conditions until someone makes up another BS crypto currency and everything repeats...

i dont think crypto will die off anytime soon if people keep investing cash into it also

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Zodiark1593 said:

The problem with spending for a fast APU is that if someone intends to upgrade the GPU when process finally come down, the iGPU portion becomes dead weight. May as well search for a PS5 if you want gaming, or fork over for scalper prices if you need the productivity, and skip the PC for awhile. 

Not in the case of an AMD APU.  The architecture they have uses the GPU to accelerate various productivity task separate from whatever a dGPU is doing.   I see your point if we are talking about an intel iGPU that is a different proposition. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Uttamattamakin said:

Not in the case of an AMD APU.  The architecture they have uses the GPU to accelerate various productivity task separate from whatever a dGPU is doing.   I see your point if we are talking about an intel iGPU that is a different proposition. 

This was half the reason I upgraded my old Haswell i5 4440 to a Xeon E3 1280V3, instead of going for a 4770. I didn't want an iGPU, since I never would use it. The i5 had an iGPU and I could not get the onboard to work, regardless.

 

The other half of the equation was the fact that the Xeon was $100, and the cheapest used 4770 (with or without a K) was $300. Quite happy with the Xeon, even though it's way more locked down clockwise than the 4770. I mine on it 24/7, and the highest temp I've ever seen on it was 62°C.

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Uttamattamakin said:

Not in the case of an AMD APU.  The architecture they have uses the GPU to accelerate various productivity task separate from whatever a dGPU is doing.   I see your point if we are talking about an intel iGPU that is a different proposition. 

It doesn't. The only things the GPU built in the APU will be useful are for driving extra displays or using it as an extra media encoder/decoder (and an intel igpu can do the same).

The whole "heterogeneous computing" thing was just a gimmick to say they had some software SDKs to make use of the GPU without while taking advantage of the shared memory, and it didn't really catch on (other than on consoles). Having an APU on your desktop/laptop won't bring any benefit.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, igormp said:

It doesn't. The only things the GPU built in the APU will be useful are for driving extra displays or using it as an extra media encoder/decoder (and an intel igpu can do the same).

The whole "heterogeneous computing" thing was just a gimmick to say they had some software SDKs to make use of the GPU without while taking advantage of the shared memory, and it didn't really catch on (other than on consoles). Having an APU on your desktop/laptop won't bring any benefit.

 If you think so buddy.  I am pretty sure that quite a big of software is coded to do just exactly this.   In fact I am pretty sure if you have a dGPU and use productivity software that can benefit from it it will use that.   Consider what Adobe says about Photoshop. 

Photoshop graphics processor (GPU) card FAQ (adobe.com) 

Just one example of a GPU being used for productivity work not gaming.  Clearly lots of video editing software uses the GPU for rendering and transcoding and having an APU that is built to leverage that will work better than an intel , good enough to booth with, graphics.   Their Xe graphics are supposed to try and catch up on that.  With supposedly better support for virtualization via a technology called SR-IOV. Something which Nvidia only supports, sort of, at the top end of their enterprise GPU's. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Uttamattamakin said:

 If you think so buddy.  I am pretty sure that quite a big of software is coded to do just exactly this.   In fact I am pretty sure if you have a dGPU and use productivity software that can benefit from it it will use that.   Consider what Adobe says about Photoshop. 

Photoshop graphics processor (GPU) card FAQ (adobe.com) 

Just one example of a GPU being used for productivity work not gaming.  Clearly lots of video editing software uses the GPU for rendering and transcoding and having an APU that is built to leverage that will work better than an intel , good enough to booth with, graphics.   Their Xe graphics are supposed to try and catch up on that. 

 

Ok, looks like you saw way too much marketing stuff.

 

What you're talking about applies to ANY GPU, no matter if the one built into an AMD CPU, Intel CPU, or dedicated ones from Nvidia or AMD. What really matters is the software supporting the required APIs to make use of such hardware (something that Nvidia excels with CUDA).

In your example with photoshop, if you have a dedicated GPU, photoshop will use it and won't touch the integrated one whatsoever.

 

Here's how AMD's HSA stuff works:

image.thumb.png.cb9b83d229fa352c8bafb25b50231e74.png

(source).

 

They haven't updated their HSA SDK in years and instead are relying on public consortiums to maintain it.

 

AMD's integrated GPU has better performance than Intel's integrated one, yes, but that's solely due to having more hardware that's also better, and has nothing to do with the whole HSA stuff.

 

8 minutes ago, Uttamattamakin said:

With supposedly better support for virtualization via a technology called SR-IOV. Something which Nvidia only supports, sort of, at the top end of their enterprise GPU's. 

Intel had GVT-g first. AMD's implementation is called MxGPU, and is only available in Radeon FirePro S GPUs (so not even all of their enterprise GPUs support it). So Intel is the only one that offers that feature in consumer hardware without requiring a license.

 

Keep in mind that hardware passthrough has nothing to do with what you're talking about. SR-IOV allows you to "share" a PCIe device between multiple guests. 

 

Feel free to ask any questions about GPUs or AMD's APUs. I actually had a laptop with their first APU arch (llano), which was kinda okaish at the time but nothing really impressive, and I also work with CUDA on a daily basis (mostly due to ML stuff).

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×