Jump to content

Nvidia & their AIBs are deceiving customers yet again with new GT 1030 variant

AlTech
6 hours ago, AluminiumTech said:

Snip

Here is the thing. Going to slower memory depending on the speed of the GPU might not even change performance of the card. The card has to be fast enough to take advantage of that extra memory speed and until we have some solid comparisons between the two models it is hard to throw stones.

 

This could just be a way for them to reduce the prices on this card to make it more competitive with the 2200g and 2400g (low gpu cost + stronger cpu for same pricing). They might have some old memory they are trying to get rid of or landed a good deal on some. Either way until we have some solid comparison that show that performance is affected there is literally no argument to be had on this.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mrthuvi said:

I will give you my sit:

Onboard graphic on cpu died for some reasons. I follow advises and bought a used low end graphic. The first one died after 6 months. The next one died after 3 months. Now I need a new dedicated card just for that 3-year-warranty and peace of mind and I don't care at all what kind of performance it gets.. 

2200g also has a warranty. I realize a full platform upgrade is a lot more than a 1030 specially because of RAM pricing but still, there's no reason why you shouldn't live without a GPU for a few more weeks to afford a 1050 which should be the bare minimum one should buy for a dedicated GPU.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, AngryBeaver said:

Here is the thing. Going to slower memory depending on the speed of the GPU might not even change performance of the card. The card has to be fast enough to take advantage of that extra memory speed and until we have some solid comparisons between the two models it is hard to throw stones.

The difference here is big enough that it's absolutely certain to cripple performance.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Sakkura said:

The difference here is big enough that it's absolutely certain to cripple performance.

This has been said before and if I remember correctly last time that correlated to a 5% or less decrease in performance.

Link to comment
Share on other sites

Link to post
Share on other sites

It should be named GT 1020 [:

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, DildorTheDecent said:

I love these threads where everybody pretends to care over a product they aren't buying.

I care because it's a product we sell where I work.  I don't care for me, but I care for those we'll sell them to.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, AngryBeaver said:

This has been said before and if I remember correctly last time that correlated to a 5% or less decrease in performance.

https://www.techpowerup.com/reviews/MSI/GTX_650_Power_Edition/27.html

 

GTX 650 about 50% faster than GT 640 at 1080p. GPU clocks can only account for a difference of 17% at most.

 

https://www.anandtech.com/show/5969/zotac-geforce-gt-640-review-/13

 

Quote

NVIDIA’s GK107 GPU may have a lot of performance potential, but its first desktop iteration as the GeForce GT 640 DDR3 does not. The decision to equip it with DDR3 clearly bottlenecks the card just as it has done to previous generation entry-level cards. So this is by no means a new problem, but it’s a recurring problem that always has the same solution: buy GDDR5.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Sakkura said:

https://www.techpowerup.com/reviews/MSI/GTX_650_Power_Edition/27.html

 

GTX 650 about 50% faster than GT 640 at 1080p. GPU clocks can only account for a difference of 17% at most.

 

https://www.anandtech.com/show/5969/zotac-geforce-gt-640-review-/13

 

 

The thing is though it has to be slow enough to bottleneck the card. Also ddr3 is a much bigger drop in speed vs gddr5 vs say ddr4. I suspect performance loss will be very small if at all.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Energycore said:

Wait, I came to this article expecting lower core counts on the DDR4 1030s. So what if there's lower clocks on them? You can overclock. This seems pretty minor other than some naming confusion but it's clear to me - if you see DDR4 it's a slow 1030, if you see GDDR5 it's a fast one. I already knew that looking at the memory types.

 

Compared to calling cards with different core counts the same (GTX 1060 3/6G, RX 560 896/1024 core), this seems pretty minor imo.

Well........ customers may end up buying GP108-300 instead of GP108-310 by accident.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

wow you guys are acting like this is first time this has happened

450gts

 

and I believe there are some amd cards that did this aswell

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, AngryBeaver said:

The thing is though it has to be slow enough to bottleneck the card. Also ddr3 is a much bigger drop in speed vs gddr5 vs say ddr4. I suspect performance loss will be very small if at all.

The GT 640 was 1800 MT/s vs. the 5000 MT/s of the GTX 650. 2.78x the speed.

 

The GT 1030 is 2100 MT/s (or 2133, it might have been rounded down) vs. 6000 MT/s. Either 2.81 or 2.86x the speed. So it's a slightly bigger difference than it was for the GT 640.

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, AngryBeaver said:

The thing is though it has to be slow enough to bottleneck the card. Also ddr3 is a much bigger drop in speed vs gddr5 vs say ddr4. I suspect performance loss will be very small if at all.

Well, the GT 1030 has a 64-bit memory bus, vs 128-bit on the 640/650. The memory clock of the DDR4 is also quite low, vs the relatively high clock of the DDR3 used on the 640. As a result, the 1030 just barely punches above 16 GB/sec while the 640 approaches 30 GB/sec.

 

Honestly, my phone has nearly twice the memory bandwidth of the 1030 DDR4.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, VegetableStu said:

AMD's equally guilty of this for the RX560(? if I remember right)

That card is label the RX560D

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Jahramika said:

That card is label the RX560D

hd6670?

5xxx series too

its all over the place from both companies

its really nothing new

 

730gt do this too

hd5670 too

 

like I said nothing new

Link to comment
Share on other sites

Link to post
Share on other sites

Y'know as an owner of the original variant - which should not be a thing - it's sickening that Nvidia is beginning to be so blatant in their bullying tactics. Currying favor is part of the industry, and the relabeling and repurposing of parts is par for the technological course. However, when they do it like this, it does look like they're trying to save money, except... when they don't tell anyone.

 

That's the difference between a slice of Velveeta verses a slice of Black Diamond in your hamburger. Only one of them is mostly oil, but you're being charged the same price. They don't pay the difference, they take it from you. That's scamming, and false advertising. Let's not fall into the trap of "it's okay because it's done". It's wrong and they need to be called out for it. I'm glad Linus got pissed on stream - it's unethical and bad practice.

 

Just like their stance on cryptocurrency. They don't make more money on overpriced cards, but all that PR is free, even if it is bad. Anyway, I like to be able to trust the brand I buy, and this makes me wary when Nvidia has been dependable for nearly twenty years for this kiddo.

Link to comment
Share on other sites

Link to post
Share on other sites

On 07/04/2018 at 1:09 AM, Misanthrope said:

2200g also has a warranty. I realize a full platform upgrade is a lot more than a 1030 specially because of RAM pricing but still, there's no reason why you shouldn't live without a GPU for a few more weeks to afford a 1050 which should be the bare minimum one should buy for a dedicated GPU.

I can't live without one because I need my pc to answer emails. And my pc won't work without a dedicated gpu because gpu in my cpu died. 

Link to comment
Share on other sites

Link to post
Share on other sites

On the subject of necessity, that's what it comes down to. Having the option of a 1030 when my i7's built in graphics are not at all comparable, well... the card just sells itself. I actually plan to upgrade to a 1050 now that there are low profile versions being made. AMD's low profile offerings just don't match up. All the more reason I need to be able to trust the vendor.

Link to comment
Share on other sites

Link to post
Share on other sites

I have in the past purchased one of these type of cards, simply because I was repurposing an old motherboard that didn't have video output, so I got the cheapest new card I could find.

 

I do agree with this though

On 06/04/2018 at 6:28 PM, Princess Cadence said:

It should be named GT 1020 [:

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, valdyrgramr said:

Low end cards have used DDR3 too.

And DDR2 as well. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

GDDR4 Vram was skipped completely by NVidia, until recently. The only manufacturer of these cards were ATI, and they were very short lived.

 

When comparing to GDDR3 Vram,
 

Quote

 

- The clock frequencies were near the same

- Power consumption was higher

- Latencies were longer.

 

 

The cards did manage to stay cool, but that doesn't make up for them. This platform was doomed from the start hence why many manufactures didn't follow along with the platform.

 

The reason why NVidia skipped GDDR4, is it's manufacturing price. NVidia didn't see any profits coming so they decided not to manufacture on this process.

 

Only 7 cards were manufactured, all from ATI.

 

This video explains a lot of what I said.

hi.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, tj_420 said:

GDDR4 Vram was skipped completely by NVidia, until recently. The only manufacturer of these cards were ATI, and they were very short lived.

 

When comparing to GDDR3 Vram,
 

 

The cards did manage to stay cool, but that doesn't make up for them. This platform was doomed from the start hence why many manufactures didn't follow along with the platform.

 

The reason why NVidia skipped GDDR4, is it's manufacturing price. NVidia didn't see any profits coming so they decided not to manufacture on this process.

 

Only 7 cards were manufactured, all from ATI.

 

This video explains a lot of what I said.

When and how did you get the impression GDDR4 was being used? Granted, JEDEC's naming scheme could be better, but GDDR4 is an entirely different memory technology than the DDR4 being employed in the 1030 video card.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

No reason to by any card below the 50 line for Nvidia, all low end hardware is a bad value so whatever as stated above this is nothing new and the only one being scummy is palit for not marking them differently.

 

I do find it funny the DDR4 they used is slower than a lot of DDR3 xD , though this seems to be an extreme low power sku 

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AresKrieger said:

No reason to by any card below the 50 line for Nvidia, all low end hardware is a bad value so whatever as stated above this is nothing new and the only one being scummy is palit for not marking them differently.

Pretty much. About the only current use I could see one for would be as a card for Ryzen/TR systems that need the cpu power, but for some reason don't need the gpu power, and even then, there are cheaper display output cards available, not counting used.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×