Jump to content

Yeah it's SUPER, super out of stock that is - new RTX card rumors.

williamcll

Super? More like Super F5 mashing ....

 

Forget it ... back to my meat grilling overheating laptop it is.

If you want to reply back to me or someone else USE THE QUOTE BUTTON!                                                      
Pascal laptops guide

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Sarra said:

I doubt it. The actual signalling wouldn't be there, and the controller on the die would be expecting specific capacity chips, if it saw double that capacity, you would either have half the bandwidth, half the capacity, or it would flat out reject the chips, and you'd basically have a card with no memory.

Bummer, I was afraid something like that would be the answer... 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, porina said:

Is the 3070 the 104 or 102 die? We know the 102 die certainly supports the higher density chips (3090 vs 3080)

it's ga104  :/

 

Edit: Anyway, ISN'T THAT SOMETHING LINUS WOULD COVER? 

 

"PIMP YOUR 3070, NOW WITH MORE COREZ RAMZ!" 

 

170534065_ShirakamiFubukiVirtualYoutuberGIF-ShirakamiFubukiVirtualYoutuberCute-DiscoverShareGIFs.gif.0df0f295d96d53a2ea50a92177aa37d9.gif

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just to reiterate

 

"SUPER" designates a product refresh, and replaces the original. Much like how the 2070S replaced the OG 2070 (although it did not replace the 2060, as that one got a price cut in response to the 5600XT). "Ti" designates a complementary product with enhanced specifications, meaning that it doesn't replace a card, but rather complements it.

 

Of course, we now apparently have "ULTRA"...

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, D13H4RD said:

Just to reiterate

 

"SUPER" designates a product refresh, and replaces the original. Much like how the 2070S replaced the OG 2070 (although it did not replace the 2060, as that one got a price cut in response to the 5600XT). "Ti" designates a complementary product with enhanced specifications, meaning that it doesn't replace a card, but rather complements it.

 

Of course, we now apparently have "ULTRA"...

Yeah, is confusing, on purpose I'd wager, but in the end, we just know new, better cards are coming, soon. 

 

Questions are availability and pricing... 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, porina said:

Is the 3070 the 104 or 102 die? We know the 102 die certainly supports the higher density chips (3090 vs 3080) and I'd be surprised if the lower ones didn't also. Look at it like desktop ram, you can put two modules on a channel instead of one. Max bandwidth is same, although there are optimisations that often lead to two modules working better than one. Logically this can happen within a single higher capacity chip also (like dual rank vs single rank modules). 

 

 

 

Elsewhere:

https://www.igorslab.de/en/nvidia-geforce-rtx-3060-ultra-12-gb-schneller-als-eine-rtx-3060-ti-asus-tuf-gaming-geleakt-2/

What year is it? The "Ultra" name might be coming back with this rumoured 3060 Ultra 12GB. For those who aren't as ancient as me, Ultra used to be nvidia's top model CPU, above GT.

when the GPU is made, they program the controller to the chips installed on it. This is why there are blank RAM spots, with all of the solder pads in place, but if you were to pull a RAM chip off another card and solder it on, it wouldn't be recognized by the controller.

 

It's also an instance where NVidia probably has the firmware locked, so it just wouldn't work.

 

However, if there's a 3080TI, and it has more memory capacity, it could be possible, assuming the actual memory controller isn't totally locked down, to flash the 3080 TI firmware to a 3080 or 3080 super, and solder higher capacity chips on it.

 

It just wouldn't be worth it in the end, though, due to the time and cost involved. It would simply be cheaper to just buy a card that already has the capacity that you want, like a 6800XT.

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SteveGrabowski0 said:

Honestly WGAF? Thanks to the douchebag scalpers and the ethereum miners these will all be unobtanium anyways.

Depends.  If Nvidia makes a card that sucks for mining but still works for games, or makes a special card that is cheaper for miners to use than cards made for gaming they could make things go. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Bombastinator said:

Depends.  If Nvidia makes a card that sucks for mining but still works for games, or makes a special card that is cheaper for miners to use than cards made for gaming they could make things go. 

It would have to be an RTX 3060 4GB or something 😂

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Bombastinator said:

Depends.  If Nvidia makes a card that sucks for mining but still works for games, or makes a special card that is cheaper for miners to use than cards made for gaming they could make things go. 

The latter, not the former. If there was enough fabrication capacity, they would be making headless cards. They still might however depending on how the GPUs are binned, but that would be only if it's better than throwing them into the trash.

 

Keep in mind that CUDA cores are used for gaming. It's the same units used to mine crypto. You can't gimp one without the other.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, StDragon said:

The latter, not the former. If there was enough fabrication capacity, they would be making headless cards. They still might however depending on how the GPUs are binned, but that would be only if it's better than throwing them into the trash.

 

Keep in mind that CUDA cores are used for gaming. It's the same units used to mine crypto. You can't gimp one without the other.

AFAIK the most profitable algorithms/coins (Ethereum) want a relatively large amount of fast memory. AMD's infinity cache doesn't help too much. All 256bit GDDR6 cards (AMD's 6800 - 6900 xt, Nvidia 3060 ti, 3070) all have about 60 MH/s on ethereum.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, StDragon said:

The latter, not the former. If there was enough fabrication capacity, they would be making headless cards. They still might however depending on how the GPUs are binned, but that would be only if it's better than throwing them into the trash.

 

Keep in mind that CUDA cores are used for gaming. It's the same units used to mine crypto. You can't gimp one without the other.

Headless cards had issues iirc.  Resale value problems lowered effective value.  They need to figure out some kind of mining optimized design to make the miners want those and not gaming cards. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, SteveGrabowski0 said:

It would have to be an RTX 3060 4GB or something 😂

That could be a solution.  Makes me wonder if someone writes an app that allows consoles to mine, what would happen to the white Xbox?

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bombastinator said:

They need to figure out some kind of mining optimized design to make the miners want those and not gaming cards. 


Math doesn't care about intentions. You're either calculating integer, floating point, or vector.

 

There's no way to hobble mining without effecting gaming in any worthwhile measurable way.

 

If you want NVidia to help crypto, they should just put in an order to make ASICs to compete against Antminer.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, StDragon said:


Math doesn't care about intentions. You're either calculating integer, floating point, or vector.

 

There's no way to hobble mining without effecting gaming in any worthwhile measurable way.

 

If you want NVidia to help crypto, they should just put in an order to make ASICs to compete against Antminer.

That would do it.  My understanding is the reason miners use gaming GPUs is they have a much better resale value which effectively lowers their total cost.  For a mining device to be more desirable than a gaming card there has to be a positive cost/benefit.

 

One possible solution is software based.  My memory is that as a general rule miners can’t code for beans.  It’s basically a script kiddie thing.  One possibility is to write code in such a way that it is advantageous for a given type of card. 

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Bombastinator said:

That would do it.  My understanding is the reason miners use gaming GPUs is they have a much better resale value which effectively lowers their total cost.  For a mining device to be more desirable than a gaming card there has to be a positive cost/benefit.

 

One possible solution is software based.  My memory is that as a general rule miners can’t code for beans.  It’s basically a script kiddie thing.  One possibility is to write code in such a way that it is advantageous for a given type of card. 


GPUs are used because ASICs can have up to a year (12 months) of back-order! If they can hack a PS5 to mine, they will and those will price way higher!

 

The quants have looked into leveraging anything from cloud based rental to malware sapping CPU and GPU cycles.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, StDragon said:


GPUs are used because ASICs can have up to a year (12 months) of back-order! If they can hack a PS5 to mine, they will and those will price way higher!

 

The quants have looked into leveraging anything from cloud based rental to malware sapping CPU and GPU cycles.

If there are machines that work better than gaming GPUs and the problem is the manufacturers can’t react to the market fast enough, it stands to reason that there should be more manufacturers.  My understanding was ASICS are only useful for very specifically Bitcoin. There not generalized enough systems.  It sort of sounds like what is needed is something more cartridge based or something.  Make the things sufficiently adaptable. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bombastinator said:

My memory is that as a general rule miners can’t code for beans.  It’s basically a script kiddie thing. 

The typical miner doesn't write their own mining code for the same reason pretty much everyone here didn't write Windows code. Makes no sense to do so if there is readily available software that can be used. And there are often multiple choices for a given crypto, it isn't like there's only one person doing it in the whole world. But generally people will tend towards the best performing one.

 

1 hour ago, Bombastinator said:

If there are machines that work better than gaming GPUs and the problem is the manufacturers can’t react to the market fast enough, it stands to reason that there should be more manufacturers.  My understanding was ASICS are only useful for very specifically Bitcoin. There not generalized enough systems.  It sort of sounds like what is needed is something more cartridge based or something.  Make the things sufficiently adaptable. 

Once you talk about ASICs you're talking about getting dedicated silicon made. That is neither cheap nor fast. You're going to have to be sure about what you're doing is going to be relevant for long enough to make your costs back. For something long term stable like Bitcoin that might work. The risk/reward for other things makes it harder to do.

 

Haven't looked into it, but I do wonder if FPGAs can or have been used. See it as reconfigurable hardware, between something as general as a GPU, and fixed like ASIC. Not cheap but could make up for it by reacting faster than going ASIC route.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

Haven't looked into it, but I do wonder if FPGAs can or have been used. See it as reconfigurable hardware, between something as general as a GPU, and fixed like ASIC. Not cheap but could make up for it by reacting faster than going ASIC route.

It's a bit of a paradox, but if everyone could mine more cheaply, then the price of crypto wouldn't have the ROI that it does. It's basic supply/demand econ 101 stuff. In a round about way, if someone where to maliciously send out a virus that bricked firmware on mining equipment, whatever remained left functioning would gain more profit from mining as the pie has now shrunk.

 

Look, IMHO, once crypto goes "main stream", that's the time to leave the market. Never invest on the high end. And as the gold rush has proven, those that stand to profit where those making the tools, not so much those using them. Remember that next time you put on a pair of Levis Strauss jeans. 😄

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, StDragon said:

It's a bit of a paradox, but if everyone could mine more cheaply, then the price of crypto wouldn't have the ROI that it does. It's basic supply/demand econ 101 stuff. In a round about way, if someone where to maliciously send out a virus that bricked firmware on mining equipment, whatever remained left functioning would gain more profit from mining as the pie has now shrunk.

We've got enough crypto threads directly or indirectly. Point of mentioning FPGAs was more about providing an alternative in the short term to miners using GPUs. What happens to crypto doesn't matter from a gaming GPU perspective. FPGA  based miner would still need someone to make and configure as it is outside the skills of most miners. So in that respect it is still closer to ASIC based approach.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, porina said:

FPGA  based miner would still need someone to make and configure as it is outside the skills of most miners. So in that respect it is still closer to ASIC based approach.

Technically, I agree with you. But it all boils down to ROI, and as I recall FPGA's are really expensive for the performance they deliver. It's very niche to where you want something better than CPU performance, but don't want to front millions of dollars for entry level ASIC solutions (as in totally custom).

 

What will be real interesting is what AMD will do with the IP from Xilinx. Will they incorporate elements of the FPGA stack in each chiplet.....? 🤔 Might make for more flexible mining, but I doubt enough to overcome the raw number of CUDA cores on a GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×