Jump to content

AMD Radeon VII Benchmark/Launch Mega Thread

Taf the Ghost
Just now, valdyrgramr said:

Well, I keep hearing in the videos that they were just using the current drivers.  AMD usually tunes cards overtime with driver updates to compete better.  So, most likely.  I'm waiting until August before buying one, and currently, they're all sold out.  XD 

Not surprised that they're all sold out, lol.

 

I was thinking about maybe saving up some money after I get a job (not 16, ow) and getting one, but I feel that there's no need to because I have a 1080p 144hz monitor. I'll probably just get a 2070/1080 or V64.

8086k

aorus pro z390

noctua nh-d15s chromax w black cover

evga 3070 ultra

samsung 128gb, adata swordfish 1tb, wd blue 1tb

seasonic 620w dogballs psu

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mxk. said:

I can't watch the video since I'm in class, but isn't the incident that got Freakazoid kicked out of c9? Simple is such a cuck, but man he's such a good player. #1 on HLTV's rankings for 2018.. 

Not sure, but it's the classic "y u bully me? u fakin bish" clip. ;)

 

Side note, I kinda feel bad for simple, it feels like him and electronic are the only ones doing work on his team. lol

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PCGuy_5960 said:

Not sure, but it's the classic "y u bully me? u fakin bish" clip. ;)

 

Side note, I kinda feel bad for simple, it feels like him and electronic are the only ones doing work on his team. lol

What team is he on? Still Na'Vi? I'm pretty sure thats the clip where freak and simple were incredibly toxic.

 

I really haven't been paying attention to all the team swaps lately, but something that did surprise me is flusha and kioshima both on c9, lol!

8086k

aorus pro z390

noctua nh-d15s chromax w black cover

evga 3070 ultra

samsung 128gb, adata swordfish 1tb, wd blue 1tb

seasonic 620w dogballs psu

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, mxk. said:

What team is he on? Still Na'Vi? 

Yep.

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PCGuy_5960 said:

Yep.

Wow! electronic on navi? tfff?

8086k

aorus pro z390

noctua nh-d15s chromax w black cover

evga 3070 ultra

samsung 128gb, adata swordfish 1tb, wd blue 1tb

seasonic 620w dogballs psu

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mxk. said:

Wow! electronic on navi? tfff?

He's been on NaVi for more than a year at this point. xD

https://liquipedia.net/counterstrike/Electronic

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Don't forget, we always see driver updates and boost optimization. 

 

What exactly did they mean at CES when it was mentioned Division 2 "At Launch" will support The Full Set of Advanced Radeon Features.   Lisa Su quickly hushed him up.

 

What are these Advanced Radeon Features ?

 

 

 

 

CPU i7 4960x Ivy Bridge Extreme | 64GB Quad DDR-3 RAM | MBD Asus x79-Deluxe | RTX 2080 ti FE 11GB |
Thermaltake 850w PWS | ASUS ROG 27" IPS 1440p | | Win 7 pro x64 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PCGuy_5960 said:

He's been on NaVi for more than a year at this point. xD

https://liquipedia.net/counterstrike/Electronic

I really stopped paying attention to the pro scene after I went scout only, lol.

8086k

aorus pro z390

noctua nh-d15s chromax w black cover

evga 3070 ultra

samsung 128gb, adata swordfish 1tb, wd blue 1tb

seasonic 620w dogballs psu

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I was actually pleased to see that the card is not only competitive with the 2080 in terms of performance but finally also in terms of temps and power: The node reduction was welcomed. Now Nvidia it's still more efficient since I'd imagine they'd pull out even better numbers once they go down the node but for the time being they weren't first and thus it's a net positive for AMD if they can meet demand (Which would be a shame if they can't due to HBM2 btw but still, betting on both a new node and the newest memory is a gamble they went into out of their own volition)

 

What I am not happy about is the apparent regression, specially since this was a bigger issue than the thermals were to begin with. All of the feature set is kind of meaningless to me now (Specially now that there's Freesync on Nvidia...kinda) but I'll still reserve my judgement until there is a mature driver if such a time comes before Nvidia releases a better product so anywhere from a few months to a year for AMD to get their shit together.

 

So overall it's decent, but nobody should be beta testing drivers on an 800 USD product no if/then/buts about it are acceptable.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Matsozetex said:

I know this may be unrelated, but the Seasonic Focus Plus series of the PSU had an issue with V56/64 cards, I guess not anymore. Gosh, its so ridiculous to look at the Vega peak draw compared to its competitors.

That's a power modded Vega card. Only the Vega 56 can be power modded like that. Vega 64 and 56 (not modded) draw about as much to slightly less power as the Radeon VII.

CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, THraShArD said:

Don't forget, we always see driver updates and boost optimization. 

 

What exactly did they mean at CES when it was mentioned Division 2 "At Launch" will support The Full Set of Advanced Radeon Features.   Lisa Su quickly hushed him up.

 

What are these Advanced Radeon Features ?

 

 

 

 

interesting had not noticed that, it really does seem like they planned on talking more about it, my guess is that its navi features as navi was supposed to come sooner.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, cj09beira said:

interesting had not noticed that, it really does seem like they planned on talking more about it, my guess is that its navi features as navi was supposed to come sooner.

GDC seems to be the big info drop. There's going to be more across the board DXR stuff, which might have more to do with Compute-based lighting engines than necessary Raytracing, if I've read the tea leaves properly.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD should just focus on making cross fire really good then they don't have to make chips as good as Nvidias just make 2 cheap cards perform like a mid grade 1 and 2 mid grade perform like a high end 1 and price them lower.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ToneStar said:

AMD should just focus on making cross fire really good then they don't have to make chips as good as Nvidias just make 2 cheap cards perform like a mid grade 1 and 2 mid grade perform like a high end 1 and price them lower.

problem is it would be very dificult to do that at the driver level and endup requiring per game profiles gain, large performance diffirences between games, it would be a mess.

when games are fully raytraced it will be quite a bit easier 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, cj09beira said:

problem is it would be very dificult to do that at the driver level and endup requiring per game profiles gain, large performance diffirences between games, it would be a mess.

when games are fully raytraced it will be quite a bit easier 

Try to make it so the games don't need to be optimized for it, there have been other technologies in the past that didn't require software such as 3dfx's version of SLI what it did is the cards would render every other line. PowerVR and S3 also had some different hardware technologies for it.

Link to comment
Share on other sites

Link to post
Share on other sites

S3 I believe had a technology that would just render different sections of the screen so you could have 4 chips and it would just divide the screen vertically then the software would just sync them together.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ToneStar said:

AMD should just focus on making cross fire really good then they don't have to make chips as good as Nvidias just make 2 cheap cards perform like a mid grade 1 and 2 mid grade perform like a high end 1 and price them lower.

No, they need to get their efficiency to Nvidia's level, first. Vega launched in 2017, so it's mostly a 2014 design, with the way things work in silicon engineering. mGPU as we know it will always require Developer support and no one bought them that way even when developers did.

Just now, cj09beira said:

problem is it would be very dificult to do that at the driver level and endup requiring per game profiles gain, large performance diffirences between games, it would be a mess.

when games are fully raytraced it will be quite a bit easier 

AMD will have to rework the entire front-end to be able to run MCMs like we all think they're going to do. Problems will be latency and coherence. We'll see what happens when they drop their MCM MI100/120 card at some point. There'll be at least their first generation on-PCB MCM setups.

 

I honestly think it'll be after Zen4 and the in-package chiplets before we see MCM GPUs that can handle gaming. It's going to take AMD a while before they can get the architecture in a state where it's seemless. CCIX being much more common will help them along.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, ToneStar said:

Try to make it so the games don't need to be optimized for it, there have been other technologies in the past that didn't require software such as 3dfx's version of SLI what it did is the cards would render every other line. PowerVR and S3 also had some different hardware technologies for it.

problem is games have become much more complex since, things like temporal anti alisasing for example are quite a pain in the butt for crossfire 

 

4 minutes ago, ToneStar said:

S3 I believe had a technology that would just render different sections of the screen so you could have 4 chips and it would just divide the screen vertically then the software would just sync them together.

problem with that is anti aliasing where the cards would have to quickly share the results of the pixels near the divide, or have anti aliasing done on a single die

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, cj09beira said:

problem is games have become much more complex since, things like temporal anti alisasing for example are quite a pain in the butt for crossfire 

 

problem with that is anti aliasing where the cards would have to quickly share the results of the pixels near the divide, or have anti aliasing done on a single die

They need cache & memory coherency to work with modern games. Original SLI was great because it actually rendered each line on separate GPUs, but modern graphics doesn't allow for that approach anymore.  Which is why we'll see the Radeon Duo with on-PCB PCIe 4 coherent connections at some point soon. It'll act, as a compute card, as one GPU.

 

Getting a Gaming GPU to that point is a ways off, but it's pretty obvious why AMD will go that way. They can eventually drop the need for a "Big" design and simply double up (or more) their smaller designs.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Taf the Ghost said:

They need cache & memory coherency to work with modern games. Original SLI was great because it actually rendered each line on separate GPUs, but modern graphics doesn't allow for that approach anymore.  Which is why we'll see the Radeon Duo with on-PCB PCIe 4 coherent connections at some point soon. It'll act, as a compute card, as one GPU.

 

Getting a Gaming GPU to that point is a ways off, but it's pretty obvious why AMD will go that way. They can eventually drop the need for a "Big" design and simply double up (or more) their smaller designs.

well they did it with the CPUs and infinity fabric, threadripper, epyc are more like 2-8 cpus stuck together.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, i_build_nanosuits said:

1) "how many reviews have you watched?"

 

2) "...consumes more power than a 2080ti...in fact more than any nvidia cards..."

 

3) "...doesn't have ray tracing, no tensor core, no DLSS..."

 

4) "You'd have to be NUTS...and i mean it...completely NUTS to blow 700$+ on this crap of a card instead of an RTX 2080."

1) How many videos have you watched? Did you not hear the part about the drivers being broken? GN and Hardware Unboxed already said they are going to re-shoot the review of this card when working drivers are available. Obviously a bad launch is a major black eye and must be corrected for future launches, but AMD apparently doesn't want to sell many of these cards anyways!

 

2) Not if you believe the Hardware Unboxed review. It actually draws 51 watts less than the 2080ti. It has a similar power draw compared to the gtx 1080ti. I agree with the premise that power consumption sucks though. I was hoping 7nm would be better, but it looks like AMD decided to clock the GPU to its limit to compete with RTX. They didn't really have any other option. Still this is effectively a very heavily overclocked Vega card. Anyone that's done any overclocking knows that this kills any kind of reasonable of power draw. The only reason it matches current Vega power numbers is because of the node shrink to 7nm.

image.png.7273e2c0051caac3cb43a9533a613936.png

 

3) How many people have bought RTX cards for Ray tracing and DLSS? 3, maybe? This reminds me of PhysX. The only difference in the gimmicks is that ray tracing actually may have some desirability to the gaming market nVidia is targeting when ray tracing performance increases by an order of magnitude. As for the current cards, no, ray tracing is dumb. That's ok because "Just buy it!" - Tomshardware. Almost no one is buying a gaming GPU to utilize the tensor cores either. Most people looking into AI development are looking at commercial cards not gaming cards. You're just preaching a bunch of marketing crap.

 

4) Uhhh... no. The 1/4 FP64 performance (instead of 1/16 nVidia gaming cards have vs commercial cards) actually makes the Radeon VII much more appealing to people that are in the market for RTX 2080 gaming performance but also do some compute work. Admittedly that is a very small market. Again, AMD really doesn't want to sell many Radeon VIIs so no big deal in the first place. Calm down. Not everyone wants the same thing you want and vice-versa. Just because an RTX card may be better for you at the same price point of the Radeon VII doesn't mean it's better for everyone at the same price.

CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

 

8 minutes ago, ATFink said:

2) Not if you believe the Hardware Unboxed review. It actually draws 51 watts less than the 2080ti. It has a similar power draw compared to the gtx 1080ti. I agree with the premise that power consumption sucks though. I was hoping 7nm would be better, but it looks like AMD decided to clock the GPU to its limit to compete with RTX. They didn't really have any other option. Still this is effectively a very heavily overclocked Vega card. Anyone that's done any overclocking knows that this kills any kind of reasonable of power draw. The only reason it matches current Vega power numbers is because of the node shrink to 7nm.

image.png.7273e2c0051caac3cb43a9533a613936.png

4) Uhhh... no. The 1/4 FP64 performance (instead of 1/16 nVidia gaming cards have vs commercial cards) actually makes the Radeon VII much more appealing to people that are in the market for RTX 2080 gaming performance but also do some compute work. Admittedly that is a very small market. Again, AMD really doesn't want to sell many Radeon VIIs so no big deal in the first place. Calm down. Not everyone wants the same thing you want and vice-versa. Just because an RTX card may be better for you at the same price point of the Radeon VII doesn't mean it's better for everyone at the same price.

btw amd stopped where they did do to pcie spec limitations, pcie cards are limited to 300w per card (stupid as the limit should depend on the amount of connectors not a rambom value)

looking at the card's clock speeds in some of the videos we got, it can clock quite a bit higher, problem is its voltage is high because they want to be able to use the worst dies here (the good ones are going to the server market), which i think is a bad strategic move that just ends up making them sell even less cards (same happened with the original vega cards, when apple was buying the good ones)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Lathlaer said:

Clearly not entirely since Radeon's performance is higher than Vega 56 and it's not due to 8GB extra RAM.

Yes, it is.

As you would have had to remove two HBM Stacks, wich equals to VEGA56.

Though the new Vega 7 has a bit higher HBM Frequency, that would increase the performance a bit.

BUT: That would negate most of the increases we've seen with Radeon 7 as Radeon 7 has more than double the bandwith of Vega 56...

So that doesn't make much sense.


There is another Possibility: Order 2hi Stacks. But that makes even less sense as you have make the chip specifically for the Radeon 7 - wich is not the plan or what they wanted.

They just used the Server/Workstation Chip without many changes and put them onto a Desktop Card. Nothing more.

 

1 hour ago, Lathlaer said:

Because by the time I need 16GB for gaming, a top of the line current card with 16GB will be lower end mid range. 

So you're saying you're fine with planned obsolescence and rather buy a new card than use your old one a bit longer??
That's essentially what "the other side" does...

 

If there wasn't an AMD, we'd still be at 2GiB VRAM Cards, maybe 4GiB at least.

But right now the Default is 8GiB in Mid Range...

 

Sorry, but no, that's just bullshit.

 

High End Cards SHOULD have more VRAM than the Mid Range ones...

 

So it makes more sense to demand that a HIGH END Card that is sold for almost double the price of VEGA56 has double the VRAM.

Simple as that...

 

Paying 1000€ for a Card that has more than 8GiB VRAM and no alternative cheaper than that is just insane.

Now we have a cheaper alternative for 700€ that has 16GiB VRAM...

 

 

1 hour ago, Lathlaer said:

Again, I am not saying this is a bad card - it's just my take on why I think people are mostly meh about it. Rendering, compute aside - for consumer gaming it's mostly an OKayish premiere, nothing more.

...wich shouldn't be the main focus from AMD as the past has shown that it does not work.

They _HAVE_ to target the Compute Users like Rendering and other professional customers (wich is why they limited the 64bit rate to "only" 1/4 and not less than that)...

 

Why market the Card to someone who wouldn't buy it anyway?
That would use it only to reduce the price on "their side"??

 

1 hour ago, Lathlaer said:

Well yeah but unless AMD beats the fact into people's skulls nothing will ever change.

And how should they do that?!
Telling the overly biased "Testers" to take a hike and a cease and dessist letter??

 

Because that's how nVidia got strong: Propaganda from the Press and bashing for Features the other side didn't have - even if they were useless at the time. (32bit framebuffer was not viable with SDR-SDRAM).

 

Just look at how the Press treats AMD when there is a small oopsie.

Just look at how the Press treated the GTX970 Issue and lately the Geforce Partner Programme...

 

Or how the Press is ignoring the Gameworks shit...

 

They care about making babies in a Video Game.

They do not care about Proprietary garbage in a Video Game that is to the advantage of the one making that shit.

1 hour ago, Lathlaer said:

Your solution is what, that they stop trying because "people will buy NVIDIA anyway"?

More or less, yes. Focus on a Market that actually WANTS your Products.

Many Gamers don't want AMD Cards. They only want AMD to pressure nVidia to lower the price.

But they would never ever buy any AMD Cards, if they have the choice.

 

Even if the AMD Card is 20% faster and cost 25% less (or even more than that), they wouldn't bother with that.

 

1 hour ago, Lathlaer said:

That's not the way to take the market. They need to be relentless in their pursuit of market share. The alternative is to be angry, take your toys and drop the GPU segment altogether.

Wasting money on a Market that doesn't want your products is less viable than to  look for new markets that want your products.

Simple as that.


And the GPU Market for AMD is going very well, even if you don't like it!

They sold more than 200 Million of their Graphics Chips in Consoles. Even more than that...

 

XBox 360 had AMD Graphics, Playstation 4 and also the XBox One do.

 The Gamecube, Wii and WiiU had AMD Graphics. 

Add that up and you have:
Wii: ~100 Million

WiiU: ~15 Million

360: ~85 Million

PS4: probably same as 360

XBone: around 20-50 Million

Add that up and you have hundreds of million of graphics chips...

 

And they sell into the embedded Market. Ever heard about Matrox?? They sell Cards with AMD Chips!

 

With that going for them, why should they bother with High(er) End Graphics cards, where you'd just waste money on a product that hardly anyone wants??

 

Just look at the Steam Hardware Survey. That proves my point...

VEGA is almost non existant there...

 

1 hour ago, Lathlaer said:

If they couldn't make the GPU cheaper with less RAM due to supply chains, then fine - it is what it is.

They COULDN'T!

That would not have made ANY Sense!

Because its entirely possible that Samsung and Hynix don't even make 2hi HBM Stacks...


And using only 2 Stacks has 2  Problems:
a) basically a VEGA 56 in 7nm

b) not interchangable with the MI50...

 

1 hour ago, Lathlaer said:

But they have a recipe for potential Ryzen-esque premiere: make the same card with less RAM but same performance.

....

Why don't you want to understand that its not as easy as you believe?!
We ain't talking about Memory on the PCB.

We're talking about On Chip Memory!


So its not economically viable to reduce the Memory for Radeon 7!

For the Reason I've mentioned above..

 

And they also didn't want to do the Radeon 7.

It was only possible because of the rediculous prices from nVidia...

 

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, valdyrgramr said:

So, I have watched about 3-4 more reviews of the card from professional 3rd party reviewers.  It seems like the Radeon VII drivers are a key issue right now.  Jay was running into application crashing and black screens, Paul was noticing some unusual things as well, as were a few others.  Glad I'm waiting until August because in Paul's test the Radeon VII lost to the FE of the 2080 in Maya with one type of test, but that could be due to other reasons as well which could favor Turing.  But, Linus did a Blender test which showed the Radeon VII winning in Blender with lower rendering times.  Might get a 2080ti still depending on how things play out by August.  XD

This launch really feels like AMD is trying to shout "We're Here!" to the high end GPU market. Even though It might not be worth it it is impressive how AMD has really started to try competing with high end GPUs when they haven't for years.

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, ToneStar said:

AMD should just focus on making cross fire really good

AMD should ignore that and scrap that shit because hardly anybody uses it.

And they don't have the resources for 1% of the Userbase.

Yes, that is not a joke, that is how little people use multiple GPU in their PC...

 

And Crossfire Support was not that bad - back in the Radeon X era, where you didn't need a special GPU just put two (for example) X1800XL in a PC and call it a day.

47 minutes ago, cj09beira said:

problem is it would be very dificult to do that at the driver level and endup requiring per game profiles gain, large performance diffirences between games, it would be a mess.

when games are fully raytraced it will be quite a bit easier 

That as well.

Its a Chicken <-> Egg Problem at the Moment.

 

Because nobody uses it, nobody cares about it.

Becuase nobody cares about it, nobody uses it.

 

45 minutes ago, ToneStar said:

S3 I believe had a technology that would just render different sections of the screen so you could have 4 chips and it would just divide the screen vertically then the software would just sync them together.

AMD used a similar technique back in the Day.

Look up "SUper AA" or however that was called back in the day, when you could use both cards for FSAA and improve the picture further (IIRC up to 12x).


And also something called "Split Frame Rendering". 

 

So what you're saying was already done.

It was however phased out because it collides with fullscreen effects of modern Games...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×