Jump to content

Future of Radeon

Vode

Raja Koddorroo allowed nVidia to take the lead and AMD will be focusing heavily on multi-GPU?

Fuck, that means nVidia will raise prices to much further than the distance of the sun.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, CTR640 said:

Raja Koddorroo allowed nVidia to take the lead and AMD will be focusing heavily on multi-GPU?

Fuck, that means nVidia will raise prices to much further than the distance of the sun.

On the other hand, when AMD finally gets their higher end GPU's out, they can charge more than they would have because Nvidia raised (or lowered) the price/performance bar on the high end. Of course, that all depends on performance. If Vega ends up being a turd, people will only buy it if its priced like a turd, so there's that.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, CTR640 said:

Raja Koddorroo allowed nVidia to take the lead and AMD will be focusing heavily on multi-GPU?

Fuck, that means nVidia will raise prices to much further than the distance of the sun.

 

37 minutes ago, Briggsy said:

On the other hand, when AMD finally gets their higher end GPU's out, they can charge more than they would have because Nvidia raised (or lowered) the price/performance bar on the high end. Of course, that all depends on performance. If Vega ends up being a turd, people will only buy it if its priced like a turd, so there's that.

Please go learn microeconomics and how monopolistic competition and monopoly markets work. No one can just set prices in the sky. There's an optimal point on the supply and demand curves where they cross and maximum total profit may be achieved. All lack of competition does is shift the demand curve.

 

And as for macroeconomics and behavioral economics, most people won't wait more than a month once the Titan P and 1080TI are released. If Vega is late, it won't matter what price it's at. The sales window will have closed.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Kloaked said:

These are some amazing mental gymnastics you're pulling. I rate you a bronze for effort.

 

I remember the lot of you shitting on some of us during the 970 debacle, for some of the very same reasons that you're pulling to defend AMD. I keep seeing others on this forum say "It's okay on this forum if Nvidia does it, but it's not okay with AMD does it apparently!". Yeah, no, it's the other way around on this forum. A lot of you refuse to debate with any logic - I stated that what Nvidia did with the 970 was stupid, but the card was perfectly fine. I don't think I specifically made any stance on the 480, but I did see it potentially being an issue since it was going way over the PCIe power draw spec for a video card (please don't fucking bring up the 750 or 960 - totally different issue since the culprits weren't reference cards coming straight from Nvidia).

 

Funny you say that - it's to be expected.

Here's the deal.

I do not condone what AMD did. They SHOULD adhere to spec, and at no point did i ever say that it was "no biggie" if they went over the PCIe slot's spec. What ive been saying for nearly two years is that its not a big deal if you go above the 6 or 8 pin connectors "spec". Because the specification is so damn conservative. 200% overprovisioned to be exact.

 

Now, the thing is the "shills" on these forums fail to understand my stance. i generally do not argue that what AMD does is fine, nor do i praise them for many of their choices. I argue that IT IS NOT DANGEROUS. And i argue against peoples misconception of what a standard is. I argue against other peoples opinions, because i want to correct them.

 

I read what other people posts, then go in to correct them. I do not just carpet bomb people with cherry picked evidence and start flaming the products of others.

 

However since a whole lot of you think i am massively "pro AMD and hate Nvidia", you think everything i say is coming from an AMD Fanboy POV.

I got a newsflash for you, and others.

 

I hate Nvidias market practices.

I like AMDs market practices.

 

I like many Nvidia products (especially Kepler and Pascal)

I like many AMD products (APUs, GCN 2 and GCN 3)

 

Link to comment
Share on other sites

Link to post
Share on other sites

I watched this whole entire thing (don't judge me) and the multi GPU parts really caught my eye.... I recall raja even mentioning something about bringing multi GPU to consoles....

 

If AMD really wants to focus on multi GPU, that would be a really good play for them. By making consoles multi GPU it would force developers to optimize for multi GPU which would then also make it more practical for PC. Not sure how it will turn out, but that caught my eye....

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, SteveGrabowski0 said:

Just admit you fucked up buying ATi and sell RTG to Intel already, team red.

Actually, I believe it was a pretty decent buy. I think most of their revenue currently comes from their GPUs and if they hadn't acquired ATI they would have no good APUs and thus their CPU side of things would be even worse. Not to mention that that was the sole reason they got the console deals, which is probably the only reason AMD isn't bankrupt.

 

Of course, it did add a lot of debt to AMD but it seems to me it has already give AMD a lot of money and in the future could be even more powerful (integrating hbm onto an apu.... That would make a pretty powerful zen-vega apu that could potentially compete with the low end of discrete graphics and if you crossfire that apu with low end graphics...you're getting something pretty powerful!).

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, DocSwag said:

Actually, I believe it was a pretty decent buy. I think most of their revenue currently comes from their GPUs and if they hadn't acquired ATI they would have no good APUs and thus their CPU side of things would be even worse. Not to mention that that was the sole reason they got the console deals, which is probably the only reason AMD isn't bankrupt.

 

Of course, it did add a lot of debt to AMD but it seems to me it has already give AMD a lot of money and in the future could be even more powerful (integrating hbm onto an apu.... That would make a pretty powerful zen-vega apu that could potentially compete with the low end of discrete graphics and if you crossfire that apu with low end graphics...you're getting something pretty powerful!

Yes and no. ATI was buried in its own debt, some of which was hidden from AMD at the time of purchase as well (and for those of you who don't remember the complexities of the purchase, Intel actually financed half of it because JP Morgan refused to assume the full risk of the loan). That left AMD in a financial pit when it came to their CPU development, and in the meantime the fusion idea was pushed too fast and not executed effectively at that. From a financial standpoint, AMD should have negotiated a much lower purchase price of ATI, like 1 billion USD less.

 

That financial constraint also caused AMD engineers to cut corners in the Bulldozer designs, removing much of what made CMT desirable on paper (independent schedulers per core in each module, replaced in Piledriver/Vishera). This includes scheduling, the single FPU/vector unit per module, the data-exclusive cache design, multi-chip-module designs, etc.. AMD tried to win the many-core race far too early with core designs that didn't scale in any workload outside of asynchronous procedural or data-parallel simple integer workloads. I posit that has a lot to do with financial restraints while managing ATI's debt and the loan debt to buy it in the first place.

 

Who says APUs never would have existed? Intel made the first CPU with an iGPU anyway, and from AMD's perspective, a close collaboration with ATI could have produced the same end product, but shifted around the burdens of research and the profit structure without having a monolithic financial problem. And as for consoles, the point remains Intel is not one to do custom hardware for small margins, and IBM has left that market behind as well. It still would have come down to AMD (and ATI) vs. Nvidia, Cat cores vs. Denver/Parker, Radeon vs. GeForce.

 

Further, many forget that Ruiz actually first sought to merge with Nvidia, but he couldn't stomach letting Jen Hsun Huang be CEO of the combined entity. ATI was the leftover choice. It was the inferior choice financially.

 

If AMD hadn't cut corners and been able to solidly compete with Intel in CPUs from laptop to server, buying ATI even a couple years later would have been a much more healthy decision. Personally, unless Zen is a smashing success (Polaris is not, unfortunately), this pattering along by AMD will keep things at a standstill. I still stand by the idea AMD should dissolve and have Intel buy RTG to settle the debts with JP Morgan. Let Nvidia take on the CPU division, engineers, and IP to combine with its own. At least Nvidia has the financial health to fight Intel head on for a long time to come. 

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, patrickjp93 said:

-GIANT SNIP-

As for the apu point, yes AMD probably would have still made APUs, but I believe they would have been much, much weaker than those that currently exist, since they wouldn't have a foundation or experience for designing a GPU.

 

Just look at Intel for proof of this-Intels 14nm finfet GPUs on Broadwell were the first iGPUs from Intel to be more powerful than AMDs ones on 28nm, not to mention that Intel had to integrate dram onto the CPU in order to surpass AMD in graphics performance.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, DocSwag said:

As for the apu point, yes AMD probably would have still made APUs, but I believe they would have been much, much weaker than those that currently exist, since they wouldn't have a foundation or experience for designing a GPU.

 

Just look at Intel for proof of this-Intels 14nm finfet GPUs on Broadwell were the first iGPUs from Intel to be more powerful than AMDs ones on 28nm, not to mention that Intel had to integrate dram onto the CPU in order to surpass AMD in graphics performance.

That's where collaborating with ATI and its engineers comes in. No need to have the engineers on your payroll as long as they are in close contact with your own. AMD had no real experience DESIGNING DRAM, and yet, HBM...

 

Intel has a much more fundamental problem than experience: patent barriers. Nvidia, AMD, Matrox, Imagination Technologies, and ARM have huge troves of patents Intel has to design around or buy access to. Intel doesn't tend to buy its way into technologies through patent licenses.

 

Further, Intel hasn't built an iGPU with more SPs than AMD until Skylake GT4e at 576 SPs vs. AMD's 512. The eDRAM was a unique solution that helped feed both the CPU and GPU. It's difficult to say who actually has the superior architecture for gaming. We can look at DX 12 compliance and see Intel is behind on it, but again, patents. We don't know nearly enough to say that Intel just can't out design Nvidia and AMD. And if we remember Nvidia pulling patents during the Larabee project and how that turned Knight's Ferry into a laughing stock at the end of development, I think it's much more accurate to conclude Intel can out engineer anyone it wants to. The problem is getting around patents in the process.

 

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, TheRandomness said:

-snip-

I remember the best case was 95%, which is amazing.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, SteveGrabowski0 said:

Just admit you fucked up buying ATi and sell RTG to Intel already, team red.

It all depends on how profitable Radeon is. While AMD fanboys applaud the creation of RTG and think it meant AMD giving Radeon the breathing room it needed to grow unrestrained, it was really AMD isolating the discrete graphics technologies division so AMD can determine how profitable Radeon actually is. I doubt Raja would shed a tear if Polaris and Vega were flops and AMD sold off RTG to Intel.. well maybe a tear of joy. AMD are probably far more interested in growing their embedded and custom silicon divisions. I recall one such discussion with Lisa in an interview where she envisioned having multiple small embedded apu's everywhere in electronics as being the future of cpu and gpu. It's probably hard for some people to believe that AMD would sell off technologies specific to discrete graphics, but down the road Radeon discrete graphics may become unprofitable if it isn't already, and selling those technologies off when they still have market value is better than holding onto Radeon and bleeding money.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×