Jump to content

Gamers Nexus: HBM2 Costs Estimated... Is AMD Profiting much on Vega?

DocSwag
1 minute ago, Terryv said:

I believe that's the price of regular GDDR5. I'm not sure the price of the X is known

regular is around 7-9$ for 10,000 unit bulks. AMD and Nvidia obviously pays even less and buys  FAR more. They probably pay around 4.5-5$ and buy in 50k++ bulk....

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Coaxialgamer said:

HBM1 would have limited them to 4GB VRAM maximum. 

 

7 minutes ago, leadeater said:

Well then 8 of them is the perfect solution, I'm a genius! :P.

 

RIP AMD's wallet and any PCB space saving, that would be a monster sized GPU package lol.

attachment.php?attachmentid=126659&d=1449540253

 

Jokes are better when told twice right? ;)

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, DrMacintosh said:

Well if you watched the Gamers Nexus video on this you would know the answer to that. 

This is a question with an opinion of an answer. There is no definite answer.

 

I'm assuming you're referring to the question I raised in the last sentence.

2 hours ago, Coaxialgamer said:

There's no mention of GDDR5X at all. Consensus is that the 1080 memory pulls 20w minus the memory subsystem. 

 

Also, it doesn't work like that. You have a power budget, and you balance it between memory and gpu. It's an integral part of power efficiency. 

You also have to remember stuff like power losses from vrms and whatnot. Efficiency on the main vrms used in Vega is actually really really good, around 90%. The max you can go is around 93-95% ish. The ones Nvidia use are nowhere near as good so Nvidia's vrms are considerably worse when it comes to vrm efficiency, probably in the low to mid 80s if I made a guess.

 

And using TDP isn't a good metric, real world power draw numbers would be more accurate.

2 hours ago, Trixanity said:

As your own quote says: Vega is bottlenecked by memory, so reducing bandwidth by going GDDR would have made it perform been significantly worse.

 

What AMD needed was higher clocked HBM but suppliers failed to deliver. The only other option would have been increasing costs by going 4 stacks instead of 2 and therefore doubling bandwidth. It would probably have increased performance significantly but would (I suppose) add another $150.

What they could've done is gone for a 512 bit memory bus. Or gone GDDR5x (not sure if Nvidia has exclusives on that though)

 

Really though, the target for hbm clock speeds was 1 ghz or 2 gb/s. Right now on the 64 I think they have it at 950 MHz/1.9 gb/s. Which is pretty much at their target. If they hit their target that's not a huge increase.

1 hour ago, Prysin said:

meanwhile Nvidia pays around 12-15$ per GDDR5X 1GB stack on the 1080Ti...

Source for that?

 

And even assuming that's true, even though that does add up to almost as much as what amd is paying the 1080 ti sells for $200-$300 more than the 64 and 56, which makes it a lot more reasonable.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, DocSwag said:

This is a question with an opinion of an answer. There is no definite answer.

 

I'm assuming you're referring to the question I raised in the last sentence.

You also have to remember stuff like power losses from vrms and whatnot. Efficiency on the main vrms used in Vega is actually really really good, around 90%. The max you can go is around 93-95% ish. The ones Nvidia use are nowhere near as good so Nvidia's vrms are considerably worse when it comes to vrm efficiency, probably in the low to mid 80s if I made a guess.

 

And using TDP isn't a good metric, real world power draw numbers would be more accurate.

What they could've done is gone for a 512 bit memory bus. Or gone GDDR5x (not sure if Nvidia has exclusives on that though)

 

Really though, the target for hbm clock speeds was 1 ghz or 2 gb/s. Right now on the 64 I think they have it at 950 MHz/1.9 gb/s. Which is pretty much at their target. If they hit their target that's not a huge increase.

Source for that?

 

And even assuming that's true, even though that does add up to almost as much as what amd is paying the 1080 ti sells for $200-$300 more than the 64 and 56, which makes it a lot more reasonable.

it's a guess, you can buy 1k bulk of the EXACT model of GDDR5X for 21$... obviously Nvidia buys in bigger bulk and has FAR better discounts

 

dun remember where i found the info. Just dig around google, there is links to retail sites of GDDR5X

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Prysin said:

just post spiderman memes... it doesnt at all piss off Godlygamer

bcf.jpg

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Prysin said:

it's a guess, you can buy 1k bulk of the EXACT model of GDDR5X for 21$... obviously Nvidia buys in bigger bulk and has FAR better discounts

 

dun remember where i found the info. Just dig around google, there is links to retail sites of GDDR5X

Holy crap I found something.

https://ru.aliexpress.com/store/product/ic-High-quality-new-and-origanl-in-stock-D9TXS-MT58K256M32JA/918207_32816922845.html

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, DocSwag said:

when will people just start to listen :|

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DocSwag said:

And using TDP isn't a good metric, real world power draw numbers would be more accurate.

However, its not like reviewers note/document how much more power cpus consume when they're handling scheduling for NV cards.  Knowing how much power a card draws is important, but hidding part of the power budget with the cpu skews results a fair bit.  NV gets a pretty big free lunch there by just about every reviewer.

 

That said Vega is still one hell of a fracking pig.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Terryv said:

I don't thing there's a scaling issue in the arch. Like others have said, vega 64 is likely starved for memory bandwidth. If they fix that with 4 modules instead of 2 they instantly double memory bandwidth.

Fiji had the same issues... but it was slower. Pretty sure there are internal bottlenecks which lead to increased bandwidth requirements.

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DocSwag said:

What they could've done is gone for a 512 bit memory bus. Or gone GDDR5x (not sure if Nvidia has exclusives on that though)

 

Really though, the target for hbm clock speeds was 1 ghz or 2 gb/s. Right now on the 64 I think they have it at 950 MHz/1.9 gb/s. Which is pretty much at their target. If they hit their target that's not a huge increase.

They would need a 384 bit memory bus with GDDR5X to match it in bandwidth while increasing power consumption. 512 to surpass it. Wide busses like that complicates things. I dont think anyone wants to mess around with 512 busses anymore.

 

Also, who knows how the supply of GDDR5X is? Maybe Nvidia takes all the available supply.

 

I would like to see AMD do a 48 CU Vega with a 384 bit GDDR5X memory bus though. It would be interesting.

Link to comment
Share on other sites

Link to post
Share on other sites

some posts here have said wrong facts. The $175 memory estimate by gamer nexus is accurate. They basically asked how much were they selling the memory to AMD and it was $175 for the 8GB HBM2. They asked other vendors for the various parts sold to AMD and where they couldnt get an accurate price, they would say that AMD would get it lower being a bulk buyer.

 

Despite AMD not winning now the move to push HBM2 is the right thing to do. Nvidia has GDDR5x to push it forwards and AMD needs something equally good for memory performance and power consumption. This will help push down the price for their next gen GPUs. HBM2 uses less power than GDDR5x

 

Just because vega couldnt beat nvidia this round doesnt make vega a bad card. It is a good card and a really good choice if you plan to build from scratch and utilise bundles for discounts of hardwares and to take into account freesync monitor too. I see a lot of posts crapping on AMD just because it couldnt beat nvidia and part of the reason why AMD was late is because of HBM2 supply and mining craze.

 

Vega with its larger memory bandwidth is actually good for mining but its not exactly the same thing as polaris. A lot of people say that vega is just 2x polaris which is wrong. Its not just a 4000 shader fiji with hbm2 either. just like when the gtx 1080 and 1080 ti came out it was first bad at mining with the new GDDR5x vram but people figured out how to use it and got a lot of mining performance from it. The fact that vega is currently bad for mining is a chance for gamers to grab the GPU before someone figures out how to make the card good at it. Hashing performance is highly async multi thread based on bus and memory performance which is why GPUs are so good at it.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Drak3 said:

God damn, that's a pretty penny. Hopefully, RTG learned something from the whole ordeal, and will launch GDDR and HBM variations of NAVI.

they wont go back to gddr, its less efficient, needs more die space on gpu to control it,

amd is trying to increase volume of hbm so that it becomes cheaper 

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Trixanity said:

They would need a 384 bit memory bus with GDDR5X to match it in bandwidth while increasing power consumption. 512 to surpass it. Wide busses like that complicates things. I dont think anyone wants to mess around with 512 busses anymore.

 

Also, who knows how the supply of GDDR5X is? Maybe Nvidia takes all the available supply.

 

I would like to see AMD do a 48 CU Vega with a 384 bit GDDR5X memory bus though. It would be interesting.

True true.

 

Tbh compared to Nvidia I guess amd's arch just isn't as good which is the basis for all the problems.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, DocSwag said:

True true.

 

Tbh compared to Nvidia I guess amd's arch just isn't as good which is the basis for all the problems.

not exactly, for example part of the efficiency with pascal is just taming the beast, if you disable all the power limitations on pascal it consumes a lot more power (its really hard to do and involves soldering), right now amd's problem is:

voltage at stock

memory compression (to allow for lower memory bandwidth needs)

and i also think there might be something problematic coming from the fact that they used IF this time around

 

there arch is also more focused on compute, which is becoming more prevalent 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, DocSwag said:

You also have to remember stuff like power losses from vrms and whatnot. Efficiency on the main vrms used in Vega is actually really really good, around 90%. The max you can go is around 93-95% ish. The ones Nvidia use are nowhere near as good so Nvidia's vrms are considerably worse when it comes to vrm efficiency, probably in the low to mid 80s if I made a guess.

 

And using TDP isn't a good metric, real world power draw numbers would be more accurate.

What they could've done is gone for a 512 bit memory bus. Or gone GDDR5x (not sure if Nvidia has exclusives on that though)

 

Yeah , you have to count VRM and board losses , but they generally aren't huge . I was commenting on the fact that we do not have actual power consumption data for gddr5X ( but it must be lower than gddr5 considering it runs at a lower voltage ). I've seen reports that the 1080's entire memory subsystem ( ram + VRM ) runs on about 25W , which is still nowhere near as efficient as HBM considering  the huge amount of bandwith . That's even when considering VRM efficiency ; 10% isn't going to make a difference.

 

IMO , TDP shouldn't be used . There isn't any standard for it , and anyone can set their product at what they want ( intel putting the 7900x at 140W for example ).Considering that all energy used gets turned into heat , the ideal solution would be to rate for maximum power consumption , but that's impossible when considering process variability , power circuit efficency etc .

 

Also , AMD can't use gddr5 , regardless . At 512 bit , AMD would need to clock gddr5 to 8gbps to achieve that bandwith . That's 80W for the RAM alone , which isn't feasible . 

Gddr5x is also a transition technology between gddr5 and gddr6 . AMd probably doesn't want to engineer a memory interface that will only be used one generation ,If their bet is to use hbm for a long time anyway . That's my guess

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, cj09beira said:

not exactly, for example part of the efficiency with pascal is just taming the beast, if you disable all the power limitations on pascal it consumes a lot more power (its really hard to do and involves soldering), right now amd's problem is:

voltage at stock

memory compression (to allow for lower memory bandwidth needs)

and i also think there might be something problematic coming from the fact that they used IF this time around

 

there arch is also more focused on compute, which is becoming more prevalent 

I want AMD to run the ryzen and vega arch through tsmc and see how much better the chips they get perform... I think that's part of their problem here too

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, cj09beira said:

not exactly, for example part of the efficiency with pascal is just taming the beast, if you disable all the power limitations on pascal it consumes a lot more power (its really hard to do and involves soldering), right now amd's problem is:

voltage at stock

memory compression (to allow for lower memory bandwidth needs)

and i also think there might be something problematic coming from the fact that they used IF this time around

 

there arch is also more focused on compute, which is becoming more prevalent 

Something that can't be denied though is Pascal as higher perf/mm^2 and can clock higher than Vega.

 

I don't doubt that if you remove all limitations Pascal could get high in power draw but compared to Vega it's superior.

 

I thought that removing the voltage limit at least could be achieved just by soldering the vsense pin on the vrm controller to ground, it's not that bad?... Or is there more to it if you want to get voltage control?

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, XenosTech said:

I want AMD to run the ryzen and vega arch through tsmc and see how much better the chips they get perform... I think that's part of their problem here too

imagine ryzen done in intel's labs hmm delicious 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, XenosTech said:

I want AMD to run the ryzen and vega arch through tsmc and see how much better the chips they get perform... I think that's part of their problem here too

I think it wouldn't be as good as Pascal still but it'd be somewhat better. As demonstrated by the 1050/1050 ti glofo/Samsung 14nm does not clock as high as tsmc, tsmc clocks about 10% ish higher.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, cj09beira said:

imagine ryzen done in intel's labs hmm delicious 

burp

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×