Jump to content

AlTech

Member
  • Posts

    15,686
  • Joined

  • Last visited

Posts posted by AlTech

  1. On 8/21/2023 at 6:28 PM, micha_vulpes said:

    As a person who liked the idea of thunderbird it's execution on windows is terrible for some reason.  It has not been a viable replacement for outlook or even Windows mail for me for quite some time.

     

    Seemingly no matter what I do. Or what service I utilize, as soon as I have about 200 emails ( not unread...not arrived, in total) across multiple computer configurations thunderbird just falls on its face. It locks up every 15-20 seconds while composing a reply.

     

    The last version of outlook I liked was 2007/2010, but those no longer connect to most email servers due to lack of oauth. 2013 onward is click-2-run containerised web apps that I found to be crappy and in general just waste a lot of CPU recourses.

     

    If this even gets half the functionality of actual outlook, without the mess of click2run I'll be extactic.

    I use Thunderbird and have over a thousand emails in total within it. It doesn't run badly for me.

     

    Search takes a few seconds but that's normal when you have lots of emails.

  2. 1 minute ago, CameronP90 said:

    Can probably film it, edit it several times, then have it on that youtube thing about releasing it to the public while having errors and misrepresented facts etc etc and another 20 minutes of a half baked apology and still kick themselves. Is anyone buying into this clown show still?

    Side note, anyone wanna bet that BL, gets a new block built, has it tested for whatever GPU and probably never sends it to LTT? Can we do that here?

    They said they've already ordered a new one and it should be ready within the next few weeks.

  3. 10 minutes ago, mikaelus said:

    Was it baseless?

    The appearance of it means it doesn't mater whether or not there is any agreement in actuality or not.

     

    We will never know if Asus gives LMG big bags of money in exchange for a pass on some issues. It probably doesn't happen but we cannot 100% rule it out because of LMG's behaviour. Conflicts of interest are like this no matter the industry. It's not tech specific. LMG walked themselves right into this situation and nobody but LMG can get them out of it.

  4. 4 minutes ago, Taf the Ghost said:

    The phrase of art is "appearance of impropriety".  It's not that you've done anything wrong. It's that a reasonable person will question the nature of your honesty if you're aren't clear about certain "interests" involved. LMG has a couple(?) of former Asus employees, works with them for years, and seems to give them a bit of a pass on things. Some basic disclosures solves a lot of issues.

     

    Oddly enough, LMG doesn't have this issue with Intel, because they've roasted them enough over the years that ever "gets" that Intel can take some heat. At least sometimes. Actually, in totality, I think LMG has shown a great amount of "appearance management" with the big 3 consumer producers (Intel, Nvidia and AMD) compared to a lot of the brands that sell their products.

    I was referring to Asus sponsoring some GPU videos and seemingly getting glowing recommendations and an easy ride as a result.

  5. 11 minutes ago, mikaelus said:

    So, as I said, no major tech reviewer can do reviews anymore. Only some nobodies recording from their basements can, since they never got sponsored by anybody.

    Now you're twisting my words.

     

    If you review restaurants and Dominos' sponsors a video from you.

     

    Can you review one of Dominos' pizzas? Sure, just not the food that Dominos' paid you to eat.

     

    If you do car reviews and VW sponsors a video about the ID.4 then you can still honestly review any Skoda car (owned by VW) and a VW car that hasn't had a sponsored video made by you for it.

  6. 13 minutes ago, mikaelus said:

    Only he didn't. First of all, LTT and Linus himself have always fully acknowledged his investment in Framework.

    Not to the satisfaction of GN which is Steve's point. Other reviewers can't trust LMG to be honest in laptop reviews because Linus invested in Framework. This hasn't caused LMG to prevent Linus from evaluating laptops from Framework's competitors on camera for LMG channels.

    13 minutes ago, mikaelus said:

    And the idea that LMG is in bed with ASUS for some reason is completely unsubstantiated.

    LMG gave Asus an easy ride on the Asus motherboard fiasco with AMD CPUs and LMG gave Asus the kind of blanket endorsement that never comes free.

    13 minutes ago, mikaelus said:

    On the basis of Steve's line of thinking nobody who took any sponsorship from any tech company should review its products or is in a potential "conflict of interest" with it.

    They ideally shouldn't but the reality of YTers funding their reviews means they need sources of revenue and they try hard to prevent it from biasing their reviews. It's also why many reviewers have a policy of not reviewing products that they've been sponsored by before or products that they had a hand in making.

    13 minutes ago, mikaelus said:

    It's ridiculous. This is where he broke his own rules about focusing on facts and evidence. 

    It's the appearance of a conflict of interest. HUB and GN have both said, LMG can't be trusted on laptop reviews because of this appearance of a conflict of interest.

    13 minutes ago, mikaelus said:

    You do realize it's possible for people to be right about something and wrong about something else, right?

    I'm not sure how this relates to anything GN has said or I have said.

     

    The appearance of a conflict of interest is just as bad as an actual conflict of interest. LMG has major conflicts of interest that they've done nothinnf about and GN took then to task for it when other smaller channels were too afraid to say it to LMG.

  7. Just now, mikaelus said:

    And when did I say he didn't in reference to these points? I agree with those. What I do not agree is with peddling accusations that are not backed by fact - like suggesting LMG was in some conflict of interest with ASUS because it was their sponsor at LTX or one ex-ASUS guy works for the company now. The same goes for Framework, which Linus has always been clear about.

    The appearance of a potential conflict of interest is sufficient to cause doubts about the propriety of the relationship between Linus, Framework, Asus, and LMG.

    Just now, mikaelus said:

    So, no, Steve had zero basis for these allegations. This is where he went rogue and personal.

    No, he reported conflicts of interest that LMG has either not considered or ignored.

    Just now, mikaelus said:

    I find it funny that people can be so invested in defending one side without noticing the hypocrisy of the other. Dude, these are just some Youtubers you don't even know. Take a chill pill.

    Is anything I have said wrong?

  8. 20 minutes ago, Silviecat44 said:

    Except if they are a $100m+ corporation

    Or unless what you're covering exists exclusively in the public in which case there's not a lot of point requesting for comment because A) the company will likely not respond except perhaps with a PR stock answer and B) the company may try to manipulate the situation as Linus has done in this instance.

  9. 37 minutes ago, mikaelus said:

    These are allegations that are completely unsubstantiated. As I said, like those people in photos with Epstein, who now "may be" or "may not be" pedos.

     

    You just don't do that. Linus has been very forthcoming about his investment in Framework. Is every tech reviewer disclosing his stock market bets? How many of them own Apple or Nvidia shares? Does it affect how biased they are? Should each of them give full disclosures before every video? I mean, come on.

    Yes, because among other things they are legally required to. The fact that some of them don't means they are breaking the law.

    37 minutes ago, mikaelus said:

    And what about all other companies which have at some point sponsored LTT content (but also GN or any other channel). Intel ran a whole series of sponsored upgrades, now taken up by AMD. Does this mean there may be conflict of interest between LTT and the biggest chip makers, just because they pay for some clips? 

    Not unless LTT starts going easy on them in reviews for seemingly no reason.

    37 minutes ago, mikaelus said:

    Aren't ALL tech reviewers living mostly off this sort of income?

    No. Many live off of merch or Patrons who give them money monthly, or rely on YT ad revenue.

    37 minutes ago, mikaelus said:

    Should they then not review anything because they got money from nearly all tech companies at some point?

    Not at all the same thing. If Linus buys shares in a company that LMG reviews products of, then Linus should not be involved in that video at all. Period. The appearance of a conflict of interest is enough to cause doubt about LMG's integrity

    37 minutes ago, mikaelus said:

    And I'm not defending Linus here - he's done a lot of bad things and has become all ego - but Steve went from talking about evidence to suggesting malicious things he can't back up with any evidence.

    Nothing Steve said wasn't backed up by evidence.

     

    LMG rushes videos: fact.

    LMG makes mistakes in virtually every video: fact.

    LMG sold a prototype that they had no right to sell: fact.

    LMG wasn't willing to spend a bit of extra time to make their videos accurately : fact.

     

    LMG made misrepresentations about the testing capabilities and performance of GN and HUB: fact.

     

    LMG made misrepresentations about Billet's prototype's capabilities: fact.

     

    Linus made misrepresentations about LMG's communication with Billet: fact.

     

    Linus made misrepresentations that LMG had material evidence which would have altered the Steve GN video, when it has been shown Linus had nothing further to add: fact.

  10. 2 hours ago, porina said:

    I've seen the rumours of no high end AMD GPU next round, but not the why. If this is the "why" it would be an example that not everything AMD works out.

     

    Just on the screenshot alone that's 10 logic pieces and 6 connectivity above substrate, which by itself is already in EPYC territory. It isn't clear how many more repetitions of this there might be in the other dimension. Even if you treat a loaded AID as a subunit, there must be quite some manufacturing (yield) risk to this complexity. The only other example like this I'm aware of might be Ponte Vecchio, at a count of 47 pieces of silicon. But that's HPC, not consumer tier offering.

     

    For lower tier navi 4, I assume they more conventional monolithic route?

    Yes. Navi 44 (replacement for N24 and N33) and N43 (N32 replacement) are expected to be monolithic dies.

     

    There's no N42 or N41 cos of this cancellation.

     

    Speculation is that N44 will be 6nm around 200mm2 die and N43 will be 3nm and be slightly better than 7900XTX perf at significantly reduced power but neither have been leaked or rumoured.

     

    2 hours ago, porina said:

    For indication, 400mm2 would be comparable to AD103 (4080) at 379, GA104 (3070) at 392, and there isn't a recent AMD GPU in that ball park. NAVI31 (without MCD) is smaller at 304, and NAVI22 (6700) at 335. NAVI21 (6800+) is much bigger at 520. DG2-512 (ARC A700) is also there at 406.

    AMD does have big chips but they're for datacenter and AI stuff.

     

    The aim eventually for AMD is to be able to Radeon GPUs the way they make Ryzen CPUs: A truly chiplet based arch instead of the current GPU chiplet approach.

    2 hours ago, porina said:

    It is also interesting seeing the different approaches the various companies use to get to bigger effective sizes.

     

    Apple M2 Ultra is two dies of ~155 each.

    Intel Sapphire Rapids is 4 dies of 400 each, a total of 1600. It isn't clear to me if EMIB used would count as extra silicon but there are 10 connections.

    Genoa is up to 12x 72 CCDs + 397 IOD, totalling 1261.

     

    AMD certainly is chopping things down to smaller pieces. 

    Because it is cheaper and more efficient to do so. Going bigger is more expensive on monolithic and in the future it will be technically challenging as well.

     

    The reticle limit of a node isn't infinite and it's easier to package small dies into a number of different configurations e.g. using Zen CCDs from Ryzen all the way to Threadripper and Epyc whilst using mostly the same or exactly the same core design except for Zen4c and Zen5c which are different.

  11. Just now, Forbidden Wafer said:

    Yeah, but why? This is the answer you get in the video I updated in the previous post. I think that is why they are thinking on stacking. Adding 3D layers for easier routing.

    True but it poses a thermal challenge which is why you can't stack them vertically on top of each other directly.


    Which is likely why the Navi 4C design called for spreading them out a bit horizontally and vertically.

  12. 2 minutes ago, Forbidden Wafer said:

    I mean, I'm all for chiplets but this doesn't seem to be worth it. Active interposers are expensive chips, the microsoldering is expensive too.
    They are probably spending more on assembly than saving with higher yields/more configurability. 

    It's not just about money. It's about performance. At a certain point you run into the reticle limit where making a bigger GPU is no longer possible.

     

    Nvidia keeps inching towards it. AMD's steering well clear of it and wants to figure out how to scale GPUs to be a lot bigger without costing a lot more or running into the laws of physics.

    2 minutes ago, Forbidden Wafer said:

    I guess they were going that route because the current layout just doesn't scale well.

    The problem right now is that the MCDs are separate but the GCDs are still really big dies that are effectively monolithic dies glued to the interposer layer with MCDs connected to it.

     

    If AMD can break down GCDs then their GPU designs can become more more scalable and much cheaper.

  13. 2 hours ago, LinusTech said:

    Billet sent us a quote. I don't know or care how they arrived at the value. If they're good, I'm good.


    As for what steps we're taking, you're talking about an outlier issue that has happened once in 10+ years of operation. There won't be a new SOP to ensure we don't accidentally auction stuff. We just need to tighten up some documentation.

    So mediocrity is okay so long as it doesn't happen too often? The fact that LMG illegally sold something and yet no SOPs will be changed to prevent it happening again is negligent conduct. If this happens again one might even be able to make a legal case for theft due to reckless disregard for who owns what property which in and of itself is likely to be a criminal offence.

     

    I'm not even sure how it's possible for such an error to occur by accident because if you don't own an item you can't sell it without explicit permission. If the company providing a review sample wants it back, then it's not yours. If they say you can keep it then it may be yours to keep but you should still double check with them to see if they're okay with you selling it for a charity auction.

  14. 2 hours ago, LinusTech said:

    With all of that said, I still disagree that the Billet Labs video (not the situation with the return, which I've already addressed above) is an 'accuracy' issue. It's more like I just read the room wrong. We COULD have re-tested it with perfect accuracy, but to do so PROPERLY - accounting for which cases it could be installed in (none) and which radiators it would be plumbed with (again... mystery) would have been impossible... and also didn't affect the conclusion of the video... OR SO I THOUGHT...

    LMG literally used the wrong GPU on the waterblock that Billet sent you. I'm not sure how this could be anything other than an accuracy issue.

     

    The waterblock was specifically for a 3090Ti. Waterblocks are not interchangeable across different GPUs much less even different board partner models of the same GPU.

    2 hours ago, LinusTech said:

    I wanted to evaluate it as a product, and as a product, IF it could manage to compete with the temperatures of the highest end blocks on the planet, it still wouldn't make sense to buy... so from my point of view, re-testing it and finding out that yes, it did in fact run cooler made no difference to the conclusion, so it didn't really make a difference.

    It does to the people with 3090Ti cards who would have benefitted from watching an accurate and objective video about a GPU block designed for 3090Ti cards.

     

    Instead it has caused untold reputational damage to the company and has likely cost them future sales as a result of the harm your video has done.

    2 hours ago, LinusTech said:

    Adam and I were talking about this today. He advocated for re-testing it regardless of how non-viable it was as a product at the time and I think he expressed really well today why it mattered. It was like making a video about a supercar. It doesn't mater if no one watching will buy it. They just wanna see it rip.  I missed that, but it wasn't because I didn't care about the consumer.. it was because I was so focused on how this product impacted a potential buyer. Either way, clearly my bad, but my intention was never to harm Billet Labs. I specifically called out their incredible machining skills because I wanted to see them create something with a viable market for it and was hoping others would appreciate the fineness of the craftsmanship even if the product was impractical. I still hope they move forward building something else because they obviously have talent and I've watched countless niche water cooling vendors come and go. It's an astonishingly unforgiving market.

     

    Either way, I'm sorry I got the community's priorities mixed-up on this one, and that we didn't show the Billet in the best light. Our intention wasn't to hurt anyone. We wanted no one to buy it (because it's an egregious waste of money no matter what temps it runs at) and we wanted Billet to make something marketable (so they can, y'know, eat).

    Even if it performed as they advertised at around 20C lower than your temps?

     

    If you can't allow your conclusion to change based on the performance of the product then you're not honestly being objective or helping the community.

     

    A wise man once said words to the effect of: A delayed game is sometimes bad, a rushed game is always bad.

     

    The same applies to videos and other content where integrity matters. The rushed videos LMG has put out riddled with pre-upload and post upload corrections are signs that there is a serious lack of quality control at LMG.

    2 hours ago, LinusTech said:

    With all of this in mind, it saddens me how quickly the pitchforks were raised over this. It also comes across a touch hypocritical when some basic due diligence could have helped clarify much of it.

     

     

    I have a LONG history of meeting issues head on and I've never been afraid to answer questions, which lands me in hot water regularly, but helps keep me in tune with my peers and with the community. The only reason I can think of not to ask me is because my honest response might be inconvenient. 

    I'm not sure you even believe that but if you do then you give GN and other reviewers in the space far less credit than they give you.

    2 hours ago, LinusTech said:

    We can test that... with this post. Will the "It was a mistake (a bad one, but a mistake) and they're taking care of it" reality manage to have the same reach?

     

    Let's see if anyone actually wants to know what happened. I hope so, but it's been disheartening seeing how many people were willing to jump on us here. Believe it or not, I'm a real person and so is the rest of my team. We are trying our best, and if what we were doing was easy, everyone would do it. Today sucks.

    For a scrappy startup with 12 employees these seemingly endless amounts of mistakes may be tolerable if not still extremely disappointing. For a reasonably large company with 120 employees this is simply not acceptable.

     

    Making basic mistakes in seemingly every video and mass producing videos like it's chocolate isn't helping anybody except your bottom line. If that's the goal then great, scream it from the roof tops but if it is you shouldn't claim to care about the community or to be doing this for consumers. Furthermore, if you can't handle criticism about the company you partially own then frankly you should not own any part of a business.

  15. Summary

    AMD is rumoured to have cancelled Navi 4C (aka Navi41) and the diagram of Navi 4C has also been leaked. Navi 4C's alleged cancellation is because of complications caused by Navi 4C's new architecture of CUs inside Shader Engine Dies housed on top of Active Interposer Dies which themselves are housed on Package Substrate. and the massive scaling of chiplet usage in the architecture.

     

    AMD is believed to have diverted employees working on Navi 4C towards Navi 5 so that Navi 4 can be released in 2024 with Navi 5 launching in 2025.

     

    Image from MLID via Videocardz

    image.thumb.jpeg.14ba723a1d075c910cf127852eae9068.jpeg

     

    Quotes

    Quote

    The leaked diagram showcases a large package substrate that accommodates four dies: three AIDs (Active Interposer Dies) and one MID (Multimedia and I/O die). It appears that each AID would house as many as 3 SEDs (Shader Engine Dies). This complex configuration represents the alleged RDNA4 architecture, or at least a segment of the GPU that was intended for future release. Notably, the diagram only presents one side of the design, omitting the complete picture. MLID notes that there should also be memory controller dies on each side, although their exact number remains unknown.

    The proposed Navi 4C GPU would have incorporated 13 to 20 chiplets, marking a substantial increase in complexity compared to RDNA3 multi-die designs such as Navi 31 or the upcoming Navi 32. Interestingly, a similar design was identified in a patent titled “Die stacking for modular parallel processors” discovered by a subscriber of MLID, which showcased ‘Virtual Compute Die’ interconnected through a Bridge Chip

     

    My thoughts

    Not gonna lie, looking at this insanely expanded complexity I am not surprised it got cancelled. I do hope that the rumours are correct and that AMD will be back making high end GPUs for Navi 5.

     

     

    Sources

    https://videocardz.com/newz/amds-canceled-radeon-rx-8000-navi-4c-gpu-diagram-has-been-partially-leaked

  16. 15 hours ago, porina said:

    In the UK there are two versions of the register, the public version, and the full version. The difference between them is you can opt out of appearing on the public list. For older people, it is similar to requesting to not be listed in the phone book (white pages). If that is all the data they got, I view it as relatively insignificant in data value. Still a serious breach, but little to no individual impact.

     

    The article mentions that credit agencies may purchase the full list, otherwise they couldn't verify who lives where for those who have opted out of the public list.

    Well it's still a breach since nobody is supposed to be able to get the full list.

     

    People on the full list are there for a reason, they don't want to be on the public list. Downplaying grievances with wholesale data selling and data brokering is not helpful to this discussion.

     

    It may not be the most significant breach ever but this hack is egregious and far from harmless. The hackers should face serious repercusions for their actions if they can ever be identified.

     

    I'm not sure why OP used the word leak since it was a cyber attack/hack and clearly not a leak.

  17. 39 minutes ago, IkeaGnome said:

     

    And you also have to keep in mind that computer/parts like this sell because people don't know better/know fully what part names mean.

    https://www.amazon.com/STGAubron-Desktop-Computer-GeForce-Keyboard/dp/B0BK54BHL7/ref=sr_1_10?crid=1RO2CKMHNEW9W&keywords=i7%2Bgaming%2Bcomputer&qid=1690904226&sprefix=i7%2Bgaming%2Bcomput%2Caps%2C221&sr=8-10&ufe=app_do%3Aamzn1.fos.18630bbb-fcbb-42f8-9767-857e17e03685&th=1

    That *should be* a i7 3770..

    https://www.amazon.com/STGAubron-Desktop-i7-8700-Keybaord-Bluetooth/dp/B0BNC1PT36/ref=sr_1_18?crid=1RO2CKMHNEW9W&keywords=i7+gaming+computer&qid=1690904248&sprefix=i7+gaming+comput%2Caps%2C221&sr=8-18&ufe=app_do%3Aamzn1.fos.ac578592-0362-4e0a-958c-0f2dd61d30d4

    https://www.amazon.com/Continuum-GeForce-Windows-Desktop-Computer/dp/B0887M9HL3/ref=sr_1_8?crid=1RO2CKMHNEW9W&keywords=i7+gaming+computer&qid=1690904328&refinements=p_36%3A2421883011&rnid=2421879011&s=pc&sprefix=i7+gaming+comput%2Caps%2C221&sr=1-8&ufe=app_do%3Aamzn1.fos.ac578592-0362-4e0a-958c-0f2dd61d30d4

     

    Remember, just because it makes sense to any of the three of us, doesn't mean the average person will understand the difference. Average person will see 7900 and think they're all very similar in performance. Average person doesn't watch tech news, read this thread, watch benchmarks. Average person is higher number=better.

    Nope, but once you add in the xx50 people get confused, or you add in the Super people get confused. There have been many threads on here asking the difference, more recently between xx50 and xx00 cards. More than 2 names/products with similar names and people will get confused.

     

    Where was the 6900/6950 xtx?

    They didn't introduce it at that time.

     

    6900XT/6950XT was sufficient.

    39 minutes ago, IkeaGnome said:

    If you dig far enough into it, yes. If you go just off confusing ass names meant to confuse the AVERAGE consumer, no it's the same. 

    What do you want AMD to do?

     

    Given that there are expected to be 6 GPUs in AMD's desktop RDNA lineup, what would you call them?

     

    7900XTX -> 7900

    7900XT -> 7800

    7900 GRE -> 7700

    7800 -> 7600

    7700 -> 7500

    7600 8GB -> 7400 8GB

     

    But then what if AMD had a complete lineup? It would be an even bigger mess.

     

    7900XTX -> 7900

    7900XT -> 7800

    7900 GRE -> 7700

    7800XT -> 7600

    7800 -> 7500

    7700XT -> 7400

    7700 -> 7300

    7600 16GB -> 7200 16GB

    7600 8GB -> 7200 8GB

    7500XT -> 7100

    7500 -> 7000

     

    This is even more confusing. Notwithstanding the XT card being called XTX and the non XT card being the XT card and the lower end XLE card being called "GRE", AMD's RDNA3 lineup isn't confusing.

    39 minutes ago, IkeaGnome said:

    It's not going to be just a Chinese card only. It's already not. Their marketing material has USD listed for the price. It will be available here given enough stock of it and time to get it over here. Yes, it's China first. No, it's not going to be China only.

     

    Common sense in product naming says the 7900xt would be the replacement for the 6900xt...

    Well it's not.

     

    The 7900XT acts as the 6950XT replacement.

     

    7900 GRE acts as the 6900XT replacement. Those 2 cards are at different price points with different people buying them.

  18. 1 hour ago, WallacEngineering said:

     

    Ya but who cares what AMD calls it. We know better, just like how Im aware that my 7900-XTX isn't an XTX. AMD just added the extra "X" to sound cool. Its really just the 7900-XT, and then the 7900-XT is actually the 7800-XT. So I guess then the GRE would actually be the 7800 Non-XT.

    Well no, the 7900XT should have been called the RX 7900. It's not a 7800XT class product.

     

    Well no, because the 7900 GRE would be either A) not launched at all if the 7900XT was called a 7900 or the 7900 GRE would be called a 7800XT in the worst case scenario.

    1 hour ago, WallacEngineering said:

    Its just like how we are well aware that the RTX 4060s are the biggest SCAM in Nvidia's history because we know they are charging even more than typical 60-class pricing for 50-class GPU dies and BUS widths and thats why literally not a single enthusiast I have talked to is even remotely interested in them. Its as Steve from GN said in the review: "The RTX 4060 is literally a waste of sand". Or at least they are until they fall to acceptable price points which most of the community has agreed upon:

     

    RTX 4060 Base: $250

    RTX 4060-Ti 8GB: $300

    RTX 4060-Ti 16GB: $340

     

    These prices are as such because the stupidly narrow BUS width means these cards scale negatively at higher resolutions, meaning they should ONLY be recommended for gamers targeting 1920x1080p or UltraWide 2560x1080p resolutions with high frame rates and maybe some Ray-Tracing gameplay. Otherwise, they are completely useless. 1440p and higher gamers should be looking at the RTX 4070 and up.

    Nvidia may be scamming customers but AMD isn't.

    1 hour ago, WallacEngineering said:

     As@IkeaGnome said, both AMD and Nvidia are screwing up this gen, its just that Nvidia is definitely worse. At least AMDs cards are reasonably priced so really all you have to worry about is deciphering the stupid names lol 🤣

     

    And yes the GRE will come down in price. Literally every single release this generation from both companies has had a starting MSRP that is too high and that drops in price over time and while this particular MSRP isn't terrible, it will still drop at least a little bit after the first 10-12 weeks on the USA market. And then that 7800 you mentioned will also fall a bit from its $550 launch price, and so on. What makes you think anything changes at this point from how the 7900-XTX, 7900-XT, and 7600 launched? Its just gonna be the same, because all GPUs are overpriced right now and GPU sales are at an all-time low across the board.

    AMD GPUs aren't significantly overpriced, perhaps they are slightly overpriced but its nothing a minor price adjustment can't fix.

     

    The 7900XTX should be $899

    The 7900XT should be $699 to $749

    The GRE should be $599

    The 7800 should launch at $499

    The 7700 should launch at $399

     

    And a 7600 16GB, if it were to ever come out, should be between $299 and $329.

     

    These are just my opinions, not what's gonna happen.

     

    7800 is planned to be $550 and 7700 is planned to be $450.

  19. 2 hours ago, IkeaGnome said:

     

    Devils advocate here. Why are we more okay with this than we are with Nvidia giving “wrong names” to graphics cards?

    Because this isn't a bad name for the GPU.

     

    N31 dies being used exclusively in the 7900 series makes sense and is fine.

     

    AMD is segmenting the series as needed to offer different tiers of performance and value to different customers who want N31.

  20. 20 minutes ago, IkeaGnome said:

    This is a 7800xt class card with a 7900 name on it. Remember how angry people got with the 4080 (4070 ti)?

    Not anymore. Also part of the hatred with 4080 12GB was the confusing name with the 4080 16GB being a totally different GPU with the same name.

     

    It was never in the cards to be called a 7800XT. It could maybe have been called a 7800 XTX though that would have been a more silly name than 7900 GRE.

     

    x800 class at least for now is becoming fully enabled xY2 where x is the ISA/Architecture codename and Y is the generation within the ISA.

     

    The xY1 die seems to be reserved for the x900 series now.

     

    And the xY3 die is for the x500 (if AMD decides to launch one in the future) and the x600 series.

    20 minutes ago, IkeaGnome said:

    The name is also just as misleading. You now have $400 in price difference between cards with the same numbering scheme. How is that better than Nvidia?

    AMD has different series within a generation of cards and regularly has multiple SKUs within them.

    Non XT (or in this case GRE) is cut down.

     

    XT is usually fully enabled but can sometimes be cut down. And XTX is fully enabled.

     

    For revisions or refreshes AMD uses a xx50 naming scheme to note the refresh compared to the xx00 scheme for launch GPUs. E.g. 6600 XT (launch

    model) and 6650XT (refresh).

     

    Where both an XT and a non XT exists, the non XT is a cut down XT.

    E.g. 6600 was a cut down 6600XT. 6700 was a cut down 6700XT. 6800 was a cut down 6800XT.

     

    AMD this generation are launching products to reflect where they exist in the product stack in terms of how far ahead of their predecessors they are.

     

    The 7600 is called a 7600 to reflect the nature of its performance and its goal to replace the 6600 and 6600XT GPUs.

     

    The 7900 GRE is a replacement for the discounted $650 6900XT and is intended to provide better performance, better perf/dollar, and support for newer features.

    20 minutes ago, IkeaGnome said:

    Disclaimer, I’m not pro one brand or another. I’m pro competition. Right now both are competing at being confusing to the average consumer. People on here are not the average consumer. A parent who saves up to splurge on their kids computer is not going to know the difference between a 4060 ti 8gb and 4060 ti 16gb any more than the difference between a 7900 gre and a 7900 xtx besides price differences. At least Nvidia’s naming there hints at the difference.

    Anybody who has bought from AMD in the past is aware of the distinction between non XT and XT cards. The XTX is a bit different but it doesn't confuse buyers in any meaningful quantity.

     

    The 7900XTX outsells the 7900XT by a large amount owing to its better performance and being a better value than the 7900XT for most of the 7900XT's time on the market.

     

    The 7900 GRE is a value Navi 31 GPU.

    People who want a cut down top die at a cheap price have it now.

     

    20 minutes ago, IkeaGnome said:

    In other words, the same thing as GN said about the 4060. This one is just backed with price drops and sales.

    AMD knows the 7900 GRE will sell well

    . They don't need to discount the 7900 GRE. The only AMD cards that need discounting right now are the 7900XT to $750 and the 7900XTX to $900.

     

    20 minutes ago, IkeaGnome said:

    Why does the amount of BUS width and die size matter? You could have a full size die with a massive bus width, but have horrible performance. Would we say “at least it’s a full size die this time?”

    It's been said above by others who have replied to you but essentially 7900 series is N31.

     

    7800 series is fully enabled N32 and 7700 series will be cut down N32.

     

    N32 this generation has a substantially higher CU count than N22 and is designed to replace the 6800 and the 6700XT.

×