Jump to content

indrora

Member
  • Posts

    74
  • Joined

  • Last visited

Reputation Activity

  1. Like
    indrora got a reaction from dogwitch in I shouldn’t have kept the $1,000,000 PC   
    Yes hello you have my attention. Jake and Alex, the lovely bear? LTT folks, you're giving me the vapors! 
  2. Funny
    indrora got a reaction from mrtwonavels in I shouldn’t have kept the $1,000,000 PC   
    Yes hello you have my attention. Jake and Alex, the lovely bear? LTT folks, you're giving me the vapors! 
  3. Funny
    indrora got a reaction from Mark Kaine in Helldivers 2 to require a PSN account login to play on PC   
    Continuing the fun: Setting up a PSN account in many places requires you to own a PlayStation
     
    That $40 game for PC suddenly needs a $500 console you can't even play the game on...
  4. Informative
    indrora got a reaction from Wiking22pl in Way to test students on computer use?   
    I have a few thoughts on this, tempered from my years of tutoring college students on basic IT. Not all of this will apply to an 8th grader. Use your best judgement.
     
    IMNSHO, topics like media literacy ("is this real? How do I trust it?") and core skills like file management, organizational skills, etc. 
     
    at one point, I developed a checklist of "Essential shit you will need to know in order to be competent in the world of computers". 
     
    Identify the following ports: USB (A,B,C, Micro/Mini B), ethernet, HDMI, VGA, SD/"removable media". Discuss at some level the difference between a desktop application (e.g. Word/Excel/VLC) and a web application (GoogleDocs, office online, Gmail, Twitter, etc) identify and discuss the differences between RAM and storage, "fixed" disks vs portable storage, etc.  Troubleshoot "is it me" - that is, is this website slow, are all websites slow, is my computer slow: I constantly hear people who complain that their computer is slow because it takes a very long time to load a specific webapp, such as one from a university that is being inundated with every student trying to register for the same 10 classes at once.  Importance of updated software (security, at least) and occasionally rebooting your machine.  In some form or fashion, be able to store, in some long-term form, information you don't want to remember now ("exobrain" type work) -- I don't care how. it can be Google Keep, OneNote, Notion, a text file they keep on their desktop. Whatever. Some form of digital notebook. I will occasionally accept bookmarks.  Print/Save to PDF Using the Internet Archive at some level. I specifically start with "what if I wanted to see the front page of CNN on September 11, 2001?"  Two-place backups. Easier if they're a Mac user -- I just tell them to get a 2TB external SSD and the Apple usb-hdmi dongle, keep it plugged in when they go to bed. Windows users I generally point to "Make regular backups on a flash drive." I'm not going to go into 3-2-1 rule type stuff.  Password managers and how they work.  How to touch type. This is, no shit, one of the most powerful things I have gotten people to learn. Being able to transcribe something you see into words is supremely useful. How to manage files: copy/paste/move/rename/etc. Saving your work. Citing sources digitally (even if it's just a URL) That's most of it. The ends are the most important imo. I would hear people come to me and go "My phone is so slow" and I'd look at it and the reason wasn't that they had too many pictures, as they would claim, but instead because they had a ton of backgrounded apps. I'd ask "When's the last time you rebooted your phone" and they'd look at me like I had grown a third eyeball. 
     
    There's some good reading to be had about this topic, too, if that's your thing:
    https://www.nngroup.com/articles/computer-skill-levels/ -- Older, but still somewhat relevant study on how well people handle various tiers of tasks with computers.  https://www.sciencedirect.com/science/article/pii/S2405844023020856 A review of mechanisms for teaching and quantifying digital literacy (open access, even!)  
    If I were to design a "thing" that covered a lot of these, it would be the following:
     
    You can get bulk packs of small size flash drives -- I don't know what your budget looks like (knowing America, bad) but if you can, you can get massive quantities of flash drives cheap now that are good enough for keeping high school stuff on. 
  5. Informative
    indrora got a reaction from starsmine in Way to test students on computer use?   
    I have a few thoughts on this, tempered from my years of tutoring college students on basic IT. Not all of this will apply to an 8th grader. Use your best judgement.
     
    IMNSHO, topics like media literacy ("is this real? How do I trust it?") and core skills like file management, organizational skills, etc. 
     
    at one point, I developed a checklist of "Essential shit you will need to know in order to be competent in the world of computers". 
     
    Identify the following ports: USB (A,B,C, Micro/Mini B), ethernet, HDMI, VGA, SD/"removable media". Discuss at some level the difference between a desktop application (e.g. Word/Excel/VLC) and a web application (GoogleDocs, office online, Gmail, Twitter, etc) identify and discuss the differences between RAM and storage, "fixed" disks vs portable storage, etc.  Troubleshoot "is it me" - that is, is this website slow, are all websites slow, is my computer slow: I constantly hear people who complain that their computer is slow because it takes a very long time to load a specific webapp, such as one from a university that is being inundated with every student trying to register for the same 10 classes at once.  Importance of updated software (security, at least) and occasionally rebooting your machine.  In some form or fashion, be able to store, in some long-term form, information you don't want to remember now ("exobrain" type work) -- I don't care how. it can be Google Keep, OneNote, Notion, a text file they keep on their desktop. Whatever. Some form of digital notebook. I will occasionally accept bookmarks.  Print/Save to PDF Using the Internet Archive at some level. I specifically start with "what if I wanted to see the front page of CNN on September 11, 2001?"  Two-place backups. Easier if they're a Mac user -- I just tell them to get a 2TB external SSD and the Apple usb-hdmi dongle, keep it plugged in when they go to bed. Windows users I generally point to "Make regular backups on a flash drive." I'm not going to go into 3-2-1 rule type stuff.  Password managers and how they work.  How to touch type. This is, no shit, one of the most powerful things I have gotten people to learn. Being able to transcribe something you see into words is supremely useful. How to manage files: copy/paste/move/rename/etc. Saving your work. Citing sources digitally (even if it's just a URL) That's most of it. The ends are the most important imo. I would hear people come to me and go "My phone is so slow" and I'd look at it and the reason wasn't that they had too many pictures, as they would claim, but instead because they had a ton of backgrounded apps. I'd ask "When's the last time you rebooted your phone" and they'd look at me like I had grown a third eyeball. 
     
    There's some good reading to be had about this topic, too, if that's your thing:
    https://www.nngroup.com/articles/computer-skill-levels/ -- Older, but still somewhat relevant study on how well people handle various tiers of tasks with computers.  https://www.sciencedirect.com/science/article/pii/S2405844023020856 A review of mechanisms for teaching and quantifying digital literacy (open access, even!)  
    If I were to design a "thing" that covered a lot of these, it would be the following:
     
    You can get bulk packs of small size flash drives -- I don't know what your budget looks like (knowing America, bad) but if you can, you can get massive quantities of flash drives cheap now that are good enough for keeping high school stuff on. 
  6. Informative
    indrora got a reaction from Needfuldoer in Was going to recommend Starforge to a friend. Won't be.   
    A friend of mine recently asked for recommendations on a new PC build. I mentioned that I was just going through the LTT secret shopper series and I'd get them my thoughts. I was doubly impressed at the positive review from GamersNexus, who had one major complaint about some thermal pads. That's high praise. 
     
    I have a few give-or-take things for smaller shops. HP and other Major OEMs are monoliths that have big corporate faces. Smaller boutiques though? I look a little into them. I look into their leadership, and Starforge, for all their dumb logo is forgivable, is very big on how they're sponsor-y-paired with OTK, who raised my eyebrows when I saw a clip from one of AsmonGold's streams that made me say "Man I don't want to be associated with him." In the clip, Asmongold starts badmouthing some VTubers:
    Then I noticed the name. 
     
    That's one of my friends he's talking about. That's a human being who he's badmouthing. That's a streamer who has worked her ass off to build a positive community of people, responding to another streamer who has worked hard to have a supportive community. Even if that wasn't someone I know personally, I'd absolutely consider this a black mark. On top of that, it's a shitty comment, completely ignoring the point that people are trying to make (it's not actually about Palworld, but about its leadership, who are hard into the NFT/Crypto/AI-art world, which a lot of people aren't okay with supporting) just to hate on furries. (It's worth saying: furries make the internet go)
     
    Okay so... where there's smoke, there's fire. There's some really bad takes I don't agree with ("artists opinions don't matter" when it comes to things like copyright infringement, for one) but okay, let's do some background research. If I'm going to form an opinion on one group of people, I should probably consider their actions as a group/whole. What's Asmongold and the rest of OTK like?

    Asmongold has some amazingly bad takes about artists and creatives...

     
    Okay, so, Asmongold is a jerk, what about the others? It gets complicated. There's plenty of snapshots of various OTK members being vaguely sexist, laughing about poor people, being jerks on twitter, having some skeletons in their closet, getting banned from twitch for a few interesting reasons, saying racist and homophobic things. I'm only somewhat going to bring up straight up accusations of terrible behavior, like manipulation and gaslighting, as too heavy a weight, as I don't know how much context you need, but it sure adds to the fuel for the flame.
     
    Then there's the... tasteless naming of the supplement flavors that many of the OTK crew bring up, one of which is named by an OTK member.
     
    All this sums up to "The people who are in some level of power, culturally, aren't the kind of people I want to support." I guess the final straw was reading that OTK as a group are worried that being called out for shitty behavior is somehow going to ruin streaming. 
     
    Hassan had it right. I was disappointed when LTT got their time in the spotlight of unpleasant drama, but LTT took ownership and righted the ship. I dunno what OTK does, but Starforge has them as their front line big names. Puts a real bad taste in my mouth.
  7. Informative
    indrora got a reaction from Anfros in It’s Back and I’m SO Excited! - Threadripper 7000   
    As a developer, this sort of processor is really appealing for a few reasons when you get into the more complex tasks.
     
    Anyone who works on OS-level work, browser development, or even just the typical Gentoo user, will appreciate the high core count and high memory throughput. Why? Because it will rip through code. Most of the time, code compilation is I/O bound, but with the sort of memory bandwidth and capacity that this can handle, it's enough that holding your entire codebase in memory is now possible and you have enough left over to do whatever compilation you need. 
     
    One of the hard parts of compiling software is a part called linking -- taking all the intermediate compilation bits and binding them together. The issue is that most linkers are single-threaded, but newer advances in linkers such as one called mold take advantage of every CPU core available. 
     
    Like Linus said: This is targeted right at the end of the "I need a desktop system" but where desktop-class Xeons are either non-performant enough but sticking an Epyc Genoa or Xeon Gold system underneath someone's desk isn't practical. For systems where you're CPU-bound but multithreaded to make use of as much compute power as you can get, these are nice big hammers for a very stubborn nail. 
     
    These also make sense outside the HEDT space specifically. Certain virtualization workloads will benefit from these sorts of chips, especially ones that need a few cores but access to a lot of RAM alongside some specialty hardware like a GPU or dedicated accelerator card.
  8. Like
    indrora got a reaction from dogwitch in It’s Back and I’m SO Excited! - Threadripper 7000   
    As a developer, this sort of processor is really appealing for a few reasons when you get into the more complex tasks.
     
    Anyone who works on OS-level work, browser development, or even just the typical Gentoo user, will appreciate the high core count and high memory throughput. Why? Because it will rip through code. Most of the time, code compilation is I/O bound, but with the sort of memory bandwidth and capacity that this can handle, it's enough that holding your entire codebase in memory is now possible and you have enough left over to do whatever compilation you need. 
     
    One of the hard parts of compiling software is a part called linking -- taking all the intermediate compilation bits and binding them together. The issue is that most linkers are single-threaded, but newer advances in linkers such as one called mold take advantage of every CPU core available. 
     
    Like Linus said: This is targeted right at the end of the "I need a desktop system" but where desktop-class Xeons are either non-performant enough but sticking an Epyc Genoa or Xeon Gold system underneath someone's desk isn't practical. For systems where you're CPU-bound but multithreaded to make use of as much compute power as you can get, these are nice big hammers for a very stubborn nail. 
     
    These also make sense outside the HEDT space specifically. Certain virtualization workloads will benefit from these sorts of chips, especially ones that need a few cores but access to a lot of RAM alongside some specialty hardware like a GPU or dedicated accelerator card.
  9. Funny
    indrora got a reaction from PinkPanther9949 in Turning Garbage into Gaming - Mom & Pop Computer Shop Part 2   
    Gotta give it to Jordan here:
     
    * Competent choice in processor, storage, GPU, RAM. I wouldn't have chosen that display, but what the heck, I'll go for it
    * Clearly hit 100fps in Crysis. 
    * Way cuter than Linus. (+1000 points)
  10. Funny
    indrora got a reaction from Needfuldoer in Turning Garbage into Gaming - Mom & Pop Computer Shop Part 2   
    Gotta give it to Jordan here:
     
    * Competent choice in processor, storage, GPU, RAM. I wouldn't have chosen that display, but what the heck, I'll go for it
    * Clearly hit 100fps in Crysis. 
    * Way cuter than Linus. (+1000 points)
  11. Like
    indrora got a reaction from borisattva in I Bought a Gaming PC from EPSON   
    You might find one "junk" on Yahoo Auctions Japan. You'll definitely pay a premium getting it over to the states though. 
     
    SSDs are still "relatively" the same on the Japanese market, yen-for-yen, you're getting WAY better bang for your buck going spinning disk:

    At the scale that Epson and others are selling at, that spinning rust is far cheaper than the SSDs, especially when you have... Complicated relationships with China: Japan would much rather have something produced in Japan (as that chassis likely was) than something produced in China unless it's going to be seen as of inferior quality anyway.

    I'm spitballing here, but I'd bet that the drive that was included was also a local brand: Toshiba or HGST. a Toshiba drive would absolutely make sense given the market and the consumer cost: 

    Most of my Japanese friends and coworkers have mentioned that they would rather go for something lower cost and slightly older than something higher cost and top of the line. Many of them have picked up Xbox One X as they've become quite cheap and can still play many of the games that the Xbox Series S can play. One even congratulated me on building a sleeper PC with slightly older parts instead of very high end parts (I had a third gen ryzen chip floating around, plus an rtx2070) and went as far as to say "you should have really gone for a 1650 if you were buying parts."
     
  12. Informative
    indrora got a reaction from williamcll in Roasting Chinese Gaming Setups   
    It's worth calling out that outlets in China are a... bit of a mixed bag at best. There's no less than 3 different kinds of outlet you can encouter on a daily basis, depending on where the thing was made, who it was made for, what kind of thing it is, etc. 
     
    You can encounter:
    Type A plugs (two prong ungrounded outlets, similar to the US non-polarized ones here) Type C plugs (two round pin plugs, aka Europlug) Type I plugs (two angled with a ground pin, which is also used in Australia and other similar countries) Type A is also used in Japan. Type I looks eerily similar to the US NEMA5-15 when plugged into the wall. It's a mad rush to the point that a power strip might well look like this:
     

  13. Funny
    indrora reacted to WhitetailAni in Fastest drive death you've encountered?   
    Does it count from its purchase date or when you received it?
    Because I had a 1999 Seagate ST33232A Compaq that lasted awhile, but died within 4 months of me receiving it.
    In a similar vein, my other 160GB Seagate drive from my iMac died in 5 months, though it was partially my fault - I knocked the iMac over while I was installing OS X.
  14. Like
    indrora reacted to BlueChinchillaEatingDorito in Fastest drive death you've encountered?   
    Hole-in-zero: 32GB Corsair Voyager USB 3.0 RMA replacement died within a month. 
     
    No kidding those had a bad reputation of being utterly unreliable. 
  15. Like
    indrora got a reaction from Hawx in We fixed Windows 10 - Microsoft will HATE this!   
    After working in The Biz for a while, I have some knowledge of why.
    Any organization inside a software development house will function like a bureaucracy, as any organization will eventually fall into this despite hopes for pseudoanarchy. In the case of QA, they're often left to their own devices. This "just... tell us what's wrong" approach means that there's a disconnect between developers and testers.
     
    From my understanding, Microsoft started encouraging developers inside to test releases through some form of internal "insiders program". We know that this sort of thing exists because of folks like Jen Gentleman talking about, but also things like BuildFeed (RIP) where we saw a lot of internal branches; it's now well documented over on wikipedia. This also means that broader bugs in Windows are caught by the developers before more users see it.
    What this means is that Microsoft has shifted how they test Windows. Why? Strangely, because having independent QA teams leads to a horrible pattern called Tester-Driven Development. Having a QA team that exists only for QA means that development focus shifts constantly to their beck and call. This also means that advancement within the QA team is almost entirely driven by your ability to spit out bugs for developers to fix... even if those bugs are meaningless or really caused by a deeper problem. This causes a lot of surface level problems to get patched over (we've seen some of those throughout Windows' development history) and doesn't give developers enough time to go back and find root causes.
     
    Now, that's not to say that bugs don't happen, but here's what it takes for a bug to make it out:
    it has to make it past the development team that is actively working on it (ballpark 20-30 people) It has to make it past the "canary" channel (nightly builds, probably 200-300 people) It has to make it past the "Selfhost" channel (builds that have been approved by those 200-300 people) It has to make it past the "Dev" channel, the first time it's seen outside of Microsoft. (likely 3-4k people) It has to make it past the "Microsoft" channel, where it's rolled out across workstations inside Micorosoft (50-100k at minimum) It has to make it past the "Beta" channel, the first time most consumer Insiders get to see it. This is the most unstable channel that is publicly available to users. (Ballpark 5 million people) It has to make it past the "Release Preview" channel, the last channel before it goes live to the rest of the world (ballpark 3-4 million people) If you're curious how many people it takes inside Microsoft to get a thing out the door, the blog posts How Many Microsoft Employees Does It Take to Change a Lightbulb by Eric Lippert, a developer on windows, and Thinking through a feature by Raymond Chen, one of the oldest developers in Windows. Both of these are probably some of the best looks into what it takes to get a change into Windows; change "Feature" to "Bugfix" and the logic still holds the same.
     
    And yet bugs still exist in Windows. So, how bad of a bug does it need to be in order to not get squashed?
    Let's assume that a developer is worth, ballpark, $100/hr and it'll take 5-6 developers to get it fixed, tested, and reviewed before it goes to Canary. That's between $500-800 per hour of development time, give or take. with a typical bug fix taking between 100-200 hours worth of time to properly diagnose and triage. That's $500,000-$1,600,000 in developer time for a single bug. That's not including the time of the PM, god help you if there's localization problems, etc.
    Now, how many users does that bug affect? Let's assume there are 800 million Windows users at any one given time (since that's the last number we've gotten out of Windows.) 100,000 people at a time is only shy of 0.125% of users. That's such a small fraction of people that it isn't worth time (shy of someone getting a real bug up their ass) to go fix it. it's spending $1million on a thing that affects 0.125% of the population when that same $1m (or, likely, $2-3million) could be spent dealing with a bug that affects 1,000,000 users (12.5%).
    The other issue is that to fix a bug, they have to find out if it's any number of categories. These are all real situations that I know have caused intermittent bugs:
    is it a bad version of the nVidia graphics driver that only gets shipped to users running older Quadro cards? is it a CPU instruction that isn't being interpreted correctly and as a result something gets shoved to an off-by-1 error? is it a race condition caused by high speed networks being just right that a coil of 100ft of cat5 sitting under someone's desk causes two threads to lose the race at the wrong time? Is it caused by a buggy USB3 controller that was only used by one vendor for one generation of machine before the controller was updated due to a bug in the actual silicon, and thus it only occurs on that one specific generation of HP Elitebooks? is it caused by lower quality HDMI cables getting interference during display EDID reading that causes the display to do something weird because windows trusted the EDID values a little too much? Is it a configuration that was otherwise uncommon but promoted on some blog as a "really cool trick" and now causes people to lose their data because someone tried to be clever? Lastly, the developers often go in blind trying to figure out what's going on. They get a vague "when I click the start button 1,000 times, the 1001th time it doesn't render right." There could be any NUMBER of things that happen between 1,000 clicks of the start menu: drivers can be updated, services stopped and restated, is this one single sitting, did the monitor turn off, is there actually a monitor or is this over RDP, etc. etc. etc.
     
    Winding back, QA inside Microsoft was, likely, not actually all that useful and getting in the way of shipping actual fixes.
  16. Like
    indrora got a reaction from Nathanpete in We fixed Windows 10 - Microsoft will HATE this!   
    After working in The Biz for a while, I have some knowledge of why.
    Any organization inside a software development house will function like a bureaucracy, as any organization will eventually fall into this despite hopes for pseudoanarchy. In the case of QA, they're often left to their own devices. This "just... tell us what's wrong" approach means that there's a disconnect between developers and testers.
     
    From my understanding, Microsoft started encouraging developers inside to test releases through some form of internal "insiders program". We know that this sort of thing exists because of folks like Jen Gentleman talking about, but also things like BuildFeed (RIP) where we saw a lot of internal branches; it's now well documented over on wikipedia. This also means that broader bugs in Windows are caught by the developers before more users see it.
    What this means is that Microsoft has shifted how they test Windows. Why? Strangely, because having independent QA teams leads to a horrible pattern called Tester-Driven Development. Having a QA team that exists only for QA means that development focus shifts constantly to their beck and call. This also means that advancement within the QA team is almost entirely driven by your ability to spit out bugs for developers to fix... even if those bugs are meaningless or really caused by a deeper problem. This causes a lot of surface level problems to get patched over (we've seen some of those throughout Windows' development history) and doesn't give developers enough time to go back and find root causes.
     
    Now, that's not to say that bugs don't happen, but here's what it takes for a bug to make it out:
    it has to make it past the development team that is actively working on it (ballpark 20-30 people) It has to make it past the "canary" channel (nightly builds, probably 200-300 people) It has to make it past the "Selfhost" channel (builds that have been approved by those 200-300 people) It has to make it past the "Dev" channel, the first time it's seen outside of Microsoft. (likely 3-4k people) It has to make it past the "Microsoft" channel, where it's rolled out across workstations inside Micorosoft (50-100k at minimum) It has to make it past the "Beta" channel, the first time most consumer Insiders get to see it. This is the most unstable channel that is publicly available to users. (Ballpark 5 million people) It has to make it past the "Release Preview" channel, the last channel before it goes live to the rest of the world (ballpark 3-4 million people) If you're curious how many people it takes inside Microsoft to get a thing out the door, the blog posts How Many Microsoft Employees Does It Take to Change a Lightbulb by Eric Lippert, a developer on windows, and Thinking through a feature by Raymond Chen, one of the oldest developers in Windows. Both of these are probably some of the best looks into what it takes to get a change into Windows; change "Feature" to "Bugfix" and the logic still holds the same.
     
    And yet bugs still exist in Windows. So, how bad of a bug does it need to be in order to not get squashed?
    Let's assume that a developer is worth, ballpark, $100/hr and it'll take 5-6 developers to get it fixed, tested, and reviewed before it goes to Canary. That's between $500-800 per hour of development time, give or take. with a typical bug fix taking between 100-200 hours worth of time to properly diagnose and triage. That's $500,000-$1,600,000 in developer time for a single bug. That's not including the time of the PM, god help you if there's localization problems, etc.
    Now, how many users does that bug affect? Let's assume there are 800 million Windows users at any one given time (since that's the last number we've gotten out of Windows.) 100,000 people at a time is only shy of 0.125% of users. That's such a small fraction of people that it isn't worth time (shy of someone getting a real bug up their ass) to go fix it. it's spending $1million on a thing that affects 0.125% of the population when that same $1m (or, likely, $2-3million) could be spent dealing with a bug that affects 1,000,000 users (12.5%).
    The other issue is that to fix a bug, they have to find out if it's any number of categories. These are all real situations that I know have caused intermittent bugs:
    is it a bad version of the nVidia graphics driver that only gets shipped to users running older Quadro cards? is it a CPU instruction that isn't being interpreted correctly and as a result something gets shoved to an off-by-1 error? is it a race condition caused by high speed networks being just right that a coil of 100ft of cat5 sitting under someone's desk causes two threads to lose the race at the wrong time? Is it caused by a buggy USB3 controller that was only used by one vendor for one generation of machine before the controller was updated due to a bug in the actual silicon, and thus it only occurs on that one specific generation of HP Elitebooks? is it caused by lower quality HDMI cables getting interference during display EDID reading that causes the display to do something weird because windows trusted the EDID values a little too much? Is it a configuration that was otherwise uncommon but promoted on some blog as a "really cool trick" and now causes people to lose their data because someone tried to be clever? Lastly, the developers often go in blind trying to figure out what's going on. They get a vague "when I click the start button 1,000 times, the 1001th time it doesn't render right." There could be any NUMBER of things that happen between 1,000 clicks of the start menu: drivers can be updated, services stopped and restated, is this one single sitting, did the monitor turn off, is there actually a monitor or is this over RDP, etc. etc. etc.
     
    Winding back, QA inside Microsoft was, likely, not actually all that useful and getting in the way of shipping actual fixes.
  17. Informative
    indrora got a reaction from Wiking22pl in We fixed Windows 10 - Microsoft will HATE this!   
    After working in The Biz for a while, I have some knowledge of why.
    Any organization inside a software development house will function like a bureaucracy, as any organization will eventually fall into this despite hopes for pseudoanarchy. In the case of QA, they're often left to their own devices. This "just... tell us what's wrong" approach means that there's a disconnect between developers and testers.
     
    From my understanding, Microsoft started encouraging developers inside to test releases through some form of internal "insiders program". We know that this sort of thing exists because of folks like Jen Gentleman talking about, but also things like BuildFeed (RIP) where we saw a lot of internal branches; it's now well documented over on wikipedia. This also means that broader bugs in Windows are caught by the developers before more users see it.
    What this means is that Microsoft has shifted how they test Windows. Why? Strangely, because having independent QA teams leads to a horrible pattern called Tester-Driven Development. Having a QA team that exists only for QA means that development focus shifts constantly to their beck and call. This also means that advancement within the QA team is almost entirely driven by your ability to spit out bugs for developers to fix... even if those bugs are meaningless or really caused by a deeper problem. This causes a lot of surface level problems to get patched over (we've seen some of those throughout Windows' development history) and doesn't give developers enough time to go back and find root causes.
     
    Now, that's not to say that bugs don't happen, but here's what it takes for a bug to make it out:
    it has to make it past the development team that is actively working on it (ballpark 20-30 people) It has to make it past the "canary" channel (nightly builds, probably 200-300 people) It has to make it past the "Selfhost" channel (builds that have been approved by those 200-300 people) It has to make it past the "Dev" channel, the first time it's seen outside of Microsoft. (likely 3-4k people) It has to make it past the "Microsoft" channel, where it's rolled out across workstations inside Micorosoft (50-100k at minimum) It has to make it past the "Beta" channel, the first time most consumer Insiders get to see it. This is the most unstable channel that is publicly available to users. (Ballpark 5 million people) It has to make it past the "Release Preview" channel, the last channel before it goes live to the rest of the world (ballpark 3-4 million people) If you're curious how many people it takes inside Microsoft to get a thing out the door, the blog posts How Many Microsoft Employees Does It Take to Change a Lightbulb by Eric Lippert, a developer on windows, and Thinking through a feature by Raymond Chen, one of the oldest developers in Windows. Both of these are probably some of the best looks into what it takes to get a change into Windows; change "Feature" to "Bugfix" and the logic still holds the same.
     
    And yet bugs still exist in Windows. So, how bad of a bug does it need to be in order to not get squashed?
    Let's assume that a developer is worth, ballpark, $100/hr and it'll take 5-6 developers to get it fixed, tested, and reviewed before it goes to Canary. That's between $500-800 per hour of development time, give or take. with a typical bug fix taking between 100-200 hours worth of time to properly diagnose and triage. That's $500,000-$1,600,000 in developer time for a single bug. That's not including the time of the PM, god help you if there's localization problems, etc.
    Now, how many users does that bug affect? Let's assume there are 800 million Windows users at any one given time (since that's the last number we've gotten out of Windows.) 100,000 people at a time is only shy of 0.125% of users. That's such a small fraction of people that it isn't worth time (shy of someone getting a real bug up their ass) to go fix it. it's spending $1million on a thing that affects 0.125% of the population when that same $1m (or, likely, $2-3million) could be spent dealing with a bug that affects 1,000,000 users (12.5%).
    The other issue is that to fix a bug, they have to find out if it's any number of categories. These are all real situations that I know have caused intermittent bugs:
    is it a bad version of the nVidia graphics driver that only gets shipped to users running older Quadro cards? is it a CPU instruction that isn't being interpreted correctly and as a result something gets shoved to an off-by-1 error? is it a race condition caused by high speed networks being just right that a coil of 100ft of cat5 sitting under someone's desk causes two threads to lose the race at the wrong time? Is it caused by a buggy USB3 controller that was only used by one vendor for one generation of machine before the controller was updated due to a bug in the actual silicon, and thus it only occurs on that one specific generation of HP Elitebooks? is it caused by lower quality HDMI cables getting interference during display EDID reading that causes the display to do something weird because windows trusted the EDID values a little too much? Is it a configuration that was otherwise uncommon but promoted on some blog as a "really cool trick" and now causes people to lose their data because someone tried to be clever? Lastly, the developers often go in blind trying to figure out what's going on. They get a vague "when I click the start button 1,000 times, the 1001th time it doesn't render right." There could be any NUMBER of things that happen between 1,000 clicks of the start menu: drivers can be updated, services stopped and restated, is this one single sitting, did the monitor turn off, is there actually a monitor or is this over RDP, etc. etc. etc.
     
    Winding back, QA inside Microsoft was, likely, not actually all that useful and getting in the way of shipping actual fixes.
  18. Informative
    indrora got a reaction from pythonmegapixel in We fixed Windows 10 - Microsoft will HATE this!   
    After working in The Biz for a while, I have some knowledge of why.
    Any organization inside a software development house will function like a bureaucracy, as any organization will eventually fall into this despite hopes for pseudoanarchy. In the case of QA, they're often left to their own devices. This "just... tell us what's wrong" approach means that there's a disconnect between developers and testers.
     
    From my understanding, Microsoft started encouraging developers inside to test releases through some form of internal "insiders program". We know that this sort of thing exists because of folks like Jen Gentleman talking about, but also things like BuildFeed (RIP) where we saw a lot of internal branches; it's now well documented over on wikipedia. This also means that broader bugs in Windows are caught by the developers before more users see it.
    What this means is that Microsoft has shifted how they test Windows. Why? Strangely, because having independent QA teams leads to a horrible pattern called Tester-Driven Development. Having a QA team that exists only for QA means that development focus shifts constantly to their beck and call. This also means that advancement within the QA team is almost entirely driven by your ability to spit out bugs for developers to fix... even if those bugs are meaningless or really caused by a deeper problem. This causes a lot of surface level problems to get patched over (we've seen some of those throughout Windows' development history) and doesn't give developers enough time to go back and find root causes.
     
    Now, that's not to say that bugs don't happen, but here's what it takes for a bug to make it out:
    it has to make it past the development team that is actively working on it (ballpark 20-30 people) It has to make it past the "canary" channel (nightly builds, probably 200-300 people) It has to make it past the "Selfhost" channel (builds that have been approved by those 200-300 people) It has to make it past the "Dev" channel, the first time it's seen outside of Microsoft. (likely 3-4k people) It has to make it past the "Microsoft" channel, where it's rolled out across workstations inside Micorosoft (50-100k at minimum) It has to make it past the "Beta" channel, the first time most consumer Insiders get to see it. This is the most unstable channel that is publicly available to users. (Ballpark 5 million people) It has to make it past the "Release Preview" channel, the last channel before it goes live to the rest of the world (ballpark 3-4 million people) If you're curious how many people it takes inside Microsoft to get a thing out the door, the blog posts How Many Microsoft Employees Does It Take to Change a Lightbulb by Eric Lippert, a developer on windows, and Thinking through a feature by Raymond Chen, one of the oldest developers in Windows. Both of these are probably some of the best looks into what it takes to get a change into Windows; change "Feature" to "Bugfix" and the logic still holds the same.
     
    And yet bugs still exist in Windows. So, how bad of a bug does it need to be in order to not get squashed?
    Let's assume that a developer is worth, ballpark, $100/hr and it'll take 5-6 developers to get it fixed, tested, and reviewed before it goes to Canary. That's between $500-800 per hour of development time, give or take. with a typical bug fix taking between 100-200 hours worth of time to properly diagnose and triage. That's $500,000-$1,600,000 in developer time for a single bug. That's not including the time of the PM, god help you if there's localization problems, etc.
    Now, how many users does that bug affect? Let's assume there are 800 million Windows users at any one given time (since that's the last number we've gotten out of Windows.) 100,000 people at a time is only shy of 0.125% of users. That's such a small fraction of people that it isn't worth time (shy of someone getting a real bug up their ass) to go fix it. it's spending $1million on a thing that affects 0.125% of the population when that same $1m (or, likely, $2-3million) could be spent dealing with a bug that affects 1,000,000 users (12.5%).
    The other issue is that to fix a bug, they have to find out if it's any number of categories. These are all real situations that I know have caused intermittent bugs:
    is it a bad version of the nVidia graphics driver that only gets shipped to users running older Quadro cards? is it a CPU instruction that isn't being interpreted correctly and as a result something gets shoved to an off-by-1 error? is it a race condition caused by high speed networks being just right that a coil of 100ft of cat5 sitting under someone's desk causes two threads to lose the race at the wrong time? Is it caused by a buggy USB3 controller that was only used by one vendor for one generation of machine before the controller was updated due to a bug in the actual silicon, and thus it only occurs on that one specific generation of HP Elitebooks? is it caused by lower quality HDMI cables getting interference during display EDID reading that causes the display to do something weird because windows trusted the EDID values a little too much? Is it a configuration that was otherwise uncommon but promoted on some blog as a "really cool trick" and now causes people to lose their data because someone tried to be clever? Lastly, the developers often go in blind trying to figure out what's going on. They get a vague "when I click the start button 1,000 times, the 1001th time it doesn't render right." There could be any NUMBER of things that happen between 1,000 clicks of the start menu: drivers can be updated, services stopped and restated, is this one single sitting, did the monitor turn off, is there actually a monitor or is this over RDP, etc. etc. etc.
     
    Winding back, QA inside Microsoft was, likely, not actually all that useful and getting in the way of shipping actual fixes.
  19. Informative
    indrora got a reaction from thewelshbrummie in We fixed Windows 10 - Microsoft will HATE this!   
    After working in The Biz for a while, I have some knowledge of why.
    Any organization inside a software development house will function like a bureaucracy, as any organization will eventually fall into this despite hopes for pseudoanarchy. In the case of QA, they're often left to their own devices. This "just... tell us what's wrong" approach means that there's a disconnect between developers and testers.
     
    From my understanding, Microsoft started encouraging developers inside to test releases through some form of internal "insiders program". We know that this sort of thing exists because of folks like Jen Gentleman talking about, but also things like BuildFeed (RIP) where we saw a lot of internal branches; it's now well documented over on wikipedia. This also means that broader bugs in Windows are caught by the developers before more users see it.
    What this means is that Microsoft has shifted how they test Windows. Why? Strangely, because having independent QA teams leads to a horrible pattern called Tester-Driven Development. Having a QA team that exists only for QA means that development focus shifts constantly to their beck and call. This also means that advancement within the QA team is almost entirely driven by your ability to spit out bugs for developers to fix... even if those bugs are meaningless or really caused by a deeper problem. This causes a lot of surface level problems to get patched over (we've seen some of those throughout Windows' development history) and doesn't give developers enough time to go back and find root causes.
     
    Now, that's not to say that bugs don't happen, but here's what it takes for a bug to make it out:
    it has to make it past the development team that is actively working on it (ballpark 20-30 people) It has to make it past the "canary" channel (nightly builds, probably 200-300 people) It has to make it past the "Selfhost" channel (builds that have been approved by those 200-300 people) It has to make it past the "Dev" channel, the first time it's seen outside of Microsoft. (likely 3-4k people) It has to make it past the "Microsoft" channel, where it's rolled out across workstations inside Micorosoft (50-100k at minimum) It has to make it past the "Beta" channel, the first time most consumer Insiders get to see it. This is the most unstable channel that is publicly available to users. (Ballpark 5 million people) It has to make it past the "Release Preview" channel, the last channel before it goes live to the rest of the world (ballpark 3-4 million people) If you're curious how many people it takes inside Microsoft to get a thing out the door, the blog posts How Many Microsoft Employees Does It Take to Change a Lightbulb by Eric Lippert, a developer on windows, and Thinking through a feature by Raymond Chen, one of the oldest developers in Windows. Both of these are probably some of the best looks into what it takes to get a change into Windows; change "Feature" to "Bugfix" and the logic still holds the same.
     
    And yet bugs still exist in Windows. So, how bad of a bug does it need to be in order to not get squashed?
    Let's assume that a developer is worth, ballpark, $100/hr and it'll take 5-6 developers to get it fixed, tested, and reviewed before it goes to Canary. That's between $500-800 per hour of development time, give or take. with a typical bug fix taking between 100-200 hours worth of time to properly diagnose and triage. That's $500,000-$1,600,000 in developer time for a single bug. That's not including the time of the PM, god help you if there's localization problems, etc.
    Now, how many users does that bug affect? Let's assume there are 800 million Windows users at any one given time (since that's the last number we've gotten out of Windows.) 100,000 people at a time is only shy of 0.125% of users. That's such a small fraction of people that it isn't worth time (shy of someone getting a real bug up their ass) to go fix it. it's spending $1million on a thing that affects 0.125% of the population when that same $1m (or, likely, $2-3million) could be spent dealing with a bug that affects 1,000,000 users (12.5%).
    The other issue is that to fix a bug, they have to find out if it's any number of categories. These are all real situations that I know have caused intermittent bugs:
    is it a bad version of the nVidia graphics driver that only gets shipped to users running older Quadro cards? is it a CPU instruction that isn't being interpreted correctly and as a result something gets shoved to an off-by-1 error? is it a race condition caused by high speed networks being just right that a coil of 100ft of cat5 sitting under someone's desk causes two threads to lose the race at the wrong time? Is it caused by a buggy USB3 controller that was only used by one vendor for one generation of machine before the controller was updated due to a bug in the actual silicon, and thus it only occurs on that one specific generation of HP Elitebooks? is it caused by lower quality HDMI cables getting interference during display EDID reading that causes the display to do something weird because windows trusted the EDID values a little too much? Is it a configuration that was otherwise uncommon but promoted on some blog as a "really cool trick" and now causes people to lose their data because someone tried to be clever? Lastly, the developers often go in blind trying to figure out what's going on. They get a vague "when I click the start button 1,000 times, the 1001th time it doesn't render right." There could be any NUMBER of things that happen between 1,000 clicks of the start menu: drivers can be updated, services stopped and restated, is this one single sitting, did the monitor turn off, is there actually a monitor or is this over RDP, etc. etc. etc.
     
    Winding back, QA inside Microsoft was, likely, not actually all that useful and getting in the way of shipping actual fixes.
  20. Informative
    indrora got a reaction from Radium_Angel in We fixed Windows 10 - Microsoft will HATE this!   
    After working in The Biz for a while, I have some knowledge of why.
    Any organization inside a software development house will function like a bureaucracy, as any organization will eventually fall into this despite hopes for pseudoanarchy. In the case of QA, they're often left to their own devices. This "just... tell us what's wrong" approach means that there's a disconnect between developers and testers.
     
    From my understanding, Microsoft started encouraging developers inside to test releases through some form of internal "insiders program". We know that this sort of thing exists because of folks like Jen Gentleman talking about, but also things like BuildFeed (RIP) where we saw a lot of internal branches; it's now well documented over on wikipedia. This also means that broader bugs in Windows are caught by the developers before more users see it.
    What this means is that Microsoft has shifted how they test Windows. Why? Strangely, because having independent QA teams leads to a horrible pattern called Tester-Driven Development. Having a QA team that exists only for QA means that development focus shifts constantly to their beck and call. This also means that advancement within the QA team is almost entirely driven by your ability to spit out bugs for developers to fix... even if those bugs are meaningless or really caused by a deeper problem. This causes a lot of surface level problems to get patched over (we've seen some of those throughout Windows' development history) and doesn't give developers enough time to go back and find root causes.
     
    Now, that's not to say that bugs don't happen, but here's what it takes for a bug to make it out:
    it has to make it past the development team that is actively working on it (ballpark 20-30 people) It has to make it past the "canary" channel (nightly builds, probably 200-300 people) It has to make it past the "Selfhost" channel (builds that have been approved by those 200-300 people) It has to make it past the "Dev" channel, the first time it's seen outside of Microsoft. (likely 3-4k people) It has to make it past the "Microsoft" channel, where it's rolled out across workstations inside Micorosoft (50-100k at minimum) It has to make it past the "Beta" channel, the first time most consumer Insiders get to see it. This is the most unstable channel that is publicly available to users. (Ballpark 5 million people) It has to make it past the "Release Preview" channel, the last channel before it goes live to the rest of the world (ballpark 3-4 million people) If you're curious how many people it takes inside Microsoft to get a thing out the door, the blog posts How Many Microsoft Employees Does It Take to Change a Lightbulb by Eric Lippert, a developer on windows, and Thinking through a feature by Raymond Chen, one of the oldest developers in Windows. Both of these are probably some of the best looks into what it takes to get a change into Windows; change "Feature" to "Bugfix" and the logic still holds the same.
     
    And yet bugs still exist in Windows. So, how bad of a bug does it need to be in order to not get squashed?
    Let's assume that a developer is worth, ballpark, $100/hr and it'll take 5-6 developers to get it fixed, tested, and reviewed before it goes to Canary. That's between $500-800 per hour of development time, give or take. with a typical bug fix taking between 100-200 hours worth of time to properly diagnose and triage. That's $500,000-$1,600,000 in developer time for a single bug. That's not including the time of the PM, god help you if there's localization problems, etc.
    Now, how many users does that bug affect? Let's assume there are 800 million Windows users at any one given time (since that's the last number we've gotten out of Windows.) 100,000 people at a time is only shy of 0.125% of users. That's such a small fraction of people that it isn't worth time (shy of someone getting a real bug up their ass) to go fix it. it's spending $1million on a thing that affects 0.125% of the population when that same $1m (or, likely, $2-3million) could be spent dealing with a bug that affects 1,000,000 users (12.5%).
    The other issue is that to fix a bug, they have to find out if it's any number of categories. These are all real situations that I know have caused intermittent bugs:
    is it a bad version of the nVidia graphics driver that only gets shipped to users running older Quadro cards? is it a CPU instruction that isn't being interpreted correctly and as a result something gets shoved to an off-by-1 error? is it a race condition caused by high speed networks being just right that a coil of 100ft of cat5 sitting under someone's desk causes two threads to lose the race at the wrong time? Is it caused by a buggy USB3 controller that was only used by one vendor for one generation of machine before the controller was updated due to a bug in the actual silicon, and thus it only occurs on that one specific generation of HP Elitebooks? is it caused by lower quality HDMI cables getting interference during display EDID reading that causes the display to do something weird because windows trusted the EDID values a little too much? Is it a configuration that was otherwise uncommon but promoted on some blog as a "really cool trick" and now causes people to lose their data because someone tried to be clever? Lastly, the developers often go in blind trying to figure out what's going on. They get a vague "when I click the start button 1,000 times, the 1001th time it doesn't render right." There could be any NUMBER of things that happen between 1,000 clicks of the start menu: drivers can be updated, services stopped and restated, is this one single sitting, did the monitor turn off, is there actually a monitor or is this over RDP, etc. etc. etc.
     
    Winding back, QA inside Microsoft was, likely, not actually all that useful and getting in the way of shipping actual fixes.
  21. Like
    indrora got a reaction from Eschew in Japanese made AX2 case: The uber-customizable behemoth   
    Saw this as I was reading Twitter: http://www.shop-siomi.com/shopdetail/000000000043/001/X/page1/recommend/

     
    An ATX case, made in Japan, that costs over 450US alone, plus the cost of getting it out of Japan. Tired of those weakling screw in standoffs? How about standoffs that can probably hold your motherboard so rigid a 3lb air cooler can't weigh it down. GPU flex? Not on my watch!
     

     
    This absolute UNIT of a case can sport a combination of disk layouts with up to 43 small size SSDs shoved into it, can hold up to nine 5.25in bays (Hotswaps for DAYS) and has mounting points for radiators like you wouldn't believe.
     

    Interestingly, they decided that a reset button wasn't useful, given the stability of modern computers, so that's gone: 

    But the adjustable LED brightness is something that could be seriously cool to see elsewhere in case designs. 
     
    oh, and it can come with casters... For less than Apple's casters cost!
  22. Agree
    indrora reacted to AndreiArgeanu in We made the SLOWEST NEW PC!   
    Why is it that whenever there's an AMD cpu with Vega Graphics, they never seem to take it seriously, or give it a chance for that matter, like what was the purpose of the GPU when you already had a better one on the CPU, and it isn't this video specifically, it's all of them. Like the El-Cheapo at LTX where the LowSpecGamer came to fix the system since it didn't run memory and dual channel which really affected it's performance. Like, from my perspective it seems that AMD's APU's, anything from Athlon to 3400G are just used as laughing stocks rather than being given a chance.
  23. Like
    indrora got a reaction from Nup in Was RTX a big scam? – Performance & image quality analysis   
    First off, I'm enjoying Anthony's style. Calm, collected, and a nice flip-side to the often energetic Linus. Linus is great for telling us about the Hot New Thing, but Anthony has a fantastic, slightly slow-burn style that works well for technical overviews. @GabenJr -- Please, pitch more of these sorts of things! Find a topic that interests you and just write a script, even if you end up putting it on a personal channel!
     
    I think the conclusion here isn't as clean-cut. Was it a scam? Not really, in my opinion -- as Anthony said, NVidia has come first to market and we're seeing the ripple effect of early adopters getting the kinks out. For those of us old enough to remember when eight times speed CD-ROM drives were the hot rage, I remember the occasional discussion as to if you needed anything better than 4x -- you'd saturated the controller on your sound card (yes, kids, sound cards ran your CD drives back then!) and there was dismissal as to if you'd want it. After all, you'd have to slow down for the CD audio!
     
    Thomas J. Watson, late head of IBM, is often quoted as saying that there would be a market of maybe 5 computers in the world, with others making similar statements. In the 90s, there was debate over the need for 3D accelerators in pairs or for high-resolution textures, high framerates, if online gaming would ever catch on, and even a hot take article in Boot Magazine that said, effectively, "3D games won't be popular, 2D board games will." In that same issue, they even had the truly bad-in-hindsight spicy take from Alex St. John, one of the people behind DirectX and later the founder of WildTangent, that Java would die a painful death... and look now: one of the most successful games on the planet is written in Java (Minecraft), the most successful mobile OS on the planet runs on Java (Android), and inside most every ATM cards and SIM card is a tiny version of Java, in many places in the world used for everything from banking to just general account management tasks! (JavaCard)
     
    I suspect it'll take a few years before we really get the hang of rayracing on the GPU (How many people remember "VR" being the hot thing in marketing land and now we're seeing sub-$1000 headsets from Lenovo, HP, Acer, etc). It'll take time, tooling changes, and developers having time to work out the kinks. When CUDA and OpenCL came along, there were only marginal gains over traditional software processing, but now we're seeing more things running on the GPU than ever before.
  24. Funny
    indrora got a reaction from Fnige in ARIN revokes 757K+ fraudulently obtained IPv4 addresses   
    Source:  BleepingComputer Team ARIN ZDNet
     
    ARIN, the American Registry for Internet Numbers, is one of the organizations responsible for dolling out IP addresses on the public internet. They've discovered that 757,760 IPv4 addresses were fraudulently allocated (making up 0.02044% of the internets publicly routable IPv4 addresses) by an individual with a few shell companies. From the ARIN press release:
    Now, digging into the ZDNet article, it seems Mr. Golestan tried to actually tell ARIN that they're just being mean and to go away after making a tidy sum of cash:
    (emphasis mine)
     
     
    A suspense thriller worthy of prime time news, truly. The sad fact of the matter is that IPv4 is still what most enterprises are using, not IPv6. We were supposed to get IPv6 rolled out permanently and completely back in 2012, on World IPv6 day, but of course, we all sat on our thumbs and did nothing that day. This makes IPv4 addresses tantamount in rarity somewhere comparable to meetings that couldn't be an email. Takes some brass ones though to tell the company you've defrauded to not talk to you because they're the ones being mean. That really seems to have worked out for poor Mr. Golestan, eh?
  25. Funny
    indrora got a reaction from 8uhbbhu8 in ARIN revokes 757K+ fraudulently obtained IPv4 addresses   
    Source:  BleepingComputer Team ARIN ZDNet
     
    ARIN, the American Registry for Internet Numbers, is one of the organizations responsible for dolling out IP addresses on the public internet. They've discovered that 757,760 IPv4 addresses were fraudulently allocated (making up 0.02044% of the internets publicly routable IPv4 addresses) by an individual with a few shell companies. From the ARIN press release:
    Now, digging into the ZDNet article, it seems Mr. Golestan tried to actually tell ARIN that they're just being mean and to go away after making a tidy sum of cash:
    (emphasis mine)
     
     
    A suspense thriller worthy of prime time news, truly. The sad fact of the matter is that IPv4 is still what most enterprises are using, not IPv6. We were supposed to get IPv6 rolled out permanently and completely back in 2012, on World IPv6 day, but of course, we all sat on our thumbs and did nothing that day. This makes IPv4 addresses tantamount in rarity somewhere comparable to meetings that couldn't be an email. Takes some brass ones though to tell the company you've defrauded to not talk to you because they're the ones being mean. That really seems to have worked out for poor Mr. Golestan, eh?
×