Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

indrora

Member
  • Content Count

    46
  • Joined

  • Last visited

Awards


This user doesn't have any awards

About indrora

  • Title
    Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. indrora

    Building my Teenage Dream PC - Ricer PC Part 1

    Oh man. Brings back some memories. Definitely gotta give a few different classic games a shot from the mid-aughts LAN days: Serious Sam ARMA Battlefield 1942 Portal Far Cry GTA Just to name a few. Portal was way on the high side of that but oh boy if you had a couple copies of Serious Sam kicking around you were the go-to friend at LAN parties.
  2. indrora

    The Supercar Gaming PC - Aventum X Review

    You're very much not its target audience. This is for the person with dollars than sense. This is the "I paid $fuckyou for my computer because I can" and a few steps above the person who buys a Porche 911 Sharkwerks to daily to the grocery store because fuck you and then takes it to mororama and puts a little "RARE ONE OF A KIND" plaque on it to show that they pay for twinkies with a platinum gold card. Where the typical system integrator machine is a Mazda Miata, this is a custom-baked supercar that cost more than the GDP of a small country because my pants don't bulge enough and I need to compensate at the next Fortnite invitational.
  3. I'm not talking about the camera trick that produces funky looking images, I'm talking about the HDR we get in games. It also doesn't inherently need multiple exposure samples (you can do certain amounts of HDR with RAW DNG and compensation techniques, a cheater HDR that some photographers use when handling moving subjects and want to avoid ghosts). No, I'm talking about games supporting HDR. Some of the first were in Unreal 3 and an early demo (HL2: Lost Cost), and a lot of the work was done by keeping two textures around and in other areas apply a bloom shader effect to the bright bits of the scene. It wasn't real HDR, but it was close enough. For an interesting look at how Valve cheated, they talked about it at SIGGRAPH 06, but the basic gist of it is that they're not really doing the raytrace HDR that you would want to do (and which RTX/DXR enables today) but instead pre-baked all their HDR stuff before rendering it fully. The long and short of it is thus: Microsoft released DXR into the world nVidia then released their RTX cards which have hardware support for DXR A few games kinda-sorta rush DXR support in nVidia announces they're kinda supporting DXR on GTX 10-series cards through CUDA cores, but that it comes with a significant performance hit Unreal Engine is slowly adding the baked-in support for DXR People get angry that there's a performance hit on GTX cards Everyone completely loses their mind and the rational folk in the room apply Hanlon's Razor Look where we are now. PhysX needed special physics cores that later were integrated into GPUs by nVidia and then replaced with CUDA GPGPU kernels -- in fact It wasn't until much later that GPU support was added. Only two games supported the accelerator card anywhere near its launch. When GPU support was added (which required quite high spec cards,) only five games supported physX on the GPU. Was physX a scam too? You later could run the physics simulations on the CPU but your framerate would tank. If we really apply the logic that's been going on, the Voodoo2 cards were a scam! There were a handful of games that depended on the Voodoo2 in order to provide 3D accelerated graphics and later got software rendering support. Hell, we can even go back as far as the Intel 8087 Math Coprocessor, since there were games that demanded it but worked fine after patching that out... but with tanked framerates because of the integer approxomations. Hmmm... It's like we've been down this path before.
  4. You're uh, new to this whole "cost of new tech" thing, aren't you? First generation of something that hasn't really been done before (remember that this is not GPGPU -- this is not CUDA -- this is adding special silicon for things like Tensor cores and ray tracing) will be more expensive. This isn't re-purposing texture shaders for something else, this is literally dedicated, first generation silicon. There's R&D cost that has to be recouped somewhere. That cost goes to you, the consumer. When CUDA came out, the cards cost more. Hell, dedicated compute cards like the nVidia Titan still cost a pretty penny. I remember when you had to make sure you had to make sure that the game you were buying supported your soundcard. Nothing like getting home from Fry's only to realize that your fancy new game (now out of the shrink wrap) doesn't support AdLib except for a few crappy MIDI renditions of the soundtrack because it was built for the Soundblaster PCM FM channels, so you go and buy a SoundBlaster only to realize that it's the same MIDI renditions just with slightly better sounding instruments. Today, there's little difference: Software support is still getting there and the first few generations of software and hardware are going to be... spotty, if not a little lacklustre. Remember when HDR was new? And how we all oohed and ahhed when it was all over our faces, only to realize it was just Bloom and some subjective lighting? How about when everybody realized Doom wasn't really 3D? Or when you really honestly needed an SLI setup to make Unreal work right?
  5. Source: ZDNet , AllAboutChromebooks (Image from AllAboutChromebooks) Google has apparently killed their (hiding in plain sight) skunkworks project to run Windows and Linux on Chromebooks under the name "Project Campfire" or "AltOS". From ZDNet: The project would have allowed Chromebooks to officially dual boot Windows or Linux alongside ChromeOS. Chalk another one up for the Google Graveyard. This certainly could have been cool, given that it currently requires quite a bit of shenanigans but is quite doable by the enthusiast... On the other hand, we've avoided (narrowly) a brewing OtherOS fiasco.
  6. Source: BleepingComputer Team ARIN ZDNet ARIN, the American Registry for Internet Numbers, is one of the organizations responsible for dolling out IP addresses on the public internet. They've discovered that 757,760 IPv4 addresses were fraudulently allocated (making up 0.02044% of the internets publicly routable IPv4 addresses) by an individual with a few shell companies. From the ARIN press release: Now, digging into the ZDNet article, it seems Mr. Golestan tried to actually tell ARIN that they're just being mean and to go away after making a tidy sum of cash: (emphasis mine) A suspense thriller worthy of prime time news, truly. The sad fact of the matter is that IPv4 is still what most enterprises are using, not IPv6. We were supposed to get IPv6 rolled out permanently and completely back in 2012, on World IPv6 day, but of course, we all sat on our thumbs and did nothing that day. This makes IPv4 addresses tantamount in rarity somewhere comparable to meetings that couldn't be an email. Takes some brass ones though to tell the company you've defrauded to not talk to you because they're the ones being mean. That really seems to have worked out for poor Mr. Golestan, eh?
  7. Source: ZDNet Several major American phone companies have written a joint letter to the FCC stating that they're no longer going to be opening new third-party sharing agreements and are terminating existing ones. From ZDnet: AT&T claims they are only sharing customer location data with E911 services and service such as ride sharing applications but that this, according to the article are limited to "legal and highly controlled cases". So, then -- what's my take on this? If this is really happening, then I think there might be a net good that comes of it from the perspective of control of personally identifiable information and the like -- especially given how cheap and easy it is to track someone if you know their phone number as it stands today, but on the other hand we're trusting the telcos to not be lying through their teeth to the FCC (and, by extension, the public) and not simultaneously also handing all this information to insert your favorite intelligence agency here. Because of course, they wouldn't do that, would they? I doubt there will be any accountability for any of this, and that's my big concern here.
  8. First off, I'm enjoying Anthony's style. Calm, collected, and a nice flip-side to the often energetic Linus. Linus is great for telling us about the Hot New Thing, but Anthony has a fantastic, slightly slow-burn style that works well for technical overviews. @GabenJr -- Please, pitch more of these sorts of things! Find a topic that interests you and just write a script, even if you end up putting it on a personal channel! I think the conclusion here isn't as clean-cut. Was it a scam? Not really, in my opinion -- as Anthony said, NVidia has come first to market and we're seeing the ripple effect of early adopters getting the kinks out. For those of us old enough to remember when eight times speed CD-ROM drives were the hot rage, I remember the occasional discussion as to if you needed anything better than 4x -- you'd saturated the controller on your sound card (yes, kids, sound cards ran your CD drives back then!) and there was dismissal as to if you'd want it. After all, you'd have to slow down for the CD audio! Thomas J. Watson, late head of IBM, is often quoted as saying that there would be a market of maybe 5 computers in the world, with others making similar statements. In the 90s, there was debate over the need for 3D accelerators in pairs or for high-resolution textures, high framerates, if online gaming would ever catch on, and even a hot take article in Boot Magazine that said, effectively, "3D games won't be popular, 2D board games will." In that same issue, they even had the truly bad-in-hindsight spicy take from Alex St. John, one of the people behind DirectX and later the founder of WildTangent, that Java would die a painful death... and look now: one of the most successful games on the planet is written in Java (Minecraft), the most successful mobile OS on the planet runs on Java (Android), and inside most every ATM cards and SIM card is a tiny version of Java, in many places in the world used for everything from banking to just general account management tasks! (JavaCard) I suspect it'll take a few years before we really get the hang of rayracing on the GPU (How many people remember "VR" being the hot thing in marketing land and now we're seeing sub-$1000 headsets from Lenovo, HP, Acer, etc). It'll take time, tooling changes, and developers having time to work out the kinks. When CUDA and OpenCL came along, there were only marginal gains over traditional software processing, but now we're seeing more things running on the GPU than ever before.
  9. Lots of reasons. It's useful to remember that emojis are, internally, characters like the latin letter a, the greek letter ζ, the cyrillic letter Д, the ideograph ヅ, or the arabic script letter ك. There's a lot of things that Conhost (the process that hosts the shell itself) has had in the past that have been broken. For instance, for a long time, if your system was in English (US) and your WSL environment happened to be in English (Germany), all sorts of tools would produce garbage until you changed your windows locale to English (German). So, the reasons to bring up full Unicode support in the terminal: Lots of dev tools use emoji -- for a variety of reasons. Python and Node tooling especially love emoji, since the languages have native support for any unicode character. Swift as well, since you can name a variable an emoji. Mixed-locale software is easier to use in WSL. Tools like ssh and such work fine Japanese users don't need a butchered environment where you have to mentally translate from \ to ¥ Japanese text renders correctly in mixed-widhth (half/fullwidth) environments Bidirectional users can have working bidirectional text (e.g. Arabic, Hebrew, etc.) Powershell cmdlets for Active Directory can now easily take the name "Martin Grüßer" without silly hacks to make it work Emoji here is really a happy side effect of having full unicode support and fallback font rendering. This means tools like PoSh Git can use arrows and other unicode characters when you are ahead/behind/etc.
  10. Most consumer electronics with switching power supplies don't give a damn about voltage or frequency now. Look at your laptop's PSU, it'll look something like this: This object can be plugged into a local outlet anywhere in the world and it'll probably work. There's some weirdness with stuff that isn't natively able to auto-adjust to 110/220, but these objects typically have a voltage switch (e.g. power supplies meant for LED displays). This is why these travel adapters work; There's no smarts in them, they're just fit-all plugs that mostly make enough connection to work. Literally the best thing to take with you to another country is a C5/6 grounded cable (or C7 ungrounded cable) with the local plug. Your laptop PSU doesn't give a damn. If it's got the CE (Conformitee European) mark it's meant to be sold in Europe. Going to Britain? Take one of these: Going elsewhere in Europe? They're like, $10, plus they're a lot safer. Plus, a nice long one (I like 2-3M long) means you're never searching for an outlet because there's always an outlet if you're brave enough to stand on chairs. I hacked an IEC connector onto my old fixed-plug laptop and enshrined it in clear resin to make it look "official" and folks really didn't give me a hard time.
  11. Since someone mentioned the UK plugs, I figured I'd post this: The multi-adapters are terrifying as well. The one that Linus actually showed in the video is __amazing__ because it includes a feature that most people didn't even know about! In the UK or parts of Europe, it turns into a free lamp tester! The ever wonderful Big Clive talks about them here: One of the safety things that we in the US are more and more commonly seeing (and something that to this day saves lives and eliminates those nasty shocks that Linus so hates) are RCD devices. They've been part of the US National Electric Code for some time now and are super cool, and Technology Connections has a wonderful video on how they work over here:
  12. indrora

    LTT: We bought a cheap SSD from Ali Express..

    One of my machines has one of their... And this is an abomination unto itself: mini PCIe PATA IDE SSD. You read that right: an IDE PCIe SSD. What's the machine, you might ask? A Dell Mini 9, a netbook that I just use as a note-taking laptop 10/10 nobody messes around with a large man who carries a candy pink netbook.
  13. indrora

    Does DBAN just lifespan?

    Don't DBAN SSDs. It's not a good idea and will quickly rip through an SSD's lifespan. DBAN works by writing random patterns to the disk, ignoring partitions and filesystems and all that jazz. It's perfectly safe (and might extend some life of the drive in some very limited cases, but those are rare). If you're just concerned about cleaning a disk up, I keep Ubuntu around so I can use the ATA Secure Erase, or you can remove all partitions on the disk with Disk Management and create one partition doing a non-quick format operation (this writes zeros to every block that the partition spans).
  14. you literally never know what you're getting sometimes. I've bought batteries that claimed to be unbranded but turned out to have branding slathered all over them. Worse, sometimes you buy what you think are legit parts only to find out that they're not. He ends up with counterfeits all the time and has regularly talked about how he spots them.
  15. They in fact did. It's called a Universal Binary, which (confusingly) also refers to 32-bit and 64-bit fattened binaries. From wikipedia: MacOS 10.0 didn't ship with Intel support. by the way; it wouldn't be until 10.4 that Intel support would be a thing. In 2006, they were seeding out $1k developer devices called the Developer Transition Platforms to select developers, PCs shoved into PowerMac G5 cases with a custom TPM. They were almost all returned within the year, as per the Apple agreements. For software from the 68k era that didn't have the same fat binary format, Apple included the Mac 68k emulator in OSX up until 10.5, which was mostly known as Classic Mode. in 10.4 for Intel macs, the emulation layer you're referring to was released, called Rosetta. Later in the PowperPC life cycle, there were articles about trimming out the PowerPC (and later 32-bit) versions of the executables, as well as removing non-native language support. It's not all that hard to move from one architecture to another when the underlying OS stays the same in terms of programming interfaces for a vast majority of software. For a vast majority of software written in ObjectiveC on the platform, it'll be a matter of "recompile and done". When moving to a new platform with a standard interface in place, you don't need to rewrite things in the vast majority of cases. It's a matter of checking that you're not doing anything ultra-specific to the processor (e.g. hand-rolled assembly) and that you're not making any major assumptions about the architecture. The reason that Office isn't 64-bit native is because Word documents pre-DOCX are literally dumps of the Word memory structure with some numbers fixed up, then un-fixed when you load them later. Backwards compatibility sucks some days, doesn't it? The fact something was on Windows made no difference during the Apple transition from PowerPC to Intel. It was the fact that Apple released a compiler that they could just go "in Xcode change this line and boom you get intel and PowerPC support if you aren't touching any funny hardware, which you shouldn't and you should be using the OS provided things." For stuff that was really heavily AltiVec-reliant, there was some work to rework that into Intel, or just turn off the AltiVec support and use the non-AltiVec version, or release a non-universal binary that suffered performance hits. Microsoft already made it just as easy; For the vast majority of software that needs native performance, the change is just "target ARM not x86" under the compiler options, and for everything else, you'll have to replace your silly "we wrote it in x86 assembler so it's like 3% faster" hack with something in ARM assembler or rewrite it in C so you're letting the compiler do the work.
×