Jump to content
Phishing Emails & YouTube Messages - Fake Giveaway Read more... ×
Search In
  • More options...
Find results that contain...
Find results in...

Mira Yurizaki

Member
  • Content Count

    17,070
  • Joined

  • Last visited

Awards


This user doesn't have any awards

About Mira Yurizaki

  • Title
    A robot of no importance

Profile Information

  • Location
    California
  • Interests
    Primarily technology, video games, anime, guns, and motorcycles.
  • Biography
    Tinkering with PCs for 15+ years. Developing software for 5+

System

  • CPU
    AMD Ryzen 7 2700X
  • Motherboard
    MSI B450M Mortar Titanium
  • RAM
    2x8GB DDR4-3200 Corsair Vegnence LED
  • GPU
    EVGA GeForce GTX 1080 ACX 3.0 AC
  • Case
    NZXT H400i
  • Storage
    250GB Samsung 970 Evo, 1TB Crucial MX500, 1TB 2.5" Seagate Barracude Pro
  • PSU
    Corsair RM550x
  • Display(s)
    ASUS PG279Q
  • Cooling
    Corsair H100i Pro
  • Keyboard
    Corsair K70 Lux
  • Mouse
    Logitech G603
  • Sound
    Sound Blaster Z -> Logitech Z906, Sennheiser HD6XX
  • Operating System
    Windows 10 Pro

Recent Profile Visitors

49,372 profile views
  1. Mira Yurizaki

    Will wiping my drive clear everything?

    It depends on what those scripts did. "Syntax error" just means the script wasn't written correctly and whatever's running the script won't run it. Just imagine this like you have a super pedantic English teacher and they refuse to grade your paper if you misspelled even one word.
  2. It doesn't really matter what language you learn first or second. All that matters is you get the basics of programming itself and knowing how to use those basics to solve problems.
  3. When it comes to making a program for a computer, while there are a plethora of languages and ways to make an application, there's only two types of both applications and languages. This post will cover what those types are. Programming Language Types There are two basic types of programming languages: low level and high level. Low level languages are the "hardware" languages, which are typically defined into two more types: Machine language, which are literally the binary values being fed into a machine. A programmer can either write out the 0s and 1s or some other representation of numerical values, typically hexadecimal. Machine language consists of primarily two parts: An opcode, which maps a number to an operation. e.g., 1 is add, 2 is subtract, 3 is jump somewhere else An operand, which tells which datum or data to operate on. Assembly language, which gives the opcodes a mnemonic name and allows for operands to have names as well, but otherwise it can map directly to machine language. Low level languages tend to be architecture specific. Even if two architectures contain the same operations, their opcodes may be different. And even if two architectures use the same assembly language mnemonic, they may handle it differently, especially with the operands. Due to the architecture specific nature, it's rare to write programs by hand using a low level language. It's typically reserved for architecture specific optimizations or if you need to get down into the depths of software baked in to the hardware. But the advantages to using low level languages is that this is the fastest your program will operate since it's directly talking to the hardware. High level languages attempt to provide a human readable way of representing program code. So instead of writing mov 10, x, you can write x = 10 or x is 10. This typically comes at the speed due to needing to translate what the higher level language is trying to accomplish. Also depending on the language itself, it may bar you from some higher performance features that are handy if used correctly, but disastrous if used incorrectly. One could also argue that some high level languages are more "mid-level" languages, in that they lack enough recent concepts and readily map to assembly language. For example, C is sometimes dubbed as a "mid-level" language because of how bare-bones it is and how easily it compiles into a low level language. This is in contrast to say Python, which comes with many more features and it doesn't readily compile into a low level language. No matter the language type, all of them except for machine code have to be turned into some form machine readable code. Though machine code may be represented as a text file such as Intel HEX, which has to be read by some loader to be runnable. For those looking for specific terms, if the source code being turned into machine code is assembly language, the program that does this is called an assembler. If the source code is a high level language, there are various types of converting it closer to machine readable code, with the three main types being: Ahead-of-Time compiling (AoT): the source code is compiled into an executable for loading and running. This generally allows for the fastest execution. Examples of normally AoT compiled languages are C and C++. Just-In-Time compiling (JIT): the source code is compiled into an intermediate form, then when the program is run, this intermediate form is compiled to the machine code as needed. Examples of normally JIT compiled languages are Java and the .NET family (C# and VB.NET) Interpreting: the source code is read and executed line by line. In order to help speed up the process, this may be JIT compiled instead. Examples of interpreted languages are BASIC, Python, and JavaScript. Computer Language Types You might be familiar with a lot of things that are "languages" that tell a computer what do, but these break down into various categories based on what their intended role is. The main ones you usually encounter are: Programming Language: This describes, obviously enough, computer programs. While this is a broad description, I tend to think of a program as something that runs directly on the hardware. i.e., the instructions represent something the actual hardware is capable of doing. Scripting Language: Scripting languages differ from programming languages in that they are meant to be run from a program to automate tasks or manipulate something about the program itself. A way to think of this is like JavaScript on web pages. The JavaScript is run on a web browser to manipulate the web application, but not all of it represents what a machine is supposed to do. e.g., a machine doesn't know what a button is, but a web browser does. Configuration Languages: These are data containers that store well, configurations that programs use to set parameters. A modern example that is popular is the JSON format, due to being human readable and easily turned into binary data. Markup Languages: This describes a document and how it should look. The name comes from "marking up" a paper in editing. For all intents and purposes, you could argue all of these are some sort of "programming" language since you are telling a computer what to do. But if you want to be a snob about languages being "real programming" languages, if a language is Turing Complete, then that means it can execute any program conceivable. Virtually all programming and scripting languages are Turing complete. So what's a non-Turing complete language? Configuration and markup languages, as well as some query based languages like SQL, and some other ways to change the behavior of an action such as regular expressions. Computer Application Types Computer applications break down into two main types: Application Program An application program, or just application, provides a service to the user. Examples include web browsers, document editors, and media players. As you might guess, this is where the term "app" comes from. While applications today are typically written in higher level languages, earlier ones had to be written in lower level languages. System Program A system program provides a service to applications. Examples include firmware, hardware drivers, and to a strong degree, operating systems. They typically do not contain any sort of user facing interface, relying on the user to write into configuration files to change the behavior of the application. If a system program does contain a user interface, it may be the only program running on the system (such as in UEFI/BIOS settings) or it may be decoupled from the core components of the program itself (such as the GUI environment of the OS kernel).
  4. @TheDankKoosh The problem is you can't just get rid of two stacks and call it a day. Getting rid of two stacks means you lose half the bandwidth because GPU memory is setup so that one memory chip is a memory channel. You'd have to build 4 stacks at half density or something. That's not going to save you a lot of money in the manufacturing process.
  5. Where are you getting that figure from?
  6. If DLSS is a post processing technique then the render is already a 2D image. So no, in theory it shouldn't be any more complicated than upscaling any other image . I can see some issues with the output producing a result that could confuse the upscaler but that's about it. The thing that's getting to me is in a game there are an impractical number of scenes and angles to choose from. It's easy for a tech demo or a benchmark to have DLSS applied because it's nothing more than a movie where the frames are generated on the fly and it's going to be the same frames every time.
  7. NVIDIA could've asked a lot of publishers beforehand to send in some representative samples. I'm not talking about DLSS though. I'm talking about the AI that these people used to upscale 2D images without a reference point... because there is no reference point available. And while it was pointed out the Doom one needed cleanup, the Morrowind one made no mention of this. Also people I'm aware how a neural AI works. What threw me off is that NVIDIA seemed to advertise DLSS as something packaged and ready to go, rather than an entire system with a process to it.
  8. The point is that these AI can upscale them to a satisfactory level without knowing what the upscaled image is supposed to look like, making them suitable for generic use. The only question is how long did it take the AI to spit out the image.
  9. The white paper says this though: The section in its entirety says nothing explicit about needing reference images from a game itself. Because how do you get results like this (from https://www.doomworld.com/forum/topic/99021-v-0-95-doom-neural-upscale-2x/ Or this (from https://www.nexusmods.com/morrowind/mods/46221?tab=description Or this When the AI doesn't even have a reference point to begin with other than a low resolution image? Aside from that, you're also telling me that NVIDIA has a datacenter sitting around somewhere for crunching DLSS reference images on request? If you want any reason why RTX is expensive, there's your answer. That datacenter ain't paying for itself. I mean, if this is how they're doing it, then okay, it's dumb as hell.
  10. I'm pretty sure what they meant in their whitepaper was they trained the AI on a set of reference images that were not related to the game to train the AI. It doesn't make sense to require the AI to train on images from the game to make the game look better. People who've used neural networks to enhance 2D images certainly didn't start from a higher resolution version of that image. I don't see how this can't apply to 3D renders, especially since the DLSS is likely working on the final output... which is a 2D image.
  11. Because the store paid a pretty penny to have it on its shelf and they're not going to sell it at a loss unless they're desperate to get rid of it.
  12. I don't really care about talking about the history itself, but it's more that people aren't just spouting out random "facts." Especially when people seem to conveniently forget history for the sake of bolstering their argument. Nitpicking here, but the overall feature set with the exception of DXR as far as producing the image itself is identical between DX11 and DX12. So speculating here, the only uplift you'd get from DX11 to DX12 is the creation of multiple queues so GPUs that can do asynchronous compute can do so. But if the developer didn't even bother with that, then yeah, that's more of a marketing move.
  13. Mira Yurizaki

    How Does LMG Name Their Servers?

    Whonnock, Fraser, and Clover are named after places/landmarks around British Columbia, Canada.
  14. DLSS should be a driver wide thing, like MFAA. I don't know what NVIDIA is doing making it a "must be explicitly supported" thing.
  15. Mira Yurizaki

    eGPU Advice

    Mini-PCIe slots on laptops have a single lane and it's likely at best PCIe 2.0. So most GPUs are likely going to be severely hampered by performance due to the restricted bandwidth. The laptop in Linus's video used an M.2 slot capable of delivering PCIe 3.0 x4.
×