Jump to content

Raytsou

Member
  • Posts

    217
  • Joined

  • Last visited

Everything posted by Raytsou

  1. so anything that's printed in your application's output, such as logs or error messages, are going to go to stdout, if you're in the terminal, then it will display within the console. You can actually pipe this std out to go somewhere, such as using >> to overwrite a file or > to append to a file. So if this is a short script one might do `./script.sh >> log.txt`.
  2. Newegg is cheaper but if anything goes wrong (and they have a track record of fucking up half their orders nowadays), customer service will give you the run around and do everything in their power to blame you, up to and including call you a liar. Stay away from them, they used to be good until they got bought out.
  3. Apple's marketing strategy is to create a walled garden ecosystem. As long as you exclusively use apple for your tech products, everything "just works". To discourage people from trying other things, apple generally makes the user experience absolute dog shit (sometimes intentional, but usually by just not putting any engineering resources there). The way they gain users is by partnering with schools to teach kids how to use apple products from a young age, and part of this partnership is that the schools get free ipads/MacBooks in exchange for not allowing the kids to be exposed to non-apple products. The downside of this is that it's extremely hard for people to get into this ecosystem if you're already used to non-apple devices. You have three options: fully commit to apple devices, never touch apple devices ever again, or just live with being frustrated all the time.
  4. Xm4 are great. Also consider Bose qc45, they have differences but basically fill the same role. If you don't care about ANC, consider ATH-m50xBT, although they're quite lacking in base imo.
  5. Oh I didn't even realize that's a feature. I think you should figure out first whether it's a software issue or hardware issue. But assuming it's software, try disabling windows from automatically installing drivers, then go into control panel and uninstall all audio devices, then try installing your mobo realtek drivers again. Remember to re-enable windows driver updates.
  6. Have you downloaded the audio drivers from your motherboard's support page? Also not sure why you used ddu? It manages display drivers, the only audio drivers it touches would be the HDMI output from your GPU.
  7. You should read up about python deep copy vs shallow copy. Basically python variables work like references so setting array A equal to array B would copy the pointers to each sub array of B into A. Using .copy would copy the references one level deeper. If you actually want to clone the values of arbitrary n-dimensional lists you need to recursively .copy or import deepcopy.
  8. Yeah considering what's "interesting" is very subjective, youd have to train your own algorithm. Yes it is just programming, but it's machine learning engineering. What's you have to do is scrap together a database of a few thousand videos and manually label the "interesting" parts of each video. Then you feed train the ml algo on this data ( basically you feed the data into a bunch of linear algebra and have it shit out a function that is able to take a video and output the label). Once you have the model you're now ready to run it on newer videos and it should hopefully output what you want. You don't need a GPU at all if you're not doing this real time, you can set a video to be processed and come back a few days later to the labels generated.
  9. So you could make one script that contains the commands you need to run on each individual machine, then make a top level script that iterates over different IPs and runs that script over ssh. https://stackoverflow.com/questions/305035/how-to-use-ssh-to-run-a-local-shell-script-on-a-remote-machine
  10. I'm gonna only discuss m2 vs 4090 bc using a CPU for anything ML is using the wrong tool for the task. As for GPU performance, I don't think they are comparable and that's probably why there are no benchmarks. The performance of the m2 ultra shouldn't even come close to a single 4090. Quick Google search shows 4090 has 83 tflops of compute at fp32, while the m2 ultra has 27. While it usually isn't that good of an indicator of real world performance, Nvidia having much more mature software support would only widen this gap. Actually I found a tomshardware article that's shows the m2 to trade blows with the 4070ti. The 4090 would be at least as least 30% faster. Having a lot of vram shouldn't make the GPU significantly better at ML. At a certain point you have already loaded everything into memory and there will be no more speed benefits from having more empty space available to use. You could argue that apple has their neural engine going for it, which is a dedicated ml accelerator that reportedly performs at 31.6 TOPS. But the 4090 has the same tech, with tensor cores coming in at 1321 TOPS, only further exaggerating the gap. I genuinely think that if you're serious enough to want to spend 6k+ on a machine just for ml, it might make more sense to spin up a ec2 instance with multiple a100s, which will be orders of magnitudes faster, for just a few dozen dollars per hour.
  11. I suppose a more satisfying answer would be to also explain a bit about anti aliasing, or the problem of representing curves within a grid. I mentioned SSAA and that it's very computationally inefficient. Most of the later methods try to achieve the same effect by trading off accuracy for speed. Most popular is multisampling anti aliasing, or MSAA. Your GPU will take look at only the parts of your screen where you'll notice a difference (where there are curves) and selectively scale those parts to look better. No discernable difference at all from SSAA but is much faster. We also have FXAA, which is just as popular. In this method, we have the CPU blur the image outputted by the GPU. Very inaccurate but extremely fast. We next have TXAA, which instead of rendering at a higher resolution, will look at the few frames leading up to the current frame. Looked okay bit was computationally expensive. Often would be used in conjunction with MSAA. We then have deep learning super sampling, or DLSS. DLSS 1.0 is pure AI image upscaling. 2.0 conceptually combined 1.0 with TXAA for best accuracy and speed.
  12. Image scaling is traditional scaling, also known as super sampling anti-aliasing (SSAA), which renders the frame at a higher virtual resolution, then fits it to your screen. Nothing will look better than this, but this is very computationally inefficient. DLSS is AI upscaling with temporal filtering. In other words, it has an algorithm that looks at the current frame, along with the few frames leading up to it, and guesses what a higher resolution version of it might look like. This is usually not noticeably less accurate but significantly faster.
  13. I disagree, without this merger there isn't a single company large enough to compete with tencent
  14. I was thinking it sounds like he was describing fediverse, but I hear it's closer to a decentralized reddit. I'm not too familiar with the technical implementation but I suspect a decentralized twitter might actually not be possible.
  15. I believe raspbian is Linux so this should work: https://superuser.com/questions/168316/how-can-i-block-all-internet-access-on-ubuntu
  16. The limited experience I have around GPU programming is through opencv, which is a computer vision / linear algebra package. You can construct cv::cuda::GpuMat objects to create matrices in vram, and operations on these objects will occur on GPU.
  17. Since this is the programming section, I'm gonna assume you can code. OpenCV has an AI upscaling function. Tutorials here: https://docs.opencv.org/4.x/d8/df8/tutorial_table_of_content_dnn_superres.html
  18. The original unix shell terminal, named "Bourne Shell", launched via `sh`; `bash` extends that, called "Bourne-again shell". `.profile` is what `sh` looks for. Tbh I had to look this up bc I'd never even heard of `.profile`. `/etc/.profile` is the system wide profile and affects everyone. `.bash_profile` is the configuration profile for just the logged-in user `.bashrc` is the profile used if `bash` is launched in non-login mode So when you launch `bash`, it will first run `/etc/.profile`, then depending on if you invoked `bash` as login or non-login mode, it will run either the `.bash_profile` or the `.bashrc`. If it can't find a `.bash_profile`, it defaults back to `.profile`
  19. lmfao few of these answers seem particularly helpful. I'll give it a go. So, @planetary problem with computer languages it's split into a spectrum of high and low level languages. The higher the level, the more layers of abstraction between the hardware. Your hardware is memory addresses in your CPUcache or ram; these are filled in with 1's and 0's. Assembly is pretty much the lowest level programming language. In this language, you are basically giving raw instructions to the CPU to move to this address and read/write values etc. These instructions then map directly to machine code (ie, the code is "assembled"). Nobody really works with assembly anymore except for a specialized niche. Definitely not easy for beginners. Then we have a slightly higher level language called C. C is great and was widely used bc you could now use a compiler, which takes C code and translates it into machine code. You no longer had to program for a specific CPU, you could now write in a generic language and run it through many different compilers. Now both of these languages were procedural, meaning instructions were read line by line iteratively down the file to be executed. In comes a new paradigm, called object-oriented programming (OOP). This revolves around the concept of having entities associated with data that you can interact with and it self-manages the data. This is still the most popular paradigm today, so def worth reading up about. Anyways, with this shift in paradigm, C was improved upon with new features - called C++, updated and widely used to this day. Worth noting that all C code will still compile on modern C++ compilers since it's still the same language, tho the same wouldn't work backwards. Hence why we refer to them as C/C++. I'd say C/C++ is not a bad language to start learning with, but it does have a big learning curve bc it forces you to learn what is really going on in a computer. However, many universities do use it to teach computer science. Rust and Go are, at it's core, someone looking at C++ and going, "fuck that backwards compatibility shit, we can do better" and making a brand new language. But bc of this, they sit slightly higher up in level. Both are a bit more beginner friendly that C++ but are much less used in industry. Personally don't know anything about Rust but Go was basically Google saying "fuck C++, we're doing our own thing." It's mostly still Google using it but it's picking up mainstream adoption. Further up the spectrum we have Java. Java is a bit higher level bc it completely forces you to use OOP and also has a garbage collector, meaning it pauses everything every once in a while to clear out memory values that aren't going in use anymore. It was designed to be used in large enterprise code-bases with thousands of files, which isn't that great for small beginner projects. Its syntax is also close enough to C/C++ that you could learn that as a beginner and comfortably switch to Java later on. There's also Scala, built on top of Java, that provides features for working with large datasets across a cluster of computers. Again, enterprise use, not beginner friendly. Now Java came to be because Oracle decided they could make money if businesses adopted Java, so they kept marketing until they were mainstream. Microsoft saw that and wanted a piece of the pie, so they came up with their own (pretty much equivalent) language. But for marketing purposes they figured they could piggyback on the "C" name and that's how C# came to be. Above that we have python, which is a language designed for scientific use. It came to be bc mathematicians, physicists, chemists, etc would often have a lot of use for computers to analyze data but didn't know much about computer engineering. Python abstracts all of that away so you're more dealing with a mental mathematical model of the data. It's actually not even a programming language anymore, it's a scripting language. Meaning, python doesn't generate a program that is then run by a machine; Python itself is a program (written in C/C++) that interprets your text file and executes the code. It's incredibly friendly to beginners, but you don't really get to learn how the underlying computer is working. We also have javascript, which is kinda like python in that you don't have to understand how a computer works at all, but it's mainly used for front-end web development. Front-end, meaning client side code sent to the browser. It's used alongside css, which declares styling of various components in a webpage, and html, which declares the format of components on a webpage. It has nothing to do with Java and is called that for the same reason C# has C in it. There's also typescript, which is OOP enforced javascript, also a superset language. All these worth learning if you're interested in web dev. I'd like to mention Haskell, which frankly I don't know much about, but it uses a different paradigm called functional programming. Basically, OOP is based off procedural programming which based off of the Turing Machine model of computing. Well, Alan Turing came up with that model but there was another dude Alonzo Church, who came up with another model called lambda calculus. The Church-Turing thesis shows that these models are both equivalently the most powerful model of computing, so naturally some clever nerd made a language to use lambda calculus in a programming language. Worth learning at some point to enhance your skills, but def not beginner friendly.
  20. so I recall hearing an old MIT alumn talking about how neural nets were mentioned in Intro to AI back in the 1970's as a theoretical model of AI but "computationally infeasible". In the early 00's, with advances in computing, we saw small NNs being implemented, tho they were limited in power. Then by the early 2010's, we were able to demonstrate that if we make a super large model and use a fuckton of data, we suddenly have a much more robust and capable AI. Thus this began a mad dash for data collection through the late 2010's; and now in the early 20's we're seeing the results of that effort.
  21. tried on my 5900x on windows, got 8152. SSH'd into another pc running linux with 2700x, got 7240.
  22. Seconded on what brendan said. Python is really beginner friendly but abstracts away a lot of the computer science concepts. You miss out on core computer engineering knowledge that shows up on many other languages. On the other hand it builds your algorithm skills much faster and helps you think about things from a computer science perspective. It's mainly designed for scientists to use computers (classic example is my friend doing her physics PhD running python scripts to analyze the telescope data she collected.) Java has an incredible amount of overhead that leaves you wondering, "why the fuck am I doing this like this" until you realize it was made for the specific goal of building large scale business systems and all that shit you write makes it clean when you have thousands of files shared between hundreds of devs. But for a beginner this scale actually obfuscates a lot of the core skills you really need so I don't think it's a good language to start. Golang/Rust are essentially C++. Sure they have more syntactic sugar but at the end of the day they mostly support the same features and are often used within similar circumstances. These throw you into the deep end with full control over everything your computer can do and have a steep learning curve. But learn any one of these languages well and you'll be able to pick up any other languages within a month. As for resources, I can't help but recommend harvard's CS50 on EdX. But i also remember when I first attempted watching those lectures in highschool that I was lost AS FUCK. There's this textbook, Beginner Perl, that personally I believe really equips you well with the terminology but Perl is more or less dead unless you really have a hard-on for core Linux. Tho Perl is an interesting in-between language that slots in somewhere between C++ and Python, where you have a lot of stuff abstracted but to a lesser extent. Finally, as an MIT alumn, I can't help but mention MITx. There's 6.0001x (or w/e it's called now, they keep renaming it) along with any other class you can find on there. Furthermore, our fundamentals of programming class has incredibly well written assignments that guide you through what is expected of you and provides a .zip file with a full python test suite to check if your solution is correct. Although as a random online person you won't get feedback/help/access to the more in-depth answer checker, I think it's more than worth going attempting. I believe every single assignment is doable with sufficient googling without watching any of the lectures and feature really interesting problems, such as implementing n-dimensional minesweeper or a tower defense game.
  23. without knowing more about the deepstack package it'll be really hard to really help you and i doubt the LTT sub happens to have the expertise; but for simply updating python versions I would consider looking around for the version no. (probably grep within the files for "python") and point it to a newer version of python locally. If there isn't anything explicitly setting the python version, it may be that your local python PATH is set to 3.7 and you have to point it to 3.11. Quick glance at the github suggests there might be a docker file to setup python 3.11 in there. OTOH it's entirely possible it's a non-trivial port and you have to mess with function names n stuff. Good luck, happy to provide more guidance.
×