Jump to content

dcgreen2k

Member
  • Posts

    1,940
  • Joined

  • Last visited

Everything posted by dcgreen2k

  1. That's the right mindset to have. My comment about this kind of thing not being well suited for Python is mostly because it runs in a virtual machine instead of bare hardware. Depending on what you're trying to do with your operating system, this may or may not be important to think about.
  2. I don't think it would be very feasible to write an operating system on Replit or in Python, at least not an operating system in the way we typically think about them. One idea that would probably work in that environment is to write a basic task scheduler and task switcher, which is part of the core of an operating system. Try to make it so that you can give your "operating system" some functions to run as well as how long to wait before running them again. For example, you could give your OS one function that prints out "Hello World" every 10 seconds and another that prints out the current time every minute or second. These functions shouldn't be contained as part of the OS itself, but instead you should have a way to "register" the functions so that your OS knows to call them at whatever time interval you specify. If you'd like to learn more about creating an actual operating system - one that can run directly on hardware, I'd first recommend checking out the C and C++ languages. These allow you to work much closer to the computer's hardware compared to Python, which is needed to make a more advanced OS. Here are some guides you should check out if you're really serious about learning how to create an OS: https://github.com/cirosantilli/x86-bare-metal-examples#getting-started https://wiki.osdev.org/Main_Page I don't want to discourage you, but writing an operating system takes a lot of hard work and knowledge in programming. Trying to take this on would undoubtedly make you a better programmer though.
  3. What you're seeing here is an argument between well-meaning forum members and a few very childish members. It doesn't happen often, but when it does it won't turn out well. This community is pretty good aside from that.
  4. I'm 22 and attending graduate school for machine perception next year. You are correct in saying that students in these introductory courses are expected to have some basic knowledge, and in my experience that meant "Are you able to turn the computer on, use a web browser, and install a program?". You'll probably notice that this is much more basic than what you're talking about, and there's a reason for that. Topics like diagnosing hardware and software issues are things that are simply not touched upon in these types of classes. To learn about them in a professional setting, you'd most likely need to find classes on IT certifications. These typically aren't offered in standard degree programs to my knowledge. There's also always the chance that someone is in the cybersecurity field just because it makes them money. And to be honest, the computer issue I saw you were talking about earlier is a very specific one that I've never encountered before. Even with my knowledge and having grown up with computers, my first approach would be to just Google it.
  5. College student here. This is incorrect. Students in introductory classes are not expected to know anything about that course's topics prior to taking it. Learning about how to use a computer is kind of the entire point of that type of class, whether it be computer science, computer engineering, or information technology. It's kind of like expecting a student to know how to perform a synthesis of verdigris from copper sulfate pentahydrate in an introductory chemistry class and asking them why the hell they're taking that class when they inevitably don't know. And by the way, diagnosing hardware issues is not something usually taught in college classes. That is purely a problem solving skill, and has to be learned on one's own.
  6. Refusing to have the decency to call someone by the name they choose is incredibly childish. The last time I remember seeing that was in second grade, and I cannot respect that. I hope that one day you will be blessed with the gift of compassion.
  7. dcgreen2k

    Please do not try using a GeForce 7000 series c…

    Huh. I specifically remember installing Windows 10 on an old system with a GeForce 7300LE and having it work just fine. The last time I used it was early last year too. I can't remember which driver version it was running, but I know it was an official Nvidia driver.
  8. The exam seems very unfocused and many of the questions are vague. What kind of CS class would this be made for? I don't know what single class would teach all of these things, since I've seen parts of these questions in many different classes throughout my studies. Some of these are introductory-level questions, some deal with applied software design, a couple are about things I learned on my own like curl, and the maze question is something I'd probably find in a high-level college data structures and algorithms or AI class. To improve this exam, I would first include better instructions. What language is the code expected to be written in? How would the code be tested? I would also organize the questions into groups, like shells and shell commands, definition-type questions, then applied programming. Finally, it would be good to write out the actual expectations for the students' code, though this also goes with my first recommendation. In the programming exams I've taken for my computer engineering and computer science classes, every one has clearly laid out the functions to implement. For languages like Java, we might give a listing of every method to implement along with their arguments and return types. For languages like C++, we typically give students the entire completed header file for a class, so that they must write an implementation as well as tests for their code.
  9. I agree that once products start getting 5-digit product names, the names get pretty clunky. They should find some new naming scheme to fix this, but "Ultra" is not it imo. Intel has done product renaming that worked well in the past, like going from Core 2 Duo, Quad, and Extreme to Core i3, i5, and i7. Though, the Core 2 naming scheme was more descriptive which helps people who are unfamiliar with Intel's product lineup. Adding "Ultra" while keeping the previous numbering does not help this at all.
  10. I've had the same issue with my Logitech G303 that I've been using since it was first released. I first tried ordering new switches, but the order got cancelled so I took the existing switches apart, cleaned the internal copper piece with soap and water, then lightly scraped the contacts a bit with a screwdriver. That was two months ago and my mouse has been working great since then. I don't know of any other solutions to this aside from getting new switches or cleaning the old ones though. Maybe you could see if there's something you could apply to the contacts to slow down oxidation?
  11. American here, living in Virginia. I've been to Canada before and it was amazing. It's one of the few countries I've seriously considered moving to. The only people I've heard say that Canada is a dystopian hell hole were from a certain political party, and I've never heard the same thing from rational people.
  12. I recently subscribed to GN because I like their videos more than sensationalized ones. Finding things wrong with a product and figuring out how to make it better is called engineering, not complaining.
  13. Just to clarify one point, assembly instructions have a one-to-one mapping into binary, and so there is no performance difference between programming in the two formats. We only use assembly because the opcode mnemonics, register names, and immediate values are easier for humans to write and understand than raw binary. Theoretically, our programs would be the fastest if they were all written in assembly. This is not the case in the real world because of how difficult it is to write good assembly code. So, we made higher-level programming languages where the more complex parts, like interacting with the hardware, were abstracted away. This way, we could make programming easier and more productive while keeping acceptable performance. Once you get to the actual instructions executing on the CPU, it mainly comes down to hardware. As far as I know, the highest possible throughput is one instruction executed per clock cycle, ignoring fancy things like superscalar architecture and having multiple cores. Many of the improvements we've made to CPUs to improve this range from simple things like increasing clock speed, to more complex topics like pipelined architecture and branch prediction algorithms. As a side note, the only fields I know of that still use some assembly code is in embedded systems (like Arduino), and cybersecurity. It's sometimes needed in embedded systems because you're typically working with microcontrollers that have very low clock speeds and not much memory, although this need has decreased significantly. For reference, the Arduino Uno has a 16MHz processor and 2KB of RAM. In cybersecurity, assembly is used in reverse engineering malicious binaries. By looking at a program's assembly code in a tool like Ghidra, you can get a better understanding of how it works on the inside and figure out exactly what it does.
  14. @Dat Guy is correct. Boehm GC is not included in C/C++ by default, and I would not call smart pointers "garbage collection" by any means.
  15. Your best bet would be to find a university that has an electrical and computer engineering program, because CPU design is a mixture of both of these fields. I've been studying in my university's ECE department for 3 years now and the classes related to CPU design have been my favorite. We've done things like building a simple 16-bit CPU from scratch on an FPGA over the course of a semester and learning how to maximize performance through pipelining and cache design. It's a really interesting field if you do decide to go into it.
  16. You got a source for that? I'm perfectly fine with some devices not coming with chargers, as long as they don't use some special plug or charging scheme that you can only get from that manufacturer. My family has a box full of USB chargers that came with our devices over the years, and we don't need any more. At a certain point it becomes instant e-waste.
  17. Those requirements are probably set that high to guarantee that anyone using Solidworks professionally won't run into performance issues. You can obviously run it just fine with everyday hardware, but it isn't guaranteed to scale well to a professional workload.
  18. I see. If you're just doing that then you could use software like Solidworks or Fusion 360 to create the 3D model, then Ultimaker Cura to turn that 3D model into a file that tells the printer how to make it. None of that software requires top-end hardware, and should run on any system with a dual core CPU and a basic graphics card.
  19. When you say rendering, do you mean creating the visualization of the model you see on screen or turning the model into code that runs on the 3D printer? The hardware has no impact on print quality, and I've run software like Ultimaker Cura on an old system with a Core 2 Quad CPU no problem.
  20. I'm not sure if doing the chunk generation on a GPU if even possible, since GPUs have a much simpler instruction set than CPUs. If it is possible, then it's unlikely that it would give a performance increase. Remember that even though GPUs have tons of cores, they're typically at a much lower clock speed than CPU cores. Pair that with the overhead required to split the work up (a workload for 1 CPU core needs to be turned into a workload for many GPU cores) and it means that chunk generation would probably be very slow on a GPU.
  21. I agree with some parts of this. First, it's true that the recent applications of AI are scammy - I've seen many people think that chatGPT is something it isn't and believing that they can trust it blindly. Second, the applications and capabilities of AIs have been heavily sensationalized by media (like Terminator) and things that have existed for a long time are suddenly perceived as scary because "AI" got tacked onto it. For example, yesterday I saw a post saying that AI was scary because it could possibly infect and spread to other computers - we've had that forever, and it's called a virus. The part that I disagree with is what we call AI. AI is a very broad field that's based in creating things that mimic human intelligence, and it's been around for almost as long as modern computing has. Some examples of AIs are opponents in video games, chatbots, and image recognition, because they all attempt to make decisions as a human would no matter how badly they might be implemented.
  22. This is true, and it should be mentioned that spinning up threads has an incredibly high overhead cost because of how each thread needs to get its own resources from the operating system. There are many cases where even if a task can be fully parallelized, it's faster for a CPU to do it sequentially since the time needed to create new threads is greater than the actual execution time. Aside from the threads themselves, the memory read/write constraints imposed by multithreaded code are also very costly in terms of CPU time and could easily degrade performance to single-threaded levels, things like atomicity and locking mechanisms.
  23. I agree that most of the things people think AI can do are ridiculous and pretty much entirely based on movies. AI has been around for almost as long as our modern definition of a computer has, and it's been heavily sensationalized. That always happens when people latch onto a scary buzzword though.
  24. I've been using Kdenlive for a while, and I definitely recommend it. It's simple enough to start doing basic editing very quickly, and is free and open source. It feels similar to Sony Vegas Pro and performs well too.
×