Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

straight_stewie

Member
  • Content Count

    1,920
  • Joined

  • Last visited

Everything posted by straight_stewie

  1. It's not a meaningless statement, although it is a little conceited. As far as I can tell AMD does not produce compilers, participate in standards creation, build toolchains, or complete manuals for x86 processors. All three of which are required for any processor to be useful in any meaningful way. Or in shorter words: x86 processors, from any manufacturer, are only useful because Intel builds tools and support systems that make them useful. Sure AMD could, but they don't and haven't, and therefore, likely don't have the people to do those things either.
  2. Let's walk through this. The user wants 5!: Number is equal to 5. That is greater than or equal to zero, so I go to the next step. l is equal to one. 1 is less than or equal to 5, so number now equals 5 * 1, which equals 5. l is now equal to 2. 2 is less than or equal to 5, so number now equals 5 * 2, which is 10. l is now equal to 3. 3 is less than or equal to 5, so number now equals 10 * 3, which is 30. l is now equal to 4. 4 is less than or equal to 5, so number now equals 30 * 4, which is 120. l is now equal to 5. 5 is less than or equal to 5, so number now equals 120 * 5, which is 600. We show the user 600. Number is now equal to 600, which is greater than or equal to zero, so I go to the next step. ... This goes on until number overflows a long and becomes negative. The final thing displayed is whatever it overflowed to. What you really want to do is this: long number = GetUserNumber(); for (long i = number; i > 1; i--) number *= i; ShowUserResult();
  3. The F-35 is not the indicator of the effectiveness of government spending. Many analysts consider it a failed program. It is more than a trillion dollars over budget and over a decade late at this point. On a more related note, this LASER is just for aircraft. Two of our Navy ships are already retrofitted retrofitted with the AN/SEQ-3 LAWS system. There are plans to start mass retrofits with this system.
  4. straight_stewie

    Python 2 or 3?

    As far as I can tell, there are very few things that work in 2 that don't work in 3. However, there are many features of 3 that are simply not available in 2, like F-strings and more advanced list comprehensions. The biggest differences that a "beginner" (not that you are one in the strictest sense) would notice is that division of integers now returns some type with a decimal instead of being truncated division, and the "print" statement has been updated to have the same syntax as all other builtin functions, "print()". Another feature that a beginner with experience in python 2 would notice is that there are now more string formatting options than the old C-like syntax. In regards to the first change, truncated division is now done by using the "//" operator. Unless you specifically want to use some outdated libraries that only work with 2.7, I would strongly recommend learning 3.7.
  5. straight_stewie

    Where to learn Assembly?

    @Mira Yurizaki has already answered your question pretty well, but I think there is an angle missing here: What is your goal with learning assembly? I ask because x86 may not be the best platform to learn on. If all you want is to learn the basics of how processors work through programming, I would argue that you would be much better served by using an AVR or PIC microcontroller than by using a PC, whichever tactic you take with the PC. The reason is that the documentation is more thorough, and still comes in at roughly 1/10th the size. Additionally, the instruction set is much simpler and there isn't an OS that you have to interface with, allowing you to focus on what you are doing.
  6. In addition to all of the above, good timing of the release of products/leaking information, something that Apple is notoriously good at, can really increase share value.
  7. straight_stewie

    RAM speed in programming

    I guess I should have formatted my statement to suggest that "within any given generation of RAM, differences in RAM 'speed' have a negligible impact on program performance, in the context of small desktop applications". That statement is even supported by LMGs own tests, comparing game performance against different sets of RAM. Nearly every problem is "memory bound" in the sense that the very definition of a computer involves the statement "given an infinitely long tape of cells...", or in other words, Alan Turings definition of a computer assumes that infinite memory is available. My original statement was to mean that the amount of memory available is of far more importance than the speed of the memory. In other words, don't choose an 8GB kit over a 16GB kit because it has a lower CAS Latency.
  8. Yet EULAs and usage restrictions still apply for every application that reaches end users...
  9. Software is copyrighted, not patented. The difference is astronomical when it comes to situations like this.
  10. There is actually big money involved in fighting this fight. John Deere and equipment purchasers have been in a decade long fight over something called the "right to repair". In the John Deere case that includes quite a few things, but the centerpoint of the argument is that John Deere claims that they maintain complete ownership of the software that runs the machines. The owners/operators claim that this removes their ability to repair their machines because you simply cannot repair modern heavy equipment without access to the software for diagnosis, and then sensor re-calibration after the repair, and by buying the machine they are implicitly purchasing the software loaded on it. Recently, in September 2018, John Deere won the fight in one state, California, where they employed lobbyists to sway legislators to sign a bill that allows equipment manufacturers to require that owners buy parts through company approved dealerships and vendors only, among other things. The argument has not made it to the supreme court yet, but I am hopeful that when it does, the common man will win and it will end things like software as a service once and for all. Software as a service is only possibly applicable to websites, where it's not feasible or safe to distribute the server side applications to end users.
  11. That leads me to believe that he's trying to set the company up to finally be able to execute on 10 or even 7nm technology, since apparently that's going to take a bigger investment than Intel originally thought.
  12. I'm always wishy-washy about these kind of decisions. On the one hand, the market was manipulated. On the other, it was never proven that Musk did it with the express intent of manipulating Tesla's value, and there are other reasons to "adjust" information about companies or their products (for example, most 600 cc motorcycles have between 580 and 598 cc engines, or how John Deere is the "biggest construction equipment provider", but only in total value of company, they don't move nearly as many yellow iron machines as CAT). Then again, Elon needs to be stopped from doing that, given that it wasn't a one time issue. But on the flip side of that, there is now precedent that allows court enforceable censorship of what business owners/leaders can and cannot post on their personal social media accounts, and that's not a problem to be viewed lightly. At best, a small group of investors won retribution, but they did so at the expense of an amount of freedom for business owners across the nation.
  13. straight_stewie

    GUI programming class

    Well, I think we should break down the pros and cons of your 3 options. This is by no means an exhaustive list, but it should get you started thinking on the right path. Python Pros The language is easy There is a plethora of libraries available to ease GUI programming Cons You don't know the language yet Many of the available libraries don't have good documentation C++ Pros You know the language You get to learn what the libraries in other languages are doing for you It can be easier to build annoyingly silly applications. This one is by no means complete, but it's a start: https://github.com/superstewie/Dumb-Window Cons If you've never worked with Win32-COM or XWindows, this is the most difficult option Java Pros You know the language Some native support for building a GUI "Commercial" libraries are available You'll make @wasab happy. Cons You'll make @wasab happy.
  14. straight_stewie

    RAM speed in programming

    I have to argue that hobby programmers would often need to make those type of architectural decisions, as they don't have the benefit of their employer having hired people specifically to architect for them.
  15. straight_stewie

    Processing Power Recommended for Programming

    It seems to me that on a small scale, such as hosting your own server, your internet connection speed is going to be your biggest bottleneck for a long time. Another consideration in web development is that there are usually a lot of hard drive accesses, unless you have an insane amount of RAM. Hard Drive/SSD accesses are extremely slow, so minimizing that will have the greatest impact. There is a reason that webservers are made out of highly parallel but relatively slow processors coupled with absolutely insane amounts of RAM: Raw compute performance isn't usually necessary for most websites. What is necessary is a quick reaction time to requests and the ability to handle a large amount of simultaneous requests. That means optimizing databases and sacrificing compute performance for parallelism and fast data access.
  16. straight_stewie

    Do computers get slower, or does everything else get faster?

    I refer to this as "the darkside of abstraction". As computers become faster, more abstraction layers become feasible, and therefore most of your performance improvements are eroded away in trade for "developer productivity". That's not necessarily a bad thing, however. The question we should be asking is "How fast do I actually need this to be". Is the cheaper software that comes with the increased ease of development worth the performance tradeoff? I would argue that in most cases the answer is "absolutely".
  17. straight_stewie

    RAM speed in programming

    There are a few problems with the die shot you posted: That was Ivy Bridge and, if you look closely, you can see that much of the area taken up by the L3 cache is not actually memory, but memory controllers. That's because L3 cache, atleast in Intel chips, usually has a relatively high number of access channels (usually four per section and four sections, so 16 access channels). RAM won't dedicate quite that much space to the memory controllers. This is a Kaby Lake chip. Much of what you see labelled "L3$" is actually called the "Side Cache" and consists of 64 MB of memory. By my math, the "Side Cache" takes roughly 14.8 mm2 for 64 MB. That's 222 mm2 per gigabyte, which is roughly 1/3 as dense as modern DRAM offerings, which yields roughly 1 gigabyte per 70 mm2. So I guess the conclusion of my analysis is that, atleast in terms of die usage, SRAM is competitive with DRAM. Even more so when you factor in the performance improvement. However, I can't find a reasonable cost analysis, because SRAM is in very low demand so it's cost is inflated. I'm not an economist, so any price adjustments I could come up with to make an apples to apples comparison on cost would be complete nonsense.
  18. straight_stewie

    RAM speed in programming

    RAM speed is not really very important at all. For nearly every problem RAM speed will not be the bottleneck. By far the biggest bottleneck is the amount of RAM available, but that only comes into play when dealing with dubious amounts of data. Warning, rant in spoiler: The tactics that should be used to improve memory access performance are to decrease the number of memory accesses and, where possible, write things to be cache oblivious. Of course, you can only determine if there is actually a bottleneck by profiling. So, as @reniat said, "measure, measure, measure".
  19. straight_stewie

    Operating System Creation

    Actually, that's not true here. At the bare minimum you will need (as in, "there is literally no other option") to write your own bootloader and stack in assembly if you plan to actually run C bare metal. That's if you have access to a C compiler that doesn't expect an OS to be there. If you don't have such a compiler, you will also have to write your own implementation of the C Standard Library, and if you want that to be fast at all, you will need to write some assembly there. We aren't talking about Arduino's here. Most systems, including RPi, don't have easily accessible bootloaders or C compilers that don't expect an OS to be there.
  20. straight_stewie

    Operating System Creation

    While you may be able to find videos covering certain topics, there is not a good video series available that will take you from "zero to hero", as it were, in the field of OSs. Building an operating system like Linux, Windows, Android, UNIX, or MacOS from scratch is legitimately regarded as one of the hardest, if not the hardest, computer programming task that anyone can do. To be successful, you will need to know how to manipulate hardware, how to abstract hardware away (if you want multitasking), how the processor actually works and how it interacts with other system components, and how to write your own libraries/compilers. And that's before adding a GUI layer. Now I'm not saying that you shouldn't try. I'm saying that you probably shouldn't try to write a desktop OS, yet. To get the experience of working with bare metal hardware, I would recommend getting an Arduino, some breadboards, and an AVR programmer capable of programming the bootloader memory. Start with normal Arduino tasks, then take the microprocessor out of it's socket, breadboard it, and start writing C or AVR for it in AVRStudio. Eventually, you should write your own bootloader, and your own USB programmer program for it. Then you can start working on your own compilers for whatever language, and start all over.
  21. straight_stewie

    Operating System Creation

    I've always felt that OSDev was more for people who could probably already build an OS without relying on a forum for information. Before getting started on developing a bespoke OS, one should be able to successfully complete the LFS track on Linux From Scratch.
  22. straight_stewie

    Need a YouTube Channel that Teaches C#

    Google DevU and Bob Tabor
  23. straight_stewie

    C++ | Can't understand Polymorphism

    Polymorphism inheritance, and casting are tightly related. Imagine that you are writing a bookkeeping program for a business. You may wish to track various kinds of people and how they relate to your business. For example, you could have: Employees External Contractors Customers Clients But how do you make all these different types of People without constantly redoing all of the work you've already done. Well, one way would be to create a People type. People might have some fields to keep track of their name and other such data. Then, you could derive Employee from People and add fields and data to keep track of payroll and the department they are in. A similar fashion for Contractor, Customer, and Client. But there is a much better use of inheritance that is hidden behind the name Polymorphism: All of the subtypes, Employee, Contractor, Customer and Client are still of the type People. Therefore, anything a People can do, any of the other types can do. This means that you can treat any of the derived types as a People. And that is all that polymorphism is: The ability for a type to appear like it is another type. Inheritance is a common tool to allow polymorphism to take place. Casting is the tool that we use to exploit inheritance: You can down-cast a Customer to a People, and in some languages, up-cast a People to a Customer. In shorter words: Polymorphism simply means that we can treat some type as if it were another type. Given our business example: Say that we want to be able to calculate our businesses total revenue. We make the assumption that all people associated with our company either owe us money, or we owe them a money. Therefore, we define the People type to have a field that gives us an integer which tells us the amount of money owed. Negative if we owe them, positive if they owe us. We can go about our tasks normally, building lists of all sorts of Employees and Customers. But that gives us a problem: How do we calculate our revenue when our lists aren't full of People? Well, the answer is quite simple: Since they are all derived from People, we can simply "trick" our program, using casting (read: Polymorphism), into thinking that our Customers and Employees are just People types, and therefore we can build one method which simply sums calls to People.MoneyOwed.
  24. Quotes don't seem to be working in blogs. Nonetheless, this is the best quote about the scientific process that I've ever seen. I'm definitely stealing this.
  25. straight_stewie

    Console.ReadLine not getting correct input (C#)

    As an aside, while for ( ; ; ) // some stuff works, the following is highly preferable: while (True) // do stuff Because the for loop method might indicate a mistake where the author meant to go back and write the condition, but forgot to do so. In contrast, the while loop method explicitly states that the author meant the loop to be infinite.
×