Jump to content

Angry Badger

Member
  • Posts

    69
  • Joined

  • Last visited

Reputation Activity

  1. Agree
    Angry Badger got a reaction from homeap5 in Is this file a malware or virus?   
    Let's make a more rational analysis of the scan results instead of making the usual "Omfg VIRUS" assumptions like a lot of people seems to be doing when presented with these results.
     
    First of all, the detection rate here is 1/60. That's not really a good enough detetction rate in order to categorize a file as malware due to how often false positives occur. Second, the file was detected by a new anti-malware product, while undetected by any major vendor. That coupled with the fact that we're not actually getting a hit on any malware signatures (it looks like a result of some heuristic analysis of the software, resulting in a match for some malware-like characteristics which doesn't neccessarily equal malware) makes it less likely that the file in question is actually malicious. 
     
    Personally, I don't think the evidence is pointing towards the file being malicious, but of course, you can never know for sure. I'd recommend that you reupload the file to VirusTotal in case of signature/heuristic engine updates. False positives sometimes pass after updates to VirusTotal's products.
  2. Agree
    Angry Badger reacted to reniat in Programming and employment   
    I think there's a difference between "We think higher education should do a better job of teaching real and practical coding standards" and "Higher education is just a gateway to the industry and is useless beyond that"
     
    The biggest thing you lose if you ignore academia is the breadth of knowledge. I went to college and took a bunch of classes that are not directly applicable to my current job, but I still am very glad I know them. Things like the machine learning, statistics beyond just the basics, how the internet works, how a CPU works/writing in assembly, etc.
    (note these are things I know from college that *I* don't use, obviously there are plenty of jobs that do use some of these but even then a person in that job might have a different set of skills they learned in college they don't use but are glad to know)

    In addition to my actual job, I also mentor new engineers in our onboarding program (~2 months time) and I absolutely wish colleges offered a few courses that were industry focused. An "industry primer" class if you will, taught by person who has relevant industry experience. Things like writing proper unit tests, satisfactory documentation, modularity and reusability, writing cohesive interfaces, etc.
     
    tl;dr Colleges could absolutely do better, but I still think they are very valuable and don't exactly blame most companies for requiring degrees. (though the price for college in the USA is still stupid and i'll be paying back loans for way longer that I am comfortable with).
  3. Agree
    Angry Badger reacted to AngryBeaver in Programming and employment   
    I am rather disappointed on all of the people against higher education. Seeing that as we move forward in the future those degrees are going to become more and more required for tasks.
     
    Also, I saw it mentioned that they put the degree requirement there to weed out candidates. In a small company this might be true, but if you are talking about large fortune 500 companies if you do not meet those basic requirements your resume will literally be dropped from the system. There are some tricks that help you get around this, but for the most part you won't be considered.
     
    I am not saying a degree reigns supreme, but I am saying for entry in to the field you either need a degree or good solid experience with a company to open doors for employment. If you wanted to code for my current company even at entry bug checking levels... you would need either 3 years of experience doing it, a 2 year technical degree in programming with 6+ months of experience, or a 4 year degree in the field.
     
    Now once you had that entry level position, if you wanted to have many advancement opportunities then you would need to start furthering your education. The company for the most part pays for the bulk of that if you are interested, bu that is because they want people to have advancement paths... and without a degree in a large company your advancement is little to non-existent. 
  4. Funny
    Angry Badger got a reaction from Dat Guy in Language to Jump in to   
    That's very true, and there's a reason why I didn't recommend C++ either. Despite the Java syntax being somewhat derived from C++, the language behaves rather differently. This is very advantageous for a few reasons:
    You get to be productive and focus on coding contrary to C++, where you'll spend your time debugging pointers and other difficult aspects of the language until you get the hang of them, which is generally a nightmare. Java is a very forgiving and intuitive language in comparison. You'll become familiar with the C++-like syntax, which is super convenient for when you'll have to dive into either C# or C++ later. It's kind of the best of both worlds for a beginner, where you'll get used to a widely used syntax while also writing code that's easy to get up and running. 
     
     
    This is another great reason why Java is a better starting language for beginners compared to Python. Object-oriented programming(OOP) is one of the very core concepts of Java development. It's something you will get the hang of, or at least get a basic undertanding of,  while learning Java. OOP is generally less central in Python, despite still being important and widely used. In my opinion, OOP is more natural to get exposed to through Java. In Python I didn't feel nearly as paradigm-bound back when I learned it.
     
    Java is already quite easy. That's why it's one of the most commonly taught languages in colleges and universities in addition to Python. You're missing out on a lot if you choose to learn Python as your first laguage (significant differences in the type system, differences in intended pardigms, difference in "strictness"), and you'll waste a lot more time when you'll later want to dive into C# or C++.
     
    I guess my main point is that they're both very beginner-friendly languages, but you'll pay for it later if you learn Python first and choose to move forward with other programming languages at a later point.
  5. Agree
    Angry Badger got a reaction from cj09beira in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Indeed, and thery're not even trying to hide their agenda a little bit. They're litterally telling every single enterprise customer out there that you're not allowed to buy the cheaper solutions.
     
    Though it's NVIDIA telling ther customers what they can and can't do with their product which is the scary part in my opinion.
  6. Agree
    Angry Badger got a reaction from cj09beira in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Not long ago, NVIDIA released an update to their end-user license agreement, and some of the additions to the agreement are rather alarming...
     
    From TechPowerUp:
     
    The recent addition can be found in the agreement which you have to accept before directly downloading GeForce-drivers from their site as the fourth point under section 2.1.3.
     
    I find it disturbing that NVIDIA feels like they have to prohibit usage in data centers just because they fail to offer enough value to their customers through their insanely expensive Quadro and Tesla cards. Are NVIDIA admitting here that their enterprise solutions aren't good enough/offer enough value compared to their "mainstream" alternatives?
  7. Agree
    Angry Badger got a reaction from Castdeath97 in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Indeed, and thery're not even trying to hide their agenda a little bit. They're litterally telling every single enterprise customer out there that you're not allowed to buy the cheaper solutions.
     
    Though it's NVIDIA telling ther customers what they can and can't do with their product which is the scary part in my opinion.
  8. Agree
    Angry Badger got a reaction from bobhays in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Indeed, and thery're not even trying to hide their agenda a little bit. They're litterally telling every single enterprise customer out there that you're not allowed to buy the cheaper solutions.
     
    Though it's NVIDIA telling ther customers what they can and can't do with their product which is the scary part in my opinion.
  9. Agree
    Angry Badger got a reaction from Beskamir in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Indeed, and thery're not even trying to hide their agenda a little bit. They're litterally telling every single enterprise customer out there that you're not allowed to buy the cheaper solutions.
     
    Though it's NVIDIA telling ther customers what they can and can't do with their product which is the scary part in my opinion.
  10. Informative
    Angry Badger got a reaction from Beskamir in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Not long ago, NVIDIA released an update to their end-user license agreement, and some of the additions to the agreement are rather alarming...
     
    From TechPowerUp:
     
    The recent addition can be found in the agreement which you have to accept before directly downloading GeForce-drivers from their site as the fourth point under section 2.1.3.
     
    I find it disturbing that NVIDIA feels like they have to prohibit usage in data centers just because they fail to offer enough value to their customers through their insanely expensive Quadro and Tesla cards. Are NVIDIA admitting here that their enterprise solutions aren't good enough/offer enough value compared to their "mainstream" alternatives?
  11. Agree
    Angry Badger got a reaction from matrix07012 in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Indeed, and thery're not even trying to hide their agenda a little bit. They're litterally telling every single enterprise customer out there that you're not allowed to buy the cheaper solutions.
     
    Though it's NVIDIA telling ther customers what they can and can't do with their product which is the scary part in my opinion.
  12. Agree
    Angry Badger got a reaction from PianoPlayer88Key in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Indeed, and thery're not even trying to hide their agenda a little bit. They're litterally telling every single enterprise customer out there that you're not allowed to buy the cheaper solutions.
     
    Though it's NVIDIA telling ther customers what they can and can't do with their product which is the scary part in my opinion.
  13. Agree
    Angry Badger got a reaction from 8uhbbhu8 in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Indeed, and thery're not even trying to hide their agenda a little bit. They're litterally telling every single enterprise customer out there that you're not allowed to buy the cheaper solutions.
     
    Though it's NVIDIA telling ther customers what they can and can't do with their product which is the scary part in my opinion.
  14. Agree
    Angry Badger got a reaction from InertiaSelling in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Not long ago, NVIDIA released an update to their end-user license agreement, and some of the additions to the agreement are rather alarming...
     
    From TechPowerUp:
     
    The recent addition can be found in the agreement which you have to accept before directly downloading GeForce-drivers from their site as the fourth point under section 2.1.3.
     
    I find it disturbing that NVIDIA feels like they have to prohibit usage in data centers just because they fail to offer enough value to their customers through their insanely expensive Quadro and Tesla cards. Are NVIDIA admitting here that their enterprise solutions aren't good enough/offer enough value compared to their "mainstream" alternatives?
  15. Informative
    Angry Badger got a reaction from leadeater in NVIDIA forbids deployment of GeForce drivers in data centers...   
    Not long ago, NVIDIA released an update to their end-user license agreement, and some of the additions to the agreement are rather alarming...
     
    From TechPowerUp:
     
    The recent addition can be found in the agreement which you have to accept before directly downloading GeForce-drivers from their site as the fourth point under section 2.1.3.
     
    I find it disturbing that NVIDIA feels like they have to prohibit usage in data centers just because they fail to offer enough value to their customers through their insanely expensive Quadro and Tesla cards. Are NVIDIA admitting here that their enterprise solutions aren't good enough/offer enough value compared to their "mainstream" alternatives?
  16. Informative
    Angry Badger got a reaction from Walt in Which code language to learn?   
    I'd start with Java. The community is big with a lot of popular frameworks that are often used in enterprise applications and other projects. 
     
    Tutorialspoint is free, but they move forward quite fast compared to what's often comfortable for a beginner. 
    https://www.tutorialspoint.com/java/index.htm
     
    A more interactive and "hands on" way to learn Java would be Codecademy. 
    https://www.codecademy.com/learn/learn-java
     
    If you've got money to spend, going for an e-book might not be the worst idea. The content is often explained in a more intuitive manner compared to a lot of free content, with more exercise and better code examples.
    https://play.google.com/store/search?q=Java&c=books&hl=en
     
    There are also some Youtube channels out there with some useful tutorials.
    https://www.youtube.com/watch?v=TBWX97e1E9g&t=34s
     
  17. Informative
    Angry Badger got a reaction from Blackhole890 in 2 stupid questions from a newbie (aka me)   
    If you're going for the jobs with the highest pay, you should get an engineering/informatics degree. Classes in those areas will be relevant for many, many years to come, and what you learn in university/college will help you solve a lot of difficult problems, and also just code better in general. C++ and Java are two very common languages taught in universities and colleges due to how central they are in both consumer and enterprise applications. 
     
    If you'd rather work with front-end systems, I'd imagine you'd get paid less than the people working on the back-end, but that's just a guess based on the knowledge required in order to work within those two areas. It seems that Javascript/HTML&CSS are more accessible skills to employers (at least that's the case where I live), while people with skills more useful in back-end development (databases, web-server technologies, security, cloud computing etc.) are often harder to come by, and therefore often end up getting paid more.
     
    But in the end, you'll have to decide based on your interest and way of thinking. If you're a good problem solver, aiming for college/university and engineering or comp. sci. is a good idea. If your thinking is more design-oriented and not so much problem-oriented, there are both guides and books on the web and schools which offer courses on the most common front-end languages. 
  18. Informative
    Angry Badger got a reaction from Walt in Learn xcode or unity   
    Xcode is an IDE, and Unity is a game development platform. None of them should be even considered when you're just starting to learn coding, and you may have misunderstood what those two things are.
     
    If you're serious about starting out with programming, I'd suggest that you first of all get a proper understanding of some popular programming languages. Personally, I'd recommend Java or C# if you're just starting out.
     
    https://www.tutorialspoint.com/java/
    https://beginnersbook.com/java-tutorial-for-beginners-with-examples/
     
    Just be aware that learning a programming language takes time and dedication, and making games usually isn't as easy as it seems.
  19. Agree
    Angry Badger got a reaction from ARikozuM in iPhone 7 touch not working?   
    Isn't a hard reset just holding the lock button + volume down until it reboots?
     
    Would have taken you three seconds to just google instead of making a thread about it, though.
  20. Agree
    Angry Badger got a reaction from Leg_Licker in S7 Edge   
    The objective truth behind this question is that the Galaxy S7 is still one of the better phones on the market today, and with a lot of stores dropping prices due to the S8 and other new phones, the potential for a good value deal is big.
     
    Claiming that it isn't a good phone today would just be plain stupid. The SoC is still great, the build quality is among the best on the market and the battery life is generally quite impressive. People with "defect" phones are the exception, and when some claim that there are "reliability issues" you should take that with a grain of salt. There aren't many known major "reliability issues" with the Galaxy S7 that can't happen to other phones, and Samsung screens are among the better ones right now. They're much sturdier and nicer than a lot of people claim they are, and just because someone some random guy on the internet knows had some anomaly on his screen (or his old Galaxy phone) doesn't mean that you will.
    And just like any other flagship phone, it gets hot while doing its job. The iPhone 7 I borrowed for a project got about as hot under substained workloads (but the battery drained considerably faster). There's nothing wrong with the phone getting hot while doing demanding work, and while browsing or watching video it will usually stay cool.
     
    It's definitely still a good phone. That's the absolute answer. A more interesting question would be how much it costs compared to the newer flagships, because that's what matters here. I'd check whether or not it's a lot cheaper than the S8, and if the price difference isn't huge, I'd consider just saving up for the newer flagship.
  21. Agree
    Angry Badger got a reaction from vorticalbox in A program with a Student database   
    I didn't really want to comment on your post, but I feel like I have to.
     
    It's correct that we're aiming to solve problems, but it's not as simple as "make something that works". A solution should always satisfy the basic principles of software development if you're developing a tool for someone else to use. Is the application robust? Is it secure? Is it difficult to maintain? Does it scale well? It's your fault if it breaks or is otherwise unstable, or even worse, compromises confidential information. By just following good practises and planning well, these can mostly be dealt with, and you're not encouraging that by denying that the solution presented in this thread is terrible.
     
    If you ever go to college for software development classes or engineering, this is what they'll teach you from the very start. You'll learn how important it is to not just solve problems in a crappy manner like the OP has done, but you'll learn how important it is to develop good habits and knowlegde of the best and most common practises when writing software that has the potential to cause problems if it fails.
     
    It's usually more expensive and troublesome to expand later if the application architecture either doesn't scale well or straight-out doesn't work at a certain scale. The ultimate answer is to write something solid that scales well in the first place. Creating something that doesn't scale well in the frist place is often the source for hard-to-maintain and crappy code. If your plan is solid and your programming skills are on-point, this usually isn't an issue, and if it somehow turns out to be an issue, you should go back to practising instead of implementing poor solutions.
     
    In this specific case you're even more wrong, because a proper storage solution would drastically increase reliability, which would be very important if someone has to rely on your application in order to retrieve data.
     
    No. This understanding is one that will leave you unemployed.
     
    We don't know that. Your're making assumptions, and wrong assumptions can be expensive and troublesome. And I say it again, just coming up with a better solutrion would handle this anyways.
     
     
  22. Agree
    Angry Badger got a reaction from Principis in What Antiviruses do you guys like?   
    I don't really like that people here are talking about "common sense". "Common sense" is also the wrong way to put it I think, and in my opinion it sounds really cocky. It would be more correct to talk about general internet knowledge rather than common sense. It's also not something you can quantity in a binary manner. 
     
    So people here saying "Windows defender and common sense" are actually saying "Run Windows defender and read up alot on internet security to compensate for the lacking protection". I don't think that advice is fair to give unless we're very sure about the user's level of experience and use cases.
     
    Not a rant, but I'm seeing a lot of people putting it like that, and I don't think that's the right way to put it. It's not to hate on Windows Defender either despite both the performance and detection rates sucking a bit.
  23. Agree
    Angry Badger got a reaction from bomerr in Is there any reason to SLI 1080Tis now?   
    The only reason for running two GTX 1080 Ti-cards in SLI is if you don't care about getting any value for your money. Like SLI always have, you'll sometimes get a decent boost, and at other times get barely anything in return depending on the game or software you're running. Driver support for SLI-setups are also generally much less stable and optimized. 
     
    I can see a reason for running two of em', but there currently doesn't exist a good reason as far as I know unless you've got some kind of specialized workload. A GTX 1080Ti SLI-setup is an abysmal value even for an extremely high-end setup (which generally aren't value-oriented anyways).
  24. Agree
    Angry Badger got a reaction from Dark Force in how do i stop bitdefender from blocking my applications?   
    I don't have much experience with the free version of Bitdefender, but if you can find settings for real-time protection and such, try turning them off one by one and see if it lets your application past. You can also whitelist the folder with the location of the files and see if that works.
     
    EDIT: I see you figured it out. Great!
     
    Bitdefender isn't trash, and I don't even have it installed right now. It has been consistently performing as one of the best antivirus products on the market for quite some time now, and it's known to run one of the best anti-malware engines on the market. Anti-virus test results can vary a lot from year to year, but compared to what you recommend as a "proper antivirus", Bitdefender is definitely not bad if we're considering detection rates and performance ratings, and that applies to the free version, too, since it's running hte same, albeit slightly less feature-rich anti-malware engine, which honestly is a masterpiece compared to what many other A.V vendors have come up with.
     
    WIndows Defender is only useful if you know what you're doing. It's otherwise preferred among those who likes unitrusive software for a reason; because it won't detect or take action before your system is a malware fiesta, a system state which is rarely the case for computer-savvy people. It's detection rates are very inonsistent, zero-day detection and removal is lacking and its ability to eliminate stubborn malware just isn't good enough. It's terrible, but people like you and me accept it for its unintrusive behavior. 
  25. Agree
    Angry Badger got a reaction from Zando_ in what is the best free antivirus for a new pc   
    Are you paying up for a security suite, or do you just want the basic protection?
     
    If paid, then I'd go with Bitdefender. It's what I used to run on my old laptop, and it's the security suite with the least performance impact, best detection rates and least intrusive behavior I've had any experience with. 
     
    If you're going the free route, the quality options are less, but Avira and Avast are well-known and decent alternatives.
     
    If you're just going to play games and download them from a reputable source like Steam, you may be better off without any antivirus. If you're browsing shady sites or downloading a lot of files (mods, games from shady sources) I'd recommend the paid version of Bitdefender, which is a really pleasant experience compared to many other AV products. If you just want to browse safely,either Avira or Avast will protect you adequately.
×