Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

WTF is up with School These Days

So I thought I would post my frustatration here to let you all deal with it.

 

Simple Question: WTF is up with schools these days?

 

Background: I am a full stack developer for a company that creates and supports applications for agencies under HHS (US gov). In school I was hammered with Java from 9th grade through college (I am still in college while I work fulltime). Well ... when I got my current job I was tasked with a dashboarding application. I of course chose to go with Java and made a Java applet that I could load in the browser for our customer. After months of work and two all day-ers at the end ... I was ****-slapped in the face with the notification that the government (at least HHS does not support browser based applets). So WTF?!?!?!

 

Now I have changed paths and have sharpened my c# skills. I have leanred to deploy web apps into AWS GovCloud that can be consumed by government agencies and have begun making custom SharePoint deployments. But still ... wtf was I learning all of the Java syntax throughout my school for?

 

I thought this was a due rant because today Stanford announced that it was changing many of their computer science classes from Java to Javascript ... good for you Stanford!

 

Just seems like if you are not making phone apps or creating applications that run locally ... Java is not what we should be teaching our kids. Learn something universal and that is common in both commercial and government enviornments ... Javascript!

 

So ... tell me I am a dumb@ss or back me up and tell me that you are just as frustrated as I am in this.

Link to post
Share on other sites
3 hours ago, gspark05 said:

But still ... wtf was I learning all of the Java syntax throughout my school for?

1

You know that Java can do more than just build applets, right?

You know that a lot of companies use Java, right?

You know that no language is the best choice for every situation, right?

 

It's unfortunate that you used the wrong language, or potentially just the wrong project type, for the job but that's hardly a reason to criticise the language from being taught in schools. With that said, many schools are changing, or have already changed, their intro courses away from Java. A lot are choosing Python while some are going with other languages. So there are valid reasons to move away from Java, according to some people, however, yours just isn't one of them.

 

 

Link to post
Share on other sites

As a learner still in school, thanks for giving me this heads up. My comp sci classes at school teach Java and other than that it's just Python and HTML (in other classes, not comp sci).

 

I've been told to start coding C first just to understand how code works then to bounce to C# or C++ (and if I'm inclined javascript or java).

  • CPU = i7 5930k
  • Cooler = Corsair H100i
  • RAM = Dominator Platinum 32GB DDR4-2666
  • Storage = Samsung 850 Pro 1TB
  • GPU = EVGA GTX 1080ti FTW3
  • PSU = Corsair 750D
  • PSU = Corsair 1000W
  • Monitor = Asus PG278Q
Link to post
Share on other sites
8 hours ago, gspark05 said:

So I thought I would post my frustatration here to let you all deal with it.

 

Simple Question: WTF is up with schools these days?

 

Background: I am a full stack developer for a company that creates and supports applications for agencies under HHS (US gov). In school I was hammered with Java from 9th grade through college (I am still in college while I work fulltime). Well ... when I got my current job I was tasked with a dashboarding application. I of course chose to go with Java and made a Java applet that I could load in the browser for our customer. After months of work and two all day-ers at the end ... I was ****-slapped in the face with the notification that the government (at least HHS does not support browser based applets). So WTF?!?!?!

 

Now I have changed paths and have sharpened my c# skills. I have leanred to deploy web apps into AWS GovCloud that can be consumed by government agencies and have begun making custom SharePoint deployments. But still ... wtf was I learning all of the Java syntax throughout my school for?

 

I thought this was a due rant because today Stanford announced that it was changing many of their computer science classes from Java to Javascript ... good for you Stanford!

 

Just seems like if you are not making phone apps or creating applications that run locally ... Java is not what we should be teaching our kids. Learn something universal and that is common in both commercial and government enviornments ... Javascript!

 

So ... tell me I am a dumb@ss or back me up and tell me that you are just as frustrated as I am in this.

I would argue Java is a way better language to learn than Javascript.

It is cross platform and has a very similar syntax to C++ and C# (which most universities dont teach because it used to be locked to Windows).

Although Python would be a better language (its what we use(d)) because of its clear scientific focus in terms of available modules (numpy, scipy, opencv, etc) and its scripting nature.

Javascript is actually a pretty suboptimal language for computer SCIENCE.

 

Building (web)apps is not really the type of job that University prepares you for.

In University you're supposed to learn a whole lot more such as linear algebra, statistical reasoning, logic, low level programming, computer vision, etc...

Most of these topics dont really apply to a web development job, you learn them to do more advanced things.

I dont think you're university teached you Java with the primary goal

of you developing Java applets for a living.

 

Finally: Java applets have died years ago.

It's on you for not staying up-to-date, dont blame your education.

Java is still a valid language in web development but as a server side language only.

Desktop: Corsair RM550x | Ryzen R9 3900X |RTX 2080 (borrowed from work) - MSI GTX 1080 | 64GB 3600MHz CL16 memory | Asus X470-F Gaming | Samsung 970 EVO 512GB | Samsung 830 256GB | 3TB HDD | Corsair 450D | Corsair H100i (NF-F12 fans) | MG279Q

Laptop: Surface Pro 7 (i5, 16GB RAM, 256GB SSD)

Console: Playstation 4 Pro

Link to post
Share on other sites
12 hours ago, gspark05 said:

Java is not what we should be teaching our kids.

True, but ...

 

12 hours ago, gspark05 said:

Learn something universal and that is common in both commercial and government enviornments ... Javascript!

... ugh!

 

As a constructive criticism aside from my usual ranting about disgusting hipster languages with more design failures than actual features:

When I was young (admittedly, still notably later than some of you old Fortraners), our school had (voluntary) classes in BASIC and Pascal. From today's perspective and with all the current "everyone uses them!!1" languages - Java, JavaScript, PHP, Python - in mind, I fail to see why Pascal has become a forgotten language as of 2017.

 

There is no sane reason to prefer JavaScript or Python to a mature and well-thought teaching language like Pascal. (I do know that Pascal is usually not a part of the list of languages I recommend, but please take this posting as one tailored for schools.)

Write in C.

Link to post
Share on other sites

You raise points that are perfectly valid in my personal experience.

6 years ago I started my Computer Science major (quit 4 years ago) and everything the University was offering was a lie, at least in relation to what they promoted.

Java, pure assembler (rlly?), R, PL/SQL (aka Oracle, which I WILL NEVER FREAKING USE!), pseudo-code (which is totally and utterly pointless, since it's basically the same thing as writing in C, but you use words instead of operators and functions)

tl;dr: Dropped out because what they tried to teach me wasn't marketable. Instead I now have workshop experience, market research certification and I know how to tend bar.

Remember kids, the only difference between screwing around and science is writing it down. - Adam Savage

 

PHOΞNIX Ryzen 5 1600 @ 3.75GHz | Corsair LPX 16Gb DDR4 @ 2933 | MSI B350 Tomahawk | Sapphire RX 480 Nitro+ 8Gb | Intel 535 120Gb | Western Digital WD5000AAKS x2 | Cooler Master HAF XB Evo | Corsair H80 + Corsair SP120 | Cooler Master 120mm AF | Corsair SP120 | Icy Box IB-172SK-B | OCZ CX500W | Acer GF246 24" + AOC <some model> 21.5" | Steelseries Apex 350 | Steelseries Diablo 3 | Steelseries Syberia RAW Prism | Corsair HS-1 | Akai AM-A1

D.VA coming soon™ xoxo

Sapphire Acer Aspire 1410 Celeron 743 | 3Gb DDR2-667 | 120Gb HDD | Windows 10 Home x32

Vault Tec Celeron 420 | 2Gb DDR2-667 | Storage pending | Open Media Vault

gh0st Asus K50IJ T3100 | 2Gb DDR2-667 | 40Gb HDD | Ubuntu 17.04

Diskord Apple MacBook A1181 Mid-2007 Core2Duo T7400 @2.16GHz | 4Gb DDR2-667 | 120Gb HDD | Windows 10 Pro x32

Firebird//Phoeniix FX-4320 | Gigabyte 990X-Gaming SLI | Asus GTS 450 | 16Gb DDR3-1600 | 2x Intel 535 250Gb | 4x 10Tb Western Digital Red | 600W Segotep custom refurb unit | Windows 10 Pro x64 // offisite backup and dad's PC

 

Saint Olms Apple iPhone 6 16Gb Gold

Archon Microsoft Lumia 640 LTE

Gulliver Nokia Lumia 1320

Werkfern Nokia Lumia 520

Hydromancer Acer Liquid Z220

Link to post
Share on other sites

You should have known this. Java applets have been unofficially deprecated since Java 8 (2013). There was even more dying restrictions put on them in the late portion of Java 7.

 

There's actually pretty OK reasoning behind why Java is taught. It's a nice language to teach SOLID OOP with since it doesn't have the complexity of C++ or the Microsoft nuances of C#. Also, in my experience, unless it's their chosen field, most uni professors are dreadful programmers.

 

Now, teaching JavaScript? I personally think that's a mistake.

 

If I was going to teach languages on a university curriculum, I'd definitely include some C and assembly (perhaps nothing CISC, probably ARM or something), and then throw in something like Go or something.

And, if my thought-dreams could be seen,
they'd probably put my head in a guillotine.
But, it's alright, ma, it's life, and life only.

Link to post
Share on other sites

The major web browsers no longer support Java applets (good riddance to NPAPI and ActiveX plugins). Like others have said, you really should be keeping with the times, especially for things related to your field of work. Unless you're using an older web browser version, you shouldn't even be able to use Java applets now.

Link to post
Share on other sites

The problem with schools is they're slow to change their curriculum at the speed technology requires. And most of that problem is ensuring that you are teaching students something standardized and something that whatever the old farts in the board of trustees think is appropriate. But at the same time, nobody can predict the future. Back when I was in college, Node.JS wasn't a thing (neither was Chrome), C# seemed to barely get any recognition, Adobe Flash was still the hottest tard to deploy web apps, and everyone and their mother was using LAMP stacks.

 

I guess it depends on if the course work is trying to teach you the latest and greatest thing, or is trying to be as broad and timely as possible. There were classes that I took that for the most part won't become obsolete because they are the core foundations of software development.

 

On another hand, if you want to make a career out of software development, it sort of behooves you to keep up with the times and not rely on just what you were taught in school.

Link to post
Share on other sites
9 hours ago, Dat Guy said:

JavaScript or Python to a mature and well-thought teaching language like Pascal.

Yes there are. there are many.

Link to post
Share on other sites

I'm from Germany and here (at least at my school) you don't even really learn how to program. From 9th to 10th grade we were seriously taught how to use text formatting in Word and Excel! I know a few people from other schools who actually learned how to program in java, but they were taking extra classes for this. 

Java Applets are deprecated since 2013 and are blocked by browsers for even longer. Especially because Java isn't really known for its good security.

But still, learning Java gives you a good understanding of how you build and design applications. In my experience, learning Java is relatively easy and you're getting cross-platform compatibility and great OOP support. If you learned Java once, it's quite easy to learn another language like C# since a lot of programming languages have a pretty similar syntax. You can also use Java to build powerful Cloud Applications, just take a look at Google Cloud Platform (App Engine) and the endpoints framework. They probably aren't government ready, but you can use them for custom frontends and scale them really easily. Android Applications can be built with Java as well. 

On 4.5.2017 at 0:21 AM, gspark05 said:

I thought this was a due rant because today Stanford announced that it was changing many of their computer science classes from Java to Javascript ... good for you Stanford!

 

Just seems like if you are not making phone apps or creating applications that run locally ... Java is not what we should be teaching our kids. Learn something universal and that is common in both commercial and government enviornments ... Javascript!

 
 

I don't think JavaScript would be the best candidate to replace Java in education. JS doesn't have good OOP support and it can be really painful to debug JS(/Node) applications -.- xD If you'd really want to replace Java I'd go with Python. It's pretty easy to learn, very fast, has great support, a lot of OpenSource libraries and is the de-facto standard language for machine learning/AI (aside from C++...)

Link to post
Share on other sites

Guess I am behind the times in Java news. I get that learning the basics is great and that you can then choose your path after school.

 

Nonetheless, as an individual working for non-DOD government agencies, Java is unacceptable. My company is trying to hire individuals out of school but none of them are experts in anything except Java.

 

On 5/5/2017 at 9:14 AM, MoVo said:

I'm from Germany and here (at least at my school) you don't even really learn how to program. From 9th to 10th grade we were seriously taught how to use text formatting in Word and Excel! I know a few people from other schools who actually learned how to program in java, but they were taking extra classes for this. 

Java Applets are deprecated since 2013 and are blocked by browsers for even longer. Especially because Java isn't really known for its good security.

But still, learning Java gives you a good understanding of how you build and design applications. In my experience, learning Java is relatively easy and you're getting cross-platform compatibility and great OOP support. If you learned Java once, it's quite easy to learn another language like C# since a lot of programming languages have a pretty similar syntax. You can also use Java to build powerful Cloud Applications, just take a look at Google Cloud Platform (App Engine) and the endpoints framework. They probably aren't government ready, but you can use them for custom frontends and scale them really easily. Android Applications can be built with Java as well. 

I don't think JavaScript would be the best candidate to replace Java in education. JS doesn't have good OOP support and it can be really painful to debug JS(/Node) applications -.- xD If you'd really want to replace Java I'd go with Python. It's pretty easy to learn, very fast, has great support, a lot of OpenSource libraries and is the de-facto standard language for machine learning/AI (aside from C++...)

Java is OOP. Out of the box it is limited but there are plenty of libraries that make it quite robust as for as OOP is concerned.

 

On 5/3/2017 at 9:42 PM, madknight3 said:

You know that Java can do more than just build applets, right?

You know that a lot of companies use Java, right?

You know that no language is the best choice for every situation, right?

 

 

I had no clue ...

I was talking about the use for the Federal Government ... specifically outside of DOD. Yes, I went a bit farther than that and I know where Java is more useful. But 8 years of Java and no Python, C#, JS or C is a bit ridiculous.

 

I didn't mean to get people butt hurt. Just ranting a bit.

Link to post
Share on other sites

I only spent a little time on Java. I spent a considerable amount of time in other languages like C++.

Link to post
Share on other sites
On 04/05/2017 at 2:05 PM, revsilverspine said:

6 years ago I started my Computer Science major (quit 4 years ago) and everything the University was offering was a lie, at least in relation to what they promoted.

Java, pure assembler (rlly?), R, PL/SQL (aka Oracle, which I WILL NEVER FREAKING USE!), pseudo-code (which is totally and utterly pointless, since it's basically the same thing as writing in C, but you use words instead of operators and functions)

pseudo-code it far from useless, helps you build complicated functions. At least you got taught languages that are actually used. All the material in my college and uni life was visual basic, and not even the most recent version.

                     ¸„»°'´¸„»°'´ Vorticalbox `'°«„¸`'°«„¸
`'°«„¸¸„»°'´¸„»°'´`'°«„¸Scientia Potentia est  ¸„»°'´`'°«„¸`'°«„¸¸„»°'´

Link to post
Share on other sites

This will all depend as well on the schools the person has been to.

 

I had no programming until sixth form, where we did some fairly outdated Visual Basic stuff, that was awful. But the more general "how to program" stuff was useful enough, including general search strategies and vaguely mentioning Big-O notation without actually ever mentioning it, just kinda explaining bits around it.

 

Uni at least was a decent mix of C, C++, Java and Python, which to me at least gives you a decent spread of most forms of language. 

To be honest though, it doesn't especially matter what languages a person knows. As long as they understand general programming paradigms, then you know they are okay enough. Then you just build on that with decent training when they do start the job.

 

I started my job (well year long placement) knowing exactly no C#, but since I was a good programmer in other languages, it was a case of picking up the general syntax, as well as some of the more specific C# stuff like LINQ etc. I got taught all that with various training modules before anyone expected me to actually code anything at all for a real project.

Really, more companies need to invest in their grads a bit more and just have a bit of training and what not, to bring the person in. It helps build up confidence (mine at least) and it also can be used to introduce alllllll the stuff that you just don't get in a Uni degree, like code reviews, sticking to code guidelines for syntax and whatnot, how to interact with a customer, how to estimate your work etc etc.

 

It helps everyone, the grad feels more at ease and not thrown in at the deep end, and the company get more options for people to hire, and they know that once they've finished the training they should all be on the same level at a minimum, which means you can trust them to do work a lot more.

CPU: 6700k GPU: Zotac RTX 2070 S RAM: 16GB 3200MHz  SSD: 2x1TB M.2  Case: DAN Case A4

Link to post
Share on other sites

You're a Software Engineer, you adapt to what is needed. Sometimes you need to pull a Tony Stark....

 

 

 

Companies should invest more in helping student at least acquire internship over the summer to learn an become capable programmers. It would definitely help in confidence.  

 

 

[ Cruel Angel ]:     Exterior  -   BENQ XL2420T   |   SteelSeries MLG Sensei   |   Corsair K70 RED   |   Corsair 900D  |                                                                                                    CPU:    -   4.7Ghz @ 1.425v             |

                             Interior    -   i7 4770k   |    Maximus VI Formula    |   Corsair Vengeance Pro 16GB    |   ASUS GTX 980 Strix SLIx2  |  840 Pro 512Gb    |    WD Black 2TB  |           RAM:   -   2400Mhz OC @ 1.650v    |

                             Cooling   -   XSPC 120mm x7 Total Radiator Space   |   XSPC RayStorm    |    PrimoChill Tubing/Res  |                                                                                             GPU:   -   1000Mhz @ 1.158            |

Link to post
Share on other sites

I came through school relatively recently using Java. Specifically Enterprise Java towards the end focused on web applications (I only mention this to point out that you can do more than android/local machine applications). It's not as prevalent in the industry where I live (compared to .Net) but there are still jobs around, and imo the syntax taught is secondary to the rest of the education. I work as a pseudo full stack developer doing .Net MVC primarily and I still apply most of my education to what I do. The biggest change for me was typing "using" rather than "import" :P

 

However I will say that it was explained to us, in no uncertain circumstances, that we should just pretend Java applets didn't exist.

 

I know this whole thing is kind of why code camps around my area don't get accredited. They want to be able to modify their curriculum. I went through school using Java, and I've not written a line of Java professionally. It really doesn't bother me. There was so much more to my education than a single language.

 

Just my 1.5 cents.

Link to post
Share on other sites
On 5/3/2017 at 3:21 PM, gspark05 said:

So I thought I would post my frustatration here to let you all deal with it.

Simple Question: WTF is up with schools these days?

--SNIP--

 

 After months of work and two all day-ers at the end ... I was ****-slapped in the face with the notification that the government (at least HHS does not support browser based applets). So WTF?!?!?!

 

--SNIP--

No one in the entire world should support browser applets anymore since they pose a security risk. NPAPI, required for JAVA and other browser-based applets, is being phased out in all major browsers, so we're sort of in-between times right now. It's great to see Javascript and HTML5 being pushed since using native things built into the browser is better in every way possible, however, there are issues in using heavy amounts of Javascript for websites or webapps too, so server side languages are ultimately better.

https://en.wikipedia.org/wiki/NPAPI#Browser_support

Desktop: KRySTaLoGi-PC Build Log (i7-4790K, RTX2060) Mobile: OnePlus 5T | Bell - Unlimited Calling & Texting + 10GB Data
Laptop: Dell XPS 15 9560 (the real 15" MacBook Pro that Apple didn't make) Tablet: iPad Mini 5 | HP Touchpad | ASUS ME302C
Camera: Canon SX280 + Rebel T1i (500D) | Sony HDR-AS50R | Panasonic DMC-TS20D Music: Spotify Premium (CIRCA '08)

Link to post
Share on other sites

If a person knows how to program, they should be able to pick up any given language after may be a week or two of study or training. Though that depends on their willingness.

 

I guess I'll just get to the bragging part. I've been able to self teach:

  • VB6
  • HTML/CSS
  • C#, more or less. It was the language used in one of my school's classes and I never touched it before then, but I spent most of the class time figuring out what the lab assignment was, then got to work on my own.
  • Python
  • JavaScript

I tried to teach myself Ruby, but for some reason my mind doesn't want to accept it.

 

So does it really matter what language they know or don't know? I don't think so. I think it's important they know how to solve a problem and are willing to adapt.

Link to post
Share on other sites

First of all: as others have said it doesn't really matter what language you learned in school, it's the fundamentals and problem solving that matter.  The far bigger problem with school is that in general (for IT) it does a piss-poor job of preparing people for the actual work environment.  Fundamentals are all well and good, but most programs fail utterly at giving students a realistic idea of what they are going to be doing...

 

So what people expect is that you'll be solving cool problems and writing clean code that follows best practices, when the reality is that you're going to spend most of your time working with old hacked-together spaghetti code that doesn't make any sense because the requirements were poorly defined and changed half a dozen times during development.  There also seems to be an appalling lack of database work considering how critical data manipulation is in many software development jobs.

 

With that being said: you (and mostly your team leaders/managers) utterly failed.  How did you spend MONTHS working on something without finding out what the basic requirements were?  (IE: language, browser, security restrictions, etc).  That's a question that needs to be asked DAY ONE before a single line of code is written, and the fact that no one in the room bothered to ask what, if any limitations there were is a major problem.  Not necessarily your fault if you're a rank and file developer that does what you're assigned, but you can't just make assumptions like that... don't assume anything is rule #1 of development.

 

For that matter Java is used by many government/military applications.  The problem wasn't the language choice, but the type of project (applet) where you went wrong.  Browsers are extremely locked down when you're talking about anything related to the government/military and you can't just use whatever you want because the settings required would represent a major security risk. 

 

Also, I don't agree that JavaScript is a good language for schools.  It can be used in many different ways, but there are MANY pitfalls where you can do a lot more harm than good with respect to good programming habits.  C# and Java are FAR superior for learning programming fundamentals and as a full-stack developer that has worked in many languages and environments I would NEVER recommend JavaScript as a starter language.

Link to post
Share on other sites

I didn't really start to code until my first semester at college (last semester).

I started with Java, which was annoying, but I did learn some complex ways to solve problems. At the same time, I was learning JavaScript. For a long time, I never understood why anyone would use Java when there is JavaScript! The almighty and more powerful (and easier) language than Java! But then I learned that, even though many languages can do the same thing, they have their places.

 

I was upset that my school didn't explain what the hell Informatics was well enough. They explained it as "Computer Science but less intense and more businessy!", which is mostly right, but it's completely wrong. My first time taking INF 201 was the best time of my life. I really learned how to use HTML, CSS, and raw JavaScript to make things work on a webpage.

 

I just think it's a mis-communication mixed with politics within the school. They don't want to move money around and change things that "are working". 

 

If you're doing web-type stuff, I strongly suggest you learn JavaScript. Very similar to Java, but more forgiving and dynamic.

 

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×