Jump to content

NVIDIA CEO Jen-Hsun Huang: Don't learn computer science. The future is human language (AI code generation)

Summary

At the World Government Summit in Dubai, Jen-Hsun Huang, CEO of Nvidia, said that children shouldn't learn computer science. In the future, coding will be done through human language and AI will generate the program.

People should focus on "on more valuable expertise like biology, education, manufacturing, [...]. This makes every person in the world a programmer".

 

Quotes

Quote

Over the last 10-15 years, almost everybody who sits on a stage like this would tell you that it is vital that your children learn computer science, everybody should learn how to program. In fact, it is almost exactly the opposite. It is our job to create computing technology such that nobody has to program, and that the programming language is human. Everybody in the world is now a programmer. This is the miracle of AI. [...] You now have a computer that will do what you tell it to do. It is vital that we upskill everyone and the upskilling process will be delightful and surprising.

 

My thoughts

Won't happen. Code is precise. The languages of choice (programming languages) are designed without ambiguity. Human language has ambiguity and people aren't precise. Further AI isn't intelligent. All it does is provide output based on probabilities without any understanding of what the output is.

Because the AI is an imposter, the output looks good but is prone to error. Running analysis on such a code base is a bad idea, and converting the output into a meaningful solution requires knowledge of the output. In this case, knowing how to code.

 

Video of Macrus Hutchins demonstrating the pitfalls of chatGPT generating "secure" code:

 

Asking Microsoft copilot to read a CSV in Mathematica and plot it as heatmap required 13 revision to get to this point.

Notice how x- and y-axis is swapped? Copilot just can't do it regardless of how you ask.

image.png.14eab8e0dc557d583fe0b8afacd4db63.png

 

 

Sources

https://www.tomshardware.com/tech-industry/artificial-intelligence/jensen-huang-advises-against-learning-to-code-leave-it-up-to-ai

People never go out of business.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, FlyingPotato_is_taken said:

Summary

At the World Government Summit in Dubai, Jen-Hsun Huang, CEO of Nvidia, said that children shouldn't learn computer science. In the future, coding will be done through human language and AI will generate the program.

People should focus on "on more valuable expertise like biology, education, manufacturing, [...]. This makes every person in the world a programmer".

 

Quotes

 

My thoughts

Won't happen. Code is precise. The languages of choice (programming languages) are designed without ambiguity. Human language has ambiguity and people aren't precise. Further AI isn't intelligent. All it does is provide output based on probabilities without any understanding of what the output is.

Because the AI is an imposter, the output looks good but is prone to error. Running analysis on such a code base is a bad idea, and converting the output into a meaningful solution requires knowledge of the output. In this case, knowing how to code.

 

Video of Macrus Hutchins demonstrating the pitfalls of chatGPT generating "secure" code:

 

Asking Microsoft copilot to read a CSV in Mathematica and plot it as heatmap required 13 revision to get to this point.

Notice how x- and y-axis is swapped? Copilot just can't do it regardless of how you ask.

image.png.14eab8e0dc557d583fe0b8afacd4db63.png

 

 

Sources

https://www.tomshardware.com/tech-industry/artificial-intelligence/jensen-huang-advises-against-learning-to-code-leave-it-up-to-ai

And then who will code the AI algorithms?

Another AI 😂?

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

I mean... saying he has a vested interest in saying this would be a massive understatement 😛 so I wouldn't really take his word for it. What's next, Apple telling you to stop buying computers because ipads are the future? (oh wait)

 

There's a good reason we don't use human language for programming... and it's not because we've previously been incapable of translating, say, English to machine code. It's because human language is, by nature, ambiguous and contextual in a way that programming languages aren't. Even assuming that an LLM will one day be capable of producing correct code that is thousands of lines long with no errors, which is a bold assumption imo, people who know nothing of computer science won't be able to describe accurately what they even want. Heck, they won't even really know what they want most of the time. The purpose of studying computer science isn't to learn a programming language, most people can manage the basics of that in a couple weeks; it's about knowing what a system should do, and how it should behave to achieve that.

 

Now, some fields of computer science may well be made obsolete by modern high performance LLMs; for example natural language processing has been a field onto itself (keep in mind, voice recognition and voice based "assistants" like Siri or Alexa did not use an LLM at launch) and could become irrelevant if AI based systems are able to solve the root problem. That just means those computer scientists can move on to work on other problems that LLMs have no relevance to.

18 minutes ago, PDifolco said:

And then who will code the AI algorithms?

Another AI 😂?

Well, that's what they'd have you believe. I guess in theory it's not completely impossible... but even if it were to happen and you could get a tailor made CNN for your specific problem just by asking an LLM for what you need it wouldn't mean CS is dead as a field, for the reasons I detailed above.

 

This kinda reads like a calculator manufacturer saying kids shouldn't learn math anymore because calculators have made the field obsolete.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting to see who spins this which way.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Sauron said:

people who know nothing of computer science won't be able to describe accurately what they even want.

I assume you've had wonderful experience of developing something that matched the customer's description to a T, only for them to come back with "What on earth is that, that's not what we wanted at all"

 

Even as a developer, I'd say the most unambiguous way to describe what you really want is to write it in code. As you said, human language isn't precise, which means you'd likely have to add so much clarifying context that it is probably more efficient to simply write the code yourself.

 

Additionally, if you don't know how to code, how do you verify that the output is correct? Sure, you can run and observe the program. You can probably have another AI generate test cases. But that still requires at least a rudimentary understanding of what you're doing.

 

24 minutes ago, PDifolco said:

And then who will code the AI algorithms?

Another AI 😂?

Fast forward a few years: Software companies desperately searching for employees able to debug and fix their AI generated code… As the leader of a company that relies heavily on both hardware and software (drivers), I'm not sure how Jen-Hsun Huang can make such a statement in good faith.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

Lawmakers require more attention to what "AI" is going to be doing in the next few years. With bots getting better as alot of tasks, some companies are looking to offload work to this "AI" to save on costs. More and more will hop on board where we may see a similar event to the industrial revolution where jobs that were prominent are now being replaced by automation. These people need to be moved to other jobs or we will see a drop in economy due to less spending and the companies that replaced humans with bots will then simply get payouts due to them complaining their service isnt needed. The transition period needs to be addressed as we move from jobs we have now to other roles and ways of thinking of work. This is the same with schools that are teaching out of date information and ways of thinking/learning.

 

Depending on what your ultimate outcome of schooling is most people simply need to know how bots work and how to use AI and become creative in thinking to use the bots to supplement their work as this is what we will see in the near future. Where alot of tasks become automated and alot of old jobs get lumped together since a bot and someone utilizing it can do multiple roles very quickly. The other role would be to help design the bots on the backend which can use bots to help with the initial coding but you also require people to understand the baseline underneath coding as well. This is what we should start seeing in universities for Computer science. Most of what I see from grads isnt useful for this sort of aspect. Its very baseline and is usually just a way of saying "This person wants to learn more and is easy to teach".

 

Note this is just an imo and what I see as someone who researches and creates AI solutions. Its coming alot sooner then people think.

 

Link to comment
Share on other sites

Link to post
Share on other sites

There will always be a need for specialized skills, even if they're for 'legacy' systems, or just to support the new systems. But he may not be wrong, in the broad sense. Meaning that the need for skilled programmers may drop by orders of magnitude as AI really gets going. 

 

Yes, some people on the planet will have to know how to manually write code. But does that really invalidate his point if it's only 10% of what it used to be? Even 50%? I can see that for common, simple applications that average joes and small businesses might want, they will have less of a need to hire someone to make that happen in the future.

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm gonna press x to doubt on this one. Computer science was never really about writing code, that's software eng. In CS, writing code has always been more of a means to an end than an end goal in and of itself. The true focus since Gottfried Wilhelm Leibniz has always been on creating algorithms and furthering our understanding of computation. Suggesting that computer scientists will not be necessary anymore because we have AI is like suggesting mathematicians will not be necessary because we have calculators, WolframAlpha, and so many formulas for so many different things already so why would we need another formula or proof when we've already discovered so much. The point is, there's still a lot left for humanity to learn in both math and computer science and Jensen is completely wrong if he thinks computer science can be replaced with AI.

Link to comment
Share on other sites

Link to post
Share on other sites

Lol imagine telling people software engineers are going to be replaced with software that needed software engineering to be created and updated. AI can do lots of cool stuff but when it messes up especially with trying to write code it's hilariously bad 

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, porina said:

John Carmack: "The discipline and precision demanded by traditional programming will remain valuable transferable attributes, but they won’t be a barrier to entry."

The more I think about it, the more I think he is wrong. This will actually raise the barrier to entry. If anyone can use AI to write programs or at least automate simple processes that previously required programming, then only people with exceptional skills will be able to make a career out of it. Just as AI is taking jobs in other fields, it will take low level jobs in IT.

 

Yes, John Doe in accounting will be able to automate some of his workflow without having to involve a developer, or learn anything about software development. But in return Jane Doe finds herself without a job, because the need for an in-house developer is gone.

 

So I guess in a sense Jensen Huang is right. If you want to be a pro-athlete, you better be very good at it. If you want to be a developer, you better be very good at it. Anyone else, maybe think about moving into another field (that isn't in danger of being replaced by AI…).

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

I'm pretty sure back when AI images were simply blobs trying to look like something, a lot of people were laughing saying it'd never amount to anything. Yet look at the strides it's made in the past 2 years alone. 

 

To say it'll never be able to do something is ridiculous and very short sighted. You simply sound like the line workers in a turn of the century auto plant, or the people who tend to countless horses during the rise of the automobile. 

 

I get the shots at him saying things like this, as he does have a massive vested interest. At the same time, and for the same reason, he likely sees things many don't, and knows what's coming down the pipeline. 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Touch My Hamm said:

Lawmakers require more attention to what "AI" is going to be doing in the next few years.

Good luck, they still don't understand  basic encryption, asking them to  wrap their heads around AI is going to be harder than AI generating consistent hands.

Link to comment
Share on other sites

Link to post
Share on other sites

Ai that will code Ai stuff. Home sweet Alabama, but with computers. What could possibly go wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, dizmo said:

I'm pretty sure back when AI images were simply blobs trying to look like something, a lot of people were laughing saying it'd never amount to anything. Yet look at the strides it's made in the past 2 years alone. 

It's kind of misleading to say "in the past 2 years alone" when what we're seeing now is the result of a decade of work. The past couple of years is just the moment where it got good enough to make mainstream news. Assuming it will just keep improving at the same pace is also likely wishful thinking since, to get it this good, we pretty much fed it the entirety of the internet; getting significantly more data will be really hard.

1 hour ago, dizmo said:

To say it'll never be able to do something is ridiculous and very short sighted. You simply sound like the line workers in a turn of the century auto plant, or the people who tend to countless horses during the rise of the automobile. 

Car plants still employ human workers.

1 hour ago, dizmo said:

I get the shots at him saying things like this, as he does have a massive vested interest. At the same time, and for the same reason, he likely sees things many don't, and knows what's coming down the pipeline. 

It doesn't really matter what he does or doesn't know because he'd say this either way. Even if he knew for a fact this was a bubble about to burst at any moment, he'd still have a vested interest in keeping the hype up a bit longer so he could cash out. Right now GPT isn't doing any computer science, at best it's doing code monkey work and only on small assignments. Even assuming a fast pace of improvement, currently we're not even close to CS becoming irrelevant.

 

Consider writers; chatGPT can write some pretty convincing text that is also grammatically correct (most of the time). This is more than can be said about its programming performance. And yet writing a good novel or article is not just about writing correct and coherent English. Maybe we can be rid of the busywork involved in actually typing down the prose or code, but there's no indication a gpt will be able to autonomously devise a good story or a well structured code base given just a vague description of what a system should do.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Lesson 5: Shopping For Milk (and Eggs)? – David Hurley In Japan

 

Pretty much illustrates why natural language alone is a bad idea for precisely laying out what you want in a concise and unambiguous way. 

 

My key takeaway is this: Don't let anyone tell you it's ok not to know something, that they've taken care of it with their product, which you should consume, by the way.

 

3 hours ago, dizmo said:

I'm pretty sure back when AI images were simply blobs trying to look like something, a lot of people were laughing saying it'd never amount to anything. Yet look at the strides it's made in the past 2 years alone. 

 

To say it'll never be able to do something is ridiculous and very short sighted. You simply sound like the line workers in a turn of the century auto plant, or the people who tend to countless horses during the rise of the automobile. 

 

I get the shots at him saying things like this, as he does have a massive vested interest. At the same time, and for the same reason, he likely sees things many don't, and knows what's coming down the pipeline. 

I think the general issue people tend to have a blind spot for is the implications of these advancements and the long-term effects. Some have a vested interest in hyping it up even at these early, fairly crappy stages because they stand to gain something from it. Others are in a panic because they see the looming threat on the horizon. See the SAG-AFTRA and Writers Guild of America strikes from the previous years.

 

The ones denying the possibility of massive upheaval are generally those who like to imagine that they're still safe and secure in their position. Just ask a lot of software developers who have been gloating for a while that they'll make you redundant eventually, so you'd better start learning to code. Now that their neck is on the line, through their own making no less, the tune sounds a lot scarier, so wishing it away is a coping mechanism.

 

The problem of the AI advancements lies at the intersection between the interests of those who have capital and those who don't, as is the case with pretty much any innovation that makes a certain type of labor redundant. Because those that do can lower their costs while those that don't lose their income in the process, it's a tale as old as the printing press.

 

One of the core issues I see not being addressed is what we should do about it. You can argue that AI will or won't be able to replace swathes of employees until you're blue in the face. Even if current AI revolution isn't the one that makes everybody redundant, this stuff will keep happening in some form or another, probably with ever greater frequency and magnitude. And the previous times this has happend at least took a while to proliferate on a global scale. That's not a luxury we have anymore. If someone creates a revolutionary tool that could massively lower HR costs, it'll gain traction everywhere almost immediately due to the competitive nature of free market capitalism. Anyone not engaging in that is either idealistic or stupid, at least by capitalist success metrics. This means that a large number of workers could find themselves without a job simultaneously, potentially with no way to pivot to other industries, if they even have jobs available. Sure, some jobs will remain and new ones will appear to go along with these advancements, but the tricky nature of AI is that it's a kind of Ouroboros, because those new jobs could also end up on the chopping block later on.

 

TL;DR: May you live in interesting times.

And now a word from our sponsor: 💩

-.-. --- --- .-.. --..-- / -.-- --- ..- / -.- -. --- .-- / -- --- .-. ... . / -.-. --- -.. .

ᑐᑌᑐᑢ

Spoiler

    ▄██████                                                      ▄██▀

  ▄█▀   ███                                                      ██

▄██     ███                                                      ██

███   ▄████  ▄█▀  ▀██▄    ▄████▄     ▄████▄     ▄████▄     ▄████▄██   ▄████▄

███████████ ███     ███ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀████ ▄██▀ ▀███▄

████▀   ███ ▀██▄   ▄██▀ ███    ███ ███        ███    ███ ███    ███ ███    ███

 ██▄    ███ ▄ ▀██▄██▀    ███▄ ▄██   ███▄ ▄██   ███▄ ▄███  ███▄ ▄███▄ ███▄ ▄██

  ▀█▄    ▀█ ██▄ ▀█▀     ▄ ▀████▀     ▀████▀     ▀████▀▀██▄ ▀████▀▀██▄ ▀████▀

       ▄█ ▄▄      ▄█▄  █▀            █▄                   ▄██  ▄▀

       ▀  ██      ███                ██                    ▄█

          ██      ███   ▄   ▄████▄   ██▄████▄     ▄████▄   ██   ▄

          ██      ███ ▄██ ▄██▀ ▀███▄ ███▀ ▀███▄ ▄██▀ ▀███▄ ██ ▄██

          ██     ███▀  ▄█ ███    ███ ███    ███ ███    ███ ██  ▄█

        █▄██  ▄▄██▀    ██  ███▄ ▄███▄ ███▄ ▄██   ███▄ ▄██  ██  ██

        ▀███████▀    ▄████▄ ▀████▀▀██▄ ▀████▀     ▀████▀ ▄█████████▄

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Eigenvektor said:

The more I think about it, the more I think he is wrong. This will actually raise the barrier to entry. If anyone can use AI to write programs or at least automate simple processes that previously required programming, then only people with exceptional skills will be able to make a career out of it. Just as AI is taking jobs in other fields, it will take low level jobs in IT.

 

Yes, John Doe in accounting will be able to automate some of his workflow without having to involve a developer, or learn anything about software development. But in return Jane Doe finds herself without a job, because the need for an in-house developer is gone.

 

So I guess in a sense Jensen Huang is right. If you want to be a pro-athlete, you better be very good at it. If you want to be a developer, you better be very good at it. Anyone else, maybe think about moving into another field (that isn't in danger of being replaced by AI…).

I would say that this probably won't be the case. The biggest issue I see is that if you don't know how something works then there can be bugs and issues that would take quite a bit of testing to ensure what the AI spits out actually works. The idea that Joe from accounting will be able to do that is just unrealistic. I think AI is likely be a good tool for software developers to use to help them but that won't replace them. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Brooksie359 said:

I would say that this probably won't be the case. The biggest issue I see is that if you don't know how something works then there can be bugs and issues that would take quite a bit of testing to ensure what the AI spits out actually works. The idea that Joe from accounting will be able to do that is just unrealistic. I think AI is likely be a good tool for software developers to use to help them but that won't replace them. 

I expressed the same doubts in my first post, so we're largely on the same page.

 

I think it will depend on how complex a task we're talking. When I say "automate some of his workflows" my mind goes towards fairly trivial tasks. Something that might require someone with basic programming knowledge for now, but could potentially be solved by an AI. I don't expect Joe to use an AI to write the next Windows any time soon.

 

How would you even go about that, write a million page technical spec that describes what you expect your OS to do? At what point does writing the spec become more work than writing the code? If the output has a bug, how do you go about fixing it? It would almost be a sort of meta-debugging where you change your wording until the AI produces code that works as intended… it'll likely be much easier to fix the code directly. Which requires someone who understands code.

 

10 hours ago, Sauron said:

Consider writers; chatGPT can write some pretty convincing text that is also grammatically correct (most of the time). This is more than can be said about its programming performance.

Just want to add my own thoughts to that.

 

ChatGPT may be able to write a coherent paragraph or two, but that is a far cry removed from a novel, let alone a series of novels. Could it do that while producing a consistent narrative with an interesting plot that actually goes somewhere? I assume you'd still need someone who actually has a good story in mind to create all of the prompts needed to keep it going in the right direction.

 

When it comes to programming, code doesn't just have to read fine, it has to be functional. And not just each function on its own, the functions as a whole have to work together and produce a cohesive whole. This is like a series of novels on steroids.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Eigenvektor said:

I expressed the same doubts in my first post, so we're largely on the same page.

 

I think it will depend on how complex a task we're talking. When I say "automate some of his workflows" my mind goes towards fairly trivial tasks. Something that might require someone with basic programming knowledge for now, but could potentially be solved by an AI. I don't expect Joe to use an AI to write the next Windows any time soon.

 

How would you even go about that, write a million page technical spec that describes what you expect your OS to do? At what point does writing the spec become more work than writing the code? If the output has a bug, how do you go about fixing it? It would almost be a sort of meta-debugging where you change your wording until the AI produces code that works as intended… it'll likely be much easier to fix the code directly. Which requires someone who understands code.

 

Just want to add my own thoughts to that.

 

ChatGPT may be able to write a coherent paragraph or two, but that is a far cry removed from a novel, let alone a series of novels. Could it do that while producing a consistent narrative with an interesting plot that actually goes somewhere? I assume you'd still need someone who actually has a good story in mind to create all of the prompts needed to keep it going in the right direction.

 

When it comes to programming, code doesn't just have to read fine, it has to be functional. And not just each function on its own, the functions as a whole have to work together and produce a cohesive whole. This is like a series of novels on steroids.

Even if it is something simple I would still say you would want someone with basic programming knowledge to look at what it spits out to make sure it works and there is no issues. Sure if it is something you don't care if it messes up then you could just trust AI to spit something out and run with it and deal with any potential bugs or issues but that wouldn't fly in my feild of work but maybe it is different for other people. I think this will mostly be used as something to help programmers rather than replace them. Certainly this could make entry level programming way faster which might result in less demand for those type of programmers but I don't think it will make them disappear. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Eigenvektor said:

ChatGPT may be able to write a coherent paragraph or two, but that is a far cry removed from a novel, let alone a series of novels. Could it do that while producing a consistent narrative with an interesting plot that actually goes somewhere? I assume you'd still need someone who actually has a good story in mind to create all of the prompts needed to keep it going in the right direction.

Maybe a larger version capable of longer outputs could make something that's at least coherent over a hundred pages or so, that's not theoretically impossible... although it may require more computing power than would make sense to give it right now. But either way I don't think the output would be a very good book, if not directed by a human who has some idea of what a good book even is. By virtue of how these systems work you always get the most statistically probable output, which is a mediocre one by definition. You might get a serviceable dime-a-dozen young adult sci-fi novel but I doubt you'd get Dune.

3 hours ago, Eigenvektor said:

When it comes to programming, code doesn't just have to read fine, it has to be functional. And not just each function on its own, the functions as a whole have to work together and produce a cohesive whole. This is like a series of novels on steroids.

And as we discussed before, a program could also be perfectly written and extremely efficient while doing the wrong thing.

2 hours ago, Brooksie359 said:

Certainly this could make entry level programming way faster which might result in less demand for those type of programmers but I don't think it will make them disappear.

My work involves designing and programming industrial machinery, I have colleagues whose job it is to make sure the finished machine works as intended; they aren't programmers or engineers by trade, but they do need to know the basics of programming. If they could reliably get decent code snippets from chatGPT (not gonna happen, because automation languages are too obscure to be well supported by something like this, but still) it would certainly save them a lot of time, allowing them to focus on the electrical and mechanical side of the machine. That's how I would envision this being useful more than anything else.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Sauron said:

Maybe a larger version capable of longer outputs could make something that's at least coherent over a hundred pages or so, that's not theoretically impossible... although it may require more computing power than would make sense to give it right now. But either way I don't think the output would be a very good book, if not directed by a human who has some idea of what a good book even is. By virtue of how these systems work you always get the most statistically probable output, which is a mediocre one by definition. You might get a serviceable dime-a-dozen young adult sci-fi novel but I doubt you'd get Dune.

And as we discussed before, a program could also be perfectly written and extremely efficient while doing the wrong thing.

My work involves designing and programming industrial machinery, I have colleagues whose job it is to make sure the finished machine works as intended; they aren't programmers or engineers by trade, but they do need to know the basics of programming. If they could reliably get decent code snippets from chatGPT (not gonna happen, because automation languages are too obscure to be well supported by something like this, but still) it would certainly save them a lot of time, allowing them to focus on the electrical and mechanical side of the machine. That's how I would envision this being useful more than anything else.

Yeah my thought are similar in the sense that people who know coding can use AI to write code or parts of code to speed up their process. All of that said you still need to learn coding to use it effectively making Nvidia's CEOs statements super dumb imo. I mean I don't think AI changes how much code your colleague has to know. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/27/2024 at 12:09 AM, Lunar River said:

Good luck, they still don't understand  basic encryption, asking them to  wrap their heads around AI is going to be harder than AI generating consistent hands.

After the google interview questionnaire they did was the nail in the coffin for me to think that a bunch of old law makers who have all been purchased and paid for will do anything is out the window..

Link to comment
Share on other sites

Link to post
Share on other sites

How often I heard that a new tool makes programming obsolete.

 

I remember the many generations of website composers that were putting website developers out of commissions.

 

Many, many unfortunate attempts at making compilers with natural language.

 

Graphical programming languages where you glue together shapes.

 

No, Jensen. Even when AGI will be a thing, and is able to generate super human programs, you'll still need skilled programmers to tell it what to do, and understand what it generated.

Link to comment
Share on other sites

Link to post
Share on other sites

Here's hoping for a grammatically correct code.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, 05032-Mendicant-Bias said:

How often I heard that a new tool makes programming obsolete.

 

I remember the many generations of website composers that were putting website developers out of commissions.

 

Many, many unfortunate attempts at making compilers with natural language.

 

Graphical programming languages where you glue together shapes.

 

No, Jensen. Even when AGI will be a thing, and is able to generate super human programs, you'll still need skilled programmers to tell it what to do, and understand what it generated.

Some are forgetting making programs isn't like "make me a video of a young woman walking down the Tokyo street" that everyone was so amazed by from OpenAi's Sora. And even that video could be anything and not exactly what you intended. With apps, you need EXACTLY what you intended and you can't just have Ai make it based on your description.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×