Jump to content

White House urges developers to avoid C and C++, use 'memory-safe' programming languages

DuckDodgers

In a new report, the White House Office of the National Cyber Director (ONCD) has called on developers to use "memory-safe programming languages," a category which excludes the popular languages. 

 

Quotes

Quote

"Experts have identified a few programming languages that both lack traits associated with memory safety and also have high proliferation across critical systems, such as C and C++," the report reads. "Choosing to use memory safe programming languages at the outset, as recommended by the Cybersecurity and Infrastructure Security Agency’s (CISA) Open-Source Software Security Roadmap is one example of developing software in a secure-by-design manner."

 

NSA Suggested Memory-Safe Programming Languages:

  • Rust
  • Go
  • C#
  • Java
  • Swift
  • JavaScript
  • Ruby

 

The market has been flooded with developers who have degrees or certifications with only a single semester of Java programming. Doesn't matter how "memory safe" the language is. With the incredulously low standard for programmers these days, they are all a threat. Stop blaming the language!

 

Sources:

https://www.tomshardware.com/software/security-software/white-house-urges-developers-to-avoid-c-and-c-use-memory-safe-programming-languages

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, DuckDodgers said:

Quotes

 

The market has been flooded with developers who have degrees or certifications with only a single semester of Java programming. Doesn't matter how "memory safe" the language is. With the incredulously low standard for programmers these days, they are all a threat. Stop blaming the language!

Even the best programers have super exploitable memory in C/C++

It is not as easy to make rust as exploitable, we are not talking about java here really. 

Rust is finally an option for low level coding and is doing great in windows and linux drivers and will be slowing going into the kernel
WITH Rust existing, the need to write in C++ is decimated, for NEW projects that you are considering writing in C/C++ change to Rust. 

yes there will still be exploits, thats not the point. The point is there will be LESS exploits from overlooking some pointer. 

The language is the problem with C/C++ when there is a safer language that accomplishes the same tasks. 

 

Quote

 Microsoft security engineers reported that around 70% of security vulnerabilities were caused by memory safety issues.

AKA 70% of vulnerabilities coming from C/C++ (which to be fair, is what the OS's are written in and Chromium)

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, DuckDodgers said:

The market has been flooded with developers who have degrees or certifications with only a single semester of Java programming

Hey hey hey chill with the Java hate it is python that deserves the fire. 

Link to comment
Share on other sites

Link to post
Share on other sites

java and python, both running on their own VMs have limited attack vectors even if your code is trash. 

it is difficult to compromise a system through a program running those languages as you have to escape the VM (not saying you cant. looking at you log4j)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Back when I was taking programming in JC, the concept of how C++ handled memory didn't make any sense. I mean, it made sense in how it worked, but why it worked that way didn't make any sense. Who the hell wants pointers?

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, DrMacintosh said:

Back when I was taking programming in JC, the concept of how C++ handled memory didn't make any sense. I mean, it made sense in how it worked, but why it worked that way didn't make any sense. Who the hell wants pointers?

its how computers work at a fundamental level. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, DuckDodgers said:

JavaScript

Sure, your application will consume an order of magnitude more resources at a fraction of the capacity and be plagued by an endless stream of runtime errors. But that's a small price to pay for the peace of mind to know that when you're on call for that 3AM outage, it wasn't because of a double-free.

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, starsmine said:

its how computers work at a fundamental level. 

Can confirm. When you get down to the level of C and C++, all accesses to heap memory and things like passing variables by reference require pointers. If you go down a level further to assembly, ALL memory accesses require pointers, both to the stack and the heap.

 

Languages like Java and Python still use pointers, they just do a good job of hiding it from you.

Computer engineering grad student, cybersecurity researcher, and hobbyist embedded systems developer

 

Daily Driver:

CPU: Ryzen 7 4800H | GPU: RTX 2060 | RAM: 16GB DDR4 3200MHz C16

 

Gaming PC:

CPU: Ryzen 5 5600X | GPU: EVGA RTX 2080Ti | RAM: 32GB DDR4 3200MHz C16

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, starsmine said:

its how computers work at a fundamental level. 

And making devs responsible for how computers work at a fundamental level is exactly why there are so many security vulnerabilities. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, DrMacintosh said:

Back when I was taking programming in JC, the concept of how C++ handled memory didn't make any sense. I mean, it made sense in how it worked, but why it worked that way didn't make any sense. Who the hell wants pointers?

 

Pointers make sense, but you need to have the frame of mind of what a pointer is. If you play the games "Human Resource Machine" or "7 billion humans" both which basically are programming metaphors of assembly language, it's conceptually easier to explain to people who can't grasp it.

 

"Why" you need pointers comes back to memory management.  For example the video buffer is an array of anywhere 160x192 pixels (or 64 segments in an 8 digit 7SEG display). Your output isn't "Draw a sheep" your output is "set the bits of the video buffer to (one of X bit values) So if 2 pixels are represented by one byte (eg 16 bit color) or 1 pixel represents an index to a palette of RGB values, you need to use pointers.

 

Otherwise you are writing loops changing each byte one at a type and when you encounter things like the "multiple pixels are represented by one byte" you need to do things like bit shifts and and pointers, so your software logic is basically "how far in this chunk of memory to make the change" rather than treating it as an array of a fixed type.

 

People who first learned BASIC, learned what pointers were from the POKE command. Because that's pretty much what it was, "go to this address in memory and write this value"

 

People don't like using pointers, but they're there because you need to read or write to allocated memory. Simply leaving that as a defined array, means it will be lost when the function leaves scope. If you "malloc()" something it has to in turn be "free()'d" within the same function to avoid use-after-free or memory leaks from constantly malloc'ing without free'ing.

 

That is what Rust is supposed to be better at. But then you have interpreted languages (Javascript, Perl, PHP, Python, etc) which instead don't let you access memory outside the function scope, which leads to a lot of other bugs where I want to use a globally scoped variable so it lives beyond the function, but can't have another function write to it. 

 

Javascript / node.js is one of those cases where it's output is entirely unreliable. I could change the value of something that is globally scoped, but get two different outputs derived from the same globally scoped variable.

 

You have to use the right language to make the right tool, and while C is "always A right tool" it's never the best tool. C++ tends to the worst tool to use when you need security (Eg a MMO game client, a Banking app, a database) because so many bugs from not freeing or zero'ing memory happen.

 

So a lot of bugs in C and C++ isn't because it's C or C++, it's because people get used to writing code a certain way (eg letting memory get free'd by going out of scope) and assume it always does.

 

What is likely to happen is that people who want to use C, will continue to use C because that is what programming libraries speak, and thus that is the universal programming language. No two programming languages can speak another programming language's object format. No two C++ programs can even talk to each other as libraries, only as C interfaces.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, dcgreen2k said:

Languages like Java and Python still use pointers, they just do a good job of hiding it from you.

In fact, depending on your point of view, every non-primitive variable in Java is a pointer. Its just more commonly called a reference.

 

The one thing you don't have is explicit pointer arithmetic, hence no need for an explicit pointer syntax. The only form of manipulation you are allowed to do is make a it point to another Object (or null). You can't increment or decrement it like you could in C/C++.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

So a lot of bugs in C and C++ isn't because it's C or C++, it's because people get used to writing code a certain way (eg letting memory get free'd by going out of scope) and assume it always does.

 

What is likely to happen is that people who want to use C, will continue to use C because that is what programming libraries speak, and thus that is the universal programming language. No two programming languages can speak another programming language's object format. No two C++ programs can even talk to each other as libraries, only as C interfaces.

It's not because people get used to writing code in a certain way, in C and C++ it makes it really easy to accidentally make a mistake and read memory out of bounds, or write too much to a certain pointer etc.

 

Like for example heartbleed, it really wasn't about writing code a certain way (I'm sure the guy normally did), it's just something that's easy to overlook.

 

I think instead of C, you will get more and more people writing rust instead (seems like the one that will likely take over) and you will eventually see a slow shift to it.

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, wanderingfool2 said:

It's not because people get used to writing code in a certain way, in C and C++ it makes it really easy to accidentally make a mistake and read memory out of bounds, or write too much to a certain pointer etc.

I agree - there are tools to help find and debug these kinds of errors like valgrind and ubsan, but it's better to use a language that prevents them entirely for important applications. I remember one of my cybersecurity professors mentioning how he switched most of his department's new developments to Go for this reason.

 

37 minutes ago, wanderingfool2 said:

I think instead of C, you will get more and more people writing rust instead (seems like the one that will likely take over) and you will eventually see a slow shift to it.

That certainly seems like what's going to happen. I know embedded systems development has an incredibly high concentration of C and C++ code due to speed and memory constraints, but there's been growing interest in Rust toolchains for the popular MCUs.

Computer engineering grad student, cybersecurity researcher, and hobbyist embedded systems developer

 

Daily Driver:

CPU: Ryzen 7 4800H | GPU: RTX 2060 | RAM: 16GB DDR4 3200MHz C16

 

Gaming PC:

CPU: Ryzen 5 5600X | GPU: EVGA RTX 2080Ti | RAM: 32GB DDR4 3200MHz C16

Link to comment
Share on other sites

Link to post
Share on other sites

I remember as a software intern many many years ago, I was given a task to write in C. It seemed to work, but randomly I'd get segmentation faults. I ask a senior person in the department for help and they ran my code through some kind of verification thing. Suffice to say, there were a lot of errors in pointer referencing, off by one or worse. It compiled and ran fine 99% of the time.

 

How useful are tools like those now? Or is code too complex you can't verify your way out of it?

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, porina said:

How useful are tools like those now? Or is code too complex you can't verify your way out of it?

Those tools are used, and that's actually how some of the bugs are found these days...

 

An issue is that sometimes you can't tell without a bit of context of what the code is doing as well [but yea, there are tools that go searching for that kind of stuff]

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, porina said:

How useful are tools like those now? Or is code too complex you can't verify your way out of it?

Memory error checkers are used very often, and UBSan seems to be the most popular currently.

 

I believe the usefulness of those tools is somewhat tied to how good the project's architecture and testing practices are. If you thoroughly test a module as soon as you write it, it's easy to pinpoint errors and fix them before they propagate to later parts of the project. On the other hand, if you wait to write any tests, have sparse tests, or write spaghetti code, then going through a list of errors, fixing old code, and making sure nothing breaks is going to be a painful process.

Computer engineering grad student, cybersecurity researcher, and hobbyist embedded systems developer

 

Daily Driver:

CPU: Ryzen 7 4800H | GPU: RTX 2060 | RAM: 16GB DDR4 3200MHz C16

 

Gaming PC:

CPU: Ryzen 5 5600X | GPU: EVGA RTX 2080Ti | RAM: 32GB DDR4 3200MHz C16

Link to comment
Share on other sites

Link to post
Share on other sites

Pretty stupid, the real issue is that Windows is just not safe when it comes to memory management...

 

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, starsmine said:

java and python, both running on their own VMs have limited attack vectors even if your code is trash. 

Java has had some wildly bad security vulnerabilities over the years. In theory doesn't stand in the way of history, Java's list of security vulnerabilities is very long so this is completely moot.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, DuckDodgers said:

Java

image.gif.1eee1065fef80b1dccf354062c775169.gif

10 hours ago, DuckDodgers said:

Doesn't matter how "memory safe" the language is.

I would say it does matter, having built-in checks decreases the likelihood of obscure errors regardless of your experience level. It makes sense to recommend usage of languages with these features when possible, especially in mission critical applications.

 

Now, that doesn't mean that a language having these features is automatically suited for such applications either...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

I'm just here wanting good anti-cheat in games. 

Oh and proper coding so games don't run like crap.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Doobeedoo said:

I'm just here wanting good anti-cheat in games. 

Oh and proper coding so games don't run like crap.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, starsmine said:

 

Yet to check really, but I've already seen it's about Valorant and Vanguard anit-cheat that is kernel-level which is a nope.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Doobeedoo said:

Yet to check really, but I've already seen it's about Valorant and Vanguard anit-cheat that is kernel-level which is a nope.

Should check it, you asked for a good anti-cheat, but you need to understand how cheats work first to make that ask. 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, starsmine said:

Should check it, you asked for a good anti-cheat, but you need to understand how cheats work first to make that ask. 

I mean I do, but kernel level one is still not untouchable yet it has such root access which is unacceptable though. I'll check it out.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×