Interesting piece on the government trying to force programmers to memory safe languages. Having come from a megacorp, the push is because new programmers aren’t that adept at using C and C++ where you directly access memory and need to make sure you’re not exceeding data boundaries, but megacorps live by hiring the fresh out of school programmers to keep costs down. I left AT&T because they were using Mahindra LTD who were using new programmers fresh out of school all over the world for minimum expense, and I was spending more time managing trouble tickets than getting work done as the new software projects were terribly implemented (perfect timing for a voluntary layoff local management couldn’t override). And we learned that Google is supposedly using AI for 25% of their code, with the corporations all salivating over using AI to replace programmers. Though, when you factor in the expense of these large AI systems that are more promise than substance, are they really going to save anything? It will probably cost them more for an even worse product with whatever liabilities that entails. But for now, lets push them to programming languages where new recruits can’t create so many security vulnerabilities.
https://www.theregister.com/2024/11/08/the_us_government_wants_developers/
Does anyone want to tell Linus Torvalds? No? I didn’t think so
Opinion I must be a glutton for punishment. Not only was my first programming language IBM 360 Assembler, my second language was C. Programming anything in them wasn’t easy. Programming safely in either is much harder.
So when the US Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigations (FBI) announced they were doubling down on their efforts to persuade software manufacturers to abandon “memory-unsafe” programming languages such as C and C++, it came as no surprise.
The report on Product Security Bad Practices warns software manufacturers about developing “new product lines for use in service of critical infrastructure or [national critical functions] NCFs in a memory-unsafe language (eg, C or C++) where there are readily available alternative memory-safe languages that could be used is dangerous and significantly elevates risk to national security, national economic security, and national public health and safety.”
In short, don’t use C or C++. Yeah, that’s going to happen.
If this sounds familiar, it’s because CISA has been preaching on this point for years. Earlier in 2024, CISA, along with partner agencies including the FBI, Australian Signals Directorate’s Australian Cyber Security Centre, and the Canadian Centre for Cyber Security, aka the Five Eyes, published a report, Exploring Memory Safety in Critical Open Source Projects, which analyzed 172 critical open source projects. The findings revealed that over half of these projects contain code written in memory-unsafe languages, accounting for 55 percent of the total lines of code across the examined projects.
Specifically, “Memory-unsafe languages require developers to properly manage memory use and allocation. Mistakes, which inevitably occur, can result in memory-safety vulnerabilities such as buffer overflows and use after free. Successful exploitation of these types of vulnerabilities can allow adversaries to take control of software, systems, and data.”
Tell us something we didn’t know.
CISA continued that memory safety vulnerabilities account for 70 percent of security vulnerabilities. To address this concern, CISA recommends that developers transition to memory-safe programming languages such as Rust, Java, C#, Go, Python, and Swift. These languages incorporate built-in protections against common memory-related errors, making them more secure from the code up.
Sounds good, doesn’t it?
If only it were that easy to snap your fingers and magically transform your code base from C to Rust. Spoiler alert: It’s not.
Take Rust in Linux, for example. Even with support from Linux’s creator, Linus Torvalds, Rust is moving into Linux at a snail’s pace.
The problem is, as Torvalds said at Open Source Summit Europe 2024, “The whole Rust versus C discussion has taken almost religious overtones” with harsh arguments that have led to one Rust in Linux maintainer throwing up his hands in disgust and walking away. You see, people who’ve spent years and sometimes decades mastering C don’t want to master the very different Rust. They don’t see the point. After all, they can write memory-safe code in C, so why can’t you?
Well, because they don’t have those years of experience, for one thing.
It’s more than just old, grumpy developers. Converting existing large codebases to memory-safe languages can be an enormous undertaking. It’s time-consuming, resource-intensive, requires careful planning to maintain functionality, and, frankly, it’s a pain in the rump.
Another problem is that memory-safe languages may introduce performance slowdowns compared to C and C++. There’s a reason we’re still using these decades-old, difficult languages; with them, developers can produce the fastest programs. Given a choice between speed and security, programmers and the companies that employ them go for the fastest code every time.
Besides the sheer migration cost, companies also face the expense of replacing existing development tools, debuggers, and testing frameworks to support the new languages. Then, of course, they’re integrating the new programs with the old code and libraries.
The CISA is insisting that this be done. Or, at the least, companies must come up with roadmaps for moving their existing codebases by January 1st, 2026. The CISA argues that the long-term benefits in terms of reduced vulnerabilities and improved security outweigh the initial investment.
I know businesses. They’re not going to buy this argument. In the modern corporate world, it’s all about maximizing the profits for the next quarter. Spending money today to save money in 2027? It’s not going to happen.
Eventually, painfully, slowly, we’ll move to memory-safe languages. It really is a good idea. Personally, though, I don’t expect it to happen this decade. In the 2030s? Yes, 2020s? No.
Neither businesses nor programmers have sufficient reason to make the jump. Sorry, CISA, that’s just the way it is.