Reasons why I still love C in 2022
The C programming language is 50 years old this year. I wanted to break my blogging hiatus to write this post to celebrate the occasion and explain why I think someone (maybe me?) could be writing a similar post to this within another 50 years. But first, let’s take a small detour…
Boring, loosely related preamble about my childhood
My deep interest for computers began at a rather young age. In those tender years, a dialogue between me and a computer savvy person would have gone down like this:
- What is a computer?
- It’s a box that allows you to do all kinds of useful stuff.
- What can you do with it?
- Scribble things on MS Paint (I was obsessed), write letters in a word processor, exchange email over the Internet, play games…
This pushed me to test the limits of what could be done with an average consumer desktop at the time. Many perfectly working installations of Windows 98 were sacrificed for the sake of my research.
As I grew older, so did my curiosity, with my line of questioning becoming more specific:
- But how can it do that?
- Well, it has a CPU which executes these tasks, some RAM to store its thoughts on the go and some hard disks too so that it can save your precious cat pictures and favourite website bookmarks.
- What is a CPU exactly? How does it talk to the RAM banks?
- The CPU has several busses, which it uses to exchange information with RAM and the rest of the system too.
- How are these busses laid out? How is information encoded so that every component knows what’s going on at any point?
- Better take a seat pal, because things are about to get intense…
Whenever I program in a modern language like JavaScript, Ruby or Python, I can get a lot of work done. I ask for stuff and stuff indeed happens. Users log in, forms are highlighted in beautiful colours, gigabytes of data are extracted from thousands of forum threads in a matter of minutes. Then my inquisitive mind perks up again: how is this all even possible? The aforementioned languages will just shrug and reply “Dunno, magic? Who cares?”. This makes me anxious, because I care.
Are people still using C in 2022?
The C programming language as we know it today was born in 1972. Originally, it was used to write parts of the tooling for the Unix operating system running on the DEC PDP-7, including a new compiler. One year later, Unix had been entirely re-written in C. Its inventors, Dennis Ritchie and Brian Kernighan published The C Programming Language (also known as the K&R book) in 1978, which became the de facto language spec until the ANSI committee ratified the ANSI C standard in 1983.
The popularity of the language grew by leaps and bounds from there on out, establishing itself as one of the most popular high-level, general-purpose languages in the 20th century. Half a century later, C remains one of the most widely used languages in the world by many accounts. As of the time of this writing, it ranks
- 2nd on the TIOBE Index,
- 9th on GitHub,
- 12th on the Stack Overflow Developer Survey 2021.
This popularity sample may not fully account for the vast amount of embedded devices and closed-source, proprietary applications that have been written in C up to this point. With such a strong presence in the collective mind share, it’s difficult to picture this language falling into obscurity anytime soon, as some of its contemporaries like Fortran, Pascal or COBOL did. With the exception of SQL, I cannot think of many other languages of that era that have managed to remain as strong and relevant as C has.
In spite of this, the reality of computing in the 21st century is markedly different to what it was during the latter half of the 20th century. The rise of the World Wide Web, portable computers, distributed computing, the cloud… C is having a really rough time finding its place in this fast-paced, highly interconnected ecosystem, dominated by the likes of Java, JavaScript or Python.
Does this mean legacy systems and ancient mainframe computers are the only reason C is still around today? Not really. Hugely successful software projects have been launched in the last 15 years, with C as their primary or supporting language. A few examples that come to mind are:
- Redis
- jq
- Wayland: many implementations of this next-generation Linux display server designed over the last decade are purely written in C, such as wlroots and Sway.
- Fuchsia: small parts of this budding OS built from scratch by Google are written in C
Not to mention the elephant —nay, mammoth— in the room, Linux! With well over 27 million lines of C code written since 1991, Linux is the largest piece of piece of software ever produced.
What makes C so unique that keeps attracting developers to this day? Here are some of the unique strengths of this language that contribute to its continued success.
C is fast
In its early years, speed wasn’t as much a defining feature of the C language as it was the expectation with most languages like it. Hardware was so constrained back then that employing languages that would make the most of the little resources available was a necessity, not a selling point.
As programming languages have traded in performance in their quest to gain productivity through abstraction, C has remained as close to the hardware as ever, which has allowed it to remain nimble and efficient. The speed and tools C puts at your disposal to make the most of your hardware is one of the main reasons people are still drawn to it.
Check out some of these benchmarks to get a better idea as to how C runtime performance fares compared with other languages.
C is stable
C is a conservative language. There have been 6 revisions to the ANSI C standard since its inception in 1983, but none have introduced substantial changes to the original specification. Take for example the two snippets below: one was written 5 years ago whereas the other is decades old. Can you guess which is which?
static inline int copy_pmd_range(struct mm_struct *dst_mm, struct mm_struct *src_mm,
pud_t *dst_pud, pud_t *src_pud, struct vm_area_struct *vma,
unsigned long addr, unsigned long end)
{
pmd_t *src_pmd, *dst_pmd;
unsigned long next;
dst_pmd = pmd_alloc(dst_mm, dst_pud, addr);
if (!dst_pmd)
return -ENOMEM;
src_pmd = pmd_offset(src_pud, addr);
do {
next = pmd_addr_end(addr, end);
if (pmd_none_or_clear_bad(src_pmd))
continue;
if (copy_pte_range(dst_mm, src_mm, dst_pmd, src_pmd,
vma, addr, next))
return -ENOMEM;
} while (dst_pmd++, src_pmd++, addr = next, addr != end);
return 0;
}
Compare the above with this piece:
typedef struct {
zx_device_t* dev;
zx_device_t* transport_dev;
bt_hci_protocol_t hci;
} passthrough_t;
static zx_status_t bt_hci_passthrough_get_protocol(void* ctx, uint32_t proto_id, void* out_proto) {
if (proto_id != ZX_PROTOCOL_BT_HCI) {
return ZX_ERR_NOT_SUPPORTED;
}
passthrough_t* passthrough = ctx;
bt_hci_protocol_t* hci_proto = out_proto;
// Forward the underlying bt-transport ops.
hci_proto->ops = passthrough->hci.ops;
hci_proto->ctx = passthrough->hci.ctx;
return ZX_OK;
}
The former snippet is from the Linux memory management subsystem (mm/
), written in 1991, which adheres to the C89 standard. The latter is an excerpt from the Bluetooth HCI module in the Google Fuchsia OS codebase, developed 26 years later (1).
As an exercise to the reader, I would suggest locating an early snapshot of a random source file in a long-lived ─or defunct─ PHP, C# or C++ codebase written a couple of decades ago or more and comparing it to something written in the last 5 years.
C is portable
The opening introduction to the K&R book states that C “is not tied to any one operating system or machine”. And boy did they take this seriously. As of today, C has been ported over to dozens of different CPU architectures, well beyond the primitive architecture of the DEC PDP-7. Here are a few major examples:
- x86/x86_64
- ARM
- PowerPC
- MIPS
- System/360
- SPARC
As long as your C program doesn’t rely on any platform-specific features, chances are you can probably build it for most CPU architectures or operating systems imaginable.
C is simple
The last edition of the K&R book reserves 32 keywords for language use. Further iterations of the ANSI C standard might have been introduced a handful more but overall, the core of the language has remained tiny. This makes C an easy language to learn, but very difficult to master.
This stands in stark contrast to C++, its object-oriented cousin, which has introduced countless extensions and paradigms to the language which, in my opinion, bring added complexity to an already intricate language.
The compactness of C doesn’t end in its specification though, as it also extends to its runtime. C doesn’t have a run-time like other compiled languages such as Go do. This means the size of a C binary will be quite a bit smaller than most binaries built from other languages.
Processes that take up less space in memory are always desirable and downright indispensable for manufacturers and users of heavily constrained hardware Professional and enthusiast devices like the Arduino further cementing C as the dominant language in the embedded space.
Is C a perfect language?
Of course not! Quite far from that. Moore’s Law has ensured the steady, exponential growth of computing capacity over the years, which has unlocked endless possibilities for the modern developer. With the average smartphone being thousands of times more powerful than most of the computers around when C was invented, performance has become an afterthought for many, and for good reasons. Nowadays, companies and developers value other qualities much more, such as higher developer productivity and lower cognitive load through greater abstraction.
What about the modern alternatives?
Multiple programming languages have emerged in recent years with the aim to deal with most of the same problems C was created to solve:
- Rust
- Go
- Julia
With the exception of C++, arguably a superset of C, most of these young languages have managed to evolve into incredibly powerful, efficient and beloved systems programming languages. The proposal for Rust to become the secondary language for the development of the Linux kernel has been received with positive feedback from the kernel maintainers. This is a significant milestone as it reflects that confidence in this language in the systems programming community has grown significantly.
With C++, Go and Julia also establishing themselves as solid alternatives in most of the fields C would’ve been used before, it is easier now to imagine C fading into the sunset in a distant future than ever before.
C is truthful
All in all, C is an honest language and this is perhaps the quality that rings most important to me out of everything I’ve said. No clever optimisations past compilation will occur in your program and even those, you can turn off.
Now you might argue,
- isn’t C too low-level by modern standards?
- doesn’t the granularity of the language make it very hard to see the forest through the
free
s (2)? - isn’t it all too easy to inadvertently write heaps of bugs and security vulnerabilities into your C programmes because of C’s spartan approach to memory handling?
- is micro-managing every single aspect of an application really a sensible approach to most software engineering endeavours these days?
You might argue all of these things, and you’d be right. C is difficult and it asks a lot from you as a developer. It can also be dangerous and terribly unsafe if not used with care and intimate knowledge of the target platform. But it’s also fun and teaches you a lot about how computers really “think”. Deep down, I know everything else is not that important to me. I like C because it is the transcription of the answers to all of the questions I could ever ask about a piece of software. It doesn’t lie to you nor does it try to hide how it does what it does. Magic is not allowed to happen in the arcane world of C. If you don’t have the answer for a particular problem, your program simply won’t compile until you do! This is the ultimate form of self-documentation.
Do I really think people will still be writing about C when it reaches 100 years of age? Let’s put it this way: when the sprawling network of AI automata take over the world and software evolves to become self-advancing, ever-knowing and self-aware, deep, deep down, something has to keep things honest for us all.
Happy birthday, C!
(1) The examples may appear blatantly cherry-picked but I assure you I spent most of the time selecting these examples finding a sufficiently old file in the Linux kernel tree and a C source file in the Fuchsia OS to compare it to.
(2) Get it? trees, free
s… hehe