Showing Posts From
Development
-
Adhithya (Adhi) Ravishankar - 20 Apr, 2026
Go's Quiet Dominance: The Language That Now Runs Cloud Infrastructure
Go doesn't have evangelists the way Rust does. Nobody is writing passionate blog posts about how Go's type system changed how they think about software. The language's community is, by design, pragmatic to the point of being slightly boring. And yet Go now runs more of the internet's infrastructure than any other language that has emerged in the last fifteen years. This is not an accident, and it's worth understanding why it happened. The Catalog Start with the software. Kubernetes is written in Go. Docker is written in Go. Terraform is written in Go. Prometheus is written in Go. Grafana's backend is written in Go. The etcd distributed key-value store that most of the cloud-native ecosystem depends on is written in Go. CockroachDB is written in Go. Istio, Envoy's control plane, most of the CNCF project catalog — Go. This is not a list assembled to make a rhetorical point. It is the actual technology stack of cloud-native infrastructure, and it is almost uniformly written in one language. If you're running Kubernetes clusters — and most organizations building software at scale are — you are running Go code. The question is why this happened, and the answer is both technical and cultural. What Go Got Right When Google open-sourced Go in 2009, the design goals were explicit: fast compilation, simple concurrency, straightforward deployment, readable code that a large team could maintain without constant style arguments. These were not ambitious goals in a research sense. They were engineering goals, chosen by people who had spent years maintaining massive codebases in C++ and Java and knew exactly what problems they wanted to avoid. Go's goroutines and channels made concurrent network programming genuinely approachable. Before Go, writing a service that needed to handle thousands of simultaneous connections required either callbacks (Node.js, with all the callback-hell problems that implied) or threads (Java, with the overhead and complexity that implied). Goroutines are lightweight — you can have hundreds of thousands of them — and the channel model for communication between them eliminated whole categories of concurrency bugs without requiring a PhD in concurrent programming. The single-binary deployment story was underappreciated at first and became one of Go's most important practical advantages as containerization took off. A Go service compiles to a single static binary with no runtime dependencies. Containerizing it means writing a FROM scratch Dockerfile with one line. Operational simplicity at this level compounds over time. The toolchain was opinionated in ways that turned out to matter. gofmt enforced a single code style, eliminating code review debates about formatting permanently. The built-in testing framework meant everyone used the same testing conventions. The race detector shipped with the standard library. These choices made large codebases — and especially codebases maintained by large, rotating teams — much easier to manage. The Generics Question, Finally Answered For years, Go's biggest technical complaint was the absence of generics. The workarounds — using interface{}, using code generation, duplicating implementations for different types — were genuinely painful, and there was a real argument that Go's simplicity was partly an illusion: you paid for it in boilerplate. Generics landed in Go 1.18 in 2022, and the reception was complicated. The implementation was initially slow, the syntax was controversial, and a significant part of the community thought Go had made a mistake. That debate mostly resolved over the following two years. By Go 1.22 and 1.23, the generics implementation was performant, the standard library had been updated to use generics thoughtfully, and idiomatic generic code became possible without fighting the type system. The pattern that emerged was sensible: generics in Go are not the centerpiece of the language the way templates are in C++ or generics are in Haskell. They're used where they clearly add value — collection utilities, generic data structures, functional helpers — and the rest of the code stays concrete and readable. Go didn't become Scala. It became a slightly more expressive version of itself, which is exactly what the language needed. Enterprise Adoption Crossed a Threshold The most significant development for Go in 2025 was not technical. It was the completion of a shift in where Go gets used. For most of Go's history, the main users were companies that were primarily software companies — the Googles and Discords and Cloudflares of the world, organizations whose engineering teams are large and sophisticated enough to evaluate a language and make a deliberate choice. These organizations adopted Go early and drove much of its ecosystem development. In 2025, Go's adoption crossed into organizations where software is not the primary business. Financial institutions building internal services. Retailers building platform infrastructure. Healthcare companies building data pipelines. These organizations don't pick languages for ideological reasons — they pick them because the tooling is mature, the talent pool is sufficient, the operational story is simple, and the code is maintainable by teams that have turnover. Go satisfies all of those requirements. It's learnable by a competent engineer in weeks, not months. It has a rich standard library that covers most of what these organizations actually need. The cloud SDKs — AWS, GCP, Azure — are all first-class. And the output is deployable infrastructure that the organization can actually operate and maintain. What the Language Looks Like in 2025 Go 1.23 shipped range-over function iterators, which cleaned up one of the remaining ergonomic rough edges in the language. The toolchain continued to improve, with faster builds and better IDE tooling through the gopls language server. The ecosystem matured in parallel. Libraries for the major use cases — HTTP services, gRPC, database access, message queues, observability — are well-maintained and production-tested. The Go module system, which had a rocky introduction but stabilized quickly, made dependency management reliable enough that it largely stopped being a source of operational pain. The one persistent complaint — error handling — remains unresolved. The if err != nil pattern that permeates Go code is verbose and creates repetitive code in any function that calls multiple fallible operations. Various proposals for improving this have circulated for years and none have been accepted. The Go team's position is that explicit error handling is worth the verbosity because it makes control flow obvious. Large portions of the community disagree. This argument will apparently continue indefinitely. Where Go Fits in 2025's Language Landscape Go is not trying to be Rust. It is not trying to be Python. The language has a clear thesis — networked services, infrastructure tooling, systems that need to be fast and concurrent but don't need to manage memory at the level of an OS kernel — and it executes on that thesis well. The interesting dynamic in 2025 is the convergence of Go and Rust in the same organizations. Companies building cloud infrastructure are often running Go services on top of systems components written in Rust. The AWS example — Lambda's Firecracker runtime is Rust; most of the control-plane tooling is Go — is being replicated at many other organizations. The two languages are complementary, not competitive, and knowing both is increasingly the mark of a well-rounded systems engineer. Go's trajectory is steady in a way that is sometimes mistaken for stagnation. The language is not going to surprise you. It is going to work reliably, compile quickly, and deploy cleanly, and the code you write today will be readable by an engineer who joins your team in three years. For organizations building and operating infrastructure, that is exactly what the language is supposed to do.
-
Adhithya (Adhi) Ravishankar - 12 Apr, 2026
Rust's Safety Guarantee Is Now a Political Argument, and It's Working
The most consequential thing that happened to Rust in 2025 had nothing to do with the language itself. It happened in Washington, in Brussels, and in the security teams of companies whose core business has nothing to do with programming languages. Memory safety vulnerabilities — buffer overflows, use-after-free bugs, null pointer dereferences — have been responsible for roughly 70% of critical security vulnerabilities in major software projects for years. Microsoft has reported this about Windows. Google has reported it about Chrome and Android. The NSA published papers on it. This has been known for a long time. What changed in 2025 is that government agencies and regulators started treating it as a policy problem with a technical solution, and the technical solution they named was memory-safe languages. The Regulatory Push The US Cybersecurity and Infrastructure Security Agency (CISA) released a roadmap for memory-safe software that went further than previous guidance. Rather than recommending that organizations "consider" memory-safe languages, it set timelines and called out specific sectors — critical infrastructure, federal contractors, healthcare IT — where migration away from memory-unsafe languages should be treated as a priority, not a long-term aspiration. The EU's Cyber Resilience Act, which came into force and began affecting product compliance requirements, embedded software security standards that made memory safety a documentable and auditable requirement for certain classes of products. Hardware vendors, automotive suppliers, and industrial equipment manufacturers selling into European markets found themselves needing to either justify the use of C/C++ with extensive tooling and process controls, or explain their migration path. None of this mandated Rust specifically. But Rust is the only language that currently satisfies the "systems-programming performance" requirement while providing the memory safety guarantees these frameworks are asking for. Go qualifies on safety grounds but doesn't cover the bare-metal and embedded use cases. C++ with hardened profiles and sanitizers can partially address the issue but requires extensive tooling investment and doesn't provide the same compile-time guarantees. What Adoption Actually Looks Like The Rust adoption story of 2025 was less about new greenfield projects and more about the seams where Rust started appearing inside existing systems. In the Linux kernel, the trajectory shifted from experimental to structural. The initial Rust support in 6.1 was celebrated but limited. Through 2024 and into 2025, the pace of Rust additions to the kernel increased — device drivers, filesystem components, and networking code — and the kernel mailing list's tone shifted from debating whether Rust belonged to debating how to grow the pool of Rust kernel contributors. Linus Torvalds, who was skeptical early, has been publicly supportive of the direction. In browsers, Mozilla's use of Rust in Firefox components predates the current moment, but Chrome's adoption of Rust for new security-sensitive code became substantial. The browser security argument for Rust is overwhelming — sandbox escapes and renderer vulnerabilities are disproportionately memory safety bugs, and both major browser engines have now committed resources to Rust for new security-sensitive code. In cloud infrastructure, AWS, Google Cloud, and Azure all have production Rust code at various layers of their stacks. AWS's Firecracker microVM — the technology underlying Lambda and Fargate — has been written in Rust since its introduction and has become a reference example for what safe systems code looks like in production at scale. In embedded and automotive, this is where the regulatory pressure was most directly felt. The Rust Embedded Working Group formalized support for a wider range of microcontroller targets, and automotive suppliers working toward ISO 26262 compliance started evaluating Rust's safety profile seriously. The toolchain for safety-critical Rust work is still maturing, but the trajectory is clear. The Borrow Checker Is the Point There's a recurring debate in software communities about whether Rust's compile-time safety guarantees are worth the learning curve. This debate often frames the borrow checker as an obstacle — something you fight through to get to writing real code. The security argument reframes this entirely. The borrow checker is not an obstacle. It is a formal verification of the most common class of critical security vulnerabilities, applied at compile time, with zero runtime overhead. You are not fighting the compiler. The compiler is preventing you from writing the category of bugs that made 70% of your CVE history possible. Once this framing settles in, the learning curve doesn't feel different — it's still steep — but the motivation to climb it changes. You're not learning an idiosyncratic type system for its own sake. You're adopting the only mainstream language that can prove, before your code ships, that a specific class of catastrophic bugs isn't in it. The Talent Gap Is Real and Getting Expensive The main constraint on Rust adoption in 2025 was not technical or organizational — it was people. There are not enough experienced Rust engineers, and the supply is not growing as fast as demand. Senior Rust engineers — people who've shipped production Rust, understand the async ecosystem, have opinions about which async runtime to use and why — are expensive and hard to find. Teams that want to adopt Rust are frequently slowed not by the language itself but by the cost of building internal expertise from scratch. This is being addressed at the supply side: university CS programs started incorporating Rust into systems programming courses, and corporate training programs for Rust have proliferated. But there's a multi-year lag between training investment and production-ready engineers, and in the meantime the demand side is moving fast. If you're a C or C++ systems programmer and you haven't learned Rust yet, your situation is analogous to a Java programmer in 2010 who hadn't looked at the JVM alternatives yet. The window where that's a comfortable position is closing. The Long Game Rust is not going to replace C and C++ in existing codebases in the near term. The installed base of C and C++ is measured in billions of lines, and rewriting working code for its own sake is not how engineering organizations allocate resources. What Rust is doing — and what 2025 accelerated — is becoming the default choice for new systems code. When a team needs to write something new that would previously have been written in C++, Rust is now the answer they have to argue against rather than argue for. That inversion matters enormously for where the language is in ten years. The safety argument won. The political argument is now carrying it forward into domains where technical merit alone would have moved things slowly. The combination of those two things is why 2025 was Rust's inflection point.
-
Adhithya (Adhi) Ravishankar - 08 Apr, 2026
Go and Rust Are Winning. Here's Why 2025 Was the Tipping Point.
Ten years ago, if you told a hiring manager your primary language was Go or Rust, you would have gotten a polite nod and a question about whether you also knew Java. Today those same hiring managers are posting roles that specifically require Rust or Go experience and struggling to fill them. Something fundamental shifted, and 2025 was the year it became undeniable. The Numbers First Stack Overflow's developer surveys have tracked Rust as the most admired language for years running — a streak that continued through 2025. But admiration and adoption are different things, and for most of Rust's life the gap between the two was the main story. That gap closed meaningfully over the past twelve months. Go's trajectory is different but equally significant. It quietly became the lingua franca of cloud-native infrastructure — Kubernetes, Docker, Terraform, Prometheus, and most of the CNCF project catalog are written in Go. In 2025, Go crossed a threshold where it stopped being "the language Google uses for its tools" and became the default choice for new backend services at a broad range of companies, from startups to enterprises. What Changed for Rust Rust's turning point was not a single moment but several things arriving at once. The US government got involved in an unusually direct way. The NSA, CISA, and a coalition of cybersecurity agencies published guidance explicitly naming memory-unsafe languages as a systemic risk and recommending migration toward memory-safe alternatives — with Rust named repeatedly as the leading systems-programming option. When federal agencies start writing technical language recommendations, industries that sell to the government pay attention, and then everyone else starts paying attention too. Rust's presence in the Linux kernel went from experimental to structural. The initial Rust support merged in kernel 6.1 was cautious and limited, but by 2025 multiple subsystems had Rust drivers in mainline, and the kernel development community's conversation shifted from "should Rust be here?" to "how do we grow the Rust contributor base?" That's a significant change in posture. Microsoft's investment became visible. The Windows team has been rewriting core system components in Rust for years, and in 2025 more of that work became public. When the company responsible for the world's most widely deployed operating system starts treating Rust as a load-bearing part of its platform strategy, the language's legitimacy in systems programming is established. Android crossed 21% of new code being written in Rust, and Google reported a corresponding drop in memory safety vulnerabilities in those components. The security case for Rust stopped being theoretical. What Changed for Go Go's story in 2025 was less dramatic than Rust's but arguably more economically significant. Generics, which landed in Go 1.18, had a rough first reception — the implementation was slow and the community was divided about whether Go needed them at all. By 2025, the generics implementation had matured enough that idiomatic generic code became possible without the performance penalties that made early adopters skeptical. That resolved the biggest open technical complaint about the language. Go 1.22 and 1.23 shipped range-over functions and improved the toolchain in ways that made large codebases more manageable. The language's compile times and deployment story — a single static binary, trivial to containerize — continued to make it the pragmatic choice for teams who need to ship and operate services at scale. The clearest signal of Go's mainstream status: it became the default language for new services at companies that are not primarily software companies. Banks, retailers, healthcare companies building internal platforms — these organizations picked Go because it was learnable, because the toolchain was sane, and because the ecosystem of libraries for the things they actually needed (HTTP services, database access, cloud SDKs) was excellent. They're Not Competing With Each Other The framing of "Go vs Rust" misses the point. They occupy genuinely different spaces. Rust is for when you need control over memory layout, when you're writing something that touches hardware or runs in constrained environments, when correctness guarantees matter more than development speed, or when you're extending an existing systems codebase (OS, browser, embedded firmware). Go is for when you're building networked services, when team velocity matters, when you want straightforward concurrency that doesn't require understanding memory ownership semantics, and when you need something you can hire for. The interesting question isn't which one wins — they're both winning, in different domains. The interesting question is what they're displacing, and the answer is increasingly: C++ in systems programming (Rust), and Java/Python in backend services (Go). Both of those are genuinely significant disruptions to ecosystems that have been stable for twenty years. What It Means for Developers If you're early in your career, learning one of these languages is no longer a speculative bet on the future — it's meeting the market where it already is. If you're a senior engineer who has avoided both because your current stack is working fine, the window for that posture is narrowing. The infrastructure being written today in Go and Rust is what you'll be maintaining and extending in five years. Neither language is without friction. Rust's learning curve around the borrow checker remains genuinely steep, and no amount of improved tooling fully eliminates the hours you'll spend fighting the compiler until the mental model clicks. Go's simplicity is real but it also means you'll write more boilerplate than in more expressive languages, and the error handling story is still something people argue about. But both languages are mature, well-tooled, well-documented, and backed by organizations with serious long-term commitments to them. The risk calculus that made them speculative bets in 2015 doesn't apply in 2025. They're infrastructure now.