The Cyber Resilience Act and Open Source

The Cyber Resilience Act and Open Source

Linux Professional Institute (LPI) closely follows the evolving landscape of software regulations and their impact on the open source community. The Cyber Resilience Act (CRA) is one such regulation sparking global debate—not only for European developers but also for those maintaining or contributing to open source projects outside the European Union (EU).

If a project has users in the EU, or if it’s hosted on platforms accessible from the EU, the CRA may still apply. This raises urgent questions about liability, compliance, and sustainability for developers worldwide.

In this discussion, Moreno Razzoli (AKA Morrolinux: Linux and FOSS Evangelist, LPI Partner and Member), and Andrea Palumbo (an Italian lawyer specialising in technology, LPI’s Solution Provider Partner), sat down with Tommaso Bonvicini, a freelance software developer passionate about C++, open source, and everything that can be tinkered with, to unpack what the CRA could mean for open source projects. This article summarizes their key insights. Buckle up: it’s going to be a ride.

The Cyber Resilience Act: What’s Happening?

The Cyber Resilience Act (CRA), a European regulation aimed at enhancing cybersecurity standards for software and hardware, officially came into force on December 11, 2024. Within three years, compliance will be mandatory, meaning that companies, developers, and open source projects will need to adapt.

The act introduces new security obligations for software and connected devices, aiming to reduce vulnerabilities and improve response times to security threats. But as LPI highlighted more than a year ago, not everything about the CRA is smooth sailing, especially when it comes to free and open source software commercial dissemination.

The Good, the Bad, and the Bureaucratic

At first glance, the CRA sounds great—who wouldn’t want better security? The regulation mandates:

  • Minimum security standards for software and hardware.
  • Lifecycle management requirements, ensuring ongoing security updates.
  • Mandatory reporting of known vulnerabilities within 24 hours.

On paper, these seem like solid cybersecurity measures. However, when one looks at the real-world impact, things get messy.

One of the biggest concerns is who bears the responsibility for compliance. Open-source projects, which rely on volunteers and community contributions, might find these obligations overwhelming. If a solo developer writes a patch that introduces a bug, who is held accountable—the contributor, the maintainer, or the entire project?

Open Source: Caught in the Crossfire?

Initially, the CRA didn’t exempt open-source projects, meaning that even non-commercial software could have been subject to the same rigorous standards as enterprise solutions. Thanks to significant lobbying from the open-source community, the final version now includes exemptions such as the following:

  • Non-commercial free and open-source software (FOSS) is exempt from CRA requirements.
  • Projects that recover costs or reinvest profits into non-commercial activities remain exempt.
  • Donations to open-source projects do not trigger CRA compliance.

That’s good news for community-driven projects, but there’s a catch—what happens when an open-source tool is used in a commercial product? If a company integrates open-source components into its proprietary software, the burden of compliance shifts to that company. This means large companies may need to acknowledge open source dependencies, which could lead to trickle-down effects on contributors.

The Privacy and Security Dilemma

One of the most controversial aspects of the CRA is its requirement to report security vulnerabilities within 24 hours of their discovery to the following entities:

  • ENISA (European Union Agency for Cybersecurity)
  • National cybersecurity response teams (CSIRTs) in each EU member state

This creates two major risks, not limited to open source software:

  • Potential misuse by government agencies: As seen in the past (think NSA exploits), centralizing security vulnerabilities can lead to mass surveillance or malicious cyber operations.
  • A massive hacking target: With multiple entities storing security flaws, attackers could breach these databases and weaponize zero-day vulnerabilities.

As Andrea put it, “We’re creating a list of the most dangerous security holes in Europe and handing it out like a party invite.”

What About Unfinished Software?

Another murky area is the regulation’s definition of “unfinished software”—a term that includes alpha, beta, and testing versions. While software in these stages is allowed on the market, developers must still assess risks and apply security measures before public releases.

Here’s where it gets weird:

  • The CRA states that unfinished software must be available for a “limited time”—but doesn’t define what “limited” means. A year? A decade?
  • Software must explicitly state that it’s unfinished—but where? A disclaimer in the settings? A pop-up at launch?

For developers who use rolling releases or continuously update their software (think nightly builds), this requirement is a nightmare. Once again, the problem goes beyond open source.

How Do These Issues Affect Open-Source Contributions?

If you contribute to open-source projects, should you be worried? The answer is unclear.

  • Minor contributions are probably safe.
  • Major changes—especially forks or substantial modifications—could make you legally responsible for complying with at least some parts of the CRA.
  • If you maintain a fork that diverges significantly, you might be considered the “official producer” under CRA rules.

The bottom line: If you’re just submitting small patches, you’re likely fine. But if you create a project from the ground up, or take an existing project and develop it into something new, be prepared for extra scrutiny.

What’s Next?

The CRA isn’t all bad—it sets baseline security expectations and clarifies accountability where none existed before, or where it was too weak to be effective. But its broad and vague definitions, especially regarding free and open-source software, could create more confusion than clarity.

For now, major takeaways include:

  • Open-source software is mostly exempt, unless used commercially.
  • Companies integrating FOSS into products must ensure compliance.
  • Privacy concerns around vulnerability reporting remain unsolved.
  • Smaller developers might struggle with bureaucracy.

The good news? There’s still time to make improvements—the CRA won’t be enforced until December 2027, giving developers, companies, and communities three years to adapt.

Final Thoughts

As Moreno, Andrea, and Tommaso wrapped up their discussion, the consensus was clear: the CRA isn’t a disaster, but it’s far from perfect.

While it advocates for stronger security, it creates hurdles for small developers and raises concerns about centralized vulnerability reporting. The open-source community successfully lobbied for important exemptions, but questions remain about how enforcement will play out.

Want to dive deeper? Watch the full discussion on Morrolinux’s channel: (in Italian: “Cyber Resilience Act APPROVATO. Cosa cambia per l’Open Source”).

About Max Roveri:

Massimiliano "Max" Roveri is a writer, blogger, editor and social media manager. He started writing on the internet in the late '90s and he went back to the digital media in 2009. Since 2014 he lives in Ireland and, since 2015, he has been part of the LPI Italy team. He is professionally involved in cultural mediation projects, with an event management side, and in education projects as a professional and as a volunteer as well.  With a background in humanities and philosophy, he loves to address the ethical and social aspects of Open Source, with an approach that nods to Gregory Bateson and Robert M. Pirsig. Photo: uphostudio

Leave a Reply

Your email address will not be published. Required fields are marked *