Moving to Codeberg

|
We're moving to Codeberg! Let's talk about why.

Over the coming weeks and months, we'll be moving all of our projects to Codeberg! Let's talk about why.


Since the release of ChatGPT in 2022, the world has seen the rise of a rampant growth in generative AI tooling. As many of our users will be aware, there's no such thing as a free lunch — and this tooling comes at a huge cost, both literal and otherwise.

As GitHub becomes directly controlled by Microsoft's AI division and engineers are forced to implement AI tools that don't align with their ethics, we've been thinking about how much we rely on it — and what we can do to lower that reliance.

We've banned using generative AI tools to contribute to our projects since 2023, but now it is time to take things a step further.

What Happened?

In 2021, GitHub announced their first technical preview for GitHub Copilot, as an extension for Visual Studio Code. Initially powered by the OpenAI Codex, Copilot promised to speed up software development by allowing users to generate code based on simple, natural-language prompts.

It didn't take long for this tool to start generating controversy, when it directly copied Quake's fast inverse square root function with a very incorrect licence. In 2022, the Software Freedom Conservancy began calling for users to Give Up GitHub, providing detailed reasons to move away, and a list of resources to help with that.

Fast-forward to 2025, and we can see how this first version of Copilot was a harbinger of what was to follow. Not only has GitHub shown how Copilot was trained on open-source GitHub projects without permission, but Copilot has become a major focus for Microsoft, even building the next version of Windows around it. GitHub has become infested with AI tools, with an intrusive prompt input box on the homepage providing a poor version of potentially useful tools that should be a core part of the platform, AI generation tools for issues and PRs that projects can't opt out of, and a deluge of bots that have replaced real, human review with hallucinatory and dangerous AI-generated code reviews.

When the GitHub CEO announced that he would step down at the end of 2025 and GitHub would be directly controlled by Microsoft's AI division, we decided that enough was enough, and it was time to leave before the platform became even more abusive, and our rights under copyright law were infringed upon even more.

Moving to Codeberg

In June, in the #make-yourself-heard channel our Discord server, we asked our community what we should do, and where we should potentially move our projects. After some discussion, our community members overwhelmingly voted to move our projects to Codeberg — so that's what we're doing!

As of this blog post, we've set up our own Forgejo Actions runner and finished moving our two template projects to our Codeberg organisation. These projects prove that moving to Codeberg (or another Forgejo host, including potentially our own in the future) is a viable approach for us, and we'll be working on moving our other projects there over the coming months.

We plan on maintaining GitHub Actions workflows for our template projects, as we understand our users may wish to continue using it.

Kord Extensions doesn't generate any income as of this writing, and the project owner doesn't have much money, but we intend on joining Codeberg e.V. in the future if this changes, or regular donations can cover the cost.

Thanks for being here!

Kord Extensions can only be what it is thanks to its users. It has been a long and winding road, but one we feel has been incredibly worth travelling, and we intend on continuing well into the future.

To all users, past, present, and future — thanks for sticking with us!


AI Tool Notes

For those not in the loop, here are some key points about generative AI tooling:

  • Generative AI tools do not think or feel. They can't make logical conclusions, often hallucinating their output and breaking the rules set out by their prompts.
  • Generative AI tools rely on other people's work, and have often been called "copyright violations at scale". They can't produce any meaningful output without consuming and training on work made by real people, and AI companies often state they must be granted copyright exceptions to be able to continue working.
  • Generative AI tools benefit the rich by giving them a reason to fire swathes of people (and replace them with AI), and a way to leverage copyright law for themselves while working to remove its protections from regular people.
  • Generative AI tools have shown to be incredibly bad for the environment, and companies/governments rushing to take advantage of gaps in the market have markedly lowered living standards for already disadvantaged minorities.
  • Generative AI tools rely on exploiting all forms of labour without giving back in return, amounting to what some workers have described as "modern-day slavery".