Skip to content
Carlos KiK
Go back

GitHub Will Train AI on Your Code Starting April 24. You Have to Opt Out.

Starting April 24, 2026, GitHub will use interaction data from Copilot Free, Pro, and Pro+ users to train AI models.

Not opt-in. Opt-out.

That means your code completions, your prompts, your coding patterns, your problem-solving approaches will feed into future model training unless you actively go into your settings and say no.

Most people will not. That is the point.

The opt-out playbook

This is the same playbook every major platform runs. Launch a feature that collects data. Make it on by default. Bury the opt-out in settings. Announce it in a blog post that 0.1% of users will read. Wait 30 days.

By May, millions of developers will be training GitHub’s AI models without knowing they agreed to it. Legally, GitHub can point to the announcement and the opt-out option. Practically, informed consent requires the person to be informed, and a blog post is not informed consent.

What they are actually collecting

“Interaction data” is deliberately vague. It could mean the code you write. The prompts you type. The suggestions you accept or reject. The context window around your cursor. The patterns in how you solve problems.

For a solo developer, this might feel abstract. For a company whose proprietary algorithms are being written with Copilot assistance, this is a data exfiltration pipeline that was just turned on by default.

The question that keeps coming up

Every AI company faces the same tension: the models need training data, and the best training data is the real-world usage of their products. The users who generate that data are also the customers paying for the product.

You are simultaneously the customer and the raw material. The subscription fee is not the full price. The full price includes your data.

What you can do

If you want to opt out:

GitHub → Settings → Copilot → scroll to “Allow GitHub to use my data for product improvements” → uncheck.

Do it before April 24. After that date, your interactions since the last opt-out check will already be in the training pipeline.

Or do nothing, and accept that your coding patterns, your problem-solving approaches, and your prompts are now training data for a product that Microsoft will sell back to you.

Your choice. As long as you know it is a choice.


[Draft: Awaiting Carlos’s twist]


Share this post on:

Next Post
Every Frontier AI Model Just Scored Below 1% on a Reasoning Test. Humans Score 100%.