Microsoft-owned GitHub just dropped a bombshell on millions of developers. Starting April 24, 2026, the company will use your Copilot interactions to train its AI models, and you are already enrolled unless you manually opt out. The move, announced by Chief Product Officer Mario Rodriguez, has sparked fierce backlash across the developer community and raised serious privacy questions about what “private” really means on the platform.
What GitHub Changed and Who It Affects
3 From April 24 onward, interaction data, specifically inputs, outputs, code snippets, and associated context, from Copilot Free, Pro, and Pro+ users will be used to train and improve GitHub’s AI models unless they opt out. 4 Students and teachers who access Copilot will also be spared. 3 Copilot Business and Copilot Enterprise users are not affected by this update.
The scale of this policy shift is massive. 25Copilot reached roughly 20 million all-time users by mid-2025 and had 4.7 million paid subscribers. 25GitHub says Copilot is used by roughly 90% of the Fortune 100.
The data GitHub plans to collect goes far beyond basic usage telemetry. 2Collected data types include code snippets, outputs accepted or modified by users, code context surrounding the cursor position, comments and documentation, file names, repository structure and navigation patterns, interactions with Copilot features like chat and inline suggestions, and feedback on suggestions such as thumbs up or down ratings.
2 Data collected may also be shared with affiliates including Microsoft, but will not reach third-party AI model providers or independent service providers.
GitHub Copilot AI training data collection opt out settings
The Private Repository Problem
Here is where things get tricky for developers working on sensitive or proprietary projects.
3 GitHub uses the phrase “at rest” deliberately because Copilot does process code from private repositories when you are actively using Copilot. This interaction data is required to run the service and could be used for model training unless you opt out.
In plain terms, if you are coding inside a private repo with Copilot active, your code snippets, prompts, and suggestions from that session are fair game for training. 5GitHub says it still will not use private repository content at rest to train AI models, meaning code simply stored on GitHub remains off limits.
That distinction is paper thin for many developers. 2For freelance developers and small teams on Pro plans who handle client code, the privacy implications extend beyond their own work to code they do not fully own.
How Developers Can Opt Out Right Now
8 Developers with privacy concerns have until April 24 to visit github.com/settings/copilot and disable data collection under the Privacy section.
Here is the step-by-step process:
- Log in to your GitHub account
- Go to your profile and click Copilot settings
- Find the Privacy section on that page
- Locate “Allow GitHub to use my data for AI model training”
- Set the dropdown to “Disabled”
1 If you have multiple GitHub accounts, you will need to repeat this process for each one, as the setting applies individually to each account.
Good news if you already opted out before. 3If you previously opted out of the setting allowing GitHub to collect this data for product improvements, your preference has been retained, your choice is preserved, and your data will not be used for training unless you opt in.
One important thing to know: 2developers can disable data collection through the Privacy section of their Copilot account settings without losing access to any features. You keep the full Copilot experience either way.
Why GitHub Says It Needs Your Data
GitHub’s justification comes down to one argument: real-world data makes better AI models.
21 Chief Product Officer Mario Rodriguez justified the shift by pointing to internal testing. Models trained on Microsoft employee interaction data showed “increased acceptance rates in multiple languages” compared to those built on public code and synthetic samples alone. 4 GitHub in its FAQs notes that Anthropic, JetBrains, and corporate parent Microsoft operate similar opt-out data use policies.
But this is not the first time GitHub has changed course on this issue. 9Copilot’s data usage policy has evolved: initially, only Free users were subject to training, then all plans were excluded from training, and now Free, Pro, and Pro+ users are once again subject to training.
That zigzag in policy does not inspire confidence. Developers who trusted that their data was off limits for training now face a different reality, and they only got 30 days notice before the change kicks in.
Developers Are Pushing Back Hard
The community response has been overwhelmingly negative.
17 In the GitHub community discussion, users offered 59 thumbs-down votes compared to just three rocket ship emojis signaling excitement. 17 Among 39 posts commenting on the change, only Martin Woodward, GitHub VP of developer relations, endorsed the idea.
14 “GitHub says privacy is a human right and that trust depends on transparency, control, and privacy by design. Yet your current Copilot policy gives stronger default privacy protection to organizations than to individual people. If privacy is truly a human right, individuals should not receive weaker default protection than companies. Please make this strictly opt in for individual users, with clear notice and an obvious, verifiable control.”
That community post captures the core frustration. Business and Enterprise customers paying premium rates get automatic protection from data collection, while individual developers paying $10 or $39 per month must actively go find the off switch.
4 Those affected have the option to opt out in accordance with “established industry practices,” meaning according to US norms as opposed to European norms where opt-in is commonly required.
| Plan | Default Data Collection | Can Opt Out |
|---|---|---|
| Copilot Free | Yes (Enabled) | Yes |
| Copilot Pro ($10/mo) | Yes (Enabled) | Yes |
| Copilot Pro+ ($39/mo) | Yes (Enabled) | Yes |
| Copilot Business ($19/user/mo) | No (Exempt) | N/A |
| Copilot Enterprise ($39/user/mo) | No (Exempt) | N/A |
| Students & Teachers | No (Exempt) | N/A |
12 GitHub states it applies appropriate technical safeguards, including filters designed to detect and remove sensitive data, and de-identification techniques. But the company has not provided specific details about how anonymization works or what safeguards prevent proprietary code from leaking into model outputs.
This policy change lands at a time when GitHub faces growing competition in the AI coding space. 21Anthropic’s Claude Code, Google’s Gemini code assistants, and Amazon’s Q Developer are all fighting for the same developer attention. 27GitHub Copilot holds 42% of the AI coding assistant market as of 2025. But rivals like 33Cursor reportedly reached over one million daily users and $500 million in annual recurring revenue.
The race for developer data is heating up, and GitHub just made a bet that most users will not bother to change the default setting. Whether that gamble pays off or pushes developers toward competitors who promise stronger privacy protections remains to be seen. If this policy concerns you, the clock is ticking. Head to your Copilot settings before April 24 and make your choice. Drop your thoughts in the comments below.