- This is an IT Support Group
- Posts
- 🤠 Weekend Edition: You Are The Training Data Now
🤠 Weekend Edition: You Are The Training Data Now
A Saturday editorial from your friendly neighborhood IT support group 🤠
This is an IT Support Group
Saturday Editorial 🤠
You're not being replaced by AI. You're being asked to fund it — and to train it on the way out.

GM IT pros,
Happy Saturday! Pouring the second cup of coffee and sitting down to write this one, because yesterday's roundup felt incomplete. The individual stories — Oracle, Copilot, Patch Tuesday — each got their one-liner. But there's a thread running through all of them this week that deserves more than a snarky bullet point.
So this is a Saturday editorial. One topic. No bullet list. Just a take. Feel free to hit reply and tell me I'm wrong.
-Stetson
You Are The Training Data Now
A weekend read for the people keeping the lights on
🧱 The Three Headlines That Are Actually One Headline
This week had three stories that, on the surface, look unrelated. Oracle fired roughly 30,000 people via a 6 AM form email signed "Oracle Leadership." GitHub quietly updated its privacy policy so that, starting April 24, every prompt, completion, snippet, and cursor context from Copilot Free, Pro, and Pro+ users will feed Microsoft's training pipeline unless you manually opt out. And the cumulative 2026 tech layoff counter quietly rolled past 85,000 — with analysts telling reporters that nearly half those cuts are being explicitly attributed to AI and workflow automation.
These are not three stories. This is one story, told three times.
The story is this: a large part of the enterprise IT economy is being restructured so that the humans who used to do the work are paying for the GPUs that will try to do it instead. The Oracle layoffs, per TD Cowen's math, free up $8-10 billion in cash flow for AI data center capex. Oracle took on $58 billion in new debt in two months for the same buildout. Free cash flow hit negative $10 billion last quarter. The people who got the 6 AM email were not "cut because the company is struggling." They were cut because the company is trying to buy a lot of GPUs and the finance team needed a source of funds. That's a different thing. It's a reallocation, and "reallocation" is a clean corporate word for "the humans are the line item being reassigned."
The Copilot story is the same story from the other end of the pipeline. Microsoft doesn't just need money and land and power and silicon to build these models — it needs corpora. And the most valuable corpus in the world, by a wide margin, is the stuff being typed into IDEs every day by working engineers: real code, real comments, real fixes to real bugs, annotated with whether the human accepted the suggestion or rewrote it. That is a training signal you cannot buy at any price. So GitHub is now going to collect it by default, and the opt-out lives several clicks deep in a settings page almost nobody visits. The reason Business and Enterprise tiers are exempt isn't because Microsoft is being kind. It's because the people paying for Enterprise have lawyers who would notice. The people on Free and Pro mostly won't.
💸 "We're a Family" Was Always a Budget Line
The Oracle email is the part that should stick with IT people specifically, because it says the quiet part out loud in a way we don't usually get.
A layoff done well — the kind of layoff where HR actually calls you, your manager has a conversation with you, severance is negotiated, equity is handled, your access is wound down over a week — costs a lot of money per head. A layoff done via a pre-dawn form email signed by a noun phrase ("Oracle Leadership") is much cheaper per head, because you are spending less HR time, less legal time, less manager time, and less decency. The delta between those two numbers, multiplied by 30,000, is real money. Not GPU money. But real money.
Every IT department in the country is going to feel some version of this over the next 18 months, and the pattern to watch for is not the layoff itself. It's the way the layoff is executed. If your company is still calling people into a room, that tells you something about how the company thinks about its workers. If the company is sending 6 AM emails signed "Leadership" and revoking badge access at 6:05, that tells you something different — and it tells you that when your number comes up, the same thing will happen to you. The Oracle method isn't a mistake or a PR misstep. It's a cost-saving measure, and cost-saving measures get copied.
The "we're a family" emails are not gone, by the way. They're just now being sent to the survivors. Watch for them next week.
🔁 What This Means For The People Doing The Actual Work
Here's where it gets practical, because an editorial without something you can actually do on Monday is just a blog post.
One, on the Copilot thing: if you're on Free, Pro, or Pro+, go to github.com/settings/copilot/features right now and disable "Allow GitHub to use my data for AI model training" under the Privacy heading. It takes ten seconds. The change takes effect immediately. Your previous opt-out on "use my data for product improvements" does carry over, which is the one nice thing GitHub did in this announcement, but the new toggle is separate and it's on by default starting April 24. Do not wait. If you administer Copilot Business or Enterprise for your org, the good news is you're exempt, but tell your devs anyway, because most of them also have personal Pro accounts and those personal accounts are now in scope.
Two, on the layoff thing: this is the week to quietly update your personal documentation. Not the stuff in the company wiki — the stuff that lives on your own machine. Your own runbook, your own "how I did that weird fix in 2023" notes, your own list of cert renewal dates, your own network diagram scribbled on a Joplin page. The thing about the 6 AM email is that it ends badge access at 6:05, which means everything on a corporate device disappears from your life at the same moment you'd most like to refer to it. You don't need to exfiltrate anything — just make sure the knowledge that belongs to you as a professional lives somewhere you personally control. Think of it as a Bitlocker recovery key, but for your career.
Three, on the AI-capex thing: this is the week to get clear-eyed about which parts of your job are actually load-bearing for the business and which parts are ceremony. The layoffs happening right now are not targeting the people who patch the Fortinet box at 11 PM when CISA drops a nine-point-one. They are targeting the people whose work can be described on a slide as "repeatable process." If your work can be described that way, the honest thing to do is to start describing it a different way — in terms of judgment, tradeoffs, weird edge cases you caught that no model would, incidents you prevented that never made it to a ticket. The Forrester prediction about two multi-day hyperscaler outages in 2026 is not a prediction about infrastructure. It's a prediction that the people who know how to debug the legacy plumbing are being cut, and the models do not yet know what those people knew. Be one of the people who knew.
🪞 The Uncomfortable Mirror
There is an uncomfortable thing I should also say, because it would be dishonest not to.
A lot of us — and I include myself — use Copilot every day. A lot of us ship faster because of it. A lot of us wrote the scripts that automated parts of the helpdesk workload that used to belong to a more junior person who is no longer employed. A lot of us, if we're being real about it, are both the labor being squeezed in this cycle and the people doing a small amount of the squeezing. You cannot cleanly be on only one side of this. The code I wrote this year to auto-triage a certain class of ticket absolutely eliminated work that a person used to do, and I am the same person who is nervously watching the Oracle email make its rounds.
That's not a reason to feel bad. It's a reason to be clear. The AI economy is not something being done to IT from outside. It's being built, largely, by us — by the people reading this email — one PowerShell script and one Copilot suggestion at a time. Which means the conversation about "how should this actually work" is a conversation we are allowed to be in, not a conversation happening over our heads. We know where the code is hidden. We know which automations actually work and which ones are held together with a single environment variable in prod. That is leverage. Use it.
🎯 What To Do This Weekend (If You Want A List)
Fine, one list, because I can't fully quit the habit.
Turn off the Copilot training toggle on your personal GitHub account before April 24. Export your personal notes off any corporate-managed device. Take an hour to write down, in your own words, three things you did this quarter that were genuinely judgment calls, not runbook executions — you'll want these the next time someone asks what you actually do. Patch the Fortinet EMS box if you haven't already; CVE-2026-35616 is being actively exploited and CISA's deadline was yesterday. And if you run a Flowise instance you stood up "just to try it," log in and kill it, because the 10.0 CVE is real.
That's your Saturday homework. Do it before the coffee wears off.
🥷 A Quick Word From The Shameless Plug Department
If this editorial has made you want to get genuinely comfortable with the terminal — not "I can sudo apt update" comfortable, but "I can look at a stack trace at 2 AM and not flinch" comfortable — that's what Shell Samurai is for. It's a hands-on Linux trainer I built because every IT pro I know wishes they'd spent more time with the shell before they needed to. Safe sandbox, real commands, actual feedback. shellsamurai.com for the pitch, app.shellsamurai.com to jump straight into a lesson. No signup gates, no upsell. Built by yours truly, zero regrets.
Before you get to the homework: I read every sponsor pitch before I ship it. The one below made the cut. Give it ten seconds — click if it's for you, skip if it's not. I'd rather you trust me than click out of guilt.
88% resolved. 22% stayed loyal. What went wrong?
That's the AI paradox hiding in your CX stack. Tickets close. Customers leave. And most teams don't see it coming because they're measuring the wrong things.
Efficiency metrics look great on paper. Handle time down. Containment rate up. But customer loyalty? That's a different story — and it's one your current dashboards probably aren't telling you.
Gladly's 2026 Customer Expectations Report surveyed thousands of real consumers to find out exactly where AI-powered service breaks trust, and what separates the platforms that drive retention from the ones that quietly erode it.
If you're architecting the CX stack, this is the data you need to build it right. Not just fast. Not just cheap. Built to last.
That's the editorial. Back to the regular snark-filled roundup next Friday. In the meantime: go touch the settings page, patch the Fortinet box, write down what you actually do, and have a real weekend. The world will still be burning on Monday.
Stay paranoid. Stay patched. See you next Friday 🤠
-Stetson

