In early 2025, something happened that I initially dismissed as minor platform drama. The Scratch Foundation updated their Terms of Service to explicitly allow the use of community-created content - projects, code, art, music, everything kids had uploaded - for AI model training.
The Scratch community did not take it well.
Kids - actual children - started unsharing their projects in protest. Forums lit up with outrage. Parents who'd encouraged their children to share creative work on the platform suddenly realised that "share" might mean something very different to the Scratch Foundation than it did to them.
I should have paid more attention immediately, because this isn't abstract for us. The Code Zone teaches kids to code using Scratch. It's foundational to our curriculum. Our students create Scratch projects every week. This policy change directly affects the children we teach.
What Actually Happened
Let me be precise about what changed, because the nuance matters.
Scratch has always been an open platform. Projects shared on the Scratch website are shared under a Creative Commons licence, which means anyone can view, remix, and build on them. That's been the deal from the start, and it's genuinely wonderful - it creates a global community of young creators learning from each other.
The TOS update added explicit language allowing the Scratch Foundation to use community content for "research purposes", which includes - but isn't limited to - AI model training. The argument is that this helps improve educational tools. The counter-argument is that children who shared their projects didn't consent to having their creative work used to train commercial AI systems.
Both arguments have merit. But only one side of the argument involves children who may not fully understand what they're consenting to.
The Consent Problem
Here's where it gets genuinely uncomfortable. Scratch's primary audience is children aged 8-16. When a ten-year-old shares a project on Scratch, they're thinking "I want my friends to see my game." They are not thinking "I consent to my code being used as training data for large language models."
You can argue that the TOS covers this. That by using the platform, users agree to the terms. And legally, that's probably correct. But ethically? A ten-year-old clicking "I agree" on a Terms of Service document is not meaningful consent. Adults don't read TOS documents. Expecting children to understand the implications is absurd.
The Creative Commons licence that Scratch uses was designed for human sharing and remixing. The spirit of the licence is "other kids can learn from your project." Using that same licence to justify feeding children's creative output into AI training pipelines stretches the spirit well beyond breaking point.
Why This Matters to Us
The Code Zone is built on Scratch at its foundation. Our youngest students start with Scratch. They learn loops by making cats walk across the screen. They learn conditionals by making games respond to player input. They build stories, animations, games - real creative work that they're proud of.
When we encourage a child to share their project, we're encouraging them to be part of a community. We're saying "your work has value, and other people can benefit from seeing how you solved this problem." That's a powerful message for a young learner.
If "share your project" now implicitly means "donate your creative work to AI training datasets", that changes the message. It introduces a transactional element to what should be a purely creative and educational act. And it puts us - as educators who recommend the platform - in an awkward position.
The Broader Pattern
Scratch isn't unique here. The entire tech industry is grappling with the question of training data. Social media posts, open-source code, public forum answers, creative works - all of it is being hoovered up to train AI models, often without explicit consent from the creators.
But there's a meaningful difference when the creators are children. Adults posting on Stack Overflow or GitHub have some understanding that their content is public and may be reused. They're making an informed choice. Children sharing Scratch projects are not making the same kind of choice, because they can't - they don't have the context or maturity to understand what AI training means.
This isn't anti-AI. I use AI every day. I'm writing this blog post in a workflow that involves AI at multiple stages. I believe AI is genuinely transformative. But I also believe that training AI responsibly means being particularly careful about consent when the data creators are children.
What We Tell Parents
We've had parents ask about this. Our response is straightforward:
Scratch is still an excellent learning tool. Nothing about the TOS change affects the educational value of learning to code in Scratch. The block-based programming, the creative projects, the logical thinking - all of that is unchanged and still brilliant.
Sharing is optional. Children can use Scratch locally, or share projects privately, without contributing to the public repository. If parents are uncomfortable with the data use implications, private projects are a perfectly good alternative.
The conversation is the point. Whether or not you're comfortable with the TOS, this is an opportunity to talk to your child about digital consent, data ownership, and what it means to share creative work online. These are important conversations that go way beyond Scratch.
What We'd Like to See
We're not calling for pitchforks. The Scratch Foundation does genuinely important work making coding accessible to millions of children worldwide. But we think the AI training question deserves more transparency and more meaningful consent mechanisms.
Opt-in, not opt-out. If children's projects are going to be used for AI training, that should be an explicit, clearly explained opt-in, not a buried clause in a TOS document. Make it a visible choice with a simple explanation of what it means.
Age-appropriate explanation. If you're going to ask children for consent, explain what you're asking in language they can understand. "Your project might be used to help teach computers how to code" is understandable. Legalese is not.
Transparency about usage. Which AI systems are being trained on Scratch data? What's the commercial relationship? Is the Scratch Foundation being paid for access to this data? These are questions that deserve clear answers.
The Bigger Question
This story raises a question that's going to define the next decade of AI development: who owns the training data?
When a ten-year-old writes a Scratch program that makes a dinosaur dance, they've created something. It's simple, it's probably full of bugs, and the dinosaur probably clips through the floor. But it's theirs. They thought of it, they built it, they debugged it (with help), and they're proud of it.
Using that creation to train AI systems without meaningful consent isn't illegal. It might even produce genuinely useful educational AI tools. But it's a choice that should be made with children's interests at the centre, not as an afterthought buried in updated terms and conditions.
We teach kids to code because we believe in empowering them to create. That empowerment has to include understanding and controlling how their creations are used. Anything less isn't education - it's extraction.