Young readers have always been the heart of my audience. It is where I started my publishing journey twenty years ago, and I continue to write for them in my Draconim and MacIver Kids series. My stories are written for them—their grief, their fire, their fierce sense of justice. But I’ve also known I couldn’t, in good conscience, invite them into the same digital spaces that constantly exploit their attention, data, and trust.
From the beginning, I chose not to collect
data through my author website. No mailing lists, no subscriber pop-ups, no
personalized ads. Not because I didn’t want to grow—I did, and still do—but
because I could never guarantee that data, once gathered, would be fully
secure. And I wasn’t willing to risk the safety of the very readers I hoped to
reach. Recent headlines—like “Hackers are targeting a surprising group ofpeople: young public school students” (OPB), “Thousands including childrenexposed in major data breach” (Tom's Guide), and “Children's data hacked after
school software firm missed basic security step” (NBC)—should raise both our
hackles and our awareness. If even large institutions can’t protect children’s
data, how can we?
It’s a quiet stance. One that looks a lot like
doing nothing. In a publishing world that rewards metrics, funnels, growth
curves, and mailing list conversions, my refusal can seem naive or
self-sabotaging, and perhaps it is. After all, I depend entirely on book
sales—and limiting the tools I can use to reach readers is counterproductive.
But it has always felt necessary.
A Loophole
Culture
Social media has long been a tool used to
skirt the very laws and codes designed to protect young people. Take for
instance the “Broadcast Code for Advertising To Children” in Canada. It
includes clauses like “Children’s advertising must not directly urge children
to purchase or urge them to ask their parents to make inquiries or purchases.” and
“Direct response techniques that invite the audience to purchase products or
services are prohibited.” These principles exist in many countries in some
form—yet across the board, enforcement is inconsistent, and social platforms
often act as if they exist above those rules. Laws like COPPA in the U.S. or
GDPR-K in Europe were intended to shield minors from exploitation, but their
enforcement has been limited—especially when it comes to influencer-style
content and microtargeted outreach.
Advertising to children is supposed to be
strictly regulated—on paper, at least. In practice, platforms like TikTok and
Instagram offer frictionless access to teen and youth audiences, and authors
are often encouraged to "just be authentic" as a workaround to the
advertising rules that apply more directly to traditional media.
The problem is, many of us aren’t just being
authentic. We’re building brands. We’re tracking engagement. We’re optimizing
hashtags, timing posts, and nudging readers toward buy links. And if that’s not
advertising, it’s close enough to feel uncomfortable—especially when we’re
doing it in a space where our readers are young, impressionable, and often
invisible behind anonymous handles.
It’s not that I blame authors for using these
tools. The pressure to be visible in an attention economy is enormous,
especially for indie creators without a marketing department behind them. But I
do think we need to talk more honestly about what we’re doing when we use
social media as our primary path to young readers.
Many find reassurance in using trusted
platforms to manage their mailing lists. These tools offer a layer of
protection—but is it enough? And is the platform you're using compliant with
the privacy laws of every country your readers might live in? How many of us
have downloaded our subscriber lists to a personal device, just in case—and how
secure is that laptop, really?
A New
Landscape, Or Just a Clearer One?
With the UK’s new child safety regulations
coming into force—laws that challenge how platforms host content likely to be
seen by minors—we may be entering a new phase. Not a surprising one, but a
clarifying one.
These changes could make it harder for YA
authors to reach their intended audience directly. Algorithms may become less
predictable. Accounts may be flagged, content shadowbanned, or reach throttled.
And while that may feel like a setback, it might also be a long-overdue signal:
the system we’ve all been relying on was never built for this kind of outreach.
Not ethically. Not safely.
At the same time, the alternative paths we
once relied on—the slow, steady routes through schools and libraries—are
becoming less accessible too.
The
Gatekeepers Are Shifting
Once, we relied on librarians, teachers, and
booksellers to act as bridges between authors and young readers. But in an era
of rising book bans, state-mandated curriculum restrictions, and moral panic
over what young people should be “allowed” to read, those bridges are burning.
The people most qualified to guide youth
toward challenging, expansive, and compassionate stories are under siege. And
in many cases, it’s indie authors—especially those writing about climate,
queerness, neurodiversity, or racial justice—who are most likely to be locked
out of institutional channels.
We are being squeezed from both ends: told not
to market to teens and youth directly, while also losing the allies who once
helped us reach them responsibly.
Do We Need
More Laws, or Just Better Ones?
One could argue that what we really need is
tighter digital regulation—more protection for minors, clearer rules around
consent and data, and harsher penalties for platforms that fail to comply.
But part of me wonders: do we need new
laws, or do we just need to enforce the ones we already have? And shouldn’t
those same standards apply not just to corporations with billion-dollar ad
budgets, but to the everyday content creators who are (often unwittingly)
playing by the same exploitative playbook?
What would it look like to create outreach
strategies that serve young readers without exposing them? What would it
mean to design tools that indie authors could use—tools designed with care, not
conversion—in mind?
What Comes
Next?
I don’t have perfect answers—or any answer
really, only questions. I only know that I want to reach young readers without
compromising their safety—or my ethics. I want to be part of a world where
stories meant for teens can find teens without relying on the same
systems that have failed them in every other way.
So I’ll end this with a question: what would
help us get there?
If you're a parent, indie author, publisher,
bookseller, librarian, coder, educator—anyone asking the same questions—I'd
love to hear your thoughts. What tools do we need? What models could we build?
And what would it look like to imagine a future where our connection to young
readers is built on trust, not surveillance?
Let’s talk. Share your thoughts and
suggestions in the comments here, or on Threads.
And if you don’t have answers yet—that’s okay.
I don’t either. This isn’t a test, it’s an invitation. A space to wonder, to
question, to imagine something better—together.
Comments
Post a Comment