Back to Writings

The boy who sent me poems.

In early spring of 2023 I received messages with poems about me. It was pretty odd, the boy who sent them had never shown interest in writing poetry. But the poems were pretty good.

Looking back now, it's obvious that the poems weren't really written but prompted. But I didn't know it then. At that moment it was all really new. The idea that you could type in crumbs of thoughts and end up with a finished, elaborated writing format, felt very abstract.

It is still abstract when you think about it, because honestly how many of us know how LLMs actually work. The boy who sent me those poems tried to explain it to me over and over. My best understanding is that it is an overtrained autocomplete, stuffed with more books, articles, and scraps of knowledge than anyone could ever consume.

Since that first poem, billions of people have used some form of AI. We started with text and code generation, soon after that images, video, voice, games, design, software... We use it for everything. We adjusted to this new idea so quickly, even though we seemed so scared in the beginning.

I was scared, for sure. I didn't want AI to do my creative assignments for me. I didn't want it generating art, or writing books. But I adjusted like the majority of us did. I discovered tools that helped me work better and more independently.

However, from the beginning, one thing was at risk. It was critical thinking and now we are entering its crisis in my opinion. Our attention was under siege before any chatbot appeared on the surface. Social media, infinite scroll, the easy content we kept shoveling without restraint. That kind of overload shrinks attention spans, pokes holes in memory, and erodes the words we can reach for.

Right now it shows in homework kids generate instead of doing, social media posts people copy & paste that all sound as if they were structured by the same marketer. Even comments to those posts are often chat-structured. If we stop thinking for ourselves in so many areas, isn't that a crisis? We begin to drown in generated content. We hand over thought crumbs, receive polished paragraphs, scan them superficially, and copy-paste. It's all good until you start replacing things that actually matter to you and interest you, the very work your brain needs in order to stay alive.

In my darkest imagining of how this could evolve, it could lead to a new kind of illiteracy, where the world is divided between those who can read, write and reason, and those … well who can't.

And the question behind all this is why do we generate it all in the first place? Because we're not interested? Because we want to save time? But when the work is meaningful, when it feeds our intelligence, why outsource it?

The results from the MIT research in 2025 (Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task) show that when used as a helper AI is a great enhancement tool. If used as a thinking replacement, it actually results in replacing our thinking leaving our minds almost empty.

Remember when we used to type many different sentence combinations in search engines fishing for the right answer? Now, if Google doesn't give us the answer on the first try we get so frustrated. And with AI we go further, prompting it for answers in arguments, essays, blog posts, everyday decisions. Have we stopped trusting our own basic knowledge? Do we really need help deciding what we think?

Personally I just think it's not attractive to use AI where we clearly could use our own true intelligence. I would really like to see AI as a tool that makes us better, because it takes on the bullshit tasks that were never enriching. Let's stop it before it really slides into growing illiteracy.