In the 1980s and 1990s, many kids like me were introduced to the world of computing thanks to the legendary Commodore 64, the first assembled personal computers, and rudimentary operating systems like DOS and Windows 3.11. I still vividly remember my first Commodore 64, a gift from my parents in 1989. I was just seven years old, in the second grade of my small hometown school, and the wonder I felt when powering up that computer, with its unmistakable blue and cyan screen, was priceless. That little device opened up an entire universe before me, filled with challenges, study, dedication, and gratification.

Back then, learning to use a computer meant reading manuals, deciphering BASIC instructions, and often banging your head against problems that would seem absurd today. With early personal computers and DOS, it was often necessary to use the Edit editor to modify autoexec.bat and config.sys configuration files—sometimes just to be able to launch a game. I also discovered that within the QBasic environment of MS-DOS, there were two games: Nibbles and Gorillas. Their source code was entirely written in BASIC and freely accessible. This effort made every achievement, even just loading a game from a cassette or a floppy disk, feel like a rewarding victory.

Movies of that era celebrated computing: think of titles like War Games, Tron, and the legendary The Lawnmower Man. The computer scientist was portrayed as a fascinating and adventurous figure—an “hacker” capable of mastering the most advanced technologies. I felt that same thrill years later, when, while writing my thesis for the final high school exam, I discovered Debian GNU/Linux. At that time, there was no ADSL yet; everything was done with an old 56k analog modem. I remember ordering the Debian Potato CDs from an Italian Linux User Group. It was an exciting experience that deeply shaped my computing journey.

Today, however, something seems to have been lost along the way. In the United States, the term coding clearly identifies “real programming.” American schools teach programming languages like Python or JavaScript at an early age, truly preparing students for a STEM career. Thanks to this mindset, the most important tech companies in the world were born in the U.S., revolutionary movements like open source emerged, and today the U.S. is pioneering the use of artificial intelligence.

In Italy, on the other hand, the concept of coding is often reduced to a simple teaching tool for “computational thinking” through graphical software like Scratch. These tools, in my view, do not simplify the fundamentals of programming—if anything, they complicate them by adding a layer of visual abstraction that can further confuse students.

Allow me a quip: in Italy we still get scandalized if someone misuses the subjunctive, yet it seems normal—and almost a badge of honor—to say: “Oh well, I don’t understand anything about computers or math,” as if it were absolutely acceptable—or even commendable—not to care about or grasp the basics of scientific subjects.

Italian-style coding was born with a noble intent: teaching computational thinking. However, it is often treated as something abstract and disconnected from real programming, almost like a philosophical exercise.

Studying programming with real languages like Python and C, on the other hand, makes the process of breaking down complex problems into smaller, manageable ones completely natural. Learning these languages means applying this methodology in practice.

Perhaps I was lucky: in middle school, I had a math teacher who was passionate about computers, with whom I often discussed programming in BASIC—both for the Commodore 64 and for PCs. In high school, I had an excellent math teacher who made us apply what we learned directly in the lab, using Turbo Pascal as a teaching language. Today, that same role could be played by Python. And I cannot forget my computer science teacher, who introduced us to C. In our very first lesson—still unforgettable to me—he explained the concept of algorithms with examples such as making coffee or solving the Tower of Hanoi, telling us: “If you can solve something with pen and paper, then you can also program it.” It was when we started writing code that techniques emerged, methods were refined, and we discovered how programming languages naturally trained the mind in what we now call computational thinking.

Of course, none of us knew we were learning “computational thinking”: we were simply writing code, creating small games and educational programs, and having fun. We went home and kept programming, making games, exercises, and applications—also because our teacher had very high expectations. Naturally, not everyone liked the subject, even though we had chosen the IT track, but I think that’s only natural.

It is essential, for those who love computing and want to become programmers, to study the C language: it is a true mental gym, capable of shaping the logical spirit of the computer scientist and confronting them with challenges involving both hardware and software engineering.

In a rapidly evolving technological world, especially with the rise of artificial intelligence, knowing at least the basics of programming through an authentic language like Python should be considered essential from a very young age—just like learning grammar or multiplication tables. Adding further complications through visual blocks like those in Scratch often proves useless and can hinder rather than facilitate learning.

A concrete example illustrates this well: a “for” loop or an “if” statement is much easier to write directly in Python than to construct the same loop with graphical blocks.

Scratch:

  • Drag a “repeat” block
  • Select the number of iterations
  • Insert other blocks inside for the action to repeat, e.g., say “Hello!”

Python:

for i in range(10):
    print("Hello!")

It is clear how much more immediate, readable, and rewarding it is to use Python. Moreover, Python is widely used in the real programming world and is one of the main languages in artificial intelligence development. For educational use, it offers a vast array of libraries: for data analysis, text processing, music, mathematics, and not least, PyGame for game development. This allows students to have fun writing code, creating games and applications. Alternatively, one could introduce game engines such as Godot (the most suitable for both educational and professional use), Unity, or Unreal Engine. I am certain that with tools like these, students would be highly engaged. Naturally, this also requires teachers to have different, more technical preparation—but the educational and motivational payoff would more than justify the effort.

In conclusion, Italian-style coding based on Scratch risks alienating students from the real world of programming. It is necessary to get students accustomed to complexity, to confront them with difficult problems, to encourage them to commit and make an effort to solve them. Programming is comparable to learning to play a musical instrument like guitar or piano, to studying dance, or to practicing a sport. There are no shortcuts: it requires study and sacrifice. Just as learning to play guitar requires studying theory and practicing until you build calluses on your fingers, in coding and software development, one must actively program to learn how to solve problems and break down big challenges into smaller, manageable ones. We should go back to teaching real computer science, using real, concrete, and powerful languages like Python and C, which can provide students with tangible and rewarding results—just as it was in the 1980s and 1990s.

I believe unplugged coding is a valuable teaching tool, useful for training computational thinking through interdisciplinary activities that can be applied to both STEM and humanities subjects. Coding can be applied to studying music, by coloring notes and keys to help students become familiar with instruments; in dance, for memorizing sequences of steps and positions; in sports, mathematics, and even in studying languages and the humanities.

However, coding—and especially unplugged coding—remains an effective tool only in early childhood and the first years of primary school, when students have not yet acquired basic reading and writing skills. From that point onward, just as in learning a musical instrument or dance, it becomes the time for sacrifice: it is necessary to abandon artificially simplified tools, get your hands dirty, and start writing real code. To do it gradually, understanding constructs, taking them apart, examining them, and writing code. Only in this way, I am convinced, can extraordinary results be achieved, both in terms of learning and personal satisfaction. And this is how we can raise a new generation of people who will not simply be technology users, but true masters of it.