Skip to main content

Prompt Engineering Is Communication ! Period !!!


As engineers, we take pride in solving problems. We optimize systems, debug code, design robust architectures, and think in terms of precision and logic. But there’s one skill that quietly makes or breaks everything — a skill often underestimated in technical circles: communication.

Yes, I’m talking about the good old art of expressing ideas clearly.

This blog is a reflection on how communication – especially in the age of AI and prompt engineering – is becoming a non-negotiable skill, and how my own experience as an engineer (and a former theatre student) shaped this realization.


The Engineering Mindset: Facts First, Communication Later?

Engineering teaches us to focus on accuracy, efficiency, and function. We’re trained to get things right. But when it comes to sharing what’s in our mind — whether in design discussions, stakeholder meetings, or team emails — we often falter.

Why?

Because we tend to think logic alone should be enough.

But here’s the truth: if others can’t understand what you built, why you built it, or how it works — then it doesn’t matter how brilliant it is. It’s not usable, not understandable, and not scalable.


Theatre Helped Me See This

Interestingly, I didn’t realize the value of communication in engineering through an engineering course. I learned it from theatre.

Under the guidance of Samrat Sanyal, director of Baranagar Drishyakabyo, I was trained to observe people — their emotions, body language, silences, and unspoken tension. I learned that what’s said is only half the message. How it’s said — and how it’s received — is everything.

It made me realize: in both theatre and engineering, the goal is to deliver an experience that works. And that demands clarity, empathy, and precision in expression.


Enter Prompt Engineering: Communication With AI

With AI tools like ChatGPT, GitHub Copilot, or Google’s Gemini entering our workspace, a new type of communication is now critical — prompt engineering.

Let’s be honest: AI is powerful, but it’s not telepathic. If you give vague, incomplete, or poorly structured instructions, AI gives vague results.

For example:

  • Bad prompt: “Write about engine.”
  • Good prompt: “Write a 500-word explanation on how an internal combustion engine works, including the four-stroke cycle, in simple language for high school students.”

See the difference? The second one is not just a prompt; it’s a well-structured technical requirement.

And that is the essence of prompt engineering: treating communication like a specification.


Lessons for Engineers

So what should we, as engineers, take away from this?

1. Communication is not a soft skill. It’s a core engineering skill.

Whether it’s talking to a teammate, writing a Jira ticket, creating documentation, or prompting an AI — clarity matters. Ambiguity costs time, money, and patience.

2. Prompt engineering is not magic. It’s structured thinking.

It forces you to ask:

  • What do I want?
  • What’s the context?
  • How should the result look?
    These are the same questions we ask during design and debugging. The only difference: now we are doing it through words.

3. Empathy helps engineers build better systems.

Understanding how people think — be it your colleague or your AI assistant — helps you design prompts, interfaces, and systems that actually work. Empathy leads to better collaboration, better UIs, and smarter automation.


Final Thoughts

In engineering, we often say: “Garbage in, garbage out.”
Well, the same applies to communication.

You can’t expect meaningful output — from humans or machines — if your input is unclear. Prompt engineering may sound technical, but it’s built on a foundation of clean, thoughtful communication.

So the next time you struggle to explain something, don’t just blame AI or your team.
Ask: “Did I communicate clearly?”
Because at the end of the day — whether it's code, a circuit, a chatbot, or a conversation — communication is the first line of execution.


Popular posts from this blog

The Subjectivity of Scientific Discovery: A Perspective from Laboratory Life

As an engineer, my exposure to Bruno Latour’s Laboratory Life has provided me with a unique lens through which to view scientific practice. In science and engineering, we often operate under the belief that mathematics, algorithms, and equations are purely objective—not affected by personal, cultural, or social influences. However, Latour challenges this notion, suggesting that scientific studies are not merely discovered but designed, shaped by the environments in which they are conducted. This perspective has resonated deeply with me, revealing that the practice of science is as much about its social dynamics as it is about empirical rigor. The Social Fabric of Scientific Research Science is often considered universal, yet the way research is conducted and received varies across cultures. Take, for example, a groundbreaking discovery in an Indian laboratory. The response from researchers in India may differ significantly from that of their counterparts in the U.S. or ...

Grammar No Longer Governs Genius: How AI Is Ending Language Politics

Language has always been more than just a medium of communication. It is a carrier of identity, access, and — most importantly — power. When we look at how power is distributed globally, it's easy to forget how central language is to this equation. The influence of a language often parallels the economic dominance of its speakers. English, for instance, owes much of its global status not just to colonial legacy, but to the economic and technological supremacy of the US and UK. But this linguistic power has long created inequality in unexpected ways — especially in countries like India, where language often acts as an invisible filter, separating the privileged from the marginalized. Let me illustrate this with something I observed firsthand. In Kolkata, one of my school teachers came from a tribal background. His knowledge was deep, and if you spoke to him, you'd instantly sense his insight and compassion. But his English wasn’t fluent — a limitation that often over...

Don't worship AI, work with it

Artificial Intelligence is no longer the future — it’s here, and it's reshaping how we think, work, and build. But for many people, especially those without a background in coding, AI can feel intimidating. Here's the good news: you don’t need to be a software developer to use AI tools like ChatGPT. In fact, if you understand problems and have ideas — AI can be your most powerful partner. LLMs: The Mind That Has Read Everything Imagine this: you’ve studied 10 books on a topic. Your friend has studied 30. Clearly, your friend might know a bit more. Now imagine a model that has read millions of books, research papers, and internet pages across every field imaginable — from quantum mechanics to philosophy to architecture to car repair manuals. That’s what a large language model (LLM) like ChatGPT has been trained on. This is why it can answer questions, generate code, write summaries, translate languages, simulate conversations, and even explain tough engineeri...