Rinteractives

The Art and Science of Prompt Engineering: How to Speak AI

Prompt Engineering

Rahul Gadekar

Mentor Stanford SEED & LISA

As artificial intelligence becomes a more integrated part of our workflows—whether you’re coding, writing, researching, or designing—learning how to communicate effectively with AI is becoming a crucial skill. Enter prompt engineering, the craft of writing inputs (prompts) to get the most useful, accurate, and creative responses from AI models like ChatGPT, Claude, or Gemini.

In this blog, I’ll break down what prompt engineering is, why it matters, principles of prompt engineering and how you can get better at it

Prompt engineering is the practice of designing and structuring input text to guide large language models (LLMs) toward generating a desired output. Think of it as giving the AI clear instructions, like a conversation with a very literal and powerful assistant

For example:

  • A vague prompt like “Tell me about growth marketing may result in a generic response.
  • A better-engineered prompt would be: “Explain 3 growth marketing strategies startups can use to drive sales in 2025.”

The more specific your prompt, the better the response. Prompt engineering minimizes ambiguity and increases relevance.

LLMs are incredibly powerful, but without clear direction, they may underperform or misunderstand your intent.

Well-constructed prompts reduce the need for back-and-forth, speeding up research, writing, and ideation

Be clear and unambiguous in your instructions. The more specific the prompt, the better the AI performs

  • ✅ Do: “List 5 growth marketing tactics for SaaS startups targeting Gen Z”
  • ❌ Don’t: “Give me some marketing ideas”

Include relevant background information such as the audience, use case, tone, or platform

  • Example:
    “Write a professional email to a potential client explaining how our AI service reduces customer support costs by 40%.”

Assign AI a role to guide the tone, depth, and framing of the response

  • Example:
    “Act as a financial advisor explaining SIPs to a 25-year-old first-time investor.”

Tell the AI how to format its response, bullets, numbered steps, table, or paragraph

  • Example:
    “Explain the benefits of growth marketing in 3 bullet points.”

Encourage the model to think in steps, especially for problem-solving or logic tasks

  • Example:
    “Explain step by step how to calculate ROI for a growth marketing campaign.”

Zero-shot: Give the task directly.
“Translate this sentence to French.”

Few-shot: Include examples in the prompt.
“Email 1: … Email 2: … Now write Email 3.”

Limit word count, tone, or content scope to ensure a focused answer

  • Example:
    “Summarize the book ‘Atomic Habits’ in under 100 words for a high school student.”

Great prompts often emerge through testing. Adjust based on how the AI responds

  • Start with a basic version
  • Analyze the response
  • Add clarity, examples, or constraints

AI doesn’t guess well. Avoid vague verbs, unclear pronouns, or open-ended phrases without direction.

  • ❌ “Do it like before.”
  • ✅ “Repeat the same format as the 3-bullet summary you used earlier.”

10. Stay Aware of Model Limitations

Know that models can “hallucinate” (make up facts) and are only trained up to a certain point in time. Don’t blindly trust outputs, especially for facts, dates, or legal/medical content

1. Chain-of-thought Prompting

Encourages the model to show its reasoning step-by-step rather than jumping straight to the answer. This improves accuracy, especially for logic and math problems.

Use When: You want the model to explain or reason rather than just give an answer.

Example:

“A car travels 60 km in 1.5 hours. What is its average speed? Show your reasoning step by step.”

Response:
Step 1: Distance = 60 km
Step 2: Time = 1.5 hours
Step 3: Speed = Distance ÷ Time = 60 ÷ 1.5 = 40 km/h

Use a combination of text and image inputs (in supported models like GPT-4o) for tasks such as describing, analyzing, or generating from visual input.

Use When: You’re working with models that support image + text inputs.

Example:

Prompt (with image):

“Look at this image of the solar system. List the planets from closest to farthest from the sun.”

You give the model an output and ask it to guess what kind of prompt would have led to it. Useful for refining your own prompts.

Use When: You want to reverse-engineer prompt structures

Example:

Prompt:
“Given this output, what was the likely prompt?”
Output:
“Create 3 Instagram captions for a new vegan protein brand targeting fitness enthusiasts.”

Response:
“Write 3 short Instagram captions promoting a vegan protein brand for fitness-focused users.”

Use Cases for Prompt Engineering

IndustryUse CasePrompt Example
MarketingAd copy generation“Write 3 short headlines for a fitness app targeting Gen Z.”
EducationLesson plan creation“Create a 5-day curriculum to teach 5th graders about fractions.”
Software DevCode generation & debugging“Write a Python function to sort a list of dictionaries by date.”
Legal & PolicyPolicy summarization“Summarize the key clauses in GDPR for non-lawyers.”

Prompt engineering isn’t just a technical trick, it’s the interface between human intention and machine intelligence. Whether you’re an entrepreneur, student, marketer, or developer, knowing how to engineer prompts helps you unlock AI’s full potential and drive better outcomes. As LLMs become more deeply integrated into everyday tools, prompt engineering will evolve from a niche skill into a fundamental digital literacy

When an unknown printegalley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting.

Rahul Gadekar

Stanford Alumnus

Mentor: Stanford Seed & Abu Dhabi SME Hub

Access a wealth of marketing insights, delve into real-world case studies, and uncover proven customer & investor acquisition strategies that have fueled the expansion of my business.

You have been successfully Subscribed! Ops! Something went wrong, please try again.