Blogs

What is “prompting”?

Development | Wed, Jun 25, 2025

By Jason Madar, Program Coordinator

What is “prompting”? Banner

At its core, a large language model (LLM) is a completion engine. It takes a starting sentence and tries to predict the most likely next word based on its training data. For example, if the starting sentence is “The cat sat on the,” the model might predict “mat” as the next word because that’s a common continuation in English. This process of predicting word after word creates coherent text that matches the context of the input.

Understanding Context and Model Behavior

One critical design decision when building an LLM is how long of a starting sentence it can accept. This is referred to as the context window. The context window determines how much text the model can “remember” at any given time. However, there’s a tradeoff: a longer context window requires significantly more computational resources, especially for larger models. For instance, expanding the context window allows the AI to handle complex inputs, but this increases memory and processing demands. Balancing the model’s size and its context window is essential to optimizing performance.

What’s surprising about these models is that, when equipped with long context windows (thousands or even hundreds of thousands of words), these LLMs seem capable of “learning” from the input. In other words, they don’t just passively predict the next word; they actively adapt to the tone, style, and information provided in the starting text. For example, if you input a math problem, the AI might follow logical patterns to solve it. Or if you provide a Shakespearean-style sentence, it will often generate text in the same style. Sentence completion, therefore, becomes a proxy for understanding because the model must “reason” about what comes next based on patterns it has seen in its training data.

Media

The Art and Impact of Prompting

In modern AI systems, we call this starting sentence a prompt. There are generally two types of prompts in every conversation with an AI: the system prompt and the user prompt. The system prompt sets the AI’s rules, tone, and personality, while the user prompt is the input from the user. Together, they shape the AI’s responses.

It’s important to understand that the core AI engine isn’t truly “thinking.” It generates the word with the highest probability based on its training data. This means crafting effective prompts requires skill, particularly in language and communication. For example, someone with strong language skills and deep knowledge of a specific subject is far more likely to create a precise and effective prompt than a novice. A well-crafted prompt serves as a guiding framework, helping the AI provide accurate, relevant, and useful outputs.

The power of prompting lies in its ability to guide AI behavior and enhance its utility. AI doesn’t only respond to prompts, it can also learn from them. One of the clearest examples of prompting in action comes from Anthropic’s Claude model. They use a carefully crafted system prompt to define the model’s tone, boundaries, and behavior at a foundational level — ensuring thoughtful, context-aware responses. You can read the full design documentation in the Anthropic System Prompts Release Notes

In short, prompting is a skill that bridges the gap between humans and AI, allowing users to unlock the potential of these powerful tools. Those with strong communication skills and subject matter expertise can significantly boost their productivity and performance by working effectively with AI. Ultimately, the success of an AI system often depends more on the user’s abilities than on the AI itself. The better the user’s prompt, the better the AI’s output.

Jason Madar

Wed, Jun 25, 2025

Logo

Share your email for program updates and upcoming info sessions — unsubscribe anytime.

Thank you for subscribing!
© 2025 Langara College. All rights reserved.Privacy Policy