Ask AI is a beta feature under the Algolia Terms of Service (“Beta Services”). Use of this feature is subject to Algolia’s GenAI Addendum.

Algolia AI / Ask AI / Guides

Well-crafted prompts ensure LLMs can provide accurate, on-brand answers.

How Ask AI prompts work

Ask AI builds its responses from several components:

  • Base system prompt (hidden). Every Ask AI request starts with a built-in prompt that ensures safety, retrieves relevant content, and keeps the tone helpful.
  • Custom system prompt. This is your prompt template: it overlays the base system prompt. For example, your template might specify: “Answer like a Kubernetes site reliability engineer. Prefer concise bullet points.”
  • User questions. These are the queries users type directly into the chat box.
  • Context passages. Ask AI inserts relevant sections from your Algolia index.

How to create effective prompts

Goal What to do Why it helps
Be explicit State the assistant’s role, style, and constraints in the first sentence. LLMs prioritize the earliest, clearest instructions.
Provide context Include product names, user expertise level, or relevant terminology. Reduces hallucinations and keeps answers on-brand.
Set formatting Describe how to format responses. For example, “Answer in Markdown with H2 headings and a summary table.” Ensures consistent rendering of answers.
Show, don’t tell Provide one or two short examples of the output you want. Concrete examples are much more effective than general descriptions.
Limit scope Instruct the LLM to explicitly state when it doesn’t know an answer or if a query is outside its knowledge base. For example, “If you’re unsure, say I don’t know. Encourages the LLM to be accurate rather than speculate, preventing the generation of incorrect or misleading information.
Iterate and test Look at the feedback and adjust your prompt template if necessary. Crafting effective prompts is a trial-and-error process. Even minor wording changes can alter responses, so continuous testing is crucial to find what works.
Be concise Keep your prompts brief and to the point. When your prompt is short, the LLM can focus more on the relevant information from your site. This helps it give better answers and respond quickly.

Example prompt

You are a cloud-native solutions architect explaining technical trade-offs to junior DevOps engineers migrating to Kubernetes. Explain concepts in plain English, avoiding jargon. For each explanation, provide a concise introduction, a clear example, and then some common pitfalls. Think step-by-step and provide your answer in well-structured bullet points, ensuring the response is easy to understand.

Common pitfalls

Avoid Symptoms What to do
Vagueness. For example, “Explain this.” Vague, generic answers. Specify role, topic, and length of answers.
Long prompts with, say, 1,000-word instructions. Slower, less informative responses and higher LLM token costs. Trim to the essentials.
Conflicting instructions The LLM picks an instruction at random. Combine instructions or order them by priority.

Optimize relevance and accuracy in Ask AI responses

To improve the accuracy and relevance of Ask AI responses, apply facetFilters when configuring the assistant. This narrows the search scope to the most relevant records. For example, filtering by language and version ensures users only receive results that match the appropriate context. If your index contains different types of records—such as navigation elements, metadata, or placeholders—you can further refine results by using facetFilters: ['type:content']. This restricts the search to meaningful content, resulting in more accurate and helpful answers.

1
2
3
4
5
6
7
8
docsearch({
  askAi: {
  assistantId: 'YOUR_ALGOLIA_ASSISTANT_ID',
  searchParameters: {
    facetFilters: ['language:en', 'version:1.0.0'],
  },
},
});

Security and compliance guidance

Don’t paste secrets, personally identifiable information (PII), or internal URLs into prompt templates.

Don’t add company policies or other sensitive information to your prompt templates. However, do test some generated answers for policy compliance.

Further reading

Did you find this page helpful?