This guide covers how to access Prompt Studio, create and test prompts, and use the canvas options to optimize your results.
Access Prompt Studio
- Log in and select Prompt Studio from the modules list.
- On the dashboard, view prompts in three tabs: All prompts, My prompts, Shared prompts.
- Click New prompt, enter a name, and click Proceed.
- Choose a creation method: Generate a prompt, Start from scratch, or Prompt library.
Create Prompts
Generate a Prompt
Expand a basic instruction into a detailed, comprehensive prompt using AI.
- Click Generate a prompt.
- Select a model and enter your instruction.
- Review the AI-generated prompt.
- Click Proceed to copy it to the canvas for customization.
Only OpenAI and Anthropic models support prompt generation. You can also access this feature directly on the canvas via the Generate Prompt button.
Start from Scratch
Opens a blank canvas where you can manually write prompts, add variables, select models, and generate outputs with full control.
Prompt Library
Access 65+ pre-built templates for common use cases like code generation, summarization, and Q&A.
- Click Prompt library.
- Browse My templates (your saved templates) or All templates (built-in options).
- Select a template and click Use template to copy it to the canvas.
Templates are read-only. Edit only after copying to the canvas.
Use Prompt Canvas
Add Prompts
| Field | Purpose |
|---|
| System prompt | Define the model’s role and behavior (optional—use toggle to enable/disable) |
| Prompt | Provide task instructions for the model |
| Response JSON schema | Define structured output format (optional—if unsupported, schema is included with the prompt) |
Supported JSON schema types: String, Boolean, Number, Integer, Object, Array, Enum, anyOf.
Add Variables
Variables let you test prompts with multiple values simultaneously.
Syntax: Use {{variable_name}} in your prompt.
- Add variables in the Prompt field using double curly braces (e.g.,
{{product}}).
- When variables are added, a Variables column appears.
- Assign values to each variable. Click Add an empty row for multiple test cases.
Variables are substituted with their values during execution, generating outputs for all rows at once.
Select Models
Compare up to 5 models to identify the best fit for your use case.
- Click the Select Model field.
- Choose a model and connection.
- Add additional models in columns to the right.
- Click the settings icon to adjust parameters (temperature, top k, top p, max tokens).
Generate Output
- Click Run to generate outputs.
- Review results in the model columns, including: response text, token count (input + output), and response time.
Maximum 10 rows can be generated simultaneously.
Canvas Options
Model Column Options
| Icon | Feature | Description |
|---|
| ⋮⋮ | Rearrange | Drag to reorder model columns |
| ✕ | Remove Model | Remove a model from comparison |
| ★ | Bookmark | Set as preferred model (required when committing if none selected) |
| ⚙ | Model Settings | Adjust temperature, top k, top p, max tokens (defaults: top p=1, top k=5, temperature=1, max tokens=256) |
| ▶ | Play | Regenerate output for the entire column |
| ⏱ | Avg Response Time | View average generation time |
| 🔢 | Avg Tokens | View mean input/output token usage |
| 📋 | Copy | Copy output to clipboard |
| View JSON | View request/response in JSON format | |
| ↻ | Regenerate | Regenerate output for a single cell |
| Icon | Feature | Description |
|---|
| 📚 | Prompt Library | Browse ~70 templates |
| 🔗 | Prompt API | Share prompts via version-specific API endpoints (cURL, Python, Node.js) |
| 💾 | Save to Library | Save current prompt as a reusable template |
| 📝 | Draft History | Capture and restore canvas states |
| 📤 | Export CSV | Export canvas data (inputs, outputs, metadata) for analysis |
| 👥 | Share | Share prompts for collaboration |
| 🕐 | Versions | View, compare, and restore prompt versions |
| ✓ | Commit | Save current prompt as a new version (V1, V2, etc.) |
| ▶ | Run | Execute prompts and generate outputs |
Quick Reference
| Task | Action |
|---|
| Create prompt quickly | Use Generate a prompt with AI expansion |
| Full control over prompt | Use Start from scratch |
| Use proven templates | Browse Prompt library |
| Test multiple scenarios | Add variables with {{variable}} syntax |
| Compare model performance | Add up to 5 models side by side |
| Fine-tune outputs | Adjust model settings (temperature, tokens) |
| Save for reuse | Commit version or save to library |
| Share with team | Use Share or Prompt API |
| Analyze results | Export as CSV |