Skip to main content

Overview

PRPM Playground lets you test any package with leading AI models (Claude, GPT-4o, and more) directly in your browser or CLI. No installation required—just instant, interactive testing with real models.

Why Use Playground?

Before Playground, you had to:
  1. Install the package
  2. Set up your AI environment
  3. Create test scenarios
  4. Try it with your model
  5. Uninstall if it doesn’t work
Playground eliminates this friction entirely. Test instantly, compare results, and make informed decisions before installing anything.

Getting Started

Web Interface

  1. Browse packages at prpm.dev/search
  2. Click “Test in Playground” on any package page
  3. Enter your test input
  4. Get instant results from your chosen AI model

CLI Interface

# Single test
prpm playground @user/package-name "your test input"

# Interactive mode (multi-turn conversation)
prpm playground @user/package-name --interactive

# Compare with baseline (test with and without the package)
prpm playground @user/package-name "test input" --compare

# Use a specific model
prpm playground @user/package-name "test input" --model opus
prpm playground @user/package-name "test input" --model gpt-4o

Supported Models

ModelBest ForCredit Cost
Claude Sonnet 3.5Balanced performance (default)2 credits
GPT-4o MiniFast, simple tasks1 credit
GPT-4oAdvanced reasoning3 credits
GPT-4 TurboComplex tasks4 credits
Claude OpusMost capable, complex tasks7 credits
1 credit = 5,000 tokens (input + output combined)

Credit System

Free Trial

  • 5 free credits to get started
  • No credit card required
  • Test immediately after signup

PRPM+ Subscription

  • 100 monthly credits for $6/month
  • Credits roll over (max 200 balance)
  • Unused credits expire after 1 month
  • Organization members get PRPM+ for $3/month

Credit Packs

  • Buy additional credits that never expire
  • 100 credits = $5.00
  • Stack with monthly credits
  • Perfect for heavy users

Advanced Features

Compare Mode

Compare the same input with and without the package prompt to see the actual value added:
prpm playground @user/code-reviewer "Review this code" --compare
This runs two tests:
  1. With package: Uses the package’s prompt
  2. Without package: Baseline model with no prompt
Perfect for evaluating if a package actually improves results or is just prompt fluff.

Interactive Mode

Have multi-turn conversations to stress test packages:
prpm playground @user/brainstorm-assistant --interactive
  • Multiple conversation turns
  • Context maintained across messages
  • Type exit or press Ctrl+C to quit
  • Great for testing conversational packages

Model Selection

Choose the right model for your test:
# Use Claude Opus for complex analysis
prpm playground @user/complex-analyzer "analyze this" --model opus

# Use GPT-4o Mini for simple tests
prpm playground @user/simple-formatter "format this" --model gpt-4o-mini

# Use GPT-4o for advanced reasoning
prpm playground @user/problem-solver "solve this" --model gpt-4o

Sharing Results

Web Interface

  1. Complete a test in the playground
  2. Click the “Share” button
  3. Get a shareable link
  4. Share with your team or the community
Shared results include:
  • The input you tested
  • The model’s complete response
  • Token usage and credit cost
  • Package information

Community Results

Every package page shows recent community test results. See:
  • What inputs others tested
  • Which models they chose
  • Whether they found it helpful
  • Token usage and costs
This creates living documentation showing real-world usage.

For Package Authors

Analytics Dashboard

Package authors get detailed analytics showing:
  • Total test sessions
  • Popular AI models used
  • Credit usage patterns
  • Session duration averages
  • Test input trends
Access your analytics at prpm.dev/dashboard

Suggested Test Inputs

Help users test your package effectively:
# Add suggested inputs to your package
prpm suggested-inputs add @user/package-name "Review this authentication code"
prpm suggested-inputs add @user/package-name "Find security vulnerabilities"

# List current inputs
prpm suggested-inputs list @user/package-name

# Remove an input
prpm suggested-inputs remove @user/package-name <input-id>
Suggested inputs appear automatically in the playground, making it easier for users to test your package with relevant examples.

CLI Commands Reference

prpm playground

Test a package with AI models.
prpm playground <package> [input] [options]
Arguments:
  • <package> - Package identifier (e.g., @user/package-name)
  • [input] - Test input text (optional, required if not using --interactive)
Options:
  • -m, --model <model> - AI model to use (sonnet, opus, gpt-4o, gpt-4o-mini)
  • -c, --compare - Compare with and without package prompt
  • -i, --interactive - Start interactive conversation mode
  • --help - Show help information
Examples:
# Basic test
prpm playground @anthropic/code-reviewer "Review this: console.log('hi')"

# Interactive session
prpm playground @user/brainstorm --interactive

# Compare mode
prpm playground @user/optimizer "Optimize this function" --compare

# Specific model
prpm playground @user/complex-task "Analyze" --model opus

prpm subscribe

Manage your PRPM+ subscription.
prpm subscribe [command]
Commands:
  • (no command) - Subscribe to PRPM+
  • status - View subscription status and credits
  • cancel - Cancel your subscription
Examples:
# Subscribe to PRPM+
prpm subscribe

# Check your credits and status
prpm subscribe status

# Cancel subscription
prpm subscribe cancel

prpm buy-credits

Purchase additional playground credits.
prpm buy-credits
Opens an interactive prompt to select and purchase credit packs.

prpm suggested-inputs

Manage suggested test inputs for your packages.
prpm suggested-inputs <command> <package> [input]
Commands:
  • add - Add a suggested input
  • list - List all suggested inputs
  • remove - Remove a suggested input
Examples:
# Add a suggested input
prpm suggested-inputs add @user/my-package "Test with this input"

# List all inputs
prpm suggested-inputs list @user/my-package

# Remove an input
prpm suggested-inputs remove @user/my-package abc123

Best Practices

For Testing

  1. Use realistic inputs - Test with inputs you’d actually use in your workflow
  2. Try multiple models - Different models excel at different tasks
  3. Use compare mode - Verify the package actually adds value
  4. Test edge cases - Don’t just test the happy path
  5. Share useful results - Help the community learn from your tests

For Package Authors

  1. Add suggested inputs - Make it easy for users to test effectively
  2. Monitor analytics - See how users actually test your package
  3. Respond to feedback - Address issues users discover through testing
  4. Test your own packages - Verify they work as expected with different models
  5. Share best results - Feature great test results in your package README

Pricing & Limits

Credit Costs

Credits are spent based on token usage:
  • 1 credit = 5,000 tokens
  • Tokens = input tokens + output tokens
  • Cost shown after each test

Rate Limits

  • Free tier: 5 credits total
  • PRPM+: 100 credits/month + rollover (max 200)
  • Credit packs: No limits, never expire

Fair Use

Playground is for testing packages, not production AI usage:
  • ✅ Testing packages before installing
  • ✅ Comparing multiple packages
  • ✅ Sharing test results with team
  • ❌ Using as primary AI interface
  • ❌ Automating bulk requests
  • ❌ Sharing account credentials

Troubleshooting

”Insufficient credits”

Solution: Subscribe to PRPM+ or buy a credit pack:
prpm subscribe
# or
prpm buy-credits

“Package not found”

Solution: Verify the package name is correct:
prpm search "package name"

CLI playground not working

Solution: Ensure you’re logged in:
prpm login
prpm playground @user/package "test"

Test taking too long

Solution:
  • Try a faster model (GPT-4o Mini)
  • Reduce input length
  • Check your internet connection

Support

Need help with Playground?