Guide

What is Temperature?

Controlling randomness and creativity in LLM outputs.

Temperature is a parameter that controls how random or deterministic LLM outputs are. Lower temperature = more focused and predictable. Higher temperature = more creative and varied.

Temperature Settings

  • 0: Most deterministic, always picks highest probability token
  • 0.3-0.5: Focused but with some variation
  • 0.7-0.9: Balanced creativity
  • 1.0+: High creativity, more randomness

When to Use Each

  • Low (0-0.3): Factual Q&A, code generation, data extraction
  • Medium (0.5-0.7): General conversation, explanations
  • High (0.8-1.0): Creative writing, brainstorming

Does lower temperature reduce hallucinations?

Somewhat. Lower temperature makes outputs more predictable but doesn't prevent hallucinations. The model may confidently output incorrect information.

Monitor outputs at any temperature

Start Free