LLM Settings
Model
Temperature
Max Tokens
System Prompt
You are a helpful assistant.
Stream Response
Advanced Settings
Top P
Top K
Frequency Penalty
Presence Penalty
Repetition Penalty
Reset settings
Provided by _subcode.ai
AI Chat
Ask me anything, type to get started