LLM Configuration
Quick presets:
Qwen (DashScope)
DeepSeek
Zhipu (GLM)
OpenAI
API URL
API Key
Model Name
Max Tokens
Temperature:
0.7
Max Questions Per Session (chat question limit for visitors)
Custom System Prompt (optional, appended to default)
Save Configuration
Test Connection