Merge branch 'dev' into docs

This commit is contained in:
Jay V
2025-05-21 15:01:25 -04:00
43 changed files with 2592 additions and 548 deletions

3
.gitignore vendored
View File

@@ -42,4 +42,5 @@ Thumbs.db
.env.local .env.local
.opencode/ .opencode/
# ignore locally built binary
opencode*

128
README.md
View File

@@ -74,6 +74,8 @@ You can configure OpenCode using environment variables:
| `ANTHROPIC_API_KEY` | For Claude models | | `ANTHROPIC_API_KEY` | For Claude models |
| `OPENAI_API_KEY` | For OpenAI models | | `OPENAI_API_KEY` | For OpenAI models |
| `GEMINI_API_KEY` | For Google Gemini models | | `GEMINI_API_KEY` | For Google Gemini models |
| `VERTEXAI_PROJECT` | For Google Cloud VertexAI (Gemini) |
| `VERTEXAI_LOCATION` | For Google Cloud VertexAI (Gemini) |
| `GROQ_API_KEY` | For Groq models | | `GROQ_API_KEY` | For Groq models |
| `AWS_ACCESS_KEY_ID` | For AWS Bedrock (Claude) | | `AWS_ACCESS_KEY_ID` | For AWS Bedrock (Claude) |
| `AWS_SECRET_ACCESS_KEY` | For AWS Bedrock (Claude) | | `AWS_SECRET_ACCESS_KEY` | For AWS Bedrock (Claude) |
@@ -81,7 +83,6 @@ You can configure OpenCode using environment variables:
| `AZURE_OPENAI_ENDPOINT` | For Azure OpenAI models | | `AZURE_OPENAI_ENDPOINT` | For Azure OpenAI models |
| `AZURE_OPENAI_API_KEY` | For Azure OpenAI models (optional when using Entra ID) | | `AZURE_OPENAI_API_KEY` | For Azure OpenAI models (optional when using Entra ID) |
| `AZURE_OPENAI_API_VERSION` | For Azure OpenAI models | | `AZURE_OPENAI_API_VERSION` | For Azure OpenAI models |
### Configuration File Structure ### Configuration File Structure
```json ```json
@@ -135,6 +136,10 @@ You can configure OpenCode using environment variables:
"command": "gopls" "command": "gopls"
} }
}, },
"shell": {
"path": "/bin/zsh",
"args": ["-l"]
},
"debug": false, "debug": false,
"debugLSP": false "debugLSP": false
} }
@@ -189,7 +194,43 @@ OpenCode supports a variety of AI models from different providers:
- O3 family (o3, o3-mini) - O3 family (o3, o3-mini)
- O4 Mini - O4 Mini
## Usage ### Google Cloud VertexAI
- Gemini 2.5
- Gemini 2.5 Flash
## Using Bedrock Models
To use bedrock models with OpenCode you need three things.
1. Valid AWS credentials (the env vars: `AWS_SECRET_KEY_ID`, `AWS_SECRET_ACCESS_KEY` and `AWS_REGION`)
2. Access to the corresponding model in AWS Bedrock in your region.
a. You can request access in the AWS console on the Bedrock -> "Model access" page.
3. A correct configuration file. You don't need the `providers` key. Instead you have to prefix your models per agent with `bedrock.` and then a valid model. For now only Claude 3.7 is supported.
```json
{
"agents": {
"primary": {
"model": "bedrock.claude-3.7-sonnet",
"maxTokens": 5000,
"reasoningEffort": ""
},
"task": {
"model": "bedrock.claude-3.7-sonnet",
"maxTokens": 5000,
"reasoningEffort": ""
},
"title": {
"model": "bedrock.claude-3.7-sonnet",
"maxTokens": 80,
"reasoningEffort": ""
}
},
}
```
## Interactive Mode Usage
```bash ```bash
# Start OpenCode # Start OpenCode
@@ -202,13 +243,65 @@ opencode -d
opencode -c /path/to/project opencode -c /path/to/project
``` ```
## Non-interactive Prompt Mode
You can run OpenCode in non-interactive mode by passing a prompt directly as a command-line argument. This is useful for scripting, automation, or when you want a quick answer without launching the full TUI.
```bash
# Run a single prompt and print the AI's response to the terminal
opencode -p "Explain the use of context in Go"
# Get response in JSON format
opencode -p "Explain the use of context in Go" -f json
# Run without showing the spinner
opencode -p "Explain the use of context in Go" -q
# Enable verbose logging to stderr
opencode -p "Explain the use of context in Go" --verbose
# Restrict the agent to only use specific tools
opencode -p "Explain the use of context in Go" --allowedTools=view,ls,glob
# Prevent the agent from using specific tools
opencode -p "Explain the use of context in Go" --excludedTools=bash,edit
```
In this mode, OpenCode will process your prompt, print the result to standard output, and then exit. All permissions are auto-approved for the session.
### Tool Restrictions
You can control which tools the AI assistant has access to in non-interactive mode:
- `--allowedTools`: Comma-separated list of tools that the agent is allowed to use. Only these tools will be available.
- `--excludedTools`: Comma-separated list of tools that the agent is not allowed to use. All other tools will be available.
These flags are mutually exclusive - you can use either `--allowedTools` or `--excludedTools`, but not both at the same time.
### Output Formats
OpenCode supports the following output formats in non-interactive mode:
| Format | Description |
| ------ | -------------------------------------- |
| `text` | Plain text output (default) |
| `json` | Output wrapped in a JSON object |
The output format is implemented as a strongly-typed `OutputFormat` in the codebase, ensuring type safety and validation when processing outputs.
## Command-line Flags ## Command-line Flags
| Flag | Short | Description | | Flag | Short | Description |
| --------- | ----- | ----------------------------- | | ----------------- | ----- | ---------------------------------------------------------- |
| `--help` | `-h` | Display help information | | `--help` | `-h` | Display help information |
| `--debug` | `-d` | Enable debug mode | | `--debug` | `-d` | Enable debug mode |
| `--cwd` | `-c` | Set current working directory | | `--cwd` | `-c` | Set current working directory |
| `--prompt` | `-p` | Run a single prompt in non-interactive mode |
| `--output-format` | `-f` | Output format for non-interactive mode (text, json) |
| `--quiet` | `-q` | Hide spinner in non-interactive mode |
| `--verbose` | | Display logs to stderr in non-interactive mode |
| `--allowedTools` | | Restrict the agent to only use specified tools |
| `--excludedTools` | | Prevent the agent from using specified tools |
## Keyboard Shortcuts ## Keyboard Shortcuts
@@ -374,6 +467,35 @@ You can define any of the following color keys in your `customTheme`:
You don't need to define all colors. Any undefined colors will fall back to the default "opencode" theme colors. You don't need to define all colors. Any undefined colors will fall back to the default "opencode" theme colors.
### Shell Configuration
OpenCode allows you to configure the shell used by the `bash` tool. By default, it uses:
1. The shell specified in the config file (if provided)
2. The shell from the `$SHELL` environment variable (if available)
3. Falls back to `/bin/bash` if neither of the above is available
To configure a custom shell, add a `shell` section to your `.opencode.json` configuration file:
```json
{
"shell": {
"path": "/bin/zsh",
"args": ["-l"]
}
}
```
You can specify any shell executable and custom arguments:
```json
{
"shell": {
"path": "/usr/bin/fish",
"args": []
}
}
```
## Architecture ## Architecture
OpenCode is built with a modular architecture: OpenCode is built with a modular architecture:

292
cmd/non_interactive_mode.go Normal file
View File

@@ -0,0 +1,292 @@
package cmd
import (
"context"
"fmt"
"io"
"os"
"sync"
"time"
"log/slog"
charmlog "github.com/charmbracelet/log"
"github.com/sst/opencode/internal/app"
"github.com/sst/opencode/internal/config"
"github.com/sst/opencode/internal/db"
"github.com/sst/opencode/internal/format"
"github.com/sst/opencode/internal/llm/agent"
"github.com/sst/opencode/internal/llm/tools"
"github.com/sst/opencode/internal/message"
"github.com/sst/opencode/internal/permission"
"github.com/sst/opencode/internal/tui/components/spinner"
"github.com/sst/opencode/internal/tui/theme"
)
// syncWriter is a thread-safe writer that prevents interleaved output
type syncWriter struct {
w io.Writer
mu sync.Mutex
}
// Write implements io.Writer
func (sw *syncWriter) Write(p []byte) (n int, err error) {
sw.mu.Lock()
defer sw.mu.Unlock()
return sw.w.Write(p)
}
// newSyncWriter creates a new synchronized writer
func newSyncWriter(w io.Writer) io.Writer {
return &syncWriter{w: w}
}
// filterTools filters the provided tools based on allowed or excluded tool names
func filterTools(allTools []tools.BaseTool, allowedTools, excludedTools []string) []tools.BaseTool {
// If neither allowed nor excluded tools are specified, return all tools
if len(allowedTools) == 0 && len(excludedTools) == 0 {
return allTools
}
// Create a map for faster lookups
allowedMap := make(map[string]bool)
for _, name := range allowedTools {
allowedMap[name] = true
}
excludedMap := make(map[string]bool)
for _, name := range excludedTools {
excludedMap[name] = true
}
var filteredTools []tools.BaseTool
for _, tool := range allTools {
toolName := tool.Info().Name
// If we have an allowed list, only include tools in that list
if len(allowedTools) > 0 {
if allowedMap[toolName] {
filteredTools = append(filteredTools, tool)
}
} else if len(excludedTools) > 0 {
// If we have an excluded list, include all tools except those in the list
if !excludedMap[toolName] {
filteredTools = append(filteredTools, tool)
}
}
}
return filteredTools
}
// handleNonInteractiveMode processes a single prompt in non-interactive mode
func handleNonInteractiveMode(ctx context.Context, prompt string, outputFormat format.OutputFormat, quiet bool, verbose bool, allowedTools, excludedTools []string) error {
// Initial log message using standard slog
slog.Info("Running in non-interactive mode", "prompt", prompt, "format", outputFormat, "quiet", quiet, "verbose", verbose,
"allowedTools", allowedTools, "excludedTools", excludedTools)
// Sanity check for mutually exclusive flags
if quiet && verbose {
return fmt.Errorf("--quiet and --verbose flags cannot be used together")
}
// Set up logging to stderr if verbose mode is enabled
if verbose {
// Create a synchronized writer to prevent interleaved output
syncWriter := newSyncWriter(os.Stderr)
// Create a charmbracelet/log logger that writes to the synchronized writer
charmLogger := charmlog.NewWithOptions(syncWriter, charmlog.Options{
Level: charmlog.DebugLevel,
ReportCaller: true,
ReportTimestamp: true,
TimeFormat: time.RFC3339,
Prefix: "OpenCode",
})
// Set the global logger for charmbracelet/log
charmlog.SetDefault(charmLogger)
// Create a slog handler that uses charmbracelet/log
// This will forward all slog logs to charmbracelet/log
slog.SetDefault(slog.New(charmLogger))
// Log a message to confirm verbose logging is enabled
charmLogger.Info("Verbose logging enabled")
}
// Start spinner if not in quiet mode
var s *spinner.Spinner
if !quiet {
// Get the current theme to style the spinner
currentTheme := theme.CurrentTheme()
// Create a themed spinner
if currentTheme != nil {
// Use the primary color from the theme
s = spinner.NewThemedSpinner("Thinking...", currentTheme.Primary())
} else {
// Fallback to default spinner if no theme is available
s = spinner.NewSpinner("Thinking...")
}
s.Start()
defer s.Stop()
}
// Connect DB, this will also run migrations
conn, err := db.Connect()
if err != nil {
return err
}
// Create a context with cancellation
ctx, cancel := context.WithCancel(ctx)
defer cancel()
// Create the app
app, err := app.New(ctx, conn)
if err != nil {
slog.Error("Failed to create app", "error", err)
return err
}
// Create a new session for this prompt
session, err := app.Sessions.Create(ctx, "Non-interactive prompt")
if err != nil {
return fmt.Errorf("failed to create session: %w", err)
}
// Set the session as current
app.CurrentSession = &session
// Auto-approve all permissions for this session
permission.AutoApproveSession(ctx, session.ID)
// Create the user message
_, err = app.Messages.Create(ctx, session.ID, message.CreateMessageParams{
Role: message.User,
Parts: []message.ContentPart{message.TextContent{Text: prompt}},
})
if err != nil {
return fmt.Errorf("failed to create message: %w", err)
}
// If tool restrictions are specified, create a new agent with filtered tools
if len(allowedTools) > 0 || len(excludedTools) > 0 {
// Initialize MCP tools synchronously to ensure they're included in filtering
mcpCtx, mcpCancel := context.WithTimeout(ctx, 10*time.Second)
agent.GetMcpTools(mcpCtx, app.Permissions)
mcpCancel()
// Get all available tools including MCP tools
allTools := agent.PrimaryAgentTools(
app.Permissions,
app.Sessions,
app.Messages,
app.History,
app.LSPClients,
)
// Filter tools based on allowed/excluded lists
filteredTools := filterTools(allTools, allowedTools, excludedTools)
// Log the filtered tools for debugging
var toolNames []string
for _, tool := range filteredTools {
toolNames = append(toolNames, tool.Info().Name)
}
slog.Debug("Using filtered tools", "count", len(filteredTools), "tools", toolNames)
// Create a new agent with the filtered tools
restrictedAgent, err := agent.NewAgent(
config.AgentPrimary,
app.Sessions,
app.Messages,
filteredTools,
)
if err != nil {
return fmt.Errorf("failed to create restricted agent: %w", err)
}
// Use the restricted agent for this request
eventCh, err := restrictedAgent.Run(ctx, session.ID, prompt)
if err != nil {
return fmt.Errorf("failed to run restricted agent: %w", err)
}
// Wait for the response
var response message.Message
for event := range eventCh {
if event.Err() != nil {
return fmt.Errorf("agent error: %w", event.Err())
}
response = event.Response()
}
// Format and print the output
content := ""
if textContent := response.Content(); textContent != nil {
content = textContent.Text
}
formattedOutput, err := format.FormatOutput(content, outputFormat)
if err != nil {
return fmt.Errorf("failed to format output: %w", err)
}
// Stop spinner before printing output
if !quiet && s != nil {
s.Stop()
}
// Print the formatted output to stdout
fmt.Println(formattedOutput)
// Shutdown the app
app.Shutdown()
return nil
}
// Run the default agent if no tool restrictions
eventCh, err := app.PrimaryAgent.Run(ctx, session.ID, prompt)
if err != nil {
return fmt.Errorf("failed to run agent: %w", err)
}
// Wait for the response
var response message.Message
for event := range eventCh {
if event.Err() != nil {
return fmt.Errorf("agent error: %w", event.Err())
}
response = event.Response()
}
// Get the text content from the response
content := ""
if textContent := response.Content(); textContent != nil {
content = textContent.Text
}
// Format the output according to the specified format
formattedOutput, err := format.FormatOutput(content, outputFormat)
if err != nil {
return fmt.Errorf("failed to format output: %w", err)
}
// Stop spinner before printing output
if !quiet && s != nil {
s.Stop()
}
// Print the formatted output to stdout
fmt.Println(formattedOutput)
// Shutdown the app
app.Shutdown()
return nil
}

View File

@@ -15,6 +15,7 @@ import (
"github.com/sst/opencode/internal/app" "github.com/sst/opencode/internal/app"
"github.com/sst/opencode/internal/config" "github.com/sst/opencode/internal/config"
"github.com/sst/opencode/internal/db" "github.com/sst/opencode/internal/db"
"github.com/sst/opencode/internal/format"
"github.com/sst/opencode/internal/llm/agent" "github.com/sst/opencode/internal/llm/agent"
"github.com/sst/opencode/internal/logging" "github.com/sst/opencode/internal/logging"
"github.com/sst/opencode/internal/lsp/discovery" "github.com/sst/opencode/internal/lsp/discovery"
@@ -88,6 +89,25 @@ to assist developers in writing, debugging, and understanding code directly from
return err return err
} }
// Check if we're in non-interactive mode
prompt, _ := cmd.Flags().GetString("prompt")
if prompt != "" {
outputFormatStr, _ := cmd.Flags().GetString("output-format")
outputFormat := format.OutputFormat(outputFormatStr)
if !outputFormat.IsValid() {
return fmt.Errorf("invalid output format: %s", outputFormatStr)
}
quiet, _ := cmd.Flags().GetBool("quiet")
verbose, _ := cmd.Flags().GetBool("verbose")
// Get tool restriction flags
allowedTools, _ := cmd.Flags().GetStringSlice("allowedTools")
excludedTools, _ := cmd.Flags().GetStringSlice("excludedTools")
return handleNonInteractiveMode(cmd.Context(), prompt, outputFormat, quiet, verbose, allowedTools, excludedTools)
}
// Run LSP auto-discovery // Run LSP auto-discovery
if err := discovery.IntegrateLSPServers(cwd); err != nil { if err := discovery.IntegrateLSPServers(cwd); err != nil {
slog.Warn("Failed to auto-discover LSP servers", "error", err) slog.Warn("Failed to auto-discover LSP servers", "error", err)
@@ -296,4 +316,16 @@ func init() {
rootCmd.Flags().BoolP("version", "v", false, "Version") rootCmd.Flags().BoolP("version", "v", false, "Version")
rootCmd.Flags().BoolP("debug", "d", false, "Debug") rootCmd.Flags().BoolP("debug", "d", false, "Debug")
rootCmd.Flags().StringP("cwd", "c", "", "Current working directory") rootCmd.Flags().StringP("cwd", "c", "", "Current working directory")
rootCmd.Flags().StringP("prompt", "p", "", "Run a single prompt in non-interactive mode")
rootCmd.Flags().StringP("output-format", "f", "text", "Output format for non-interactive mode (text, json)")
rootCmd.Flags().BoolP("quiet", "q", false, "Hide spinner in non-interactive mode")
rootCmd.Flags().BoolP("verbose", "", false, "Display logs to stderr in non-interactive mode")
rootCmd.Flags().StringSlice("allowedTools", nil, "Restrict the agent to only use the specified tools in non-interactive mode (comma-separated list)")
rootCmd.Flags().StringSlice("excludedTools", nil, "Prevent the agent from using the specified tools in non-interactive mode (comma-separated list)")
// Make allowedTools and excludedTools mutually exclusive
rootCmd.MarkFlagsMutuallyExclusive("allowedTools", "excludedTools")
// Make quiet and verbose mutually exclusive
rootCmd.MarkFlagsMutuallyExclusive("quiet", "verbose")
} }

View File

@@ -227,6 +227,7 @@ func generateSchema() map[string]any {
string(models.ProviderOpenRouter), string(models.ProviderOpenRouter),
string(models.ProviderBedrock), string(models.ProviderBedrock),
string(models.ProviderAzure), string(models.ProviderAzure),
string(models.ProviderVertexAI),
} }
providerSchema["additionalProperties"].(map[string]any)["properties"].(map[string]any)["provider"] = map[string]any{ providerSchema["additionalProperties"].(map[string]any)["properties"].(map[string]any)["provider"] = map[string]any{

13
go.mod
View File

@@ -11,7 +11,7 @@ require (
github.com/aymanbagabas/go-udiff v0.2.0 github.com/aymanbagabas/go-udiff v0.2.0
github.com/bmatcuk/doublestar/v4 v4.8.1 github.com/bmatcuk/doublestar/v4 v4.8.1
github.com/catppuccin/go v0.3.0 github.com/catppuccin/go v0.3.0
github.com/charmbracelet/bubbles v0.20.0 github.com/charmbracelet/bubbles v0.21.0
github.com/charmbracelet/bubbletea v1.3.4 github.com/charmbracelet/bubbletea v1.3.4
github.com/charmbracelet/glamour v0.9.1 github.com/charmbracelet/glamour v0.9.1
github.com/charmbracelet/lipgloss v1.1.0 github.com/charmbracelet/lipgloss v1.1.0
@@ -26,7 +26,7 @@ require (
github.com/muesli/reflow v0.3.0 github.com/muesli/reflow v0.3.0
github.com/muesli/termenv v0.16.0 github.com/muesli/termenv v0.16.0
github.com/ncruces/go-sqlite3 v0.25.0 github.com/ncruces/go-sqlite3 v0.25.0
github.com/openai/openai-go v0.1.0-beta.2 github.com/openai/openai-go v0.1.0-beta.10
github.com/pressly/goose/v3 v3.24.2 github.com/pressly/goose/v3 v3.24.2
github.com/sergi/go-diff v1.3.2-0.20230802210424-5b0b94c5c0d3 github.com/sergi/go-diff v1.3.2-0.20230802210424-5b0b94c5c0d3
github.com/spf13/cobra v1.9.1 github.com/spf13/cobra v1.9.1
@@ -34,6 +34,11 @@ require (
github.com/stretchr/testify v1.10.0 github.com/stretchr/testify v1.10.0
) )
require (
github.com/charmbracelet/log v0.4.2 // indirect
golang.org/x/exp v0.0.0-20250305212735-054e65f0b394 // indirect
)
require ( require (
cloud.google.com/go v0.116.0 // indirect cloud.google.com/go v0.116.0 // indirect
cloud.google.com/go/auth v0.13.0 // indirect cloud.google.com/go/auth v0.13.0 // indirect
@@ -42,7 +47,7 @@ require (
github.com/Azure/azure-sdk-for-go/sdk/internal v1.10.0 // indirect github.com/Azure/azure-sdk-for-go/sdk/internal v1.10.0 // indirect
github.com/AzureAD/microsoft-authentication-library-for-go v1.2.2 // indirect github.com/AzureAD/microsoft-authentication-library-for-go v1.2.2 // indirect
github.com/andybalholm/cascadia v1.3.2 // indirect github.com/andybalholm/cascadia v1.3.2 // indirect
github.com/atotto/clipboard v0.1.4 // indirect github.com/atotto/clipboard v0.1.4
github.com/aws/aws-sdk-go-v2 v1.30.3 // indirect github.com/aws/aws-sdk-go-v2 v1.30.3 // indirect
github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.6.3 // indirect github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.6.3 // indirect
github.com/aws/aws-sdk-go-v2/config v1.27.27 // indirect github.com/aws/aws-sdk-go-v2/config v1.27.27 // indirect
@@ -115,7 +120,7 @@ require (
go.opentelemetry.io/otel/trace v1.35.0 // indirect go.opentelemetry.io/otel/trace v1.35.0 // indirect
go.uber.org/multierr v1.11.0 // indirect go.uber.org/multierr v1.11.0 // indirect
golang.org/x/crypto v0.37.0 // indirect golang.org/x/crypto v0.37.0 // indirect
golang.org/x/image v0.26.0 // indirect golang.org/x/image v0.26.0
golang.org/x/net v0.39.0 // indirect golang.org/x/net v0.39.0 // indirect
golang.org/x/sync v0.13.0 // indirect golang.org/x/sync v0.13.0 // indirect
golang.org/x/sys v0.32.0 // indirect golang.org/x/sys v0.32.0 // indirect

14
go.sum
View File

@@ -68,8 +68,8 @@ github.com/bmatcuk/doublestar/v4 v4.8.1 h1:54Bopc5c2cAvhLRAzqOGCYHYyhcDHsFF4wWIR
github.com/bmatcuk/doublestar/v4 v4.8.1/go.mod h1:xBQ8jztBU6kakFMg+8WGxn0c6z1fTSPVIjEY1Wr7jzc= github.com/bmatcuk/doublestar/v4 v4.8.1/go.mod h1:xBQ8jztBU6kakFMg+8WGxn0c6z1fTSPVIjEY1Wr7jzc=
github.com/catppuccin/go v0.3.0 h1:d+0/YicIq+hSTo5oPuRi5kOpqkVA5tAsU6dNhvRu+aY= github.com/catppuccin/go v0.3.0 h1:d+0/YicIq+hSTo5oPuRi5kOpqkVA5tAsU6dNhvRu+aY=
github.com/catppuccin/go v0.3.0/go.mod h1:8IHJuMGaUUjQM82qBrGNBv7LFq6JI3NnQCF6MOlZjpc= github.com/catppuccin/go v0.3.0/go.mod h1:8IHJuMGaUUjQM82qBrGNBv7LFq6JI3NnQCF6MOlZjpc=
github.com/charmbracelet/bubbles v0.20.0 h1:jSZu6qD8cRQ6k9OMfR1WlM+ruM8fkPWkHvQWD9LIutE= github.com/charmbracelet/bubbles v0.21.0 h1:9TdC97SdRVg/1aaXNVWfFH3nnLAwOXr8Fn6u6mfQdFs=
github.com/charmbracelet/bubbles v0.20.0/go.mod h1:39slydyswPy+uVOHZ5x/GjwVAFkCsV8IIVy+4MhzwwU= github.com/charmbracelet/bubbles v0.21.0/go.mod h1:HF+v6QUR4HkEpz62dx7ym2xc71/KBHg+zKwJtMw+qtg=
github.com/charmbracelet/bubbletea v1.3.4 h1:kCg7B+jSCFPLYRA52SDZjr51kG/fMUEoPoZrkaDHyoI= github.com/charmbracelet/bubbletea v1.3.4 h1:kCg7B+jSCFPLYRA52SDZjr51kG/fMUEoPoZrkaDHyoI=
github.com/charmbracelet/bubbletea v1.3.4/go.mod h1:dtcUCyCGEX3g9tosuYiut3MXgY/Jsv9nKVdibKKRRXo= github.com/charmbracelet/bubbletea v1.3.4/go.mod h1:dtcUCyCGEX3g9tosuYiut3MXgY/Jsv9nKVdibKKRRXo=
github.com/charmbracelet/colorprofile v0.2.3-0.20250311203215-f60798e515dc h1:4pZI35227imm7yK2bGPcfpFEmuY1gc2YSTShr4iJBfs= github.com/charmbracelet/colorprofile v0.2.3-0.20250311203215-f60798e515dc h1:4pZI35227imm7yK2bGPcfpFEmuY1gc2YSTShr4iJBfs=
@@ -78,12 +78,14 @@ github.com/charmbracelet/glamour v0.9.1 h1:11dEfiGP8q1BEqvGoIjivuc2rBk+5qEXdPtaQ
github.com/charmbracelet/glamour v0.9.1/go.mod h1:+SHvIS8qnwhgTpVMiXwn7OfGomSqff1cHBCI8jLOetk= github.com/charmbracelet/glamour v0.9.1/go.mod h1:+SHvIS8qnwhgTpVMiXwn7OfGomSqff1cHBCI8jLOetk=
github.com/charmbracelet/lipgloss v1.1.0 h1:vYXsiLHVkK7fp74RkV7b2kq9+zDLoEU4MZoFqR/noCY= github.com/charmbracelet/lipgloss v1.1.0 h1:vYXsiLHVkK7fp74RkV7b2kq9+zDLoEU4MZoFqR/noCY=
github.com/charmbracelet/lipgloss v1.1.0/go.mod h1:/6Q8FR2o+kj8rz4Dq0zQc3vYf7X+B0binUUBwA0aL30= github.com/charmbracelet/lipgloss v1.1.0/go.mod h1:/6Q8FR2o+kj8rz4Dq0zQc3vYf7X+B0binUUBwA0aL30=
github.com/charmbracelet/log v0.4.2 h1:hYt8Qj6a8yLnvR+h7MwsJv/XvmBJXiueUcI3cIxsyig=
github.com/charmbracelet/log v0.4.2/go.mod h1:qifHGX/tc7eluv2R6pWIpyHDDrrb/AG71Pf2ysQu5nw=
github.com/charmbracelet/x/ansi v0.8.0 h1:9GTq3xq9caJW8ZrBTe0LIe2fvfLR/bYXKTx2llXn7xE= github.com/charmbracelet/x/ansi v0.8.0 h1:9GTq3xq9caJW8ZrBTe0LIe2fvfLR/bYXKTx2llXn7xE=
github.com/charmbracelet/x/ansi v0.8.0/go.mod h1:wdYl/ONOLHLIVmQaxbIYEC/cRKOQyjTkowiI4blgS9Q= github.com/charmbracelet/x/ansi v0.8.0/go.mod h1:wdYl/ONOLHLIVmQaxbIYEC/cRKOQyjTkowiI4blgS9Q=
github.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd h1:vy0GVL4jeHEwG5YOXDmi86oYw2yuYUGqz6a8sLwg0X8= github.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd h1:vy0GVL4jeHEwG5YOXDmi86oYw2yuYUGqz6a8sLwg0X8=
github.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd/go.mod h1:xe0nKWGd3eJgtqZRaN9RjMtK7xUYchjzPr7q6kcvCCs= github.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd/go.mod h1:xe0nKWGd3eJgtqZRaN9RjMtK7xUYchjzPr7q6kcvCCs=
github.com/charmbracelet/x/exp/golden v0.0.0-20240815200342-61de596daa2b h1:MnAMdlwSltxJyULnrYbkZpp4k58Co7Tah3ciKhSNo0Q= github.com/charmbracelet/x/exp/golden v0.0.0-20241011142426-46044092ad91 h1:payRxjMjKgx2PaCWLZ4p3ro9y97+TVLZNaRZgJwSVDQ=
github.com/charmbracelet/x/exp/golden v0.0.0-20240815200342-61de596daa2b/go.mod h1:wDlXFlCrmJ8J+swcL/MnGUuYnqgQdW9rhSD61oNMb6U= github.com/charmbracelet/x/exp/golden v0.0.0-20241011142426-46044092ad91/go.mod h1:wDlXFlCrmJ8J+swcL/MnGUuYnqgQdW9rhSD61oNMb6U=
github.com/charmbracelet/x/term v0.2.1 h1:AQeHeLZ1OqSXhrAWpYUtZyX1T3zVxfpZuEQMIQaGIAQ= github.com/charmbracelet/x/term v0.2.1 h1:AQeHeLZ1OqSXhrAWpYUtZyX1T3zVxfpZuEQMIQaGIAQ=
github.com/charmbracelet/x/term v0.2.1/go.mod h1:oQ4enTYFV7QN4m0i9mzHrViD7TQKvNEEkHUMCmsxdUg= github.com/charmbracelet/x/term v0.2.1/go.mod h1:oQ4enTYFV7QN4m0i9mzHrViD7TQKvNEEkHUMCmsxdUg=
github.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g= github.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=
@@ -177,8 +179,8 @@ github.com/ncruces/go-strftime v0.1.9 h1:bY0MQC28UADQmHmaF5dgpLmImcShSi2kHU9XLdh
github.com/ncruces/go-strftime v0.1.9/go.mod h1:Fwc5htZGVVkseilnfgOVb9mKy6w1naJmn9CehxcKcls= github.com/ncruces/go-strftime v0.1.9/go.mod h1:Fwc5htZGVVkseilnfgOVb9mKy6w1naJmn9CehxcKcls=
github.com/ncruces/julianday v1.0.0 h1:fH0OKwa7NWvniGQtxdJRxAgkBMolni2BjDHaWTxqt7M= github.com/ncruces/julianday v1.0.0 h1:fH0OKwa7NWvniGQtxdJRxAgkBMolni2BjDHaWTxqt7M=
github.com/ncruces/julianday v1.0.0/go.mod h1:Dusn2KvZrrovOMJuOt0TNXL6tB7U2E8kvza5fFc9G7g= github.com/ncruces/julianday v1.0.0/go.mod h1:Dusn2KvZrrovOMJuOt0TNXL6tB7U2E8kvza5fFc9G7g=
github.com/openai/openai-go v0.1.0-beta.2 h1:Ra5nCFkbEl9w+UJwAciC4kqnIBUCcJazhmMA0/YN894= github.com/openai/openai-go v0.1.0-beta.10 h1:CknhGXe8aXQMRuqg255PFnWzgRY9nEryMxoNIBBM9tU=
github.com/openai/openai-go v0.1.0-beta.2/go.mod h1:g461MYGXEXBVdV5SaR/5tNzNbSfwTBBefwc+LlDCK0Y= github.com/openai/openai-go v0.1.0-beta.10/go.mod h1:g461MYGXEXBVdV5SaR/5tNzNbSfwTBBefwc+LlDCK0Y=
github.com/pelletier/go-toml/v2 v2.2.3 h1:YmeHyLY8mFWbdkNWwpr+qIL2bEqT0o95WSdkNHvL12M= github.com/pelletier/go-toml/v2 v2.2.3 h1:YmeHyLY8mFWbdkNWwpr+qIL2bEqT0o95WSdkNHvL12M=
github.com/pelletier/go-toml/v2 v2.2.3/go.mod h1:MfCQTFTvCcUyyvvwm1+G6H/jORL20Xlb6rzQu9GuUkc= github.com/pelletier/go-toml/v2 v2.2.3/go.mod h1:MfCQTFTvCcUyyvvwm1+G6H/jORL20Xlb6rzQu9GuUkc=
github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c h1:+mdjkGKdHQG3305AYmdv1U2eRNDiU2ErMBj1gwrq8eQ= github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c h1:+mdjkGKdHQG3305AYmdv1U2eRNDiU2ErMBj1gwrq8eQ=

View File

@@ -40,6 +40,9 @@ type App struct {
watcherCancelFuncs []context.CancelFunc watcherCancelFuncs []context.CancelFunc
cancelFuncsMutex sync.Mutex cancelFuncsMutex sync.Mutex
watcherWG sync.WaitGroup watcherWG sync.WaitGroup
// UI state
filepickerOpen bool
} }
func New(ctx context.Context, conn *sql.DB) (*App, error) { func New(ctx context.Context, conn *sql.DB) (*App, error) {
@@ -128,6 +131,16 @@ func (app *App) initTheme() {
} }
} }
// IsFilepickerOpen returns whether the filepicker is currently open
func (app *App) IsFilepickerOpen() bool {
return app.filepickerOpen
}
// SetFilepickerOpen sets the state of the filepicker
func (app *App) SetFilepickerOpen(open bool) {
app.filepickerOpen = open
}
// Shutdown performs a clean shutdown of the application // Shutdown performs a clean shutdown of the application
func (app *App) Shutdown() { func (app *App) Shutdown() {
// Cancel all watcher goroutines // Cancel all watcher goroutines

View File

@@ -73,6 +73,12 @@ type TUIConfig struct {
CustomTheme map[string]any `json:"customTheme,omitempty"` CustomTheme map[string]any `json:"customTheme,omitempty"`
} }
// ShellConfig defines the configuration for the shell used by the bash tool.
type ShellConfig struct {
Path string `json:"path,omitempty"`
Args []string `json:"args,omitempty"`
}
// Config is the main configuration structure for the application. // Config is the main configuration structure for the application.
type Config struct { type Config struct {
Data Data `json:"data"` Data Data `json:"data"`
@@ -85,6 +91,7 @@ type Config struct {
DebugLSP bool `json:"debugLSP,omitempty"` DebugLSP bool `json:"debugLSP,omitempty"`
ContextPaths []string `json:"contextPaths,omitempty"` ContextPaths []string `json:"contextPaths,omitempty"`
TUI TUIConfig `json:"tui"` TUI TUIConfig `json:"tui"`
Shell ShellConfig `json:"shell,omitempty"`
} }
// Application constants // Application constants
@@ -235,6 +242,7 @@ func setProviderDefaults() {
// 5. OpenRouter // 5. OpenRouter
// 6. AWS Bedrock // 6. AWS Bedrock
// 7. Azure // 7. Azure
// 8. Google Cloud VertexAI
// Anthropic configuration // Anthropic configuration
if key := viper.GetString("providers.anthropic.apiKey"); strings.TrimSpace(key) != "" { if key := viper.GetString("providers.anthropic.apiKey"); strings.TrimSpace(key) != "" {
@@ -299,6 +307,15 @@ func setProviderDefaults() {
viper.SetDefault("agents.title.model", models.AzureGPT41Mini) viper.SetDefault("agents.title.model", models.AzureGPT41Mini)
return return
} }
// Google Cloud VertexAI configuration
if hasVertexAICredentials() {
viper.SetDefault("agents.coder.model", models.VertexAIGemini25)
viper.SetDefault("agents.summarizer.model", models.VertexAIGemini25)
viper.SetDefault("agents.task.model", models.VertexAIGemini25Flash)
viper.SetDefault("agents.title.model", models.VertexAIGemini25Flash)
return
}
} }
// hasAWSCredentials checks if AWS credentials are available in the environment. // hasAWSCredentials checks if AWS credentials are available in the environment.
@@ -327,6 +344,19 @@ func hasAWSCredentials() bool {
return false return false
} }
// hasVertexAICredentials checks if VertexAI credentials are available in the environment.
func hasVertexAICredentials() bool {
// Check for explicit VertexAI parameters
if os.Getenv("VERTEXAI_PROJECT") != "" && os.Getenv("VERTEXAI_LOCATION") != "" {
return true
}
// Check for Google Cloud project and location
if os.Getenv("GOOGLE_CLOUD_PROJECT") != "" && (os.Getenv("GOOGLE_CLOUD_REGION") != "" || os.Getenv("GOOGLE_CLOUD_LOCATION") != "") {
return true
}
return false
}
// readConfig handles the result of reading a configuration file. // readConfig handles the result of reading a configuration file.
func readConfig(err error) error { func readConfig(err error) error {
if err == nil { if err == nil {
@@ -549,6 +579,10 @@ func getProviderAPIKey(provider models.ModelProvider) string {
if hasAWSCredentials() { if hasAWSCredentials() {
return "aws-credentials-available" return "aws-credentials-available"
} }
case models.ProviderVertexAI:
if hasVertexAICredentials() {
return "vertex-ai-credentials-available"
}
} }
return "" return ""
} }
@@ -669,6 +703,24 @@ func setDefaultModelForAgent(agent AgentName) bool {
return true return true
} }
if hasVertexAICredentials() {
var model models.ModelID
maxTokens := int64(5000)
if agent == AgentTitle {
model = models.VertexAIGemini25Flash
maxTokens = 80
} else {
model = models.VertexAIGemini25
}
cfg.Agents[agent] = Agent{
Model: model,
MaxTokens: maxTokens,
}
return true
}
return false return false
} }

46
internal/format/format.go Normal file
View File

@@ -0,0 +1,46 @@
package format
import (
"encoding/json"
"fmt"
)
// OutputFormat represents the format for non-interactive mode output
type OutputFormat string
const (
// TextFormat is plain text output (default)
TextFormat OutputFormat = "text"
// JSONFormat is output wrapped in a JSON object
JSONFormat OutputFormat = "json"
)
// IsValid checks if the output format is valid
func (f OutputFormat) IsValid() bool {
return f == TextFormat || f == JSONFormat
}
// String returns the string representation of the output format
func (f OutputFormat) String() string {
return string(f)
}
// FormatOutput formats the given content according to the specified format
func FormatOutput(content string, format OutputFormat) (string, error) {
switch format {
case TextFormat:
return content, nil
case JSONFormat:
jsonData := map[string]string{
"response": content,
}
jsonBytes, err := json.MarshalIndent(jsonData, "", " ")
if err != nil {
return "", fmt.Errorf("failed to marshal JSON: %w", err)
}
return string(jsonBytes), nil
default:
return "", fmt.Errorf("unsupported output format: %s", format)
}
}

View File

@@ -0,0 +1,90 @@
package format
import (
"testing"
)
func TestOutputFormat_IsValid(t *testing.T) {
t.Parallel()
tests := []struct {
name string
format OutputFormat
want bool
}{
{
name: "text format",
format: TextFormat,
want: true,
},
{
name: "json format",
format: JSONFormat,
want: true,
},
{
name: "invalid format",
format: "invalid",
want: false,
},
}
for _, tt := range tests {
tt := tt
t.Run(tt.name, func(t *testing.T) {
t.Parallel()
if got := tt.format.IsValid(); got != tt.want {
t.Errorf("OutputFormat.IsValid() = %v, want %v", got, tt.want)
}
})
}
}
func TestFormatOutput(t *testing.T) {
t.Parallel()
tests := []struct {
name string
content string
format OutputFormat
want string
wantErr bool
}{
{
name: "text format",
content: "test content",
format: TextFormat,
want: "test content",
wantErr: false,
},
{
name: "json format",
content: "test content",
format: JSONFormat,
want: "{\n \"response\": \"test content\"\n}",
wantErr: false,
},
{
name: "invalid format",
content: "test content",
format: "invalid",
want: "",
wantErr: true,
},
}
for _, tt := range tests {
tt := tt
t.Run(tt.name, func(t *testing.T) {
t.Parallel()
got, err := FormatOutput(tt.content, tt.format)
if (err != nil) != tt.wantErr {
t.Errorf("FormatOutput() error = %v, wantErr %v", err, tt.wantErr)
return
}
if got != tt.want {
t.Errorf("FormatOutput() = %v, want %v", got, tt.want)
}
})
}
}

View File

@@ -4,12 +4,11 @@ import (
"context" "context"
"errors" "errors"
"fmt" "fmt"
"log/slog"
"strings" "strings"
"sync" "sync"
"time" "time"
"log/slog"
"github.com/sst/opencode/internal/config" "github.com/sst/opencode/internal/config"
"github.com/sst/opencode/internal/llm/models" "github.com/sst/opencode/internal/llm/models"
"github.com/sst/opencode/internal/llm/prompt" "github.com/sst/opencode/internal/llm/prompt"
@@ -522,15 +521,6 @@ func (a *agent) processEvent(ctx context.Context, sessionID string, assistantMsg
assistantMsg.AddToolCall(*event.ToolCall) assistantMsg.AddToolCall(*event.ToolCall)
_, err := a.messages.Update(ctx, *assistantMsg) _, err := a.messages.Update(ctx, *assistantMsg)
return err return err
// TODO: see how to handle this
// case provider.EventToolUseDelta:
// tm := time.Unix(assistantMsg.UpdatedAt, 0)
// assistantMsg.AppendToolCallInput(event.ToolCall.ID, event.ToolCall.Input)
// if time.Since(tm) > 1000*time.Millisecond {
// err := a.messages.Update(ctx, *assistantMsg)
// assistantMsg.UpdatedAt = time.Now().Unix()
// return err
// }
case provider.EventToolUseStop: case provider.EventToolUseStop:
assistantMsg.FinishToolCall(event.ToolCall.ID) assistantMsg.FinishToolCall(event.ToolCall.ID)
_, err := a.messages.Update(ctx, *assistantMsg) _, err := a.messages.Update(ctx, *assistantMsg)

View File

@@ -0,0 +1,25 @@
package models
const (
ProviderBedrock ModelProvider = "bedrock"
// Models
BedrockClaude37Sonnet ModelID = "bedrock.claude-3.7-sonnet"
)
var BedrockModels = map[ModelID]Model{
BedrockClaude37Sonnet: {
ID: BedrockClaude37Sonnet,
Name: "Bedrock: Claude 3.7 Sonnet",
Provider: ProviderBedrock,
APIModel: "anthropic.claude-3-7-sonnet-20250219-v1:0",
CostPer1MIn: 3.0,
CostPer1MInCached: 3.75,
CostPer1MOutCached: 0.30,
CostPer1MOut: 15.0,
ContextWindow: 200_000,
DefaultMaxTokens: 50_000,
CanReason: true,
SupportsAttachments: true,
},
}

View File

@@ -41,6 +41,7 @@ var GroqModels = map[ModelID]Model{
CostPer1MInCached: 0, CostPer1MInCached: 0,
CostPer1MOutCached: 0, CostPer1MOutCached: 0,
CostPer1MOut: 0.34, CostPer1MOut: 0.34,
DefaultMaxTokens: 8192,
ContextWindow: 128_000, // 10M when? ContextWindow: 128_000, // 10M when?
SupportsAttachments: true, SupportsAttachments: true,
}, },
@@ -54,6 +55,7 @@ var GroqModels = map[ModelID]Model{
CostPer1MInCached: 0, CostPer1MInCached: 0,
CostPer1MOutCached: 0, CostPer1MOutCached: 0,
CostPer1MOut: 0.20, CostPer1MOut: 0.20,
DefaultMaxTokens: 8192,
ContextWindow: 128_000, ContextWindow: 128_000,
SupportsAttachments: true, SupportsAttachments: true,
}, },

View File

@@ -22,14 +22,7 @@ type Model struct {
SupportsAttachments bool `json:"supports_attachments"` SupportsAttachments bool `json:"supports_attachments"`
} }
// Model IDs
const ( // GEMINI
// Bedrock
BedrockClaude37Sonnet ModelID = "bedrock.claude-3.7-sonnet"
)
const ( const (
ProviderBedrock ModelProvider = "bedrock"
// ForTests // ForTests
ProviderMock ModelProvider = "__mock" ProviderMock ModelProvider = "__mock"
) )
@@ -43,56 +36,19 @@ var ProviderPopularity = map[ModelProvider]int{
ProviderOpenRouter: 5, ProviderOpenRouter: 5,
ProviderBedrock: 6, ProviderBedrock: 6,
ProviderAzure: 7, ProviderAzure: 7,
ProviderVertexAI: 8,
} }
var SupportedModels = map[ModelID]Model{ var SupportedModels = map[ModelID]Model{}
//
// // GEMINI
// GEMINI25: {
// ID: GEMINI25,
// Name: "Gemini 2.5 Pro",
// Provider: ProviderGemini,
// APIModel: "gemini-2.5-pro-exp-03-25",
// CostPer1MIn: 0,
// CostPer1MInCached: 0,
// CostPer1MOutCached: 0,
// CostPer1MOut: 0,
// },
//
// GRMINI20Flash: {
// ID: GRMINI20Flash,
// Name: "Gemini 2.0 Flash",
// Provider: ProviderGemini,
// APIModel: "gemini-2.0-flash",
// CostPer1MIn: 0.1,
// CostPer1MInCached: 0,
// CostPer1MOutCached: 0.025,
// CostPer1MOut: 0.4,
// },
//
// // Bedrock
BedrockClaude37Sonnet: {
ID: BedrockClaude37Sonnet,
Name: "Bedrock: Claude 3.7 Sonnet",
Provider: ProviderBedrock,
APIModel: "anthropic.claude-3-7-sonnet-20250219-v1:0",
CostPer1MIn: 3.0,
CostPer1MInCached: 3.75,
CostPer1MOutCached: 0.30,
CostPer1MOut: 15.0,
ContextWindow: 200_000,
DefaultMaxTokens: 50_000,
CanReason: true,
SupportsAttachments: true,
},
}
func init() { func init() {
maps.Copy(SupportedModels, AnthropicModels) maps.Copy(SupportedModels, AnthropicModels)
maps.Copy(SupportedModels, BedrockModels)
maps.Copy(SupportedModels, OpenAIModels) maps.Copy(SupportedModels, OpenAIModels)
maps.Copy(SupportedModels, GeminiModels) maps.Copy(SupportedModels, GeminiModels)
maps.Copy(SupportedModels, GroqModels) maps.Copy(SupportedModels, GroqModels)
maps.Copy(SupportedModels, AzureModels) maps.Copy(SupportedModels, AzureModels)
maps.Copy(SupportedModels, OpenRouterModels) maps.Copy(SupportedModels, OpenRouterModels)
maps.Copy(SupportedModels, XAIModels) maps.Copy(SupportedModels, XAIModels)
maps.Copy(SupportedModels, VertexAIGeminiModels)
} }

View File

@@ -3,6 +3,7 @@ package models
const ( const (
ProviderOpenAI ModelProvider = "openai" ProviderOpenAI ModelProvider = "openai"
CodexMini ModelID = "codex-mini"
GPT41 ModelID = "gpt-4.1" GPT41 ModelID = "gpt-4.1"
GPT41Mini ModelID = "gpt-4.1-mini" GPT41Mini ModelID = "gpt-4.1-mini"
GPT41Nano ModelID = "gpt-4.1-nano" GPT41Nano ModelID = "gpt-4.1-nano"
@@ -18,6 +19,20 @@ const (
) )
var OpenAIModels = map[ModelID]Model{ var OpenAIModels = map[ModelID]Model{
CodexMini: {
ID: CodexMini,
Name: "Codex Mini",
Provider: ProviderOpenAI,
APIModel: "codex-mini-latest",
CostPer1MIn: 1.50,
CostPer1MInCached: 0.375,
CostPer1MOutCached: 0.0,
CostPer1MOut: 6.00,
ContextWindow: 200_000,
DefaultMaxTokens: 100_000,
CanReason: true,
SupportsAttachments: true,
},
GPT41: { GPT41: {
ID: GPT41, ID: GPT41,
Name: "GPT 4.1", Name: "GPT 4.1",

View File

@@ -0,0 +1,38 @@
package models
const (
ProviderVertexAI ModelProvider = "vertexai"
// Models
VertexAIGemini25Flash ModelID = "vertexai.gemini-2.5-flash"
VertexAIGemini25 ModelID = "vertexai.gemini-2.5"
)
var VertexAIGeminiModels = map[ModelID]Model{
VertexAIGemini25Flash: {
ID: VertexAIGemini25Flash,
Name: "VertexAI: Gemini 2.5 Flash",
Provider: ProviderVertexAI,
APIModel: "gemini-2.5-flash-preview-04-17",
CostPer1MIn: GeminiModels[Gemini25Flash].CostPer1MIn,
CostPer1MInCached: GeminiModels[Gemini25Flash].CostPer1MInCached,
CostPer1MOut: GeminiModels[Gemini25Flash].CostPer1MOut,
CostPer1MOutCached: GeminiModels[Gemini25Flash].CostPer1MOutCached,
ContextWindow: GeminiModels[Gemini25Flash].ContextWindow,
DefaultMaxTokens: GeminiModels[Gemini25Flash].DefaultMaxTokens,
SupportsAttachments: true,
},
VertexAIGemini25: {
ID: VertexAIGemini25,
Name: "VertexAI: Gemini 2.5 Pro",
Provider: ProviderVertexAI,
APIModel: "gemini-2.5-pro-preview-03-25",
CostPer1MIn: GeminiModels[Gemini25].CostPer1MIn,
CostPer1MInCached: GeminiModels[Gemini25].CostPer1MInCached,
CostPer1MOut: GeminiModels[Gemini25].CostPer1MOut,
CostPer1MOutCached: GeminiModels[Gemini25].CostPer1MOutCached,
ContextWindow: GeminiModels[Gemini25].ContextWindow,
DefaultMaxTokens: GeminiModels[Gemini25].DefaultMaxTokens,
SupportsAttachments: true,
},
}

View File

@@ -176,13 +176,16 @@ func (g *geminiClient) send(ctx context.Context, messages []message.Message, too
history := geminiMessages[:len(geminiMessages)-1] // All but last message history := geminiMessages[:len(geminiMessages)-1] // All but last message
lastMsg := geminiMessages[len(geminiMessages)-1] lastMsg := geminiMessages[len(geminiMessages)-1]
chat, _ := g.client.Chats.Create(ctx, g.providerOptions.model.APIModel, &genai.GenerateContentConfig{ config := &genai.GenerateContentConfig{
MaxOutputTokens: int32(g.providerOptions.maxTokens), MaxOutputTokens: int32(g.providerOptions.maxTokens),
SystemInstruction: &genai.Content{ SystemInstruction: &genai.Content{
Parts: []*genai.Part{{Text: g.providerOptions.systemMessage}}, Parts: []*genai.Part{{Text: g.providerOptions.systemMessage}},
}, },
Tools: g.convertTools(tools), }
}, history) if len(tools) > 0 {
config.Tools = g.convertTools(tools)
}
chat, _ := g.client.Chats.Create(ctx, g.providerOptions.model.APIModel, config, history)
attempts := 0 attempts := 0
for { for {
@@ -262,13 +265,16 @@ func (g *geminiClient) stream(ctx context.Context, messages []message.Message, t
history := geminiMessages[:len(geminiMessages)-1] // All but last message history := geminiMessages[:len(geminiMessages)-1] // All but last message
lastMsg := geminiMessages[len(geminiMessages)-1] lastMsg := geminiMessages[len(geminiMessages)-1]
chat, _ := g.client.Chats.Create(ctx, g.providerOptions.model.APIModel, &genai.GenerateContentConfig{ config := &genai.GenerateContentConfig{
MaxOutputTokens: int32(g.providerOptions.maxTokens), MaxOutputTokens: int32(g.providerOptions.maxTokens),
SystemInstruction: &genai.Content{ SystemInstruction: &genai.Content{
Parts: []*genai.Part{{Text: g.providerOptions.systemMessage}}, Parts: []*genai.Part{{Text: g.providerOptions.systemMessage}},
}, },
Tools: g.convertTools(tools), }
}, history) if len(tools) > 0 {
config.Tools = g.convertTools(tools)
}
chat, _ := g.client.Chats.Create(ctx, g.providerOptions.model.APIModel, config, history)
attempts := 0 attempts := 0
eventChan := make(chan ProviderEvent) eventChan := make(chan ProviderEvent)

View File

@@ -2,21 +2,14 @@ package provider
import ( import (
"context" "context"
"encoding/json"
"errors" "errors"
"fmt" "fmt"
"io" "log/slog"
"time"
"github.com/openai/openai-go" "github.com/openai/openai-go"
"github.com/openai/openai-go/option" "github.com/openai/openai-go/option"
"github.com/openai/openai-go/shared"
"github.com/sst/opencode/internal/config"
"github.com/sst/opencode/internal/llm/models" "github.com/sst/opencode/internal/llm/models"
"github.com/sst/opencode/internal/llm/tools" "github.com/sst/opencode/internal/llm/tools"
"github.com/sst/opencode/internal/message" "github.com/sst/opencode/internal/message"
"github.com/sst/opencode/internal/status"
"log/slog"
) )
type openaiOptions struct { type openaiOptions struct {
@@ -66,86 +59,20 @@ func newOpenAIClient(opts providerClientOptions) OpenAIClient {
} }
} }
func (o *openaiClient) convertMessages(messages []message.Message) (openaiMessages []openai.ChatCompletionMessageParamUnion) { func (o *openaiClient) send(ctx context.Context, messages []message.Message, tools []tools.BaseTool) (response *ProviderResponse, err error) {
// Add system message first if o.providerOptions.model.ID == models.OpenAIModels[models.CodexMini].ID || o.providerOptions.model.ID == models.OpenAIModels[models.O1Pro].ID {
openaiMessages = append(openaiMessages, openai.SystemMessage(o.providerOptions.systemMessage)) return o.sendResponseMessages(ctx, messages, tools)
}
for _, msg := range messages { return o.sendChatcompletionMessage(ctx, messages, tools)
switch msg.Role {
case message.User:
var content []openai.ChatCompletionContentPartUnionParam
textBlock := openai.ChatCompletionContentPartTextParam{Text: msg.Content().String()}
content = append(content, openai.ChatCompletionContentPartUnionParam{OfText: &textBlock})
for _, binaryContent := range msg.BinaryContent() {
imageURL := openai.ChatCompletionContentPartImageImageURLParam{URL: binaryContent.String(models.ProviderOpenAI)}
imageBlock := openai.ChatCompletionContentPartImageParam{ImageURL: imageURL}
content = append(content, openai.ChatCompletionContentPartUnionParam{OfImageURL: &imageBlock})
} }
openaiMessages = append(openaiMessages, openai.UserMessage(content)) func (o *openaiClient) stream(ctx context.Context, messages []message.Message, tools []tools.BaseTool) <-chan ProviderEvent {
if o.providerOptions.model.ID == models.OpenAIModels[models.CodexMini].ID || o.providerOptions.model.ID == models.OpenAIModels[models.O1Pro].ID {
case message.Assistant: return o.streamResponseMessages(ctx, messages, tools)
assistantMsg := openai.ChatCompletionAssistantMessageParam{ }
Role: "assistant", return o.streamChatCompletionMessages(ctx, messages, tools)
} }
if msg.Content().String() != "" {
assistantMsg.Content = openai.ChatCompletionAssistantMessageParamContentUnion{
OfString: openai.String(msg.Content().String()),
}
}
if len(msg.ToolCalls()) > 0 {
assistantMsg.ToolCalls = make([]openai.ChatCompletionMessageToolCallParam, len(msg.ToolCalls()))
for i, call := range msg.ToolCalls() {
assistantMsg.ToolCalls[i] = openai.ChatCompletionMessageToolCallParam{
ID: call.ID,
Type: "function",
Function: openai.ChatCompletionMessageToolCallFunctionParam{
Name: call.Name,
Arguments: call.Input,
},
}
}
}
openaiMessages = append(openaiMessages, openai.ChatCompletionMessageParamUnion{
OfAssistant: &assistantMsg,
})
case message.Tool:
for _, result := range msg.ToolResults() {
openaiMessages = append(openaiMessages,
openai.ToolMessage(result.Content, result.ToolCallID),
)
}
}
}
return
}
func (o *openaiClient) convertTools(tools []tools.BaseTool) []openai.ChatCompletionToolParam {
openaiTools := make([]openai.ChatCompletionToolParam, len(tools))
for i, tool := range tools {
info := tool.Info()
openaiTools[i] = openai.ChatCompletionToolParam{
Function: openai.FunctionDefinitionParam{
Name: info.Name,
Description: openai.String(info.Description),
Parameters: openai.FunctionParameters{
"type": "object",
"properties": info.Parameters,
"required": info.Required,
},
},
}
}
return openaiTools
}
func (o *openaiClient) finishReason(reason string) message.FinishReason { func (o *openaiClient) finishReason(reason string) message.FinishReason {
switch reason { switch reason {
@@ -160,190 +87,6 @@ func (o *openaiClient) finishReason(reason string) message.FinishReason {
} }
} }
func (o *openaiClient) preparedParams(messages []openai.ChatCompletionMessageParamUnion, tools []openai.ChatCompletionToolParam) openai.ChatCompletionNewParams {
params := openai.ChatCompletionNewParams{
Model: openai.ChatModel(o.providerOptions.model.APIModel),
Messages: messages,
Tools: tools,
}
if o.providerOptions.model.CanReason == true {
params.MaxCompletionTokens = openai.Int(o.providerOptions.maxTokens)
switch o.options.reasoningEffort {
case "low":
params.ReasoningEffort = shared.ReasoningEffortLow
case "medium":
params.ReasoningEffort = shared.ReasoningEffortMedium
case "high":
params.ReasoningEffort = shared.ReasoningEffortHigh
default:
params.ReasoningEffort = shared.ReasoningEffortMedium
}
} else {
params.MaxTokens = openai.Int(o.providerOptions.maxTokens)
}
if o.providerOptions.model.Provider == models.ProviderOpenRouter {
params.WithExtraFields(map[string]any{
"provider": map[string]any{
"require_parameters": true,
},
})
}
return params
}
func (o *openaiClient) send(ctx context.Context, messages []message.Message, tools []tools.BaseTool) (response *ProviderResponse, err error) {
params := o.preparedParams(o.convertMessages(messages), o.convertTools(tools))
cfg := config.Get()
if cfg.Debug {
jsonData, _ := json.Marshal(params)
slog.Debug("Prepared messages", "messages", string(jsonData))
}
attempts := 0
for {
attempts++
openaiResponse, err := o.client.Chat.Completions.New(
ctx,
params,
)
// If there is an error we are going to see if we can retry the call
if err != nil {
retry, after, retryErr := o.shouldRetry(attempts, err)
duration := time.Duration(after) * time.Millisecond
if retryErr != nil {
return nil, retryErr
}
if retry {
status.Warn(fmt.Sprintf("Retrying due to rate limit... attempt %d of %d", attempts, maxRetries), status.WithDuration(duration))
select {
case <-ctx.Done():
return nil, ctx.Err()
case <-time.After(duration):
continue
}
}
return nil, retryErr
}
content := ""
if openaiResponse.Choices[0].Message.Content != "" {
content = openaiResponse.Choices[0].Message.Content
}
toolCalls := o.toolCalls(*openaiResponse)
finishReason := o.finishReason(string(openaiResponse.Choices[0].FinishReason))
if len(toolCalls) > 0 {
finishReason = message.FinishReasonToolUse
}
return &ProviderResponse{
Content: content,
ToolCalls: toolCalls,
Usage: o.usage(*openaiResponse),
FinishReason: finishReason,
}, nil
}
}
func (o *openaiClient) stream(ctx context.Context, messages []message.Message, tools []tools.BaseTool) <-chan ProviderEvent {
params := o.preparedParams(o.convertMessages(messages), o.convertTools(tools))
params.StreamOptions = openai.ChatCompletionStreamOptionsParam{
IncludeUsage: openai.Bool(true),
}
cfg := config.Get()
if cfg.Debug {
jsonData, _ := json.Marshal(params)
slog.Debug("Prepared messages", "messages", string(jsonData))
}
attempts := 0
eventChan := make(chan ProviderEvent)
go func() {
for {
attempts++
openaiStream := o.client.Chat.Completions.NewStreaming(
ctx,
params,
)
acc := openai.ChatCompletionAccumulator{}
currentContent := ""
toolCalls := make([]message.ToolCall, 0)
for openaiStream.Next() {
chunk := openaiStream.Current()
acc.AddChunk(chunk)
for _, choice := range chunk.Choices {
if choice.Delta.Content != "" {
eventChan <- ProviderEvent{
Type: EventContentDelta,
Content: choice.Delta.Content,
}
currentContent += choice.Delta.Content
}
}
}
err := openaiStream.Err()
if err == nil || errors.Is(err, io.EOF) {
// Stream completed successfully
finishReason := o.finishReason(string(acc.ChatCompletion.Choices[0].FinishReason))
if len(acc.ChatCompletion.Choices[0].Message.ToolCalls) > 0 {
toolCalls = append(toolCalls, o.toolCalls(acc.ChatCompletion)...)
}
if len(toolCalls) > 0 {
finishReason = message.FinishReasonToolUse
}
eventChan <- ProviderEvent{
Type: EventComplete,
Response: &ProviderResponse{
Content: currentContent,
ToolCalls: toolCalls,
Usage: o.usage(acc.ChatCompletion),
FinishReason: finishReason,
},
}
close(eventChan)
return
}
// If there is an error we are going to see if we can retry the call
retry, after, retryErr := o.shouldRetry(attempts, err)
duration := time.Duration(after) * time.Millisecond
if retryErr != nil {
eventChan <- ProviderEvent{Type: EventError, Error: retryErr}
close(eventChan)
return
}
if retry {
status.Warn(fmt.Sprintf("Retrying due to rate limit... attempt %d of %d", attempts, maxRetries), status.WithDuration(duration))
select {
case <-ctx.Done():
// context cancelled
if ctx.Err() == nil {
eventChan <- ProviderEvent{Type: EventError, Error: ctx.Err()}
}
close(eventChan)
return
case <-time.After(duration):
continue
}
}
eventChan <- ProviderEvent{Type: EventError, Error: retryErr}
close(eventChan)
return
}
}()
return eventChan
}
func (o *openaiClient) shouldRetry(attempts int, err error) (bool, int64, error) { func (o *openaiClient) shouldRetry(attempts int, err error) (bool, int64, error) {
var apierr *openai.Error var apierr *openai.Error
@@ -373,36 +116,6 @@ func (o *openaiClient) shouldRetry(attempts int, err error) (bool, int64, error)
return true, int64(retryMs), nil return true, int64(retryMs), nil
} }
func (o *openaiClient) toolCalls(completion openai.ChatCompletion) []message.ToolCall {
var toolCalls []message.ToolCall
if len(completion.Choices) > 0 && len(completion.Choices[0].Message.ToolCalls) > 0 {
for _, call := range completion.Choices[0].Message.ToolCalls {
toolCall := message.ToolCall{
ID: call.ID,
Name: call.Function.Name,
Input: call.Function.Arguments,
Type: "function",
Finished: true,
}
toolCalls = append(toolCalls, toolCall)
}
}
return toolCalls
}
func (o *openaiClient) usage(completion openai.ChatCompletion) TokenUsage {
cachedTokens := completion.Usage.PromptTokensDetails.CachedTokens
inputTokens := completion.Usage.PromptTokens - cachedTokens
return TokenUsage{
InputTokens: inputTokens,
OutputTokens: completion.Usage.CompletionTokens,
CacheCreationTokens: 0, // OpenAI doesn't provide this directly
CacheReadTokens: cachedTokens,
}
}
func WithOpenAIBaseURL(baseURL string) OpenAIOption { func WithOpenAIBaseURL(baseURL string) OpenAIOption {
return func(options *openaiOptions) { return func(options *openaiOptions) {

View File

@@ -0,0 +1,317 @@
package provider
import (
"context"
"encoding/json"
"errors"
"fmt"
"io"
"log/slog"
"time"
"github.com/openai/openai-go"
"github.com/openai/openai-go/shared"
"github.com/sst/opencode/internal/config"
"github.com/sst/opencode/internal/llm/models"
"github.com/sst/opencode/internal/llm/tools"
"github.com/sst/opencode/internal/message"
"github.com/sst/opencode/internal/status"
)
func (o *openaiClient) convertMessagesToChatCompletionMessages(messages []message.Message) (openaiMessages []openai.ChatCompletionMessageParamUnion) {
// Add system message first
openaiMessages = append(openaiMessages, openai.SystemMessage(o.providerOptions.systemMessage))
for _, msg := range messages {
switch msg.Role {
case message.User:
var content []openai.ChatCompletionContentPartUnionParam
textBlock := openai.ChatCompletionContentPartTextParam{Text: msg.Content().String()}
content = append(content, openai.ChatCompletionContentPartUnionParam{OfText: &textBlock})
for _, binaryContent := range msg.BinaryContent() {
imageURL := openai.ChatCompletionContentPartImageImageURLParam{URL: binaryContent.String(models.ProviderOpenAI)}
imageBlock := openai.ChatCompletionContentPartImageParam{ImageURL: imageURL}
content = append(content, openai.ChatCompletionContentPartUnionParam{OfImageURL: &imageBlock})
}
openaiMessages = append(openaiMessages, openai.UserMessage(content))
case message.Assistant:
assistantMsg := openai.ChatCompletionAssistantMessageParam{
Role: "assistant",
}
if msg.Content().String() != "" {
assistantMsg.Content = openai.ChatCompletionAssistantMessageParamContentUnion{
OfString: openai.String(msg.Content().String()),
}
}
if len(msg.ToolCalls()) > 0 {
assistantMsg.ToolCalls = make([]openai.ChatCompletionMessageToolCallParam, len(msg.ToolCalls()))
for i, call := range msg.ToolCalls() {
assistantMsg.ToolCalls[i] = openai.ChatCompletionMessageToolCallParam{
ID: call.ID,
Type: "function",
Function: openai.ChatCompletionMessageToolCallFunctionParam{
Name: call.Name,
Arguments: call.Input,
},
}
}
}
openaiMessages = append(openaiMessages, openai.ChatCompletionMessageParamUnion{
OfAssistant: &assistantMsg,
})
case message.Tool:
for _, result := range msg.ToolResults() {
openaiMessages = append(openaiMessages,
openai.ToolMessage(result.Content, result.ToolCallID),
)
}
}
}
return
}
func (o *openaiClient) convertToChatCompletionTools(tools []tools.BaseTool) []openai.ChatCompletionToolParam {
openaiTools := make([]openai.ChatCompletionToolParam, len(tools))
for i, tool := range tools {
info := tool.Info()
openaiTools[i] = openai.ChatCompletionToolParam{
Function: openai.FunctionDefinitionParam{
Name: info.Name,
Description: openai.String(info.Description),
Parameters: openai.FunctionParameters{
"type": "object",
"properties": info.Parameters,
"required": info.Required,
},
},
}
}
return openaiTools
}
func (o *openaiClient) preparedChatCompletionParams(messages []openai.ChatCompletionMessageParamUnion, tools []openai.ChatCompletionToolParam) openai.ChatCompletionNewParams {
params := openai.ChatCompletionNewParams{
Model: openai.ChatModel(o.providerOptions.model.APIModel),
Messages: messages,
Tools: tools,
}
if o.providerOptions.model.CanReason == true {
params.MaxCompletionTokens = openai.Int(o.providerOptions.maxTokens)
switch o.options.reasoningEffort {
case "low":
params.ReasoningEffort = shared.ReasoningEffortLow
case "medium":
params.ReasoningEffort = shared.ReasoningEffortMedium
case "high":
params.ReasoningEffort = shared.ReasoningEffortHigh
default:
params.ReasoningEffort = shared.ReasoningEffortMedium
}
} else {
params.MaxTokens = openai.Int(o.providerOptions.maxTokens)
}
if o.providerOptions.model.Provider == models.ProviderOpenRouter {
params.WithExtraFields(map[string]any{
"provider": map[string]any{
"require_parameters": true,
},
})
}
return params
}
func (o *openaiClient) sendChatcompletionMessage(ctx context.Context, messages []message.Message, tools []tools.BaseTool) (response *ProviderResponse, err error) {
params := o.preparedChatCompletionParams(o.convertMessagesToChatCompletionMessages(messages), o.convertToChatCompletionTools(tools))
cfg := config.Get()
if cfg.Debug {
jsonData, _ := json.Marshal(params)
slog.Debug("Prepared messages", "messages", string(jsonData))
}
attempts := 0
for {
attempts++
openaiResponse, err := o.client.Chat.Completions.New(
ctx,
params,
)
// If there is an error we are going to see if we can retry the call
if err != nil {
retry, after, retryErr := o.shouldRetry(attempts, err)
duration := time.Duration(after) * time.Millisecond
if retryErr != nil {
return nil, retryErr
}
if retry {
status.Warn(fmt.Sprintf("Retrying due to rate limit... attempt %d of %d", attempts, maxRetries), status.WithDuration(duration))
select {
case <-ctx.Done():
return nil, ctx.Err()
case <-time.After(duration):
continue
}
}
return nil, retryErr
}
content := ""
if openaiResponse.Choices[0].Message.Content != "" {
content = openaiResponse.Choices[0].Message.Content
}
toolCalls := o.chatCompletionToolCalls(*openaiResponse)
finishReason := o.finishReason(string(openaiResponse.Choices[0].FinishReason))
if len(toolCalls) > 0 {
finishReason = message.FinishReasonToolUse
}
return &ProviderResponse{
Content: content,
ToolCalls: toolCalls,
Usage: o.usage(*openaiResponse),
FinishReason: finishReason,
}, nil
}
}
func (o *openaiClient) streamChatCompletionMessages(ctx context.Context, messages []message.Message, tools []tools.BaseTool) <-chan ProviderEvent {
params := o.preparedChatCompletionParams(o.convertMessagesToChatCompletionMessages(messages), o.convertToChatCompletionTools(tools))
params.StreamOptions = openai.ChatCompletionStreamOptionsParam{
IncludeUsage: openai.Bool(true),
}
cfg := config.Get()
if cfg.Debug {
jsonData, _ := json.Marshal(params)
slog.Debug("Prepared messages", "messages", string(jsonData))
}
attempts := 0
eventChan := make(chan ProviderEvent)
go func() {
for {
attempts++
openaiStream := o.client.Chat.Completions.NewStreaming(
ctx,
params,
)
acc := openai.ChatCompletionAccumulator{}
currentContent := ""
toolCalls := make([]message.ToolCall, 0)
for openaiStream.Next() {
chunk := openaiStream.Current()
acc.AddChunk(chunk)
for _, choice := range chunk.Choices {
if choice.Delta.Content != "" {
eventChan <- ProviderEvent{
Type: EventContentDelta,
Content: choice.Delta.Content,
}
currentContent += choice.Delta.Content
}
}
}
err := openaiStream.Err()
if err == nil || errors.Is(err, io.EOF) {
// Stream completed successfully
finishReason := o.finishReason(string(acc.ChatCompletion.Choices[0].FinishReason))
if len(acc.ChatCompletion.Choices[0].Message.ToolCalls) > 0 {
toolCalls = append(toolCalls, o.chatCompletionToolCalls(acc.ChatCompletion)...)
}
if len(toolCalls) > 0 {
finishReason = message.FinishReasonToolUse
}
eventChan <- ProviderEvent{
Type: EventComplete,
Response: &ProviderResponse{
Content: currentContent,
ToolCalls: toolCalls,
Usage: o.usage(acc.ChatCompletion),
FinishReason: finishReason,
},
}
close(eventChan)
return
}
// If there is an error we are going to see if we can retry the call
retry, after, retryErr := o.shouldRetry(attempts, err)
duration := time.Duration(after) * time.Millisecond
if retryErr != nil {
eventChan <- ProviderEvent{Type: EventError, Error: retryErr}
close(eventChan)
return
}
if retry {
status.Warn(fmt.Sprintf("Retrying due to rate limit... attempt %d of %d", attempts, maxRetries), status.WithDuration(duration))
select {
case <-ctx.Done():
// context cancelled
if ctx.Err() == nil {
eventChan <- ProviderEvent{Type: EventError, Error: ctx.Err()}
}
close(eventChan)
return
case <-time.After(duration):
continue
}
}
eventChan <- ProviderEvent{Type: EventError, Error: retryErr}
close(eventChan)
return
}
}()
return eventChan
}
func (o *openaiClient) chatCompletionToolCalls(completion openai.ChatCompletion) []message.ToolCall {
var toolCalls []message.ToolCall
if len(completion.Choices) > 0 && len(completion.Choices[0].Message.ToolCalls) > 0 {
for _, call := range completion.Choices[0].Message.ToolCalls {
toolCall := message.ToolCall{
ID: call.ID,
Name: call.Function.Name,
Input: call.Function.Arguments,
Type: "function",
Finished: true,
}
toolCalls = append(toolCalls, toolCall)
}
}
return toolCalls
}
func (o *openaiClient) usage(completion openai.ChatCompletion) TokenUsage {
cachedTokens := completion.Usage.PromptTokensDetails.CachedTokens
inputTokens := completion.Usage.PromptTokens - cachedTokens
return TokenUsage{
InputTokens: inputTokens,
OutputTokens: completion.Usage.CompletionTokens,
CacheCreationTokens: 0, // OpenAI doesn't provide this directly
CacheReadTokens: cachedTokens,
}
}

View File

@@ -0,0 +1,393 @@
package provider
import (
"github.com/openai/openai-go"
"github.com/openai/openai-go/responses"
"github.com/sst/opencode/internal/llm/models"
"github.com/sst/opencode/internal/llm/tools"
"github.com/sst/opencode/internal/message"
"context"
"encoding/json"
"errors"
"fmt"
"io"
"time"
"log/slog"
"github.com/openai/openai-go/shared"
"github.com/sst/opencode/internal/config"
"github.com/sst/opencode/internal/status"
)
func (o *openaiClient) convertMessagesToResponseParams(messages []message.Message) responses.ResponseInputParam {
inputItems := responses.ResponseInputParam{}
inputItems = append(inputItems, responses.ResponseInputItemUnionParam{
OfMessage: &responses.EasyInputMessageParam{
Content: responses.EasyInputMessageContentUnionParam{OfString: openai.String(o.providerOptions.systemMessage)},
Role: responses.EasyInputMessageRoleSystem,
},
})
for _, msg := range messages {
switch msg.Role {
case message.User:
inputItemContentList := responses.ResponseInputMessageContentListParam{
responses.ResponseInputContentUnionParam{
OfInputText: &responses.ResponseInputTextParam{
Text: msg.Content().String(),
},
},
}
for _, binaryContent := range msg.BinaryContent() {
inputItemContentList = append(inputItemContentList, responses.ResponseInputContentUnionParam{
OfInputImage: &responses.ResponseInputImageParam{
ImageURL: openai.String(binaryContent.String(models.ProviderOpenAI)),
},
})
}
userMsg := responses.ResponseInputItemUnionParam{
OfInputMessage: &responses.ResponseInputItemMessageParam{
Content: inputItemContentList,
Role: string(responses.ResponseInputMessageItemRoleUser),
},
}
inputItems = append(inputItems, userMsg)
case message.Assistant:
if msg.Content().String() != "" {
assistantMsg := responses.ResponseInputItemUnionParam{
OfOutputMessage: &responses.ResponseOutputMessageParam{
Content: []responses.ResponseOutputMessageContentUnionParam{{
OfOutputText: &responses.ResponseOutputTextParam{
Text: msg.Content().String(),
},
}},
},
}
inputItems = append(inputItems, assistantMsg)
}
if len(msg.ToolCalls()) > 0 {
for _, call := range msg.ToolCalls() {
toolMsg := responses.ResponseInputItemUnionParam{
OfFunctionCall: &responses.ResponseFunctionToolCallParam{
CallID: call.ID,
Name: call.Name,
Arguments: call.Input,
},
}
inputItems = append(inputItems, toolMsg)
}
}
case message.Tool:
for _, result := range msg.ToolResults() {
toolMsg := responses.ResponseInputItemUnionParam{
OfFunctionCallOutput: &responses.ResponseInputItemFunctionCallOutputParam{
Output: result.Content,
CallID: result.ToolCallID,
},
}
inputItems = append(inputItems, toolMsg)
}
}
}
return inputItems
}
func (o *openaiClient) convertToResponseTools(tools []tools.BaseTool) []responses.ToolUnionParam {
outputTools := make([]responses.ToolUnionParam, len(tools))
for i, tool := range tools {
info := tool.Info()
outputTools[i] = responses.ToolUnionParam{
OfFunction: &responses.FunctionToolParam{
Name: info.Name,
Description: openai.String(info.Description),
Parameters: map[string]any{
"type": "object",
"properties": info.Parameters,
"required": info.Required,
},
},
}
}
return outputTools
}
func (o *openaiClient) preparedResponseParams(input responses.ResponseInputParam, tools []responses.ToolUnionParam) responses.ResponseNewParams {
params := responses.ResponseNewParams{
Model: shared.ResponsesModel(o.providerOptions.model.APIModel),
Input: responses.ResponseNewParamsInputUnion{OfInputItemList: input},
Tools: tools,
}
params.MaxOutputTokens = openai.Int(o.providerOptions.maxTokens)
if o.providerOptions.model.CanReason == true {
switch o.options.reasoningEffort {
case "low":
params.Reasoning.Effort = shared.ReasoningEffortLow
case "medium":
params.Reasoning.Effort = shared.ReasoningEffortMedium
case "high":
params.Reasoning.Effort = shared.ReasoningEffortHigh
default:
params.Reasoning.Effort = shared.ReasoningEffortMedium
}
}
if o.providerOptions.model.Provider == models.ProviderOpenRouter {
params.WithExtraFields(map[string]any{
"provider": map[string]any{
"require_parameters": true,
},
})
}
return params
}
func (o *openaiClient) sendResponseMessages(ctx context.Context, messages []message.Message, tools []tools.BaseTool) (response *ProviderResponse, err error) {
params := o.preparedResponseParams(o.convertMessagesToResponseParams(messages), o.convertToResponseTools(tools))
cfg := config.Get()
if cfg.Debug {
jsonData, _ := json.Marshal(params)
slog.Debug("Prepared messages", "messages", string(jsonData))
}
attempts := 0
for {
attempts++
openaiResponse, err := o.client.Responses.New(
ctx,
params,
)
// If there is an error we are going to see if we can retry the call
if err != nil {
retry, after, retryErr := o.shouldRetry(attempts, err)
duration := time.Duration(after) * time.Millisecond
if retryErr != nil {
return nil, retryErr
}
if retry {
status.Warn(fmt.Sprintf("Retrying due to rate limit... attempt %d of %d", attempts, maxRetries), status.WithDuration(duration))
select {
case <-ctx.Done():
return nil, ctx.Err()
case <-time.After(duration):
continue
}
}
return nil, retryErr
}
content := ""
if openaiResponse.OutputText() != "" {
content = openaiResponse.OutputText()
}
toolCalls := o.responseToolCalls(*openaiResponse)
finishReason := o.finishReason("stop")
if len(toolCalls) > 0 {
finishReason = message.FinishReasonToolUse
}
return &ProviderResponse{
Content: content,
ToolCalls: toolCalls,
Usage: o.responseUsage(*openaiResponse),
FinishReason: finishReason,
}, nil
}
}
func (o *openaiClient) streamResponseMessages(ctx context.Context, messages []message.Message, tools []tools.BaseTool) <-chan ProviderEvent {
eventChan := make(chan ProviderEvent)
params := o.preparedResponseParams(o.convertMessagesToResponseParams(messages), o.convertToResponseTools(tools))
cfg := config.Get()
if cfg.Debug {
jsonData, _ := json.Marshal(params)
slog.Debug("Prepared messages", "messages", string(jsonData))
}
attempts := 0
go func() {
for {
attempts++
stream := o.client.Responses.NewStreaming(ctx, params)
outputText := ""
currentToolCallID := ""
for stream.Next() {
event := stream.Current()
switch event := event.AsAny().(type) {
case responses.ResponseCompletedEvent:
toolCalls := o.responseToolCalls(event.Response)
finishReason := o.finishReason("stop")
if len(toolCalls) > 0 {
finishReason = message.FinishReasonToolUse
}
eventChan <- ProviderEvent{
Type: EventComplete,
Response: &ProviderResponse{
Content: outputText,
ToolCalls: toolCalls,
Usage: o.responseUsage(event.Response),
FinishReason: finishReason,
},
}
close(eventChan)
return
case responses.ResponseTextDeltaEvent:
outputText += event.Delta
eventChan <- ProviderEvent{
Type: EventContentDelta,
Content: event.Delta,
}
case responses.ResponseTextDoneEvent:
eventChan <- ProviderEvent{
Type: EventContentStop,
Content: outputText,
}
close(eventChan)
return
case responses.ResponseOutputItemAddedEvent:
if event.Item.Type == "function_call" {
currentToolCallID = event.Item.ID
eventChan <- ProviderEvent{
Type: EventToolUseStart,
ToolCall: &message.ToolCall{
ID: event.Item.ID,
Name: event.Item.Name,
Finished: false,
},
}
}
case responses.ResponseFunctionCallArgumentsDeltaEvent:
if event.ItemID == currentToolCallID {
eventChan <- ProviderEvent{
Type: EventToolUseDelta,
ToolCall: &message.ToolCall{
ID: currentToolCallID,
Finished: false,
Input: event.Delta,
},
}
}
case responses.ResponseFunctionCallArgumentsDoneEvent:
if event.ItemID == currentToolCallID {
eventChan <- ProviderEvent{
Type: EventToolUseStop,
ToolCall: &message.ToolCall{
ID: currentToolCallID,
Input: event.Arguments,
},
}
currentToolCallID = ""
}
case responses.ResponseOutputItemDoneEvent:
if event.Item.Type == "function_call" {
eventChan <- ProviderEvent{
Type: EventToolUseStop,
ToolCall: &message.ToolCall{
ID: event.Item.ID,
Name: event.Item.Name,
Input: event.Item.Arguments,
Finished: true,
},
}
currentToolCallID = ""
}
}
}
err := stream.Err()
if err == nil || errors.Is(err, io.EOF) {
close(eventChan)
return
}
// If there is an error we are going to see if we can retry the call
retry, after, retryErr := o.shouldRetry(attempts, err)
duration := time.Duration(after) * time.Millisecond
if retryErr != nil {
eventChan <- ProviderEvent{Type: EventError, Error: retryErr}
close(eventChan)
return
}
if retry {
status.Warn(fmt.Sprintf("Retrying due to rate limit... attempt %d of %d", attempts, maxRetries), status.WithDuration(duration))
select {
case <-ctx.Done():
// context cancelled
if ctx.Err() == nil {
eventChan <- ProviderEvent{Type: EventError, Error: ctx.Err()}
}
close(eventChan)
return
case <-time.After(duration):
continue
}
}
eventChan <- ProviderEvent{Type: EventError, Error: retryErr}
close(eventChan)
return
}
}()
return eventChan
}
func (o *openaiClient) responseToolCalls(response responses.Response) []message.ToolCall {
var toolCalls []message.ToolCall
for _, output := range response.Output {
if output.Type == "function_call" {
call := output.AsFunctionCall()
toolCall := message.ToolCall{
ID: call.ID,
Name: call.Name,
Input: call.Arguments,
Type: "function",
Finished: true,
}
toolCalls = append(toolCalls, toolCall)
}
}
return toolCalls
}
func (o *openaiClient) responseUsage(response responses.Response) TokenUsage {
cachedTokens := response.Usage.InputTokensDetails.CachedTokens
inputTokens := response.Usage.InputTokens - cachedTokens
return TokenUsage{
InputTokens: inputTokens,
OutputTokens: response.Usage.OutputTokens,
CacheCreationTokens: 0, // OpenAI doesn't provide this directly
CacheReadTokens: cachedTokens,
}
}

View File

@@ -3,11 +3,11 @@ package provider
import ( import (
"context" "context"
"fmt" "fmt"
"log/slog"
"github.com/sst/opencode/internal/llm/models" "github.com/sst/opencode/internal/llm/models"
"github.com/sst/opencode/internal/llm/tools" "github.com/sst/opencode/internal/llm/tools"
"github.com/sst/opencode/internal/message" "github.com/sst/opencode/internal/message"
"log/slog"
) )
type EventType string type EventType string
@@ -123,6 +123,11 @@ func NewProvider(providerName models.ModelProvider, opts ...ProviderClientOption
options: clientOptions, options: clientOptions,
client: newAzureClient(clientOptions), client: newAzureClient(clientOptions),
}, nil }, nil
case models.ProviderVertexAI:
return &baseProvider[VertexAIClient]{
options: clientOptions,
client: newVertexAIClient(clientOptions),
}, nil
case models.ProviderOpenRouter: case models.ProviderOpenRouter:
clientOptions.openaiOptions = append(clientOptions.openaiOptions, clientOptions.openaiOptions = append(clientOptions.openaiOptions,
WithOpenAIBaseURL("https://openrouter.ai/api/v1"), WithOpenAIBaseURL("https://openrouter.ai/api/v1"),

View File

@@ -0,0 +1,34 @@
package provider
import (
"context"
"log/slog"
"os"
"google.golang.org/genai"
)
type VertexAIClient ProviderClient
func newVertexAIClient(opts providerClientOptions) VertexAIClient {
geminiOpts := geminiOptions{}
for _, o := range opts.geminiOptions {
o(&geminiOpts)
}
client, err := genai.NewClient(context.Background(), &genai.ClientConfig{
Project: os.Getenv("VERTEXAI_PROJECT"),
Location: os.Getenv("VERTEXAI_LOCATION"),
Backend: genai.BackendVertexAI,
})
if err != nil {
slog.Error("Failed to create VertexAI client", "error", err)
return nil
}
return &geminiClient{
providerOptions: opts,
options: geminiOpts,
client: client,
}
}

View File

@@ -196,16 +196,11 @@ func (e *editTool) createNewFile(ctx context.Context, filePath, content string)
content, content,
filePath, filePath,
) )
rootDir := config.WorkingDirectory()
permissionPath := filepath.Dir(filePath)
if strings.HasPrefix(filePath, rootDir) {
permissionPath = rootDir
}
p := e.permissions.Request( p := e.permissions.Request(
ctx, ctx,
permission.CreatePermissionRequest{ permission.CreatePermissionRequest{
SessionID: sessionID, SessionID: sessionID,
Path: permissionPath, Path: filePath,
ToolName: EditToolName, ToolName: EditToolName,
Action: "write", Action: "write",
Description: fmt.Sprintf("Create file %s", filePath), Description: fmt.Sprintf("Create file %s", filePath),
@@ -308,16 +303,11 @@ func (e *editTool) deleteContent(ctx context.Context, filePath, oldString string
filePath, filePath,
) )
rootDir := config.WorkingDirectory()
permissionPath := filepath.Dir(filePath)
if strings.HasPrefix(filePath, rootDir) {
permissionPath = rootDir
}
p := e.permissions.Request( p := e.permissions.Request(
ctx, ctx,
permission.CreatePermissionRequest{ permission.CreatePermissionRequest{
SessionID: sessionID, SessionID: sessionID,
Path: permissionPath, Path: filePath,
ToolName: EditToolName, ToolName: EditToolName,
Action: "write", Action: "write",
Description: fmt.Sprintf("Delete content from file %s", filePath), Description: fmt.Sprintf("Delete content from file %s", filePath),
@@ -429,16 +419,11 @@ func (e *editTool) replaceContent(ctx context.Context, filePath, oldString, newS
newContent, newContent,
filePath, filePath,
) )
rootDir := config.WorkingDirectory()
permissionPath := filepath.Dir(filePath)
if strings.HasPrefix(filePath, rootDir) {
permissionPath = rootDir
}
p := e.permissions.Request( p := e.permissions.Request(
ctx, ctx,
permission.CreatePermissionRequest{ permission.CreatePermissionRequest{
SessionID: sessionID, SessionID: sessionID,
Path: permissionPath, Path: filePath,
ToolName: EditToolName, ToolName: EditToolName,
Action: "write", Action: "write",
Description: fmt.Sprintf("Replace content in file %s", filePath), Description: fmt.Sprintf("Replace content in file %s", filePath),

View File

@@ -12,6 +12,7 @@ import (
"syscall" "syscall"
"time" "time"
"github.com/sst/opencode/internal/config"
"github.com/sst/opencode/internal/status" "github.com/sst/opencode/internal/status"
) )
@@ -59,12 +60,27 @@ func GetPersistentShell(workingDir string) *PersistentShell {
} }
func newPersistentShell(cwd string) *PersistentShell { func newPersistentShell(cwd string) *PersistentShell {
shellPath := os.Getenv("SHELL") cfg := config.Get()
// Use shell from config if specified
shellPath := ""
shellArgs := []string{"-l"}
if cfg != nil && cfg.Shell.Path != "" {
shellPath = cfg.Shell.Path
if len(cfg.Shell.Args) > 0 {
shellArgs = cfg.Shell.Args
}
} else {
// Fall back to environment variable
shellPath = os.Getenv("SHELL")
if shellPath == "" { if shellPath == "" {
// Default to bash if neither config nor environment variable is set
shellPath = "/bin/bash" shellPath = "/bin/bash"
} }
}
cmd := exec.Command(shellPath, "-l") cmd := exec.Command(shellPath, shellArgs...)
cmd.Dir = cwd cmd.Dir = cwd
stdinPipe, err := cmd.StdinPipe() stdinPipe, err := cmd.StdinPipe()

View File

@@ -6,7 +6,6 @@ import (
"fmt" "fmt"
"os" "os"
"path/filepath" "path/filepath"
"strings"
"time" "time"
"github.com/sst/opencode/internal/config" "github.com/sst/opencode/internal/config"
@@ -161,16 +160,11 @@ func (w *writeTool) Run(ctx context.Context, call ToolCall) (ToolResponse, error
filePath, filePath,
) )
rootDir := config.WorkingDirectory()
permissionPath := filepath.Dir(filePath)
if strings.HasPrefix(filePath, rootDir) {
permissionPath = rootDir
}
p := w.permissions.Request( p := w.permissions.Request(
ctx, ctx,
permission.CreatePermissionRequest{ permission.CreatePermissionRequest{
SessionID: sessionID, SessionID: sessionID,
Path: permissionPath, Path: filePath,
ToolName: WriteToolName, ToolName: WriteToolName,
Action: "write", Action: "write",
Description: fmt.Sprintf("Create file %s", filePath), Description: fmt.Sprintf("Create file %s", filePath),

View File

@@ -2,6 +2,7 @@ package chat
import ( import (
"fmt" "fmt"
"log/slog"
"os" "os"
"os/exec" "os/exec"
"slices" "slices"
@@ -16,6 +17,7 @@ import (
"github.com/sst/opencode/internal/message" "github.com/sst/opencode/internal/message"
"github.com/sst/opencode/internal/status" "github.com/sst/opencode/internal/status"
"github.com/sst/opencode/internal/tui/components/dialog" "github.com/sst/opencode/internal/tui/components/dialog"
"github.com/sst/opencode/internal/tui/image"
"github.com/sst/opencode/internal/tui/layout" "github.com/sst/opencode/internal/tui/layout"
"github.com/sst/opencode/internal/tui/styles" "github.com/sst/opencode/internal/tui/styles"
"github.com/sst/opencode/internal/tui/theme" "github.com/sst/opencode/internal/tui/theme"
@@ -29,11 +31,17 @@ type editorCmp struct {
textarea textarea.Model textarea textarea.Model
attachments []message.Attachment attachments []message.Attachment
deleteMode bool deleteMode bool
history []string
historyIndex int
currentMessage string
} }
type EditorKeyMaps struct { type EditorKeyMaps struct {
Send key.Binding Send key.Binding
OpenEditor key.Binding OpenEditor key.Binding
Paste key.Binding
HistoryUp key.Binding
HistoryDown key.Binding
} }
type bluredEditorKeyMaps struct { type bluredEditorKeyMaps struct {
@@ -56,6 +64,18 @@ var editorMaps = EditorKeyMaps{
key.WithKeys("ctrl+e"), key.WithKeys("ctrl+e"),
key.WithHelp("ctrl+e", "open editor"), key.WithHelp("ctrl+e", "open editor"),
), ),
Paste: key.NewBinding(
key.WithKeys("ctrl+v"),
key.WithHelp("ctrl+v", "paste content"),
),
HistoryUp: key.NewBinding(
key.WithKeys("up"),
key.WithHelp("up", "previous message"),
),
HistoryDown: key.NewBinding(
key.WithKeys("down"),
key.WithHelp("down", "next message"),
),
} }
var DeleteKeyMaps = DeleteAttachmentKeyMaps{ var DeleteKeyMaps = DeleteAttachmentKeyMaps{
@@ -69,7 +89,7 @@ var DeleteKeyMaps = DeleteAttachmentKeyMaps{
), ),
DeleteAllAttachments: key.NewBinding( DeleteAllAttachments: key.NewBinding(
key.WithKeys("r"), key.WithKeys("r"),
key.WithHelp("ctrl+r+r", "delete all attchments"), key.WithHelp("ctrl+r+r", "delete all attachments"),
), ),
} }
@@ -132,6 +152,15 @@ func (m *editorCmp) send() tea.Cmd {
m.textarea.Reset() m.textarea.Reset()
attachments := m.attachments attachments := m.attachments
// Save to history if not empty and not a duplicate of the last entry
if value != "" {
if len(m.history) == 0 || m.history[len(m.history)-1] != value {
m.history = append(m.history, value)
}
m.historyIndex = len(m.history)
m.currentMessage = ""
}
m.attachments = nil m.attachments = nil
if value == "" { if value == "" {
return nil return nil
@@ -200,6 +229,67 @@ func (m *editorCmp) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
m.deleteMode = false m.deleteMode = false
return m, nil return m, nil
} }
if key.Matches(msg, editorMaps.Paste) {
imageBytes, text, err := image.GetImageFromClipboard()
if err != nil {
slog.Error(err.Error())
return m, cmd
}
if len(imageBytes) != 0 {
attachmentName := fmt.Sprintf("clipboard-image-%d", len(m.attachments))
attachment := message.Attachment{FilePath: attachmentName, FileName: attachmentName, Content: imageBytes, MimeType: "image/png"}
m.attachments = append(m.attachments, attachment)
} else {
m.textarea.SetValue(m.textarea.Value() + text)
}
return m, cmd
}
// Handle history navigation with up/down arrow keys
// Only handle history navigation if the filepicker is not open
if m.textarea.Focused() && key.Matches(msg, editorMaps.HistoryUp) && !m.app.IsFilepickerOpen() {
// Get the current line number
currentLine := m.textarea.Line()
// Only navigate history if we're at the first line
if currentLine == 0 && len(m.history) > 0 {
// Save current message if we're just starting to navigate
if m.historyIndex == len(m.history) {
m.currentMessage = m.textarea.Value()
}
// Go to previous message in history
if m.historyIndex > 0 {
m.historyIndex--
m.textarea.SetValue(m.history[m.historyIndex])
}
return m, nil
}
}
if m.textarea.Focused() && key.Matches(msg, editorMaps.HistoryDown) && !m.app.IsFilepickerOpen() {
// Get the current line number and total lines
currentLine := m.textarea.Line()
value := m.textarea.Value()
lines := strings.Split(value, "\n")
totalLines := len(lines)
// Only navigate history if we're at the last line
if currentLine == totalLines-1 {
if m.historyIndex < len(m.history)-1 {
// Go to next message in history
m.historyIndex++
m.textarea.SetValue(m.history[m.historyIndex])
} else if m.historyIndex == len(m.history)-1 {
// Return to the current message being composed
m.historyIndex = len(m.history)
m.textarea.SetValue(m.currentMessage)
}
return m, nil
}
}
// Handle Enter key // Handle Enter key
if m.textarea.Focused() && key.Matches(msg, editorMaps.Send) { if m.textarea.Focused() && key.Matches(msg, editorMaps.Send) {
value := m.textarea.Value() value := m.textarea.Value()
@@ -243,7 +333,6 @@ func (m *editorCmp) SetSize(width, height int) tea.Cmd {
m.height = height m.height = height
m.textarea.SetWidth(width - 3) // account for the prompt and padding right m.textarea.SetWidth(width - 3) // account for the prompt and padding right
m.textarea.SetHeight(height) m.textarea.SetHeight(height)
m.textarea.SetWidth(width)
return nil return nil
} }
@@ -316,5 +405,8 @@ func NewEditorCmp(app *app.App) tea.Model {
return &editorCmp{ return &editorCmp{
app: app, app: app,
textarea: ta, textarea: ta,
history: []string{},
historyIndex: 0,
currentMessage: "",
} }
} }

View File

@@ -386,6 +386,8 @@ func (m *messagesCmp) help() string {
baseStyle.Foreground(t.TextMuted()).Bold(true).Render("+"), baseStyle.Foreground(t.TextMuted()).Bold(true).Render("+"),
baseStyle.Foreground(t.Text()).Bold(true).Render("enter"), baseStyle.Foreground(t.Text()).Bold(true).Render("enter"),
baseStyle.Foreground(t.TextMuted()).Bold(true).Render(" for newline,"), baseStyle.Foreground(t.TextMuted()).Bold(true).Render(" for newline,"),
baseStyle.Foreground(t.Text()).Bold(true).Render(" ↑↓"),
baseStyle.Foreground(t.TextMuted()).Bold(true).Render(" for history,"),
baseStyle.Foreground(t.Text()).Bold(true).Render(" ctrl+h"), baseStyle.Foreground(t.Text()).Bold(true).Render(" ctrl+h"),
baseStyle.Foreground(t.TextMuted()).Bold(true).Render(" to toggle tool messages"), baseStyle.Foreground(t.TextMuted()).Bold(true).Render(" to toggle tool messages"),
) )

View File

@@ -71,8 +71,7 @@ func (m *sidebarCmp) View() string {
return baseStyle. return baseStyle.
Width(m.width). Width(m.width).
PaddingLeft(4). PaddingLeft(4).
PaddingRight(2). PaddingRight(1).
Height(m.height - 1).
Render( Render(
lipgloss.JoinVertical( lipgloss.JoinVertical(
lipgloss.Top, lipgloss.Top,
@@ -98,14 +97,9 @@ func (m *sidebarCmp) sessionSection() string {
sessionValue := baseStyle. sessionValue := baseStyle.
Foreground(t.Text()). Foreground(t.Text()).
Width(m.width - lipgloss.Width(sessionKey)).
Render(fmt.Sprintf(": %s", m.app.CurrentSession.Title)) Render(fmt.Sprintf(": %s", m.app.CurrentSession.Title))
return lipgloss.JoinHorizontal( return sessionKey + sessionValue
lipgloss.Left,
sessionKey,
sessionValue,
)
} }
func (m *sidebarCmp) modifiedFile(filePath string, additions, removals int) string { func (m *sidebarCmp) modifiedFile(filePath string, additions, removals int) string {

View File

@@ -10,7 +10,6 @@ import (
"github.com/charmbracelet/lipgloss" "github.com/charmbracelet/lipgloss"
"github.com/sst/opencode/internal/config" "github.com/sst/opencode/internal/config"
"github.com/sst/opencode/internal/llm/models" "github.com/sst/opencode/internal/llm/models"
"github.com/sst/opencode/internal/status"
"github.com/sst/opencode/internal/tui/layout" "github.com/sst/opencode/internal/tui/layout"
"github.com/sst/opencode/internal/tui/styles" "github.com/sst/opencode/internal/tui/styles"
"github.com/sst/opencode/internal/tui/theme" "github.com/sst/opencode/internal/tui/theme"
@@ -127,7 +126,6 @@ func (m *modelDialogCmp) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
m.switchProvider(1) m.switchProvider(1)
} }
case key.Matches(msg, modelKeys.Enter): case key.Matches(msg, modelKeys.Enter):
status.Info(fmt.Sprintf("selected model: %s", m.models[m.selectedIdx].Name))
return m, util.CmdHandler(ModelSelectedMsg{Model: m.models[m.selectedIdx]}) return m, util.CmdHandler(ModelSelectedMsg{Model: m.models[m.selectedIdx]})
case key.Matches(msg, modelKeys.Escape): case key.Matches(msg, modelKeys.Escape):
return m, util.CmdHandler(CloseModelDialogMsg{}) return m, util.CmdHandler(CloseModelDialogMsg{})

View File

@@ -6,6 +6,7 @@ import (
"github.com/charmbracelet/bubbles/viewport" "github.com/charmbracelet/bubbles/viewport"
tea "github.com/charmbracelet/bubbletea" tea "github.com/charmbracelet/bubbletea"
"github.com/charmbracelet/lipgloss" "github.com/charmbracelet/lipgloss"
"github.com/sst/opencode/internal/config"
"github.com/sst/opencode/internal/diff" "github.com/sst/opencode/internal/diff"
"github.com/sst/opencode/internal/llm/tools" "github.com/sst/opencode/internal/llm/tools"
"github.com/sst/opencode/internal/permission" "github.com/sst/opencode/internal/permission"
@@ -13,6 +14,7 @@ import (
"github.com/sst/opencode/internal/tui/styles" "github.com/sst/opencode/internal/tui/styles"
"github.com/sst/opencode/internal/tui/theme" "github.com/sst/opencode/internal/tui/theme"
"github.com/sst/opencode/internal/tui/util" "github.com/sst/opencode/internal/tui/util"
"path/filepath"
"strings" "strings"
) )
@@ -204,10 +206,19 @@ func (p *permissionDialogCmp) renderHeader() string {
Render(fmt.Sprintf(": %s", p.permission.ToolName)) Render(fmt.Sprintf(": %s", p.permission.ToolName))
pathKey := baseStyle.Foreground(t.TextMuted()).Bold(true).Render("Path") pathKey := baseStyle.Foreground(t.TextMuted()).Bold(true).Render("Path")
// Get the current working directory to display relative path
relativePath := p.permission.Path
if filepath.IsAbs(relativePath) {
if cwd, err := filepath.Rel(config.WorkingDirectory(), relativePath); err == nil {
relativePath = cwd
}
}
pathValue := baseStyle. pathValue := baseStyle.
Foreground(t.Text()). Foreground(t.Text()).
Width(p.width - lipgloss.Width(pathKey)). Width(p.width - lipgloss.Width(pathKey)).
Render(fmt.Sprintf(": %s", p.permission.Path)) Render(fmt.Sprintf(": %s", relativePath))
headerParts := []string{ headerParts := []string{
lipgloss.JoinHorizontal( lipgloss.JoinHorizontal(

View File

@@ -0,0 +1,178 @@
package dialog
import (
"github.com/charmbracelet/bubbles/key"
tea "github.com/charmbracelet/bubbletea"
"github.com/charmbracelet/lipgloss"
utilComponents "github.com/sst/opencode/internal/tui/components/util"
"github.com/sst/opencode/internal/tui/layout"
"github.com/sst/opencode/internal/tui/styles"
"github.com/sst/opencode/internal/tui/theme"
)
const (
maxToolsDialogWidth = 60
maxVisibleTools = 15
)
// ToolsDialog interface for the tools list dialog
type ToolsDialog interface {
tea.Model
layout.Bindings
SetTools(tools []string)
}
// ShowToolsDialogMsg is sent to show the tools dialog
type ShowToolsDialogMsg struct {
Show bool
}
// CloseToolsDialogMsg is sent when the tools dialog is closed
type CloseToolsDialogMsg struct{}
type toolItem struct {
name string
}
func (t toolItem) Render(selected bool, width int) string {
th := theme.CurrentTheme()
baseStyle := styles.BaseStyle().
Width(width).
Background(th.Background())
if selected {
baseStyle = baseStyle.
Background(th.Primary()).
Foreground(th.Background()).
Bold(true)
} else {
baseStyle = baseStyle.
Foreground(th.Text())
}
return baseStyle.Render(t.name)
}
type toolsDialogCmp struct {
tools []toolItem
width int
height int
list utilComponents.SimpleList[toolItem]
}
type toolsKeyMap struct {
Up key.Binding
Down key.Binding
Escape key.Binding
J key.Binding
K key.Binding
}
var toolsKeys = toolsKeyMap{
Up: key.NewBinding(
key.WithKeys("up"),
key.WithHelp("↑", "previous tool"),
),
Down: key.NewBinding(
key.WithKeys("down"),
key.WithHelp("↓", "next tool"),
),
Escape: key.NewBinding(
key.WithKeys("esc"),
key.WithHelp("esc", "close"),
),
J: key.NewBinding(
key.WithKeys("j"),
key.WithHelp("j", "next tool"),
),
K: key.NewBinding(
key.WithKeys("k"),
key.WithHelp("k", "previous tool"),
),
}
func (m *toolsDialogCmp) Init() tea.Cmd {
return nil
}
func (m *toolsDialogCmp) SetTools(tools []string) {
var toolItems []toolItem
for _, name := range tools {
toolItems = append(toolItems, toolItem{name: name})
}
m.tools = toolItems
m.list.SetItems(toolItems)
}
func (m *toolsDialogCmp) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
switch msg := msg.(type) {
case tea.KeyMsg:
switch {
case key.Matches(msg, toolsKeys.Escape):
return m, func() tea.Msg { return CloseToolsDialogMsg{} }
// Pass other key messages to the list component
default:
var cmd tea.Cmd
listModel, cmd := m.list.Update(msg)
m.list = listModel.(utilComponents.SimpleList[toolItem])
return m, cmd
}
case tea.WindowSizeMsg:
m.width = msg.Width
m.height = msg.Height
}
// For non-key messages
var cmd tea.Cmd
listModel, cmd := m.list.Update(msg)
m.list = listModel.(utilComponents.SimpleList[toolItem])
return m, cmd
}
func (m *toolsDialogCmp) View() string {
t := theme.CurrentTheme()
baseStyle := styles.BaseStyle().Background(t.Background())
title := baseStyle.
Foreground(t.Primary()).
Bold(true).
Width(maxToolsDialogWidth).
Padding(0, 0, 1).
Render("Available Tools")
// Calculate dialog width based on content
dialogWidth := min(maxToolsDialogWidth, m.width/2)
m.list.SetMaxWidth(dialogWidth)
content := lipgloss.JoinVertical(
lipgloss.Left,
title,
m.list.View(),
)
return baseStyle.Padding(1, 2).
Border(lipgloss.RoundedBorder()).
BorderBackground(t.Background()).
BorderForeground(t.TextMuted()).
Background(t.Background()).
Width(lipgloss.Width(content) + 4).
Render(content)
}
func (m *toolsDialogCmp) BindingKeys() []key.Binding {
return layout.KeyMapToSlice(toolsKeys)
}
func NewToolsDialogCmp() ToolsDialog {
list := utilComponents.NewSimpleList[toolItem](
[]toolItem{},
maxVisibleTools,
"No tools available",
true,
)
return &toolsDialogCmp{
list: list,
}
}

View File

@@ -84,7 +84,7 @@ func (i *detailCmp) updateContent() {
messageStyle := lipgloss.NewStyle().Bold(true).Foreground(t.Text()) messageStyle := lipgloss.NewStyle().Bold(true).Foreground(t.Text())
content.WriteString(messageStyle.Render("Message:")) content.WriteString(messageStyle.Render("Message:"))
content.WriteString("\n") content.WriteString("\n")
content.WriteString(lipgloss.NewStyle().Padding(0, 2).Render(i.currentLog.Message)) content.WriteString(lipgloss.NewStyle().Padding(0, 2).Width(i.width).Render(i.currentLog.Message))
content.WriteString("\n\n") content.WriteString("\n\n")
// Attributes section // Attributes section
@@ -112,7 +112,7 @@ func (i *detailCmp) updateContent() {
valueStyle.Render(value), valueStyle.Render(value),
) )
content.WriteString(lipgloss.NewStyle().Padding(0, 2).Render(attrLine)) content.WriteString(lipgloss.NewStyle().Padding(0, 2).Width(i.width).Render(attrLine))
content.WriteString("\n") content.WriteString("\n")
} }
} }
@@ -123,7 +123,7 @@ func (i *detailCmp) updateContent() {
content.WriteString("\n") content.WriteString("\n")
content.WriteString(sessionStyle.Render("Session:")) content.WriteString(sessionStyle.Render("Session:"))
content.WriteString("\n") content.WriteString("\n")
content.WriteString(lipgloss.NewStyle().Padding(0, 2).Render(i.currentLog.SessionID)) content.WriteString(lipgloss.NewStyle().Padding(0, 2).Width(i.width).Render(i.currentLog.SessionID))
} }
i.viewport.SetContent(content.String()) i.viewport.SetContent(content.String())

View File

@@ -0,0 +1,127 @@
package spinner
import (
"context"
"fmt"
"os"
"github.com/charmbracelet/bubbles/spinner"
tea "github.com/charmbracelet/bubbletea"
"github.com/charmbracelet/lipgloss"
)
// Spinner wraps the bubbles spinner for both interactive and non-interactive mode
type Spinner struct {
model spinner.Model
done chan struct{}
prog *tea.Program
ctx context.Context
cancel context.CancelFunc
}
// spinnerModel is the tea.Model for the spinner
type spinnerModel struct {
spinner spinner.Model
message string
quitting bool
}
func (m spinnerModel) Init() tea.Cmd {
return m.spinner.Tick
}
func (m spinnerModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
switch msg := msg.(type) {
case tea.KeyMsg:
m.quitting = true
return m, tea.Quit
case spinner.TickMsg:
var cmd tea.Cmd
m.spinner, cmd = m.spinner.Update(msg)
return m, cmd
case quitMsg:
m.quitting = true
return m, tea.Quit
default:
return m, nil
}
}
func (m spinnerModel) View() string {
if m.quitting {
return ""
}
return fmt.Sprintf("%s %s", m.spinner.View(), m.message)
}
// quitMsg is sent when we want to quit the spinner
type quitMsg struct{}
// NewSpinner creates a new spinner with the given message
func NewSpinner(message string) *Spinner {
s := spinner.New()
s.Spinner = spinner.Dot
s.Style = s.Style.Foreground(s.Style.GetForeground())
ctx, cancel := context.WithCancel(context.Background())
model := spinnerModel{
spinner: s,
message: message,
}
prog := tea.NewProgram(model, tea.WithOutput(os.Stderr), tea.WithoutCatchPanics())
return &Spinner{
model: s,
done: make(chan struct{}),
prog: prog,
ctx: ctx,
cancel: cancel,
}
}
// NewThemedSpinner creates a new spinner with the given message and color
func NewThemedSpinner(message string, color lipgloss.AdaptiveColor) *Spinner {
s := spinner.New()
s.Spinner = spinner.Dot
s.Style = s.Style.Foreground(color)
ctx, cancel := context.WithCancel(context.Background())
model := spinnerModel{
spinner: s,
message: message,
}
prog := tea.NewProgram(model, tea.WithOutput(os.Stderr), tea.WithoutCatchPanics())
return &Spinner{
model: s,
done: make(chan struct{}),
prog: prog,
ctx: ctx,
cancel: cancel,
}
}
// Start begins the spinner animation
func (s *Spinner) Start() {
go func() {
defer close(s.done)
go func() {
<-s.ctx.Done()
s.prog.Send(quitMsg{})
}()
_, err := s.prog.Run()
if err != nil {
fmt.Fprintf(os.Stderr, "Error running spinner: %v\n", err)
}
}()
}
// Stop ends the spinner animation
func (s *Spinner) Stop() {
s.cancel()
<-s.done
}

View File

@@ -0,0 +1,24 @@
package spinner
import (
"testing"
"time"
)
func TestSpinner(t *testing.T) {
t.Parallel()
// Create a spinner
s := NewSpinner("Test spinner")
// Start the spinner
s.Start()
// Wait a bit to let it run
time.Sleep(100 * time.Millisecond)
// Stop the spinner
s.Stop()
// If we got here without panicking, the test passes
}

View File

@@ -0,0 +1,49 @@
//go:build !windows
package image
import (
"bytes"
"fmt"
"image"
"github.com/atotto/clipboard"
)
func GetImageFromClipboard() ([]byte, string, error) {
text, err := clipboard.ReadAll()
if err != nil {
return nil, "", fmt.Errorf("Error reading clipboard")
}
if text == "" {
return nil, "", nil
}
binaryData := []byte(text)
imageBytes, err := binaryToImage(binaryData)
if err != nil {
return nil, text, nil
}
return imageBytes, "", nil
}
func binaryToImage(data []byte) ([]byte, error) {
reader := bytes.NewReader(data)
img, _, err := image.Decode(reader)
if err != nil {
return nil, fmt.Errorf("Unable to covert bytes to image")
}
return ImageToBytes(img)
}
func min(a, b int) int {
if a < b {
return a
}
return b
}

View File

@@ -0,0 +1,192 @@
//go:build windows
package image
import (
"bytes"
"fmt"
"image"
"image/color"
"log/slog"
"syscall"
"unsafe"
)
var (
user32 = syscall.NewLazyDLL("user32.dll")
kernel32 = syscall.NewLazyDLL("kernel32.dll")
openClipboard = user32.NewProc("OpenClipboard")
closeClipboard = user32.NewProc("CloseClipboard")
getClipboardData = user32.NewProc("GetClipboardData")
isClipboardFormatAvailable = user32.NewProc("IsClipboardFormatAvailable")
globalLock = kernel32.NewProc("GlobalLock")
globalUnlock = kernel32.NewProc("GlobalUnlock")
globalSize = kernel32.NewProc("GlobalSize")
)
const (
CF_TEXT = 1
CF_UNICODETEXT = 13
CF_DIB = 8
)
type BITMAPINFOHEADER struct {
BiSize uint32
BiWidth int32
BiHeight int32
BiPlanes uint16
BiBitCount uint16
BiCompression uint32
BiSizeImage uint32
BiXPelsPerMeter int32
BiYPelsPerMeter int32
BiClrUsed uint32
BiClrImportant uint32
}
func GetImageFromClipboard() ([]byte, string, error) {
ret, _, _ := openClipboard.Call(0)
if ret == 0 {
return nil, "", fmt.Errorf("failed to open clipboard")
}
defer func(closeClipboard *syscall.LazyProc, a ...uintptr) {
_, _, err := closeClipboard.Call(a...)
if err != nil {
slog.Error("close clipboard failed")
return
}
}(closeClipboard)
isTextAvailable, _, _ := isClipboardFormatAvailable.Call(uintptr(CF_TEXT))
isUnicodeTextAvailable, _, _ := isClipboardFormatAvailable.Call(uintptr(CF_UNICODETEXT))
if isTextAvailable != 0 || isUnicodeTextAvailable != 0 {
// Get text from clipboard
var formatToUse uintptr = CF_TEXT
if isUnicodeTextAvailable != 0 {
formatToUse = CF_UNICODETEXT
}
hClipboardText, _, _ := getClipboardData.Call(formatToUse)
if hClipboardText != 0 {
textPtr, _, _ := globalLock.Call(hClipboardText)
if textPtr != 0 {
defer func(globalUnlock *syscall.LazyProc, a ...uintptr) {
_, _, err := globalUnlock.Call(a...)
if err != nil {
slog.Error("Global unlock failed")
return
}
}(globalUnlock, hClipboardText)
// Get clipboard text
var clipboardText string
if formatToUse == CF_UNICODETEXT {
// Convert wide string to Go string
clipboardText = syscall.UTF16ToString((*[1 << 20]uint16)(unsafe.Pointer(textPtr))[:])
} else {
// Get size of ANSI text
size, _, _ := globalSize.Call(hClipboardText)
if size > 0 {
// Convert ANSI string to Go string
textBytes := make([]byte, size)
copy(textBytes, (*[1 << 20]byte)(unsafe.Pointer(textPtr))[:size:size])
clipboardText = bytesToString(textBytes)
}
}
// Check if the text is not empty
if clipboardText != "" {
return nil, clipboardText, nil
}
}
}
}
hClipboardData, _, _ := getClipboardData.Call(uintptr(CF_DIB))
if hClipboardData == 0 {
return nil, "", fmt.Errorf("failed to get clipboard data")
}
dataPtr, _, _ := globalLock.Call(hClipboardData)
if dataPtr == 0 {
return nil, "", fmt.Errorf("failed to lock clipboard data")
}
defer func(globalUnlock *syscall.LazyProc, a ...uintptr) {
_, _, err := globalUnlock.Call(a...)
if err != nil {
slog.Error("Global unlock failed")
return
}
}(globalUnlock, hClipboardData)
bmiHeader := (*BITMAPINFOHEADER)(unsafe.Pointer(dataPtr))
width := int(bmiHeader.BiWidth)
height := int(bmiHeader.BiHeight)
if height < 0 {
height = -height
}
bitsPerPixel := int(bmiHeader.BiBitCount)
img := image.NewRGBA(image.Rect(0, 0, width, height))
var bitsOffset uintptr
if bitsPerPixel <= 8 {
numColors := uint32(1) << bitsPerPixel
if bmiHeader.BiClrUsed > 0 {
numColors = bmiHeader.BiClrUsed
}
bitsOffset = unsafe.Sizeof(*bmiHeader) + uintptr(numColors*4)
} else {
bitsOffset = unsafe.Sizeof(*bmiHeader)
}
for y := range height {
for x := range width {
srcY := height - y - 1
if bmiHeader.BiHeight < 0 {
srcY = y
}
var pixelPointer unsafe.Pointer
var r, g, b, a uint8
switch bitsPerPixel {
case 24:
stride := (width*3 + 3) &^ 3
pixelPointer = unsafe.Pointer(dataPtr + bitsOffset + uintptr(srcY*stride+x*3))
b = *(*byte)(pixelPointer)
g = *(*byte)(unsafe.Add(pixelPointer, 1))
r = *(*byte)(unsafe.Add(pixelPointer, 2))
a = 255
case 32:
pixelPointer = unsafe.Pointer(dataPtr + bitsOffset + uintptr(srcY*width*4+x*4))
b = *(*byte)(pixelPointer)
g = *(*byte)(unsafe.Add(pixelPointer, 1))
r = *(*byte)(unsafe.Add(pixelPointer, 2))
a = *(*byte)(unsafe.Add(pixelPointer, 3))
if a == 0 {
a = 255
}
default:
return nil, "", fmt.Errorf("unsupported bit count: %d", bitsPerPixel)
}
img.Set(x, y, color.RGBA{R: r, G: g, B: b, A: a})
}
}
imageBytes, err := ImageToBytes(img)
if err != nil {
return nil, "", err
}
return imageBytes, "", nil
}
func bytesToString(b []byte) string {
i := bytes.IndexByte(b, 0)
if i == -1 {
return string(b)
}
return string(b[:i])
}

View File

@@ -1,8 +1,10 @@
package image package image
import ( import (
"bytes"
"fmt" "fmt"
"image" "image"
"image/png"
"os" "os"
"strings" "strings"
@@ -71,3 +73,13 @@ func ImagePreview(width int, filename string) (string, error) {
return imageString, nil return imageString, nil
} }
func ImageToBytes(image image.Image) ([]byte, error) {
buf := new(bytes.Buffer)
err := png.Encode(buf, image)
if err != nil {
return nil, err
}
return buf.Bytes(), nil
}

View File

@@ -11,16 +11,16 @@ type Container interface {
tea.Model tea.Model
Sizeable Sizeable
Bindings Bindings
Focus() // Add focus method Focus()
Blur() // Add blur method Blur()
} }
type container struct { type container struct {
width int width int
height int height int
content tea.Model content tea.Model
// Style options
paddingTop int paddingTop int
paddingRight int paddingRight int
paddingBottom int paddingBottom int
@@ -32,7 +32,7 @@ type container struct {
borderLeft bool borderLeft bool
borderStyle lipgloss.Border borderStyle lipgloss.Border
focused bool // Track focus state focused bool
} }
func (c *container) Init() tea.Cmd { func (c *container) Init() tea.Cmd {
@@ -152,16 +152,13 @@ func (c *container) Blur() {
type ContainerOption func(*container) type ContainerOption func(*container)
func NewContainer(content tea.Model, options ...ContainerOption) Container { func NewContainer(content tea.Model, options ...ContainerOption) Container {
c := &container{ c := &container{
content: content, content: content,
borderStyle: lipgloss.NormalBorder(), borderStyle: lipgloss.NormalBorder(),
} }
for _, option := range options { for _, option := range options {
option(c) option(c)
} }
return c return c
} }

View File

@@ -202,6 +202,54 @@ func LoadCustomTheme(customTheme map[string]any) (Theme, error) {
theme.DiffAddedLineNumberBgColor = adaptiveColor theme.DiffAddedLineNumberBgColor = adaptiveColor
case "diffremovedlinenumberbg": case "diffremovedlinenumberbg":
theme.DiffRemovedLineNumberBgColor = adaptiveColor theme.DiffRemovedLineNumberBgColor = adaptiveColor
case "syntaxcomment":
theme.SyntaxCommentColor = adaptiveColor
case "syntaxkeyword":
theme.SyntaxKeywordColor = adaptiveColor
case "syntaxfunction":
theme.SyntaxFunctionColor = adaptiveColor
case "syntaxvariable":
theme.SyntaxVariableColor = adaptiveColor
case "syntaxstring":
theme.SyntaxStringColor = adaptiveColor
case "syntaxnumber":
theme.SyntaxNumberColor = adaptiveColor
case "syntaxtype":
theme.SyntaxTypeColor = adaptiveColor
case "syntaxoperator":
theme.SyntaxOperatorColor = adaptiveColor
case "syntaxpunctuation":
theme.SyntaxPunctuationColor = adaptiveColor
case "markdowntext":
theme.MarkdownTextColor = adaptiveColor
case "markdownheading":
theme.MarkdownHeadingColor = adaptiveColor
case "markdownlink":
theme.MarkdownLinkColor = adaptiveColor
case "markdownlinktext":
theme.MarkdownLinkTextColor = adaptiveColor
case "markdowncode":
theme.MarkdownCodeColor = adaptiveColor
case "markdownblockquote":
theme.MarkdownBlockQuoteColor = adaptiveColor
case "markdownemph":
theme.MarkdownEmphColor = adaptiveColor
case "markdownstrong":
theme.MarkdownStrongColor = adaptiveColor
case "markdownhorizontalrule":
theme.MarkdownHorizontalRuleColor = adaptiveColor
case "markdownlistitem":
theme.MarkdownListItemColor = adaptiveColor
case "markdownlistitemenum":
theme.MarkdownListEnumerationColor = adaptiveColor
case "markdownimage":
theme.MarkdownImageColor = adaptiveColor
case "markdownimagetext":
theme.MarkdownImageTextColor = adaptiveColor
case "markdowncodeblock":
theme.MarkdownCodeBlockColor = adaptiveColor
case "markdownlistenumeration":
theme.MarkdownListEnumerationColor = adaptiveColor
default: default:
slog.Warn("Unknown color key in custom theme", "key", key) slog.Warn("Unknown color key in custom theme", "key", key)
} }

View File

@@ -235,7 +235,19 @@ func ParseAdaptiveColor(value any) (lipgloss.AdaptiveColor, error) {
}, nil }, nil
} }
// Case 2: Map with dark and light keys // Case 2: Int value between 0 and 255
if numericVal, ok := value.(float64); ok {
intVal := int(numericVal)
if intVal < 0 || intVal > 255 {
return lipgloss.AdaptiveColor{}, fmt.Errorf("invalid int color value (must be between 0 and 255): %d", intVal)
}
return lipgloss.AdaptiveColor{
Dark: fmt.Sprintf("%d", intVal),
Light: fmt.Sprintf("%d", intVal),
}, nil
}
// Case 3: Map with dark and light keys
if colorMap, ok := value.(map[string]any); ok { if colorMap, ok := value.(map[string]any); ok {
darkVal, darkOk := colorMap["dark"] darkVal, darkOk := colorMap["dark"]
lightVal, lightOk := colorMap["light"] lightVal, lightOk := colorMap["light"]
@@ -248,7 +260,20 @@ func ParseAdaptiveColor(value any) (lipgloss.AdaptiveColor, error) {
lightHex, lightIsString := lightVal.(string) lightHex, lightIsString := lightVal.(string)
if !darkIsString || !lightIsString { if !darkIsString || !lightIsString {
return lipgloss.AdaptiveColor{}, fmt.Errorf("color values must be strings") darkVal, darkIsNumber := darkVal.(float64)
lightVal, lightIsNumber := lightVal.(float64)
if !darkIsNumber || !lightIsNumber {
return lipgloss.AdaptiveColor{}, fmt.Errorf("color map values must be strings or ints")
}
darkInt := int(darkVal)
lightInt := int(lightVal)
return lipgloss.AdaptiveColor{
Dark: fmt.Sprintf("%d", darkInt),
Light: fmt.Sprintf("%d", lightInt),
}, nil
} }
if !hexColorRegex.MatchString(darkHex) || !hexColorRegex.MatchString(lightHex) { if !hexColorRegex.MatchString(darkHex) || !hexColorRegex.MatchString(lightHex) {

View File

@@ -13,6 +13,7 @@ import (
"github.com/charmbracelet/lipgloss" "github.com/charmbracelet/lipgloss"
"github.com/sst/opencode/internal/app" "github.com/sst/opencode/internal/app"
"github.com/sst/opencode/internal/config" "github.com/sst/opencode/internal/config"
"github.com/sst/opencode/internal/llm/agent"
"github.com/sst/opencode/internal/logging" "github.com/sst/opencode/internal/logging"
"github.com/sst/opencode/internal/message" "github.com/sst/opencode/internal/message"
"github.com/sst/opencode/internal/permission" "github.com/sst/opencode/internal/permission"
@@ -38,6 +39,7 @@ type keyMap struct {
Filepicker key.Binding Filepicker key.Binding
Models key.Binding Models key.Binding
SwitchTheme key.Binding SwitchTheme key.Binding
Tools key.Binding
} }
const ( const (
@@ -81,6 +83,11 @@ var keys = keyMap{
key.WithKeys("ctrl+t"), key.WithKeys("ctrl+t"),
key.WithHelp("ctrl+t", "switch theme"), key.WithHelp("ctrl+t", "switch theme"),
), ),
Tools: key.NewBinding(
key.WithKeys("f9"),
key.WithHelp("f9", "show available tools"),
),
} }
var helpEsc = key.NewBinding( var helpEsc = key.NewBinding(
@@ -137,6 +144,9 @@ type appModel struct {
showMultiArgumentsDialog bool showMultiArgumentsDialog bool
multiArgumentsDialog dialog.MultiArgumentsDialogCmp multiArgumentsDialog dialog.MultiArgumentsDialogCmp
showToolsDialog bool
toolsDialog dialog.ToolsDialog
} }
func (a appModel) Init() tea.Cmd { func (a appModel) Init() tea.Cmd {
@@ -162,6 +172,8 @@ func (a appModel) Init() tea.Cmd {
cmds = append(cmds, cmd) cmds = append(cmds, cmd)
cmd = a.themeDialog.Init() cmd = a.themeDialog.Init()
cmds = append(cmds, cmd) cmds = append(cmds, cmd)
cmd = a.toolsDialog.Init()
cmds = append(cmds, cmd)
// Check if we should show the init dialog // Check if we should show the init dialog
cmds = append(cmds, func() tea.Msg { cmds = append(cmds, func() tea.Msg {
@@ -288,6 +300,14 @@ func (a appModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
a.showThemeDialog = false a.showThemeDialog = false
return a, nil return a, nil
case dialog.CloseToolsDialogMsg:
a.showToolsDialog = false
return a, nil
case dialog.ShowToolsDialogMsg:
a.showToolsDialog = msg.Show
return a, nil
case dialog.ThemeChangedMsg: case dialog.ThemeChangedMsg:
a.pages[a.currentPage], cmd = a.pages[a.currentPage].Update(msg) a.pages[a.currentPage], cmd = a.pages[a.currentPage].Update(msg)
a.showThemeDialog = false a.showThemeDialog = false
@@ -397,6 +417,7 @@ func (a appModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
if a.showFilepicker { if a.showFilepicker {
a.showFilepicker = false a.showFilepicker = false
a.filepicker.ToggleFilepicker(a.showFilepicker) a.filepicker.ToggleFilepicker(a.showFilepicker)
a.app.SetFilepickerOpen(a.showFilepicker)
} }
if a.showModelDialog { if a.showModelDialog {
a.showModelDialog = false a.showModelDialog = false
@@ -404,9 +425,18 @@ func (a appModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
if a.showMultiArgumentsDialog { if a.showMultiArgumentsDialog {
a.showMultiArgumentsDialog = false a.showMultiArgumentsDialog = false
} }
if a.showToolsDialog {
a.showToolsDialog = false
}
return a, nil return a, nil
case key.Matches(msg, keys.SwitchSession): case key.Matches(msg, keys.SwitchSession):
if a.currentPage == page.ChatPage && !a.showQuit && !a.showPermissions && !a.showCommandDialog { if a.currentPage == page.ChatPage && !a.showQuit && !a.showPermissions && !a.showCommandDialog {
// Close other dialogs
a.showToolsDialog = false
a.showThemeDialog = false
a.showModelDialog = false
a.showFilepicker = false
// Load sessions and show the dialog // Load sessions and show the dialog
sessions, err := a.app.Sessions.List(context.Background()) sessions, err := a.app.Sessions.List(context.Background())
if err != nil { if err != nil {
@@ -424,6 +454,10 @@ func (a appModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
return a, nil return a, nil
case key.Matches(msg, keys.Commands): case key.Matches(msg, keys.Commands):
if a.currentPage == page.ChatPage && !a.showQuit && !a.showPermissions && !a.showSessionDialog && !a.showThemeDialog && !a.showFilepicker { if a.currentPage == page.ChatPage && !a.showQuit && !a.showPermissions && !a.showSessionDialog && !a.showThemeDialog && !a.showFilepicker {
// Close other dialogs
a.showToolsDialog = false
a.showModelDialog = false
// Show commands dialog // Show commands dialog
if len(a.commands) == 0 { if len(a.commands) == 0 {
status.Warn("No commands available") status.Warn("No commands available")
@@ -440,22 +474,52 @@ func (a appModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
return a, nil return a, nil
} }
if a.currentPage == page.ChatPage && !a.showQuit && !a.showPermissions && !a.showSessionDialog && !a.showCommandDialog { if a.currentPage == page.ChatPage && !a.showQuit && !a.showPermissions && !a.showSessionDialog && !a.showCommandDialog {
// Close other dialogs
a.showToolsDialog = false
a.showThemeDialog = false
a.showFilepicker = false
a.showModelDialog = true a.showModelDialog = true
return a, nil return a, nil
} }
return a, nil return a, nil
case key.Matches(msg, keys.SwitchTheme): case key.Matches(msg, keys.SwitchTheme):
if a.currentPage == page.ChatPage && !a.showQuit && !a.showPermissions && !a.showSessionDialog && !a.showCommandDialog { if a.currentPage == page.ChatPage && !a.showQuit && !a.showPermissions && !a.showSessionDialog && !a.showCommandDialog {
// Close other dialogs
a.showToolsDialog = false
a.showModelDialog = false
a.showFilepicker = false
a.showThemeDialog = true a.showThemeDialog = true
return a, a.themeDialog.Init() return a, a.themeDialog.Init()
} }
return a, nil return a, nil
case key.Matches(msg, keys.Tools):
// Check if any other dialog is open
if a.currentPage == page.ChatPage && !a.showQuit && !a.showPermissions &&
!a.showSessionDialog && !a.showCommandDialog && !a.showThemeDialog &&
!a.showFilepicker && !a.showModelDialog && !a.showInitDialog &&
!a.showMultiArgumentsDialog {
// Toggle tools dialog
a.showToolsDialog = !a.showToolsDialog
if a.showToolsDialog {
// Get tool names dynamically
toolNames := getAvailableToolNames(a.app)
a.toolsDialog.SetTools(toolNames)
}
return a, nil
}
return a, nil
case key.Matches(msg, returnKey) || key.Matches(msg): case key.Matches(msg, returnKey) || key.Matches(msg):
if msg.String() == quitKey { if msg.String() == quitKey {
if a.currentPage == page.LogsPage { if a.currentPage == page.LogsPage {
return a, a.moveToPage(page.ChatPage) return a, a.moveToPage(page.ChatPage)
} }
} else if !a.filepicker.IsCWDFocused() { } else if !a.filepicker.IsCWDFocused() {
if a.showToolsDialog {
a.showToolsDialog = false
return a, nil
}
if a.showQuit { if a.showQuit {
a.showQuit = !a.showQuit a.showQuit = !a.showQuit
return a, nil return a, nil
@@ -476,6 +540,7 @@ func (a appModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
if a.showFilepicker { if a.showFilepicker {
a.showFilepicker = false a.showFilepicker = false
a.filepicker.ToggleFilepicker(a.showFilepicker) a.filepicker.ToggleFilepicker(a.showFilepicker)
a.app.SetFilepickerOpen(a.showFilepicker)
return a, nil return a, nil
} }
if a.currentPage == page.LogsPage { if a.currentPage == page.LogsPage {
@@ -490,6 +555,11 @@ func (a appModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
return a, nil return a, nil
} }
a.showHelp = !a.showHelp a.showHelp = !a.showHelp
// Close other dialogs if opening help
if a.showHelp {
a.showToolsDialog = false
}
return a, nil return a, nil
case key.Matches(msg, helpEsc): case key.Matches(msg, helpEsc):
if a.app.PrimaryAgent.IsBusy() { if a.app.PrimaryAgent.IsBusy() {
@@ -500,8 +570,19 @@ func (a appModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
return a, nil return a, nil
} }
case key.Matches(msg, keys.Filepicker): case key.Matches(msg, keys.Filepicker):
// Toggle filepicker
a.showFilepicker = !a.showFilepicker a.showFilepicker = !a.showFilepicker
a.filepicker.ToggleFilepicker(a.showFilepicker) a.filepicker.ToggleFilepicker(a.showFilepicker)
a.app.SetFilepickerOpen(a.showFilepicker)
// Close other dialogs if opening filepicker
if a.showFilepicker {
a.showToolsDialog = false
a.showThemeDialog = false
a.showModelDialog = false
a.showCommandDialog = false
a.showSessionDialog = false
}
return a, nil return a, nil
} }
@@ -601,6 +682,16 @@ func (a appModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
} }
} }
if a.showToolsDialog {
d, toolsCmd := a.toolsDialog.Update(msg)
a.toolsDialog = d.(dialog.ToolsDialog)
cmds = append(cmds, toolsCmd)
// Only block key messages send all other messages down
if _, ok := msg.(tea.KeyMsg); ok {
return a, tea.Batch(cmds...)
}
}
s, cmd := a.status.Update(msg) s, cmd := a.status.Update(msg)
cmds = append(cmds, cmd) cmds = append(cmds, cmd)
a.status = s.(core.StatusCmp) a.status = s.(core.StatusCmp)
@@ -615,6 +706,26 @@ func (a *appModel) RegisterCommand(cmd dialog.Command) {
a.commands = append(a.commands, cmd) a.commands = append(a.commands, cmd)
} }
// getAvailableToolNames returns a list of all available tool names
func getAvailableToolNames(app *app.App) []string {
// Get primary agent tools (which already include MCP tools)
allTools := agent.PrimaryAgentTools(
app.Permissions,
app.Sessions,
app.Messages,
app.History,
app.LSPClients,
)
// Extract tool names
var toolNames []string
for _, tool := range allTools {
toolNames = append(toolNames, tool.Info().Name)
}
return toolNames
}
func (a *appModel) moveToPage(pageID page.PageID) tea.Cmd { func (a *appModel) moveToPage(pageID page.PageID) tea.Cmd {
// Allow navigating to logs page even when agent is busy // Allow navigating to logs page even when agent is busy
if a.app.PrimaryAgent.IsBusy() && pageID != page.LogsPage { if a.app.PrimaryAgent.IsBusy() && pageID != page.LogsPage {
@@ -821,6 +932,21 @@ func (a appModel) View() string {
) )
} }
if a.showToolsDialog {
overlay := a.toolsDialog.View()
row := lipgloss.Height(appView) / 2
row -= lipgloss.Height(overlay) / 2
col := lipgloss.Width(appView) / 2
col -= lipgloss.Width(overlay) / 2
appView = layout.PlaceOverlay(
col,
row,
overlay,
appView,
true,
)
}
return appView return appView
} }
@@ -838,6 +964,7 @@ func New(app *app.App) tea.Model {
permissions: dialog.NewPermissionDialogCmp(), permissions: dialog.NewPermissionDialogCmp(),
initDialog: dialog.NewInitDialogCmp(), initDialog: dialog.NewInitDialogCmp(),
themeDialog: dialog.NewThemeDialogCmp(), themeDialog: dialog.NewThemeDialogCmp(),
toolsDialog: dialog.NewToolsDialogCmp(),
app: app, app: app,
commands: []dialog.Command{}, commands: []dialog.Command{},
pages: map[page.PageID]tea.Model{ pages: map[page.PageID]tea.Model{

View File

@@ -12,63 +12,74 @@
"model": { "model": {
"description": "Model ID for the agent", "description": "Model ID for the agent",
"enum": [ "enum": [
"azure.o1-mini",
"openrouter.gemini-2.5-flash",
"claude-3-haiku",
"o1-mini",
"qwen-qwq",
"llama-3.3-70b-versatile",
"openrouter.claude-3.5-sonnet",
"o3-mini",
"o4-mini",
"gpt-4.1", "gpt-4.1",
"azure.o3-mini",
"openrouter.gpt-4.1-nano",
"openrouter.gpt-4o",
"gemini-2.5",
"azure.gpt-4o",
"azure.gpt-4o-mini",
"claude-3.7-sonnet",
"azure.gpt-4.1-nano",
"openrouter.o1",
"openrouter.claude-3-haiku",
"bedrock.claude-3.7-sonnet",
"gemini-2.5-flash",
"azure.o3",
"openrouter.gemini-2.5",
"openrouter.o3", "openrouter.o3",
"openrouter.o3-mini",
"openrouter.gpt-4.1-mini",
"openrouter.gpt-4.5-preview",
"openrouter.gpt-4o-mini",
"gpt-4.1-mini",
"meta-llama/llama-4-scout-17b-16e-instruct",
"openrouter.o1-mini",
"gpt-4.5-preview",
"o3",
"openrouter.claude-3.5-haiku",
"claude-3-opus",
"o1-pro",
"gemini-2.0-flash",
"azure.o4-mini",
"openrouter.o4-mini",
"claude-3.5-sonnet",
"meta-llama/llama-4-maverick-17b-128e-instruct",
"azure.o1",
"openrouter.gpt-4.1", "openrouter.gpt-4.1",
"openrouter.o1-pro", "meta-llama/llama-4-scout-17b-16e-instruct",
"gpt-4.1-nano", "openrouter.gpt-4o",
"azure.gpt-4.5-preview", "o1-pro",
"openrouter.claude-3-opus", "claude-3-haiku",
"gpt-4o-mini",
"o1", "o1",
"deepseek-r1-distill-llama-70b", "gemini-2.5-flash",
"azure.gpt-4.1", "vertexai.gemini-2.5-flash",
"gpt-4o",
"azure.gpt-4.1-mini",
"openrouter.claude-3.7-sonnet",
"claude-3.5-haiku", "claude-3.5-haiku",
"gemini-2.0-flash-lite" "gpt-4o-mini",
"o3-mini",
"gpt-4.5-preview",
"azure.gpt-4o",
"azure.o4-mini",
"openrouter.claude-3.5-sonnet",
"gpt-4o",
"o3",
"gpt-4.1-mini",
"llama-3.3-70b-versatile",
"azure.gpt-4o-mini",
"gpt-4.1-nano",
"o4-mini",
"qwen-qwq",
"openrouter.claude-3.5-haiku",
"openrouter.qwen-3-14b",
"vertexai.gemini-2.5",
"gemini-2.5",
"azure.gpt-4.1-nano",
"openrouter.o1-mini",
"openrouter.qwen-3-30b",
"claude-3.7-sonnet",
"claude-3.5-sonnet",
"gemini-2.0-flash",
"meta-llama/llama-4-maverick-17b-128e-instruct",
"openrouter.o3-mini",
"openrouter.o4-mini",
"openrouter.gpt-4.1-mini",
"openrouter.o1",
"o1-mini",
"azure.gpt-4.1-mini",
"openrouter.o1-pro",
"grok-3-beta",
"grok-3-mini-fast-beta",
"openrouter.claude-3.7-sonnet",
"openrouter.claude-3-opus",
"openrouter.qwen-3-235b",
"openrouter.gpt-4.1-nano",
"bedrock.claude-3.7-sonnet",
"openrouter.qwen-3-8b",
"claude-3-opus",
"azure.o1-mini",
"deepseek-r1-distill-llama-70b",
"gemini-2.0-flash-lite",
"openrouter.qwen-3-32b",
"openrouter.gpt-4.5-preview",
"grok-3-mini-beta",
"grok-3-fast-beta",
"azure.o3-mini",
"openrouter.claude-3-haiku",
"azure.gpt-4.1",
"azure.o1",
"azure.o3",
"azure.gpt-4.5-preview",
"openrouter.gemini-2.5-flash",
"openrouter.gpt-4o-mini",
"openrouter.gemini-2.5"
], ],
"type": "string" "type": "string"
}, },
@@ -102,63 +113,74 @@
"model": { "model": {
"description": "Model ID for the agent", "description": "Model ID for the agent",
"enum": [ "enum": [
"azure.o1-mini",
"openrouter.gemini-2.5-flash",
"claude-3-haiku",
"o1-mini",
"qwen-qwq",
"llama-3.3-70b-versatile",
"openrouter.claude-3.5-sonnet",
"o3-mini",
"o4-mini",
"gpt-4.1", "gpt-4.1",
"azure.o3-mini",
"openrouter.gpt-4.1-nano",
"openrouter.gpt-4o",
"gemini-2.5",
"azure.gpt-4o",
"azure.gpt-4o-mini",
"claude-3.7-sonnet",
"azure.gpt-4.1-nano",
"openrouter.o1",
"openrouter.claude-3-haiku",
"bedrock.claude-3.7-sonnet",
"gemini-2.5-flash",
"azure.o3",
"openrouter.gemini-2.5",
"openrouter.o3", "openrouter.o3",
"openrouter.o3-mini",
"openrouter.gpt-4.1-mini",
"openrouter.gpt-4.5-preview",
"openrouter.gpt-4o-mini",
"gpt-4.1-mini",
"meta-llama/llama-4-scout-17b-16e-instruct",
"openrouter.o1-mini",
"gpt-4.5-preview",
"o3",
"openrouter.claude-3.5-haiku",
"claude-3-opus",
"o1-pro",
"gemini-2.0-flash",
"azure.o4-mini",
"openrouter.o4-mini",
"claude-3.5-sonnet",
"meta-llama/llama-4-maverick-17b-128e-instruct",
"azure.o1",
"openrouter.gpt-4.1", "openrouter.gpt-4.1",
"openrouter.o1-pro", "meta-llama/llama-4-scout-17b-16e-instruct",
"gpt-4.1-nano", "openrouter.gpt-4o",
"azure.gpt-4.5-preview", "o1-pro",
"openrouter.claude-3-opus", "claude-3-haiku",
"gpt-4o-mini",
"o1", "o1",
"deepseek-r1-distill-llama-70b", "gemini-2.5-flash",
"azure.gpt-4.1", "vertexai.gemini-2.5-flash",
"gpt-4o",
"azure.gpt-4.1-mini",
"openrouter.claude-3.7-sonnet",
"claude-3.5-haiku", "claude-3.5-haiku",
"gemini-2.0-flash-lite" "gpt-4o-mini",
"o3-mini",
"gpt-4.5-preview",
"azure.gpt-4o",
"azure.o4-mini",
"openrouter.claude-3.5-sonnet",
"gpt-4o",
"o3",
"gpt-4.1-mini",
"llama-3.3-70b-versatile",
"azure.gpt-4o-mini",
"gpt-4.1-nano",
"o4-mini",
"qwen-qwq",
"openrouter.claude-3.5-haiku",
"openrouter.qwen-3-14b",
"vertexai.gemini-2.5",
"gemini-2.5",
"azure.gpt-4.1-nano",
"openrouter.o1-mini",
"openrouter.qwen-3-30b",
"claude-3.7-sonnet",
"claude-3.5-sonnet",
"gemini-2.0-flash",
"meta-llama/llama-4-maverick-17b-128e-instruct",
"openrouter.o3-mini",
"openrouter.o4-mini",
"openrouter.gpt-4.1-mini",
"openrouter.o1",
"o1-mini",
"azure.gpt-4.1-mini",
"openrouter.o1-pro",
"grok-3-beta",
"grok-3-mini-fast-beta",
"openrouter.claude-3.7-sonnet",
"openrouter.claude-3-opus",
"openrouter.qwen-3-235b",
"openrouter.gpt-4.1-nano",
"bedrock.claude-3.7-sonnet",
"openrouter.qwen-3-8b",
"claude-3-opus",
"azure.o1-mini",
"deepseek-r1-distill-llama-70b",
"gemini-2.0-flash-lite",
"openrouter.qwen-3-32b",
"openrouter.gpt-4.5-preview",
"grok-3-mini-beta",
"grok-3-fast-beta",
"azure.o3-mini",
"openrouter.claude-3-haiku",
"azure.gpt-4.1",
"azure.o1",
"azure.o3",
"azure.gpt-4.5-preview",
"openrouter.gemini-2.5-flash",
"openrouter.gpt-4o-mini",
"openrouter.gemini-2.5"
], ],
"type": "string" "type": "string"
}, },
@@ -341,7 +363,8 @@
"groq", "groq",
"openrouter", "openrouter",
"bedrock", "bedrock",
"azure" "azure",
"vertexai"
], ],
"type": "string" "type": "string"
} }