Compare commits

...

15 Commits

Author SHA1 Message Date
default 72960aeb77 fix: show devcontainer Delete menu for failed/error states
The three-dot menu (containing Delete) was gated behind
showDevcontainerControls which requires both a subAgent AND a
container reference. When a devcontainer fails (e.g. exit status 1),
the container and/or subAgent may be missing, hiding the menu entirely.

Decouple the Delete menu from showDevcontainerControls so it renders
whenever the devcontainer is not in a transitioning state (starting,
stopping, deleting). The delete API only needs parentAgent.id and
devcontainer.id, so no subAgent or container is required.

Fixes #23754
2026-04-10 17:27:15 +00:00
Mathias Fredriksson a62ead8588 fix(coderd): sort pinned chats first in GetChats pagination (#24222)
The GetChats SQL query ordered by (updated_at, id) DESC with no
pin_order awareness. A pinned chat with an old updated_at could
land on page 2+ and be invisible in the sidebar's Pinned section.

Add a 4-column ORDER BY: pinned-first flag DESC, negated pin_order
DESC, updated_at DESC, id DESC. The negation trick keeps all sort
columns DESC so the cursor tuple < comparison still works. Update
the after_id cursor clause to match the expanded sort key.

Fix the false handler comment claiming PinChatByID bumps updated_at.
2026-04-10 17:13:19 +00:00
dependabot[bot] b68c14dd04 chore: bump github.com/hashicorp/go-getter from 1.8.4 to 1.8.6 (#24247)
Bumps
[github.com/hashicorp/go-getter](https://github.com/hashicorp/go-getter)
from 1.8.4 to 1.8.6.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/hashicorp/go-getter/releases">github.com/hashicorp/go-getter's
releases</a>.</em></p>
<blockquote>
<h2>v1.8.6</h2>
<p>No release notes provided.</p>
<h2>v1.8.5</h2>
<h2>What's Changed</h2>
<ul>
<li>[chore] : Bump the go group with 2 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/576">hashicorp/go-getter#576</a></li>
<li>use %w to wrap error by <a
href="https://github.com/Ericwww"><code>@​Ericwww</code></a> in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/475">hashicorp/go-getter#475</a></li>
<li>fix: <a
href="https://redirect.github.com/hashicorp/go-getter/issues/538">#538</a>
http file download skipped if headResp.ContentLength is 0 by <a
href="https://github.com/martijnvdp"><code>@​martijnvdp</code></a> in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/539">hashicorp/go-getter#539</a></li>
<li>chore: fix error message capitalization in checksum function by <a
href="https://github.com/ssagarverma"><code>@​ssagarverma</code></a> in
<a
href="https://redirect.github.com/hashicorp/go-getter/pull/578">hashicorp/go-getter#578</a></li>
<li>[chore] : Bump the go group with 8 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/577">hashicorp/go-getter#577</a></li>
<li>Fix git url with ambiguous ref by <a
href="https://github.com/nimasamii"><code>@​nimasamii</code></a> in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/382">hashicorp/go-getter#382</a></li>
<li>fix: resolve compilation errors in get_git_test.go by <a
href="https://github.com/CreatorHead"><code>@​CreatorHead</code></a> in
<a
href="https://redirect.github.com/hashicorp/go-getter/pull/579">hashicorp/go-getter#579</a></li>
<li>[chore] : Bump the actions group with 2 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/582">hashicorp/go-getter#582</a></li>
<li>[chore] : Bump the go group with 3 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/583">hashicorp/go-getter#583</a></li>
<li>test that arbitrary files cannot be checksummed by <a
href="https://github.com/schmichael"><code>@​schmichael</code></a> in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/250">hashicorp/go-getter#250</a></li>
<li>[chore] : Bump google.golang.org/api from 0.260.0 to 0.262.0 in the
go group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/585">hashicorp/go-getter#585</a></li>
<li>[chore] : Bump actions/checkout from 6.0.1 to 6.0.2 in the actions
group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/586">hashicorp/go-getter#586</a></li>
<li>[chore] : Bump the go group with 3 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/588">hashicorp/go-getter#588</a></li>
<li>[chore] : Bump actions/cache from 5.0.2 to 5.0.3 in the actions
group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/589">hashicorp/go-getter#589</a></li>
<li>[chore] : Bump aws-actions/configure-aws-credentials from 5.1.1 to
6.0.0 in the actions group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/592">hashicorp/go-getter#592</a></li>
<li>[chore] : Bump google.golang.org/api from 0.264.0 to 0.265.0 in the
go group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/591">hashicorp/go-getter#591</a></li>
<li>[chore] : Bump the go group with 5 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/593">hashicorp/go-getter#593</a></li>
<li>IND-6310 - CRT Onboarding by <a
href="https://github.com/nasareeny"><code>@​nasareeny</code></a> in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/584">hashicorp/go-getter#584</a></li>
<li>Fix crt build path by <a
href="https://github.com/ssagarverma"><code>@​ssagarverma</code></a> in
<a
href="https://redirect.github.com/hashicorp/go-getter/pull/594">hashicorp/go-getter#594</a></li>
<li>[chore] : Bump the go group with 3 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/596">hashicorp/go-getter#596</a></li>
<li>fix: remove checkout action from set-product-version job by <a
href="https://github.com/ssagarverma"><code>@​ssagarverma</code></a> in
<a
href="https://redirect.github.com/hashicorp/go-getter/pull/598">hashicorp/go-getter#598</a></li>
<li>[chore] : Bump the actions group with 4 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/595">hashicorp/go-getter#595</a></li>
<li>fix(deps): upgrade go.opentelemetry.io/otel/sdk to v1.40.0
(GO-2026-4394) by <a
href="https://github.com/ssagarverma"><code>@​ssagarverma</code></a> in
<a
href="https://redirect.github.com/hashicorp/go-getter/pull/599">hashicorp/go-getter#599</a></li>
<li>Prepare go-getter for v1.8.5 release by <a
href="https://github.com/nasareeny"><code>@​nasareeny</code></a> in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/597">hashicorp/go-getter#597</a></li>
<li>[chore] : Bump the actions group with 2 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/600">hashicorp/go-getter#600</a></li>
<li>sec: bump go and xrepos + redact aws tokens in url by <a
href="https://github.com/dduzgun-security"><code>@​dduzgun-security</code></a>
in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/604">hashicorp/go-getter#604</a></li>
</ul>
<p><strong>NOTES:</strong></p>
<p>Binary Distribution Update: To streamline our release process and
align with other HashiCorp tools, all release binaries will now be
published exclusively to the official HashiCorp <a
href="https://releases.hashicorp.com/go-getter/">release</a> site. We
will no longer attach release assets to GitHub Releases.</p>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/Ericwww"><code>@​Ericwww</code></a> made
their first contribution in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/475">hashicorp/go-getter#475</a></li>
<li><a
href="https://github.com/martijnvdp"><code>@​martijnvdp</code></a> made
their first contribution in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/539">hashicorp/go-getter#539</a></li>
<li><a href="https://github.com/nimasamii"><code>@​nimasamii</code></a>
made their first contribution in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/382">hashicorp/go-getter#382</a></li>
<li><a href="https://github.com/nasareeny"><code>@​nasareeny</code></a>
made their first contribution in <a
href="https://redirect.github.com/hashicorp/go-getter/pull/584">hashicorp/go-getter#584</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/hashicorp/go-getter/compare/v1.8.4...v1.8.5">https://github.com/hashicorp/go-getter/compare/v1.8.4...v1.8.5</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/hashicorp/go-getter/commit/d23bff48fb87c956bb507a03d35a63ee45470e34"><code>d23bff4</code></a>
Merge pull request <a
href="https://redirect.github.com/hashicorp/go-getter/issues/608">#608</a>
from hashicorp/dependabot/go_modules/go-security-9c51...</li>
<li><a
href="https://github.com/hashicorp/go-getter/commit/2c4aba8e5286c18bc66358236454a3e3b0aa7421"><code>2c4aba8</code></a>
Merge pull request <a
href="https://redirect.github.com/hashicorp/go-getter/issues/613">#613</a>
from hashicorp/pull/v1.8.6</li>
<li><a
href="https://github.com/hashicorp/go-getter/commit/fe61ed9454b818721d81328d7e880fc2ed2c8d15"><code>fe61ed9</code></a>
Merge pull request <a
href="https://redirect.github.com/hashicorp/go-getter/issues/611">#611</a>
from hashicorp/SECVULN-41053</li>
<li><a
href="https://github.com/hashicorp/go-getter/commit/d53365612c5250f7df8d586ba3be70fbd42e613b"><code>d533656</code></a>
Merge pull request <a
href="https://redirect.github.com/hashicorp/go-getter/issues/606">#606</a>
from hashicorp/pull/CRT</li>
<li><a
href="https://github.com/hashicorp/go-getter/commit/388f23d7d40f1f1e1a9f5b40ee5590c08154cd6d"><code>388f23d</code></a>
Additional test for local branch and head</li>
<li><a
href="https://github.com/hashicorp/go-getter/commit/b7ceaa59b11a203c14cf58e5fcaa8f169c0ced6e"><code>b7ceaa5</code></a>
harden checkout ref handling and added regression tests</li>
<li><a
href="https://github.com/hashicorp/go-getter/commit/769cc14fdb0df5ac548f4ead1193b5c40460f11e"><code>769cc14</code></a>
Release version bump up</li>
<li><a
href="https://github.com/hashicorp/go-getter/commit/6086a6a1f6347f735401c26429d9a0e14ad29444"><code>6086a6a</code></a>
Review Comments Addressed</li>
<li><a
href="https://github.com/hashicorp/go-getter/commit/e02063cd28e97bb8a23a63e72e2a4a4ab6e982cf"><code>e02063c</code></a>
Revert &quot;SECVULN Fix for git checkout argument injection enables
arbitrary fil...</li>
<li><a
href="https://github.com/hashicorp/go-getter/commit/c93084dc4306b2c49c54fe6fbfbe79c98956e5f8"><code>c93084d</code></a>
[chore] : Bump google.golang.org/grpc</li>
<li>Additional commits viewable in <a
href="https://github.com/hashicorp/go-getter/compare/v1.8.4...v1.8.6">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/hashicorp/go-getter&package-manager=go_modules&previous-version=1.8.4&new-version=1.8.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts page](https://github.com/coder/coder/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-10 15:36:57 +00:00
Zach 508114d484 feat: user secret database encryption (#24218)
Add dbcrypt support for user secret values. When database encryption is
enabled, secret values are transparently encrypted on write and
decrypted on read through the existing dbcrypt store wrapper.

- Wrap `CreateUserSecret`, `GetUserSecretByUserIDAndName`,
`ListUserSecretsWithValues`, and `UpdateUserSecretByUserIDAndName` in
enterprise/dbcrypt/dbcrypt.go.
- Add rotate and decrypt support for user secrets in
enterprise/dbcrypt/cliutil.go (`server dbcrypt rotate` and `server
dbcrypt decrypt`).
- Add internal tests covering encrypt-on-create, decrypt-on-read,
re-encrypt-on-update, and plaintext passthrough when no cipher is
configured.
2026-04-10 09:34:11 -06:00
Garrett Delfosse e0fbb0e4ec feat: comment on original PR after cherry-pick PR is created (#24243)
After the cherry-pick workflow creates a backport PR, it now comments on
the original PR to notify the author with a link to the new PR.

If the cherry-pick had conflicts, the comment includes a warning.

## Changes

- Capture the URL output of `gh pr create` into `NEW_PR_URL`
- Add `gh pr comment` on the original PR with the link
- Append a conflict warning to the comment when applicable

> Generated by Coder Agents
2026-04-10 11:21:13 -04:00
J. Scott Miller 7bde763b66 feat: add workspace build transition to provisioner job list (#24131)
Closes #16332

Previously `coder provisioner jobs list` showed no indication of what a workspace
build job was doing (i.e., start, stop, or delete). This adds
`workspace_build_transition` to the provisioner job metadata, exposed in
both the REST API and CLI. Template and workspace name columns were also
added, both available via `-c`.

```
$ coder provisioner jobs list -c id,type,status,"workspace build transition"
ID                                    TYPE                     STATUS     WORKSPACE BUILD TRANSITION
95f35545-a59f-4900-813d-80b8c8fd7a33  template_version_import  succeeded
0a903bbe-cef5-4e72-9e62-f7e7b4dfbb7a  workspace_build          succeeded  start
```
2026-04-10 09:50:11 -05:00
Matt Vollmer 36141fafad feat: stack insights tables vertically and paginate Pull requests table (#24198)
The "By model" and "Pull requests" tables on the PR Insights page
(`/agents/settings/insights`) were side-by-side at `lg` breakpoints, and
the Pull requests table was hard-capped at 20 rows by the backend.

- Replaced `lg:grid-cols-2` with a single-column stacked layout so both
tables span the full content width.
- Removed the `LIMIT 20` from the `GetPRInsightsRecentPRs` SQL query so
all PRs in the selected time range are returned.
- Can add this back if we need it. If we do, we should add a little
subheader above this table to indicate that we're not showing all PRs
within the selected timeframe.
- Added client-side pagination to the Pull requests table using
`PaginationWidgetBase` (page size 10), matching the existing pattern in
`ChatCostSummaryView`.
- Renamed the section heading from "Recent" to "Pull requests" since it
now shows the full set for the time range.
<img width="1481" height="1817" alt="image"
src="https://github.com/user-attachments/assets/0066c42f-4d7b-4cee-b64b-6680848edc68"
/>


> 🤖 PR generated with Coder Agents
2026-04-10 10:48:54 -04:00
Garrett Delfosse 3462c31f43 fix: update directory for terraform-managed subagents (#24220)
When a devcontainer subagent is terraform-managed, the provisioner sets
its directory to the host-side `workspace_folder` path at build time. At
runtime, the agent injection code determines the correct
container-internal
path from `devcontainer read-configuration` and sends it via
`CreateSubAgent`.

However, the `CreateSubAgent` handler only updated `display_apps` for
pre-existing agents, ignoring the `Directory` field. This caused
SSH/terminal
sessions to land in `~` instead of the workspace folder (e.g.
`/workspaces/foo`).

Add `UpdateWorkspaceAgentDirectoryByID` query and call it in the
terraform-managed subagent update path to also persist the directory.

Fixes PLAT-118

<details><summary>Root cause analysis</summary>

Two code paths set the subagent `Directory` field:

1. **Provisioner (build time):** `insertDevcontainerSubagent` in
`provisionerdserver.go`
   stores `dc.GetWorkspaceFolder()` — the **host-side** path from the
   `coder_devcontainer` Terraform resource (e.g. `/home/coder/project`).

2. **Agent injection (runtime):**
`maybeInjectSubAgentIntoContainerLocked` in
`api.go` reads the devcontainer config and gets the correct
**container-internal**
path (e.g. `/workspaces/project`), then calls `client.Create(ctx,
subAgentConfig)`.

For terraform-managed subagents (those with `req.Id != nil`),
`CreateSubAgent`
in `coderd/agentapi/subagent.go` recognized the pre-existing agent and
entered
the update path — but only called `UpdateWorkspaceAgentDisplayAppsByID`,
discarding the `Directory` field from the request. The agent kept the
stale
host-side path, which doesn't exist inside the container, causing
`expandPathToAbs` to fall back to `~`.

</details>

> [!NOTE]
> Generated by Coder Agents
2026-04-10 10:11:22 -04:00
Ethan a0ea71b74c perf(site/src): optimistically edit chat messages (#23976)
Previously, editing a past user message in Agents chat waited for the
PATCH round-trip and cache reconciliation before the conversation
visibly settled. The edited bubble and truncated tail could briefly fall
back to older fetched state, and a failed edit did not restore the full
local editing context cleanly.

Keep history editing optimistic end-to-end: update the edited user
bubble and truncate the tail immediately, preserve that visible
conversation until the authoritative replacement message and cache catch
up, and restore the draft/editor/attachment state on failure. The route
already scopes each `agentId` to a keyed `AgentChatPage` instance with
its own store/cache-writing closures, so navigating between chats does
not need an extra post-await active-chat guard to keep one chat's edit
response out of another chat.
2026-04-10 23:40:49 +10:00
Cian Johnston 0a14bb529e refactor(site): convert OrganizationAutocomplete to fully controlled component (#24211)
Fixes https://github.com/coder/internal/issues/1440

- Convert `OrganizationAutocomplete` to a purely presentational, fully
controlled component
- Accept `value`, `onChange`, `options` from parent; remove internal
state, data fetching, and permission filtering
- Update `CreateTemplateForm` and `CreateUserForm` to own org fetching,
permission checks, auto-select, and invalid-value clearing inline
- Memoize `orgOptions` in callers for stable `useEffect` deps
- Rewrite Storybook stories for the new controlled API


> 🤖 Written by a Coder Agent. Reviewed by a human.
2026-04-10 13:56:43 +01:00
Danielle Maywood 2c32d84f12 fix: remove double bottom border on build logs table (#24000) 2026-04-10 13:50:36 +01:00
Jaayden Halko 76d89f59af fix(site): add bottom spacing for sources-only assistant messages (#24202)
Closes CODAGT-123

Assistant messages containing only source parts (no markdown or
reasoning)
were missing the bottom spacer that normally fills the gap left by the
hidden
action bar, causing them to sit flush against the next user bubble.

The existing fallback spacer guarded on `Boolean(parsed.reasoning)`, so
it
only fired for thinking-only replies. Replace that guard with the
broader
`hasRenderableContent` flag (which covers blocks, tools, and sources)
and
extract a named `needsAssistantBottomSpacer` boolean so future content
types
inherit consistent spacing without re-reading compound conditions.

Adds a `SourcesOnlyAssistantSpacing` Storybook story mirroring the
existing
`ThinkingOnlyAssistantSpacing` pattern for regression coverage.
2026-04-10 13:09:23 +01:00
Jaayden Halko 1a3a92bd1b fix: fix 4px layout shift on streaming commit in chat (#24203)
Closes CODAGT-124

When a streaming assistant response finishes and moves from the live
stream
tail into the conversation timeline, the message jumps 4px upward. This
happens because the outer layout wrapper and live-stream section both
used
`gap-3` (12px), while the committed-message list used `gap-2` (8px).

Unify all three containers to `gap-2` so the gap between messages stays
at 8px regardless of whether they're streaming or committed, eliminating
the layout shift.

A Storybook story with play-function assertions locks the invariant: it
renders both committed messages and an active stream, then verifies both
the outer and inner containers report `rowGap === "8px"`.
2026-04-10 13:09:03 +01:00
Jake Howell 4018320614 fix: resolve <WorkspaceTimings /> size (#24235) 2026-04-10 21:31:43 +10:00
Susana Ferreira d9700baa8d docs(docs/ai-coder): document AI Gateway Proxy private IP restrictions (#24209)
Documents the private/reserved IP range restrictions added to AI Gateway
Proxy:

- **Restricting proxy access**: Updated to reflect that private/reserved
IP ranges are now blocked by default, with atomic IP validation to
prevent DNS rebinding. Documents the Coder access URL exemption and the
`CODER_AIBRIDGE_PROXY_ALLOWED_PRIVATE_CIDRS` option.
- **Upstream proxy**: Added a note on the DNS rebinding limitation when
an upstream proxy is configured, and that upstream proxies should
enforce their own restrictions.

> [!NOTE]
> Initially generated by Coder Agents, modified and reviewed by
@ssncferreira

Follow-up: #23109
2026-04-10 12:09:14 +01:00
78 changed files with 2542 additions and 716 deletions
+16 -7
View File
@@ -134,10 +134,19 @@ jobs:
exit 0
fi
gh pr create \
--base "$RELEASE_BRANCH" \
--head "$BACKPORT_BRANCH" \
--title "$TITLE" \
--body "$BODY" \
--assignee "$SENDER" \
--reviewer "$SENDER"
NEW_PR_URL=$(
gh pr create \
--base "$RELEASE_BRANCH" \
--head "$BACKPORT_BRANCH" \
--title "$TITLE" \
--body "$BODY" \
--assignee "$SENDER" \
--reviewer "$SENDER"
)
# Comment on the original PR to notify the author.
COMMENT="Cherry-pick PR created: ${NEW_PR_URL}"
if [ "$CONFLICT" = true ]; then
COMMENT="${COMMENT} (⚠️ conflicts need manual resolution)"
fi
gh pr comment "$PR_NUMBER" --body "$COMMENT"
+120
View File
@@ -2862,6 +2862,126 @@ func TestAPI(t *testing.T) {
"rebuilt agent should include updated display apps")
})
// Verify that when a terraform-managed subagent is injected into
// a devcontainer, the Directory field sent to Create reflects
// the container-internal workspaceFolder from devcontainer
// read-configuration, not the host-side workspace_folder from
// the terraform resource. This is the scenario described in
// https://linear.app/codercom/issue/PRODUCT-259:
// 1. Non-terraform subagent → directory = /workspaces/foo (correct)
// 2. Terraform subagent → directory was stuck on host path (bug)
t.Run("TerraformDefinedSubAgentUsesContainerInternalDirectory", func(t *testing.T) {
t.Parallel()
if runtime.GOOS == "windows" {
t.Skip("Dev Container tests are not supported on Windows (this test uses mocks but fails due to Windows paths)")
}
var (
ctx = testutil.Context(t, testutil.WaitMedium)
logger = slogtest.Make(t, &slogtest.Options{IgnoreErrors: true}).Leveled(slog.LevelDebug)
mCtrl = gomock.NewController(t)
terraformAgentID = uuid.New()
containerID = "test-container-id"
// Given: A container with a host-side workspace folder.
terraformContainer = codersdk.WorkspaceAgentContainer{
ID: containerID,
FriendlyName: "test-container",
Image: "test-image",
Running: true,
CreatedAt: time.Now(),
Labels: map[string]string{
agentcontainers.DevcontainerLocalFolderLabel: "/home/coder/project",
agentcontainers.DevcontainerConfigFileLabel: "/home/coder/project/.devcontainer/devcontainer.json",
},
}
// Given: A terraform-defined devcontainer whose
// workspace_folder is the HOST-side path (set by provisioner).
terraformDevcontainer = codersdk.WorkspaceAgentDevcontainer{
ID: uuid.New(),
Name: "terraform-devcontainer",
WorkspaceFolder: "/home/coder/project",
ConfigPath: "/home/coder/project/.devcontainer/devcontainer.json",
SubagentID: uuid.NullUUID{UUID: terraformAgentID, Valid: true},
}
fCCLI = &fakeContainerCLI{
containers: codersdk.WorkspaceAgentListContainersResponse{
Containers: []codersdk.WorkspaceAgentContainer{terraformContainer},
},
arch: runtime.GOARCH,
}
// Given: devcontainer read-configuration returns the
// CONTAINER-INTERNAL workspace folder.
fDCCLI = &fakeDevcontainerCLI{
upID: containerID,
readConfig: agentcontainers.DevcontainerConfig{
Workspace: agentcontainers.DevcontainerWorkspace{
WorkspaceFolder: "/workspaces/project",
},
MergedConfiguration: agentcontainers.DevcontainerMergedConfiguration{
Customizations: agentcontainers.DevcontainerMergedCustomizations{
Coder: []agentcontainers.CoderCustomization{{}},
},
},
},
}
mSAC = acmock.NewMockSubAgentClient(mCtrl)
createCalls = make(chan agentcontainers.SubAgent, 1)
closed bool
)
mSAC.EXPECT().List(gomock.Any()).Return([]agentcontainers.SubAgent{}, nil).AnyTimes()
mSAC.EXPECT().Create(gomock.Any(), gomock.Any()).DoAndReturn(
func(_ context.Context, agent agentcontainers.SubAgent) (agentcontainers.SubAgent, error) {
agent.AuthToken = uuid.New()
createCalls <- agent
return agent, nil
},
).Times(1)
mSAC.EXPECT().Delete(gomock.Any(), gomock.Any()).DoAndReturn(func(_ context.Context, _ uuid.UUID) error {
assert.True(t, closed, "Delete should only be called after Close")
return nil
}).AnyTimes()
api := agentcontainers.NewAPI(logger,
agentcontainers.WithContainerCLI(fCCLI),
agentcontainers.WithDevcontainerCLI(fDCCLI),
agentcontainers.WithDevcontainers(
[]codersdk.WorkspaceAgentDevcontainer{terraformDevcontainer},
[]codersdk.WorkspaceAgentScript{{ID: terraformDevcontainer.ID, LogSourceID: uuid.New()}},
),
agentcontainers.WithSubAgentClient(mSAC),
agentcontainers.WithSubAgentURL("test-subagent-url"),
agentcontainers.WithWatcher(watcher.NewNoop()),
)
api.Start()
defer func() {
closed = true
api.Close()
}()
// When: The devcontainer is created (triggering injection).
err := api.CreateDevcontainer(terraformDevcontainer.WorkspaceFolder, terraformDevcontainer.ConfigPath)
require.NoError(t, err)
// Then: The subagent sent to Create has the correct
// container-internal directory, not the host path.
createdAgent := testutil.RequireReceive(ctx, t, createCalls)
assert.Equal(t, terraformAgentID, createdAgent.ID,
"agent should use terraform-defined ID")
assert.Equal(t, "/workspaces/project", createdAgent.Directory,
"directory should be the container-internal path from devcontainer "+
"read-configuration, not the host-side workspace_folder")
})
t.Run("Error", func(t *testing.T) {
t.Parallel()
+1 -1
View File
@@ -11,7 +11,7 @@ OPTIONS:
-O, --org string, $CODER_ORGANIZATION
Select which organization (uuid or name) to use.
-c, --column [id|created at|started at|completed at|canceled at|error|error code|status|worker id|worker name|file id|tags|queue position|queue size|organization id|initiator id|template version id|workspace build id|type|available workers|template version name|template id|template name|template display name|template icon|workspace id|workspace name|logs overflowed|organization|queue] (default: created at,id,type,template display name,status,queue,tags)
-c, --column [id|created at|started at|completed at|canceled at|error|error code|status|worker id|worker name|file id|tags|queue position|queue size|organization id|initiator id|template version id|workspace build id|type|available workers|template version name|template id|template name|template display name|template icon|workspace id|workspace name|workspace build transition|logs overflowed|organization|queue] (default: created at,id,type,template display name,status,queue,tags)
Columns to display in table output.
-i, --initiator string, $CODER_PROVISIONER_JOB_LIST_INITIATOR
@@ -58,7 +58,8 @@
"template_display_name": "",
"template_icon": "",
"workspace_id": "===========[workspace ID]===========",
"workspace_name": "test-workspace"
"workspace_name": "test-workspace",
"workspace_build_transition": "start"
},
"logs_overflowed": false,
"organization_name": "Coder"
+11 -1
View File
@@ -71,7 +71,7 @@ func (a *SubAgentAPI) CreateSubAgent(ctx context.Context, req *agentproto.Create
// An ID is only given in the request when it is a terraform-defined devcontainer
// that has attached resources. These subagents are pre-provisioned by terraform
// (the agent record already exists), so we update configurable fields like
// display_apps rather than creating a new agent.
// display_apps and directory rather than creating a new agent.
if req.Id != nil {
id, err := uuid.FromBytes(req.Id)
if err != nil {
@@ -97,6 +97,16 @@ func (a *SubAgentAPI) CreateSubAgent(ctx context.Context, req *agentproto.Create
return nil, xerrors.Errorf("update workspace agent display apps: %w", err)
}
if req.Directory != "" {
if err := a.Database.UpdateWorkspaceAgentDirectoryByID(ctx, database.UpdateWorkspaceAgentDirectoryByIDParams{
ID: id,
Directory: req.Directory,
UpdatedAt: createdAt,
}); err != nil {
return nil, xerrors.Errorf("update workspace agent directory: %w", err)
}
}
return &agentproto.CreateSubAgentResponse{
Agent: &agentproto.SubAgent{
Name: subAgent.Name,
+38 -2
View File
@@ -1267,11 +1267,11 @@ func TestSubAgentAPI(t *testing.T) {
agentID, err := uuid.FromBytes(resp.Agent.Id)
require.NoError(t, err)
// And: The database agent's other fields are unchanged.
// And: The database agent's name, architecture, and OS are unchanged.
updatedAgent, err := db.GetWorkspaceAgentByID(dbauthz.AsSystemRestricted(ctx), agentID)
require.NoError(t, err)
require.Equal(t, baseChildAgent.Name, updatedAgent.Name)
require.Equal(t, baseChildAgent.Directory, updatedAgent.Directory)
require.Equal(t, "/different/path", updatedAgent.Directory)
require.Equal(t, baseChildAgent.Architecture, updatedAgent.Architecture)
require.Equal(t, baseChildAgent.OperatingSystem, updatedAgent.OperatingSystem)
@@ -1280,6 +1280,42 @@ func TestSubAgentAPI(t *testing.T) {
require.Equal(t, database.DisplayAppWebTerminal, updatedAgent.DisplayApps[0])
},
},
{
name: "OK_DirectoryUpdated",
setup: func(t *testing.T, db database.Store, agent database.WorkspaceAgent) *proto.CreateSubAgentRequest {
// Given: An existing child agent with a stale host-side
// directory (as set by the provisioner at build time).
childAgent := dbgen.WorkspaceAgent(t, db, database.WorkspaceAgent{
ParentID: uuid.NullUUID{Valid: true, UUID: agent.ID},
ResourceID: agent.ResourceID,
Name: baseChildAgent.Name,
Directory: "/home/coder/project",
Architecture: baseChildAgent.Architecture,
OperatingSystem: baseChildAgent.OperatingSystem,
DisplayApps: baseChildAgent.DisplayApps,
})
// When: Agent injection sends the correct
// container-internal path.
return &proto.CreateSubAgentRequest{
Id: childAgent.ID[:],
Directory: "/workspaces/project",
DisplayApps: []proto.CreateSubAgentRequest_DisplayApp{
proto.CreateSubAgentRequest_WEB_TERMINAL,
},
}
},
check: func(t *testing.T, ctx context.Context, db database.Store, resp *proto.CreateSubAgentResponse, agent database.WorkspaceAgent) {
agentID, err := uuid.FromBytes(resp.Agent.Id)
require.NoError(t, err)
// Then: Directory is updated to the container-internal
// path.
updatedAgent, err := db.GetWorkspaceAgentByID(dbauthz.AsSystemRestricted(ctx), agentID)
require.NoError(t, err)
require.Equal(t, "/workspaces/project", updatedAgent.Directory)
},
},
{
name: "Error/MalformedID",
setup: func(t *testing.T, db database.Store, agent database.WorkspaceAgent) *proto.CreateSubAgentRequest {
+3
View File
@@ -19149,6 +19149,9 @@ const docTemplate = `{
"template_version_name": {
"type": "string"
},
"workspace_build_transition": {
"$ref": "#/definitions/codersdk.WorkspaceTransition"
},
"workspace_id": {
"type": "string",
"format": "uuid"
+3
View File
@@ -17509,6 +17509,9 @@
"template_version_name": {
"type": "string"
},
"workspace_build_transition": {
"$ref": "#/definitions/codersdk.WorkspaceTransition"
},
"workspace_id": {
"type": "string",
"format": "uuid"
+15 -2
View File
@@ -3401,11 +3401,11 @@ func (q *querier) GetPRInsightsPerModel(ctx context.Context, arg database.GetPRI
return q.db.GetPRInsightsPerModel(ctx, arg)
}
func (q *querier) GetPRInsightsRecentPRs(ctx context.Context, arg database.GetPRInsightsRecentPRsParams) ([]database.GetPRInsightsRecentPRsRow, error) {
func (q *querier) GetPRInsightsPullRequests(ctx context.Context, arg database.GetPRInsightsPullRequestsParams) ([]database.GetPRInsightsPullRequestsRow, error) {
if err := q.authorizeContext(ctx, policy.ActionRead, rbac.ResourceDeploymentConfig); err != nil {
return nil, err
}
return q.db.GetPRInsightsRecentPRs(ctx, arg)
return q.db.GetPRInsightsPullRequests(ctx, arg)
}
func (q *querier) GetPRInsightsSummary(ctx context.Context, arg database.GetPRInsightsSummaryParams) (database.GetPRInsightsSummaryRow, error) {
@@ -6783,6 +6783,19 @@ func (q *querier) UpdateWorkspaceAgentConnectionByID(ctx context.Context, arg da
return q.db.UpdateWorkspaceAgentConnectionByID(ctx, arg)
}
func (q *querier) UpdateWorkspaceAgentDirectoryByID(ctx context.Context, arg database.UpdateWorkspaceAgentDirectoryByIDParams) error {
workspace, err := q.db.GetWorkspaceByAgentID(ctx, arg.ID)
if err != nil {
return err
}
if err := q.authorizeContext(ctx, policy.ActionUpdateAgent, workspace); err != nil {
return err
}
return q.db.UpdateWorkspaceAgentDirectoryByID(ctx, arg)
}
func (q *querier) UpdateWorkspaceAgentDisplayAppsByID(ctx context.Context, arg database.UpdateWorkspaceAgentDisplayAppsByIDParams) error {
workspace, err := q.db.GetWorkspaceByAgentID(ctx, arg.ID)
if err != nil {
+14 -3
View File
@@ -2261,9 +2261,9 @@ func (s *MethodTestSuite) TestTemplate() {
dbm.EXPECT().GetPRInsightsPerModel(gomock.Any(), arg).Return([]database.GetPRInsightsPerModelRow{}, nil).AnyTimes()
check.Args(arg).Asserts(rbac.ResourceDeploymentConfig, policy.ActionRead)
}))
s.Run("GetPRInsightsRecentPRs", s.Mocked(func(dbm *dbmock.MockStore, _ *gofakeit.Faker, check *expects) {
arg := database.GetPRInsightsRecentPRsParams{}
dbm.EXPECT().GetPRInsightsRecentPRs(gomock.Any(), arg).Return([]database.GetPRInsightsRecentPRsRow{}, nil).AnyTimes()
s.Run("GetPRInsightsPullRequests", s.Mocked(func(dbm *dbmock.MockStore, _ *gofakeit.Faker, check *expects) {
arg := database.GetPRInsightsPullRequestsParams{}
dbm.EXPECT().GetPRInsightsPullRequests(gomock.Any(), arg).Return([]database.GetPRInsightsPullRequestsRow{}, nil).AnyTimes()
check.Args(arg).Asserts(rbac.ResourceDeploymentConfig, policy.ActionRead)
}))
s.Run("GetTelemetryTaskEvents", s.Mocked(func(dbm *dbmock.MockStore, _ *gofakeit.Faker, check *expects) {
@@ -2935,6 +2935,17 @@ func (s *MethodTestSuite) TestWorkspace() {
dbm.EXPECT().UpdateWorkspaceAgentStartupByID(gomock.Any(), arg).Return(nil).AnyTimes()
check.Args(arg).Asserts(w, policy.ActionUpdate).Returns()
}))
s.Run("UpdateWorkspaceAgentDirectoryByID", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
w := testutil.Fake(s.T(), faker, database.Workspace{})
agt := testutil.Fake(s.T(), faker, database.WorkspaceAgent{})
arg := database.UpdateWorkspaceAgentDirectoryByIDParams{
ID: agt.ID,
Directory: "/workspaces/project",
}
dbm.EXPECT().GetWorkspaceByAgentID(gomock.Any(), agt.ID).Return(w, nil).AnyTimes()
dbm.EXPECT().UpdateWorkspaceAgentDirectoryByID(gomock.Any(), arg).Return(nil).AnyTimes()
check.Args(arg).Asserts(w, policy.ActionUpdateAgent).Returns()
}))
s.Run("UpdateWorkspaceAgentDisplayAppsByID", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
w := testutil.Fake(s.T(), faker, database.Workspace{})
agt := testutil.Fake(s.T(), faker, database.WorkspaceAgent{})
+12 -4
View File
@@ -1992,11 +1992,11 @@ func (m queryMetricsStore) GetPRInsightsPerModel(ctx context.Context, arg databa
return r0, r1
}
func (m queryMetricsStore) GetPRInsightsRecentPRs(ctx context.Context, arg database.GetPRInsightsRecentPRsParams) ([]database.GetPRInsightsRecentPRsRow, error) {
func (m queryMetricsStore) GetPRInsightsPullRequests(ctx context.Context, arg database.GetPRInsightsPullRequestsParams) ([]database.GetPRInsightsPullRequestsRow, error) {
start := time.Now()
r0, r1 := m.s.GetPRInsightsRecentPRs(ctx, arg)
m.queryLatencies.WithLabelValues("GetPRInsightsRecentPRs").Observe(time.Since(start).Seconds())
m.queryCounts.WithLabelValues(httpmw.ExtractHTTPRoute(ctx), httpmw.ExtractHTTPMethod(ctx), "GetPRInsightsRecentPRs").Inc()
r0, r1 := m.s.GetPRInsightsPullRequests(ctx, arg)
m.queryLatencies.WithLabelValues("GetPRInsightsPullRequests").Observe(time.Since(start).Seconds())
m.queryCounts.WithLabelValues(httpmw.ExtractHTTPRoute(ctx), httpmw.ExtractHTTPMethod(ctx), "GetPRInsightsPullRequests").Inc()
return r0, r1
}
@@ -4840,6 +4840,14 @@ func (m queryMetricsStore) UpdateWorkspaceAgentConnectionByID(ctx context.Contex
return r0
}
func (m queryMetricsStore) UpdateWorkspaceAgentDirectoryByID(ctx context.Context, arg database.UpdateWorkspaceAgentDirectoryByIDParams) error {
start := time.Now()
r0 := m.s.UpdateWorkspaceAgentDirectoryByID(ctx, arg)
m.queryLatencies.WithLabelValues("UpdateWorkspaceAgentDirectoryByID").Observe(time.Since(start).Seconds())
m.queryCounts.WithLabelValues(httpmw.ExtractHTTPRoute(ctx), httpmw.ExtractHTTPMethod(ctx), "UpdateWorkspaceAgentDirectoryByID").Inc()
return r0
}
func (m queryMetricsStore) UpdateWorkspaceAgentDisplayAppsByID(ctx context.Context, arg database.UpdateWorkspaceAgentDisplayAppsByIDParams) error {
start := time.Now()
r0 := m.s.UpdateWorkspaceAgentDisplayAppsByID(ctx, arg)
+21 -7
View File
@@ -3692,19 +3692,19 @@ func (mr *MockStoreMockRecorder) GetPRInsightsPerModel(ctx, arg any) *gomock.Cal
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetPRInsightsPerModel", reflect.TypeOf((*MockStore)(nil).GetPRInsightsPerModel), ctx, arg)
}
// GetPRInsightsRecentPRs mocks base method.
func (m *MockStore) GetPRInsightsRecentPRs(ctx context.Context, arg database.GetPRInsightsRecentPRsParams) ([]database.GetPRInsightsRecentPRsRow, error) {
// GetPRInsightsPullRequests mocks base method.
func (m *MockStore) GetPRInsightsPullRequests(ctx context.Context, arg database.GetPRInsightsPullRequestsParams) ([]database.GetPRInsightsPullRequestsRow, error) {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "GetPRInsightsRecentPRs", ctx, arg)
ret0, _ := ret[0].([]database.GetPRInsightsRecentPRsRow)
ret := m.ctrl.Call(m, "GetPRInsightsPullRequests", ctx, arg)
ret0, _ := ret[0].([]database.GetPRInsightsPullRequestsRow)
ret1, _ := ret[1].(error)
return ret0, ret1
}
// GetPRInsightsRecentPRs indicates an expected call of GetPRInsightsRecentPRs.
func (mr *MockStoreMockRecorder) GetPRInsightsRecentPRs(ctx, arg any) *gomock.Call {
// GetPRInsightsPullRequests indicates an expected call of GetPRInsightsPullRequests.
func (mr *MockStoreMockRecorder) GetPRInsightsPullRequests(ctx, arg any) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetPRInsightsRecentPRs", reflect.TypeOf((*MockStore)(nil).GetPRInsightsRecentPRs), ctx, arg)
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetPRInsightsPullRequests", reflect.TypeOf((*MockStore)(nil).GetPRInsightsPullRequests), ctx, arg)
}
// GetPRInsightsSummary mocks base method.
@@ -9120,6 +9120,20 @@ func (mr *MockStoreMockRecorder) UpdateWorkspaceAgentConnectionByID(ctx, arg any
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "UpdateWorkspaceAgentConnectionByID", reflect.TypeOf((*MockStore)(nil).UpdateWorkspaceAgentConnectionByID), ctx, arg)
}
// UpdateWorkspaceAgentDirectoryByID mocks base method.
func (m *MockStore) UpdateWorkspaceAgentDirectoryByID(ctx context.Context, arg database.UpdateWorkspaceAgentDirectoryByIDParams) error {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "UpdateWorkspaceAgentDirectoryByID", ctx, arg)
ret0, _ := ret[0].(error)
return ret0
}
// UpdateWorkspaceAgentDirectoryByID indicates an expected call of UpdateWorkspaceAgentDirectoryByID.
func (mr *MockStoreMockRecorder) UpdateWorkspaceAgentDirectoryByID(ctx, arg any) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "UpdateWorkspaceAgentDirectoryByID", reflect.TypeOf((*MockStore)(nil).UpdateWorkspaceAgentDirectoryByID), ctx, arg)
}
// UpdateWorkspaceAgentDisplayAppsByID mocks base method.
func (m *MockStore) UpdateWorkspaceAgentDisplayAppsByID(ctx context.Context, arg database.UpdateWorkspaceAgentDisplayAppsByIDParams) error {
m.ctrl.T.Helper()
-2
View File
@@ -3791,8 +3791,6 @@ CREATE INDEX idx_chats_last_model_config_id ON chats USING btree (last_model_con
CREATE INDEX idx_chats_owner ON chats USING btree (owner_id);
CREATE INDEX idx_chats_owner_updated_id ON chats USING btree (owner_id, updated_at DESC, id DESC);
CREATE INDEX idx_chats_parent_chat_id ON chats USING btree (parent_chat_id);
CREATE INDEX idx_chats_pending ON chats USING btree (status) WHERE (status = 'pending'::chat_status);
@@ -0,0 +1 @@
CREATE INDEX idx_chats_owner_updated_id ON chats (owner_id, updated_at DESC, id DESC);
@@ -0,0 +1,5 @@
-- The GetChats ORDER BY changed from (updated_at, id) DESC to a 4-column
-- expression sort (pinned-first flag, negated pin_order, updated_at, id).
-- This index was purpose-built for the old sort and no longer provides
-- read benefit. The simpler idx_chats_owner covers the owner_id filter.
DROP INDEX IF EXISTS idx_chats_owner_updated_id;
+5 -3
View File
@@ -418,11 +418,12 @@ type sqlcQuerier interface {
// per PR for state/additions/deletions/model (model comes from the
// most recent chat).
GetPRInsightsPerModel(ctx context.Context, arg GetPRInsightsPerModelParams) ([]GetPRInsightsPerModelRow, error)
// Returns individual PR rows with cost for the recent PRs table.
// Returns all individual PR rows with cost for the selected time range.
// Uses two CTEs: pr_costs sums cost for the PR-linked chat and its
// direct children (that lack their own PR), and deduped picks one row
// per PR for metadata.
GetPRInsightsRecentPRs(ctx context.Context, arg GetPRInsightsRecentPRsParams) ([]GetPRInsightsRecentPRsRow, error)
// per PR for metadata. A safety-cap LIMIT guards against unexpectedly
// large result sets from direct API callers.
GetPRInsightsPullRequests(ctx context.Context, arg GetPRInsightsPullRequestsParams) ([]GetPRInsightsPullRequestsRow, error)
// PR Insights queries for the /agents analytics dashboard.
// These aggregate data from chat_diff_statuses (PR metadata) joined
// with chats and chat_messages (cost) to power the PR Insights view.
@@ -1011,6 +1012,7 @@ type sqlcQuerier interface {
UpdateWorkspace(ctx context.Context, arg UpdateWorkspaceParams) (WorkspaceTable, error)
UpdateWorkspaceACLByID(ctx context.Context, arg UpdateWorkspaceACLByIDParams) error
UpdateWorkspaceAgentConnectionByID(ctx context.Context, arg UpdateWorkspaceAgentConnectionByIDParams) error
UpdateWorkspaceAgentDirectoryByID(ctx context.Context, arg UpdateWorkspaceAgentDirectoryByIDParams) error
UpdateWorkspaceAgentDisplayAppsByID(ctx context.Context, arg UpdateWorkspaceAgentDisplayAppsByIDParams) error
UpdateWorkspaceAgentLifecycleStateByID(ctx context.Context, arg UpdateWorkspaceAgentLifecycleStateByIDParams) error
UpdateWorkspaceAgentLogOverflowByID(ctx context.Context, arg UpdateWorkspaceAgentLogOverflowByIDParams) error
+34 -20
View File
@@ -10408,11 +10408,10 @@ func TestGetPRInsights(t *testing.T) {
assert.Equal(t, int64(1), summary.TotalPrsCreated)
assert.Equal(t, int64(8_000_000), summary.TotalCostMicros)
recent, err := store.GetPRInsightsRecentPRs(context.Background(), database.GetPRInsightsRecentPRsParams{
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
LimitVal: 20,
})
require.NoError(t, err)
require.Len(t, recent, 1)
@@ -10442,11 +10441,10 @@ func TestGetPRInsights(t *testing.T) {
assert.Equal(t, int64(1), summary.TotalPrsMerged)
// RecentPRs ordered by created_at DESC: chatB is newer.
recent, err := store.GetPRInsightsRecentPRs(context.Background(), database.GetPRInsightsRecentPRsParams{
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
LimitVal: 20,
})
require.NoError(t, err)
require.Len(t, recent, 2)
@@ -10491,11 +10489,10 @@ func TestGetPRInsights(t *testing.T) {
assert.Equal(t, int64(1), summary.TotalPrsCreated)
assert.Equal(t, int64(1), summary.TotalPrsMerged)
recent, err := store.GetPRInsightsRecentPRs(context.Background(), database.GetPRInsightsRecentPRsParams{
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
LimitVal: 20,
})
require.NoError(t, err)
require.Len(t, recent, 1)
@@ -10533,11 +10530,10 @@ func TestGetPRInsights(t *testing.T) {
assert.Equal(t, int64(9_000_000), summary.TotalCostMicros)
// RecentPRs should return 1 row with the full tree cost.
recent, err := store.GetPRInsightsRecentPRs(context.Background(), database.GetPRInsightsRecentPRsParams{
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
LimitVal: 20,
})
require.NoError(t, err)
require.Len(t, recent, 1)
@@ -10575,11 +10571,10 @@ func TestGetPRInsights(t *testing.T) {
assert.Equal(t, int64(2), summary.TotalPrsCreated)
assert.Equal(t, int64(8_000_000), summary.TotalCostMicros)
recent, err := store.GetPRInsightsRecentPRs(context.Background(), database.GetPRInsightsRecentPRsParams{
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
LimitVal: 20,
})
require.NoError(t, err)
require.Len(t, recent, 2)
@@ -10621,11 +10616,10 @@ func TestGetPRInsights(t *testing.T) {
assert.Equal(t, int64(2), summary.TotalPrsCreated)
assert.Equal(t, int64(17_000_000), summary.TotalCostMicros)
recent, err := store.GetPRInsightsRecentPRs(context.Background(), database.GetPRInsightsRecentPRsParams{
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
LimitVal: 20,
})
require.NoError(t, err)
require.Len(t, recent, 2)
@@ -10658,11 +10652,10 @@ func TestGetPRInsights(t *testing.T) {
assert.Equal(t, int64(2), summary.TotalPrsCreated)
assert.Equal(t, int64(10_000_000), summary.TotalCostMicros)
recent, err := store.GetPRInsightsRecentPRs(context.Background(), database.GetPRInsightsRecentPRsParams{
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
LimitVal: 20,
})
require.NoError(t, err)
require.Len(t, recent, 2)
@@ -10695,11 +10688,10 @@ func TestGetPRInsights(t *testing.T) {
assert.Equal(t, int64(1), summary.TotalPrsCreated)
assert.Equal(t, int64(15_000_000), summary.TotalCostMicros)
recent, err := store.GetPRInsightsRecentPRs(context.Background(), database.GetPRInsightsRecentPRsParams{
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
LimitVal: 20,
})
require.NoError(t, err)
require.Len(t, recent, 1)
@@ -10724,11 +10716,10 @@ func TestGetPRInsights(t *testing.T) {
assert.Equal(t, int64(1), summary.TotalPrsCreated)
assert.Equal(t, int64(0), summary.TotalCostMicros)
recent, err := store.GetPRInsightsRecentPRs(context.Background(), database.GetPRInsightsRecentPRsParams{
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
LimitVal: 20,
})
require.NoError(t, err)
require.Len(t, recent, 1)
@@ -10767,11 +10758,10 @@ func TestGetPRInsights(t *testing.T) {
require.Len(t, byModel, 1)
assert.Equal(t, modelName, byModel[0].DisplayName)
recent, err := store.GetPRInsightsRecentPRs(context.Background(), database.GetPRInsightsRecentPRsParams{
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
LimitVal: 20,
})
require.NoError(t, err)
require.Len(t, recent, 1)
@@ -10803,6 +10793,30 @@ func TestGetPRInsights(t *testing.T) {
assert.Equal(t, int64(8_000_000), summary.TotalCostMicros)
assert.Equal(t, int64(5_000_000), summary.MergedCostMicros)
})
t.Run("AllPRsReturnedWithSafetyCap", func(t *testing.T) {
t.Parallel()
store, userID, mcID := setupChatInfra(t)
// Create 25 distinct PRs — more than the old LIMIT 20 — and
// verify all are returned.
const prCount = 25
for i := range prCount {
chat := createChat(t, store, userID, mcID, fmt.Sprintf("chat-%d", i))
insertCostMessage(t, store, chat.ID, userID, mcID, 1_000_000)
linkPR(t, store, chat.ID,
fmt.Sprintf("https://github.com/org/repo/pull/%d", 100+i),
"merged", fmt.Sprintf("fix: pr-%d", i), 10, 2, 1)
}
recent, err := store.GetPRInsightsPullRequests(context.Background(), database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: noOwner,
})
require.NoError(t, err)
assert.Len(t, recent, prCount, "all PRs within the date range should be returned")
})
}
func TestChatPinOrderQueries(t *testing.T) {
+72 -49
View File
@@ -3218,7 +3218,7 @@ func (q *sqlQuerier) GetPRInsightsPerModel(ctx context.Context, arg GetPRInsight
return items, nil
}
const getPRInsightsRecentPRs = `-- name: GetPRInsightsRecentPRs :many
const getPRInsightsPullRequests = `-- name: GetPRInsightsPullRequests :many
WITH pr_costs AS (
SELECT
prc.pr_key,
@@ -3238,9 +3238,9 @@ WITH pr_costs AS (
AND cds2.pull_request_state IS NOT NULL
))
WHERE cds.pull_request_state IS NOT NULL
AND c.created_at >= $2::timestamptz
AND c.created_at < $3::timestamptz
AND ($4::uuid IS NULL OR c.owner_id = $4::uuid)
AND c.created_at >= $1::timestamptz
AND c.created_at < $2::timestamptz
AND ($3::uuid IS NULL OR c.owner_id = $3::uuid)
) prc
LEFT JOIN LATERAL (
SELECT COALESCE(SUM(cm.total_cost_micros), 0) AS cost_micros
@@ -3275,9 +3275,9 @@ deduped AS (
JOIN chats c ON c.id = cds.chat_id
LEFT JOIN chat_model_configs cmc ON cmc.id = c.last_model_config_id
WHERE cds.pull_request_state IS NOT NULL
AND c.created_at >= $2::timestamptz
AND c.created_at < $3::timestamptz
AND ($4::uuid IS NULL OR c.owner_id = $4::uuid)
AND c.created_at >= $1::timestamptz
AND c.created_at < $2::timestamptz
AND ($3::uuid IS NULL OR c.owner_id = $3::uuid)
ORDER BY COALESCE(NULLIF(cds.url, ''), c.id::text), c.created_at DESC, c.id DESC
)
SELECT chat_id, pr_title, pr_url, pr_number, state, draft, additions, deletions, changed_files, commits, approved, changes_requested, reviewer_count, author_login, author_avatar_url, base_branch, model_display_name, cost_micros, created_at FROM (
@@ -3305,17 +3305,16 @@ SELECT chat_id, pr_title, pr_url, pr_number, state, draft, additions, deletions,
JOIN pr_costs pc ON pc.pr_key = d.pr_key
) sub
ORDER BY sub.created_at DESC
LIMIT $1::int
LIMIT 500
`
type GetPRInsightsRecentPRsParams struct {
LimitVal int32 `db:"limit_val" json:"limit_val"`
type GetPRInsightsPullRequestsParams struct {
StartDate time.Time `db:"start_date" json:"start_date"`
EndDate time.Time `db:"end_date" json:"end_date"`
OwnerID uuid.NullUUID `db:"owner_id" json:"owner_id"`
}
type GetPRInsightsRecentPRsRow struct {
type GetPRInsightsPullRequestsRow struct {
ChatID uuid.UUID `db:"chat_id" json:"chat_id"`
PrTitle string `db:"pr_title" json:"pr_title"`
PrUrl sql.NullString `db:"pr_url" json:"pr_url"`
@@ -3337,24 +3336,20 @@ type GetPRInsightsRecentPRsRow struct {
CreatedAt time.Time `db:"created_at" json:"created_at"`
}
// Returns individual PR rows with cost for the recent PRs table.
// Returns all individual PR rows with cost for the selected time range.
// Uses two CTEs: pr_costs sums cost for the PR-linked chat and its
// direct children (that lack their own PR), and deduped picks one row
// per PR for metadata.
func (q *sqlQuerier) GetPRInsightsRecentPRs(ctx context.Context, arg GetPRInsightsRecentPRsParams) ([]GetPRInsightsRecentPRsRow, error) {
rows, err := q.db.QueryContext(ctx, getPRInsightsRecentPRs,
arg.LimitVal,
arg.StartDate,
arg.EndDate,
arg.OwnerID,
)
// per PR for metadata. A safety-cap LIMIT guards against unexpectedly
// large result sets from direct API callers.
func (q *sqlQuerier) GetPRInsightsPullRequests(ctx context.Context, arg GetPRInsightsPullRequestsParams) ([]GetPRInsightsPullRequestsRow, error) {
rows, err := q.db.QueryContext(ctx, getPRInsightsPullRequests, arg.StartDate, arg.EndDate, arg.OwnerID)
if err != nil {
return nil, err
}
defer rows.Close()
var items []GetPRInsightsRecentPRsRow
var items []GetPRInsightsPullRequestsRow
for rows.Next() {
var i GetPRInsightsRecentPRsRow
var i GetPRInsightsPullRequestsRow
if err := rows.Scan(
&i.ChatID,
&i.PrTitle,
@@ -5823,20 +5818,18 @@ WHERE
ELSE chats.archived = $2 :: boolean
END
AND CASE
-- This allows using the last element on a page as effectively a cursor.
-- This is an important option for scripts that need to paginate without
-- duplicating or missing data.
-- Cursor pagination: the last element on a page acts as the cursor.
-- The 4-tuple matches the ORDER BY below. All columns sort DESC
-- (pin_order is negated so lower values sort first in DESC order),
-- which lets us use a single tuple < comparison.
WHEN $3 :: uuid != '00000000-0000-0000-0000-000000000000'::uuid THEN (
-- The pagination cursor is the last ID of the previous page.
-- The query is ordered by the updated_at field, so select all
-- rows before the cursor.
(updated_at, id) < (
(CASE WHEN pin_order > 0 THEN 1 ELSE 0 END, -pin_order, updated_at, id) < (
SELECT
updated_at, id
CASE WHEN c2.pin_order > 0 THEN 1 ELSE 0 END, -c2.pin_order, c2.updated_at, c2.id
FROM
chats
chats c2
WHERE
id = $3
c2.id = $3
)
)
ELSE true
@@ -5848,9 +5841,15 @@ WHERE
-- Authorize Filter clause will be injected below in GetAuthorizedChats
-- @authorize_filter
ORDER BY
-- Deterministic and consistent ordering of all rows, even if they share
-- a timestamp. This is to ensure consistent pagination.
(updated_at, id) DESC OFFSET $5
-- Pinned chats (pin_order > 0) sort before unpinned ones. Within
-- pinned chats, lower pin_order values come first. The negation
-- trick (-pin_order) keeps all sort columns DESC so the cursor
-- tuple < comparison works with uniform direction.
CASE WHEN pin_order > 0 THEN 1 ELSE 0 END DESC,
-pin_order DESC,
updated_at DESC,
id DESC
OFFSET $5
LIMIT
-- The chat list is unbounded and expected to grow large.
-- Default to 50 to prevent accidental excessively large queries.
@@ -17519,7 +17518,8 @@ SELECT
w.id AS workspace_id,
COALESCE(w.name, '') AS workspace_name,
-- Include the name of the provisioner_daemon associated to the job
COALESCE(pd.name, '') AS worker_name
COALESCE(pd.name, '') AS worker_name,
wb.transition as workspace_build_transition
FROM
provisioner_jobs pj
LEFT JOIN
@@ -17564,7 +17564,8 @@ GROUP BY
t.icon,
w.id,
w.name,
pd.name
pd.name,
wb.transition
ORDER BY
pj.created_at DESC
LIMIT
@@ -17581,18 +17582,19 @@ type GetProvisionerJobsByOrganizationAndStatusWithQueuePositionAndProvisionerPar
}
type GetProvisionerJobsByOrganizationAndStatusWithQueuePositionAndProvisionerRow struct {
ProvisionerJob ProvisionerJob `db:"provisioner_job" json:"provisioner_job"`
QueuePosition int64 `db:"queue_position" json:"queue_position"`
QueueSize int64 `db:"queue_size" json:"queue_size"`
AvailableWorkers []uuid.UUID `db:"available_workers" json:"available_workers"`
TemplateVersionName string `db:"template_version_name" json:"template_version_name"`
TemplateID uuid.NullUUID `db:"template_id" json:"template_id"`
TemplateName string `db:"template_name" json:"template_name"`
TemplateDisplayName string `db:"template_display_name" json:"template_display_name"`
TemplateIcon string `db:"template_icon" json:"template_icon"`
WorkspaceID uuid.NullUUID `db:"workspace_id" json:"workspace_id"`
WorkspaceName string `db:"workspace_name" json:"workspace_name"`
WorkerName string `db:"worker_name" json:"worker_name"`
ProvisionerJob ProvisionerJob `db:"provisioner_job" json:"provisioner_job"`
QueuePosition int64 `db:"queue_position" json:"queue_position"`
QueueSize int64 `db:"queue_size" json:"queue_size"`
AvailableWorkers []uuid.UUID `db:"available_workers" json:"available_workers"`
TemplateVersionName string `db:"template_version_name" json:"template_version_name"`
TemplateID uuid.NullUUID `db:"template_id" json:"template_id"`
TemplateName string `db:"template_name" json:"template_name"`
TemplateDisplayName string `db:"template_display_name" json:"template_display_name"`
TemplateIcon string `db:"template_icon" json:"template_icon"`
WorkspaceID uuid.NullUUID `db:"workspace_id" json:"workspace_id"`
WorkspaceName string `db:"workspace_name" json:"workspace_name"`
WorkerName string `db:"worker_name" json:"worker_name"`
WorkspaceBuildTransition NullWorkspaceTransition `db:"workspace_build_transition" json:"workspace_build_transition"`
}
func (q *sqlQuerier) GetProvisionerJobsByOrganizationAndStatusWithQueuePositionAndProvisioner(ctx context.Context, arg GetProvisionerJobsByOrganizationAndStatusWithQueuePositionAndProvisionerParams) ([]GetProvisionerJobsByOrganizationAndStatusWithQueuePositionAndProvisionerRow, error) {
@@ -17644,6 +17646,7 @@ func (q *sqlQuerier) GetProvisionerJobsByOrganizationAndStatusWithQueuePositionA
&i.WorkspaceID,
&i.WorkspaceName,
&i.WorkerName,
&i.WorkspaceBuildTransition,
); err != nil {
return nil, err
}
@@ -26816,6 +26819,26 @@ func (q *sqlQuerier) UpdateWorkspaceAgentConnectionByID(ctx context.Context, arg
return err
}
const updateWorkspaceAgentDirectoryByID = `-- name: UpdateWorkspaceAgentDirectoryByID :exec
UPDATE
workspace_agents
SET
directory = $2, updated_at = $3
WHERE
id = $1
`
type UpdateWorkspaceAgentDirectoryByIDParams struct {
ID uuid.UUID `db:"id" json:"id"`
Directory string `db:"directory" json:"directory"`
UpdatedAt time.Time `db:"updated_at" json:"updated_at"`
}
func (q *sqlQuerier) UpdateWorkspaceAgentDirectoryByID(ctx context.Context, arg UpdateWorkspaceAgentDirectoryByIDParams) error {
_, err := q.db.ExecContext(ctx, updateWorkspaceAgentDirectoryByID, arg.ID, arg.Directory, arg.UpdatedAt)
return err
}
const updateWorkspaceAgentDisplayAppsByID = `-- name: UpdateWorkspaceAgentDisplayAppsByID :exec
UPDATE
workspace_agents
+5 -4
View File
@@ -173,11 +173,12 @@ JOIN pr_costs pc ON pc.pr_key = d.pr_key
GROUP BY d.model_config_id, d.display_name, d.model, d.provider
ORDER BY total_prs DESC;
-- name: GetPRInsightsRecentPRs :many
-- Returns individual PR rows with cost for the recent PRs table.
-- name: GetPRInsightsPullRequests :many
-- Returns all individual PR rows with cost for the selected time range.
-- Uses two CTEs: pr_costs sums cost for the PR-linked chat and its
-- direct children (that lack their own PR), and deduped picks one row
-- per PR for metadata.
-- per PR for metadata. A safety-cap LIMIT guards against unexpectedly
-- large result sets from direct API callers.
WITH pr_costs AS (
SELECT
prc.pr_key,
@@ -264,4 +265,4 @@ SELECT * FROM (
JOIN pr_costs pc ON pc.pr_key = d.pr_key
) sub
ORDER BY sub.created_at DESC
LIMIT @limit_val::int;
LIMIT 500;
+17 -13
View File
@@ -353,20 +353,18 @@ WHERE
ELSE chats.archived = sqlc.narg('archived') :: boolean
END
AND CASE
-- This allows using the last element on a page as effectively a cursor.
-- This is an important option for scripts that need to paginate without
-- duplicating or missing data.
-- Cursor pagination: the last element on a page acts as the cursor.
-- The 4-tuple matches the ORDER BY below. All columns sort DESC
-- (pin_order is negated so lower values sort first in DESC order),
-- which lets us use a single tuple < comparison.
WHEN @after_id :: uuid != '00000000-0000-0000-0000-000000000000'::uuid THEN (
-- The pagination cursor is the last ID of the previous page.
-- The query is ordered by the updated_at field, so select all
-- rows before the cursor.
(updated_at, id) < (
(CASE WHEN pin_order > 0 THEN 1 ELSE 0 END, -pin_order, updated_at, id) < (
SELECT
updated_at, id
CASE WHEN c2.pin_order > 0 THEN 1 ELSE 0 END, -c2.pin_order, c2.updated_at, c2.id
FROM
chats
chats c2
WHERE
id = @after_id
c2.id = @after_id
)
)
ELSE true
@@ -378,9 +376,15 @@ WHERE
-- Authorize Filter clause will be injected below in GetAuthorizedChats
-- @authorize_filter
ORDER BY
-- Deterministic and consistent ordering of all rows, even if they share
-- a timestamp. This is to ensure consistent pagination.
(updated_at, id) DESC OFFSET @offset_opt
-- Pinned chats (pin_order > 0) sort before unpinned ones. Within
-- pinned chats, lower pin_order values come first. The negation
-- trick (-pin_order) keeps all sort columns DESC so the cursor
-- tuple < comparison works with uniform direction.
CASE WHEN pin_order > 0 THEN 1 ELSE 0 END DESC,
-pin_order DESC,
updated_at DESC,
id DESC
OFFSET @offset_opt
LIMIT
-- The chat list is unbounded and expected to grow large.
-- Default to 50 to prevent accidental excessively large queries.
+4 -2
View File
@@ -195,7 +195,8 @@ SELECT
w.id AS workspace_id,
COALESCE(w.name, '') AS workspace_name,
-- Include the name of the provisioner_daemon associated to the job
COALESCE(pd.name, '') AS worker_name
COALESCE(pd.name, '') AS worker_name,
wb.transition as workspace_build_transition
FROM
provisioner_jobs pj
LEFT JOIN
@@ -240,7 +241,8 @@ GROUP BY
t.icon,
w.id,
w.name,
pd.name
pd.name,
wb.transition
ORDER BY
pj.created_at DESC
LIMIT
@@ -190,6 +190,14 @@ SET
WHERE
id = $1;
-- name: UpdateWorkspaceAgentDirectoryByID :exec
UPDATE
workspace_agents
SET
directory = $2, updated_at = $3
WHERE
id = $1;
-- name: GetWorkspaceAgentLogsAfter :many
SELECT
*
+9 -10
View File
@@ -1810,9 +1810,9 @@ func (api *API) patchChat(rw http.ResponseWriter, r *http.Request) {
// - pinOrder > 0 && already pinned: reorder (shift
// neighbors, clamp to [1, count]).
// - pinOrder > 0 && not pinned: append to end. The
// requested value is intentionally ignored because
// PinChatByID also bumps updated_at to keep the
// chat visible in the paginated sidebar.
// requested value is intentionally ignored; the
// SQL ORDER BY sorts pinned chats first so they
// appear on page 1 of the paginated sidebar.
var err error
errMsg := "Failed to pin chat."
switch {
@@ -5626,7 +5626,7 @@ func (api *API) prInsights(rw http.ResponseWriter, r *http.Request) {
previousSummary database.GetPRInsightsSummaryRow
timeSeries []database.GetPRInsightsTimeSeriesRow
byModel []database.GetPRInsightsPerModelRow
recentPRs []database.GetPRInsightsRecentPRsRow
recentPRs []database.GetPRInsightsPullRequestsRow
)
eg, egCtx := errgroup.WithContext(ctx)
@@ -5674,11 +5674,10 @@ func (api *API) prInsights(rw http.ResponseWriter, r *http.Request) {
eg.Go(func() error {
var err error
recentPRs, err = api.Database.GetPRInsightsRecentPRs(egCtx, database.GetPRInsightsRecentPRsParams{
recentPRs, err = api.Database.GetPRInsightsPullRequests(egCtx, database.GetPRInsightsPullRequestsParams{
StartDate: startDate,
EndDate: endDate,
OwnerID: ownerID,
LimitVal: 20,
})
return err
})
@@ -5788,10 +5787,10 @@ func (api *API) prInsights(rw http.ResponseWriter, r *http.Request) {
}
httpapi.Write(ctx, rw, http.StatusOK, codersdk.PRInsightsResponse{
Summary: summary,
TimeSeries: tsEntries,
ByModel: modelEntries,
RecentPRs: prEntries,
Summary: summary,
TimeSeries: tsEntries,
ByModel: modelEntries,
PullRequests: prEntries,
})
}
+180
View File
@@ -876,6 +876,186 @@ func TestListChats(t *testing.T) {
require.NoError(t, err)
require.Len(t, allChats, totalChats)
})
// Test that a pinned chat with an old updated_at appears on page 1.
t.Run("PinnedOnFirstPage", func(t *testing.T) {
t.Parallel()
ctx := testutil.Context(t, testutil.WaitLong)
client, _ := newChatClientWithDatabase(t)
_ = coderdtest.CreateFirstUser(t, client.Client)
_ = createChatModelConfig(t, client)
// Create the chat that will later be pinned. It gets the
// earliest updated_at because it is inserted first.
pinnedChat, err := client.CreateChat(ctx, codersdk.CreateChatRequest{
Content: []codersdk.ChatInputPart{{
Type: codersdk.ChatInputPartTypeText,
Text: "pinned-chat",
}},
})
require.NoError(t, err)
// Fill page 1 with newer chats so the pinned chat would
// normally be pushed off the first page (default limit 50).
const fillerCount = 51
fillerChats := make([]codersdk.Chat, 0, fillerCount)
for i := range fillerCount {
c, createErr := client.CreateChat(ctx, codersdk.CreateChatRequest{
Content: []codersdk.ChatInputPart{{
Type: codersdk.ChatInputPartTypeText,
Text: fmt.Sprintf("filler-%d", i),
}},
})
require.NoError(t, createErr)
fillerChats = append(fillerChats, c)
}
// Wait for all chats to reach a terminal status so
// updated_at is stable before paginating. A single
// polling loop checks every chat per tick to avoid
// O(N) separate Eventually loops.
allCreated := append([]codersdk.Chat{pinnedChat}, fillerChats...)
pending := make(map[uuid.UUID]struct{}, len(allCreated))
for _, c := range allCreated {
pending[c.ID] = struct{}{}
}
testutil.Eventually(ctx, t, func(_ context.Context) bool {
all, listErr := client.ListChats(ctx, &codersdk.ListChatsOptions{
Pagination: codersdk.Pagination{Limit: fillerCount + 10},
})
if listErr != nil {
return false
}
for _, ch := range all {
if _, ok := pending[ch.ID]; ok && ch.Status != codersdk.ChatStatusPending && ch.Status != codersdk.ChatStatusRunning {
delete(pending, ch.ID)
}
}
return len(pending) == 0
}, testutil.IntervalFast)
// Pin the earliest chat.
err = client.UpdateChat(ctx, pinnedChat.ID, codersdk.UpdateChatRequest{
PinOrder: ptr.Ref(int32(1)),
})
require.NoError(t, err)
// Fetch page 1 with default limit (50).
page1, err := client.ListChats(ctx, &codersdk.ListChatsOptions{
Pagination: codersdk.Pagination{Limit: 50},
})
require.NoError(t, err)
// The pinned chat must appear on page 1.
page1IDs := make(map[uuid.UUID]struct{}, len(page1))
for _, c := range page1 {
page1IDs[c.ID] = struct{}{}
}
_, found := page1IDs[pinnedChat.ID]
require.True(t, found, "pinned chat should appear on page 1")
// The pinned chat should be the first item in the list.
require.Equal(t, pinnedChat.ID, page1[0].ID, "pinned chat should be first")
})
// Test cursor pagination with a mix of pinned and unpinned chats.
t.Run("CursorWithPins", func(t *testing.T) {
t.Parallel()
ctx := testutil.Context(t, testutil.WaitLong)
client, _ := newChatClientWithDatabase(t)
_ = coderdtest.CreateFirstUser(t, client.Client)
_ = createChatModelConfig(t, client)
// Create 5 chats: 2 will be pinned, 3 unpinned.
const totalChats = 5
createdChats := make([]codersdk.Chat, 0, totalChats)
for i := range totalChats {
c, createErr := client.CreateChat(ctx, codersdk.CreateChatRequest{
Content: []codersdk.ChatInputPart{{
Type: codersdk.ChatInputPartTypeText,
Text: fmt.Sprintf("cursor-pin-chat-%d", i),
}},
})
require.NoError(t, createErr)
createdChats = append(createdChats, c)
}
// Wait for all chats to reach terminal status.
// Check each chat by ID rather than fetching the full list.
testutil.Eventually(ctx, t, func(_ context.Context) bool {
for _, c := range createdChats {
ch, err := client.GetChat(ctx, c.ID)
require.NoError(t, err, "GetChat should succeed for just-created chat %s", c.ID)
if ch.Status == codersdk.ChatStatusPending || ch.Status == codersdk.ChatStatusRunning {
return false
}
}
return true
}, testutil.IntervalFast)
// Pin the first two chats (oldest updated_at).
err := client.UpdateChat(ctx, createdChats[0].ID, codersdk.UpdateChatRequest{
PinOrder: ptr.Ref(int32(1)),
})
require.NoError(t, err)
err = client.UpdateChat(ctx, createdChats[1].ID, codersdk.UpdateChatRequest{
PinOrder: ptr.Ref(int32(1)),
})
require.NoError(t, err)
// Paginate with limit=2 using cursor (after_id).
const pageSize = 2
maxPages := totalChats/pageSize + 2
var allPaginated []codersdk.Chat
var afterID uuid.UUID
for range maxPages {
opts := &codersdk.ListChatsOptions{
Pagination: codersdk.Pagination{Limit: pageSize},
}
if afterID != uuid.Nil {
opts.Pagination.AfterID = afterID
}
page, listErr := client.ListChats(ctx, opts)
require.NoError(t, listErr)
if len(page) == 0 {
break
}
allPaginated = append(allPaginated, page...)
afterID = page[len(page)-1].ID
}
// All chats should appear exactly once.
seenIDs := make(map[uuid.UUID]struct{}, len(allPaginated))
for _, c := range allPaginated {
_, dup := seenIDs[c.ID]
require.False(t, dup, "chat %s appeared more than once", c.ID)
seenIDs[c.ID] = struct{}{}
}
require.Len(t, seenIDs, totalChats, "all chats should appear in paginated results")
// Pinned chats should come before unpinned ones, and
// within the pinned group, lower pin_order sorts first.
pinnedSeen := false
unpinnedSeen := false
for _, c := range allPaginated {
if c.PinOrder > 0 {
require.False(t, unpinnedSeen, "pinned chat %s appeared after unpinned chat", c.ID)
pinnedSeen = true
} else {
unpinnedSeen = true
}
}
require.True(t, pinnedSeen, "at least one pinned chat should exist")
// Verify within-pinned ordering: pin_order=1 before
// pin_order=2 (the -pin_order DESC column).
require.Equal(t, createdChats[0].ID, allPaginated[0].ID,
"pin_order=1 chat should be first")
require.Equal(t, createdChats[1].ID, allPaginated[1].ID,
"pin_order=2 chat should be second")
})
}
func TestListChatModels(t *testing.T) {
+3
View File
@@ -435,6 +435,9 @@ func convertProvisionerJobWithQueuePosition(pj database.GetProvisionerJobsByOrga
if pj.WorkspaceID.Valid {
job.Metadata.WorkspaceID = &pj.WorkspaceID.UUID
}
if pj.WorkspaceBuildTransition.Valid {
job.Metadata.WorkspaceBuildTransition = codersdk.WorkspaceTransition(pj.WorkspaceBuildTransition.WorkspaceTransition)
}
return job
}
+8 -7
View File
@@ -97,13 +97,14 @@ func TestProvisionerJobs(t *testing.T) {
// Verify that job metadata is correct.
assert.Equal(t, job2.Metadata, codersdk.ProvisionerJobMetadata{
TemplateVersionName: version.Name,
TemplateID: template.ID,
TemplateName: template.Name,
TemplateDisplayName: template.DisplayName,
TemplateIcon: template.Icon,
WorkspaceID: &w.ID,
WorkspaceName: w.Name,
TemplateVersionName: version.Name,
TemplateID: template.ID,
TemplateName: template.Name,
TemplateDisplayName: template.DisplayName,
TemplateIcon: template.Icon,
WorkspaceID: &w.ID,
WorkspaceName: w.Name,
WorkspaceBuildTransition: codersdk.WorkspaceTransitionStart,
})
})
})
+4 -4
View File
@@ -2411,10 +2411,10 @@ func (c *ExperimentalClient) GetChatsByWorkspace(ctx context.Context, workspaceI
// PRInsightsResponse is the response from the PR insights endpoint.
type PRInsightsResponse struct {
Summary PRInsightsSummary `json:"summary"`
TimeSeries []PRInsightsTimeSeriesEntry `json:"time_series"`
ByModel []PRInsightsModelBreakdown `json:"by_model"`
RecentPRs []PRInsightsPullRequest `json:"recent_prs"`
Summary PRInsightsSummary `json:"summary"`
TimeSeries []PRInsightsTimeSeriesEntry `json:"time_series"`
ByModel []PRInsightsModelBreakdown `json:"by_model"`
PullRequests []PRInsightsPullRequest `json:"recent_prs"`
}
// PRInsightsSummary contains aggregate PR metrics for a time period,
+8 -7
View File
@@ -143,13 +143,14 @@ type ProvisionerJobInput struct {
// ProvisionerJobMetadata contains metadata for the job.
type ProvisionerJobMetadata struct {
TemplateVersionName string `json:"template_version_name" table:"template version name"`
TemplateID uuid.UUID `json:"template_id" format:"uuid" table:"template id"`
TemplateName string `json:"template_name" table:"template name"`
TemplateDisplayName string `json:"template_display_name" table:"template display name"`
TemplateIcon string `json:"template_icon" table:"template icon"`
WorkspaceID *uuid.UUID `json:"workspace_id,omitempty" format:"uuid" table:"workspace id"`
WorkspaceName string `json:"workspace_name,omitempty" table:"workspace name"`
TemplateVersionName string `json:"template_version_name" table:"template version name"`
TemplateID uuid.UUID `json:"template_id" format:"uuid" table:"template id"`
TemplateName string `json:"template_name" table:"template name"`
TemplateDisplayName string `json:"template_display_name" table:"template display name"`
TemplateIcon string `json:"template_icon" table:"template icon"`
WorkspaceID *uuid.UUID `json:"workspace_id,omitempty" format:"uuid" table:"workspace id"`
WorkspaceName string `json:"workspace_name,omitempty" table:"workspace name"`
WorkspaceBuildTransition WorkspaceTransition `json:"workspace_build_transition,omitempty" table:"workspace build transition"`
}
// ProvisionerJobType represents the type of job.
@@ -23,6 +23,7 @@ The following database fields are currently encrypted:
- `external_auth_links.oauth_access_token`
- `external_auth_links.oauth_refresh_token`
- `crypto_keys.secret`
- `user_secrets.value`
Additional database fields may be encrypted in the future.
@@ -80,9 +80,19 @@ See [Proxy TLS Configuration](#proxy-tls-configuration) for configuration steps.
### Restricting proxy access
Requests to non-allowlisted domains are tunneled through the proxy without restriction.
Requests to non-allowlisted domains are tunneled through the proxy, but connections to private and reserved IP ranges are blocked by default.
The IP validation and TCP connect happen atomically, preventing DNS rebinding attacks where the resolved address could change between the check and the connection.
To prevent unauthorized use, restrict network access to the proxy so that only authorized clients can connect.
In case the Coder access URL resolves to a private address, it is automatically exempt from this restriction so the proxy can always reach its own deployment.
If you need to allow access to additional internal networks via the proxy, use the Allowlist CIDRs option ([`CODER_AIBRIDGE_PROXY_ALLOWED_PRIVATE_CIDRS`](../../../reference/cli/server.md#--aibridge-proxy-allowed-private-cidrs)):
```shell
CODER_AIBRIDGE_PROXY_ALLOWED_PRIVATE_CIDRS=10.0.0.0/8,172.16.0.0/12
# or via CLI flag:
--aibridge-proxy-allowed-private-cidrs=10.0.0.0/8,172.16.0.0/12
```
## CA Certificate
AI Gateway Proxy uses a CA (Certificate Authority) certificate to perform MITM interception of HTTPS traffic.
@@ -240,6 +250,11 @@ To ensure AI Gateway also routes requests through the upstream proxy, make sure
<!-- TODO(ssncferreira): Add diagram showing how AI Gateway Proxy integrates with upstream proxies -->
> [!NOTE]
> When an upstream proxy is configured, AI Gateway Proxy validates the destination IP before forwarding the request.
> However, the upstream proxy re-resolves DNS independently, so a small DNS rebinding window exists between the validation and the actual connection.
> Ensure your upstream proxy enforces its own restrictions on private and reserved IP ranges.
### Configuration
Configure the upstream proxy URL:
+21 -14
View File
@@ -60,6 +60,7 @@ curl -X GET http://coder-server:8080/api/v2/users/{user}/workspace/{workspacenam
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -300,6 +301,7 @@ curl -X GET http://coder-server:8080/api/v2/workspacebuilds/{workspacebuild} \
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -1008,6 +1010,7 @@ curl -X GET http://coder-server:8080/api/v2/workspacebuilds/{workspacebuild}/sta
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -1359,6 +1362,7 @@ curl -X GET http://coder-server:8080/api/v2/workspaces/{workspace}/builds \
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -1577,6 +1581,7 @@ Status Code **200**
| `»»» template_id` | string(uuid) | false | | |
| `»»» template_name` | string | false | | |
| `»»» template_version_name` | string | false | | |
| `»»» workspace_build_transition` | [codersdk.WorkspaceTransition](schemas.md#codersdkworkspacetransition) | false | | |
| `»»» workspace_id` | string(uuid) | false | | |
| `»»» workspace_name` | string | false | | |
| `»» organization_id` | string(uuid) | false | | |
@@ -1710,20 +1715,21 @@ Status Code **200**
#### Enumerated Values
| Property | Value(s) |
|---------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `error_code` | `REQUIRED_TEMPLATE_VARIABLES` |
| `status` | `canceled`, `canceling`, `connected`, `connecting`, `deleted`, `deleting`, `disconnected`, `failed`, `pending`, `running`, `starting`, `stopped`, `stopping`, `succeeded`, `timeout` |
| `type` | `template_version_dry_run`, `template_version_import`, `workspace_build` |
| `reason` | `autostart`, `autostop`, `initiator` |
| `health` | `disabled`, `healthy`, `initializing`, `unhealthy` |
| `open_in` | `slim-window`, `tab` |
| `sharing_level` | `authenticated`, `organization`, `owner`, `public` |
| `state` | `complete`, `failure`, `idle`, `working` |
| `lifecycle_state` | `created`, `off`, `ready`, `shutdown_error`, `shutdown_timeout`, `shutting_down`, `start_error`, `start_timeout`, `starting` |
| `startup_script_behavior` | `blocking`, `non-blocking` |
| `workspace_transition` | `delete`, `start`, `stop` |
| `transition` | `delete`, `start`, `stop` |
| Property | Value(s) |
|------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `error_code` | `REQUIRED_TEMPLATE_VARIABLES` |
| `workspace_build_transition` | `delete`, `start`, `stop` |
| `status` | `canceled`, `canceling`, `connected`, `connecting`, `deleted`, `deleting`, `disconnected`, `failed`, `pending`, `running`, `starting`, `stopped`, `stopping`, `succeeded`, `timeout` |
| `type` | `template_version_dry_run`, `template_version_import`, `workspace_build` |
| `reason` | `autostart`, `autostop`, `initiator` |
| `health` | `disabled`, `healthy`, `initializing`, `unhealthy` |
| `open_in` | `slim-window`, `tab` |
| `sharing_level` | `authenticated`, `organization`, `owner`, `public` |
| `state` | `complete`, `failure`, `idle`, `working` |
| `lifecycle_state` | `created`, `off`, `ready`, `shutdown_error`, `shutdown_timeout`, `shutting_down`, `start_error`, `start_timeout`, `starting` |
| `startup_script_behavior` | `blocking`, `non-blocking` |
| `workspace_transition` | `delete`, `start`, `stop` |
| `transition` | `delete`, `start`, `stop` |
To perform this operation, you must be authenticated. [Learn more](authentication.md).
@@ -1810,6 +1816,7 @@ curl -X POST http://coder-server:8080/api/v2/workspaces/{workspace}/builds \
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
+44 -40
View File
@@ -317,6 +317,7 @@ curl -X GET http://coder-server:8080/api/v2/organizations/{organization}/provisi
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -346,49 +347,51 @@ curl -X GET http://coder-server:8080/api/v2/organizations/{organization}/provisi
Status Code **200**
| Name | Type | Required | Restrictions | Description |
|----------------------------|------------------------------------------------------------------------------|----------|--------------|-------------|
| `[array item]` | array | false | | |
| `» available_workers` | array | false | | |
| `» canceled_at` | string(date-time) | false | | |
| `» completed_at` | string(date-time) | false | | |
| `» created_at` | string(date-time) | false | | |
| `» error` | string | false | | |
| `» error_code` | [codersdk.JobErrorCode](schemas.md#codersdkjoberrorcode) | false | | |
| `» file_id` | string(uuid) | false | | |
| `» id` | string(uuid) | false | | |
| `» initiator_id` | string(uuid) | false | | |
| `» input` | [codersdk.ProvisionerJobInput](schemas.md#codersdkprovisionerjobinput) | false | | |
| `»» error` | string | false | | |
| `»» template_version_id` | string(uuid) | false | | |
| `»» workspace_build_id` | string(uuid) | false | | |
| `» logs_overflowed` | boolean | false | | |
| `» metadata` | [codersdk.ProvisionerJobMetadata](schemas.md#codersdkprovisionerjobmetadata) | false | | |
| `»» template_display_name` | string | false | | |
| `»» template_icon` | string | false | | |
| `»» template_id` | string(uuid) | false | | |
| `»» template_name` | string | false | | |
| `»» template_version_name` | string | false | | |
| `»» workspace_id` | string(uuid) | false | | |
| `»» workspace_name` | string | false | | |
| organization_id` | string(uuid) | false | | |
| queue_position` | integer | false | | |
| `» queue_size` | integer | false | | |
| started_at` | string(date-time) | false | | |
| `» status` | [codersdk.ProvisionerJobStatus](schemas.md#codersdkprovisionerjobstatus) | false | | |
| `» tags` | object | false | | |
| » [any property]` | string | false | | |
| type` | [codersdk.ProvisionerJobType](schemas.md#codersdkprovisionerjobtype) | false | | |
| worker_id` | string(uuid) | false | | |
| `» worker_name` | string | false | | |
| Name | Type | Required | Restrictions | Description |
|---------------------------------|------------------------------------------------------------------------------|----------|--------------|-------------|
| `[array item]` | array | false | | |
| `» available_workers` | array | false | | |
| `» canceled_at` | string(date-time) | false | | |
| `» completed_at` | string(date-time) | false | | |
| `» created_at` | string(date-time) | false | | |
| `» error` | string | false | | |
| `» error_code` | [codersdk.JobErrorCode](schemas.md#codersdkjoberrorcode) | false | | |
| `» file_id` | string(uuid) | false | | |
| `» id` | string(uuid) | false | | |
| `» initiator_id` | string(uuid) | false | | |
| `» input` | [codersdk.ProvisionerJobInput](schemas.md#codersdkprovisionerjobinput) | false | | |
| `»» error` | string | false | | |
| `»» template_version_id` | string(uuid) | false | | |
| `»» workspace_build_id` | string(uuid) | false | | |
| `» logs_overflowed` | boolean | false | | |
| `» metadata` | [codersdk.ProvisionerJobMetadata](schemas.md#codersdkprovisionerjobmetadata) | false | | |
| `»» template_display_name` | string | false | | |
| `»» template_icon` | string | false | | |
| `»» template_id` | string(uuid) | false | | |
| `»» template_name` | string | false | | |
| `»» template_version_name` | string | false | | |
| `»» workspace_build_transition` | [codersdk.WorkspaceTransition](schemas.md#codersdkworkspacetransition) | false | | |
| `»» workspace_id` | string(uuid) | false | | |
| » workspace_name` | string | false | | |
| organization_id` | string(uuid) | false | | |
| `» queue_position` | integer | false | | |
| queue_size` | integer | false | | |
| `» started_at` | string(date-time) | false | | |
| status` | [codersdk.ProvisionerJobStatus](schemas.md#codersdkprovisionerjobstatus) | false | | |
| tags` | object | false | | |
| » [any property]` | string | false | | |
| type` | [codersdk.ProvisionerJobType](schemas.md#codersdkprovisionerjobtype) | false | | |
| `» worker_id` | string(uuid) | false | | |
| `» worker_name` | string | false | | |
#### Enumerated Values
| Property | Value(s) |
|--------------|--------------------------------------------------------------------------|
| `error_code` | `REQUIRED_TEMPLATE_VARIABLES` |
| `status` | `canceled`, `canceling`, `failed`, `pending`, `running`, `succeeded` |
| `type` | `template_version_dry_run`, `template_version_import`, `workspace_build` |
| Property | Value(s) |
|------------------------------|--------------------------------------------------------------------------|
| `error_code` | `REQUIRED_TEMPLATE_VARIABLES` |
| `workspace_build_transition` | `delete`, `start`, `stop` |
| `status` | `canceled`, `canceling`, `failed`, `pending`, `running`, `succeeded` |
| `type` | `template_version_dry_run`, `template_version_import`, `workspace_build` |
To perform this operation, you must be authenticated. [Learn more](authentication.md).
@@ -441,6 +444,7 @@ curl -X GET http://coder-server:8080/api/v2/organizations/{organization}/provisi
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
+18 -9
View File
@@ -7121,6 +7121,7 @@ Only certain features set these fields: - FeatureManagedAgentLimit|
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -7787,6 +7788,7 @@ Only certain features set these fields: - FeatureManagedAgentLimit|
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -7896,6 +7898,7 @@ Only certain features set these fields: - FeatureManagedAgentLimit|
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
}
@@ -7903,15 +7906,16 @@ Only certain features set these fields: - FeatureManagedAgentLimit|
### Properties
| Name | Type | Required | Restrictions | Description |
|-------------------------|--------|----------|--------------|-------------|
| `template_display_name` | string | false | | |
| `template_icon` | string | false | | |
| `template_id` | string | false | | |
| `template_name` | string | false | | |
| `template_version_name` | string | false | | |
| `workspace_id` | string | false | | |
| `workspace_name` | string | false | | |
| Name | Type | Required | Restrictions | Description |
|------------------------------|--------------------------------------------------------------|----------|--------------|-------------|
| `template_display_name` | string | false | | |
| `template_icon` | string | false | | |
| `template_id` | string | false | | |
| `template_name` | string | false | | |
| `template_version_name` | string | false | | |
| `workspace_build_transition` | [codersdk.WorkspaceTransition](#codersdkworkspacetransition) | false | | |
| `workspace_id` | string | false | | |
| `workspace_name` | string | false | | |
## codersdk.ProvisionerJobStatus
@@ -8467,6 +8471,7 @@ Only certain features set these fields: - FeatureManagedAgentLimit|
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -10014,6 +10019,7 @@ Restarts will only happen on weekdays in this list on weeks which line up with W
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -11404,6 +11410,7 @@ If the schedule is empty, the user will be updated to use the default schedule.|
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -12562,6 +12569,7 @@ If the schedule is empty, the user will be updated to use the default schedule.|
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -13394,6 +13402,7 @@ If the schedule is empty, the user will be updated to use the default schedule.|
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
+2
View File
@@ -425,6 +425,7 @@ curl -X POST http://coder-server:8080/api/v2/tasks/{user}/{task}/pause \
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -668,6 +669,7 @@ curl -X POST http://coder-server:8080/api/v2/tasks/{user}/{task}/resume \
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
+135 -122
View File
@@ -493,6 +493,7 @@ curl -X GET http://coder-server:8080/api/v2/organizations/{organization}/templat
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -595,6 +596,7 @@ curl -X GET http://coder-server:8080/api/v2/organizations/{organization}/templat
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -721,6 +723,7 @@ curl -X POST http://coder-server:8080/api/v2/organizations/{organization}/templa
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -1335,6 +1338,7 @@ curl -X GET http://coder-server:8080/api/v2/templates/{template}/versions \
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -1379,70 +1383,72 @@ curl -X GET http://coder-server:8080/api/v2/templates/{template}/versions \
Status Code **200**
| Name | Type | Required | Restrictions | Description |
|-----------------------------|------------------------------------------------------------------------------|----------|--------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `[array item]` | array | false | | |
| `» archived` | boolean | false | | |
| `» created_at` | string(date-time) | false | | |
| `» created_by` | [codersdk.MinimalUser](schemas.md#codersdkminimaluser) | false | | |
| `»» avatar_url` | string(uri) | false | | |
| `»» id` | string(uuid) | true | | |
| `»» name` | string | false | | |
| `»» username` | string | true | | |
| `» has_external_agent` | boolean | false | | |
| `» id` | string(uuid) | false | | |
| `» job` | [codersdk.ProvisionerJob](schemas.md#codersdkprovisionerjob) | false | | |
| `»» available_workers` | array | false | | |
| `»» canceled_at` | string(date-time) | false | | |
| `»» completed_at` | string(date-time) | false | | |
| `»» created_at` | string(date-time) | false | | |
| `»» error` | string | false | | |
| `»» error_code` | [codersdk.JobErrorCode](schemas.md#codersdkjoberrorcode) | false | | |
| `»» file_id` | string(uuid) | false | | |
| `»» id` | string(uuid) | false | | |
| `»» initiator_id` | string(uuid) | false | | |
| `»» input` | [codersdk.ProvisionerJobInput](schemas.md#codersdkprovisionerjobinput) | false | | |
| `»»» error` | string | false | | |
| `»»» template_version_id` | string(uuid) | false | | |
| `»»» workspace_build_id` | string(uuid) | false | | |
| `»» logs_overflowed` | boolean | false | | |
| `»» metadata` | [codersdk.ProvisionerJobMetadata](schemas.md#codersdkprovisionerjobmetadata) | false | | |
| `»»» template_display_name` | string | false | | |
| `»»» template_icon` | string | false | | |
| `»»» template_id` | string(uuid) | false | | |
| `»»» template_name` | string | false | | |
| `»»» template_version_name` | string | false | | |
| `»»» workspace_id` | string(uuid) | false | | |
| `»»» workspace_name` | string | false | | |
| `»» organization_id` | string(uuid) | false | | |
| `»» queue_position` | integer | false | | |
| `»» queue_size` | integer | false | | |
| `»» started_at` | string(date-time) | false | | |
| `»» status` | [codersdk.ProvisionerJobStatus](schemas.md#codersdkprovisionerjobstatus) | false | | |
| `»» tags` | object | false | | |
| `»»» [any property]` | string | false | | |
| `»» type` | [codersdk.ProvisionerJobType](schemas.md#codersdkprovisionerjobtype) | false | | |
| `»» worker_id` | string(uuid) | false | | |
| `»» worker_name` | string | false | | |
| matched_provisioners` | [codersdk.MatchedProvisioners](schemas.md#codersdkmatchedprovisioners) | false | | |
| » available` | integer | false | | Available is the number of provisioner daemons that are available to take jobs. This may be less than the count if some provisioners are busy or have been stopped. |
| `»» count` | integer | false | | Count is the number of provisioner daemons that matched the given tags. If the count is 0, it means no provisioner daemons matched the requested tags. |
| `»» most_recently_seen` | string(date-time) | false | | Most recently seen is the most recently seen time of the set of matched provisioners. If no provisioners matched, this field will be null. |
| `» message` | string | false | | |
| name` | string | false | | |
| organization_id` | string(uuid) | false | | |
| readme` | string | false | | |
| template_id` | string(uuid) | false | | |
| updated_at` | string(date-time) | false | | |
| warnings` | array | false | | |
| Name | Type | Required | Restrictions | Description |
|----------------------------------|------------------------------------------------------------------------------|----------|--------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `[array item]` | array | false | | |
| `» archived` | boolean | false | | |
| `» created_at` | string(date-time) | false | | |
| `» created_by` | [codersdk.MinimalUser](schemas.md#codersdkminimaluser) | false | | |
| `»» avatar_url` | string(uri) | false | | |
| `»» id` | string(uuid) | true | | |
| `»» name` | string | false | | |
| `»» username` | string | true | | |
| `» has_external_agent` | boolean | false | | |
| `» id` | string(uuid) | false | | |
| `» job` | [codersdk.ProvisionerJob](schemas.md#codersdkprovisionerjob) | false | | |
| `»» available_workers` | array | false | | |
| `»» canceled_at` | string(date-time) | false | | |
| `»» completed_at` | string(date-time) | false | | |
| `»» created_at` | string(date-time) | false | | |
| `»» error` | string | false | | |
| `»» error_code` | [codersdk.JobErrorCode](schemas.md#codersdkjoberrorcode) | false | | |
| `»» file_id` | string(uuid) | false | | |
| `»» id` | string(uuid) | false | | |
| `»» initiator_id` | string(uuid) | false | | |
| `»» input` | [codersdk.ProvisionerJobInput](schemas.md#codersdkprovisionerjobinput) | false | | |
| `»»» error` | string | false | | |
| `»»» template_version_id` | string(uuid) | false | | |
| `»»» workspace_build_id` | string(uuid) | false | | |
| `»» logs_overflowed` | boolean | false | | |
| `»» metadata` | [codersdk.ProvisionerJobMetadata](schemas.md#codersdkprovisionerjobmetadata) | false | | |
| `»»» template_display_name` | string | false | | |
| `»»» template_icon` | string | false | | |
| `»»» template_id` | string(uuid) | false | | |
| `»»» template_name` | string | false | | |
| `»»» template_version_name` | string | false | | |
| `»»» workspace_build_transition` | [codersdk.WorkspaceTransition](schemas.md#codersdkworkspacetransition) | false | | |
| `»»» workspace_id` | string(uuid) | false | | |
| `»»» workspace_name` | string | false | | |
| `»» organization_id` | string(uuid) | false | | |
| `»» queue_position` | integer | false | | |
| `»» queue_size` | integer | false | | |
| `»» started_at` | string(date-time) | false | | |
| `»» status` | [codersdk.ProvisionerJobStatus](schemas.md#codersdkprovisionerjobstatus) | false | | |
| `»» tags` | object | false | | |
| `»»» [any property]` | string | false | | |
| `»» type` | [codersdk.ProvisionerJobType](schemas.md#codersdkprovisionerjobtype) | false | | |
| `»» worker_id` | string(uuid) | false | | |
| » worker_name` | string | false | | |
| matched_provisioners` | [codersdk.MatchedProvisioners](schemas.md#codersdkmatchedprovisioners) | false | | |
| `»» available` | integer | false | | Available is the number of provisioner daemons that are available to take jobs. This may be less than the count if some provisioners are busy or have been stopped. |
| `»» count` | integer | false | | Count is the number of provisioner daemons that matched the given tags. If the count is 0, it means no provisioner daemons matched the requested tags. |
| » most_recently_seen` | string(date-time) | false | | Most recently seen is the most recently seen time of the set of matched provisioners. If no provisioners matched, this field will be null. |
| message` | string | false | | |
| name` | string | false | | |
| organization_id` | string(uuid) | false | | |
| readme` | string | false | | |
| template_id` | string(uuid) | false | | |
| updated_at` | string(date-time) | false | | |
| `» warnings` | array | false | | |
#### Enumerated Values
| Property | Value(s) |
|--------------|--------------------------------------------------------------------------|
| `error_code` | `REQUIRED_TEMPLATE_VARIABLES` |
| `status` | `canceled`, `canceling`, `failed`, `pending`, `running`, `succeeded` |
| `type` | `template_version_dry_run`, `template_version_import`, `workspace_build` |
| Property | Value(s) |
|------------------------------|--------------------------------------------------------------------------|
| `error_code` | `REQUIRED_TEMPLATE_VARIABLES` |
| `workspace_build_transition` | `delete`, `start`, `stop` |
| `status` | `canceled`, `canceling`, `failed`, `pending`, `running`, `succeeded` |
| `type` | `template_version_dry_run`, `template_version_import`, `workspace_build` |
To perform this operation, you must be authenticated. [Learn more](authentication.md).
@@ -1615,6 +1621,7 @@ curl -X GET http://coder-server:8080/api/v2/templates/{template}/versions/{templ
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -1659,70 +1666,72 @@ curl -X GET http://coder-server:8080/api/v2/templates/{template}/versions/{templ
Status Code **200**
| Name | Type | Required | Restrictions | Description |
|-----------------------------|------------------------------------------------------------------------------|----------|--------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `[array item]` | array | false | | |
| `» archived` | boolean | false | | |
| `» created_at` | string(date-time) | false | | |
| `» created_by` | [codersdk.MinimalUser](schemas.md#codersdkminimaluser) | false | | |
| `»» avatar_url` | string(uri) | false | | |
| `»» id` | string(uuid) | true | | |
| `»» name` | string | false | | |
| `»» username` | string | true | | |
| `» has_external_agent` | boolean | false | | |
| `» id` | string(uuid) | false | | |
| `» job` | [codersdk.ProvisionerJob](schemas.md#codersdkprovisionerjob) | false | | |
| `»» available_workers` | array | false | | |
| `»» canceled_at` | string(date-time) | false | | |
| `»» completed_at` | string(date-time) | false | | |
| `»» created_at` | string(date-time) | false | | |
| `»» error` | string | false | | |
| `»» error_code` | [codersdk.JobErrorCode](schemas.md#codersdkjoberrorcode) | false | | |
| `»» file_id` | string(uuid) | false | | |
| `»» id` | string(uuid) | false | | |
| `»» initiator_id` | string(uuid) | false | | |
| `»» input` | [codersdk.ProvisionerJobInput](schemas.md#codersdkprovisionerjobinput) | false | | |
| `»»» error` | string | false | | |
| `»»» template_version_id` | string(uuid) | false | | |
| `»»» workspace_build_id` | string(uuid) | false | | |
| `»» logs_overflowed` | boolean | false | | |
| `»» metadata` | [codersdk.ProvisionerJobMetadata](schemas.md#codersdkprovisionerjobmetadata) | false | | |
| `»»» template_display_name` | string | false | | |
| `»»» template_icon` | string | false | | |
| `»»» template_id` | string(uuid) | false | | |
| `»»» template_name` | string | false | | |
| `»»» template_version_name` | string | false | | |
| `»»» workspace_id` | string(uuid) | false | | |
| `»»» workspace_name` | string | false | | |
| `»» organization_id` | string(uuid) | false | | |
| `»» queue_position` | integer | false | | |
| `»» queue_size` | integer | false | | |
| `»» started_at` | string(date-time) | false | | |
| `»» status` | [codersdk.ProvisionerJobStatus](schemas.md#codersdkprovisionerjobstatus) | false | | |
| `»» tags` | object | false | | |
| `»»» [any property]` | string | false | | |
| `»» type` | [codersdk.ProvisionerJobType](schemas.md#codersdkprovisionerjobtype) | false | | |
| `»» worker_id` | string(uuid) | false | | |
| `»» worker_name` | string | false | | |
| matched_provisioners` | [codersdk.MatchedProvisioners](schemas.md#codersdkmatchedprovisioners) | false | | |
| » available` | integer | false | | Available is the number of provisioner daemons that are available to take jobs. This may be less than the count if some provisioners are busy or have been stopped. |
| `»» count` | integer | false | | Count is the number of provisioner daemons that matched the given tags. If the count is 0, it means no provisioner daemons matched the requested tags. |
| `»» most_recently_seen` | string(date-time) | false | | Most recently seen is the most recently seen time of the set of matched provisioners. If no provisioners matched, this field will be null. |
| `» message` | string | false | | |
| name` | string | false | | |
| organization_id` | string(uuid) | false | | |
| readme` | string | false | | |
| template_id` | string(uuid) | false | | |
| updated_at` | string(date-time) | false | | |
| warnings` | array | false | | |
| Name | Type | Required | Restrictions | Description |
|----------------------------------|------------------------------------------------------------------------------|----------|--------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `[array item]` | array | false | | |
| `» archived` | boolean | false | | |
| `» created_at` | string(date-time) | false | | |
| `» created_by` | [codersdk.MinimalUser](schemas.md#codersdkminimaluser) | false | | |
| `»» avatar_url` | string(uri) | false | | |
| `»» id` | string(uuid) | true | | |
| `»» name` | string | false | | |
| `»» username` | string | true | | |
| `» has_external_agent` | boolean | false | | |
| `» id` | string(uuid) | false | | |
| `» job` | [codersdk.ProvisionerJob](schemas.md#codersdkprovisionerjob) | false | | |
| `»» available_workers` | array | false | | |
| `»» canceled_at` | string(date-time) | false | | |
| `»» completed_at` | string(date-time) | false | | |
| `»» created_at` | string(date-time) | false | | |
| `»» error` | string | false | | |
| `»» error_code` | [codersdk.JobErrorCode](schemas.md#codersdkjoberrorcode) | false | | |
| `»» file_id` | string(uuid) | false | | |
| `»» id` | string(uuid) | false | | |
| `»» initiator_id` | string(uuid) | false | | |
| `»» input` | [codersdk.ProvisionerJobInput](schemas.md#codersdkprovisionerjobinput) | false | | |
| `»»» error` | string | false | | |
| `»»» template_version_id` | string(uuid) | false | | |
| `»»» workspace_build_id` | string(uuid) | false | | |
| `»» logs_overflowed` | boolean | false | | |
| `»» metadata` | [codersdk.ProvisionerJobMetadata](schemas.md#codersdkprovisionerjobmetadata) | false | | |
| `»»» template_display_name` | string | false | | |
| `»»» template_icon` | string | false | | |
| `»»» template_id` | string(uuid) | false | | |
| `»»» template_name` | string | false | | |
| `»»» template_version_name` | string | false | | |
| `»»» workspace_build_transition` | [codersdk.WorkspaceTransition](schemas.md#codersdkworkspacetransition) | false | | |
| `»»» workspace_id` | string(uuid) | false | | |
| `»»» workspace_name` | string | false | | |
| `»» organization_id` | string(uuid) | false | | |
| `»» queue_position` | integer | false | | |
| `»» queue_size` | integer | false | | |
| `»» started_at` | string(date-time) | false | | |
| `»» status` | [codersdk.ProvisionerJobStatus](schemas.md#codersdkprovisionerjobstatus) | false | | |
| `»» tags` | object | false | | |
| `»»» [any property]` | string | false | | |
| `»» type` | [codersdk.ProvisionerJobType](schemas.md#codersdkprovisionerjobtype) | false | | |
| `»» worker_id` | string(uuid) | false | | |
| » worker_name` | string | false | | |
| matched_provisioners` | [codersdk.MatchedProvisioners](schemas.md#codersdkmatchedprovisioners) | false | | |
| `»» available` | integer | false | | Available is the number of provisioner daemons that are available to take jobs. This may be less than the count if some provisioners are busy or have been stopped. |
| `»» count` | integer | false | | Count is the number of provisioner daemons that matched the given tags. If the count is 0, it means no provisioner daemons matched the requested tags. |
| » most_recently_seen` | string(date-time) | false | | Most recently seen is the most recently seen time of the set of matched provisioners. If no provisioners matched, this field will be null. |
| message` | string | false | | |
| name` | string | false | | |
| organization_id` | string(uuid) | false | | |
| readme` | string | false | | |
| template_id` | string(uuid) | false | | |
| updated_at` | string(date-time) | false | | |
| `» warnings` | array | false | | |
#### Enumerated Values
| Property | Value(s) |
|--------------|--------------------------------------------------------------------------|
| `error_code` | `REQUIRED_TEMPLATE_VARIABLES` |
| `status` | `canceled`, `canceling`, `failed`, `pending`, `running`, `succeeded` |
| `type` | `template_version_dry_run`, `template_version_import`, `workspace_build` |
| Property | Value(s) |
|------------------------------|--------------------------------------------------------------------------|
| `error_code` | `REQUIRED_TEMPLATE_VARIABLES` |
| `workspace_build_transition` | `delete`, `start`, `stop` |
| `status` | `canceled`, `canceling`, `failed`, `pending`, `running`, `succeeded` |
| `type` | `template_version_dry_run`, `template_version_import`, `workspace_build` |
To perform this operation, you must be authenticated. [Learn more](authentication.md).
@@ -1785,6 +1794,7 @@ curl -X GET http://coder-server:8080/api/v2/templateversions/{templateversion} \
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -1896,6 +1906,7 @@ curl -X PATCH http://coder-server:8080/api/v2/templateversions/{templateversion}
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -2095,6 +2106,7 @@ curl -X POST http://coder-server:8080/api/v2/templateversions/{templateversion}/
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -2170,6 +2182,7 @@ curl -X GET http://coder-server:8080/api/v2/templateversions/{templateversion}/d
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
+6
View File
@@ -115,6 +115,7 @@ of the template will be used.
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -478,6 +479,7 @@ curl -X GET http://coder-server:8080/api/v2/users/{user}/workspace/{workspacenam
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -808,6 +810,7 @@ of the template will be used.
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -1116,6 +1119,7 @@ curl -X GET http://coder-server:8080/api/v2/workspaces \
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -1405,6 +1409,7 @@ curl -X GET http://coder-server:8080/api/v2/workspaces/{workspace} \
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
@@ -1971,6 +1976,7 @@ curl -X PUT http://coder-server:8080/api/v2/workspaces/{workspace}/dormant \
"template_id": "c6d67e98-83ea-49f0-8812-e4abae2b68bc",
"template_name": "string",
"template_version_name": "string",
"workspace_build_transition": "start",
"workspace_id": "0967198e-ec7b-4c6b-b4d3-f71244cadbe9",
"workspace_name": "string"
},
+4 -4
View File
@@ -54,10 +54,10 @@ Select which organization (uuid or name) to use.
### -c, --column
| | |
|---------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Type | <code>[id\|created at\|started at\|completed at\|canceled at\|error\|error code\|status\|worker id\|worker name\|file id\|tags\|queue position\|queue size\|organization id\|initiator id\|template version id\|workspace build id\|type\|available workers\|template version name\|template id\|template name\|template display name\|template icon\|workspace id\|workspace name\|logs overflowed\|organization\|queue]</code> |
| Default | <code>created at,id,type,template display name,status,queue,tags</code> |
| | |
|---------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Type | <code>[id\|created at\|started at\|completed at\|canceled at\|error\|error code\|status\|worker id\|worker name\|file id\|tags\|queue position\|queue size\|organization id\|initiator id\|template version id\|workspace build id\|type\|available workers\|template version name\|template id\|template name\|template display name\|template icon\|workspace id\|workspace name\|workspace build transition\|logs overflowed\|organization\|queue]</code> |
| Default | <code>created at,id,type,template display name,status,queue,tags</code> |
Columns to display in table output.
+19
View File
@@ -197,6 +197,10 @@ func TestServerDBCrypt(t *testing.T) {
gitAuthLinks, err := db.GetExternalAuthLinksByUserID(ctx, usr.ID)
require.NoError(t, err, "failed to get git auth links for user %s", usr.ID)
require.Empty(t, gitAuthLinks)
userSecrets, err := db.ListUserSecretsWithValues(ctx, usr.ID)
require.NoError(t, err, "failed to get user secrets for user %s", usr.ID)
require.Empty(t, userSecrets)
}
// Validate that the key has been revoked in the database.
@@ -242,6 +246,14 @@ func genData(t *testing.T, db database.Store) []database.User {
OAuthRefreshToken: "refresh-" + usr.ID.String(),
})
}
_ = dbgen.UserSecret(t, db, database.UserSecret{
UserID: usr.ID,
Name: "secret-" + usr.ID.String(),
Value: "value-" + usr.ID.String(),
EnvName: "",
FilePath: "",
})
users = append(users, usr)
}
}
@@ -283,6 +295,13 @@ func requireEncryptedWithCipher(ctx context.Context, t *testing.T, db database.S
require.Equal(t, c.HexDigest(), gal.OAuthAccessTokenKeyID.String)
require.Equal(t, c.HexDigest(), gal.OAuthRefreshTokenKeyID.String)
}
userSecrets, err := db.ListUserSecretsWithValues(ctx, userID)
require.NoError(t, err, "failed to get user secrets for user %s", userID)
for _, s := range userSecrets {
requireEncryptedEquals(t, c, "value-"+userID.String(), s.Value)
require.Equal(t, c.HexDigest(), s.ValueKeyID.String)
}
}
// nullCipher is a dbcrypt.Cipher that does not encrypt or decrypt.
@@ -11,7 +11,7 @@ OPTIONS:
-O, --org string, $CODER_ORGANIZATION
Select which organization (uuid or name) to use.
-c, --column [id|created at|started at|completed at|canceled at|error|error code|status|worker id|worker name|file id|tags|queue position|queue size|organization id|initiator id|template version id|workspace build id|type|available workers|template version name|template id|template name|template display name|template icon|workspace id|workspace name|logs overflowed|organization|queue] (default: created at,id,type,template display name,status,queue,tags)
-c, --column [id|created at|started at|completed at|canceled at|error|error code|status|worker id|worker name|file id|tags|queue position|queue size|organization id|initiator id|template version id|workspace build id|type|available workers|template version name|template id|template name|template display name|template icon|workspace id|workspace name|workspace build transition|logs overflowed|organization|queue] (default: created at,id,type,template display name,status,queue,tags)
Columns to display in table output.
-i, --initiator string, $CODER_PROVISIONER_JOB_LIST_INITIATOR
+58
View File
@@ -96,6 +96,34 @@ func Rotate(ctx context.Context, log slog.Logger, sqlDB *sql.DB, ciphers []Ciphe
}
log.Debug(ctx, "encrypted user chat provider key", slog.F("user_id", uid), slog.F("chat_provider_id", userProviderKey.ChatProviderID), slog.F("current", idx+1), slog.F("cipher", ciphers[0].HexDigest()))
}
userSecrets, err := cryptTx.ListUserSecretsWithValues(ctx, uid)
if err != nil {
return xerrors.Errorf("get user secrets for user %s: %w", uid, err)
}
for _, secret := range userSecrets {
if secret.ValueKeyID.Valid && secret.ValueKeyID.String == ciphers[0].HexDigest() {
log.Debug(ctx, "skipping user secret", slog.F("user_id", uid), slog.F("secret_name", secret.Name), slog.F("current", idx+1), slog.F("cipher", ciphers[0].HexDigest()))
continue
}
if _, err := cryptTx.UpdateUserSecretByUserIDAndName(ctx, database.UpdateUserSecretByUserIDAndNameParams{
UserID: uid,
Name: secret.Name,
UpdateValue: true,
Value: secret.Value,
ValueKeyID: sql.NullString{}, // dbcrypt will re-encrypt
UpdateDescription: false,
Description: "",
UpdateEnvName: false,
EnvName: "",
UpdateFilePath: false,
FilePath: "",
}); err != nil {
return xerrors.Errorf("rotate user secret user_id=%s name=%s: %w", uid, secret.Name, err)
}
log.Debug(ctx, "rotated user secret", slog.F("user_id", uid), slog.F("secret_name", secret.Name), slog.F("current", idx+1), slog.F("cipher", ciphers[0].HexDigest()))
}
return nil
}, &database.TxOptions{
Isolation: sql.LevelRepeatableRead,
@@ -235,6 +263,34 @@ func Decrypt(ctx context.Context, log slog.Logger, sqlDB *sql.DB, ciphers []Ciph
}
log.Debug(ctx, "decrypted user chat provider key", slog.F("user_id", uid), slog.F("chat_provider_id", userProviderKey.ChatProviderID), slog.F("current", idx+1))
}
userSecrets, err := tx.ListUserSecretsWithValues(ctx, uid)
if err != nil {
return xerrors.Errorf("get user secrets for user %s: %w", uid, err)
}
for _, secret := range userSecrets {
if !secret.ValueKeyID.Valid {
log.Debug(ctx, "skipping user secret", slog.F("user_id", uid), slog.F("secret_name", secret.Name), slog.F("current", idx+1))
continue
}
if _, err := tx.UpdateUserSecretByUserIDAndName(ctx, database.UpdateUserSecretByUserIDAndNameParams{
UserID: uid,
Name: secret.Name,
UpdateValue: true,
Value: secret.Value,
ValueKeyID: sql.NullString{}, // clear the key ID
UpdateDescription: false,
Description: "",
UpdateEnvName: false,
EnvName: "",
UpdateFilePath: false,
FilePath: "",
}); err != nil {
return xerrors.Errorf("decrypt user secret user_id=%s name=%s: %w", uid, secret.Name, err)
}
log.Debug(ctx, "decrypted user secret", slog.F("user_id", uid), slog.F("secret_name", secret.Name), slog.F("current", idx+1))
}
return nil
}, &database.TxOptions{
Isolation: sql.LevelRepeatableRead,
@@ -292,6 +348,8 @@ DELETE FROM external_auth_links
OR oauth_refresh_token_key_id IS NOT NULL;
DELETE FROM user_chat_provider_keys
WHERE api_key_key_id IS NOT NULL;
DELETE FROM user_secrets
WHERE value_key_id IS NOT NULL;
UPDATE chat_providers
SET api_key = '',
api_key_key_id = NULL
+54
View File
@@ -717,6 +717,60 @@ func (db *dbCrypt) UpsertMCPServerUserToken(ctx context.Context, params database
return tok, nil
}
func (db *dbCrypt) CreateUserSecret(ctx context.Context, params database.CreateUserSecretParams) (database.UserSecret, error) {
if err := db.encryptField(&params.Value, &params.ValueKeyID); err != nil {
return database.UserSecret{}, err
}
secret, err := db.Store.CreateUserSecret(ctx, params)
if err != nil {
return database.UserSecret{}, err
}
if err := db.decryptField(&secret.Value, secret.ValueKeyID); err != nil {
return database.UserSecret{}, err
}
return secret, nil
}
func (db *dbCrypt) GetUserSecretByUserIDAndName(ctx context.Context, arg database.GetUserSecretByUserIDAndNameParams) (database.UserSecret, error) {
secret, err := db.Store.GetUserSecretByUserIDAndName(ctx, arg)
if err != nil {
return database.UserSecret{}, err
}
if err := db.decryptField(&secret.Value, secret.ValueKeyID); err != nil {
return database.UserSecret{}, err
}
return secret, nil
}
func (db *dbCrypt) ListUserSecretsWithValues(ctx context.Context, userID uuid.UUID) ([]database.UserSecret, error) {
secrets, err := db.Store.ListUserSecretsWithValues(ctx, userID)
if err != nil {
return nil, err
}
for i := range secrets {
if err := db.decryptField(&secrets[i].Value, secrets[i].ValueKeyID); err != nil {
return nil, err
}
}
return secrets, nil
}
func (db *dbCrypt) UpdateUserSecretByUserIDAndName(ctx context.Context, arg database.UpdateUserSecretByUserIDAndNameParams) (database.UserSecret, error) {
if arg.UpdateValue {
if err := db.encryptField(&arg.Value, &arg.ValueKeyID); err != nil {
return database.UserSecret{}, err
}
}
secret, err := db.Store.UpdateUserSecretByUserIDAndName(ctx, arg)
if err != nil {
return database.UserSecret{}, err
}
if err := db.decryptField(&secret.Value, secret.ValueKeyID); err != nil {
return database.UserSecret{}, err
}
return secret, nil
}
func (db *dbCrypt) encryptField(field *string, digest *sql.NullString) error {
// If no cipher is loaded, then we can't encrypt anything!
if db.ciphers == nil || db.primaryCipherDigest == "" {
+195
View File
@@ -1287,3 +1287,198 @@ func TestUserChatProviderKeys(t *testing.T) {
requireEncryptedEquals(t, ciphers[0], rawKey.APIKey, updatedAPIKey)
})
}
func TestUserSecrets(t *testing.T) {
t.Parallel()
ctx := context.Background()
const (
//nolint:gosec // test credentials
initialValue = "super-secret-value-initial"
//nolint:gosec // test credentials
updatedValue = "super-secret-value-updated"
)
insertUserSecret := func(
t *testing.T,
crypt *dbCrypt,
ciphers []Cipher,
) database.UserSecret {
t.Helper()
user := dbgen.User(t, crypt, database.User{})
secret, err := crypt.CreateUserSecret(ctx, database.CreateUserSecretParams{
ID: uuid.New(),
UserID: user.ID,
Name: "test-secret-" + uuid.NewString()[:8],
Value: initialValue,
})
require.NoError(t, err)
require.Equal(t, initialValue, secret.Value)
if len(ciphers) > 0 {
require.Equal(t, ciphers[0].HexDigest(), secret.ValueKeyID.String)
}
return secret
}
t.Run("CreateUserSecretEncryptsValue", func(t *testing.T) {
t.Parallel()
db, crypt, ciphers := setup(t)
secret := insertUserSecret(t, crypt, ciphers)
// Reading through crypt should return plaintext.
got, err := crypt.GetUserSecretByUserIDAndName(ctx, database.GetUserSecretByUserIDAndNameParams{
UserID: secret.UserID,
Name: secret.Name,
})
require.NoError(t, err)
require.Equal(t, initialValue, got.Value)
// Reading through raw DB should return encrypted value.
raw, err := db.GetUserSecretByUserIDAndName(ctx, database.GetUserSecretByUserIDAndNameParams{
UserID: secret.UserID,
Name: secret.Name,
})
require.NoError(t, err)
require.NotEqual(t, initialValue, raw.Value)
requireEncryptedEquals(t, ciphers[0], raw.Value, initialValue)
})
t.Run("ListUserSecretsWithValuesDecrypts", func(t *testing.T) {
t.Parallel()
_, crypt, ciphers := setup(t)
secret := insertUserSecret(t, crypt, ciphers)
secrets, err := crypt.ListUserSecretsWithValues(ctx, secret.UserID)
require.NoError(t, err)
require.Len(t, secrets, 1)
require.Equal(t, initialValue, secrets[0].Value)
})
t.Run("UpdateUserSecretReEncryptsValue", func(t *testing.T) {
t.Parallel()
db, crypt, ciphers := setup(t)
secret := insertUserSecret(t, crypt, ciphers)
updated, err := crypt.UpdateUserSecretByUserIDAndName(ctx, database.UpdateUserSecretByUserIDAndNameParams{
UserID: secret.UserID,
Name: secret.Name,
UpdateValue: true,
Value: updatedValue,
ValueKeyID: sql.NullString{},
})
require.NoError(t, err)
require.Equal(t, updatedValue, updated.Value)
require.Equal(t, ciphers[0].HexDigest(), updated.ValueKeyID.String)
// Raw DB should have new encrypted value.
raw, err := db.GetUserSecretByUserIDAndName(ctx, database.GetUserSecretByUserIDAndNameParams{
UserID: secret.UserID,
Name: secret.Name,
})
require.NoError(t, err)
require.NotEqual(t, updatedValue, raw.Value)
requireEncryptedEquals(t, ciphers[0], raw.Value, updatedValue)
})
t.Run("NoCipherStoresPlaintext", func(t *testing.T) {
t.Parallel()
db, crypt := setupNoCiphers(t)
user := dbgen.User(t, crypt, database.User{})
secret, err := crypt.CreateUserSecret(ctx, database.CreateUserSecretParams{
ID: uuid.New(),
UserID: user.ID,
Name: "plaintext-secret",
Value: initialValue,
})
require.NoError(t, err)
require.Equal(t, initialValue, secret.Value)
require.False(t, secret.ValueKeyID.Valid)
// Raw DB should also have plaintext.
raw, err := db.GetUserSecretByUserIDAndName(ctx, database.GetUserSecretByUserIDAndNameParams{
UserID: user.ID,
Name: "plaintext-secret",
})
require.NoError(t, err)
require.Equal(t, initialValue, raw.Value)
require.False(t, raw.ValueKeyID.Valid)
})
t.Run("UpdateMetadataOnlySkipsEncryption", func(t *testing.T) {
t.Parallel()
db, crypt, ciphers := setup(t)
secret := insertUserSecret(t, crypt, ciphers)
// Read the raw encrypted value from the database.
rawBefore, err := db.GetUserSecretByUserIDAndName(ctx, database.GetUserSecretByUserIDAndNameParams{
UserID: secret.UserID,
Name: secret.Name,
})
require.NoError(t, err)
// Perform a metadata-only update (no value change).
updated, err := crypt.UpdateUserSecretByUserIDAndName(ctx, database.UpdateUserSecretByUserIDAndNameParams{
UserID: secret.UserID,
Name: secret.Name,
UpdateValue: false,
Value: "",
ValueKeyID: sql.NullString{},
UpdateDescription: true,
Description: "updated description",
UpdateEnvName: false,
EnvName: "",
UpdateFilePath: false,
FilePath: "",
})
require.NoError(t, err)
require.Equal(t, "updated description", updated.Description)
require.Equal(t, initialValue, updated.Value)
// Read the raw encrypted value again.
rawAfter, err := db.GetUserSecretByUserIDAndName(ctx, database.GetUserSecretByUserIDAndNameParams{
UserID: secret.UserID,
Name: secret.Name,
})
require.NoError(t, err)
require.Equal(t, rawBefore.Value, rawAfter.Value)
require.Equal(t, rawBefore.ValueKeyID, rawAfter.ValueKeyID)
})
t.Run("GetUserSecretDecryptErr", func(t *testing.T) {
t.Parallel()
db, crypt, ciphers := setup(t)
user := dbgen.User(t, db, database.User{})
dbgen.UserSecret(t, db, database.UserSecret{
UserID: user.ID,
Name: "corrupt-secret",
Value: fakeBase64RandomData(t, 32),
ValueKeyID: sql.NullString{String: ciphers[0].HexDigest(), Valid: true},
})
_, err := crypt.GetUserSecretByUserIDAndName(ctx, database.GetUserSecretByUserIDAndNameParams{
UserID: user.ID,
Name: "corrupt-secret",
})
require.Error(t, err)
var derr *DecryptFailedError
require.ErrorAs(t, err, &derr)
})
t.Run("ListUserSecretsWithValuesDecryptErr", func(t *testing.T) {
t.Parallel()
db, crypt, ciphers := setup(t)
user := dbgen.User(t, db, database.User{})
dbgen.UserSecret(t, db, database.UserSecret{
UserID: user.ID,
Name: "corrupt-list-secret",
Value: fakeBase64RandomData(t, 32),
ValueKeyID: sql.NullString{String: ciphers[0].HexDigest(), Valid: true},
})
_, err := crypt.ListUserSecretsWithValues(ctx, user.ID)
require.Error(t, err)
var derr *DecryptFailedError
require.ErrorAs(t, err, &derr)
})
}
+3 -3
View File
@@ -518,7 +518,7 @@ require (
cloud.google.com/go/logging v1.13.2 // indirect
cloud.google.com/go/longrunning v0.8.0 // indirect
cloud.google.com/go/monitoring v1.24.3 // indirect
cloud.google.com/go/storage v1.60.0 // indirect
cloud.google.com/go/storage v1.61.3 // indirect
git.sr.ht/~jackmordaunt/go-toast v1.1.2 // indirect
github.com/Azure/azure-sdk-for-go/sdk/azcore v1.20.0 // indirect
github.com/Azure/azure-sdk-for-go/sdk/internal v1.11.2 // indirect
@@ -576,8 +576,8 @@ require (
github.com/goccy/go-yaml v1.19.2 // indirect
github.com/google/go-containerregistry v0.20.7 // indirect
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674 // indirect
github.com/hashicorp/aws-sdk-go-base/v2 v2.0.0-beta.70 // indirect
github.com/hashicorp/go-getter v1.8.4 // indirect
github.com/hashicorp/aws-sdk-go-base/v2 v2.0.0-beta.72 // indirect
github.com/hashicorp/go-getter v1.8.6 // indirect
github.com/hexops/gotextdiff v1.0.3 // indirect
github.com/inconshreveable/mousetrap v1.1.0 // indirect
github.com/jackmordaunt/icns/v3 v3.0.1 // indirect
+8 -8
View File
@@ -18,8 +18,8 @@ cloud.google.com/go/longrunning v0.8.0 h1:LiKK77J3bx5gDLi4SMViHixjD2ohlkwBi+mKA7
cloud.google.com/go/longrunning v0.8.0/go.mod h1:UmErU2Onzi+fKDg2gR7dusz11Pe26aknR4kHmJJqIfk=
cloud.google.com/go/monitoring v1.24.3 h1:dde+gMNc0UhPZD1Azu6at2e79bfdztVDS5lvhOdsgaE=
cloud.google.com/go/monitoring v1.24.3/go.mod h1:nYP6W0tm3N9H/bOw8am7t62YTzZY+zUeQ+Bi6+2eonI=
cloud.google.com/go/storage v1.60.0 h1:oBfZrSOCimggVNz9Y/bXY35uUcts7OViubeddTTVzQ8=
cloud.google.com/go/storage v1.60.0/go.mod h1:q+5196hXfejkctrnx+VYU8RKQr/L3c0cBIlrjmiAKE0=
cloud.google.com/go/storage v1.61.3 h1:VS//ZfBuPGDvakfD9xyPW1RGF1Vy3BWUoVZXgW1KMOg=
cloud.google.com/go/storage v1.61.3/go.mod h1:JtqK8BBB7TWv0HVGHubtUdzYYrakOQIsMLffZ2Z/HWk=
cloud.google.com/go/trace v1.11.7 h1:kDNDX8JkaAG3R2nq1lIdkb7FCSi1rCmsEtKVsty7p+U=
cloud.google.com/go/trace v1.11.7/go.mod h1:TNn9d5V3fQVf6s4SCveVMIBS2LJUqo73GACmq/Tky0s=
dario.cat/mergo v1.0.2 h1:85+piFYR1tMbRrLcDwR18y4UKJ3aH1Tbzi24VRW1TK8=
@@ -687,8 +687,8 @@ github.com/grpc-ecosystem/grpc-gateway/v2 v2.28.0 h1:HWRh5R2+9EifMyIHV7ZV+MIZqgz
github.com/grpc-ecosystem/grpc-gateway/v2 v2.28.0/go.mod h1:JfhWUomR1baixubs02l85lZYYOm7LV6om4ceouMv45c=
github.com/hairyhenderson/go-codeowners v0.7.0 h1:s0W4wF8bdsBEjTWzwzSlsatSthWtTAF2xLgo4a4RwAo=
github.com/hairyhenderson/go-codeowners v0.7.0/go.mod h1:wUlNgQ3QjqC4z8DnM5nnCYVq/icpqXJyJOukKx5U8/Q=
github.com/hashicorp/aws-sdk-go-base/v2 v2.0.0-beta.70 h1:0HADrxxqaQkGycO1JoUUA+B4FnIkuo8d2bz/hSaTFFQ=
github.com/hashicorp/aws-sdk-go-base/v2 v2.0.0-beta.70/go.mod h1:fm2FdDCzJdtbXF7WKAMvBb5NEPouXPHFbGNYs9ShFns=
github.com/hashicorp/aws-sdk-go-base/v2 v2.0.0-beta.72 h1:vTCWu1wbdYo7PEZFem/rlr01+Un+wwVmI7wiegFdRLk=
github.com/hashicorp/aws-sdk-go-base/v2 v2.0.0-beta.72/go.mod h1:Vn+BBgKQHVQYdVQ4NZDICE1Brb+JfaONyDHr3q07oQc=
github.com/hashicorp/errwrap v1.0.0/go.mod h1:YH+1FKiLXxHSkmPseP+kNlulaMuP3n2brvKWEqk/Jc4=
github.com/hashicorp/errwrap v1.1.0 h1:OxrOeh75EUXMY8TBjag2fzXGZ40LB6IKw45YeGUDY2I=
github.com/hashicorp/errwrap v1.1.0/go.mod h1:YH+1FKiLXxHSkmPseP+kNlulaMuP3n2brvKWEqk/Jc4=
@@ -698,8 +698,8 @@ github.com/hashicorp/go-cleanhttp v0.5.2 h1:035FKYIWjmULyFRBKPs8TBQoi0x6d9G4xc9n
github.com/hashicorp/go-cleanhttp v0.5.2/go.mod h1:kO/YDlP8L1346E6Sodw+PrpBSV4/SoxCXGY6BqNFT48=
github.com/hashicorp/go-cty v1.5.0 h1:EkQ/v+dDNUqnuVpmS5fPqyY71NXVgT5gf32+57xY8g0=
github.com/hashicorp/go-cty v1.5.0/go.mod h1:lFUCG5kd8exDobgSfyj4ONE/dc822kiYMguVKdHGMLM=
github.com/hashicorp/go-getter v1.8.4 h1:hGEd2xsuVKgwkMtPVufq73fAmZU/x65PPcqH3cb0D9A=
github.com/hashicorp/go-getter v1.8.4/go.mod h1:x27pPGSg9kzoB147QXI8d/nDvp2IgYGcwuRjpaXE9Yg=
github.com/hashicorp/go-getter v1.8.6 h1:9sQboWULaydVphxc4S64oAI4YqpuCk7nPmvbk131ebY=
github.com/hashicorp/go-getter v1.8.6/go.mod h1:nVH12eOV2P58dIiL3rsU6Fh3wLeJEKBOJzhMmzlSWoo=
github.com/hashicorp/go-hclog v1.6.3 h1:Qr2kF+eVWjTiYmU7Y31tYlP1h0q/X3Nl3tPGdaB11/k=
github.com/hashicorp/go-hclog v1.6.3/go.mod h1:W4Qnvbt70Wk/zYJryRzDRU/4r0kIg0PVHBcfoyhpF5M=
github.com/hashicorp/go-multierror v1.1.1 h1:H5DkEtf6CXdFp0N0Em5UCwQpXMWke8IA0+lD48awMYo=
@@ -1322,8 +1322,8 @@ go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.40.0 h1:DvJDO
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.40.0/go.mod h1:EtekO9DEJb4/jRyN4v4Qjc2yA7AtfCBuz2FynRUWTXs=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.38.0 h1:aTL7F04bJHUlztTsNGJ2l+6he8c+y/b//eR0jjjemT4=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.38.0/go.mod h1:kldtb7jDTeol0l3ewcmd8SDvx3EmIE7lyvqbasU3QC4=
go.opentelemetry.io/otel/exporters/stdout/stdoutmetric v1.39.0 h1:5gn2urDL/FBnK8OkCfD1j3/ER79rUuTYmCvlXBKeYL8=
go.opentelemetry.io/otel/exporters/stdout/stdoutmetric v1.39.0/go.mod h1:0fBG6ZJxhqByfFZDwSwpZGzJU671HkwpWaNe2t4VUPI=
go.opentelemetry.io/otel/exporters/stdout/stdoutmetric v1.40.0 h1:ZrPRak/kS4xI3AVXy8F7pipuDXmDsrO8Lg+yQjBLjw0=
go.opentelemetry.io/otel/exporters/stdout/stdoutmetric v1.40.0/go.mod h1:3y6kQCWztq6hyW8Z9YxQDDm0Je9AJoFar2G0yDcmhRk=
go.opentelemetry.io/otel/exporters/stdout/stdouttrace v1.37.0 h1:SNhVp/9q4Go/XHBkQ1/d5u9P/U+L1yaGPoi0x+mStaI=
go.opentelemetry.io/otel/exporters/stdout/stdouttrace v1.37.0/go.mod h1:tx8OOlGH6R4kLV67YaYO44GFXloEjGPZuMjEkaaqIp4=
go.opentelemetry.io/otel/metric v1.42.0 h1:2jXG+3oZLNXEPfNmnpxKDeZsFI5o4J+nz6xUlaFdF/4=
@@ -0,0 +1,44 @@
import { describe, expect, it } from "vitest";
import type * as TypesGen from "#/api/typesGenerated";
import { buildOptimisticEditedMessage } from "./chatMessageEdits";
const makeUserMessage = (
content: readonly TypesGen.ChatMessagePart[] = [
{ type: "text", text: "original" },
],
): TypesGen.ChatMessage => ({
id: 1,
chat_id: "chat-1",
created_at: "2025-01-01T00:00:00.000Z",
role: "user",
content,
});
describe("buildOptimisticEditedMessage", () => {
it("preserves image MIME types for newly attached files", () => {
const message = buildOptimisticEditedMessage({
requestContent: [{ type: "file", file_id: "image-1" }],
originalMessage: makeUserMessage(),
attachmentMediaTypes: new Map([["image-1", "image/png"]]),
});
expect(message.content).toEqual([
{ type: "file", file_id: "image-1", media_type: "image/png" },
]);
});
it("reuses existing file parts before local attachment metadata", () => {
const existingFilePart: TypesGen.ChatFilePart = {
type: "file",
file_id: "existing-1",
media_type: "image/jpeg",
};
const message = buildOptimisticEditedMessage({
requestContent: [{ type: "file", file_id: "existing-1" }],
originalMessage: makeUserMessage([existingFilePart]),
attachmentMediaTypes: new Map([["existing-1", "text/plain"]]),
});
expect(message.content).toEqual([existingFilePart]);
});
});
+148
View File
@@ -0,0 +1,148 @@
import type { InfiniteData } from "react-query";
import type * as TypesGen from "#/api/typesGenerated";
const buildOptimisticEditedContent = ({
requestContent,
originalMessage,
attachmentMediaTypes,
}: {
requestContent: readonly TypesGen.ChatInputPart[];
originalMessage: TypesGen.ChatMessage;
attachmentMediaTypes?: ReadonlyMap<string, string>;
}): readonly TypesGen.ChatMessagePart[] => {
const existingFilePartsByID = new Map<string, TypesGen.ChatFilePart>();
for (const part of originalMessage.content ?? []) {
if (part.type === "file" && part.file_id) {
existingFilePartsByID.set(part.file_id, part);
}
}
return requestContent.map((part): TypesGen.ChatMessagePart => {
if (part.type === "text") {
return { type: "text", text: part.text ?? "" };
}
if (part.type === "file-reference") {
return {
type: "file-reference",
file_name: part.file_name ?? "",
start_line: part.start_line ?? 1,
end_line: part.end_line ?? 1,
content: part.content ?? "",
};
}
const fileId = part.file_id ?? "";
return (
existingFilePartsByID.get(fileId) ?? {
type: "file",
file_id: part.file_id,
media_type:
attachmentMediaTypes?.get(fileId) ?? "application/octet-stream",
}
);
});
};
export const buildOptimisticEditedMessage = ({
requestContent,
originalMessage,
attachmentMediaTypes,
}: {
requestContent: readonly TypesGen.ChatInputPart[];
originalMessage: TypesGen.ChatMessage;
attachmentMediaTypes?: ReadonlyMap<string, string>;
}): TypesGen.ChatMessage => ({
...originalMessage,
content: buildOptimisticEditedContent({
requestContent,
originalMessage,
attachmentMediaTypes,
}),
});
const sortMessagesDescending = (
messages: readonly TypesGen.ChatMessage[],
): TypesGen.ChatMessage[] => [...messages].sort((a, b) => b.id - a.id);
const upsertFirstPageMessage = (
messages: readonly TypesGen.ChatMessage[],
message: TypesGen.ChatMessage,
): TypesGen.ChatMessage[] => {
const byID = new Map(
messages.map((existingMessage) => [existingMessage.id, existingMessage]),
);
byID.set(message.id, message);
return sortMessagesDescending(Array.from(byID.values()));
};
export const projectEditedConversationIntoCache = ({
currentData,
editedMessageId,
replacementMessage,
queuedMessages,
}: {
currentData: InfiniteData<TypesGen.ChatMessagesResponse> | undefined;
editedMessageId: number;
replacementMessage?: TypesGen.ChatMessage;
queuedMessages?: readonly TypesGen.ChatQueuedMessage[];
}): InfiniteData<TypesGen.ChatMessagesResponse> | undefined => {
if (!currentData?.pages?.length) {
return currentData;
}
const truncatedPages = currentData.pages.map((page, pageIndex) => {
const truncatedMessages = page.messages.filter(
(message) => message.id < editedMessageId,
);
const nextPage = {
...page,
...(pageIndex === 0 && queuedMessages !== undefined
? { queued_messages: queuedMessages }
: {}),
};
if (pageIndex !== 0 || !replacementMessage) {
return { ...nextPage, messages: truncatedMessages };
}
return {
...nextPage,
messages: upsertFirstPageMessage(truncatedMessages, replacementMessage),
};
});
return {
...currentData,
pages: truncatedPages,
};
};
export const reconcileEditedMessageInCache = ({
currentData,
optimisticMessageId,
responseMessage,
}: {
currentData: InfiniteData<TypesGen.ChatMessagesResponse> | undefined;
optimisticMessageId: number;
responseMessage: TypesGen.ChatMessage;
}): InfiniteData<TypesGen.ChatMessagesResponse> | undefined => {
if (!currentData?.pages?.length) {
return currentData;
}
const replacedPages = currentData.pages.map((page, pageIndex) => {
const preservedMessages = page.messages.filter(
(message) =>
message.id !== optimisticMessageId && message.id !== responseMessage.id,
);
if (pageIndex !== 0) {
return { ...page, messages: preservedMessages };
}
return {
...page,
messages: upsertFirstPageMessage(preservedMessages, responseMessage),
};
});
return {
...currentData,
pages: replacedPages,
};
};
+177 -19
View File
@@ -2,6 +2,7 @@ import { QueryClient } from "react-query";
import { describe, expect, it, vi } from "vitest";
import { API } from "#/api/api";
import type * as TypesGen from "#/api/typesGenerated";
import { buildOptimisticEditedMessage } from "./chatMessageEdits";
import {
archiveChat,
cancelChatListRefetches,
@@ -795,14 +796,44 @@ describe("mutation invalidation scope", () => {
content: [{ type: "text" as const, text: `msg ${id}` }],
});
const makeQueuedMessage = (
chatId: string,
id: number,
): TypesGen.ChatQueuedMessage => ({
id,
chat_id: chatId,
created_at: `2025-01-01T00:10:${String(id).padStart(2, "0")}Z`,
content: [{ type: "text" as const, text: `queued ${id}` }],
});
const editReq = {
content: [{ type: "text" as const, text: "edited" }],
};
it("editChatMessage optimistically removes truncated messages from cache", async () => {
const requireMessage = (
messages: readonly TypesGen.ChatMessage[],
messageId: number,
): TypesGen.ChatMessage => {
const message = messages.find((candidate) => candidate.id === messageId);
if (!message) {
throw new Error(`missing message ${messageId}`);
}
return message;
};
const buildOptimisticMessage = (message: TypesGen.ChatMessage) =>
buildOptimisticEditedMessage({
originalMessage: message,
requestContent: editReq.content,
});
it("editChatMessage writes the optimistic replacement into cache", async () => {
const queryClient = createTestQueryClient();
const chatId = "chat-1";
const messages = [5, 4, 3, 2, 1].map((id) => makeMsg(chatId, id));
const optimisticMessage = buildOptimisticMessage(
requireMessage(messages, 3),
);
queryClient.setQueryData<InfMessages>(chatMessagesKey(chatId), {
pages: [{ messages, queued_messages: [], has_more: false }],
@@ -812,18 +843,58 @@ describe("mutation invalidation scope", () => {
const mutation = editChatMessage(queryClient, chatId);
const context = await mutation.onMutate({
messageId: 3,
optimisticMessage,
req: editReq,
});
const data = queryClient.getQueryData<InfMessages>(chatMessagesKey(chatId));
expect(data?.pages[0]?.messages.map((m) => m.id)).toEqual([2, 1]);
expect(data?.pages[0]?.messages.map((message) => message.id)).toEqual([
3, 2, 1,
]);
expect(data?.pages[0]?.messages[0]?.content).toEqual(
optimisticMessage.content,
);
expect(context?.previousData?.pages[0]?.messages).toHaveLength(5);
});
it("editChatMessage clears queued messages in cache during optimistic history edit", async () => {
const queryClient = createTestQueryClient();
const chatId = "chat-1";
const messages = [5, 4, 3, 2, 1].map((id) => makeMsg(chatId, id));
const optimisticMessage = buildOptimisticMessage(
requireMessage(messages, 3),
);
const queuedMessages = [makeQueuedMessage(chatId, 11)];
queryClient.setQueryData<InfMessages>(chatMessagesKey(chatId), {
pages: [
{
messages,
queued_messages: queuedMessages,
has_more: false,
},
],
pageParams: [undefined],
});
const mutation = editChatMessage(queryClient, chatId);
await mutation.onMutate({
messageId: 3,
optimisticMessage,
req: editReq,
});
const data = queryClient.getQueryData<InfMessages>(chatMessagesKey(chatId));
expect(data?.pages[0]?.queued_messages).toEqual([]);
});
it("editChatMessage restores cache on error", async () => {
const queryClient = createTestQueryClient();
const chatId = "chat-1";
const messages = [5, 4, 3, 2, 1].map((id) => makeMsg(chatId, id));
const optimisticMessage = buildOptimisticMessage(
requireMessage(messages, 3),
);
queryClient.setQueryData<InfMessages>(chatMessagesKey(chatId), {
pages: [{ messages, queued_messages: [], has_more: false }],
@@ -833,22 +904,85 @@ describe("mutation invalidation scope", () => {
const mutation = editChatMessage(queryClient, chatId);
const context = await mutation.onMutate({
messageId: 3,
optimisticMessage,
req: editReq,
});
expect(
queryClient.getQueryData<InfMessages>(chatMessagesKey(chatId))?.pages[0]
?.messages,
).toHaveLength(2);
).toHaveLength(3);
mutation.onError(
new Error("network failure"),
{ messageId: 3, req: editReq },
{ messageId: 3, optimisticMessage, req: editReq },
context,
);
const data = queryClient.getQueryData<InfMessages>(chatMessagesKey(chatId));
expect(data?.pages[0]?.messages.map((m) => m.id)).toEqual([5, 4, 3, 2, 1]);
expect(data?.pages[0]?.messages.map((message) => message.id)).toEqual([
5, 4, 3, 2, 1,
]);
});
it("editChatMessage preserves websocket-upserted newer messages on success", async () => {
const queryClient = createTestQueryClient();
const chatId = "chat-1";
const messages = [5, 4, 3, 2, 1].map((id) => makeMsg(chatId, id));
const optimisticMessage = buildOptimisticMessage(
requireMessage(messages, 3),
);
const responseMessage = {
...makeMsg(chatId, 9),
content: [{ type: "text" as const, text: "edited authoritative" }],
};
const websocketMessage = {
...makeMsg(chatId, 10),
content: [{ type: "text" as const, text: "assistant follow-up" }],
role: "assistant" as const,
};
queryClient.setQueryData<InfMessages>(chatMessagesKey(chatId), {
pages: [{ messages, queued_messages: [], has_more: false }],
pageParams: [undefined],
});
const mutation = editChatMessage(queryClient, chatId);
await mutation.onMutate({
messageId: 3,
optimisticMessage,
req: editReq,
});
queryClient.setQueryData<InfMessages | undefined>(
chatMessagesKey(chatId),
(current) => {
if (!current) {
return current;
}
return {
...current,
pages: [
{
...current.pages[0],
messages: [websocketMessage, ...current.pages[0].messages],
},
...current.pages.slice(1),
],
};
},
);
mutation.onSuccess(
{ message: responseMessage },
{ messageId: 3, optimisticMessage, req: editReq },
);
const data = queryClient.getQueryData<InfMessages>(chatMessagesKey(chatId));
expect(data?.pages[0]?.messages.map((message) => message.id)).toEqual([
10, 9, 2, 1,
]);
expect(data?.pages[0]?.messages[1]?.content).toEqual(
responseMessage.content,
);
});
it("editChatMessage onMutate is a no-op when cache is empty", async () => {
@@ -890,13 +1024,14 @@ describe("mutation invalidation scope", () => {
expect(data?.pages[0]?.messages.map((m) => m.id)).toEqual([3, 2, 1]);
});
it("editChatMessage onMutate filters across multiple pages", async () => {
it("editChatMessage onMutate updates the first page and preserves older pages", async () => {
const queryClient = createTestQueryClient();
const chatId = "chat-1";
// Page 0 (newest): IDs 106. Page 1 (older): IDs 51.
const page0 = [10, 9, 8, 7, 6].map((id) => makeMsg(chatId, id));
const page1 = [5, 4, 3, 2, 1].map((id) => makeMsg(chatId, id));
const optimisticMessage = buildOptimisticMessage(requireMessage(page0, 7));
queryClient.setQueryData<InfMessages>(chatMessagesKey(chatId), {
pages: [
@@ -907,19 +1042,28 @@ describe("mutation invalidation scope", () => {
});
const mutation = editChatMessage(queryClient, chatId);
await mutation.onMutate({ messageId: 7, req: editReq });
await mutation.onMutate({
messageId: 7,
optimisticMessage,
req: editReq,
});
const data = queryClient.getQueryData<InfMessages>(chatMessagesKey(chatId));
// Page 0: only ID 6 survives (< 7).
expect(data?.pages[0]?.messages.map((m) => m.id)).toEqual([6]);
// Page 1: all survive (all < 7).
expect(data?.pages[1]?.messages.map((m) => m.id)).toEqual([5, 4, 3, 2, 1]);
expect(data?.pages[0]?.messages.map((message) => message.id)).toEqual([
7, 6,
]);
expect(data?.pages[1]?.messages.map((message) => message.id)).toEqual([
5, 4, 3, 2, 1,
]);
});
it("editChatMessage onMutate editing the first message empties all pages", async () => {
it("editChatMessage onMutate keeps the optimistic replacement when editing the first message", async () => {
const queryClient = createTestQueryClient();
const chatId = "chat-1";
const messages = [5, 4, 3, 2, 1].map((id) => makeMsg(chatId, id));
const optimisticMessage = buildOptimisticMessage(
requireMessage(messages, 1),
);
queryClient.setQueryData<InfMessages>(chatMessagesKey(chatId), {
pages: [{ messages, queued_messages: [], has_more: false }],
@@ -927,20 +1071,25 @@ describe("mutation invalidation scope", () => {
});
const mutation = editChatMessage(queryClient, chatId);
await mutation.onMutate({ messageId: 1, req: editReq });
await mutation.onMutate({
messageId: 1,
optimisticMessage,
req: editReq,
});
const data = queryClient.getQueryData<InfMessages>(chatMessagesKey(chatId));
// All messages have id >= 1, so the page is empty.
expect(data?.pages[0]?.messages).toHaveLength(0);
// Sibling fields survive the spread.
expect(data?.pages[0]?.messages.map((message) => message.id)).toEqual([1]);
expect(data?.pages[0]?.queued_messages).toEqual([]);
expect(data?.pages[0]?.has_more).toBe(false);
});
it("editChatMessage onMutate editing the latest message keeps earlier ones", async () => {
it("editChatMessage onMutate keeps earlier messages when editing the latest message", async () => {
const queryClient = createTestQueryClient();
const chatId = "chat-1";
const messages = [5, 4, 3, 2, 1].map((id) => makeMsg(chatId, id));
const optimisticMessage = buildOptimisticMessage(
requireMessage(messages, 5),
);
queryClient.setQueryData<InfMessages>(chatMessagesKey(chatId), {
pages: [{ messages, queued_messages: [], has_more: false }],
@@ -948,10 +1097,19 @@ describe("mutation invalidation scope", () => {
});
const mutation = editChatMessage(queryClient, chatId);
await mutation.onMutate({ messageId: 5, req: editReq });
await mutation.onMutate({
messageId: 5,
optimisticMessage,
req: editReq,
});
const data = queryClient.getQueryData<InfMessages>(chatMessagesKey(chatId));
expect(data?.pages[0]?.messages.map((m) => m.id)).toEqual([4, 3, 2, 1]);
expect(data?.pages[0]?.messages.map((message) => message.id)).toEqual([
5, 4, 3, 2, 1,
]);
expect(data?.pages[0]?.messages[0]?.content).toEqual(
optimisticMessage.content,
);
});
it("interruptChat does not invalidate unrelated queries", async () => {
+36 -27
View File
@@ -6,6 +6,10 @@ import type {
import { API } from "#/api/api";
import type * as TypesGen from "#/api/typesGenerated";
import type { UsePaginatedQueryOptions } from "#/hooks/usePaginatedQuery";
import {
projectEditedConversationIntoCache,
reconcileEditedMessageInCache,
} from "./chatMessageEdits";
export const chatsKey = ["chats"] as const;
export const chatKey = (chatId: string) => ["chats", chatId] as const;
@@ -601,13 +605,21 @@ export const createChatMessage = (
type EditChatMessageMutationArgs = {
messageId: number;
optimisticMessage?: TypesGen.ChatMessage;
req: TypesGen.EditChatMessageRequest;
};
type EditChatMessageMutationContext = {
previousData?: InfiniteData<TypesGen.ChatMessagesResponse> | undefined;
};
export const editChatMessage = (queryClient: QueryClient, chatId: string) => ({
mutationFn: ({ messageId, req }: EditChatMessageMutationArgs) =>
API.experimental.editChatMessage(chatId, messageId, req),
onMutate: async ({ messageId }: EditChatMessageMutationArgs) => {
onMutate: async ({
messageId,
optimisticMessage,
}: EditChatMessageMutationArgs): Promise<EditChatMessageMutationContext> => {
// Cancel in-flight refetches so they don't overwrite the
// optimistic update before the mutation completes.
await queryClient.cancelQueries({
@@ -619,40 +631,23 @@ export const editChatMessage = (queryClient: QueryClient, chatId: string) => ({
InfiniteData<TypesGen.ChatMessagesResponse>
>(chatMessagesKey(chatId));
// Optimistically remove the edited message and everything
// after it. The server soft-deletes these and inserts a
// replacement with a new ID. Without this, the WebSocket
// handler's upsertCacheMessages adds new messages to the
// React Query cache without removing the soft-deleted ones,
// causing deleted messages to flash back into view until
// the full REST refetch resolves.
queryClient.setQueryData<
InfiniteData<TypesGen.ChatMessagesResponse> | undefined
>(chatMessagesKey(chatId), (current) => {
if (!current?.pages?.length) {
return current;
}
return {
...current,
pages: current.pages.map((page) => ({
...page,
messages: page.messages.filter((m) => m.id < messageId),
})),
};
});
>(chatMessagesKey(chatId), (current) =>
projectEditedConversationIntoCache({
currentData: current,
editedMessageId: messageId,
replacementMessage: optimisticMessage,
queuedMessages: [],
}),
);
return { previousData };
},
onError: (
_error: unknown,
_variables: EditChatMessageMutationArgs,
context:
| {
previousData?:
| InfiniteData<TypesGen.ChatMessagesResponse>
| undefined;
}
| undefined,
context: EditChatMessageMutationContext | undefined,
) => {
// Restore the cache on failure so the user sees the
// original messages again.
@@ -660,6 +655,20 @@ export const editChatMessage = (queryClient: QueryClient, chatId: string) => ({
queryClient.setQueryData(chatMessagesKey(chatId), context.previousData);
}
},
onSuccess: (
response: TypesGen.EditChatMessageResponse,
variables: EditChatMessageMutationArgs,
) => {
queryClient.setQueryData<
InfiniteData<TypesGen.ChatMessagesResponse> | undefined
>(chatMessagesKey(chatId), (current) =>
reconcileEditedMessageInCache({
currentData: current,
optimisticMessageId: variables.messageId,
responseMessage: response.message,
}),
);
},
onSettled: () => {
// Always reconcile with the server regardless of whether
// the mutation succeeded or failed. On success this picks
+119
View File
@@ -0,0 +1,119 @@
import { describe, expect, it, vi } from "vitest";
import { API } from "#/api/api";
import type { AuthorizationCheck, Organization } from "#/api/typesGenerated";
import { permittedOrganizations } from "./organizations";
// Mock the API module
vi.mock("#/api/api", () => ({
API: {
getOrganizations: vi.fn(),
checkAuthorization: vi.fn(),
},
}));
const MockOrg1: Organization = {
id: "org-1",
name: "org-one",
display_name: "Org One",
description: "",
icon: "",
created_at: "",
updated_at: "",
is_default: true,
};
const MockOrg2: Organization = {
id: "org-2",
name: "org-two",
display_name: "Org Two",
description: "",
icon: "",
created_at: "",
updated_at: "",
is_default: false,
};
const templateCreateCheck: AuthorizationCheck = {
object: { resource_type: "template" },
action: "create",
};
describe("permittedOrganizations", () => {
it("returns query config with correct queryKey", () => {
const config = permittedOrganizations(templateCreateCheck);
expect(config.queryKey).toEqual([
"organizations",
"permitted",
templateCreateCheck,
]);
});
it("fetches orgs and filters by permission check", async () => {
const getOrgsMock = vi.mocked(API.getOrganizations);
const checkAuthMock = vi.mocked(API.checkAuthorization);
getOrgsMock.mockResolvedValue([MockOrg1, MockOrg2]);
checkAuthMock.mockResolvedValue({
"org-1": true,
"org-2": false,
});
const config = permittedOrganizations(templateCreateCheck);
const result = await config.queryFn!();
// Should only return org-1 (which passed the check)
expect(result).toEqual([MockOrg1]);
// Verify the auth check was called with per-org checks
expect(checkAuthMock).toHaveBeenCalledWith({
checks: {
"org-1": {
...templateCreateCheck,
object: {
...templateCreateCheck.object,
organization_id: "org-1",
},
},
"org-2": {
...templateCreateCheck,
object: {
...templateCreateCheck.object,
organization_id: "org-2",
},
},
},
});
});
it("returns all orgs when all pass the check", async () => {
const getOrgsMock = vi.mocked(API.getOrganizations);
const checkAuthMock = vi.mocked(API.checkAuthorization);
getOrgsMock.mockResolvedValue([MockOrg1, MockOrg2]);
checkAuthMock.mockResolvedValue({
"org-1": true,
"org-2": true,
});
const config = permittedOrganizations(templateCreateCheck);
const result = await config.queryFn!();
expect(result).toEqual([MockOrg1, MockOrg2]);
});
it("returns empty array when no orgs pass the check", async () => {
const getOrgsMock = vi.mocked(API.getOrganizations);
const checkAuthMock = vi.mocked(API.checkAuthorization);
getOrgsMock.mockResolvedValue([MockOrg1, MockOrg2]);
checkAuthMock.mockResolvedValue({
"org-1": false,
"org-2": false,
});
const config = permittedOrganizations(templateCreateCheck);
const result = await config.queryFn!();
expect(result).toEqual([]);
});
});
+27 -1
View File
@@ -5,6 +5,7 @@ import {
type GetProvisionerJobsParams,
} from "#/api/api";
import type {
AuthorizationCheck,
CreateOrganizationRequest,
GroupSyncSettings,
Organization,
@@ -160,7 +161,7 @@ export const updateOrganizationMemberRoles = (
};
};
export const organizationsKey = ["organizations"] as const;
const organizationsKey = ["organizations"] as const;
const notAvailable = { available: false, value: undefined } as const;
@@ -295,6 +296,31 @@ export const provisionerJobs = (
};
};
/**
* Fetch organizations the current user is permitted to use for a given
* action. Fetches all organizations, runs a per-org authorization
* check, and returns only those that pass.
*/
export const permittedOrganizations = (check: AuthorizationCheck) => {
return {
queryKey: ["organizations", "permitted", check],
queryFn: async (): Promise<Organization[]> => {
const orgs = await API.getOrganizations();
const checks = Object.fromEntries(
orgs.map((org) => [
org.id,
{
...check,
object: { ...check.object, organization_id: org.id },
},
]),
);
const permissions = await API.checkAuthorization({ checks });
return orgs.filter((org) => permissions[org.id]);
},
};
};
/**
* Fetch permissions for all provided organizations.
*
+1
View File
@@ -5615,6 +5615,7 @@ export interface ProvisionerJobMetadata {
readonly template_icon: string;
readonly workspace_id?: string;
readonly workspace_name?: string;
readonly workspace_build_transition?: WorkspaceTransition;
}
// From codersdk/provisionerdaemons.go
@@ -1,18 +1,14 @@
import type { Meta, StoryObj } from "@storybook/react-vite";
import { action } from "storybook/actions";
import { userEvent, within } from "storybook/test";
import {
MockOrganization,
MockOrganization2,
MockUserOwner,
} from "#/testHelpers/entities";
import { expect, fn, screen, userEvent, waitFor, within } from "storybook/test";
import { MockOrganization, MockOrganization2 } from "#/testHelpers/entities";
import { OrganizationAutocomplete } from "./OrganizationAutocomplete";
const meta: Meta<typeof OrganizationAutocomplete> = {
title: "components/OrganizationAutocomplete",
component: OrganizationAutocomplete,
args: {
onChange: action("Selected organization"),
onChange: fn(),
options: [MockOrganization, MockOrganization2],
},
};
@@ -20,36 +16,51 @@ export default meta;
type Story = StoryObj<typeof OrganizationAutocomplete>;
export const ManyOrgs: Story = {
parameters: {
showOrganizations: true,
user: MockUserOwner,
features: ["multiple_organizations"],
permissions: { viewDeploymentConfig: true },
queries: [
{
key: ["organizations"],
data: [MockOrganization, MockOrganization2],
},
],
args: {
value: null,
},
play: async ({ canvasElement }) => {
const canvas = within(canvasElement);
const button = canvas.getByRole("button");
await userEvent.click(button);
await waitFor(() => {
expect(
screen.getByText(MockOrganization.display_name),
).toBeInTheDocument();
expect(
screen.getByText(MockOrganization2.display_name),
).toBeInTheDocument();
});
},
};
export const WithValue: Story = {
args: {
value: MockOrganization2,
},
play: async ({ canvasElement, args }) => {
const canvas = within(canvasElement);
await waitFor(() => {
expect(
canvas.getByText(MockOrganization2.display_name),
).toBeInTheDocument();
});
expect(args.onChange).not.toHaveBeenCalled();
},
};
export const OneOrg: Story = {
parameters: {
showOrganizations: true,
user: MockUserOwner,
features: ["multiple_organizations"],
permissions: { viewDeploymentConfig: true },
queries: [
{
key: ["organizations"],
data: [MockOrganization],
},
],
args: {
value: MockOrganization,
options: [MockOrganization],
},
play: async ({ canvasElement, args }) => {
const canvas = within(canvasElement);
await waitFor(() => {
expect(
canvas.getByText(MockOrganization.display_name),
).toBeInTheDocument();
});
expect(args.onChange).not.toHaveBeenCalled();
},
};
@@ -1,9 +1,6 @@
import { Check } from "lucide-react";
import { type FC, useEffect, useState } from "react";
import { useQuery } from "react-query";
import { checkAuthorization } from "#/api/queries/authCheck";
import { organizations } from "#/api/queries/organizations";
import type { AuthorizationCheck, Organization } from "#/api/typesGenerated";
import { type FC, useState } from "react";
import type { Organization } from "#/api/typesGenerated";
import { ChevronDownIcon } from "#/components/AnimatedIcons/ChevronDown";
import { Avatar } from "#/components/Avatar/Avatar";
import { Button } from "#/components/Button/Button";
@@ -22,62 +19,21 @@ import {
} from "#/components/Popover/Popover";
type OrganizationAutocompleteProps = {
value: Organization | null;
onChange: (organization: Organization | null) => void;
options: Organization[];
id?: string;
required?: boolean;
check?: AuthorizationCheck;
};
export const OrganizationAutocomplete: FC<OrganizationAutocompleteProps> = ({
value,
onChange,
options,
id,
required,
check,
}) => {
const [open, setOpen] = useState(false);
const [selected, setSelected] = useState<Organization | null>(null);
const organizationsQuery = useQuery(organizations());
const checks =
check &&
organizationsQuery.data &&
Object.fromEntries(
organizationsQuery.data.map((org) => [
org.id,
{
...check,
object: { ...check.object, organization_id: org.id },
},
]),
);
const permissionsQuery = useQuery({
...checkAuthorization({ checks: checks ?? {} }),
enabled: Boolean(check && organizationsQuery.data),
});
// If an authorization check was provided, filter the organizations based on
// the results of that check.
let options = organizationsQuery.data ?? [];
if (check) {
options = permissionsQuery.data
? options.filter((org) => permissionsQuery.data[org.id])
: [];
}
// Unfortunate: this useEffect sets a default org value
// if only one is available and is necessary as the autocomplete loads
// its own data. Until we refactor, proceed cautiously!
useEffect(() => {
const org = options[0];
if (options.length !== 1 || org === selected) {
return;
}
setSelected(org);
onChange(org);
}, [options, selected, onChange]);
return (
<Popover open={open} onOpenChange={setOpen}>
@@ -90,14 +46,14 @@ export const OrganizationAutocomplete: FC<OrganizationAutocompleteProps> = ({
data-testid="organization-autocomplete"
className="w-full justify-start gap-2 font-normal"
>
{selected ? (
{value ? (
<>
<Avatar
size="sm"
src={selected.icon}
fallback={selected.display_name}
src={value.icon}
fallback={value.display_name}
/>
<span className="truncate">{selected.display_name}</span>
<span className="truncate">{value.display_name}</span>
</>
) : (
<span className="text-content-secondary">
@@ -121,7 +77,6 @@ export const OrganizationAutocomplete: FC<OrganizationAutocompleteProps> = ({
key={org.id}
value={`${org.display_name} ${org.name}`}
onSelect={() => {
setSelected(org);
onChange(org);
setOpen(false);
}}
@@ -134,7 +89,7 @@ export const OrganizationAutocomplete: FC<OrganizationAutocompleteProps> = ({
<span className="truncate">
{org.display_name || org.name}
</span>
{selected?.id === org.id && (
{value?.id === org.id && (
<Check className="ml-auto size-icon-sm shrink-0" />
)}
</CommandItem>
@@ -1,5 +1,5 @@
import type { Meta, StoryObj } from "@storybook/react-vite";
import { screen, spyOn, userEvent, within } from "storybook/test";
import { expect, screen, spyOn, userEvent, within } from "storybook/test";
import { API } from "#/api/api";
import { getPreferredProxy } from "#/contexts/ProxyContext";
import { chromatic } from "#/testHelpers/chromatic";
@@ -57,6 +57,13 @@ export const HasError: Story = {
agent: undefined,
},
},
play: async ({ canvasElement }) => {
const canvas = within(canvasElement);
const moreActionsButton = canvas.getByRole("button", {
name: "Dev Container actions",
});
expect(moreActionsButton).toBeVisible();
},
};
export const NoPorts: Story = {};
@@ -123,6 +130,13 @@ export const NoContainerOrSubAgent: Story = {
},
subAgents: [],
},
play: async ({ canvasElement }) => {
const canvas = within(canvasElement);
const moreActionsButton = canvas.getByRole("button", {
name: "Dev Container actions",
});
expect(moreActionsButton).toBeVisible();
},
};
export const NoContainerOrAgentOrName: Story = {
@@ -274,7 +274,7 @@ export const AgentDevcontainerCard: FC<AgentDevcontainerCardProps> = ({
/>
)}
{showDevcontainerControls && (
{!isTransitioning && (
<AgentDevcontainerMoreActions
deleteDevContainer={deleteDevcontainerMutation.mutate}
/>
@@ -89,7 +89,7 @@ export const WorkspaceBuildLogs: FC<WorkspaceBuildLogsProps> = ({
<div
className={cn(
"logs-header",
"flex items-center border-solid border-0 border-b border-border font-sans",
"flex items-center border-solid border-0 border-b last:border-b-0 border-border font-sans",
"bg-surface-primary text-xs font-semibold leading-none",
"first-of-type:pt-4",
)}
@@ -100,22 +100,24 @@ export const WorkspaceTimings: FC<WorkspaceTimingsProps> = ({
return (
<div className="rounded-lg border-solid bg-surface-primary">
<Button
disabled={isLoading}
variant="subtle"
className="w-full flex items-center"
onClick={() => setIsOpen((o) => !o)}
>
<ChevronDownIcon open={isOpen} className="size-4 mr-4" />
<span>Build timeline</span>
<span className="ml-auto text-content-secondary">
<div className="flex items-center justify-between px-4 py-1.5 relative">
<Button
disabled={isLoading}
variant="subtle"
onClick={() => setIsOpen((o) => !o)}
className="after:content-[''] after:absolute after:inset-0"
>
<ChevronDownIcon open={isOpen} />
<span>Build timeline</span>
</Button>
<span className="ml-auto text-sm text-content-secondary pr-2">
{isLoading ? (
<Skeleton variant="text" width={40} height={16} />
) : (
displayProvisioningTime()
)}
</span>
</Button>
</div>
{!isLoading && (
<Collapse in={isOpen}>
<div
@@ -4,9 +4,12 @@ import { beforeEach, describe, expect, it, vi } from "vitest";
import {
draftInputStorageKeyPrefix,
getPersistedDraftInputValue,
restoreOptimisticRequestSnapshot,
useConversationEditingState,
} from "./AgentChatPage";
import type { ChatMessageInputRef } from "./components/AgentChatInput";
import { createChatStore } from "./components/ChatConversation/chatStore";
import type { PendingAttachment } from "./components/ChatPageContent";
type MockChatInputHandle = {
handle: ChatMessageInputRef;
@@ -84,6 +87,41 @@ describe("getPersistedDraftInputValue", () => {
});
});
describe("restoreOptimisticRequestSnapshot", () => {
it("restores queued messages, stream output, status, and stream error", () => {
const store = createChatStore();
store.setQueuedMessages([
{
id: 9,
chat_id: "chat-abc-123",
created_at: "2025-01-01T00:00:00.000Z",
content: [{ type: "text" as const, text: "queued" }],
},
]);
store.setChatStatus("running");
store.applyMessagePart({ type: "text", text: "partial response" });
store.setStreamError({ kind: "generic", message: "old error" });
const previousSnapshot = store.getSnapshot();
store.batch(() => {
store.setQueuedMessages([]);
store.setChatStatus("pending");
store.clearStreamState();
store.clearStreamError();
});
restoreOptimisticRequestSnapshot(store, previousSnapshot);
const restoredSnapshot = store.getSnapshot();
expect(restoredSnapshot.queuedMessages).toEqual(
previousSnapshot.queuedMessages,
);
expect(restoredSnapshot.chatStatus).toBe(previousSnapshot.chatStatus);
expect(restoredSnapshot.streamState).toBe(previousSnapshot.streamState);
expect(restoredSnapshot.streamError).toEqual(previousSnapshot.streamError);
});
});
describe("useConversationEditingState", () => {
const chatID = "chat-abc-123";
const expectedKey = `${draftInputStorageKeyPrefix}${chatID}`;
@@ -327,6 +365,64 @@ describe("useConversationEditingState", () => {
unmount();
});
it("forwards pending attachments through history-edit send", async () => {
const { result, onSend, unmount } = renderEditing();
const attachments: PendingAttachment[] = [
{ fileId: "file-1", mediaType: "image/png" },
];
act(() => {
result.current.handleEditUserMessage(7, "hello");
});
await act(async () => {
await result.current.handleSendFromInput("hello", attachments);
});
expect(onSend).toHaveBeenCalledWith("hello", attachments, 7);
unmount();
});
it("restores the edit draft and file-block seed when an edit submission fails", async () => {
const { result, onSend, unmount } = renderEditing();
const mockInput = createMockChatInputHandle("edited message");
const fileBlocks = [
{ type: "file", file_id: "file-1", media_type: "image/png" },
] as const;
result.current.chatInputRef.current = mockInput.handle;
onSend.mockRejectedValueOnce(new Error("boom"));
const editorState = JSON.stringify({
root: {
children: [
{
children: [{ text: "edited message" }],
type: "paragraph",
},
],
type: "root",
},
});
act(() => {
result.current.handleEditUserMessage(7, "edited message", fileBlocks);
result.current.handleContentChange("edited message", editorState, false);
});
await act(async () => {
await expect(
result.current.handleSendFromInput("edited message"),
).rejects.toThrow("boom");
});
expect(mockInput.clear).toHaveBeenCalled();
expect(result.current.inputValueRef.current).toBe("edited message");
expect(result.current.editingMessageId).toBe(7);
expect(result.current.editingFileBlocks).toEqual(fileBlocks);
expect(result.current.editorInitialValue).toBe("edited message");
expect(result.current.initialEditorState).toBe(editorState);
unmount();
});
it("clears the composer and persisted draft after a successful send", async () => {
localStorage.setItem(expectedKey, "draft to clear");
const { result, onSend, unmount } = renderEditing();
+136 -30
View File
@@ -11,6 +11,7 @@ import { toast } from "sonner";
import type { UrlTransform } from "streamdown";
import { API, watchWorkspace } from "#/api/api";
import { isApiError } from "#/api/errors";
import { buildOptimisticEditedMessage } from "#/api/queries/chatMessageEdits";
import {
chat,
chatDesktopEnabled,
@@ -51,11 +52,14 @@ import {
getWorkspaceAgent,
} from "./components/ChatConversation/chatHelpers";
import {
type ChatStore,
type ChatStoreState,
selectChatStatus,
useChatSelector,
useChatStore,
} from "./components/ChatConversation/chatStore";
import { useWorkspaceCreationWatcher } from "./components/ChatConversation/useWorkspaceCreationWatcher";
import type { PendingAttachment } from "./components/ChatPageContent";
import {
getDefaultMCPSelection,
getSavedMCPSelection,
@@ -101,12 +105,47 @@ export function getPersistedDraftInputValue(
).text;
}
/** @internal Exported for testing. */
export const restoreOptimisticRequestSnapshot = (
store: Pick<
ChatStore,
| "batch"
| "setChatStatus"
| "setQueuedMessages"
| "setStreamError"
| "setStreamState"
>,
snapshot: Pick<
ChatStoreState,
"chatStatus" | "queuedMessages" | "streamError" | "streamState"
>,
): void => {
store.batch(() => {
store.setQueuedMessages(snapshot.queuedMessages);
store.setChatStatus(snapshot.chatStatus);
store.setStreamState(snapshot.streamState);
store.setStreamError(snapshot.streamError);
});
};
const buildAttachmentMediaTypes = (
attachments?: readonly PendingAttachment[],
): ReadonlyMap<string, string> | undefined => {
if (!attachments?.length) {
return undefined;
}
return new Map(
attachments.map(({ fileId, mediaType }) => [fileId, mediaType]),
);
};
/** @internal Exported for testing. */
export function useConversationEditingState(deps: {
chatID: string | undefined;
onSend: (
message: string,
fileIds?: string[],
attachments?: readonly PendingAttachment[],
editedMessageID?: number,
) => Promise<void>;
onDeleteQueuedMessage: (id: number) => Promise<void>;
@@ -130,6 +169,9 @@ export function useConversationEditingState(deps: {
};
},
);
const serializedEditorStateRef = useRef<string | undefined>(
initialEditorState,
);
// Monotonic counter to force LexicalComposer remount.
const [remountKey, setRemountKey] = useState(0);
@@ -176,6 +218,7 @@ export function useConversationEditingState(deps: {
editorInitialValue: text,
initialEditorState: undefined,
});
serializedEditorStateRef.current = undefined;
setRemountKey((k) => k + 1);
inputValueRef.current = text;
setEditingFileBlocks(fileBlocks ?? []);
@@ -188,6 +231,7 @@ export function useConversationEditingState(deps: {
editorInitialValue: savedText,
initialEditorState: savedState,
});
serializedEditorStateRef.current = savedState;
setRemountKey((k) => k + 1);
inputValueRef.current = savedText;
setEditingMessageId(null);
@@ -221,6 +265,7 @@ export function useConversationEditingState(deps: {
editorInitialValue: text,
initialEditorState: undefined,
});
serializedEditorStateRef.current = undefined;
setRemountKey((k) => k + 1);
inputValueRef.current = text;
setEditingFileBlocks(fileBlocks);
@@ -233,6 +278,7 @@ export function useConversationEditingState(deps: {
editorInitialValue: savedText,
initialEditorState: savedState,
});
serializedEditorStateRef.current = savedState;
setRemountKey((k) => k + 1);
inputValueRef.current = savedText;
setEditingQueuedMessageID(null);
@@ -240,25 +286,48 @@ export function useConversationEditingState(deps: {
setEditingFileBlocks([]);
};
// Wraps the parent onSend to clear local input/editing state
// and handle queue-edit deletion.
const handleSendFromInput = async (message: string, fileIds?: string[]) => {
const editedMessageID =
editingMessageId !== null ? editingMessageId : undefined;
const queueEditID = editingQueuedMessageID;
// Clears the composer for an in-flight history edit and
// returns a rollback function that restores the editing draft
// if the send fails.
const clearInputForHistoryEdit = (message: string) => {
const snapshot = {
editorState: serializedEditorStateRef.current,
fileBlocks: editingFileBlocks,
messageId: editingMessageId,
};
await onSend(message, fileIds, editedMessageID);
// Clear input and editing state on success.
chatInputRef.current?.clear();
inputValueRef.current = "";
setEditingMessageId(null);
return () => {
setDraftState({
editorInitialValue: message,
initialEditorState: snapshot.editorState,
});
serializedEditorStateRef.current = snapshot.editorState;
setRemountKey((k) => k + 1);
inputValueRef.current = message;
setEditingMessageId(snapshot.messageId);
setEditingFileBlocks(snapshot.fileBlocks);
};
};
// Clears all input and editing state after a successful send.
const finalizeSuccessfulSend = (
editedMessageID: number | undefined,
queueEditID: number | null,
) => {
chatInputRef.current?.clear();
if (!isMobileViewport()) {
chatInputRef.current?.focus();
}
inputValueRef.current = "";
serializedEditorStateRef.current = undefined;
if (draftStorageKey) {
localStorage.removeItem(draftStorageKey);
}
if (editingMessageId !== null) {
setEditingMessageId(null);
if (editedMessageID !== undefined) {
setDraftBeforeHistoryEdit(null);
setEditingFileBlocks([]);
}
@@ -270,12 +339,41 @@ export function useConversationEditingState(deps: {
}
};
// Wraps the parent onSend to clear local input/editing state
// and handle queue-edit deletion.
const handleSendFromInput = async (
message: string,
attachments?: readonly PendingAttachment[],
) => {
const editedMessageID =
editingMessageId !== null ? editingMessageId : undefined;
const queueEditID = editingQueuedMessageID;
const sendPromise = onSend(message, attachments, editedMessageID);
// For history edits, clear input immediately and prepare
// a rollback in case the send fails.
const rollback =
editedMessageID !== undefined
? clearInputForHistoryEdit(message)
: undefined;
try {
await sendPromise;
} catch (error) {
rollback?.();
throw error;
}
finalizeSuccessfulSend(editedMessageID, queueEditID);
};
const handleContentChange = (
content: string,
serializedEditorState: string,
hasFileReferences: boolean,
) => {
inputValueRef.current = content;
serializedEditorStateRef.current = serializedEditorState;
// Don't overwrite the persisted draft while editing a
// history or queued message — the original draft (possibly
@@ -430,9 +528,6 @@ const AgentChatPage: FC = () => {
} = useOutletContext<AgentsOutletContext>();
const queryClient = useQueryClient();
const [selectedModel, setSelectedModel] = useState("");
const [pendingEditMessageId, setPendingEditMessageId] = useState<
number | null
>(null);
const scrollToBottomRef = useRef<(() => void) | null>(null);
const chatInputRef = useRef<ChatMessageInputRef | null>(null);
const inputValueRef = useRef(
@@ -775,7 +870,7 @@ const AgentChatPage: FC = () => {
const handleSend = async (
message: string,
fileIds?: string[],
attachments?: readonly PendingAttachment[],
editedMessageID?: number,
) => {
const chatInputHandle = (
@@ -790,7 +885,9 @@ const AgentChatPage: FC = () => {
(p) => p.type === "file-reference",
);
const hasContent =
message.trim() || (fileIds && fileIds.length > 0) || hasFileReferences;
message.trim() ||
(attachments && attachments.length > 0) ||
hasFileReferences;
if (!hasContent || isSubmissionPending || !agentId || !hasModelOptions) {
return;
}
@@ -818,28 +915,41 @@ const AgentChatPage: FC = () => {
}
}
// Add pre-uploaded file references.
if (fileIds && fileIds.length > 0) {
for (const fileId of fileIds) {
// Add pre-uploaded file attachments.
if (attachments && attachments.length > 0) {
for (const { fileId } of attachments) {
content.push({ type: "file", file_id: fileId });
}
}
if (editedMessageID !== undefined) {
const request: TypesGen.EditChatMessageRequest = { content };
const originalEditedMessage = chatMessagesList?.find(
(existingMessage) => existingMessage.id === editedMessageID,
);
const optimisticMessage = originalEditedMessage
? buildOptimisticEditedMessage({
requestContent: request.content,
originalMessage: originalEditedMessage,
attachmentMediaTypes: buildAttachmentMediaTypes(attachments),
})
: undefined;
const previousSnapshot = store.getSnapshot();
clearChatErrorReason(agentId);
clearStreamError();
setPendingEditMessageId(editedMessageID);
store.batch(() => {
store.setQueuedMessages([]);
store.setChatStatus("running");
store.clearStreamState();
});
scrollToBottomRef.current?.();
try {
await editMessage({
messageId: editedMessageID,
optimisticMessage,
req: request,
});
store.clearStreamState();
store.setChatStatus("running");
setPendingEditMessageId(null);
} catch (error) {
setPendingEditMessageId(null);
restoreOptimisticRequestSnapshot(store, previousSnapshot);
handleUsageLimitError(error);
throw error;
}
@@ -918,10 +1028,8 @@ const AgentChatPage: FC = () => {
const handlePromoteQueuedMessage = async (id: number) => {
const previousSnapshot = store.getSnapshot();
const previousQueuedMessages = previousSnapshot.queuedMessages;
const previousChatStatus = previousSnapshot.chatStatus;
store.setQueuedMessages(
previousQueuedMessages.filter((message) => message.id !== id),
previousSnapshot.queuedMessages.filter((message) => message.id !== id),
);
store.clearStreamState();
if (agentId) {
@@ -937,8 +1045,7 @@ const AgentChatPage: FC = () => {
store.upsertDurableMessage(promotedMessage);
upsertCacheMessages([promotedMessage]);
} catch (error) {
store.setQueuedMessages(previousQueuedMessages);
store.setChatStatus(previousChatStatus);
restoreOptimisticRequestSnapshot(store, previousSnapshot);
handleUsageLimitError(error);
throw error;
}
@@ -1133,7 +1240,6 @@ const AgentChatPage: FC = () => {
workspaceAgent={workspaceAgent}
store={store}
editing={editing}
pendingEditMessageId={pendingEditMessageId}
effectiveSelectedModel={effectiveSelectedModel}
setSelectedModel={setSelectedModel}
modelOptions={modelOptions}
@@ -113,7 +113,6 @@ const StoryAgentChatPageView: FC<StoryProps> = ({ editing, ...overrides }) => {
parentChat: undefined as TypesGen.Chat | undefined,
isArchived: false,
store: createChatStore(),
pendingEditMessageId: null as number | null,
effectiveSelectedModel: defaultModelConfigID,
setSelectedModel: fn(),
modelOptions: defaultModelOptions,
@@ -505,22 +504,6 @@ export const EditingMessage: Story = {
),
};
/** The saving state while an edit is in progress shows the pending
* indicator on the message being saved. */
export const EditingSaving: Story = {
render: () => (
<StoryAgentChatPageView
store={buildStoreWithMessages(editingMessages)}
editing={{
editingMessageId: 3,
editorInitialValue: "Now tell me a better joke",
}}
pendingEditMessageId={3}
isSubmissionPending
/>
),
};
// ---------------------------------------------------------------------------
// AgentChatPageNotFoundView stories
// ---------------------------------------------------------------------------
@@ -36,6 +36,7 @@ import {
import type { useChatStore } from "./components/ChatConversation/chatStore";
import type { ModelSelectorOption } from "./components/ChatElements";
import { DesktopPanelContext } from "./components/ChatElements/tools/DesktopPanelContext";
import type { PendingAttachment } from "./components/ChatPageContent";
import { ChatPageInput, ChatPageTimeline } from "./components/ChatPageContent";
import { ChatScrollContainer } from "./components/ChatScrollContainer";
import { ChatTopBar } from "./components/ChatTopBar";
@@ -69,7 +70,10 @@ interface EditingState {
fileBlocks: readonly ChatMessagePart[],
) => void;
handleCancelQueueEdit: () => void;
handleSendFromInput: (message: string, fileIds?: string[]) => void;
handleSendFromInput: (
message: string,
attachments?: readonly PendingAttachment[],
) => void;
handleContentChange: (
content: string,
serializedEditorState: string,
@@ -92,7 +96,6 @@ interface AgentChatPageViewProps {
// Editing state.
editing: EditingState;
pendingEditMessageId: number | null;
// Model/input configuration.
effectiveSelectedModel: string;
@@ -179,7 +182,6 @@ export const AgentChatPageView: FC<AgentChatPageViewProps> = ({
workspace,
store,
editing,
pendingEditMessageId,
effectiveSelectedModel,
setSelectedModel,
modelOptions,
@@ -387,7 +389,6 @@ export const AgentChatPageView: FC<AgentChatPageViewProps> = ({
persistedError={persistedError}
onEditUserMessage={editing.handleEditUserMessage}
editingMessageId={editing.editingMessageId}
savingMessageId={pendingEditMessageId}
urlTransform={urlTransform}
mcpServers={mcpServers}
/>
@@ -668,9 +668,8 @@ export const AgentChatInput: FC<AgentChatInputProps> = ({
<div className="flex items-center justify-between border-b border-border-warning/50 px-3 py-1.5">
<span className="flex items-center gap-1.5 text-xs font-medium text-content-warning">
<PencilIcon className="h-3.5 w-3.5" />
{isLoading
? "Saving edit..."
: "Editing will delete all subsequent messages and restart the conversation here."}
Editing will delete all subsequent messages and restart the
conversation here.
</span>
<Button
type="button"
@@ -939,3 +939,59 @@ export const ThinkingOnlyAssistantSpacing: Story = {
expect(canvas.getByText("Any progress?")).toBeInTheDocument();
},
};
/**
* Regression: sources-only assistant messages must have consistent
* bottom spacing before the next user bubble. A spacer div fills the
* gap that would normally come from the hidden action bar.
*/
export const SourcesOnlyAssistantSpacing: Story = {
args: {
...defaultArgs,
parsedMessages: buildMessages([
{
...baseMessage,
id: 1,
role: "user",
content: [{ type: "text", text: "Can you share your sources?" }],
},
{
...baseMessage,
id: 2,
role: "assistant",
content: [
{
type: "source",
url: "https://example.com/docs",
title: "Documentation",
},
{
type: "source",
url: "https://example.com/api",
title: "API Reference",
},
],
},
{
...baseMessage,
id: 3,
role: "user",
content: [{ type: "text", text: "Thanks!" }],
},
]),
},
play: async ({ canvasElement }) => {
const canvas = within(canvasElement);
expect(canvas.getByText("Can you share your sources?")).toBeInTheDocument();
expect(canvas.getByText("Thanks!")).toBeInTheDocument();
await userEvent.click(
canvas.getByRole("button", { name: /searched 2 results/i }),
);
expect(
canvas.getByRole("link", { name: "Documentation" }),
).toBeInTheDocument();
expect(
canvas.getByRole("link", { name: "API Reference" }),
).toBeInTheDocument();
},
};
@@ -12,7 +12,6 @@ import type { UrlTransform } from "streamdown";
import type * as TypesGen from "#/api/typesGenerated";
import { Button } from "#/components/Button/Button";
import { CopyButton } from "#/components/CopyButton/CopyButton";
import { Spinner } from "#/components/Spinner/Spinner";
import {
Tooltip,
TooltipContent,
@@ -427,7 +426,6 @@ const ChatMessageItem = memo<{
fileBlocks?: readonly TypesGen.ChatMessagePart[],
) => void;
editingMessageId?: number | null;
savingMessageId?: number | null;
isAfterEditingMessage?: boolean;
hideActions?: boolean;
@@ -446,7 +444,6 @@ const ChatMessageItem = memo<{
parsed,
onEditUserMessage,
editingMessageId,
savingMessageId,
isAfterEditingMessage = false,
hideActions = false,
fadeFromBottom = false,
@@ -458,7 +455,6 @@ const ChatMessageItem = memo<{
showDesktopPreviews,
}) => {
const isUser = message.role === "user";
const isSavingMessage = savingMessageId === message.id;
const [previewImage, setPreviewImage] = useState<string | null>(null);
const [previewText, setPreviewText] = useState<string | null>(null);
if (
@@ -516,6 +512,11 @@ const ChatMessageItem = memo<{
userInlineContent.length > 0 || Boolean(parsed.markdown?.trim());
const hasFileBlocks = userFileBlocks.length > 0;
const hasCopyableContent = Boolean(parsed.markdown.trim());
const needsAssistantBottomSpacer =
!hideActions &&
!isUser &&
!hasCopyableContent &&
(Boolean(parsed.reasoning) || parsed.sources.length > 0);
const conversationItemProps: { role: "user" | "assistant" } = {
role: isUser ? "user" : "assistant",
@@ -536,7 +537,6 @@ const ChatMessageItem = memo<{
"rounded-lg border border-solid border-border-default bg-surface-secondary px-3 py-2 font-sans shadow-sm transition-shadow",
editingMessageId === message.id &&
"border-surface-secondary shadow-[0_0_0_2px_hsla(var(--border-warning),0.6)]",
isSavingMessage && "ring-2 ring-content-secondary/40",
fadeFromBottom && "relative overflow-hidden",
)}
style={
@@ -567,13 +567,6 @@ const ChatMessageItem = memo<{
: parsed.markdown || ""}
</span>
)}
{isSavingMessage && (
<Spinner
className="mt-0.5 h-3.5 w-3.5 shrink-0 text-content-secondary"
aria-label="Saving message edit"
loading
/>
)}
</div>
)}
{hasFileBlocks && (
@@ -670,12 +663,9 @@ const ChatMessageItem = memo<{
</div>
)}
{/* Spacer for assistant messages without an action bar
(e.g. thinking-only) so they have consistent bottom
padding before the next user bubble. */}
{!hideActions &&
!isUser &&
!hasCopyableContent &&
Boolean(parsed.reasoning) && <div className="min-h-6" />}
(e.g. reasoning-only or sources-only) so they have
consistent bottom padding before the next user bubble. */}
{needsAssistantBottomSpacer && <div className="min-h-6" />}
{previewImage && (
<ImageLightbox
src={previewImage}
@@ -702,7 +692,6 @@ const StickyUserMessage = memo<{
fileBlocks?: readonly TypesGen.ChatMessagePart[],
) => void;
editingMessageId?: number | null;
savingMessageId?: number | null;
isAfterEditingMessage?: boolean;
}>(
({
@@ -710,7 +699,6 @@ const StickyUserMessage = memo<{
parsed,
onEditUserMessage,
editingMessageId,
savingMessageId,
isAfterEditingMessage = false,
}) => {
const [isStuck, setIsStuck] = useState(false);
@@ -935,7 +923,6 @@ const StickyUserMessage = memo<{
parsed={parsed}
onEditUserMessage={handleEditUserMessage}
editingMessageId={editingMessageId}
savingMessageId={savingMessageId}
isAfterEditingMessage={isAfterEditingMessage}
/>
</div>
@@ -978,7 +965,6 @@ const StickyUserMessage = memo<{
parsed={parsed}
onEditUserMessage={handleEditUserMessage}
editingMessageId={editingMessageId}
savingMessageId={savingMessageId}
isAfterEditingMessage={isAfterEditingMessage}
fadeFromBottom
/>
@@ -1000,7 +986,6 @@ interface ConversationTimelineProps {
fileBlocks?: readonly TypesGen.ChatMessagePart[],
) => void;
editingMessageId?: number | null;
savingMessageId?: number | null;
urlTransform?: UrlTransform;
mcpServers?: readonly TypesGen.MCPServerConfig[];
computerUseSubagentIds?: Set<string>;
@@ -1014,7 +999,6 @@ export const ConversationTimeline = memo<ConversationTimelineProps>(
subagentTitles,
onEditUserMessage,
editingMessageId,
savingMessageId,
urlTransform,
mcpServers,
computerUseSubagentIds,
@@ -1041,7 +1025,7 @@ export const ConversationTimeline = memo<ConversationTimelineProps>(
}
return (
<div className="flex flex-col gap-2">
<div data-testid="conversation-timeline" className="flex flex-col gap-2">
{parsedMessages.map(({ message, parsed }, msgIdx) => {
if (message.role === "user") {
return (
@@ -1051,7 +1035,6 @@ export const ConversationTimeline = memo<ConversationTimelineProps>(
parsed={parsed}
onEditUserMessage={onEditUserMessage}
editingMessageId={editingMessageId}
savingMessageId={savingMessageId}
isAfterEditingMessage={afterEditingMessageIds.has(message.id)}
/>
);
@@ -1065,7 +1048,6 @@ export const ConversationTimeline = memo<ConversationTimelineProps>(
key={message.id}
message={message}
parsed={parsed}
savingMessageId={savingMessageId}
urlTransform={urlTransform}
isAfterEditingMessage={afterEditingMessageIds.has(message.id)}
hideActions={!isLastInChain}
@@ -70,7 +70,7 @@ export const LiveStreamTailContent = ({
}
return (
<div className="flex flex-col gap-3">
<div className="flex flex-col gap-2">
{shouldRenderEmptyState && (
<div className="py-12 text-center text-content-secondary">
<p className="text-sm">Start a conversation with your agent.</p>
@@ -190,6 +190,27 @@ describe("setChatStatus", () => {
});
});
// ---------------------------------------------------------------------------
// setStreamState
// ---------------------------------------------------------------------------
describe("setStreamState", () => {
it("does not notify when setting the same stream state reference", () => {
const store = createChatStore();
store.applyMessagePart({ type: "text", text: "hello" });
const streamState = store.getSnapshot().streamState;
expect(streamState).not.toBeNull();
let notified = false;
store.subscribe(() => {
notified = true;
});
store.setStreamState(streamState);
expect(notified).toBe(false);
});
});
// ---------------------------------------------------------------------------
// setStreamError / clearStreamError
// ---------------------------------------------------------------------------
@@ -4213,6 +4213,96 @@ describe("store/cache desync protection", () => {
expect(result.current.orderedMessageIDs).toEqual([1]);
});
});
it("reflects optimistic and authoritative history-edit cache updates through the normal sync effect", async () => {
immediateAnimationFrame();
const chatID = "chat-local-edit-sync";
const msg1 = makeMessage(chatID, 1, "user", "first");
const msg2 = makeMessage(chatID, 2, "assistant", "second");
const msg3 = makeMessage(chatID, 3, "user", "third");
const optimisticReplacement = {
...msg3,
content: [{ type: "text" as const, text: "edited draft" }],
};
const authoritativeReplacement = makeMessage(chatID, 9, "user", "edited");
const mockSocket = createMockSocket();
mockWatchChatReturn(mockSocket);
const queryClient = createTestQueryClient();
const wrapper: FC<PropsWithChildren> = ({ children }) => (
<QueryClientProvider client={queryClient}>{children}</QueryClientProvider>
);
const initialOptions = {
chatID,
chatMessages: [msg1, msg2, msg3],
chatRecord: makeChat(chatID),
chatMessagesData: {
messages: [msg1, msg2, msg3],
queued_messages: [],
has_more: false,
},
chatQueuedMessages: [] as TypesGen.ChatQueuedMessage[],
setChatErrorReason: vi.fn(),
clearChatErrorReason: vi.fn(),
};
const { result, rerender } = renderHook(
(options: Parameters<typeof useChatStore>[0]) => {
const { store } = useChatStore(options);
return {
store,
messagesByID: useChatSelector(store, selectMessagesByID),
orderedMessageIDs: useChatSelector(store, selectOrderedMessageIDs),
};
},
{ initialProps: initialOptions, wrapper },
);
await waitFor(() => {
expect(result.current.orderedMessageIDs).toEqual([1, 2, 3]);
});
act(() => {
mockSocket.emitOpen();
});
rerender({
...initialOptions,
chatMessages: [msg1, msg2, optimisticReplacement],
chatMessagesData: {
messages: [msg1, msg2, optimisticReplacement],
queued_messages: [],
has_more: false,
},
});
await waitFor(() => {
expect(result.current.orderedMessageIDs).toEqual([1, 2, 3]);
expect(result.current.messagesByID.get(3)?.content).toEqual(
optimisticReplacement.content,
);
});
rerender({
...initialOptions,
chatMessages: [msg1, msg2, authoritativeReplacement],
chatMessagesData: {
messages: [msg1, msg2, authoritativeReplacement],
queued_messages: [],
has_more: false,
},
});
await waitFor(() => {
expect(result.current.orderedMessageIDs).toEqual([1, 2, 9]);
expect(result.current.messagesByID.has(3)).toBe(false);
expect(result.current.messagesByID.get(9)?.content).toEqual(
authoritativeReplacement.content,
);
});
});
});
describe("parse errors", () => {
@@ -174,6 +174,7 @@ export type ChatStore = {
queuedMessages: readonly TypesGen.ChatQueuedMessage[] | undefined,
) => void;
setChatStatus: (status: TypesGen.ChatStatus | null) => void;
setStreamState: (streamState: StreamState | null) => void;
setStreamError: (reason: ChatDetailError | null) => void;
clearStreamError: () => void;
setRetryState: (state: RetryState | null) => void;
@@ -412,6 +413,20 @@ export const createChatStore = (): ChatStore => {
chatStatus: status,
}));
},
setStreamState: (streamState) => {
if (state.streamState === streamState) {
return;
}
setState((current) => {
if (current.streamState === streamState) {
return current;
}
return {
...current,
streamState,
};
});
},
setStreamError: (reason) => {
setState((current) => {
if (chatDetailErrorsEqual(current.streamError, reason)) {
@@ -206,10 +206,9 @@ export const useChatStore = (
const fetchedIDs = new Set(chatMessages.map((m) => m.id));
// Only classify a store-held ID as stale if it was
// present in the PREVIOUS sync's fetched data. IDs
// added to the store after the last sync (by the WS
// handler or handleSend) are new, not stale, and
// must not trigger the destructive replaceMessages
// path.
// added to the store after the last sync (for example
// by the WS handler) are new, not stale, and must not
// trigger the destructive replaceMessages path.
const prevIDs = new Set(prev.map((m) => m.id));
const hasStaleEntries =
contentChanged &&
@@ -21,6 +21,7 @@ import {
} from "#/components/Tooltip/Tooltip";
import { formatTokenCount } from "#/utils/analytics";
import { formatCostMicros } from "#/utils/currency";
import { paginateItems } from "#/utils/paginateItems";
interface ChatCostSummaryViewProps {
summary: TypesGen.ChatCostSummary | undefined;
@@ -95,25 +96,19 @@ export const ChatCostSummaryView: FC<ChatCostSummaryViewProps> = ({
}
const modelPageSize = 10;
const modelMaxPage = Math.max(
1,
Math.ceil(summary.by_model.length / modelPageSize),
);
const clampedModelPage = Math.min(modelPage, modelMaxPage);
const pagedModels = summary.by_model.slice(
(clampedModelPage - 1) * modelPageSize,
clampedModelPage * modelPageSize,
);
const {
pagedItems: pagedModels,
clampedPage: clampedModelPage,
hasPreviousPage: hasModelPrev,
hasNextPage: hasModelNext,
} = paginateItems(summary.by_model, modelPageSize, modelPage);
const chatPageSize = 10;
const chatMaxPage = Math.max(
1,
Math.ceil(summary.by_chat.length / chatPageSize),
);
const clampedChatPage = Math.min(chatPage, chatMaxPage);
const pagedChats = summary.by_chat.slice(
(clampedChatPage - 1) * chatPageSize,
clampedChatPage * chatPageSize,
);
const {
pagedItems: pagedChats,
clampedPage: clampedChatPage,
hasPreviousPage: hasChatPrev,
hasNextPage: hasChatNext,
} = paginateItems(summary.by_chat, chatPageSize, chatPage);
const usageLimit = summary.usage_limit;
const showUsageLimitCard = usageLimit?.is_limited === true;
@@ -333,10 +328,8 @@ export const ChatCostSummaryView: FC<ChatCostSummaryViewProps> = ({
currentPage={clampedModelPage}
pageSize={modelPageSize}
onPageChange={setModelPage}
hasPreviousPage={clampedModelPage > 1}
hasNextPage={
clampedModelPage * modelPageSize < summary.by_model.length
}
hasPreviousPage={hasModelPrev}
hasNextPage={hasModelNext}
/>
</div>
)}
@@ -403,10 +396,8 @@ export const ChatCostSummaryView: FC<ChatCostSummaryViewProps> = ({
currentPage={clampedChatPage}
pageSize={chatPageSize}
onPageChange={setChatPage}
hasPreviousPage={clampedChatPage > 1}
hasNextPage={
clampedChatPage * chatPageSize < summary.by_chat.length
}
hasPreviousPage={hasChatPrev}
hasNextPage={hasChatNext}
/>
</div>
)}
@@ -48,7 +48,6 @@ interface ChatPageTimelineProps {
fileBlocks?: readonly TypesGen.ChatMessagePart[],
) => void;
editingMessageId?: number | null;
savingMessageId?: number | null;
urlTransform?: UrlTransform;
mcpServers?: readonly TypesGen.MCPServerConfig[];
}
@@ -59,7 +58,6 @@ export const ChatPageTimeline: FC<ChatPageTimelineProps> = ({
persistedError,
onEditUserMessage,
editingMessageId,
savingMessageId,
urlTransform,
mcpServers,
}) => {
@@ -86,7 +84,10 @@ export const ChatPageTimeline: FC<ChatPageTimelineProps> = ({
return (
<Profiler id="AgentChat" onRender={onRenderProfiler}>
<div className="mx-auto flex w-full max-w-3xl flex-col gap-3 py-6">
<div
data-testid="chat-timeline-wrapper"
className="mx-auto flex w-full max-w-3xl flex-col gap-2 py-6"
>
{/* VNC sessions for completed agents may already be
terminated, so inline desktop previews are disabled
via showDesktopPreviews={false} to avoid a perpetual
@@ -97,7 +98,6 @@ export const ChatPageTimeline: FC<ChatPageTimelineProps> = ({
subagentTitles={subagentTitles}
onEditUserMessage={onEditUserMessage}
editingMessageId={editingMessageId}
savingMessageId={savingMessageId}
urlTransform={urlTransform}
mcpServers={mcpServers}
computerUseSubagentIds={computerUseSubagentIds}
@@ -118,10 +118,18 @@ export const ChatPageTimeline: FC<ChatPageTimelineProps> = ({
);
};
export type PendingAttachment = {
fileId: string;
mediaType: string;
};
interface ChatPageInputProps {
store: ChatStoreHandle;
compressionThreshold: number | undefined;
onSend: (message: string, fileIds?: string[]) => void;
onSend: (
message: string,
attachments?: readonly PendingAttachment[],
) => Promise<void> | void;
onDeleteQueuedMessage: (id: number) => Promise<void>;
onPromoteQueuedMessage: (id: number) => Promise<void>;
onInterrupt: () => void;
@@ -312,9 +320,10 @@ export const ChatPageInput: FC<ChatPageInputProps> = ({
<AgentChatInput
onSend={(message) => {
void (async () => {
// Collect file IDs from already-uploaded attachments.
// Skip files in error state (e.g. too large).
const fileIds: string[] = [];
// Collect uploaded attachment metadata for the optimistic
// transcript builder while keeping the server payload
// shape unchanged downstream.
const pendingAttachments: PendingAttachment[] = [];
let skippedErrors = 0;
for (const file of attachments) {
const state = uploadStates.get(file);
@@ -323,7 +332,10 @@ export const ChatPageInput: FC<ChatPageInputProps> = ({
continue;
}
if (state?.status === "uploaded" && state.fileId) {
fileIds.push(state.fileId);
pendingAttachments.push({
fileId: state.fileId,
mediaType: file.type || "application/octet-stream",
});
}
}
if (skippedErrors > 0) {
@@ -331,9 +343,10 @@ export const ChatPageInput: FC<ChatPageInputProps> = ({
`${skippedErrors} attachment${skippedErrors > 1 ? "s" : ""} could not be sent (upload failed)`,
);
}
const fileArg = fileIds.length > 0 ? fileIds : undefined;
const attachmentArg =
pendingAttachments.length > 0 ? pendingAttachments : undefined;
try {
await onSend(message, fileArg);
await onSend(message, attachmentArg);
} catch {
// Attachments preserved for retry on failure.
return;
@@ -1,7 +1,7 @@
import dayjs from "dayjs";
import relativeTime from "dayjs/plugin/relativeTime";
import { CodeIcon, ExternalLinkIcon } from "lucide-react";
import type { FC } from "react";
import { type FC, useState } from "react";
import { Area, AreaChart, CartesianGrid, XAxis, YAxis } from "recharts";
import type * as TypesGen from "#/api/typesGenerated";
import { Button } from "#/components/Button/Button";
@@ -11,6 +11,7 @@ import {
ChartTooltip,
ChartTooltipContent,
} from "#/components/Chart/Chart";
import { PaginationWidgetBase } from "#/components/PaginationWidget/PaginationWidgetBase";
import {
Table,
TableBody,
@@ -21,6 +22,7 @@ import {
} from "#/components/Table/Table";
import { cn } from "#/utils/cn";
import { formatCostMicros } from "#/utils/currency";
import { paginateItems } from "#/utils/paginateItems";
import { PrStateIcon } from "./GitPanel/GitPanel";
dayjs.extend(relativeTime);
@@ -286,6 +288,8 @@ const TimeRangeFilter: FC<{
// Main view
// ---------------------------------------------------------------------------
const RECENT_PRS_PAGE_SIZE = 10;
export const PRInsightsView: FC<PRInsightsViewProps> = ({
data,
timeRange,
@@ -294,6 +298,18 @@ export const PRInsightsView: FC<PRInsightsViewProps> = ({
const { summary, time_series, by_model, recent_prs } = data;
const isEmpty = summary.total_prs_created === 0;
// Client-side pagination for recent PRs table.
// Page resets to 1 on data refresh because the parent unmounts this
// component during loading. Clamping ensures the page is valid if the
// list shrinks without a full remount.
const [recentPrsPage, setRecentPrsPage] = useState(1);
const {
pagedItems: pagedRecentPrs,
clampedPage: clampedRecentPrsPage,
hasPreviousPage: hasRecentPrsPrev,
hasNextPage: hasRecentPrsNext,
} = paginateItems(recent_prs, RECENT_PRS_PAGE_SIZE, recentPrsPage);
return (
<div className="space-y-8">
{/* ── Header ── */}
@@ -354,8 +370,8 @@ export const PRInsightsView: FC<PRInsightsViewProps> = ({
</div>
</section>
{/* ── Model breakdown + Recent PRs side by side ── */}
<div className="grid grid-cols-1 gap-6 lg:grid-cols-2">
{/* ── Model breakdown + Recent PRs ── */}
<div className="space-y-6">
{/* ── Model performance (simplified) ── */}
{by_model.length > 0 && (
<section>
@@ -413,7 +429,7 @@ export const PRInsightsView: FC<PRInsightsViewProps> = ({
{recent_prs.length > 0 && (
<section>
<div className="mb-4">
<SectionTitle>Recent</SectionTitle>
<SectionTitle>Pull requests</SectionTitle>
</div>
<div className="overflow-hidden rounded-lg border border-border-default">
<Table className="table-fixed text-sm">
@@ -436,7 +452,7 @@ export const PRInsightsView: FC<PRInsightsViewProps> = ({
</TableRow>
</TableHeader>{" "}
<TableBody>
{recent_prs.map((pr) => (
{pagedRecentPrs.map((pr) => (
<TableRow
key={pr.chat_id}
className="border-t border-border-default transition-colors hover:bg-surface-secondary/50"
@@ -480,6 +496,18 @@ export const PRInsightsView: FC<PRInsightsViewProps> = ({
</TableBody>
</Table>
</div>
{recent_prs.length > RECENT_PRS_PAGE_SIZE && (
<div className="pt-4">
<PaginationWidgetBase
totalRecords={recent_prs.length}
currentPage={clampedRecentPrsPage}
pageSize={RECENT_PRS_PAGE_SIZE}
onPageChange={setRecentPrsPage}
hasPreviousPage={hasRecentPrsPrev}
hasNextPage={hasRecentPrsNext}
/>
</div>
)}
</section>
)}
</div>
@@ -1,10 +1,7 @@
import type { Meta, StoryObj } from "@storybook/react-vite";
import { action } from "storybook/actions";
import { screen, userEvent } from "storybook/test";
import {
getProvisionerDaemonsKey,
organizationsKey,
} from "#/api/queries/organizations";
import { expect, screen, userEvent, waitFor } from "storybook/test";
import { getProvisionerDaemonsKey } from "#/api/queries/organizations";
import {
MockDefaultOrganization,
MockOrganization2,
@@ -61,40 +58,20 @@ export const StarterTemplateWithOrgPicker: Story = {
},
};
const canCreateTemplate = (organizationId: string) => {
return {
[organizationId]: {
object: {
resource_type: "template",
organization_id: organizationId,
},
action: "create",
},
};
};
// Query key used by permittedOrganizations() in the form.
const permittedOrgsKey = [
"organizations",
"permitted",
{ object: { resource_type: "template" }, action: "create" },
];
export const StarterTemplateWithProvisionerWarning: Story = {
parameters: {
queries: [
{
key: organizationsKey,
key: permittedOrgsKey,
data: [MockDefaultOrganization, MockOrganization2],
},
{
key: [
"authorization",
{
checks: {
...canCreateTemplate(MockDefaultOrganization.id),
...canCreateTemplate(MockOrganization2.id),
},
},
],
data: {
[MockDefaultOrganization.id]: true,
[MockOrganization2.id]: true,
},
},
{
key: getProvisionerDaemonsKey(MockOrganization2.id),
data: [],
@@ -117,27 +94,11 @@ export const StarterTemplatePermissionsCheck: Story = {
parameters: {
queries: [
{
key: organizationsKey,
data: [MockDefaultOrganization, MockOrganization2],
},
{
key: [
"authorization",
{
checks: {
...canCreateTemplate(MockDefaultOrganization.id),
...canCreateTemplate(MockOrganization2.id),
},
},
],
data: {
[MockDefaultOrganization.id]: true,
[MockOrganization2.id]: false,
},
},
{
key: getProvisionerDaemonsKey(MockOrganization2.id),
data: [],
// Only MockDefaultOrganization passes the permission
// check; MockOrganization2 is filtered out by the
// permittedOrganizations query.
key: permittedOrgsKey,
data: [MockDefaultOrganization],
},
],
},
@@ -146,7 +107,14 @@ export const StarterTemplatePermissionsCheck: Story = {
showOrganizationPicker: true,
},
play: async () => {
// When only one org passes the permission check, it should be
// auto-selected in the picker.
const organizationPicker = screen.getByTestId("organization-autocomplete");
await waitFor(() =>
expect(organizationPicker).toHaveTextContent(
MockDefaultOrganization.display_name,
),
);
await userEvent.click(organizationPicker);
},
};
@@ -7,7 +7,10 @@ import { type FC, useState } from "react";
import { useQuery } from "react-query";
import { useSearchParams } from "react-router";
import * as Yup from "yup";
import { provisionerDaemons } from "#/api/queries/organizations";
import {
permittedOrganizations,
provisionerDaemons,
} from "#/api/queries/organizations";
import type {
CreateTemplateVersionRequest,
Organization,
@@ -191,6 +194,10 @@ type CreateTemplateFormProps = (
showOrganizationPicker?: boolean;
};
// Stable reference for empty org options to avoid re-render loops
// in the render-time state adjustment pattern.
const emptyOrgs: Organization[] = [];
export const CreateTemplateForm: FC<CreateTemplateFormProps> = (props) => {
const [searchParams] = useSearchParams();
const [selectedOrg, setSelectedOrg] = useState<Organization | null>(null);
@@ -222,6 +229,34 @@ export const CreateTemplateForm: FC<CreateTemplateFormProps> = (props) => {
});
const getFieldHelpers = getFormHelpers<CreateTemplateFormData>(form, error);
const permittedOrgsQuery = useQuery({
...permittedOrganizations({
object: { resource_type: "template" },
action: "create",
}),
enabled: Boolean(showOrganizationPicker),
});
const orgOptions = permittedOrgsQuery.data ?? emptyOrgs;
// Clear invalid selections when permission filtering removes the
// selected org. Uses the React render-time adjustment pattern.
const [prevOrgOptions, setPrevOrgOptions] = useState(orgOptions);
if (orgOptions !== prevOrgOptions) {
setPrevOrgOptions(orgOptions);
if (selectedOrg && !orgOptions.some((o) => o.id === selectedOrg.id)) {
setSelectedOrg(null);
void form.setFieldValue("organization", "");
}
}
// Auto-select when exactly one org is available and nothing is
// selected. Runs every render (not gated on options change) so it
// works when mock data is available synchronously on first render.
if (orgOptions.length === 1 && selectedOrg === null) {
setSelectedOrg(orgOptions[0]);
void form.setFieldValue("organization", orgOptions[0].name || "");
}
const { data: provisioners } = useQuery({
...provisionerDaemons(selectedOrg?.id ?? ""),
enabled: showOrganizationPicker && Boolean(selectedOrg),
@@ -263,9 +298,10 @@ export const CreateTemplateForm: FC<CreateTemplateFormProps> = (props) => {
<div className="flex flex-col gap-2">
<Label htmlFor="organization">Organization</Label>
<OrganizationAutocomplete
{...getFieldHelpers("organization")}
id="organization"
required
value={selectedOrg}
options={orgOptions}
onChange={(newValue) => {
setSelectedOrg(newValue);
void form.setFieldValue(
@@ -273,10 +309,6 @@ export const CreateTemplateForm: FC<CreateTemplateFormProps> = (props) => {
newValue?.name || "",
);
}}
check={{
object: { resource_type: "template" },
action: "create",
}}
/>
</div>
</>
@@ -1,8 +1,6 @@
import type { Meta, StoryObj } from "@storybook/react-vite";
import { action } from "storybook/actions";
import { userEvent, within } from "storybook/test";
import { organizationsKey } from "#/api/queries/organizations";
import type { Organization } from "#/api/typesGenerated";
import {
MockOrganization,
MockOrganization2,
@@ -26,37 +24,20 @@ type Story = StoryObj<typeof CreateUserForm>;
export const Ready: Story = {};
const permissionCheckQuery = (organizations: Organization[]) => {
return {
key: [
"authorization",
{
checks: Object.fromEntries(
organizations.map((org) => [
org.id,
{
action: "create",
object: {
resource_type: "organization_member",
organization_id: org.id,
},
},
]),
),
},
],
data: Object.fromEntries(organizations.map((org) => [org.id, true])),
};
};
// Query key used by permittedOrganizations() in the form.
const permittedOrgsKey = [
"organizations",
"permitted",
{ object: { resource_type: "organization_member" }, action: "create" },
];
export const WithOrganizations: Story = {
parameters: {
queries: [
{
key: organizationsKey,
key: permittedOrgsKey,
data: [MockOrganization, MockOrganization2],
},
permissionCheckQuery([MockOrganization, MockOrganization2]),
],
},
args: {
@@ -1,9 +1,11 @@
import { useFormik } from "formik";
import { Check } from "lucide-react";
import { Select as SelectPrimitive } from "radix-ui";
import type { FC } from "react";
import { type FC, useState } from "react";
import { useQuery } from "react-query";
import * as Yup from "yup";
import { hasApiFieldErrors, isApiError } from "#/api/errors";
import { permittedOrganizations } from "#/api/queries/organizations";
import type * as TypesGen from "#/api/typesGenerated";
import { ErrorAlert } from "#/components/Alert/ErrorAlert";
import { Button } from "#/components/Button/Button";
@@ -90,6 +92,10 @@ interface CreateUserFormProps {
serviceAccountsEnabled: boolean;
}
// Stable reference for empty org options to avoid re-render loops
// in the render-time state adjustment pattern.
const emptyOrgs: TypesGen.Organization[] = [];
export const CreateUserForm: FC<CreateUserFormProps> = ({
error,
isLoading,
@@ -125,6 +131,38 @@ export const CreateUserForm: FC<CreateUserFormProps> = ({
enableReinitialize: true,
});
const [selectedOrg, setSelectedOrg] = useState<TypesGen.Organization | null>(
null,
);
const permittedOrgsQuery = useQuery({
...permittedOrganizations({
object: { resource_type: "organization_member" },
action: "create",
}),
enabled: showOrganizations,
});
const orgOptions = permittedOrgsQuery.data ?? emptyOrgs;
// Clear invalid selections when permission filtering removes the
// selected org. Uses the React render-time adjustment pattern.
const [prevOrgOptions, setPrevOrgOptions] = useState(orgOptions);
if (orgOptions !== prevOrgOptions) {
setPrevOrgOptions(orgOptions);
if (selectedOrg && !orgOptions.some((o) => o.id === selectedOrg.id)) {
setSelectedOrg(null);
void form.setFieldValue("organization", "");
}
}
// Auto-select when exactly one org is available and nothing is
// selected. Runs every render (not gated on options change) so it
// works when mock data is available synchronously on first render.
if (orgOptions.length === 1 && selectedOrg === null) {
setSelectedOrg(orgOptions[0]);
void form.setFieldValue("organization", orgOptions[0].id ?? "");
}
const getFieldHelpers = getFormHelpers(form, error);
const isServiceAccount = form.values.login_type === "none";
@@ -174,16 +212,14 @@ export const CreateUserForm: FC<CreateUserFormProps> = ({
<div className="flex flex-col gap-2">
<Label htmlFor="organization">Organization</Label>
<OrganizationAutocomplete
{...getFieldHelpers("organization")}
id="organization"
required
value={selectedOrg}
options={orgOptions}
onChange={(newValue) => {
setSelectedOrg(newValue);
void form.setFieldValue("organization", newValue?.id ?? "");
}}
check={{
object: { resource_type: "organization_member" },
action: "create",
}}
/>
</div>
)}
+64
View File
@@ -0,0 +1,64 @@
import { describe, expect, it } from "vitest";
import { paginateItems } from "./paginateItems";
// 25 items numbered 125 for readable assertions.
const items = Array.from({ length: 25 }, (_, i) => i + 1);
describe("paginateItems", () => {
it("returns the first page of items", () => {
const result = paginateItems(items, 10, 1);
expect(result.pagedItems).toEqual([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]);
expect(result.clampedPage).toBe(1);
expect(result.totalPages).toBe(3);
expect(result.hasPreviousPage).toBe(false);
expect(result.hasNextPage).toBe(true);
});
it("returns a partial last page", () => {
const result = paginateItems(items, 10, 3);
expect(result.pagedItems).toEqual([21, 22, 23, 24, 25]);
expect(result.clampedPage).toBe(3);
expect(result.totalPages).toBe(3);
expect(result.hasPreviousPage).toBe(true);
expect(result.hasNextPage).toBe(false);
});
it("clamps currentPage down when beyond total pages", () => {
const result = paginateItems(items, 10, 99);
expect(result.clampedPage).toBe(3);
expect(result.pagedItems).toEqual([21, 22, 23, 24, 25]);
expect(result.hasPreviousPage).toBe(true);
expect(result.hasNextPage).toBe(false);
});
it("clamps currentPage up when 0", () => {
const result = paginateItems(items, 10, 0);
expect(result.clampedPage).toBe(1);
expect(result.pagedItems).toEqual([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]);
expect(result.hasPreviousPage).toBe(false);
expect(result.hasNextPage).toBe(true);
});
it("clamps currentPage up when negative", () => {
const result = paginateItems(items, 10, -5);
expect(result.clampedPage).toBe(1);
expect(result.pagedItems).toEqual([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]);
expect(result.hasPreviousPage).toBe(false);
expect(result.hasNextPage).toBe(true);
});
it("returns empty pagedItems with clampedPage=1 for an empty array", () => {
const result = paginateItems([], 10, 1);
expect(result.pagedItems).toEqual([]);
expect(result.clampedPage).toBe(1);
expect(result.totalPages).toBe(1);
expect(result.hasPreviousPage).toBe(false);
expect(result.hasNextPage).toBe(false);
});
it("reports hasPreviousPage correctly for middle pages", () => {
const result = paginateItems(items, 10, 2);
expect(result.hasPreviousPage).toBe(true);
expect(result.hasNextPage).toBe(true);
});
});
+25
View File
@@ -0,0 +1,25 @@
export function paginateItems<T>(
items: readonly T[],
pageSize: number,
currentPage: number,
): {
pagedItems: T[];
clampedPage: number;
totalPages: number;
hasPreviousPage: boolean;
hasNextPage: boolean;
} {
const totalPages = Math.max(1, Math.ceil(items.length / pageSize));
const clampedPage = Math.max(1, Math.min(currentPage, totalPages));
const pagedItems = items.slice(
(clampedPage - 1) * pageSize,
clampedPage * pageSize,
);
return {
pagedItems,
clampedPage,
totalPages,
hasPreviousPage: clampedPage > 1,
hasNextPage: clampedPage * pageSize < items.length,
};
}