Compare commits

...

170 Commits

Author SHA1 Message Date
Cian Johnston e29335e7e1 getting there 2025-10-28 15:19:50 +00:00
Cian Johnston defb8d080a add test with and without preset id 2025-10-23 16:54:29 +01:00
Cian Johnston c6151931b9 fix 2025-10-23 16:50:14 +01:00
Cian Johnston ce5e368f40 more wip 2025-10-23 11:11:36 +01:00
Cian Johnston 884d4b5d0e fix schemas 2025-10-21 21:44:56 +01:00
Cian Johnston f54fee0f72 fix 2025-10-21 21:01:48 +01:00
Cian Johnston e8b436ef61 moving things around 2025-10-21 20:59:18 +01:00
Cian Johnston 0e99594b70 linter finally not angry 2025-10-21 17:36:53 +01:00
Cian Johnston bd669b352d linter fixes 2025-10-21 15:41:00 +01:00
Cian Johnston 7437fdf535 address more linter errors 2025-10-21 13:58:48 +01:00
Cian Johnston ee8e2213b0 linter fixes 2025-10-21 13:00:12 +01:00
Cian Johnston 15175a4780 dry 2025-10-21 12:40:15 +01:00
Cian Johnston ee64d91eac s/jest/mock 2025-10-21 12:32:26 +01:00
Cian Johnston 91bf3cfc28 fix tests 2025-10-21 12:22:09 +01:00
Cian Johnston 03559ade48 replace console.log with core.debug 2025-10-20 17:44:02 +01:00
Cian Johnston 15730d40fc add lint script 2025-10-20 17:43:34 +01:00
Cian Johnston 295869bef1 fixup! wip 2025-10-20 17:17:57 +01:00
Cian Johnston 2fe1353f7a wip 2025-10-20 17:06:21 +01:00
david-fraley f3d950d917 chore: update release calendar (#20351) 2025-10-17 01:11:38 +05:00
Atif Ali ef51e7d07a chore(docs): update numbered lists to be consistent (#20350) 2025-10-16 20:11:18 +00:00
blink-so[bot] 5119db3cd2 feat: underline links in announcement banner for better visibility (#20166)
## Overview

Links in announcement banners are now underlined to make them visually
distinguishable without requiring users to hover over them.

Context:
[Slack](https://codercom.slack.com/archives/C0989BZU23T/p1759503061267819)

## Changes

- Added `text-decoration: underline` to links in the announcement banner
component

---------

Co-authored-by: blink-so[bot] <211532188+blink-so[bot]@users.noreply.github.com>
Co-authored-by: blink-so[bot] <157993532+blink-so[bot]@users.noreply.github.com>
Co-authored-by: Michael Smith <michaelsmith@coder.com>
2025-10-16 15:24:45 -04:00
Zach 9f3b2cddb1 fix(dogfood): revert unpin of containerd.io and pin docker-ce (#20349)
Previously we unpinned the containerd.io package since the dogfood
template image build was failing due to docker-ce requiring a newer
version of containerd.io. The build was fixed, but some dogfood machines
experienced docker-in-docker problems again because their OS version
didn't have the fixed containerd.io packages available.

This PR first reverts the unpinning of the containerd.io package, and
pins the docker-ce major version for compatibility with the
containerd.io package version we have pinned. Newer versions of
docker-ce have a higher min version of containerd.io. While the
underlying issue that caused us to pin containerd.io is fixed in newer
versions, we can't yet remove the pin because some machines running
dogfood are not yet updated to a version of Linux that have a version of
the fixed package (e.g. ubuntu 20.04).
2025-10-16 13:04:23 -06:00
Steven Masley a53a5682d5 chore: update coder/guts to v1.6.0 (#20346)
- Removes redudant generic type parameters
- Removes unexported values from generted output
2025-10-16 12:07:17 -05:00
Atif Ali 038e23b82c chore(dogfood): remove extra ENV variable for claude code auth (#20337) 2025-10-16 11:20:36 +00:00
Susana Ferreira 104aa19014 chore(docs): improve prebuild provsioners section (#20321)
## Description

Follow-up from: https://github.com/coder/coder/pull/20305 to include a
note about `coder_workspace_tags` being cumulative and a new step to
validate the status of the prebuild provisioners.
Fix steps formatting.
2025-10-16 11:22:48 +01:00
Cian Johnston 0faee8e913 feat(coderd): notify on task completion/failure (#20327)
Adds notifications on task transitions to completed or failure state.

Authored by Claude, I reviewed it and it appears to be legit.
2025-10-16 10:21:08 +01:00
Susana Ferreira 3fa438f80e chore(dogfood): add prebuild coder_workspace_tags (#20308)
## Description

Add a `coder_workspace_tags` data block to the Write Coder on Coder
template that conditionally includes the tag `is_prebuild=true` when the
provisioner job is a prebuild (workspace_build) job.

Related internal [PR](https://github.com/coder/dogfood/pull/201) updated
the dogfood deployment with a dedicated pool of provisioner daemons
configured with this tag.
As a result, prebuild-related jobs are now routed exclusively to this
pool, preventing them from competing with user-initiated workspace jobs.

This has been successfully tested internally in dogfood with template
`ssncf-prebuilds-coder`
2025-10-16 10:08:34 +01:00
Cian Johnston 275602ce61 ci: include coder-provisioner-tagged-prebuilds in deploy.yaml (#20320)
Add reconciliation and rollout for coder-provisioner-tagged-prebuilds
deployment

<!--

If you have used AI to produce some or all of this PR, please ensure you
have read our [AI Contribution
guidelines](https://coder.com/docs/about/contributing/AI_CONTRIBUTING)
before submitting.

-->
2025-10-16 09:42:25 +01:00
Dean Sheather 5887867e9b chore: rework wsproxy mesh tests to avoid flakes (#20296)
- Attempts pings twice per replicasync callback in wsproxy
- Reworks the test setup code to be more lenient and retry proxy
registration on failure

Closes coder/internal#957
2025-10-16 18:39:06 +11:00
Asher 41de4ad91a feat: add task send and logs MCP tools (#20230)
Closes https://github.com/coder/internal/issues/776
2025-10-15 13:21:20 -08:00
Danielle Maywood 9bef5de30d chore: upgrade coder/clistat to v1.1.1 (#20322)
coder/clistat v1.1.1 contains a bug fix
2025-10-15 20:23:29 +01:00
yyefimov 1c8ee5cb88 fix(coderd): support string type for oidc response's expires_in json property (#20152)
Some versions of Azure AD return expires_in property as string. Use
json.Number to accept either integer or string and then convert to
int64.
Helpful links:

https://learn.microsoft.com/en-us/answers/questions/2337020/azure-ad-token-endpoint-returns-expires-in-as-stri

https://feedback.azure.com/d365community/idea/7772fd95-26e6-ec11-a81b-0022484ee92d
2025-10-15 17:37:37 +00:00
Bruno Quaresma 91d4f8b59b fix: use the selected version to check external auth (#20316)
Fixes https://github.com/coder/coder/issues/20315
2025-10-15 14:24:40 -03:00
Bruno Quaresma 6b990bda52 fix: minor visual fixes in the tasks table (#20317)
- Fix table size for failure and empty states
- Add skeleton for the "Delete" action in the table row

Fixes https://github.com/coder/coder/issues/20281
2025-10-15 14:23:49 -03:00
Bruno Quaresma df738abccd chore: remove MUI icons (#20318) 2025-10-15 14:22:31 -03:00
Susana Ferreira cc4127405b chore(dogfood): update claude-code module (#20191)
## Description

Update `claude_code` module `system_prompt` variable in template "Write
Coder on Coder". Claude-code module now incorporates Coder's inner
system prompt for proper integration with task reporting.

Related to PRs:
* https://github.com/coder/coder/pull/20053
* https://github.com/coder/registry/pull/443 and
https://github.com/coder/registry/pull/461

---------

Co-authored-by: Atif Ali <atif@coder.com>
2025-10-15 18:22:12 +01:00
Mathias Fredriksson 82945cfb16 fix(coderd/database): add missing columns to tasks with status (#20311)
Updates coder/internal#976
2025-10-15 16:34:33 +00:00
Mathias Fredriksson 408b09a1f2 feat(coderd): add audit resource for tasks (#20301)
Updates coder/internal#976
2025-10-15 16:13:59 +00:00
Zach 03d285db26 fix(dogfood): fix dogfood template image build (#20312)
Previously the dogfood action was failing because the image for the
dogfood template pulls a version of docker-ce that depended on a version
of containerd.io greater than the pinned version. The pinned version was
a workaround for an old sysbox issue (see #15723).

Example action I kicked off main recently that failed:
https://github.com/coder/coder/actions/runs/18507966953
```
4.879 The following packages have unmet dependencies:
4.955  docker-ce : Depends: containerd.io (>= 1.7.27) but 1.7.23-1 is to be installed
4.963 E: Unable to correct problems, you have held broken packages.
```
2025-10-15 10:01:34 -06:00
Cian Johnston ade3fce0f6 fix(coderd): prevent task working notification for first app status (#20313)
Disclaimer: Claude did all of this, reviewed and committed by me.

I find the "task is working" notification straight after creation to be
unnecessary.
Added logic to skip the notification if the first app status is
"working".
2025-10-15 16:41:16 +01:00
Dean Sheather 6c99d5eca2 fix: avoid connection logging crashes in agent (#20307)
- Ignore errors when reporting a connection from the server, just log
them instead
- Translate connection log IP `localhost` to `127.0.0.1` on both the
server and the agent

Note that the temporary fix for converting invalid IPs to localhost is
not required in main since the database no longer forbids NULL for the
IP column since https://github.com/coder/coder/pull/19788

Relates to #20194
2025-10-16 01:56:43 +11:00
Susana Ferreira 09e2daf282 chore(docs): add external provisioner configuration for prebuilds (#20305)
## Description

Update the Prebuilds troubleshooting page to include a new section,
“Preventing prebuild queue contention (recommended)”, outlining a
best-practice configuration to prevent prebuild jobs from overwhelming
the provisioner queue.

This setup introduces a dedicated prebuild provisioner pool and has been
successfully tested internally in dogfood:
https://github.com/coder/dogfood/pull/201

Closes: https://github.com/coder/coder/issues/20241
2025-10-15 15:34:21 +01:00
Bruno Quaresma 9861931df1 fix: select the correct version when template changes (#20293)
Fix https://github.com/coder/internal/issues/1062
2025-10-15 10:04:17 -03:00
Bruno Quaresma 24dddd56c5 fix: keep button in loading state after start request is made (#20294)
After requesting a workspace start, it may take a while to become ready.
Show a loading state in the meantime.
	
Fixes  https://github.com/coder/coder/issues/20233
2025-10-15 10:03:34 -03:00
Spike Curtis 05b037bdea fix: avoid deadlock race writing to a disconnected mapper (#20303)
fixes https://github.com/coder/internal/issues/1045

Fixes a race condition in our PG Coordinator when a peer disconnects. We issue database queries to find the peer mappings (node structures for each peer connected via a tunnel), and then send these to the "mapper" that generates diffs and eventually writes the update to the websocket.

Before this change we erroneously used the querier's context for this update, which has the same lifetime as the coordinator itself. If the peer has disconnected, the mapper might not be reading from its channel, and this causes a deadlock in a querier worker. This also prevents us from doing any more work on the peer.

I also added some more debug logging that would have been helpful when tracking this down.
2025-10-15 15:56:07 +04:00
Jaayden Halko 3699ff6b48 refactor: migrate TermsOfServiceLink from MUI to shadcn/ui (#20260)
## Summary

Migrate the `TermsOfServiceLink` component from MUI to shadcn/ui

## Changes

- **Replaced** `@mui/material/Link` with the custom `Link` component
from `components/Link/Link` (shadcn/ui)
- **Migrated** Emotion `css` prop to Tailwind utility classes
- **Preserved** external link icon functionality (automatically provided
by shadcn Link component)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-10-15 12:16:18 +01:00
Rowan Smith e0b1536075 chore: clarify autostop behaviour for existing workspaces (#20295)
[We have tests to ensure this
behaviour](https://github.com/coder/coder/blob/152103bf788dc5a4c06586302a255a212e0fe77d/coderd/workspaces_test.go#L2957-L3004),
but it is not clearly documented. This PR adds a note clarifying
autostop must be enabled on the template prior to a workspace being
created for it to apply to the workspace, i.e. it is not re-read if a
workspace is restarted / stopped and started. This behaviour was raised
by a customer.

https://coder.com/docs/user-guides/workspace-scheduling#autostop
2025-10-15 08:46:27 +05:00
Marcin Tojek 8ac54534f4 feat: skip deprecated:false in templates search bar (#20274)
Fixes: https://github.com/coder/coder/issues/19746
2025-10-14 20:58:13 +02:00
Bruno Quaresma 3dd3859ebf chore: add task feedback dialog component (#20252)
Related to https://github.com/coder/coder/issues/20214

This PR aims to add only the FE component for capturing user feedback
from tasks. Once the BE work is completed (in a separate PR), this
component will be triggered after a task is deleted.

The goal is to develop this feature in parallel.

**Screenshot:**
<img width="1206" height="727" alt="Screenshot 2025-10-09 at 14 31 53"
src="https://github.com/user-attachments/assets/1f92026c-8f05-4535-bbd6-85c4b107c037"
/>
2025-10-14 13:47:02 -03:00
Cian Johnston 9f229370e7 feat(coderd/database): add ListTasks query (#20282)
Relates to https://github.com/coder/internal/issues/981

Adds a `ListTasks` query that allows filtering by OwnerID and OrganizationID.
2025-10-14 17:33:30 +01:00
Sas Swart 06db58771f docs: add troubleshooting steps for prebuilt workspaces (#20231)
This PR adds troubleshooting steps to guide Coder operators when they
suspect that prebuilds might have overwhelmed their deployments.

Closes https://github.com/coder/coder/issues/19490

---------

Co-authored-by: Susana Ferreira <susana@coder.com>
2025-10-14 13:20:43 +02:00
Cian Johnston 9c6be5bfe7 ci(.github/workflows/traiage.yaml): explicitly fetch comments in gh cli invocation (#20288)
Closes https://github.com/coder/internal/issues/1053

Updated the `gh` command to fetch issue description with explicit JSON
fields and formatting.

Validated with manual workflow run:
https://github.com/coder/coder/actions/runs/18492815891/job/52690197479

Agent picked up additional instruction from comments:
https://github.com/coder/coder/compare/cian/traiage-gh-20085-18492815891

<img width="1339" height="406" alt="Screenshot 2025-10-14 at 11 02 36"
src="https://github.com/user-attachments/assets/c3daf103-3d88-4f11-b5ee-d3596912aaa8"
/>
2025-10-14 11:25:18 +01:00
Paweł Banaszewski 152103bf78 fix: add default value for RevokeURL property in external auth config for GitHub (#20272)
This PR adds setting default value of `RevokeURL` property of external
auth config for GitHub.
2025-10-14 09:28:10 +02:00
Danielle Maywood 9158e46bda chore: upgrade coder/clistat to v1.1.0 (#20280)
Upgrades coder/clistat to v1.1.0. This version contains a significant
refactor to the cgroupv2 implementation that improves how we calculate
memory and cpu limits.
2025-10-13 15:11:34 +01:00
dependabot[bot] 5c8176b7ff chore: bump google.golang.org/api from 0.251.0 to 0.252.0 (#20266)
Bumps
[google.golang.org/api](https://github.com/googleapis/google-api-go-client)
from 0.251.0 to 0.252.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/google-api-go-client/releases">google.golang.org/api's
releases</a>.</em></p>
<blockquote>
<h2>v0.252.0</h2>
<h2><a
href="https://github.com/googleapis/google-api-go-client/compare/v0.251.0...v0.252.0">0.252.0</a>
(2025-10-07)</h2>
<h3>Features</h3>
<ul>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3326">#3326</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/c9925454087ae745d966061b1e5e091236a23b73">c992545</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3328">#3328</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/b9b915b1d871407fd86912eaa5073dec1dd28cd0">b9b915b</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3330">#3330</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/fcb7723a725ee8eb73e0fb511c1d5b33c25ebd50">fcb7723</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3331">#3331</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/a67f44d16fb8ee506127e2d411b84e708d4fa4df">a67f44d</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3332">#3332</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/b9890fcd362e2b7fe6f2ec4b37257408d5f09edb">b9890fc</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3334">#3334</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/163216d91d6e20831b2cda5400aee0e587cfec18">163216d</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3336">#3336</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/2c4dd78944ebf9f04258ccea6800f57a7cdd8dae">2c4dd78</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/google-api-go-client/blob/main/CHANGES.md">google.golang.org/api's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/googleapis/google-api-go-client/compare/v0.251.0...v0.252.0">0.252.0</a>
(2025-10-07)</h2>
<h3>Features</h3>
<ul>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3326">#3326</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/c9925454087ae745d966061b1e5e091236a23b73">c992545</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3328">#3328</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/b9b915b1d871407fd86912eaa5073dec1dd28cd0">b9b915b</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3330">#3330</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/fcb7723a725ee8eb73e0fb511c1d5b33c25ebd50">fcb7723</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3331">#3331</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/a67f44d16fb8ee506127e2d411b84e708d4fa4df">a67f44d</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3332">#3332</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/b9890fcd362e2b7fe6f2ec4b37257408d5f09edb">b9890fc</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3334">#3334</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/163216d91d6e20831b2cda5400aee0e587cfec18">163216d</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3336">#3336</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/2c4dd78944ebf9f04258ccea6800f57a7cdd8dae">2c4dd78</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/88c288c3544fb39b71e48e545d4284f8ffe0c6a1"><code>88c288c</code></a>
chore(main): release 0.252.0 (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3327">#3327</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/2c4dd78944ebf9f04258ccea6800f57a7cdd8dae"><code>2c4dd78</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3336">#3336</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/ceff1a1fca588d5f1e24d6ca53b8888d0c9175b2"><code>ceff1a1</code></a>
chore(all): update all (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3333">#3333</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/163216d91d6e20831b2cda5400aee0e587cfec18"><code>163216d</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3334">#3334</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/b9890fcd362e2b7fe6f2ec4b37257408d5f09edb"><code>b9890fc</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3332">#3332</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/a67f44d16fb8ee506127e2d411b84e708d4fa4df"><code>a67f44d</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3331">#3331</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/4312741a471f572fe1ee6c0d00c9311265fc3e14"><code>4312741</code></a>
chore: update CODEOWNERS, remove blunderbuss (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3329">#3329</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/fcb7723a725ee8eb73e0fb511c1d5b33c25ebd50"><code>fcb7723</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3330">#3330</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/b9b915b1d871407fd86912eaa5073dec1dd28cd0"><code>b9b915b</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3328">#3328</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/c9925454087ae745d966061b1e5e091236a23b73"><code>c992545</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3326">#3326</a>)</li>
<li>See full diff in <a
href="https://github.com/googleapis/google-api-go-client/compare/v0.251.0...v0.252.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=google.golang.org/api&package-manager=go_modules&previous-version=0.251.0&new-version=0.252.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 13:30:16 +00:00
dependabot[bot] 3d779c2093 chore: bump google.golang.org/grpc from 1.75.1 to 1.76.0 (#20277)
Bumps [google.golang.org/grpc](https://github.com/grpc/grpc-go) from
1.75.1 to 1.76.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/grpc/grpc-go/releases">google.golang.org/grpc's
releases</a>.</em></p>
<blockquote>
<h2>Release 1.76.0</h2>
<h1>Dependencies</h1>
<ul>
<li>Minimum supported Go version is now 1.24 (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8509">#8509</a>)
<ul>
<li>Special Thanks: <a
href="https://github.com/kevinGC"><code>@​kevinGC</code></a></li>
</ul>
</li>
</ul>
<h1>Bug Fixes</h1>
<ul>
<li>client: Return status <code>INTERNAL</code> when a server sends zero
response messages for a unary or client-streaming RPC. (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8523">#8523</a>)</li>
<li>client: Fail RPCs with status <code>INTERNAL</code> instead of
<code>UNKNOWN</code> upon receiving http headers with status 1xx and
<code>END_STREAM</code> flag set. (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8518">#8518</a>)
<ul>
<li>Special Thanks: <a
href="https://github.com/vinothkumarr227"><code>@​vinothkumarr227</code></a></li>
</ul>
</li>
<li>pick_first: Fix race condition that could cause pick_first to get
stuck in <code>IDLE</code> state on backend address change. (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8615">#8615</a>)</li>
</ul>
<h1>New Features</h1>
<ul>
<li>credentials: Add <code>credentials/jwt</code> package providing
file-based JWT PerRPCCredentials (A97). (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8431">#8431</a>)
<ul>
<li>Special Thanks: <a
href="https://github.com/dimpavloff"><code>@​dimpavloff</code></a></li>
</ul>
</li>
</ul>
<h1>Performance Improvements</h1>
<ul>
<li>client: Improve HTTP/2 header size estimate to reduce
re-allocations. (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8547">#8547</a>)</li>
<li>encoding/proto: Avoid redundant message size calculation when
marshaling. (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8569">#8569</a>)
<ul>
<li>Special Thanks: <a
href="https://github.com/rs-unity"><code>@​rs-unity</code></a></li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/grpc/grpc-go/commit/d96c2ef4f3339142d20a47797d8a5a4fae948607"><code>d96c2ef</code></a>
Change version to 1.76.0 (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8584">#8584</a>)</li>
<li><a
href="https://github.com/grpc/grpc-go/commit/79c553c64de01994d8b9dc0dcac6ed765ac7de50"><code>79c553c</code></a>
Cherry pick <a
href="https://redirect.github.com/grpc/grpc-go/issues/8610">#8610</a>,
<a href="https://redirect.github.com/grpc/grpc-go/issues/8615">#8615</a>
to v1.76.x (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8621">#8621</a>)</li>
<li><a
href="https://github.com/grpc/grpc-go/commit/0513350812453ffc1fe7fd329817a16fb40a8cfe"><code>0513350</code></a>
client: minor improvements to log messages (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8564">#8564</a>)</li>
<li><a
href="https://github.com/grpc/grpc-go/commit/ebaf486eab0fdf28996baf269064f83224538150"><code>ebaf486</code></a>
credentials: implement file-based JWT Call Credentials (part 1 for A97)
(<a
href="https://redirect.github.com/grpc/grpc-go/issues/8431">#8431</a>)</li>
<li><a
href="https://github.com/grpc/grpc-go/commit/ca78c904b12dd41257291d6b9ba3309a18f0b277"><code>ca78c90</code></a>
xds/resolver_test: fix flaky test
ResolverBadServiceUpdate_NACKedWithoutCache...</li>
<li><a
href="https://github.com/grpc/grpc-go/commit/83bead40c01c8c5b8407e4573203ab34dec76c78"><code>83bead4</code></a>
internal/buffer: set closed flag when closing channel in the Load method
(<a
href="https://redirect.github.com/grpc/grpc-go/issues/8575">#8575</a>)</li>
<li><a
href="https://github.com/grpc/grpc-go/commit/0f45079e3826e866ff0d2034a8732c0a482e3170"><code>0f45079</code></a>
encoding/proto: enable use cached size option (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8569">#8569</a>)</li>
<li><a
href="https://github.com/grpc/grpc-go/commit/8420f3ff9ce4617369e054cedb51fda6d45c3340"><code>8420f3f</code></a>
transport: avoid slice reallocation during header creation (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8547">#8547</a>)</li>
<li><a
href="https://github.com/grpc/grpc-go/commit/b36320ef9aa22c1b3eedd607fec388fc61cc6583"><code>b36320e</code></a>
Revert &quot;stats/opentelemetry: record retry attempts from
clientStream (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8342">#8342</a>)&quot;...</li>
<li><a
href="https://github.com/grpc/grpc-go/commit/c1222501e9eeb118d6f0df19fa9387fcb6e5a6a1"><code>c122250</code></a>
stats/opentelemetry: record retry attempts from clientStream (<a
href="https://redirect.github.com/grpc/grpc-go/issues/8342">#8342</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/grpc/grpc-go/compare/v1.75.1...v1.76.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=google.golang.org/grpc&package-manager=go_modules&previous-version=1.75.1&new-version=1.76.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 13:14:01 +00:00
dependabot[bot] a10ca93c42 chore: bump alpine from 3.22.1 to 3.22.2 in /scripts (#20278)
Bumps alpine from 3.22.1 to 3.22.2.


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=alpine&package-manager=docker&previous-version=3.22.1&new-version=3.22.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 13:12:57 +00:00
dependabot[bot] 4a373ee6af chore: bump github.com/valyala/fasthttp from 1.66.0 to 1.67.0 (#20273)
Bumps [github.com/valyala/fasthttp](https://github.com/valyala/fasthttp)
from 1.66.0 to 1.67.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/valyala/fasthttp/releases">github.com/valyala/fasthttp's
releases</a>.</em></p>
<blockquote>
<h2>v1.67.0</h2>
<p>Special thanks to the following security researchers who reported the
issues fixed in this release:</p>
<ul>
<li><a href="https://github.com/zer0yu"><code>@​zer0yu</code></a> (Enze
Wang)</li>
<li><a href="https://github.com/P3ngu1nW"><code>@​P3ngu1nW</code></a>
(Jingcheng Yang)</li>
<li><a href="https://github.com/9vvert"><code>@​9vvert</code></a> (Zehui
Miao)</li>
</ul>
<h2>What's Changed</h2>
<ul>
<li>Add DNS cache management methods for TCPDialer by <a
href="https://github.com/aabishkaryal"><code>@​aabishkaryal</code></a>
in <a
href="https://redirect.github.com/valyala/fasthttp/pull/2072">valyala/fasthttp#2072</a></li>
<li>Fix username:password@ validation in urls by <a
href="https://github.com/erikdubbelboer"><code>@​erikdubbelboer</code></a>
in <a
href="https://redirect.github.com/valyala/fasthttp/pull/2080">valyala/fasthttp#2080</a></li>
<li>Validate IPv6 addresses in urls by <a
href="https://github.com/erikdubbelboer"><code>@​erikdubbelboer</code></a>
in <a
href="https://redirect.github.com/valyala/fasthttp/pull/2079">valyala/fasthttp#2079</a></li>
<li>Validate schemes by <a
href="https://github.com/erikdubbelboer"><code>@​erikdubbelboer</code></a>
in <a
href="https://redirect.github.com/valyala/fasthttp/pull/2078">valyala/fasthttp#2078</a></li>
<li>Reject invalid hosts with multiple port delimiters by <a
href="https://github.com/erikdubbelboer"><code>@​erikdubbelboer</code></a>
in <a
href="https://redirect.github.com/valyala/fasthttp/pull/2077">valyala/fasthttp#2077</a></li>
<li>Reject backslash absolute URIs and cache parse errors by <a
href="https://github.com/erikdubbelboer"><code>@​erikdubbelboer</code></a>
in <a
href="https://redirect.github.com/valyala/fasthttp/pull/2075">valyala/fasthttp#2075</a></li>
<li>Reject bad ipv6 hostnames by <a
href="https://github.com/erikdubbelboer"><code>@​erikdubbelboer</code></a>
in <a
href="https://redirect.github.com/valyala/fasthttp/pull/2076">valyala/fasthttp#2076</a></li>
<li>Reimplement flushing support for fasthttpadaptor by <a
href="https://github.com/erikdubbelboer"><code>@​erikdubbelboer</code></a>
in <a
href="https://redirect.github.com/valyala/fasthttp/pull/2081">valyala/fasthttp#2081</a></li>
<li>chore(deps): bump securego/gosec from 2.22.8 to 2.22.9 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/valyala/fasthttp/pull/2073">valyala/fasthttp#2073</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/aabishkaryal"><code>@​aabishkaryal</code></a>
made their first contribution in <a
href="https://redirect.github.com/valyala/fasthttp/pull/2072">valyala/fasthttp#2072</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/valyala/fasthttp/compare/v1.66.0...v1.67.0">https://github.com/valyala/fasthttp/compare/v1.66.0...v1.67.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/valyala/fasthttp/commit/b26ff4866918a81247d8ce1cbfe07c7da63b2940"><code>b26ff48</code></a>
chore(deps): bump golang.org/x/net from 0.44.0 to 0.45.0 (<a
href="https://redirect.github.com/valyala/fasthttp/issues/2084">#2084</a>)</li>
<li><a
href="https://github.com/valyala/fasthttp/commit/19624506292b2f05ed08774d1f6c814340efb409"><code>1962450</code></a>
Fix copyTrailer</li>
<li><a
href="https://github.com/valyala/fasthttp/commit/2272d532e154f55d6a9cfe316109d2be850a6331"><code>2272d53</code></a>
Reimplement flushing support for fasthttpadaptor (<a
href="https://redirect.github.com/valyala/fasthttp/issues/2081">#2081</a>)</li>
<li><a
href="https://github.com/valyala/fasthttp/commit/a17ec74999fe7a1f9a46d09d644584e73e97f908"><code>a17ec74</code></a>
Reject bad ipv6 hostnames (<a
href="https://redirect.github.com/valyala/fasthttp/issues/2076">#2076</a>)</li>
<li><a
href="https://github.com/valyala/fasthttp/commit/f18eb9ef0c366b9ac212e41f4bce3378b215dbf2"><code>f18eb9e</code></a>
Reject backslash absolute URIs and cache parse errors (<a
href="https://redirect.github.com/valyala/fasthttp/issues/2075">#2075</a>)</li>
<li><a
href="https://github.com/valyala/fasthttp/commit/bed90bcf091dc56b3098fb57cbbc1ca931d5ea34"><code>bed90bc</code></a>
Reject invalid hosts with multiple port delimiters (<a
href="https://redirect.github.com/valyala/fasthttp/issues/2077">#2077</a>)</li>
<li><a
href="https://github.com/valyala/fasthttp/commit/d3fc68239107eb85a36b3baf0b9ca873cdf9b95f"><code>d3fc682</code></a>
Validate schemes (<a
href="https://redirect.github.com/valyala/fasthttp/issues/2078">#2078</a>)</li>
<li><a
href="https://github.com/valyala/fasthttp/commit/af41f54adbd2b47323ae0b1caf76e79cb3c2e824"><code>af41f54</code></a>
Validate IPv6 addresses in urls (<a
href="https://redirect.github.com/valyala/fasthttp/issues/2079">#2079</a>)</li>
<li><a
href="https://github.com/valyala/fasthttp/commit/75dcdb8bba3eac0eb091a68a5b401cf9755be0df"><code>75dcdb8</code></a>
Fix username:password@ validation in urls (<a
href="https://redirect.github.com/valyala/fasthttp/issues/2080">#2080</a>)</li>
<li><a
href="https://github.com/valyala/fasthttp/commit/ede09fad738b7c16784e37064f49f89c960ecbb0"><code>ede09fa</code></a>
Limit FuzzTestHeaderScanner body size</li>
<li>Additional commits viewable in <a
href="https://github.com/valyala/fasthttp/compare/v1.66.0...v1.67.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/valyala/fasthttp&package-manager=go_modules&previous-version=1.66.0&new-version=1.67.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 12:16:05 +00:00
Paweł Banaszewski 847058c56c fix: set default values for RevokeURL property in external auth configs (#20270)
This PR adds logic that sets default values for `RevokeURL` in external
auth configs.
2025-10-13 14:04:08 +02:00
dependabot[bot] 5ab72cce63 chore: bump github.com/gofrs/flock from 0.12.1 to 0.13.0 (#20269)
Bumps [github.com/gofrs/flock](https://github.com/gofrs/flock) from
0.12.1 to 0.13.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/gofrs/flock/releases">github.com/gofrs/flock's
releases</a>.</em></p>
<blockquote>
<h2>v0.13.0</h2>
<h2>What's Changed</h2>
<p>Minimum Go version 1.24</p>
<ul>
<li>feat: add Stat method by <a
href="https://github.com/ferhatelmas"><code>@​ferhatelmas</code></a> in
<a
href="https://redirect.github.com/gofrs/flock/pull/127">gofrs/flock#127</a></li>
<li>chore(deps): bump golang.org/x/sys from 0.22.0 to 0.37.0</li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/gofrs/flock/compare/v0.12.1...v0.13.0">https://github.com/gofrs/flock/compare/v0.12.1...v0.13.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/gofrs/flock/commit/bfec60bb026171031050af631b6dec974f14e9fa"><code>bfec60b</code></a>
chore(deps): bump golang.org/x/sys from 0.36.0 to 0.37.0 in the gomod
group (...</li>
<li><a
href="https://github.com/gofrs/flock/commit/7094284415ad11369be4662a7c12be25963b4ea5"><code>7094284</code></a>
chore: update linter</li>
<li><a
href="https://github.com/gofrs/flock/commit/8111aec69ca1501f26bb5198ed02673e87806e65"><code>8111aec</code></a>
feat: add Stat method (<a
href="https://redirect.github.com/gofrs/flock/issues/127">#127</a>)</li>
<li><a
href="https://github.com/gofrs/flock/commit/6f0f0ed4e14d546b238ae500710aba38b924e135"><code>6f0f0ed</code></a>
chore(deps): bump the github-actions group with 4 updates (<a
href="https://redirect.github.com/gofrs/flock/issues/126">#126</a>)</li>
<li><a
href="https://github.com/gofrs/flock/commit/fe44231e563ec57fda028bc2484140fb1f24a6d1"><code>fe44231</code></a>
chore(deps): bump golang.org/x/sys from 0.35.0 to 0.36.0 in the gomod
group (...</li>
<li><a
href="https://github.com/gofrs/flock/commit/f74f0fb0332646c6b3730bfe9cce6fc0badc52c6"><code>f74f0fb</code></a>
chore(deps): bump github.com/stretchr/testify from 1.10.0 to 1.11.1 in
the go...</li>
<li><a
href="https://github.com/gofrs/flock/commit/c1f6d161c8e3b29a4d612e34ff17b37d00d4cd2f"><code>c1f6d16</code></a>
chore(deps): bump golang.org/x/sys from 0.34.0 to 0.35.0 in the gomod
group (...</li>
<li><a
href="https://github.com/gofrs/flock/commit/c542c57ff5f6af1d62b6864144170b612731796a"><code>c542c57</code></a>
chore(deps): bump github/codeql-action from 3.29.2 to 3.29.5 in the
github-ac...</li>
<li><a
href="https://github.com/gofrs/flock/commit/425570ba9b698b04bb9506c4906f137fb34ac7e0"><code>425570b</code></a>
chore(deps): bump golang.org/x/sys from 0.33.0 to 0.34.0 in the gomod
group (...</li>
<li><a
href="https://github.com/gofrs/flock/commit/12753ea298e1aeb97f0881ff3fc07eabcb2b86e5"><code>12753ea</code></a>
chore(deps): bump github/codeql-action from 3.28.18 to 3.29.2 in the
github-a...</li>
<li>Additional commits viewable in <a
href="https://github.com/gofrs/flock/compare/v0.12.1...v0.13.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/gofrs/flock&package-manager=go_modules&previous-version=0.12.1&new-version=0.13.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 12:02:14 +00:00
dependabot[bot] fb86841071 chore: bump github.com/brianvoe/gofakeit/v7 from 7.7.1 to 7.8.0 (#20265)
Bumps
[github.com/brianvoe/gofakeit/v7](https://github.com/brianvoe/gofakeit)
from 7.7.1 to 7.8.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/brianvoe/gofakeit/commit/6af983166ac57e4b39f11c665273d6e61339637d"><code>6af9831</code></a>
emoji - cleanup, organization and new function additions</li>
<li><a
href="https://github.com/brianvoe/gofakeit/commit/1399ba0ab2177c19f83432e20a619c26be517a35"><code>1399ba0</code></a>
generate - improve generate performance</li>
<li><a
href="https://github.com/brianvoe/gofakeit/commit/3840eaf8c61407e57a8e14382c0e81dfbde15402"><code>3840eaf</code></a>
internet - added url safe slug</li>
<li><a
href="https://github.com/brianvoe/gofakeit/commit/1332b941476e911422b8647d8d6b331de703513f"><code>1332b94</code></a>
airline - added airline functions</li>
<li><a
href="https://github.com/brianvoe/gofakeit/commit/ae44694fb1f7b11f8687ec0b292961e462ceb847"><code>ae44694</code></a>
readme - updated to reflect text updates</li>
<li><a
href="https://github.com/brianvoe/gofakeit/commit/e5a4d526a4e615eaa77b8361ceae4d0ba7ce71b1"><code>e5a4d52</code></a>
lorem ipsum - simplified usage and directly use sentence gen</li>
<li><a
href="https://github.com/brianvoe/gofakeit/commit/3890a67dc9728b7f8a0c0e390eab8e91e455eec1"><code>3890a67</code></a>
sentence - updated usage across repo</li>
<li><a
href="https://github.com/brianvoe/gofakeit/commit/ab13d69433d91d6521002017f30ee1c5b17893a0"><code>ab13d69</code></a>
hipster - updated to uppercase first character</li>
<li><a
href="https://github.com/brianvoe/gofakeit/commit/7b7ddf7db30dcdd4c562fdcb1c59049b29bb6641"><code>7b7ddf7</code></a>
hipster - updated to uppercase first character</li>
<li><a
href="https://github.com/brianvoe/gofakeit/commit/5e08f261402527fdba32e9f74187809cb3a5987b"><code>5e08f26</code></a>
template - update test usage</li>
<li>Additional commits viewable in <a
href="https://github.com/brianvoe/gofakeit/compare/v7.7.1...v7.8.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/brianvoe/gofakeit/v7&package-manager=go_modules&previous-version=7.7.1&new-version=7.8.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 11:57:31 +00:00
dependabot[bot] 51b0767429 chore: bump rust from 1219c0b to e4ae8ab in /dogfood/coder (#20267)
Bumps rust from `1219c0b` to `e4ae8ab`.


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=rust&package-manager=docker&previous-version=slim&new-version=slim)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 11:54:09 +00:00
Mathias Fredriksson a8f87c2625 feat(coderd): implement task to app linking (#20237)
This change adds workspace build/agent/app linking to tasks and wires it
into `wsbuilder` and `provisionerdserver`.

Closes coder/internal#948
Closes coder/coder#20212
Closes coder/coder#19773
2025-10-13 12:57:06 +03:00
Mathias Fredriksson 5dc57da6b4 fix(coderd/database): ensure task name uniqueness (#20236)
This change ensures task names are unique per user the same way we do
for workspaces. This ensures we don't create tasks that are impossible
to start due to another task being named the same creating a workspace
name conflict.

Updates coder/internal#948
Supersedes coder/coder#20212
2025-10-13 12:42:38 +03:00
Mathias Fredriksson 952c69f412 feat(coderd/database): add task status and status view (#20235)
This change updates the `task_workspace_apps` table structure for
improved linking to workspace builds and adds queries to manage tasks
and a view to expose task status.

Updates coder/internal#948
Supersedes coder/coder#20212
Supersedes coder/coder#19773
2025-10-13 12:25:58 +03:00
Mathias Fredriksson 299a54a99b feat(coderd): add tasks rbac object (#20234)
This change adds RBAC for tasks.

Updates coder/internal#948
Supersedes coder/coder#20212
2025-10-13 12:02:22 +03:00
dependabot[bot] d9f95f2285 chore: bump coder/git-clone/coder from 1.1.1 to 1.1.2 in /dogfood/coder (#20264)
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=coder/git-clone/coder&package-manager=terraform&previous-version=1.1.1&new-version=1.1.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 00:31:27 +00:00
dependabot[bot] dd85d1fdd8 chore: bump coder/claude-code/coder from 3.0.1 to 3.1.0 in /dogfood/coder (#20263)
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=coder/claude-code/coder&package-manager=terraform&previous-version=3.0.1&new-version=3.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 00:31:19 +00:00
Spike Curtis 2ded9d6a73 test: remove external compilation of db cleaner entirely (#20240)
fixes https://github.com/coder/internal/issues/1026

Thru a (perhaps too-) clever hack of `init()` functions, I've managed to remove the need to separately compile the cleaner binary. This should fix the flakes we are seeing were the binary compilation takes 10s of seconds on macOS.  The cleaner is encorporated directly into the test binary and we self-exec as the subprocess.
2025-10-10 10:02:03 +04:00
ケイラ 97cb19eeb5 chore: replace nexus-repository icon with an actual svg (#20251) 2025-10-09 13:55:55 -06:00
ケイラ caeff49aba chore: refactor roles to support multiple permission sets scoped by org id (#20186)
In preparation for adding the "member" permission level, which will also
be grouped by org ID, do a bit of a refactor to make room for it and the
existing "org" level to live in the same `map`
2025-10-09 11:08:34 -06:00
Hugo Dutka 6213b30f10 fix: replace http.DefaultClient in telemetry with dedicated client (#20247)
Replaces a call to the http default client in telemetry.go. Looks like
it was missed in https://github.com/coder/coder/pull/20128.

Related to https://github.com/coder/internal/issues/1020 and
https://github.com/coder/internal/issues/645.
2025-10-09 18:35:09 +02:00
Danielle Maywood 5d66f1f027 fix(provisioner): use sidebar app id instead of app id (#20246)
In a previous PR we set SidebarApp.Id equal to the appID instead of the
sidebarAppID.
2025-10-09 16:41:40 +01:00
Bruno Quaresma 8f2394c256 chore: replace MUI tables (#20201)
I tried to break this work into smaller pieces, but since there are a
lot of dependent components, I decided to handle it in one larger chunk
and rely on Storybook to catch any bugs.

That said, let me know if you’d prefer a different approach!
2025-10-09 10:09:37 -03:00
Thomas Kosiewski ed90ecf00e feat: add allow_list to resource-scoped API tokens (#19964)
# Add API key allow_list for resource-scoped tokens

This PR adds support for API key allow lists, enabling tokens to be scoped to specific resources. The implementation:

1. Adds a new `allow_list` field to the `CreateTokenRequest` struct, allowing clients to specify resource-specific scopes when creating API tokens
2. Implements `APIAllowListTarget` type to represent resource targets in the format `<type>:<id>` with support for wildcards
3. Adds validation and normalization logic for allow lists to handle wildcards and deduplication
4. Integrates with RBAC by creating an `APIKeyEffectiveScope` that merges API key scopes with allow list restrictions
5. Updates API documentation and TypeScript types to reflect the new functionality

This feature enables creating tokens that are limited to specific resources (like workspaces or templates) by ID, making it possible to create more granular API tokens with limited access.
2025-10-09 14:53:08 +02:00
Danielle Maywood f31e6e09ba chore(provisioner): support updated coder_ai_task resource (#20160)
Closes https://github.com/coder/internal/issues/978

- Introduce `CODER_TASK_ID` and `CODER_TASK_PROMPT` to the provisioner
environment
- Make use of new `app_id` field in provider, with a fallback to
`sidebar_app.id` for backwards compatibility

**For now** I've left the `taskPrompt` and `taskID` as a TODO as we do
not yet create these values.
2025-10-09 10:42:01 +01:00
Dean Sheather 8942b50872 fix: allow unhanger to unhang dormant workspaces (#20229) 2025-10-09 17:33:57 +11:00
Rishi Mondal c84ab728a8 chore: add nexus-repository icon (#20227) 2025-10-08 16:24:13 -06:00
Jaayden Halko 9935da86c6 chore: migrate appearanceform to Tailwind and shadcn (#20204) 2025-10-08 21:13:04 +01:00
Bruno Quaresma 4b8d73994f chore: migrate MUI icons to lucide-react (#20193) 2025-10-08 15:01:01 -03:00
DevCats e5d7f43c51 chore: add auto-dev-server-icon and set to monochromatic (#20219) 2025-10-08 12:58:56 -05:00
Bruno Quaresma 55be304e53 chore: replace remaining MUI buttons (#20200) 2025-10-08 13:39:47 -03:00
Jiachen Jiang 79736154db docs: add documentation for upcoming Agent Boundary feature (#20099)
## PR Description
tbd @jcjiang 

See a preview at:
https://coder.com/docs/@boundaries-docs/ai-coder/agent-boundary

---------

Co-authored-by: David Fraley <davidiii@fraley.us>
Co-authored-by: david-fraley <67079030+david-fraley@users.noreply.github.com>
2025-10-08 09:14:14 -07:00
david-fraley 6c5b741bed docs: list tasks CLI docs in manifest.json (#20220) 2025-10-08 17:32:21 +02:00
Cian Johnston fa82f841c7 chore(docs): add documentation for exp CLI commands (#20019)
Updates task documentation with experimental CLI.
~Generated by Claude using `--help` output.~

Should be merged alongside https://github.com/coder/coder/pull/20020
2025-10-08 13:43:44 +01:00
Spike Curtis 5807fe01e4 test: prevent TestAgent_ReconnectingPTY connection reporting check from interfering (#20210)
When we added support for connection tracking in the Workspace agent, we modified the ReconnectingPTY tests to add an initial connection that we immediately hang up and check that connections are logged.

In the case of `screen`-based pty handling, hanging up the initial connection can race with the initial attachment to the `screen` process, and cause that process to exit early. This leaves subsequent connections to the same session ID to fail.

In this PR we just use different pty session IDs so that the initial connections we do to verify logging don't interfere with the rest of the test.

_Arguably_ it's a bug in our Reconnecting PTY code that hanging up immediately can leave the system in a weird state, but we do eventually recover and error out, so I don't think it's worth trying to fix.
2025-10-08 16:23:46 +04:00
Spike Curtis e2076beb9f test: change to explicitly compiling dbcleaner on posix (#20206)
relates to: https://github.com/coder/internal/issues/1026

On POSIX systems (macOS and Linux) we compile the dbcleaner binary into a temp directory. This allows us to explicitly separate the compilation step and report the time it takes. We suspect this might be a contributing factor in the above linked flakes we see on macOS.

This doesn't work on Windows because Go tests clean up the temp directory at the end of the test and the dbcleaner binary will still be executing. On Windows you cannot delete a file being executed nor the directory. However, we are not seeing any flakes on Windows so the old behavior seems to be OK.
2025-10-08 16:18:02 +04:00
Spike Curtis f1d3f31c10 test: increase timeout in TestRun (scaletest/workspaceupdates) (#20211)
fixes https://github.com/coder/internal/issues/1050

Extends the timeout on the affected test. It uses a postgres database, and so 15s timeout isn't enough to not flake with our test infra these days.
2025-10-08 16:17:41 +04:00
Mathias Fredriksson 46e242e444 test(coderd/database/dbtestutil): add debug output to pg_dump failure (#20209) 2025-10-08 14:22:35 +03:00
Sas Swart 544f15523c fix: adjust workspace claims to be initiated by users (#20179)
The prebuilds user never initiates a workspace claim autonomously. A
claim can only happen when a user attempts to create a workspace. When
listing prebuild provisioner jobs, it would not make sense to see jobs
related to users who are creating workspaces and have gotten a prebuilt
workspace. When cleaning up an overwhelmed provisioner queue, we should
not delete claims as they have humans waiting for them and are not part
of the thundering herd.

Therefore, this PR ensures that provisioner jobs that claim workspaces
are considered to be initiated by the user, not the prebuilds system.
2025-10-08 10:40:54 +02:00
Atif Ali 037e6f06f5 docs: fix link to Grafana dashboard example for AI Bridge (#20205) 2025-10-08 08:25:18 +00:00
Mathias Fredriksson 8274251fe1 fix(.devcontainer): check if ssh folder exists in post_create (#19987) 2025-10-08 07:24:13 +00:00
Cian Johnston 63631b5b2b chore(coderd): aitasks: add internal-only api doc comments (#20020)
Adds api doc comments calling out experimental status.

Should be merged alongside https://github.com/coder/coder/pull/20019
2025-10-08 08:20:20 +01:00
Stephen Kirby d0f434b672 feat(docs): add bridge documentation for early access (#20188) 2025-10-07 22:36:05 -05:00
david-fraley be22c38161 docs: update release versions in docs (#20196) 2025-10-07 20:46:47 +00:00
Spike Curtis 65335bc7d4 feat: add cli command scaletest dynamic-parameters (#20034)
part of https://github.com/coder/internal/issues/912

Adds CLI command `coder exp scaletest dynamic-parameters`

I've left out the configuration of tracing and timeouts for now. I think I want to do some refactoring of the scaletest CLI to make handling those flags take up less boiler plate.

I will add tracing and timeout flags in a follow up PR.
2025-10-07 21:53:59 +04:00
Dean Sheather 0e0f0925e4 fix: use raw SVG for logo on static error page (#20189)
Relates to #20185, #20029, #18878
2025-10-08 00:48:17 +11:00
Mathias Fredriksson 057d7dacdc chore(coderd/database/queries): remove trailing whitespace (#20192) 2025-10-07 13:10:38 +00:00
Susana Ferreira 6b72ef8b18 chore(docs): update notifications documentation to include task events (#20190)
## Description

Update notifications documentation to include Task Events introduced in
PR: https://github.com/coder/coder/pull/19965
2025-10-07 11:32:44 +01:00
Kacper Sawicki 0c2eca94f5 feat(cli): add notifications scaletest command (#20092)
Closes https://github.com/coder/internal/issues/984
2025-10-07 10:33:52 +02:00
Kacper Sawicki 05f8f67ced feat(scaletest): add runner for notifications delivery (#20091)
Relates to https://github.com/coder/internal/issues/910

This PR adds a scaletest runner that simulates users receiving notifications through WebSocket connections.

An instance of this notification runner does the following:

1. Creates a user (optionally with specific roles like owner).
2. Connects to /api/v2/notifications/inbox/watch via WebSocket to receive notifications in real-time.
3. Waits for all other concurrently executing runners (per the DialBarrier WaitGroup) to also connect their websockets.
4. For receiving users: Watches the WebSocket for expected notifications and records delivery latency for each notification type.
5. For regular users: Maintains WebSocket connections to simulate concurrent load while receiving users wait for notifications.
6. Waits on the ReceivingWatchBarrier to coordinate between receiving and regular users.
7. Cleans up the created user after the test completes.


Exposes three prometheus metrics:

1. notification_delivery_latency_seconds - HistogramVec. Labels = {username, notification_type}
2. notification_delivery_errors_total - CounterVec. Labels = {username, action}
3. notification_delivery_missed_total - CounterVec. Labels = {username}

The runner measures end-to-end notification latency from when a notification-triggering event occurs (e.g., user creation/deletion) to when the notification is received by a WebSocket client.
2025-10-07 09:59:15 +02:00
Michael Smith 156f985fb0 fix(site): update useClipboard to work better with effect logic (#20183)
## Changes made
- Updated `useClipboard` API to require passing the text in via the
`copyToClipboard` function, rather than requiring that the text gets
specified in render logic
- Ensured that the `copyToClipboard` function always stays stable across
all React lifecycles
- Updated all existing uses to use the new function signatures
- Updated all tests and added new cases
2025-10-06 15:41:57 -04:00
ケイラ 09f69ef74f chore: update logo on error page (#20185) 2025-10-06 12:14:35 -06:00
Bruno Quaresma 83c4611293 chore: revert "feat: include latest build id in task responses" (#20184)
Reverts coder/coder#20181

I realized we don’t need this in the task response. When loading a task,
we already need much more workspace information, so it makes more sense
to fetch the workspace data separately instead of trying to embed all
its details into the response.

I think we can keep the task response clean and focused on the essential
information needed to list tasks. For more specific details, we can
fetch the related resources as needed. So, I’m reverting this PR.
2025-10-06 14:44:11 -03:00
Bruno Quaresma 23a44d10ac feat: include latest build id in task responses (#20181)
Adding the latest build ID is necessary to locate the logs associated
with the tasks in the UI.
2025-10-06 13:12:56 -03:00
Bruno Quaresma 1783ee13ab chore: fix biome error when running make lint (#20182)
Fixes a Biome lint error when running `make lint`.

```
> biome check --error-on-warnings --fix .

biome.jsonc:6:13 deserialize ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

  ℹ The configuration schema version does not match the CLI version 2.2.4
  
    4 │                 "includes": ["!e2e/**/*Generated.ts"]
    5 │         },
  > 6 │         "$schema": "https://biomejs.dev/schemas/2.2.0/schema.json"
      │                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    7 │ }
    8 │ 
  
  ℹ   Expected:                     2.2.4
      Found:                        2.2.0
  
  
  ℹ Run the command biome migrate to migrate the configuration file.
  

Checked 1165 files in 796ms. No fixes applied.
```
2025-10-06 11:51:56 -03:00
Bruno Quaresma 33fbb174d5 chore: set biome lsp bin path for vscode (#20180) 2025-10-06 11:51:44 -03:00
Bruno Quaresma 840afb225b feat: select template version for tasks (#20146)
Allows users with `updateTemplates` permission to select a specific
template version when creating a task.

One of the biggest changes was moving `TaskPrompt` into `modules/task`
and adding a Storybook entry for it. I also moved some stories from
`TasksPage` to simplify its story.

<img width="1208" height="197" alt="Screenshot 2025-10-02 at 12 09 06"
src="https://github.com/user-attachments/assets/b85d2723-bb52-442b-b8eb-36721944a653"
/>

Closes https://github.com/coder/coder/issues/19986
2025-10-06 10:23:53 -03:00
dependabot[bot] 3f49e28308 chore: bump github.com/coreos/go-oidc/v3 from 3.15.0 to 3.16.0 (#20178)
Bumps [github.com/coreos/go-oidc/v3](https://github.com/coreos/go-oidc)
from 3.15.0 to 3.16.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/coreos/go-oidc/releases">github.com/coreos/go-oidc/v3's
releases</a>.</em></p>
<blockquote>
<h2>v3.16.0</h2>
<h2>What's Changed</h2>
<ul>
<li>refactor: Remove unused time injection from RemoteKeySet by <a
href="https://github.com/ponimas"><code>@​ponimas</code></a> in <a
href="https://redirect.github.com/coreos/go-oidc/pull/466">coreos/go-oidc#466</a></li>
<li>bump go to 1.24, remove 1.23 support, bump go-jose dependency,
remove x/net dependency by <a
href="https://github.com/wardviaene"><code>@​wardviaene</code></a> in <a
href="https://redirect.github.com/coreos/go-oidc/pull/467">coreos/go-oidc#467</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/wardviaene"><code>@​wardviaene</code></a> made
their first contribution in <a
href="https://redirect.github.com/coreos/go-oidc/pull/467">coreos/go-oidc#467</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/coreos/go-oidc/compare/v3.15.0...v3.16.0">https://github.com/coreos/go-oidc/compare/v3.15.0...v3.16.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/coreos/go-oidc/commit/e9584733f8bb6c4683d1e98b4fb22eee121f7dff"><code>e958473</code></a>
bump go to 1.24, remove 1.23 support, bump go-jose dependency, remove
x/net d...</li>
<li><a
href="https://github.com/coreos/go-oidc/commit/69b167061fdb7270ef965f150ea6aabe11678728"><code>69b1670</code></a>
refactor: Remove unused time injection from RemoteKeySet</li>
<li>See full diff in <a
href="https://github.com/coreos/go-oidc/compare/v3.15.0...v3.16.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/coreos/go-oidc/v3&package-manager=go_modules&previous-version=3.15.0&new-version=3.16.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-06 11:59:26 +00:00
dependabot[bot] 5fdc4185c7 chore: bump github.com/moby/moby from 28.4.0+incompatible to 28.5.0+incompatible (#20177)
Bumps [github.com/moby/moby](https://github.com/moby/moby) from
28.4.0+incompatible to 28.5.0+incompatible.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/moby/moby/releases">github.com/moby/moby's
releases</a>.</em></p>
<blockquote>
<h2>v28.5.0</h2>
<h1>28.5.0</h1>
<p>For a full list of pull requests and changes in this release, refer
to the relevant GitHub milestones:</p>
<ul>
<li><a
href="https://github.com/docker/cli/issues?q=is%3Aclosed+milestone%3A28.5.0">docker/cli,
28.5.0 milestone</a></li>
<li><a
href="https://github.com/moby/moby/issues?q=is%3Aclosed+milestone%3A28.5.0">moby/moby,
28.5.0 milestone</a></li>
<li>Deprecated and removed features, see <a
href="https://github.com/docker/cli/blob/v28.5.0/docs/deprecated.md">Deprecated
Features</a>.</li>
<li>Changes to the Engine API, see <a
href="https://github.com/moby/moby/blob/v28.5.0/docs/api/version-history.md">API
version history</a>.</li>
</ul>
<h3>Bug fixes and enhancements</h3>
<ul>
<li>Don't print warnings in <code>docker info</code> for broken symlinks
in CLI-plugin directories. <a
href="https://redirect.github.com/docker/cli/pull/6476">docker/cli#6476</a></li>
<li>Fix a panic during <code>stats</code> on empty event
<code>Actor.ID</code>. <a
href="https://redirect.github.com/docker/cli/pull/6471">docker/cli#6471</a></li>
</ul>
<h3>Packaging updates</h3>
<ul>
<li>Remove support for legacy CBC cipher suites. <a
href="https://redirect.github.com/docker/cli/pull/6474">docker/cli#6474</a></li>
<li>Update Buildkit to <a
href="https://github.com/moby/buildkit/releases/tag/v0.25.0">v0.25.0</a>.
<a
href="https://redirect.github.com/moby/moby/pull/51075">moby/moby#51075</a></li>
<li>Update Dockerfile syntax to <a
href="https://github.com/moby/buildkit/releases/tag/dockerfile%2F1.19.0">v1.19.0</a>.
<a
href="https://redirect.github.com/moby/moby/pull/51075">moby/moby#51075</a></li>
</ul>
<h3>Networking</h3>
<ul>
<li>Eliminated harmless warning about deletion of
<code>endpoint_count</code> from the data store. <a
href="https://redirect.github.com/moby/moby/pull/51064">moby/moby#51064</a></li>
<li>Fix a bug causing IPAM plugins to not be loaded on Windows. <a
href="https://redirect.github.com/moby/moby/pull/51035">moby/moby#51035</a></li>
</ul>
<h3>API</h3>
<ul>
<li>Deprecate support for kernel memory TCP accounting
(<code>KernelMemoryTCP</code>). <a
href="https://redirect.github.com/moby/moby/pull/51067">moby/moby#51067</a></li>
<li>Fix <code>GET containers/{name}/checkpoints</code> returning
<code>null</code> instead of empty JSON array when there are no
checkpoints. <a
href="https://redirect.github.com/moby/moby/pull/51052">moby/moby#51052</a></li>
</ul>
<h3>Go SDK</h3>
<ul>
<li>cli-plugins/plugin: Run: allow customizing the CLI. <a
href="https://redirect.github.com/docker/cli/pull/6481">docker/cli#6481</a></li>
<li>cli/command: add <code>WithUserAgent</code> option. <a
href="https://redirect.github.com/docker/cli/pull/6477">docker/cli#6477</a></li>
</ul>
<h3>Deprecations</h3>
<ul>
<li>Go-SDK: cli/command: deprecate <code>DockerCli.Apply</code>. This
method is no longer used and will be removed in the next release if
there are no remaining uses. <a
href="https://redirect.github.com/docker/cli/pull/6497">docker/cli#6497</a></li>
<li>Go-SDK: cli/command: deprecate
<code>DockerCli.ContentTrustEnabled</code>. This method is no longer
used and will be removed in the next release. <a
href="https://redirect.github.com/docker/cli/pull/6495">docker/cli#6495</a></li>
<li>Go-SDK: cli/command: deprecate
<code>DockerCli.DefaultVersion</code>. This method is no longer used and
will be removed in the next release. <a
href="https://redirect.github.com/docker/cli/pull/6491">docker/cli#6491</a></li>
<li>Go-SDK: cli/command: deprecate <code>ResolveDefaultContext</code>
utility. <a
href="https://redirect.github.com/docker/cli/pull/6529">docker/cli#6529</a></li>
<li>Go-SDK: cli/command: deprecate <code>WithContentTrustFromEnv</code>,
<code>WithContentTrust</code> options. These options were used
internally, and will be removed in the next release.. <a
href="https://redirect.github.com/docker/cli/pull/6489">docker/cli#6489</a></li>
<li>Go-SDK: cli/manifest/store: deprecate <code>IsNotFound()</code>. <a
href="https://redirect.github.com/docker/cli/pull/6514">docker/cli#6514</a></li>
<li>Go-SDK: templates: deprecate NewParse() function. <a
href="https://redirect.github.com/docker/cli/pull/6469">docker/cli#6469</a></li>
</ul>
<h2>v28.5.0-rc.1</h2>
<h2>28.5.0-rc.1</h2>
<p>For a full list of pull requests and changes in this release, refer
to the relevant GitHub milestones:</p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/moby/moby/commit/cd048300a48700294339c9c91d2dcc691cb8f63b"><code>cd04830</code></a>
Merge pull request <a
href="https://redirect.github.com/moby/moby/issues/51075">#51075</a>
from vvoland/51074-28.x</li>
<li><a
href="https://github.com/moby/moby/commit/e29d6be7a5126c4677a974cac3b9610e5157c780"><code>e29d6be</code></a>
vendor: github.com/moby/buildkit v0.25.0</li>
<li><a
href="https://github.com/moby/moby/commit/9b4369035bed7dfdc3799484159ca4ed09e04e8e"><code>9b43690</code></a>
Merge pull request <a
href="https://redirect.github.com/moby/moby/issues/51069">#51069</a>
from thaJeztah/28.x_backport_docs_rm_deprecated_vir...</li>
<li><a
href="https://github.com/moby/moby/commit/4f3572596b4db81e89aa69cbee8741591d77ad47"><code>4f35725</code></a>
api: swagger: remove VirtualSize fields for API &gt; v1.43</li>
<li><a
href="https://github.com/moby/moby/commit/79f310d4bc0272183fdd9fed87b0a6881c185ca2"><code>79f310d</code></a>
Merge pull request <a
href="https://redirect.github.com/moby/moby/issues/51067">#51067</a>
from austinvazquez/cherry-pick-deprecate-kernel-mem...</li>
<li><a
href="https://github.com/moby/moby/commit/deb4bbbfe09dfea5289d6a410d3c73d8a49a13ac"><code>deb4bbb</code></a>
api: deprecate <code>KernelMemoryTCP</code> support</li>
<li><a
href="https://github.com/moby/moby/commit/423a7fd6af43765bb80d5ab94fcd60c7daa64033"><code>423a7fd</code></a>
Merge pull request <a
href="https://redirect.github.com/moby/moby/issues/51064">#51064</a>
from thaJeztah/28.x_backport_fix_epcnt_warning</li>
<li><a
href="https://github.com/moby/moby/commit/fbf2fe8b7dc2cc532ef7de136f95016f7f56b267"><code>fbf2fe8</code></a>
Eliminate warning about endpoint count store delete</li>
<li><a
href="https://github.com/moby/moby/commit/252a1ebe7effbc4e7e2272154332111d1fc916b2"><code>252a1eb</code></a>
Merge pull request <a
href="https://redirect.github.com/moby/moby/issues/51061">#51061</a>
from thaJeztah/28.x_backport_rm_email_example</li>
<li><a
href="https://github.com/moby/moby/commit/2c15eb6617efb30cb828ce1a7133df6aa1689314"><code>2c15eb6</code></a>
api/docs: remove email field from example auth</li>
<li>Additional commits viewable in <a
href="https://github.com/moby/moby/compare/v28.4.0...v28.5.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/moby/moby&package-manager=go_modules&previous-version=28.4.0+incompatible&new-version=28.5.0+incompatible)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-06 11:58:57 +00:00
Sas Swart a7c9e623fe chore: fix api param name (#20172)
In https://github.com/coder/coder/pull/20137, we added a new flag to
`coder provisioner jobs list`, namely `--initiator`.

To make some follow-up worth it, I need to rename an API param used in
the process before it becomes part of our released and tagged API.

Instead of only accepting UUIDs, we accept an arbitrary string.
We still validate it as a UUID now, but we will expand its validation to
allow any string and then resolve that string the same way that we
resolve the user parameter elsewhere in the API.
2025-10-06 13:58:37 +02:00
dependabot[bot] dd99d40ad6 chore: bump github.com/go-playground/validator/v10 from 10.27.0 to 10.28.0 (#20176)
Bumps
[github.com/go-playground/validator/v10](https://github.com/go-playground/validator)
from 10.27.0 to 10.28.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/go-playground/validator/releases">github.com/go-playground/validator/v10's
releases</a>.</em></p>
<blockquote>
<h2>Release 10.28.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Update workflow.yml to support 2 most recent major versions by <a
href="https://github.com/nodivbyzero"><code>@​nodivbyzero</code></a> in
<a
href="https://redirect.github.com/go-playground/validator/pull/1417">go-playground/validator#1417</a></li>
<li>Bump actions/checkout from 4 to 5 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/go-playground/validator/pull/1456">go-playground/validator#1456</a></li>
<li>Go 1.25 support by <a
href="https://github.com/nodivbyzero"><code>@​nodivbyzero</code></a> in
<a
href="https://redirect.github.com/go-playground/validator/pull/1459">go-playground/validator#1459</a></li>
<li>Bump github.com/gabriel-vasile/mimetype from 1.4.8 to 1.4.10 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/go-playground/validator/pull/1463">go-playground/validator#1463</a></li>
<li>Bump golang.org/x/text from 0.22.0 to 0.29.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/go-playground/validator/pull/1464">go-playground/validator#1464</a></li>
<li>Bump actions/setup-go from 5 to 6 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/go-playground/validator/pull/1465">go-playground/validator#1465</a></li>
<li>Bump golang.org/x/crypto from 0.33.0 to 0.42.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/go-playground/validator/pull/1467">go-playground/validator#1467</a></li>
<li>fix: should panic when define duplicate field param in
<code>required_if</code> by <a
href="https://github.com/duyquang6"><code>@​duyquang6</code></a> in <a
href="https://redirect.github.com/go-playground/validator/pull/1468">go-playground/validator#1468</a></li>
<li>Fixed missing keys from returned errors in map validation by <a
href="https://github.com/gelozr"><code>@​gelozr</code></a> in <a
href="https://redirect.github.com/go-playground/validator/pull/1284">go-playground/validator#1284</a></li>
<li>Added https_url tag by <a
href="https://github.com/ahmedkamalio"><code>@​ahmedkamalio</code></a>
in <a
href="https://redirect.github.com/go-playground/validator/pull/1461">go-playground/validator#1461</a></li>
<li>docs: add description for 'port' validator by <a
href="https://github.com/nodivbyzero"><code>@​nodivbyzero</code></a> in
<a
href="https://redirect.github.com/go-playground/validator/pull/1435">go-playground/validator#1435</a></li>
<li>Add alphaspace validator by <a
href="https://github.com/takaaa220"><code>@​takaaa220</code></a> in <a
href="https://redirect.github.com/go-playground/validator/pull/1343">go-playground/validator#1343</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/duyquang6"><code>@​duyquang6</code></a>
made their first contribution in <a
href="https://redirect.github.com/go-playground/validator/pull/1468">go-playground/validator#1468</a></li>
<li><a href="https://github.com/gelozr"><code>@​gelozr</code></a> made
their first contribution in <a
href="https://redirect.github.com/go-playground/validator/pull/1284">go-playground/validator#1284</a></li>
<li><a
href="https://github.com/ahmedkamalio"><code>@​ahmedkamalio</code></a>
made their first contribution in <a
href="https://redirect.github.com/go-playground/validator/pull/1461">go-playground/validator#1461</a></li>
<li><a href="https://github.com/takaaa220"><code>@​takaaa220</code></a>
made their first contribution in <a
href="https://redirect.github.com/go-playground/validator/pull/1343">go-playground/validator#1343</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/go-playground/validator/compare/v10.27.0...v10.28.0">https://github.com/go-playground/validator/compare/v10.27.0...v10.28.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/go-playground/validator/commit/bdc3a7d31bd8269baa07904324e8fb22fb2f9372"><code>bdc3a7d</code></a>
Add alphaspace validator (<a
href="https://redirect.github.com/go-playground/validator/issues/1343">#1343</a>)</li>
<li><a
href="https://github.com/go-playground/validator/commit/63594a0fbee5293ac7bc6c80286c3277cd2ddb45"><code>63594a0</code></a>
docs: add description for 'port' validator (<a
href="https://redirect.github.com/go-playground/validator/issues/1435">#1435</a>)</li>
<li><a
href="https://github.com/go-playground/validator/commit/45f3a8e09a4d0b995829fe0613289db4fb69b850"><code>45f3a8e</code></a>
Added https_url tag (<a
href="https://redirect.github.com/go-playground/validator/issues/1461">#1461</a>)</li>
<li><a
href="https://github.com/go-playground/validator/commit/7a23bca81c8bc72c578252b524665ea173b489da"><code>7a23bca</code></a>
Remove Go version 1.23 from CI workflow</li>
<li><a
href="https://github.com/go-playground/validator/commit/13130d2e78838bec6b9e4dbce756c183844590e7"><code>13130d2</code></a>
Fixed missing keys from returned errors in map validation (<a
href="https://redirect.github.com/go-playground/validator/issues/1284">#1284</a>)</li>
<li><a
href="https://github.com/go-playground/validator/commit/94e89f0942028ebfb17426b8d3c2005a46557621"><code>94e89f0</code></a>
fix: should panic when define duplicate field param in
<code>required_if</code> (<a
href="https://redirect.github.com/go-playground/validator/issues/1468">#1468</a>)</li>
<li><a
href="https://github.com/go-playground/validator/commit/6905468e4559e0937518de42c92cb6b5c023342d"><code>6905468</code></a>
Bump golang.org/x/crypto from 0.33.0 to 0.42.0 (<a
href="https://redirect.github.com/go-playground/validator/issues/1467">#1467</a>)</li>
<li><a
href="https://github.com/go-playground/validator/commit/77ef70e2d1287e50641c096ca1c3d09497e7197c"><code>77ef70e</code></a>
Bump actions/setup-go from 5 to 6 (<a
href="https://redirect.github.com/go-playground/validator/issues/1465">#1465</a>)</li>
<li><a
href="https://github.com/go-playground/validator/commit/78d05ae554de682d121e78a6a62d7de01be8e863"><code>78d05ae</code></a>
Bump golang.org/x/text from 0.22.0 to 0.29.0 (<a
href="https://redirect.github.com/go-playground/validator/issues/1464">#1464</a>)</li>
<li><a
href="https://github.com/go-playground/validator/commit/34aea1f62ca76abe2999265d25c8dcf8507cb3ed"><code>34aea1f</code></a>
Bump github.com/gabriel-vasile/mimetype from 1.4.8 to 1.4.10 (<a
href="https://redirect.github.com/go-playground/validator/issues/1463">#1463</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/go-playground/validator/compare/v10.27.0...v10.28.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/go-playground/validator/v10&package-manager=go_modules&previous-version=10.27.0&new-version=10.28.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-06 11:57:20 +00:00
Danielle Maywood 815e58e872 chore: upgrade clistat to v1.0.2 (#20173)
clistat@v1.0.2 contains a bug fix for when `cpu.max` cgroup file is
missing
2025-10-06 12:21:21 +01:00
Thomas Kosiewski b60ae0a0c4 refactor: add wildcard scope entries for API key scopes (#20032)
# Add API Key Scope Wildcards

This PR adds wildcard API key scopes (`resource:*`) for all RBAC resources to ensure every resource has a matching wildcard value. It also adds all individual `resource:action`​ scopes to the API documentation and TypeScript definitions.

The changes include:

- Adding a new database migration (000377) that adds wildcard API key scopes
- Updating the API documentation to include all available scopes
- Enhancing the scope generation scripts to include all resource wildcards
- Updating the TypeScript definitions to match the expanded scope list

These changes make creating API keys with comprehensive permissions for specific resource types easier.
2025-10-06 12:08:17 +02:00
Sas Swart d17dd5d787 feat: add filtering by initiator to provisioner job listing in the CLI (#20137)
Relates to https://github.com/coder/internal/issues/934

This PR provides a mechanism to filter provisioner jobs according to who
initiated the job.
This will be used to find pending prebuild jobs when prebuilds have
overwhelmed the provisioner job queue. They can then be canceled.

If prebuilds are overwhelming provisioners, the following steps will be
taken:

```bash
# pause prebuild reconciliation to limit provisioner queue pollution:
coder prebuilds pause 
# cancel pending provisioner jobs to clear the queue
coder provisioner jobs list --initiator="prebuilds" --status="pending" | jq ... | xargs -n1 -I{} coder provisioner jobs cancel {}
# push a fixed template and wait for the import to complete
coder templates push ... # push a fixed template
# resume prebuild reconciliation
coder prebuilds resume
```

This interface differs somewhat from what was specified in the issue,
but still provides a mechanism that addresses the issue. The original
proposal was made by myself and this simpler implementation makes sense.
I might add a `--search` parameter in a follow-up if there is appetite
for it.

Potential follow ups:
* Support for this usage: `coder provisioner jobs list --search
"initiator:prebuilds status:pending"`
* Adding the same parameters to `coder provisioner jobs cancel` as a
convenience feature so that operators don't have to pipe through `jq`
and `xargs`
2025-10-06 08:56:43 +00:00
Atif Ali c1357d4e27 chore(dogfood): dogfood latest version of agentAPI (#20165) 2025-10-04 15:58:17 +05:00
ケイラ 2ec9be2203 fix: only show error once when failing to update org member roles (#20155) 2025-10-03 14:11:08 -06:00
Zach b48e367c58 chore: remove dead randtz package (#20156)
This package was added in 65db7a71 a couple years back and was seemingly
never used (backed up by `git log -S randtz`).
2025-10-03 09:42:34 -06:00
Rafael Rodriguez bb5884467d feat(cli): prompt for missing required template variables on template creation (#19973)
## Summary

In this pull request we're adding support in the CLI for prompting the
user for any missing required template variables in the `coder templates
push` command and automatically retrying the template build once a user
has provided any missing variable values.

Closes: https://github.com/coder/coder/issues/19782

### Demo

In the following recording I created a simple template terraform file
that used different variable types (string, number, boolean, and
sensitive) and prompted the user to enter a value for each variable.

<details>
<summary>See example template terraform file</summary>

```tf
...

# Required variables for testing interactive prompting
variable "docker_image" {
  description = "Docker image to use for the workspace"
  type        = string
}

variable "workspace_name" {
  description = "Name of the workspace"
  type        = string
}

variable "cpu_limit" {
  description = "CPU limit for the container (number of cores)"
  type        = number
}

variable "enable_gpu" {
  description = "Enable GPU access for the container"
  type        = bool
}

variable "api_key" {
  description = "API key for external services (sensitive)"
  type        = string
  sensitive   = true
}

# Optional variable with default
variable "docker_socket" {
  default     = "/var/run/docker.sock"
  description = "Docker socket path"
  type        = string
}

...
```

</details>

Once the user entered a valid value for each variable, the template
build would be retried.


https://github.com/user-attachments/assets/770cf954-3cbc-4464-925e-2be4e32a97de

<details>
<summary>See output from recording</summary>

```shell
$ ./scripts/coder-dev.sh templates push test-required-params -d examples/templates/test-required-params/
INFO : Overriding codersdk.SessionTokenCookie as we are developing inside a Coder workspace.
/home/coder/coder/build/coder-slim_2.26.0-devel+a68122ca3_linux_amd64
Provisioner tags:  <none>
WARN: No .terraform.lock.hcl file found
  | When provisioning, Coder will be unable to cache providers without a lockfile and must download them from the internet each time.
  | Create one by running  terraform init  in your template directory.
> Upload "examples/templates/test-required-params"? (yes/no) yes
=== ✔ Queued [0ms]
==> ⧗ Running
==> ⧗ Running
=== ✔ Running [4ms]
==> ⧗ Setting up
=== ✔ Setting up [0ms]
==> ⧗ Parsing template parameters
=== ✔ Parsing template parameters [8ms]
==> ⧗ Cleaning Up
=== ✘ Cleaning Up [4ms]
=== ✘ Cleaning Up [8ms]
Found 5 missing required variables:
  - docker_image (string): Docker image to use for the workspace
  - workspace_name (string): Name of the workspace
  - cpu_limit (number): CPU limit for the container (number of cores)
  - enable_gpu (bool): Enable GPU access for the container
  - api_key (string): API key for external services (sensitive)

The template requires values for the following variables:
var.docker_image (required)
  Description: Docker image to use for the workspace
  Type: string
  Current value: <empty>
> Enter value: image-name
var.workspace_name (required)
  Description: Name of the workspace
  Type: string
  Current value: <empty>
> Enter value: workspace-name
var.cpu_limit (required)
  Description: CPU limit for the container (number of cores)
  Type: number
  Current value: <empty>
> Enter value: 1
var.enable_gpu (required)
  Description: Enable GPU access for the container
  Type: bool
  Current value: <empty>
? Select value: false
var.api_key (required), sensitive
  Description: API key for external services (sensitive)
  Type: string
  Current value: <empty>
> Enter value: (*redacted*) ******

Retrying template build with provided variables...
=== ✔ Queued [0ms]
==> ⧗ Running
==> ⧗ Running
=== ✔ Running [2ms]
==> ⧗ Setting up
=== ✔ Setting up [0ms]
==> ⧗ Parsing template parameters
=== ✔ Parsing template parameters [7ms]
==> ⧗ Detecting persistent resources
2025-09-25 22:34:14.731Z Terraform 1.13.0
2025-09-25 22:34:15.140Z data.coder_provisioner.me: Refreshing...
2025-09-25 22:34:15.140Z data.coder_workspace.me: Refreshing...
2025-09-25 22:34:15.140Z data.coder_workspace_owner.me: Refreshing...
2025-09-25 22:34:15.141Z data.coder_provisioner.me: Refresh complete after 0s [id=2bd73098-d127-4362-b3a5-628e5bce6998]
2025-09-25 22:34:15.141Z data.coder_workspace_owner.me: Refresh complete after 0s [id=c2006933-4f3e-4c45-9e04-79612c3a5eca]
2025-09-25 22:34:15.141Z data.coder_workspace.me: Refresh complete after 0s [id=36f2dc6f-0bf2-43bd-bc4d-b29768334e02]
2025-09-25 22:34:15.186Z coder_agent.main: Plan to create
2025-09-25 22:34:15.186Z module.code-server[0].coder_app.code-server: Plan to create
2025-09-25 22:34:15.186Z docker_volume.home_volume: Plan to create
2025-09-25 22:34:15.186Z module.code-server[0].coder_script.code-server: Plan to create
2025-09-25 22:34:15.187Z docker_container.workspace[0]: Plan to create
2025-09-25 22:34:15.187Z Plan: 5 to add, 0 to change, 0 to destroy.
=== ✔ Detecting persistent resources [3104ms]
==> ⧗ Detecting ephemeral resources
2025-09-25 22:34:16.033Z Terraform 1.13.0
2025-09-25 22:34:16.428Z data.coder_workspace.me: Refreshing...
2025-09-25 22:34:16.428Z data.coder_provisioner.me: Refreshing...
2025-09-25 22:34:16.429Z data.coder_workspace_owner.me: Refreshing...
2025-09-25 22:34:16.429Z data.coder_provisioner.me: Refresh complete after 0s [id=2d2f7083-88e6-425c-9df3-856a3bf4cc73]
2025-09-25 22:34:16.429Z data.coder_workspace.me: Refresh complete after 0s [id=c723575e-c7d3-43d7-bf54-0e34d0959dc3]
2025-09-25 22:34:16.431Z data.coder_workspace_owner.me: Refresh complete after 0s [id=d43470c2-236e-4ae9-a977-6b53688c2cb1]
2025-09-25 22:34:16.453Z coder_agent.main: Plan to create
2025-09-25 22:34:16.453Z docker_volume.home_volume: Plan to create
2025-09-25 22:34:16.454Z Plan: 2 to add, 0 to change, 0 to destroy.
=== ✔ Detecting ephemeral resources [1278ms]
==> ⧗ Cleaning Up
=== ✔ Cleaning Up [6ms]
┌──────────────────────────────────┐
│ Template Preview                 │
├──────────────────────────────────┤
│ RESOURCE                         │
├──────────────────────────────────┤
│ docker_container.workspace       │
│ └─ main (linux, amd64)           │
├──────────────────────────────────┤
│ docker_volume.home_volume        │
└──────────────────────────────────┘

The test-required-params template has been created at Sep 25
22:34:16! Developers can provision a workspace with this template using:

Updated version at Sep 25 22:34:16!
```

</details>

### Changes

- Added a new function to check if the provisioner failed due to a
template missing required variables
- Added a handler function that is called when a provisioner fails due
to the "missing required variables" error. The handler function will:
- Check for provided template variables and identify any missing
variables
- Prompt the user for any missing variables (prompt is adapted based on
the variable type)
  - Validate user input for missing variables
- Retry the template build when all variables have been provided by the
user

### Testing

Added tests for the following scenarios:

- Ensure validation based on variable type
- Ensure users are not prompted for variables with a default value
- Ensure variables provided via a variables files (`--variables-file`)
or a variable flag (`--variable`) take precedence over a template
2025-10-03 10:20:06 -05:00
Cian Johnston a360785199 ci(.github/workflows/traiage.yaml): check instead for push access to repo (#20163) 2025-10-03 15:04:20 +01:00
Ethan 2b4485575c feat(scaletest): add autostart scaletest command (#20161)
Closes https://github.com/coder/internal/issues/911
2025-10-03 20:50:21 +10:00
Cian Johnston e1619daacc chore(coderd): update aitasks.go to leverage agentapi-sdk-go (#20159)
I only recently became aware of the existence of `agentapi-sdk-go`.
Updates `aitasks.go` to make use of it.
2025-10-03 10:17:43 +01:00
Cian Johnston 2880582cc9 ci(.github/workflows/traiage.yaml): remove extraneous cleanup steps (#20154) 2025-10-03 10:01:08 +01:00
Cian Johnston 091b5c88d6 ci(.github/workflows/traiage.yaml): adjust prompt for legibility when… (#20144)
… truncated, just like this
2025-10-03 09:24:55 +01:00
yyefimov 039fa89481 fix(coderd): correct the name of the unmarshall error variable (#20150)
Incorrect error variable is used during reporting of the issue during
unmarshall operations and this makes it hard to understand the
underlying reason for OIDC failure: use `unmarshalError` instead of
`err`.
2025-10-03 12:08:15 +10:00
Cian Johnston ffcb7a1693 fix(coderd): truncate task prompt to 160 characters in notifications (#20147)
Truncates the task prompt used in notifications to a maximum of 160
characters. The length of 160 characters was chosen arbitrarily.
2025-10-02 19:54:07 +01:00
Zach 4d1003eace fix: remove initial global HTTP client usage (#20128)
This PR makes the initial steps at removing usage of the global Go HTTP
client, which was seen to have impacts on test flakiness in
https://github.com/coder/internal/issues/1020. The first commit removes
uses from tests, with the exception of one test that is tightly coupled
to the default client. The second commit makes easy/low-risk removals
from application code. This should have some impact to reduce test flakiness.
2025-10-02 11:43:13 -06:00
Danielle Maywood 0d2ccacacd chore: update to coder/clistat v1.0.1 (#20143)
Update https://github.com/coder/clistat to v1.0.1. This release has two
bug fixes.
2025-10-02 15:39:11 +01:00
Rafael Rodriguez c517aabe43 fix: add heartbeat to keep dynamic params websocket open (#20026)
## Summary

In this pull request we're adding a heartbeat to the
`handleParameterWebsocket` function to ensure that the connection stays
open until the 30 min timeout has been reached.

Closes: https://github.com/coder/coder/issues/19805

### Testing

- Reproduced the problem mentioned in the issue (websocket connection
closes after ~10 minutes of inactivity on the create workspace page)

<img width="1870" height="357" alt="Screenshot 2025-09-26 at 15 58 51"
src="https://github.com/user-attachments/assets/a9e2e89e-87c5-4afa-9644-afe246a15f79"
/>

- Confirmed that adding the heartbeat kept the connection open until the
30 min timeout was reached

<img width="1636" height="387" alt="Screenshot 2025-09-29 at 15 51 43"
src="https://github.com/user-attachments/assets/0a8c5cda-29a6-493d-a6c0-4a2629da8838"
/>
2025-10-02 09:32:40 -05:00
Cian Johnston 1f71c2c129 ci(.github/workflows/traiage.yaml): fix task URL in github comment (#20142) 2025-10-02 14:59:22 +01:00
Marcin Tojek d93629bcde fix: check permission to update username (#20139) 2025-10-02 15:07:49 +02:00
Danny Kopping d63bb2ce2f chore: add Audit Log purge advice (#20052)
Audit Log entries can be deleted safely (with appropriate caveats), but
we don't specifically call this out in the docs.

---------

Signed-off-by: Danny Kopping <danny@coder.com>
2025-10-02 11:10:51 +02:00
Dean Sheather d2c286d9b0 chore: fix missing variable in deploy workflow (#20136)
already in release branch and tested working
2025-10-02 16:08:29 +10:00
Dean Sheather e5c8c9bdaf chore: pin dogfood to release branch during release freeze (#20028)
Relates to https://github.com/coder/dogfood/pull/189
Closes https://github.com/coder/internal/issues/1021

- Adds new script `scripts/should_deploy.sh` which implements the
algorithm in the linked issue
- Changes the `ci.yaml` workflow to run on release branches
- Moves the deployment steps out of `ci.yaml` into a new workflow
`deploy.yaml` for concurrency limiting purposes
- Changes the behavior of image tag pushing slightly:
    - Versioned tags will no longer have a `main-` prefix
    - `main` branch will still push the `main` and `latest` tags
    - `release/x.y` branches will now push `release-x.y` tags
- The deploy job will exit early if `should_deploy.sh` returns false
- The deploy job will now retag whatever image it's about to deploy as
`dogfood`
2025-10-02 12:12:29 +10:00
Ethan 76d6e13185 ci: make changes required (#20131)
Earlier today, a dependabot PR was merged despite CI not having been run. https://github.com/coder/coder/pull/20068

This was because the `changes` job failed due to a github ratelimit, which caused all the tests to be skipped, which caused `required` to pass as it passes if tests are skipped.

<img width="725" height="872" alt="image" src="https://github.com/user-attachments/assets/3a13d19b-819a-48df-ad8c-e9ff903a0496" />

This PR makes `changes` required so this can't happen again (assuming we keep around the dependabot automerge around, it might).
2025-10-02 11:22:05 +10:00
Ethan f84a789b40 feat(scaletest): add runner for thundering herd autostart (#19998)
Relates to https://github.com/coder/internal/issues/911

This PR adds a scaletest runner that simulates a user with a workspace configured to autostart at a specific time.

An instance of this autostart runner does the following:
1. Creates a user.
2. Creates a workspace.
3. Listens on `/api/v2/workspaces/<ws-id>/watch` to wait for build completions throughout the run.
4. Stops the workspace, waits for the build to finish.
4. Waits for all other concurrently executing runners (per the `SetupBarrier` `WaitGroup`) to also have a stopped workspace.
5. Sets the workspace autostart time to `time.Now().Add(cfg.AutoStartDelay)`
6. Waits for the autostart workspace build to begin, enter the `pending` status, and then `running`.
7. Waits for the autostart workspace build to end.

Exposes four prometheus metrics:
- `autostart_job_creation_latency_seconds` - HistogramVec. Labels = {username, workspace_name}
- `autostart_job_acquired_latency_seconds` - HistogramVec. Labels = {username, workspace_name}
- `autostart_total_latency_seconds` - `HistogramVec`. Labels = `{username, workspace_name}`.
- `autostart_errors_total` - `CounterVec`. Labels = `{username, action}`.
2025-10-02 10:45:52 +10:00
dependabot[bot] 41420aea3c chore: bump ts-proto from 1.164.0 to 1.181.2 in /site (#20080)
Bumps [ts-proto](https://github.com/stephenh/ts-proto) from 1.164.0 to
1.181.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/stephenh/ts-proto/releases">ts-proto's
releases</a>.</em></p>
<blockquote>
<h2>v1.181.2</h2>
<h2><a
href="https://github.com/stephenh/ts-proto/compare/v1.181.1...v1.181.2">1.181.2</a>
(2024-08-15)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>toJSON Function with <code>removeEnumPrefix=true</code> and
<code>unrecognizedEnumValue=0</code> Options (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1089">#1089</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/24014908f814e2720b9c2b9bd2ae1773be880a16">2401490</a>),
closes <a
href="https://redirect.github.com/stephenh/ts-proto/issues/1086">#1086</a>
<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1086">#1086</a></li>
</ul>
<h2>v1.181.1</h2>
<h2><a
href="https://github.com/stephenh/ts-proto/compare/v1.181.0...v1.181.1">1.181.1</a>
(2024-07-13)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Incorrect message names in the generated code for repeated fields
(<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1073">#1073</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/8a95d8e0983a38e604b6990461e726db566ff311">8a95d8e</a>),
closes <a
href="https://redirect.github.com/stephenh/ts-proto/issues/1072">#1072</a></li>
</ul>
<h2>v1.181.0</h2>
<h1><a
href="https://github.com/stephenh/ts-proto/compare/v1.180.0...v1.181.0">1.181.0</a>
(2024-07-01)</h1>
<h3>Features</h3>
<ul>
<li>added the &quot;typePrefix&quot; and &quot;typeSuffix&quot; options.
(<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1069">#1069</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/ab515cda322baeb94c7588117e4bb5bee6281874">ab515cd</a>),
closes <a
href="https://redirect.github.com/stephenh/ts-proto/issues/1033">#1033</a></li>
</ul>
<h2>v1.180.0</h2>
<h1><a
href="https://github.com/stephenh/ts-proto/compare/v1.179.0...v1.180.0">1.180.0</a>
(2024-06-15)</h1>
<h3>Features</h3>
<ul>
<li>oneof=unions-value to use the same field name for oneof cases (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1062">#1062</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/74930908cc8e5292577a793b7ae06c3721225ac3">7493090</a>),
closes <a
href="https://redirect.github.com/stephenh/ts-proto/issues/1060">#1060</a></li>
</ul>
<h2>v1.179.0</h2>
<h1><a
href="https://github.com/stephenh/ts-proto/compare/v1.178.0...v1.179.0">1.179.0</a>
(2024-06-15)</h1>
<h3>Features</h3>
<ul>
<li>bigIntLiteral option for using BigInt literals (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1063">#1063</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/b89fbcb1f99ccfcd1f06551286c2459e44a3bac2">b89fbcb</a>),
closes <a
href="https://redirect.github.com/stephenh/ts-proto/issues/928">#928</a>
<a
href="https://redirect.github.com/stephenh/ts-proto/issues/932">#932</a></li>
</ul>
<h2>v1.178.0</h2>
<h1><a
href="https://github.com/stephenh/ts-proto/compare/v1.177.0...v1.178.0">1.178.0</a>
(2024-06-07)</h1>
<h3>Features</h3>
<ul>
<li><code>no-file-descriptor</code> setting for outputSchema option (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1047">#1047</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/c54f06c4a7dd766abf3f91932b1e4cdf38b7f346">c54f06c</a>)</li>
</ul>
<h2>v1.177.0</h2>
<h1><a
href="https://github.com/stephenh/ts-proto/compare/v1.176.3...v1.177.0">1.177.0</a>
(2024-06-07)</h1>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/stephenh/ts-proto/blob/main/CHANGELOG.md">ts-proto's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/stephenh/ts-proto/compare/v1.181.1...v1.181.2">1.181.2</a>
(2024-08-15)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>toJSON Function with <code>removeEnumPrefix=true</code> and
<code>unrecognizedEnumValue=0</code> Options (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1089">#1089</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/24014908f814e2720b9c2b9bd2ae1773be880a16">2401490</a>),
closes <a
href="https://redirect.github.com/stephenh/ts-proto/issues/1086">#1086</a>
<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1086">#1086</a></li>
</ul>
<h2><a
href="https://github.com/stephenh/ts-proto/compare/v1.181.0...v1.181.1">1.181.1</a>
(2024-07-13)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Incorrect message names in the generated code for repeated fields
(<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1073">#1073</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/8a95d8e0983a38e604b6990461e726db566ff311">8a95d8e</a>),
closes <a
href="https://redirect.github.com/stephenh/ts-proto/issues/1072">#1072</a></li>
</ul>
<h1><a
href="https://github.com/stephenh/ts-proto/compare/v1.180.0...v1.181.0">1.181.0</a>
(2024-07-01)</h1>
<h3>Features</h3>
<ul>
<li>added the &quot;typePrefix&quot; and &quot;typeSuffix&quot; options.
(<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1069">#1069</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/ab515cda322baeb94c7588117e4bb5bee6281874">ab515cd</a>),
closes <a
href="https://redirect.github.com/stephenh/ts-proto/issues/1033">#1033</a></li>
</ul>
<h1><a
href="https://github.com/stephenh/ts-proto/compare/v1.179.0...v1.180.0">1.180.0</a>
(2024-06-15)</h1>
<h3>Features</h3>
<ul>
<li>oneof=unions-value to use the same field name for oneof cases (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1062">#1062</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/74930908cc8e5292577a793b7ae06c3721225ac3">7493090</a>),
closes <a
href="https://redirect.github.com/stephenh/ts-proto/issues/1060">#1060</a></li>
</ul>
<h1><a
href="https://github.com/stephenh/ts-proto/compare/v1.178.0...v1.179.0">1.179.0</a>
(2024-06-15)</h1>
<h3>Features</h3>
<ul>
<li>bigIntLiteral option for using BigInt literals (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1063">#1063</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/b89fbcb1f99ccfcd1f06551286c2459e44a3bac2">b89fbcb</a>),
closes <a
href="https://redirect.github.com/stephenh/ts-proto/issues/928">#928</a>
<a
href="https://redirect.github.com/stephenh/ts-proto/issues/932">#932</a></li>
</ul>
<h1><a
href="https://github.com/stephenh/ts-proto/compare/v1.177.0...v1.178.0">1.178.0</a>
(2024-06-07)</h1>
<h3>Features</h3>
<ul>
<li><code>no-file-descriptor</code> setting for outputSchema option (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1047">#1047</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/c54f06c4a7dd766abf3f91932b1e4cdf38b7f346">c54f06c</a>)</li>
</ul>
<h1><a
href="https://github.com/stephenh/ts-proto/compare/v1.176.3...v1.177.0">1.177.0</a>
(2024-06-07)</h1>
<h3>Features</h3>
<ul>
<li>add option <code>noDefaultsForOptionals</code> (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1051">#1051</a>)
(<a
href="https://github.com/stephenh/ts-proto/commit/41d10205bf68468c37cf69e58dc4c4fdbfffcf5b">41d1020</a>)</li>
</ul>
<h2><a
href="https://github.com/stephenh/ts-proto/compare/v1.176.2...v1.176.3">1.176.3</a>
(2024-06-07)</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/stephenh/ts-proto/commit/290d0c51e7bea305f4759c138c43aef8a22af19b"><code>290d0c5</code></a>
chore(release): 1.181.2 [skip ci]</li>
<li><a
href="https://github.com/stephenh/ts-proto/commit/24014908f814e2720b9c2b9bd2ae1773be880a16"><code>2401490</code></a>
fix: toJSON Function with <code>removeEnumPrefix=true</code> and
`unrecognizedEnumValue=...</li>
<li><a
href="https://github.com/stephenh/ts-proto/commit/76243a860bdcadb16086379abebb60ff9cc3d363"><code>76243a8</code></a>
chore(release): 1.181.1 [skip ci]</li>
<li><a
href="https://github.com/stephenh/ts-proto/commit/8a95d8e0983a38e604b6990461e726db566ff311"><code>8a95d8e</code></a>
fix: Incorrect message names in the generated code for repeated fields
(<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1073">#1073</a>)</li>
<li><a
href="https://github.com/stephenh/ts-proto/commit/2078d1a905ad7961e168971c51b852deae930b6d"><code>2078d1a</code></a>
Remove .DS_Store (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1071">#1071</a>)</li>
<li><a
href="https://github.com/stephenh/ts-proto/commit/c5546fba41ed248ffa750172aae3544e70ffddb8"><code>c5546fb</code></a>
chore(release): 1.181.0 [skip ci]</li>
<li><a
href="https://github.com/stephenh/ts-proto/commit/ab515cda322baeb94c7588117e4bb5bee6281874"><code>ab515cd</code></a>
feat: added the &quot;typePrefix&quot; and &quot;typeSuffix&quot;
options. (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1069">#1069</a>)</li>
<li><a
href="https://github.com/stephenh/ts-proto/commit/14e7e0f6a94a9309d16fb8d62467c95b3d001529"><code>14e7e0f</code></a>
docs: Describe <code>oneof=unions-value</code> output as Algebraic Data
Type (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1065">#1065</a>)</li>
<li><a
href="https://github.com/stephenh/ts-proto/commit/84ccaf2badc2def72b03f11d6faac645d1677f97"><code>84ccaf2</code></a>
chore(release): 1.180.0 [skip ci]</li>
<li><a
href="https://github.com/stephenh/ts-proto/commit/74930908cc8e5292577a793b7ae06c3721225ac3"><code>7493090</code></a>
feat: oneof=unions-value to use the same field name for oneof cases (<a
href="https://redirect.github.com/stephenh/ts-proto/issues/1062">#1062</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/stephenh/ts-proto/compare/v1.164.0...v1.181.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ts-proto&package-manager=npm_and_yarn&previous-version=1.164.0&new-version=1.181.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: ケイラ <kayla@tree.camp>
2025-10-01 23:11:11 +00:00
dependabot[bot] 29b4cfc55b chore: bump remark-gfm from 4.0.0 to 4.0.1 in /site (#20081) 2025-10-01 17:08:26 -06:00
dependabot[bot] 79f5cb8b9a chore: bump react-virtualized-auto-sizer and @types/react-virtualized-auto-sizer (#20078)
Bumps
[react-virtualized-auto-sizer](https://github.com/bvaughn/react-virtualized-auto-sizer)
and
[@types/react-virtualized-auto-sizer](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/react-virtualized-auto-sizer).
These dependencies needed to be updated together.
Updates `react-virtualized-auto-sizer` from 1.0.24 to 1.0.26
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/releases">react-virtualized-auto-sizer's
releases</a>.</em></p>
<blockquote>
<h2>1.0.26</h2>
<ul>
<li>Changed <code>width</code> and <code>height</code> values to be
based om <code>getBoundingClientRect</code> rather than
<code>offsetWidth</code> and <code>offsetHeight</code> (<a
href="https://developer.mozilla.org/en-US/docs/Web/API/HTMLElement/offsetWidth#value">which
are integers</a> and can cause rounding/flickering problems in some
cases).</li>
</ul>
<h2>1.0.25</h2>
<ul>
<li>Dependencies updated to include React 19</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/blob/master/CHANGELOG.md">react-virtualized-auto-sizer's
changelog</a>.</em></p>
<blockquote>
<h2>1.0.26</h2>
<ul>
<li>Changed <code>width</code> and <code>height</code> values to be
based om <code>getBoundingClientRect</code> rather than
<code>offsetWidth</code> and <code>offsetHeight</code> (<a
href="https://developer.mozilla.org/en-US/docs/Web/API/HTMLElement/offsetWidth#value">which
are integers</a> and can cause rounding/flickering problems in some
cases).</li>
</ul>
<h2>1.0.25</h2>
<ul>
<li>Dependencies updated to include React 19</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/commit/45b1270b631829c29746de23ad8d60b9a19f0960"><code>45b1270</code></a>
Add deprecation warning for scaledWidth/scaledHeight accessors</li>
<li><a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/commit/e9c8ef003f09b13d4afa1a2dcb13f93d910dd937"><code>e9c8ef0</code></a>
Merge pull request <a
href="https://redirect.github.com/bvaughn/react-virtualized-auto-sizer/issues/97">#97</a>
from bvaughn/issues/96</li>
<li><a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/commit/fdf25d4075f3bb6ecc1062d2ad119459c8877f57"><code>fdf25d4</code></a>
Change width and height to come from getBoundingClientRect rather than
offset...</li>
<li><a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/commit/1898b073c15d4c62587cd2cab186a1b69f1131e4"><code>1898b07</code></a>
Tweaked README format</li>
<li><a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/commit/2dc8808e2654e4b2753a6f330dc87bb72c2958ee"><code>2dc8808</code></a>
1.0.24 -&gt; 1.0.25</li>
<li><a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/commit/801cc5239defabf2b0d53cfe236ea4cc447774e1"><code>801cc52</code></a>
Merge pull request <a
href="https://redirect.github.com/bvaughn/react-virtualized-auto-sizer/issues/93">#93</a>
from olafbuitelaar/patch-1</li>
<li><a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/commit/0b8b1811ffe82658f0330a5ae2726a105e71f9ce"><code>0b8b181</code></a>
updated deps to react 19</li>
<li><a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/commit/9f970c99fd53d7cb0908ad7d4de9982fdf63bd6e"><code>9f970c9</code></a>
Update README</li>
<li>See full diff in <a
href="https://github.com/bvaughn/react-virtualized-auto-sizer/compare/1.0.24...1.0.26">compare
view</a></li>
</ul>
</details>
<br />

Updates `@types/react-virtualized-auto-sizer` from 1.0.4 to 1.0.8
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/react-virtualized-auto-sizer">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 22:46:31 +00:00
dependabot[bot] 4e53d8b9c2 chore: bump @biomejs/biome from 2.2.0 to 2.2.4 in /site (#20118)
Bumps
[@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome)
from 2.2.0 to 2.2.4.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/biomejs/biome/releases"><code>@​biomejs/biome</code>'s
releases</a>.</em></p>
<blockquote>
<h2>Biome CLI v2.2.4</h2>
<h2>2.2.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/7453">#7453</a> <a
href="https://github.com/biomejs/biome/commit/aa8cea31af675699e18988fe79242ae5d5215af1">
<code>aa8cea3</code></a> Thanks <a
href="https://github.com/arendjr"><code>@​arendjr</code></a>! - Fixed <a
href="https://redirect.github.com/biomejs/biome/issues/7242">#7242</a>:
Aliases specified in
<code>package.json</code>'s <code>imports</code> section now support
having multiple targets as part of an array.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/7454">#7454</a> <a
href="https://github.com/biomejs/biome/commit/ac171839a31600225e3b759470eaa026746e9cf4">
<code>ac17183</code></a> Thanks <a
href="https://github.com/arendjr"><code>@​arendjr</code></a>! - Greatly
improved performance of
<code>noImportCycles</code> by eliminating allocations.</p>
<p>In one repository, the total runtime of Biome with only
<code>noImportCycles</code> enabled went from ~23s down to ~4s.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/7447">#7447</a> <a
href="https://github.com/biomejs/biome/commit/7139aad75b6e8045be6eb09425fb82eb035fb704">
<code>7139aad</code></a> Thanks <a
href="https://github.com/rriski"><code>@​rriski</code></a>! - Fixes <a
href="https://redirect.github.com/biomejs/biome/issues/7446">#7446</a>.
The GritQL
<code>$...</code> spread metavariable now correctly matches members in
object literals, aligning its behavior with arrays and function
calls.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/6710">#6710</a> <a
href="https://github.com/biomejs/biome/commit/98cf9af0a4e02434983899ce49d92209a6abab02">
<code>98cf9af</code></a> Thanks <a
href="https://github.com/arendjr"><code>@​arendjr</code></a>! - Fixed <a
href="https://redirect.github.com/biomejs/biome/issues/7423">#4723</a>:
Type inference now recognises
<em>index signatures</em> and their accesses when they are being indexed
as a string.</p>
<h4>Example</h4>
<pre lang="ts"><code>type BagOfPromises = {
// This is an index signature definition. It declares that instances of
type
  // `BagOfPromises` can be indexed using arbitrary strings.
  [property: string]: Promise&lt;void&gt;;
};
<p>let bag: BagOfPromises = {};
// Because <code>bag.iAmAPromise</code> is equivalent to
<code>bag[&amp;quot;iAmAPromise&amp;quot;]</code>, this is
// considered an access to the string index, and a Promise is expected.
bag.iAmAPromise;
</code></pre></p>
</li>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/7415">#7415</a> <a
href="https://github.com/biomejs/biome/commit/d042f18f556edfd4fecff562c8f197dbec81a5e7">
<code>d042f18</code></a> Thanks <a
href="https://github.com/qraqras"><code>@​qraqras</code></a>! - Fixed <a
href="https://redirect.github.com/biomejs/biome/issues/7212">#7212</a>,
now the <a href="https://biomejs.dev/linter/rules/use-optional-chain/">
<code>useOptionalChain</code></a> rule recognizes optional chaining
using
<code>typeof</code> (e.g., <code>typeof foo !== 'undefined' &amp;&amp;
foo.bar</code>).</p>
</li>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/7419">#7419</a> <a
href="https://github.com/biomejs/biome/commit/576baf4faf568e8b6a295f457f70894235ffdb59">
<code>576baf4</code></a> Thanks <a
href="https://github.com/Conaclos"><code>@​Conaclos</code></a>! - Fixed
<a
href="https://redirect.github.com/biomejs/biome/issues/7323">#7323</a>.
<a
href="https://biomejs.dev/linter/rules/no-unused-private-class-members/">
<code>noUnusedPrivateClassMembers</code></a> no longer reports as unused
TypeScript
<code>private</code> members if the rule encounters a computed access on
<code>this</code>.</p>
<p>In the following example, <code>member</code> as previously reported
as unused. It is no longer reported.</p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md"><code>@​biomejs/biome</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>2.2.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/7453">#7453</a> <a
href="https://github.com/biomejs/biome/commit/aa8cea31af675699e18988fe79242ae5d5215af1"><code>aa8cea3</code></a>
Thanks <a href="https://github.com/arendjr"><code>@​arendjr</code></a>!
- Fixed <a
href="https://redirect.github.com/biomejs/biome/issues/7242">#7242</a>:
Aliases specified in
<code>package.json</code>'s <code>imports</code> section now support
having multiple targets as part of an array.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/7454">#7454</a> <a
href="https://github.com/biomejs/biome/commit/ac171839a31600225e3b759470eaa026746e9cf4"><code>ac17183</code></a>
Thanks <a href="https://github.com/arendjr"><code>@​arendjr</code></a>!
- Greatly improved performance of
<code>noImportCycles</code> by eliminating allocations.</p>
<p>In one repository, the total runtime of Biome with only
<code>noImportCycles</code> enabled went from ~23s down to ~4s.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/7447">#7447</a> <a
href="https://github.com/biomejs/biome/commit/7139aad75b6e8045be6eb09425fb82eb035fb704"><code>7139aad</code></a>
Thanks <a href="https://github.com/rriski"><code>@​rriski</code></a>! -
Fixes <a
href="https://redirect.github.com/biomejs/biome/issues/7446">#7446</a>.
The GritQL
<code>$...</code> spread metavariable now correctly matches members in
object literals, aligning its behavior with arrays and function
calls.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/6710">#6710</a> <a
href="https://github.com/biomejs/biome/commit/98cf9af0a4e02434983899ce49d92209a6abab02"><code>98cf9af</code></a>
Thanks <a href="https://github.com/arendjr"><code>@​arendjr</code></a>!
- Fixed <a
href="https://redirect.github.com/biomejs/biome/issues/7423">#4723</a>:
Type inference now recognises
<em>index signatures</em> and their accesses when they are being indexed
as a string.</p>
<h4>Example</h4>
<pre lang="ts"><code>type BagOfPromises = {
// This is an index signature definition. It declares that instances of
type
  // `BagOfPromises` can be indexed using arbitrary strings.
  [property: string]: Promise&lt;void&gt;;
};
<p>let bag: BagOfPromises = {};
// Because <code>bag.iAmAPromise</code> is equivalent to
<code>bag[&amp;quot;iAmAPromise&amp;quot;]</code>, this is
// considered an access to the string index, and a Promise is expected.
bag.iAmAPromise;
</code></pre></p>
</li>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/7415">#7415</a> <a
href="https://github.com/biomejs/biome/commit/d042f18f556edfd4fecff562c8f197dbec81a5e7"><code>d042f18</code></a>
Thanks <a href="https://github.com/qraqras"><code>@​qraqras</code></a>!
- Fixed <a
href="https://redirect.github.com/biomejs/biome/issues/7212">#7212</a>,
now the <a
href="https://biomejs.dev/linter/rules/use-optional-chain/"><code>useOptionalChain</code></a>
rule recognizes optional chaining using
<code>typeof</code> (e.g., <code>typeof foo !== 'undefined' &amp;&amp;
foo.bar</code>).</p>
</li>
<li>
<p><a
href="https://redirect.github.com/biomejs/biome/pull/7419">#7419</a> <a
href="https://github.com/biomejs/biome/commit/576baf4faf568e8b6a295f457f70894235ffdb59"><code>576baf4</code></a>
Thanks <a
href="https://github.com/Conaclos"><code>@​Conaclos</code></a>! - Fixed
<a
href="https://redirect.github.com/biomejs/biome/issues/7323">#7323</a>.
<a
href="https://biomejs.dev/linter/rules/no-unused-private-class-members/"><code>noUnusedPrivateClassMembers</code></a>
no longer reports as unused TypeScript
<code>private</code> members if the rule encounters a computed access on
<code>this</code>.</p>
<p>In the following example, <code>member</code> as previously reported
as unused. It is no longer reported.</p>
<pre lang="ts"><code>class TsBioo {
  private member: number;
<p>set_with_name(name: string, value: number) {
this[name] = value;
}
}
</code></pre></p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/biomejs/biome/commit/5d212c5ab940ba83cc72d4aa9936ebbb1964ae7a"><code>5d212c5</code></a>
ci: release (<a
href="https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome/issues/7450">#7450</a>)</li>
<li><a
href="https://github.com/biomejs/biome/commit/351bccdfe49a6173cb1446ef2a8a9171c8d78c26"><code>351bccd</code></a>
chore: restore release files</li>
<li><a
href="https://github.com/biomejs/biome/commit/32dbfa156b3d097813ff96e53a65b4004adb3591"><code>32dbfa1</code></a>
ci: release (<a
href="https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome/issues/7413">#7413</a>)</li>
<li><a
href="https://github.com/biomejs/biome/commit/75b6a0d12f3aa30647f743d607b0d60c0470fff3"><code>75b6a0d</code></a>
feat(linter): add rule <code>noJsxLiterals</code> (<a
href="https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome/issues/7248">#7248</a>)</li>
<li><a
href="https://github.com/biomejs/biome/commit/53ff5ae34428f042bb5b80c19862c9cf69fc6359"><code>53ff5ae</code></a>
feat(analyse/json): add <code>noDuplicateDependencies</code> rule (<a
href="https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome/issues/7142">#7142</a>)</li>
<li><a
href="https://github.com/biomejs/biome/commit/daa4a66e7971800a5e15024f5b5535d072087ac9"><code>daa4a66</code></a>
ci: release (<a
href="https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome/issues/7306">#7306</a>)</li>
<li><a
href="https://github.com/biomejs/biome/commit/0f38ea689acf6c64b0e749d34da48a03a9708067"><code>0f38ea6</code></a>
chore: add new bronze sponsor (<a
href="https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome/issues/7397">#7397</a>)</li>
<li><a
href="https://github.com/biomejs/biome/commit/7f532745900936039e77cc0b4254562ec9a7376d"><code>7f53274</code></a>
docs: safety of useSortedKeys (<a
href="https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome/issues/6112">#6112</a>)</li>
<li><a
href="https://github.com/biomejs/biome/commit/fad34b9db9778fe964ff7dbc489de0bfad2d3ece"><code>fad34b9</code></a>
feat(biome_js_analyze): add UseConsistentArrowReturn rule (<a
href="https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome/issues/7245">#7245</a>)</li>
<li><a
href="https://github.com/biomejs/biome/commit/4416573f4d709047a28407d99381810b7bc7dcc7"><code>4416573</code></a>
feat(lint/vue): implement <code>useVueMultiWordComponentNames</code> (<a
href="https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome/issues/7373">#7373</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.4/packages/@biomejs/biome">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@biomejs/biome&package-manager=npm_and_yarn&previous-version=2.2.0&new-version=2.2.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 22:12:40 +00:00
dependabot[bot] 230df3eadf chore: bump react-confetti from 6.2.2 to 6.4.0 in /site (#20126)
Bumps [react-confetti](https://github.com/alampros/react-confetti) from
6.2.2 to 6.4.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/alampros/react-confetti/releases">react-confetti's
releases</a>.</em></p>
<blockquote>
<h2>v6.4.0</h2>
<h1><a
href="https://github.com/alampros/react-confetti/compare/v6.3.0...v6.4.0">6.4.0</a>
(2025-03-04)</h1>
<h3>Bug Fixes</h3>
<ul>
<li>clamp tweenProgress between 0 and tweenDuration (<a
href="https://github.com/alampros/react-confetti/commit/f988305151150522111c7b846303ba2263fa85e9">f988305</a>)</li>
</ul>
<h3>Features</h3>
<ul>
<li>adding tweenFrom property to allow smooth transition when parameters
change (<a
href="https://github.com/alampros/react-confetti/commit/dde31e0e28da164f7b9f3909e73baa6ffe24cf68">dde31e0</a>)</li>
</ul>
<h2>v6.3.0</h2>
<h1><a
href="https://github.com/alampros/react-confetti/compare/v6.2.3...v6.3.0">6.3.0</a>
(2025-03-01)</h1>
<h3>Bug Fixes</h3>
<ul>
<li>prevent particle flicker on removal (<a
href="https://github.com/alampros/react-confetti/commit/5cb5bd87032705b90a8d85588d931b985a4127e5">5cb5bd8</a>)</li>
</ul>
<h3>Features</h3>
<ul>
<li>using elapsed time in physics updates (<a
href="https://github.com/alampros/react-confetti/commit/d1626dcd16758f627749a30794075663c5fce15f">d1626dc</a>)</li>
</ul>
<h2>v6.2.3</h2>
<h2><a
href="https://github.com/alampros/react-confetti/compare/v6.2.2...v6.2.3">6.2.3</a>
(2025-02-22)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>export IConfettiOptions (<a
href="https://github.com/alampros/react-confetti/commit/73dffe8f87cb87ed7aed42effdc056ac6fe8ffde">73dffe8</a>),
closes <a
href="https://redirect.github.com/alampros/react-confetti/issues/165">#165</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/alampros/react-confetti/blob/develop/CHANGELOG.md">react-confetti's
changelog</a>.</em></p>
<blockquote>
<h1><a
href="https://github.com/alampros/react-confetti/compare/v6.3.0...v6.4.0">6.4.0</a>
(2025-03-04)</h1>
<h3>Bug Fixes</h3>
<ul>
<li>clamp tweenProgress between 0 and tweenDuration (<a
href="https://github.com/alampros/react-confetti/commit/f988305151150522111c7b846303ba2263fa85e9">f988305</a>)</li>
</ul>
<h3>Features</h3>
<ul>
<li>adding tweenFrom property to allow smooth transition when parameters
change (<a
href="https://github.com/alampros/react-confetti/commit/dde31e0e28da164f7b9f3909e73baa6ffe24cf68">dde31e0</a>)</li>
</ul>
<h1><a
href="https://github.com/alampros/react-confetti/compare/v6.2.3...v6.3.0">6.3.0</a>
(2025-03-01)</h1>
<h3>Bug Fixes</h3>
<ul>
<li>prevent particle flicker on removal (<a
href="https://github.com/alampros/react-confetti/commit/5cb5bd87032705b90a8d85588d931b985a4127e5">5cb5bd8</a>)</li>
</ul>
<h3>Features</h3>
<ul>
<li>using elapsed time in physics updates (<a
href="https://github.com/alampros/react-confetti/commit/d1626dcd16758f627749a30794075663c5fce15f">d1626dc</a>)</li>
</ul>
<h2><a
href="https://github.com/alampros/react-confetti/compare/v6.2.2...v6.2.3">6.2.3</a>
(2025-02-22)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>export IConfettiOptions (<a
href="https://github.com/alampros/react-confetti/commit/73dffe8f87cb87ed7aed42effdc056ac6fe8ffde">73dffe8</a>),
closes <a
href="https://redirect.github.com/alampros/react-confetti/issues/165">#165</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/alampros/react-confetti/commit/0d535377bc449b9515e19975315620a13ca20dba"><code>0d53537</code></a>
chore(release): 6.4.0 [skip ci]</li>
<li><a
href="https://github.com/alampros/react-confetti/commit/641b3519007b53699b876c4b7d6e32473633366c"><code>641b351</code></a>
Merge branch 'develop'</li>
<li><a
href="https://github.com/alampros/react-confetti/commit/6ac61ed2d39743845efe23f2c25783b1f763c165"><code>6ac61ed</code></a>
lint fix</li>
<li><a
href="https://github.com/alampros/react-confetti/commit/90f5a594883b95cc6f6ee4e291a8878e2222a77e"><code>90f5a59</code></a>
Merge pull request <a
href="https://redirect.github.com/alampros/react-confetti/issues/172">#172</a>
from AlexJDG/bugfix/fix-tweening</li>
<li><a
href="https://github.com/alampros/react-confetti/commit/f988305151150522111c7b846303ba2263fa85e9"><code>f988305</code></a>
fix: clamp tweenProgress between 0 and tweenDuration</li>
<li><a
href="https://github.com/alampros/react-confetti/commit/dde31e0e28da164f7b9f3909e73baa6ffe24cf68"><code>dde31e0</code></a>
feat: adding tweenFrom property to allow smooth transition when
parameters ch...</li>
<li><a
href="https://github.com/alampros/react-confetti/commit/5dd9f7b4ea29c9460b0fd7ffb1864d178b1aa2ac"><code>5dd9f7b</code></a>
chore: renaming property for clarity</li>
<li><a
href="https://github.com/alampros/react-confetti/commit/31eb46d24be1398cb81d501670dec96aba518cb6"><code>31eb46d</code></a>
chore(release): 6.3.0 [skip ci]</li>
<li><a
href="https://github.com/alampros/react-confetti/commit/e44738908b515237443cbb865cd1e61fdcc2a938"><code>e447389</code></a>
Merge branch 'develop'</li>
<li><a
href="https://github.com/alampros/react-confetti/commit/b1779918cf790752a4095df4ecb794a97b025994"><code>b177991</code></a>
fix lint action name</li>
<li>Additional commits viewable in <a
href="https://github.com/alampros/react-confetti/compare/v6.2.2...v6.4.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=react-confetti&package-manager=npm_and_yarn&previous-version=6.2.2&new-version=6.4.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 22:11:37 +00:00
Asher 94c76b97bd feat: add list_apps MCP tool (#19952) 2025-10-01 13:57:11 -08:00
Asher ebcfae27a2 feat: add task create, list, status, and delete MCP tools (#19901) 2025-10-01 13:39:45 -08:00
dependabot[bot] 0993dcfef7 chore: bump monaco-editor from 0.52.2 to 0.53.0 in /site (#20120)
Bumps [monaco-editor](https://github.com/microsoft/monaco-editor) from
0.52.2 to 0.53.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/microsoft/monaco-editor/releases">monaco-editor's
releases</a>.</em></p>
<blockquote>
<h2>v0.53.0</h2>
<h2>Changes:</h2>
<ul>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4980">#4980</a>:
sets node version</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4979">#4979</a>:
v0.53.0</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4978">#4978</a>:
updates changelog</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4975">#4975</a>:
Fixes worker sandbox problems</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4969">#4969</a>:
Implements language selection</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4967">#4967</a>:
adds amd and firefox/webkit to smoke tests</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4966">#4966</a>:
Fixes AMD web worker loading</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4965">#4965</a>:
Updates monaco-editor-core dependency &amp; fixes basic-languages amd
file</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4964">#4964</a>:
Run npm run playwright-install</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4963">#4963</a>:
Hediet/b/successful mosquito</li>
</ul>
<!-- raw HTML omitted -->
<ul>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4962">#4962</a>:
Fixes node version</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4950">#4950</a>:
ESM Progress</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4913">#4913</a>:
Bump webpack-dev-server from 4.10.0 to 5.2.1 in /samples</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4915">#4915</a>:
Add snowflake sql keywords</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4895">#4895</a>:
Fixes <a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4799">microsoft/monaco-editor#4799</a></li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4742">#4742</a>:
Update webpack plugin to support module workers</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4824">#4824</a>:
Fix CI and website workflows</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4747">#4747</a>:
Engineering - add dependsOn field</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4734">#4734</a>:
Bump express from 4.19.2 to 4.21.1 in /website</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4616">#4616</a>:
Bump requirejs from 2.3.6 to 2.3.7</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4668">#4668</a>:
Bump micromatch from 4.0.5 to 4.0.8 in /website</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4726">#4726</a>:
Bump http-proxy-middleware from 2.0.6 to 2.0.7 in /website</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4595">#4595</a>:
Bump ws from 8.8.1 to 8.18.0 in /samples</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4697">#4697</a>:
Bump body-parser and express in /samples</li>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4696">#4696</a>:
Bump rollup from 2.79.1 to 2.79.2</li>
</ul>
<p>This list of changes was <a
href="https://dev.azure.com/monacotools/Monaco/_build/results?buildId=356772&amp;view=logs">auto
generated</a>.<!-- raw HTML omitted --></p>
<h2>v0.53.0-rc2</h2>
<p>No release notes provided.</p>
<h2>v0.53.0-dev-20250908</h2>
<p>No release notes provided.</p>
<h2>v0.53.0-dev-20250907</h2>
<p>No release notes provided.</p>
<h2>v0.53.0-dev-20250906</h2>
<h2>Changes:</h2>
<ul>
<li><a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4978">#4978</a>:
updates changelog</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/microsoft/monaco-editor/blob/main/CHANGELOG.md">monaco-editor's
changelog</a>.</em></p>
<blockquote>
<h2>[0.53.0]</h2>
<ul>
<li>⚠️ This release deprecates the AMD build and ships with
significant changes of the AMD build. The AMD build will still be
shipped for a while, but we don't offer support for it anymore. Please
migrate to the ESM build.</li>
</ul>
<h3>New Features</h3>
<ul>
<li>Next Edit Suggestion support.</li>
<li>Scroll On Middle Click</li>
<li>Edit Context Support</li>
</ul>
<h3>Breaking Changes</h3>
<ul>
<li>Internal AMD modules are no longer accessible. Accessing internal
AMD modules was never supported. While this is still possible in the ESM
build, we don't encourage this usage pattern.</li>
<li>The <a
href="https://github.com/microsoft/monaco-editor/blob/a4d7907bd439b06b24e334bdf2ab597bcae658b5/samples/browser-script-editor/index.html">browser-script-editor
scenario</a> for unbundled synchronous script import and editor creation
no longer works. Instead, a the ESM build should be used with a bundler,
such as vite or webpack.</li>
<li>Custom AMD workers don't work anymore out of the box.</li>
</ul>
<h2>[0.52.0]</h2>
<ul>
<li>Comment added inside of <code>IModelContentChangedEvent</code></li>
</ul>
<h2>[0.51.0]</h2>
<ul>
<li>New fields <code>IEditorOptions.placeholder</code> and
<code>IEditorOptions.compactMode</code></li>
<li>New fields <code>IGotoLocationOptions.multipleTests</code> and
<code>IGotoLocationOptions.alternativeTestsCommand</code></li>
<li>New field <code>IInlineEditOptions.backgroundColoring</code></li>
<li>New experimental field
<code>IEditorOptions.experimental.useTrueInlineView</code></li>
<li>New options <code>CommentThreadRevealOptions</code> for
comments</li>
</ul>
<p>Contributions to <code>monaco-editor</code>:</p>
<ul>
<li><a href="https://github.com/ScottCarda-MS"><code>@​ScottCarda-MS
(Scott Carda)</code></a>: Update Q# Keywords [PR <a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4586">#4586</a>](<a
href="https://redirect.github.com/microsoft/monaco-editor/pull/4586">microsoft/monaco-editor#4586</a>)</li>
</ul>
<h2>[0.50.0]</h2>
<ul>
<li>New field
<code>IEditorMinimapOptions.sectionHeaderLetterSpacing</code></li>
<li>New field <code>IOverlayWidgetPosition.stackOridinal</code></li>
<li>New field <code>EmitOutput.diagnostics</code></li>
<li>New event <code>IOverlayWidget.onDidLayout</code></li>
<li>New events <code>ICodeEditor.onBeginUpdate</code> and
<code>ICodeEditor.onEndUpdate</code></li>
<li><code>HoverVerbosityRequest.action</code> -&gt;
<code>HoverVerbosityRequest.verbosityDelta</code></li>
<li><code>MultiDocumentHighlightProvider.selector</code> changed from
<code>LanguageFilter</code> to <code>LanguageSelector</code></li>
<li>New optional parameters in <code>getEmitOutput</code>:
<code>emitOnlyDtsFiles</code> and <code>forceDtsEmit</code></li>
</ul>
<p>Contributions to <code>monaco-editor</code>:</p>
<ul>
<li><a href="https://github.com/htcfreek"><code>@​htcfreek
(Heiko)</code></a>: Add extension to <code>csp.contribution.ts</code>
[PR <a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4504">#4504</a>](<a
href="https://redirect.github.com/microsoft/monaco-editor/pull/4504">microsoft/monaco-editor#4504</a>)</li>
<li><a href="https://github.com/jakebailey"><code>@​jakebailey (Jake
Bailey)</code></a>: Call clearFiles on internal EmitOutput diagnostics,
pass args down [PR <a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4482">#4482</a>](<a
href="https://redirect.github.com/microsoft/monaco-editor/pull/4482">microsoft/monaco-editor#4482</a>)</li>
<li><a href="https://github.com/johnyanarella"><code>@​johnyanarella
(John Yanarella)</code></a>: Update TypeScript to TS 5.4.5 in all
projects, vendored files [PR <a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4305">#4305</a>](<a
href="https://redirect.github.com/microsoft/monaco-editor/pull/4305">microsoft/monaco-editor#4305</a>)</li>
<li><a
href="https://github.com/samstrohkorbatt"><code>@​samstrohkorbatt</code></a>:
Adding Python f-string syntax support [PR <a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4401">#4401</a>](<a
href="https://redirect.github.com/microsoft/monaco-editor/pull/4401">microsoft/monaco-editor#4401</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/microsoft/monaco-editor/commit/4e45ba0c5ff45fc61c0ccac61c0987369df04a6e"><code>4e45ba0</code></a>
sets node version (<a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4980">#4980</a>)</li>
<li><a
href="https://github.com/microsoft/monaco-editor/commit/c7f027ed9e925e7704f24eb93d99defa867cda7c"><code>c7f027e</code></a>
v0.53.0 (<a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4979">#4979</a>)</li>
<li><a
href="https://github.com/microsoft/monaco-editor/commit/c41951a02b7e9497484ed259a4af7197c4b0c123"><code>c41951a</code></a>
updates changelog (<a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4978">#4978</a>)</li>
<li><a
href="https://github.com/microsoft/monaco-editor/commit/759c442daedd192c1b61abacb624b68665dbd1b6"><code>759c442</code></a>
Fixes css and amd output</li>
<li><a
href="https://github.com/microsoft/monaco-editor/commit/6f3fbe8c3a8e14adff806a554cebe97aa8f4a2a3"><code>6f3fbe8</code></a>
Fixes worker sandbox problems (<a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4975">#4975</a>)</li>
<li><a
href="https://github.com/microsoft/monaco-editor/commit/a4d7907bd439b06b24e334bdf2ab597bcae658b5"><code>a4d7907</code></a>
Implements language selection (<a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4969">#4969</a>)</li>
<li><a
href="https://github.com/microsoft/monaco-editor/commit/9e4368a8e92e9035d9bdc719a90f908a01067c41"><code>9e4368a</code></a>
adds amd and firefox/webkit to smoke tests (<a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4967">#4967</a>)</li>
<li><a
href="https://github.com/microsoft/monaco-editor/commit/d2c20a1ad77b3aa406aad09f9576e91619fa5dcb"><code>d2c20a1</code></a>
Fixes AMD web worker loading (<a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4966">#4966</a>)</li>
<li><a
href="https://github.com/microsoft/monaco-editor/commit/3bfde9adce4057c11fcb56ffa39fa5ab9a463875"><code>3bfde9a</code></a>
Updates monaco-editor-core dependency &amp; fixes basic-languages amd
file (<a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4965">#4965</a>)</li>
<li><a
href="https://github.com/microsoft/monaco-editor/commit/15e0a937777d0b7639d4c987e00ad5fcf4d1d47e"><code>15e0a93</code></a>
Run npm run playwright-install (<a
href="https://redirect.github.com/microsoft/monaco-editor/issues/4964">#4964</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/microsoft/monaco-editor/compare/v0.52.2...v0.53.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=monaco-editor&package-manager=npm_and_yarn&previous-version=0.52.2&new-version=0.53.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:57:55 +00:00
dependabot[bot] f05ccfe525 chore: bump postcss from 8.5.1 to 8.5.6 in /site (#20113)
Bumps [postcss](https://github.com/postcss/postcss) from 8.5.1 to 8.5.6.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/postcss/postcss/releases">postcss's
releases</a>.</em></p>
<blockquote>
<h2>8.5.6</h2>
<ul>
<li>Fixed <code>ContainerWithChildren</code> type discriminating (by <a
href="https://github.com/Goodwine"><code>@​Goodwine</code></a>).</li>
</ul>
<h2>8.5.5</h2>
<ul>
<li>Fixed <code>package.json</code>→<code>exports</code> compatibility
with some tools (by <a
href="https://github.com/JounQin"><code>@​JounQin</code></a>).</li>
</ul>
<h2>8.5.4</h2>
<ul>
<li>Fixed Parcel compatibility issue (by <a
href="https://github.com/git-sumitchaudhary"><code>@​git-sumitchaudhary</code></a>).</li>
</ul>
<h2>8.5.3</h2>
<ul>
<li>Added more details to <code>Unknown word</code> error (by <a
href="https://github.com/hiepxanh"><code>@​hiepxanh</code></a>).</li>
<li>Fixed types (by <a
href="https://github.com/romainmenke"><code>@​romainmenke</code></a>).</li>
<li>Fixed docs (by <a
href="https://github.com/catnipan"><code>@​catnipan</code></a>).</li>
</ul>
<h2>8.5.2</h2>
<ul>
<li>Fixed end position of rules with semicolon (by <a
href="https://github.com/romainmenke"><code>@​romainmenke</code></a>).</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/postcss/postcss/blob/main/CHANGELOG.md">postcss's
changelog</a>.</em></p>
<blockquote>
<h2>8.5.6</h2>
<ul>
<li>Fixed <code>ContainerWithChildren</code> type discriminating (by <a
href="https://github.com/Goodwine"><code>@​Goodwine</code></a>).</li>
</ul>
<h2>8.5.5</h2>
<ul>
<li>Fixed <code>package.json</code>→<code>exports</code> compatibility
with some tools (by <a
href="https://github.com/JounQin"><code>@​JounQin</code></a>).</li>
</ul>
<h2>8.5.4</h2>
<ul>
<li>Fixed Parcel compatibility issue (by <a
href="https://github.com/git-sumitchaudhary"><code>@​git-sumitchaudhary</code></a>).</li>
</ul>
<h2>8.5.3</h2>
<ul>
<li>Added more details to <code>Unknown word</code> error (by <a
href="https://github.com/hiepxanh"><code>@​hiepxanh</code></a>).</li>
<li>Fixed types (by <a
href="https://github.com/romainmenke"><code>@​romainmenke</code></a>).</li>
<li>Fixed docs (by <a
href="https://github.com/catnipan"><code>@​catnipan</code></a>).</li>
</ul>
<h2>8.5.2</h2>
<ul>
<li>Fixed end position of rules with semicolon (by <a
href="https://github.com/romainmenke"><code>@​romainmenke</code></a>).</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/postcss/postcss/commit/91d6eb5c3d1ca8acb4e8e3926005acf2b066c211"><code>91d6eb5</code></a>
Release 8.5.6 version</li>
<li><a
href="https://github.com/postcss/postcss/commit/65ffc55117bf4289b1f977986ed76fad402641b1"><code>65ffc55</code></a>
Update dependencies</li>
<li><a
href="https://github.com/postcss/postcss/commit/ecd20eb7f9587d63e3f3348b768aec0e9fb000d3"><code>ecd20eb</code></a>
Fix ContainerWithChildren to allow discriminating the node type by
comparing ...</li>
<li><a
href="https://github.com/postcss/postcss/commit/c18159719e4a6d65ad7085edf1dc42e07814f683"><code>c181597</code></a>
Release 8.5.5 version</li>
<li><a
href="https://github.com/postcss/postcss/commit/c5523fbec5f32622e77103c643e1258007c2609d"><code>c5523fb</code></a>
Update dependencies</li>
<li><a
href="https://github.com/postcss/postcss/commit/2e3450c55f41e378e086f4f189e5243a573c3390"><code>2e3450c</code></a>
refactor: <code>import</code> should be listed before
<code>require</code> (<a
href="https://redirect.github.com/postcss/postcss/issues/2052">#2052</a>)</li>
<li><a
href="https://github.com/postcss/postcss/commit/4d720bd01adec2e8645bf91e725825bebb712e1b"><code>4d720bd</code></a>
Update EM text</li>
<li><a
href="https://github.com/postcss/postcss/commit/6cb4a6673fb6d8b23eb1ebe66a22b6267ab141de"><code>6cb4a66</code></a>
Release 8.5.4 version</li>
<li><a
href="https://github.com/postcss/postcss/commit/ec5c1e031083664bed1cf91eaac72f8c61068110"><code>ec5c1e0</code></a>
Update dependencies</li>
<li><a
href="https://github.com/postcss/postcss/commit/e85e9385c87499bc7e274c6ce332cf59e3988994"><code>e85e938</code></a>
Fix code format</li>
<li>Additional commits viewable in <a
href="https://github.com/postcss/postcss/compare/8.5.1...8.5.6">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=postcss&package-manager=npm_and_yarn&previous-version=8.5.1&new-version=8.5.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:57:46 +00:00
dependabot[bot] b68c648476 chore: bump react-resizable-panels from 3.0.3 to 3.0.6 in /site (#20125)
Bumps
[react-resizable-panels](https://github.com/bvaughn/react-resizable-panels)
from 3.0.3 to 3.0.6.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/bvaughn/react-resizable-panels/releases">react-resizable-panels's
releases</a>.</em></p>
<blockquote>
<h2>3.0.6</h2>
<ul>
<li><a
href="https://redirect.github.com/bvaughn/react-resizable-panels/pull/517">#517</a>:
Fixed Firefox bug that caused resizing to be interrupted unexpected</li>
</ul>
<h2>3.0.5</h2>
<ul>
<li><a
href="https://redirect.github.com/bvaughn/react-resizable-panels/pull/512">#512</a>:
Fixed size precision regression from 2.0.17</li>
</ul>
<h2>3.0.4</h2>
<ul>
<li><a
href="https://redirect.github.com/bvaughn/react-resizable-panels/pull/503">#503</a>:
Support custom cursors</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/bvaughn/react-resizable-panels/commit/43a071df6bb96d54e8b81a6be0e2d453af45cc4b"><code>43a071d</code></a>
3.0.5 -&gt; 3.0.6</li>
<li><a
href="https://github.com/bvaughn/react-resizable-panels/commit/2bb5e969431581f57f6507726ca26998f5d25663"><code>2bb5e96</code></a>
Fix Firefox drag bug (<a
href="https://redirect.github.com/bvaughn/react-resizable-panels/issues/517">#517</a>)</li>
<li><a
href="https://github.com/bvaughn/react-resizable-panels/commit/f65a4815c73d43ae884c0465d59da76991f4d14e"><code>f65a481</code></a>
3.0.4 -&gt; 3.0.5</li>
<li><a
href="https://github.com/bvaughn/react-resizable-panels/commit/0bfe50b08535970513c1b4d91d23f3d223fb24f0"><code>0bfe50b</code></a>
Prettier</li>
<li><a
href="https://github.com/bvaughn/react-resizable-panels/commit/25c91ac9d77db5c6e00f3d19690cd3780e431289"><code>25c91ac</code></a>
reintroduce toFixed changes from 2.0.16 removed in 2.0.17 (<a
href="https://redirect.github.com/bvaughn/react-resizable-panels/issues/512">#512</a>)</li>
<li><a
href="https://github.com/bvaughn/react-resizable-panels/commit/97b6d482f5d83da1f561874619eff95026efee6f"><code>97b6d48</code></a>
Update Code Sandbox link</li>
<li><a
href="https://github.com/bvaughn/react-resizable-panels/commit/45300ec4463da75caa2147243e16de19c4183d33"><code>45300ec</code></a>
Remove duplication in Panel component's Imperative API docs. (<a
href="https://redirect.github.com/bvaughn/react-resizable-panels/issues/511">#511</a>)</li>
<li><a
href="https://github.com/bvaughn/react-resizable-panels/commit/8d426e7fbddf2a6fe1c5e9e6c91f23ea30107ce0"><code>8d426e7</code></a>
Fix typo in README.md (<a
href="https://redirect.github.com/bvaughn/react-resizable-panels/issues/509">#509</a>)</li>
<li><a
href="https://github.com/bvaughn/react-resizable-panels/commit/5af2d8d030a034e9295bc31d8229f70004dd2e25"><code>5af2d8d</code></a>
3.0.3 -&gt; 3.0.4</li>
<li><a
href="https://github.com/bvaughn/react-resizable-panels/commit/9b69a44a1f32657a9c6e2d41b73153814909e9a7"><code>9b69a44</code></a>
Support custom cursors (<a
href="https://redirect.github.com/bvaughn/react-resizable-panels/issues/503">#503</a>)
(<a
href="https://redirect.github.com/bvaughn/react-resizable-panels/issues/504">#504</a>)</li>
<li>See full diff in <a
href="https://github.com/bvaughn/react-resizable-panels/compare/3.0.3...3.0.6">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=react-resizable-panels&package-manager=npm_and_yarn&previous-version=3.0.3&new-version=3.0.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:51:57 +00:00
dependabot[bot] e89c7cdcef chore: bump @fontsource/ibm-plex-mono from 5.1.1 to 5.2.7 in /site (#20127)
Bumps
[@fontsource/ibm-plex-mono](https://github.com/fontsource/font-files/tree/HEAD/fonts/google/ibm-plex-mono)
from 5.1.1 to 5.2.7.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/fontsource/font-files/commits/HEAD/fonts/google/ibm-plex-mono">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@fontsource/ibm-plex-mono&package-manager=npm_and_yarn&previous-version=5.1.1&new-version=5.2.7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:51:43 +00:00
dependabot[bot] b92a0f428c chore: bump autoprefixer from 10.4.20 to 10.4.21 in /site (#20124)
Bumps [autoprefixer](https://github.com/postcss/autoprefixer) from
10.4.20 to 10.4.21.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/postcss/autoprefixer/releases">autoprefixer's
releases</a>.</em></p>
<blockquote>
<h2>10.4.21</h2>
<ul>
<li>Fixed old <code>-moz-</code> prefix for
<code>:placeholder-shown</code> (by <a
href="https://github.com/Marukome0743"><code>@​Marukome0743</code></a>).</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/postcss/autoprefixer/blob/main/CHANGELOG.md">autoprefixer's
changelog</a>.</em></p>
<blockquote>
<h2>10.4.21</h2>
<ul>
<li>Fixed old <code>-moz-</code> prefix for
<code>:placeholder-shown</code> (by <a
href="https://github.com/Marukome0743"><code>@​Marukome0743</code></a>).</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/postcss/autoprefixer/commit/541295c0e6dd348db2d3f52772b59cd403c59d29"><code>541295c</code></a>
Release 10.4.21 version</li>
<li><a
href="https://github.com/postcss/autoprefixer/commit/8d555f7e5e665d6a70e1d08db6e0bc9c4262db66"><code>8d555f7</code></a>
Update dependencies and sort imports</li>
<li><a
href="https://github.com/postcss/autoprefixer/commit/5c2421e82af45ee085d0806110fcef66bbebe59b"><code>5c2421e</code></a>
Update Node.js and pnpm on CI</li>
<li><a
href="https://github.com/postcss/autoprefixer/commit/af9cb5f365f66bf5169f1f42e08036651453b1a6"><code>af9cb5f</code></a>
fix: replace <code>:-moz-placeholder-shown</code> with
<code>:-moz-placeholder</code> (<a
href="https://redirect.github.com/postcss/autoprefixer/issues/1532">#1532</a>)</li>
<li>See full diff in <a
href="https://github.com/postcss/autoprefixer/compare/10.4.20...10.4.21">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=autoprefixer&package-manager=npm_and_yarn&previous-version=10.4.20&new-version=10.4.21)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:51:24 +00:00
dependabot[bot] cf8d7665e3 chore: bump chroma-js from 2.4.2 to 2.6.0 in /site (#20123)
Bumps [chroma-js](https://github.com/gka/chroma.js) from 2.4.2 to 2.6.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/gka/chroma.js/releases">chroma-js's
releases</a>.</em></p>
<blockquote>
<h2>v2.6.0</h2>
<h2>What's Changed</h2>
<ul>
<li>🎉 NEW: color.tint + color.shade (thanks to <a
href="https://github.com/tremby"><code>@​tremby</code></a> in <a
href="https://redirect.github.com/gka/chroma.js/pull/246">gka/chroma.js#246</a>)</li>
<li>fix: remove false w3c color cornflower (thanks to <a
href="https://github.com/friedPotat0"><code>@​friedPotat0</code></a> in
<a
href="https://redirect.github.com/gka/chroma.js/pull/298">gka/chroma.js#298</a>)</li>
<li>docs: replace website with archive by <a
href="https://github.com/Artoria2e5"><code>@​Artoria2e5</code></a> in <a
href="https://redirect.github.com/gka/chroma.js/pull/264">gka/chroma.js#264</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/gka/chroma.js/compare/v2.5.0...v2.6.0">https://github.com/gka/chroma.js/compare/v2.5.0...v2.6.0</a></p>
<h2>v2.5.0</h2>
<h2>What's Changed</h2>
<ul>
<li>🎉 Big code refactoring to ES modules plus dependency and build setup
updates by <a href="https://github.com/gka"><code>@​gka</code></a> &amp;
<a href="https://github.com/zyyv"><code>@​zyyv</code></a> in <a
href="https://redirect.github.com/gka/chroma.js/pull/336">gka/chroma.js#336</a></li>
<li>Update 404 links in readme.md by <a
href="https://github.com/rdela"><code>@​rdela</code></a> in <a
href="https://redirect.github.com/gka/chroma.js/pull/326">gka/chroma.js#326</a></li>
<li>Update readme.md by <a
href="https://github.com/JiatLn"><code>@​JiatLn</code></a> in <a
href="https://redirect.github.com/gka/chroma.js/pull/307a.js/pull/336">gka/chroma.js#307</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/zyyv"><code>@​zyyv</code></a> made their
first contribution in <a
href="https://redirect.github.com/gka/chroma.js/pull/336">gka/chroma.js#336</a></li>
<li><a href="https://github.com/rdela"><code>@​rdela</code></a> made
their first contribution in <a
href="https://redirect.github.com/gka/chroma.js/pull/326">gka/chroma.js#326</a></li>
<li><a href="https://github.com/JiatLn"><code>@​JiatLn</code></a> made
their first contribution in <a
href="https://redirect.github.com/gka/chroma.js/pull/307">gka/chroma.js#307</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/gka/chroma.js/compare/v2.3.0...v2.5.0">https://github.com/gka/chroma.js/compare/v2.3.0...v2.5.0</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/gka/chroma.js/blob/main/CHANGELOG.md">chroma-js's
changelog</a>.</em></p>
<blockquote>
<h3>2.6.0</h3>
<ul>
<li>🎉 NEW: add <a
href="https://github.com/gka/chroma.js/blob/main%5Bhttps://github.com/gka/chroma.js/blob/main%60https://github.com/gka/chroma.js/blob/mainchttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainlhttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainrhttps://github.com/gka/chroma.js/blob/main.https://github.com/gka/chroma.js/blob/mainshttps://github.com/gka/chroma.js/blob/mainhhttps://github.com/gka/chroma.js/blob/mainahttps://github.com/gka/chroma.js/blob/maindhttps://github.com/gka/chroma.js/blob/mainehttps://github.com/gka/chroma.js/blob/main(https://github.com/gka/chroma.js/blob/main)https://github.com/gka/chroma.js/blob/main%60https://github.com/gka/chroma.js/blob/main%5Dhttps://github.com/gka/chroma.js/blob/main(https://github.com/gka/chroma.js/blob/main#https://github.com/gka/chroma.js/blob/mainchttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainlhttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainrhttps://github.com/gka/chroma.js/blob/main-https://github.com/gka/chroma.js/blob/mainshttps://github.com/gka/chroma.js/blob/mainhhttps://github.com/gka/chroma.js/blob/mainahttps://github.com/gka/chroma.js/blob/maindhttps://github.com/gka/chroma.js/blob/mainehttps://github.com/gka/chroma.js/blob/main)https://github.com/gka/chroma.js/blob/main">https://github.com/gka/chroma.js/blob/main[https://github.com/gka/chroma.js/blob/main`https://github.com/gka/chroma.js/blob/mainchttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainlhttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainrhttps://github.com/gka/chroma.js/blob/main.https://github.com/gka/chroma.js/blob/mainshttps://github.com/gka/chroma.js/blob/mainhhttps://github.com/gka/chroma.js/blob/mainahttps://github.com/gka/chroma.js/blob/maindhttps://github.com/gka/chroma.js/blob/mainehttps://github.com/gka/chroma.js/blob/main(https://github.com/gka/chroma.js/blob/main)https://github.com/gka/chroma.js/blob/main`https://github.com/gka/chroma.js/blob/main]https://github.com/gka/chroma.js/blob/main(https://github.com/gka/chroma.js/blob/main#https://github.com/gka/chroma.js/blob/mainchttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainlhttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainrhttps://github.com/gka/chroma.js/blob/main-https://github.com/gka/chroma.js/blob/mainshttps://github.com/gka/chroma.js/blob/mainhhttps://github.com/gka/chroma.js/blob/mainahttps://github.com/gka/chroma.js/blob/maindhttps://github.com/gka/chroma.js/blob/mainehttps://github.com/gka/chroma.js/blob/main)https://github.com/gka/chroma.js/blob/main</a>,
<a
href="https://github.com/gka/chroma.js/blob/main%5Bhttps://github.com/gka/chroma.js/blob/main%60https://github.com/gka/chroma.js/blob/mainchttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainlhttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainrhttps://github.com/gka/chroma.js/blob/main.https://github.com/gka/chroma.js/blob/mainthttps://github.com/gka/chroma.js/blob/mainihttps://github.com/gka/chroma.js/blob/mainnhttps://github.com/gka/chroma.js/blob/mainthttps://github.com/gka/chroma.js/blob/main(https://github.com/gka/chroma.js/blob/main)https://github.com/gka/chroma.js/blob/main%60https://github.com/gka/chroma.js/blob/main%5Dhttps://github.com/gka/chroma.js/blob/main(https://github.com/gka/chroma.js/blob/main#https://github.com/gka/chroma.js/blob/mainchttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainlhttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainrhttps://github.com/gka/chroma.js/blob/main-https://github.com/gka/chroma.js/blob/mainshttps://github.com/gka/chroma.js/blob/mainhhttps://github.com/gka/chroma.js/blob/mainahttps://github.com/gka/chroma.js/blob/maindhttps://github.com/gka/chroma.js/blob/mainehttps://github.com/gka/chroma.js/blob/main)https://github.com/gka/chroma.js/blob/main">https://github.com/gka/chroma.js/blob/main[https://github.com/gka/chroma.js/blob/main`https://github.com/gka/chroma.js/blob/mainchttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainlhttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainrhttps://github.com/gka/chroma.js/blob/main.https://github.com/gka/chroma.js/blob/mainthttps://github.com/gka/chroma.js/blob/mainihttps://github.com/gka/chroma.js/blob/mainnhttps://github.com/gka/chroma.js/blob/mainthttps://github.com/gka/chroma.js/blob/main(https://github.com/gka/chroma.js/blob/main)https://github.com/gka/chroma.js/blob/main`https://github.com/gka/chroma.js/blob/main]https://github.com/gka/chroma.js/blob/main(https://github.com/gka/chroma.js/blob/main#https://github.com/gka/chroma.js/blob/mainchttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainlhttps://github.com/gka/chroma.js/blob/mainohttps://github.com/gka/chroma.js/blob/mainrhttps://github.com/gka/chroma.js/blob/main-https://github.com/gka/chroma.js/blob/mainshttps://github.com/gka/chroma.js/blob/mainhhttps://github.com/gka/chroma.js/blob/mainahttps://github.com/gka/chroma.js/blob/maindhttps://github.com/gka/chroma.js/blob/mainehttps://github.com/gka/chroma.js/blob/main)https://github.com/gka/chroma.js/blob/main</a>.</li>
<li>fix: remove false w3c color cornflower</li>
</ul>
<h3>2.5.0</h3>
<ul>
<li>refactored code base to ES6 modules</li>
</ul>
<h3>2.4.0</h3>
<ul>
<li>add support for Oklab and Oklch color spaces</li>
</ul>
<h3>2.3.0</h3>
<ul>
<li>use binom of degree n in chroma.bezier</li>
</ul>
<h3>2.2.0</h3>
<ul>
<li>use Delta e2000 for chroma.deltaE <a
href="https://redirect.github.com/gka/chroma.js/issues/269">#269</a></li>
</ul>
<h3>2.0.3</h3>
<ul>
<li>hsl2rgb will, like other x2rgb conversions now set the default alpha
to 1</li>
</ul>
<h3>2.0.2</h3>
<ul>
<li>use a more mangle-safe check for Color class constructor to fix
issues with uglifyjs and terser</li>
</ul>
<h3>2.0.1</h3>
<ul>
<li>added <code>chroma.valid()</code> for checking if a color can be
parsed by chroma.js</li>
</ul>
<h3>2.0.0</h3>
<ul>
<li>chroma.js has been ported from CoffeeScript to ES6! This means you
can now import parts of chroma in your projects!</li>
<li>changed HCG input space from [0..360,0..100,0..100] to
[0..360,0..1,0..1] (to be in line with HSL)</li>
<li>added new object unpacking (e.g. <code>hsl2rgb({h,s,l})</code>)</li>
<li>changed default interpolation to <code>lrgb</code> in
mix/interpolate and average.</li>
<li>if colors can't be parsed correctly, chroma will now throw Errors
instead of silently failing with console.errors</li>
</ul>
<h3>1.4.1</h3>
<ul>
<li>chroma.scale() now interprets <code>null</code> as NaN and returns
the fallback color. Before it had interpreted <code>null</code> as
<code>0</code></li>
<li>added <code>scale.nodata()</code> to allow customizing the
previously hard-coded fallback (aka &quot;no data&quot;) color
#cccccc</li>
</ul>
<h3>1.4.0</h3>
<ul>
<li>color.hex() now automatically sets the mode to 'rgba' if the colors
alpha channel is &lt; 1. so
<code>chroma('rgba(255,0,0,.5)').hex()</code> will now return
<code>&quot;#ff000080&quot;</code> instead of
<code>&quot;#ff0000&quot;</code>. if this is not what you want, you must
explicitly set the mode to <code>rgb</code> using
<code>.hex(&quot;rgb&quot;)</code>.</li>
<li>bugfix in chroma.average in LRGB mode (<a
href="https://redirect.github.com/gka/chroma.js/issues/187">#187</a>)</li>
<li>chroma.scale now also works with just one color (<a
href="https://redirect.github.com/gka/chroma.js/issues/180">#180</a>)</li>
</ul>
<h3>1.3.5</h3>
<ul>
<li>added LRGB interpolation</li>
</ul>
<h3>1.3.4</h3>
<ul>
<li>passing <em>null</em> as mode in scale.colors will return chroma
objects</li>
</ul>
<h3>1.3.3</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/gka/chroma.js/commits/v2.6.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=chroma-js&package-manager=npm_and_yarn&previous-version=2.4.2&new-version=2.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:50:53 +00:00
dependabot[bot] 4cabb9528e chore: bump @types/lodash from 4.17.15 to 4.17.20 in /site (#20122)
Bumps
[@types/lodash](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/lodash)
from 4.17.15 to 4.17.20.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/lodash">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@types/lodash&package-manager=npm_and_yarn&previous-version=4.17.15&new-version=4.17.20)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:46:34 +00:00
dependabot[bot] 29aa1cc572 chore: bump tailwindcss from 3.4.17 to 3.4.18 in /site (#20117)
[//]: # (dependabot-start)
⚠️  **Dependabot is rebasing this PR** ⚠️ 

Rebasing might not happen immediately, so don't worry if this takes some
time.

Note: if you make any changes to this PR yourself, they will take
precedence over the rebase.

---

[//]: # (dependabot-end)

Bumps
[tailwindcss](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/packages/tailwindcss)
from 3.4.17 to 3.4.18.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/tailwindlabs/tailwindcss/releases">tailwindcss's
releases</a>.</em></p>
<blockquote>
<h2>v3.4.18</h2>
<h3>Fixed</h3>
<ul>
<li>Improve support for raw <code>supports-[…]</code> queries in
arbitrary values (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/13605">#13605</a>)</li>
<li>Fix <code>require.cache</code> error when loaded through a
TypeScript file in Node 22.18+ (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/18665">#18665</a>)</li>
<li>Support <code>import.meta.resolve(…)</code> in configs for new
enough Node.js versions (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/18938">#18938</a>)</li>
<li>Allow using newer versions of <code>postcss-load-config</code> for
better ESM and TypeScript PostCSS config support with the CLI (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/18938">#18938</a>)</li>
<li>Remove irrelevant utility rules when matching important classes (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/19030">#19030</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md">tailwindcss's
changelog</a>.</em></p>
<blockquote>
<h2>[3.4.18] - 2024-10-01</h2>
<h3>Fixed</h3>
<ul>
<li>Improve support for raw <code>supports-[…]</code> queries in
arbitrary values (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/13605">#13605</a>)</li>
<li>Fix <code>require.cache</code> error when loaded through a
TypeScript file in Node 22.18+ (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/18665">#18665</a>)</li>
<li>Support <code>import.meta.resolve(…)</code> in configs for new
enough Node.js versions (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/18938">#18938</a>)</li>
<li>Allow using newer versions of <code>postcss-load-config</code> for
better ESM and TypeScript PostCSS config support with the CLI (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/18938">#18938</a>)</li>
<li>Remove irrelevant utility rules when matching important classes (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/19030">#19030</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/tailwindlabs/tailwindcss/commits/v3.4.18/packages/tailwindcss">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=tailwindcss&package-manager=npm_and_yarn&previous-version=3.4.17&new-version=3.4.18)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:42:53 +00:00
dependabot[bot] 30f81edbce chore: bump google.golang.org/api from 0.250.0 to 0.251.0 (#20110)
Bumps
[google.golang.org/api](https://github.com/googleapis/google-api-go-client)
from 0.250.0 to 0.251.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/google-api-go-client/releases">google.golang.org/api's
releases</a>.</em></p>
<blockquote>
<h2>v0.251.0</h2>
<h2><a
href="https://github.com/googleapis/google-api-go-client/compare/v0.250.0...v0.251.0">0.251.0</a>
(2025-09-30)</h2>
<h3>Features</h3>
<ul>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3319">#3319</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/7ef0f9bc31e15c6998e4b26b511bf1d5d50a6970">7ef0f9b</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3321">#3321</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/2cb519b1a2d45c30fe1dfde10b47a83424d30231">2cb519b</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3322">#3322</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/3e4bc6062699a1710d50f68b694bbe3a50132e82">3e4bc60</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3324">#3324</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/b41b5a5c9ef21d7499b6f206c214c4a16933e3b7">b41b5a5</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3325">#3325</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/8c5ef06788b235fcfb78e7226cf9905e88c96628">8c5ef06</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/google-api-go-client/blob/main/CHANGES.md">google.golang.org/api's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/googleapis/google-api-go-client/compare/v0.250.0...v0.251.0">0.251.0</a>
(2025-09-30)</h2>
<h3>Features</h3>
<ul>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3319">#3319</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/7ef0f9bc31e15c6998e4b26b511bf1d5d50a6970">7ef0f9b</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3321">#3321</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/2cb519b1a2d45c30fe1dfde10b47a83424d30231">2cb519b</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3322">#3322</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/3e4bc6062699a1710d50f68b694bbe3a50132e82">3e4bc60</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3324">#3324</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/b41b5a5c9ef21d7499b6f206c214c4a16933e3b7">b41b5a5</a>)</li>
<li><strong>all:</strong> Auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3325">#3325</a>)
(<a
href="https://github.com/googleapis/google-api-go-client/commit/8c5ef06788b235fcfb78e7226cf9905e88c96628">8c5ef06</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/bce8b63a2760d6bb50b56bc120bc79c5a5a4f53a"><code>bce8b63</code></a>
chore(main): release 0.251.0 (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3320">#3320</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/65e7830d404c48623fbd05f4bebdb3c8cc8b8ea9"><code>65e7830</code></a>
chore(all): update all (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3323">#3323</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/8c5ef06788b235fcfb78e7226cf9905e88c96628"><code>8c5ef06</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3325">#3325</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/b41b5a5c9ef21d7499b6f206c214c4a16933e3b7"><code>b41b5a5</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3324">#3324</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/3e4bc6062699a1710d50f68b694bbe3a50132e82"><code>3e4bc60</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3322">#3322</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/2cb519b1a2d45c30fe1dfde10b47a83424d30231"><code>2cb519b</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3321">#3321</a>)</li>
<li><a
href="https://github.com/googleapis/google-api-go-client/commit/7ef0f9bc31e15c6998e4b26b511bf1d5d50a6970"><code>7ef0f9b</code></a>
feat(all): auto-regenerate discovery clients (<a
href="https://redirect.github.com/googleapis/google-api-go-client/issues/3319">#3319</a>)</li>
<li>See full diff in <a
href="https://github.com/googleapis/google-api-go-client/compare/v0.250.0...v0.251.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=google.golang.org/api&package-manager=go_modules&previous-version=0.250.0&new-version=0.251.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:42:34 +00:00
dependabot[bot] a191ff9aa2 chore: bump dayjs from 1.11.13 to 1.11.18 in /site (#20115)
Bumps [dayjs](https://github.com/iamkun/dayjs) from 1.11.13 to 1.11.18.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/iamkun/dayjs/releases">dayjs's
releases</a>.</em></p>
<blockquote>
<h2>v1.11.18</h2>
<h2><a
href="https://github.com/iamkun/dayjs/compare/v1.11.17...v1.11.18">1.11.18</a>
(2025-08-30)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>error semantic-release dependency (<a
href="https://github.com/iamkun/dayjs/commit/8cfb31386d840d31e9655870f4d8c01592eb753a">8cfb313</a>)</li>
</ul>
<h2>v1.11.17</h2>
<h2><a
href="https://github.com/iamkun/dayjs/compare/v1.11.16...v1.11.17">1.11.17</a>
(2025-08-29)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>[en-AU] locale use the same ordinal as moment (<a
href="https://redirect.github.com/iamkun/dayjs/issues/2878">#2878</a>)
(<a
href="https://github.com/iamkun/dayjs/commit/1b95ecd21d4feafe7ab113a2d48d7d8d93bb95c9">1b95ecd</a>)</li>
</ul>
<h2>v1.11.16</h2>
<h2><a
href="https://github.com/iamkun/dayjs/compare/v1.11.15...v1.11.16">1.11.16</a>
(2025-08-29)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>test release workflow (no code changes) (<a
href="https://github.com/iamkun/dayjs/commit/c38c428a78c344699eff373adfc8c007bb3a514f">c38c428</a>)</li>
</ul>
<h2>v1.11.15</h2>
<h2><a
href="https://github.com/iamkun/dayjs/compare/v1.11.14...v1.11.15">1.11.15</a>
(2025-08-28)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Fix misspellings in Irish or Irish Gaelic [ga] (<a
href="https://redirect.github.com/iamkun/dayjs/issues/2861">#2861</a>)
(<a
href="https://github.com/iamkun/dayjs/commit/9c14a4245a8e764ee3260ff17a7ff48dfd09d279">9c14a42</a>)</li>
</ul>
<h2>v1.11.14</h2>
<h2><a
href="https://github.com/iamkun/dayjs/compare/v1.11.13...v1.11.14">1.11.14</a>
(2025-08-27)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>.utcOffset(0, true) result and its clone are different bug (<a
href="https://redirect.github.com/iamkun/dayjs/issues/2505">#2505</a>)
(<a
href="https://github.com/iamkun/dayjs/commit/fefdcd4b6b807786f65139b6dd801e0014d7dc6f">fefdcd4</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/iamkun/dayjs/blob/v1.11.18/CHANGELOG.md">dayjs's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/iamkun/dayjs/compare/v1.11.17...v1.11.18">1.11.18</a>
(2025-08-30)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>error semantic-release dependency (<a
href="https://github.com/iamkun/dayjs/commit/8cfb31386d840d31e9655870f4d8c01592eb753a">8cfb313</a>)</li>
</ul>
<h2><a
href="https://github.com/iamkun/dayjs/compare/v1.11.16...v1.11.17">1.11.17</a>
(2025-08-29)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>[en-AU] locale use the same ordinal as moment (<a
href="https://redirect.github.com/iamkun/dayjs/issues/2878">#2878</a>)
(<a
href="https://github.com/iamkun/dayjs/commit/1b95ecd21d4feafe7ab113a2d48d7d8d93bb95c9">1b95ecd</a>)</li>
</ul>
<h2><a
href="https://github.com/iamkun/dayjs/compare/v1.11.15...v1.11.16">1.11.16</a>
(2025-08-29)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>test release workflow (no code changes) (<a
href="https://github.com/iamkun/dayjs/commit/c38c428a78c344699eff373adfc8c007bb3a514f">c38c428</a>)</li>
</ul>
<h2><a
href="https://github.com/iamkun/dayjs/compare/v1.11.14...v1.11.15">1.11.15</a>
(2025-08-28)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Fix misspellings in Irish or Irish Gaelic [ga] (<a
href="https://redirect.github.com/iamkun/dayjs/issues/2861">#2861</a>)
(<a
href="https://github.com/iamkun/dayjs/commit/9c14a4245a8e764ee3260ff17a7ff48dfd09d279">9c14a42</a>)</li>
</ul>
<h2><a
href="https://github.com/iamkun/dayjs/compare/v1.11.13...v1.11.14">1.11.14</a>
(2025-08-27)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>.utcOffset(0, true) result and its clone are different bug (<a
href="https://redirect.github.com/iamkun/dayjs/issues/2505">#2505</a>)
(<a
href="https://github.com/iamkun/dayjs/commit/fefdcd4b6b807786f65139b6dd801e0014d7dc6f">fefdcd4</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/iamkun/dayjs/commit/9beb3f3ea595c6046cbe863ee6d15d63e6fb2b07"><code>9beb3f3</code></a>
chore(release): 1.11.18 [skip ci]</li>
<li><a
href="https://github.com/iamkun/dayjs/commit/d72d0cfb58595259e5b634da54f66b355e306f29"><code>d72d0cf</code></a>
Merge pull request <a
href="https://redirect.github.com/iamkun/dayjs/issues/2925">#2925</a>
from iamkun/dev</li>
<li><a
href="https://github.com/iamkun/dayjs/commit/9be50d5b504afd619efe970b3a40b11d95b98a8b"><code>9be50d5</code></a>
chore: update workflow</li>
<li><a
href="https://github.com/iamkun/dayjs/commit/8cfb31386d840d31e9655870f4d8c01592eb753a"><code>8cfb313</code></a>
fix: error semantic-release dependency</li>
<li><a
href="https://github.com/iamkun/dayjs/commit/b9815f988447425dfb6cd63c42cf0487953ddb5e"><code>b9815f9</code></a>
chore: update workflow</li>
<li><a
href="https://github.com/iamkun/dayjs/commit/7fcf939982437e50c08d33b6e1e53ab46f53c1f4"><code>7fcf939</code></a>
chore(release): 1.11.17 [skip ci]</li>
<li><a
href="https://github.com/iamkun/dayjs/commit/b832bab7b95f9756bba31a1c12d0f0c1678828d0"><code>b832bab</code></a>
d2m (<a
href="https://redirect.github.com/iamkun/dayjs/issues/2922">#2922</a>)</li>
<li><a
href="https://github.com/iamkun/dayjs/commit/1b95ecd21d4feafe7ab113a2d48d7d8d93bb95c9"><code>1b95ecd</code></a>
fix: [en-AU] locale use the same ordinal as moment (<a
href="https://redirect.github.com/iamkun/dayjs/issues/2878">#2878</a>)</li>
<li><a
href="https://github.com/iamkun/dayjs/commit/5465380fbc16ca1bd36cfd2f3e2295d2f447602c"><code>5465380</code></a>
chore: update .npmignore</li>
<li><a
href="https://github.com/iamkun/dayjs/commit/fcdbc82d6fa299a4ddb2040e1ed20c3917c1e615"><code>fcdbc82</code></a>
chore: update workflow debug <code>@​semantic-release/github</code></li>
<li>Additional commits viewable in <a
href="https://github.com/iamkun/dayjs/compare/v1.11.13...v1.11.18">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=dayjs&package-manager=npm_and_yarn&previous-version=1.11.13&new-version=1.11.18)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:42:26 +00:00
dependabot[bot] f68495aa6e chore: bump ua-parser-js from 1.0.40 to 1.0.41 in /site (#20116)
Bumps [ua-parser-js](https://github.com/faisalman/ua-parser-js) from
1.0.40 to 1.0.41.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/faisalman/ua-parser-js/releases">ua-parser-js's
releases</a>.</em></p>
<blockquote>
<h2>v1.0.41</h2>
<h2>Version 0.7.41 / 1.0.41</h2>
<ul>
<li>Add new browser: Daum, Ladybird</li>
<li>Add new device vendor: HMD</li>
<li>Add new engine: LibWeb</li>
<li>Add new os: Windows IoT, Ubuntu Touch</li>
<li>Improve cpu detection: ARM, x86</li>
<li>Improve device vendor detection: Apple, Archos, Generic, Google,
Honor, Huawei, Infinix, Nvidia, Lenovo, Nokia, OnePlus, Xiaomi</li>
<li>Improve device type detection: smarttv, wearables</li>
<li>Improve os detection: Linux, Symbian</li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/faisalman/ua-parser-js/compare/1.0.40...1.0.41">https://github.com/faisalman/ua-parser-js/compare/1.0.40...1.0.41</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/faisalman/ua-parser-js/blob/master/CHANGELOG.md">ua-parser-js's
changelog</a>.</em></p>
<blockquote>
<h2>Version 0.7.41 / 1.0.41</h2>
<ul>
<li>Add new browser: Daum, Ladybird</li>
<li>Add new device vendor: HMD</li>
<li>Add new engine: LibWeb</li>
<li>Add new os: Windows IoT, Ubuntu Touch</li>
<li>Improve cpu detection: ARM, x86</li>
<li>Improve device vendor detection: Apple, Archos, Generic, Google,
Honor, Huawei, Infinix, Nvidia, Lenovo, Nokia, OnePlus, Xiaomi</li>
<li>Improve device type detection: smarttv, wearables</li>
<li>Improve os detection: Linux, Symbian</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/faisalman/ua-parser-js/commit/90017c98d341788570435b7587ce1f0725022c7a"><code>90017c9</code></a>
Bump version <code>1.0.41</code> (mirror of <code>0.7.41</code>)</li>
<li><a
href="https://github.com/faisalman/ua-parser-js/commit/af825ff5574d63122dec5a3261f7933044001a1c"><code>af825ff</code></a>
Bump version <code>0.7.41</code></li>
<li><a
href="https://github.com/faisalman/ua-parser-js/commit/592595445144f06efc32325399c575d9cef964ab"><code>5925954</code></a>
Backport - Improve detection for Nokia device &amp; Symbian OS</li>
<li><a
href="https://github.com/faisalman/ua-parser-js/commit/fc668ef0c09b5a4663077757b8fe73bc850ff8ea"><code>fc668ef</code></a>
Backport - Improve device detection for Generic device: capture its
device mo...</li>
<li><a
href="https://github.com/faisalman/ua-parser-js/commit/0543fb2e95e55be28fc14fa68936bf790407b057"><code>0543fb2</code></a>
Backport - Improve CPU detection: ARM</li>
<li><a
href="https://github.com/faisalman/ua-parser-js/commit/98f1c00fd35cbb321228318fcf5e2f7671d75dfa"><code>98f1c00</code></a>
Backport - Improve device detection for unidentified SmartTV
vendors</li>
<li><a
href="https://github.com/faisalman/ua-parser-js/commit/d66c971090281558706a48118c4db570ebf0ed73"><code>d66c971</code></a>
Backport - Improve detection for Nvidia devices</li>
<li><a
href="https://github.com/faisalman/ua-parser-js/commit/cbe60388ea25b60c10321d964296cb8b53e74bbb"><code>cbe6038</code></a>
Backport - Add Daum app user agent (<a
href="https://redirect.github.com/faisalman/ua-parser-js/issues/773">#773</a>)</li>
<li><a
href="https://github.com/faisalman/ua-parser-js/commit/e665bd56bef61ee147c359f93c7896f17332db36"><code>e665bd5</code></a>
Backport - Add new OS: <code>Ubuntu Touch</code></li>
<li><a
href="https://github.com/faisalman/ua-parser-js/commit/20c30407207b30638c0f10c4884541115a41c56f"><code>20c3040</code></a>
Backport - Add new device: Apple HomePod</li>
<li>Additional commits viewable in <a
href="https://github.com/faisalman/ua-parser-js/compare/1.0.40...1.0.41">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ua-parser-js&package-manager=npm_and_yarn&previous-version=1.0.40&new-version=1.0.41)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:42:17 +00:00
dependabot[bot] 0765970d49 chore: bump jest-fixed-jsdom from 0.0.9 to 0.0.10 in /site (#20114)
Bumps [jest-fixed-jsdom](https://github.com/mswjs/jest-fixed-jsdom) from
0.0.9 to 0.0.10.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/mswjs/jest-fixed-jsdom/releases">jest-fixed-jsdom's
releases</a>.</em></p>
<blockquote>
<h2>v0.0.10 (2025-08-30)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>using node's global AbortController and AbortSignal (<a
href="https://redirect.github.com/mswjs/jest-fixed-jsdom/issues/35">#35</a>)
(1e63cde866d5575f42ec5fc4520ebb9c487101e2) <a
href="https://github.com/stevematney"><code>@​stevematney</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/mswjs/jest-fixed-jsdom/commit/d31545fc0c56c1ccb0c6a54644c37f54ed25605e"><code>d31545f</code></a>
chore(release): v0.0.10</li>
<li><a
href="https://github.com/mswjs/jest-fixed-jsdom/commit/1e63cde866d5575f42ec5fc4520ebb9c487101e2"><code>1e63cde</code></a>
fix: using node's global AbortController and AbortSignal (<a
href="https://redirect.github.com/mswjs/jest-fixed-jsdom/issues/35">#35</a>)</li>
<li>See full diff in <a
href="https://github.com/mswjs/jest-fixed-jsdom/compare/v0.0.9...v0.0.10">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=jest-fixed-jsdom&package-manager=npm_and_yarn&previous-version=0.0.9&new-version=0.0.10)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:42:04 +00:00
dependabot[bot] f7388f3dfc chore: bump github.com/anthropics/anthropic-sdk-go from 1.12.0 to 1.13.0 (#20109)
Bumps
[github.com/anthropics/anthropic-sdk-go](https://github.com/anthropics/anthropic-sdk-go)
from 1.12.0 to 1.13.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/anthropics/anthropic-sdk-go/releases">github.com/anthropics/anthropic-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v1.13.0</h2>
<h2>1.13.0 (2025-09-29)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-go/compare/v1.12.0...v1.13.0">v1.12.0...v1.13.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> adds support for Claude Sonnet 4.5 and context
management features (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/3d5d51ad6ee64b34c7cc361a9dfd6f45966987dd">3d5d51a</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>bugfix for setting JSON keys with special characters (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/c868b921190f8d371cc93d12e019daf5a7463306">c868b92</a>)</li>
<li><strong>internal:</strong> unmarshal correctly when there are
multiple discriminators (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/ecc3ce31a9ed98b8f2b66b5e1489fce510528f77">ecc3ce3</a>)</li>
<li>use slices.Concat instead of sometimes modifying r.Options (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/88e7186cad944290498a3381c829df36d26a1cce">88e7186</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li>bump minimum go version to 1.22 (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/87af8f397ae68ce72a76a07a735d21495aad8799">87af8f3</a>)</li>
<li>do not install brew dependencies in ./scripts/bootstrap by default
(<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/c689348cc4b5ec7ab3512261e4e3cc50d208a02c">c689348</a>)</li>
<li><strong>internal:</strong> fix tests (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/bfc6eafeff58664f0d6f155f96286f3993e60f89">bfc6eaf</a>)</li>
<li>update more docs for 1.22 (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/d67c50d49082b4b28bdabc44943853431cd5205c">d67c50d</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/anthropics/anthropic-sdk-go/blob/main/CHANGELOG.md">github.com/anthropics/anthropic-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>1.13.0 (2025-09-29)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-go/compare/v1.12.0...v1.13.0">v1.12.0...v1.13.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> adds support for Claude Sonnet 4.5 and context
management features (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/3d5d51ad6ee64b34c7cc361a9dfd6f45966987dd">3d5d51a</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>bugfix for setting JSON keys with special characters (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/c868b921190f8d371cc93d12e019daf5a7463306">c868b92</a>)</li>
<li><strong>internal:</strong> unmarshal correctly when there are
multiple discriminators (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/ecc3ce31a9ed98b8f2b66b5e1489fce510528f77">ecc3ce3</a>)</li>
<li>use slices.Concat instead of sometimes modifying r.Options (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/88e7186cad944290498a3381c829df36d26a1cce">88e7186</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li>bump minimum go version to 1.22 (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/87af8f397ae68ce72a76a07a735d21495aad8799">87af8f3</a>)</li>
<li>do not install brew dependencies in ./scripts/bootstrap by default
(<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/c689348cc4b5ec7ab3512261e4e3cc50d208a02c">c689348</a>)</li>
<li><strong>internal:</strong> fix tests (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/bfc6eafeff58664f0d6f155f96286f3993e60f89">bfc6eaf</a>)</li>
<li>update more docs for 1.22 (<a
href="https://github.com/anthropics/anthropic-sdk-go/commit/d67c50d49082b4b28bdabc44943853431cd5205c">d67c50d</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/anthropics/anthropic-sdk-go/commit/e8befdc7fdceba33c9000b0b50061b8a42cb6c04"><code>e8befdc</code></a>
release: 1.13.0</li>
<li><a
href="https://github.com/anthropics/anthropic-sdk-go/commit/1de6d4717bf8d4e6ff30e64bbdb9974dc1f36dae"><code>1de6d47</code></a>
feat(api): adds support for Claude Sonnet 4.5 and context management
features</li>
<li><a
href="https://github.com/anthropics/anthropic-sdk-go/commit/a9aab31ae626c91a70c7893e0ff583d522930435"><code>a9aab31</code></a>
fix: bugfix for setting JSON keys with special characters</li>
<li><a
href="https://github.com/anthropics/anthropic-sdk-go/commit/41a745429cf31fa0ec05d9a3638e92fdb42e4f3b"><code>41a7454</code></a>
codegen metadata</li>
<li><a
href="https://github.com/anthropics/anthropic-sdk-go/commit/31633bdc1a055d8e85a09c76f1cfe40f5dbaab8a"><code>31633bd</code></a>
chore: do not install brew dependencies in ./scripts/bootstrap by
default</li>
<li><a
href="https://github.com/anthropics/anthropic-sdk-go/commit/2deaed6d70652c9b51966235a283ec02102e72b5"><code>2deaed6</code></a>
fix: use slices.Concat instead of sometimes modifying r.Options</li>
<li><a
href="https://github.com/anthropics/anthropic-sdk-go/commit/9f35b6866932b7a21547b73a41adb531f7f30fd7"><code>9f35b68</code></a>
chore: update more docs for 1.22</li>
<li><a
href="https://github.com/anthropics/anthropic-sdk-go/commit/287a399aa194714f394d1b65a726573eb34d769a"><code>287a399</code></a>
chore: bump minimum go version to 1.22</li>
<li><a
href="https://github.com/anthropics/anthropic-sdk-go/commit/aa4540a9c37d7a3584cc158e0a24fa5377979629"><code>aa4540a</code></a>
chore(internal): fix tests</li>
<li><a
href="https://github.com/anthropics/anthropic-sdk-go/commit/73e5532c81d4cdfa082e0b7b4ae3244d54ef2469"><code>73e5532</code></a>
codegen metadata</li>
<li>Additional commits viewable in <a
href="https://github.com/anthropics/anthropic-sdk-go/compare/v1.12.0...v1.13.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/anthropics/anthropic-sdk-go&package-manager=go_modules&previous-version=1.12.0&new-version=1.13.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:41:20 +00:00
dependabot[bot] 7328fa5c08 chore: bump typescript from 5.7.3 to 5.9.3 in /offlinedocs (#20112)
Bumps [typescript](https://github.com/microsoft/TypeScript) from 5.7.3
to 5.9.3.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/microsoft/TypeScript/releases">typescript's
releases</a>.</em></p>
<blockquote>
<h2>TypeScript 5.9.3</h2>
<p>Note: this tag was recreated to point at the correct commit. The npm
package contained the correct content.</p>
<p>For release notes, check out the <a
href="https://devblogs.microsoft.com/typescript/announcing-typescript-5-9/">release
announcement</a></p>
<ul>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.9.0%22+is%3Aclosed+">fixed
issues query for Typescript 5.9.0 (Beta)</a>.</li>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.9.1%22+is%3Aclosed+">fixed
issues query for Typescript 5.9.1 (RC)</a>.</li>
<li><em>No specific changes for TypeScript 5.9.2 (Stable)</em></li>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.9.3%22+is%3Aclosed+">fixed
issues query for Typescript 5.9.3 (Stable)</a>.</li>
</ul>
<p>Downloads are available on:</p>
<ul>
<li><a href="https://www.npmjs.com/package/typescript">npm</a></li>
</ul>
<h2>TypeScript 5.9</h2>
<p>For release notes, check out the <a
href="https://devblogs.microsoft.com/typescript/announcing-typescript-5-9/">release
announcement</a></p>
<ul>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.9.0%22+is%3Aclosed+">fixed
issues query for Typescript 5.9.0 (Beta)</a>.</li>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.9.1%22+is%3Aclosed+">fixed
issues query for Typescript 5.9.1 (RC)</a>.</li>
<li><em>No specific changes for TypeScript 5.9.2 (Stable)</em></li>
</ul>
<p>Downloads are available on:</p>
<ul>
<li><a href="https://www.npmjs.com/package/typescript">npm</a></li>
</ul>
<h2>TypeScript 5.9 RC</h2>
<p>For release notes, check out the <a
href="https://devblogs.microsoft.com/typescript/announcing-typescript-5-9-rc/">release
announcement</a></p>
<ul>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.9.0%22+is%3Aclosed+">fixed
issues query for Typescript 5.9.0 (Beta)</a>.</li>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.9.1%22+is%3Aclosed+">fixed
issues query for Typescript 5.9.1 (RC)</a>.</li>
</ul>
<p>Downloads are available on:</p>
<ul>
<li><a href="https://www.npmjs.com/package/typescript">npm</a></li>
</ul>
<h2>TypeScript 5.9 Beta</h2>
<p>For release notes, check out the <a
href="https://devblogs.microsoft.com/typescript/announcing-typescript-5-9-beta/">release
announcement</a>.</p>
<ul>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.9.0%22+is%3Aclosed+">fixed
issues query for Typescript 5.9.0 (Beta)</a>.</li>
</ul>
<p>Downloads are available on:</p>
<ul>
<li><a href="https://www.npmjs.com/package/typescript">npm</a></li>
</ul>
<h2>TypeScript 5.8.3</h2>
<p>For release notes, check out the <a
href="https://devblogs.microsoft.com/typescript/announcing-typescript-5-8/">release
announcement</a>.</p>
<ul>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.8.0%22+is%3Aclosed+">fixed
issues query for Typescript 5.8.0 (Beta)</a>.</li>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.8.1%22+is%3Aclosed+">fixed
issues query for Typescript 5.8.1 (RC)</a>.</li>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.8.2%22+is%3Aclosed+">fixed
issues query for Typescript 5.8.2 (Stable)</a>.</li>
<li><a
href="https://github.com/Microsoft/TypeScript/issues?utf8=%E2%9C%93&amp;q=milestone%3A%22TypeScript+5.8.3%22+is%3Aclosed+">fixed
issues query for Typescript 5.8.3 (Stable)</a>.</li>
</ul>
<p>Downloads are available on:</p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/microsoft/TypeScript/commit/c63de15a992d37f0d6cec03ac7631872838602cb"><code>c63de15</code></a>
Bump version to 5.9.3 and LKG</li>
<li><a
href="https://github.com/microsoft/TypeScript/commit/8428ca4cc8a7ecc9ac18dd0258016228814f5eaf"><code>8428ca4</code></a>
🤖 Pick PR <a
href="https://redirect.github.com/microsoft/TypeScript/issues/62438">#62438</a>
(Fix incorrectly ignored dts file fr...) into release-5.9 (#...</li>
<li><a
href="https://github.com/microsoft/TypeScript/commit/a131cac6831aa6532ea963d0cb3131b957cad980"><code>a131cac</code></a>
🤖 Pick PR <a
href="https://redirect.github.com/microsoft/TypeScript/issues/62351">#62351</a>
(Add missing Float16Array constructo...) into release-5.9 (#...</li>
<li><a
href="https://github.com/microsoft/TypeScript/commit/04243333584a5bfaeb3434c0982c6280fe87b8d5"><code>0424333</code></a>
🤖 Pick PR <a
href="https://redirect.github.com/microsoft/TypeScript/issues/62423">#62423</a>
(Revert PR 61928) into release-5.9 (<a
href="https://redirect.github.com/microsoft/TypeScript/issues/62425">#62425</a>)</li>
<li><a
href="https://github.com/microsoft/TypeScript/commit/bdb641a4347af822916fb8cdb9894c9c2d2421dd"><code>bdb641a</code></a>
🤖 Pick PR <a
href="https://redirect.github.com/microsoft/TypeScript/issues/62311">#62311</a>
(Fix parenthesizer rules for manuall...) into release-5.9 (#...</li>
<li><a
href="https://github.com/microsoft/TypeScript/commit/0d9b9b92e2aca2f75c979a801abbc21bff473748"><code>0d9b9b9</code></a>
🤖 Pick PR <a
href="https://redirect.github.com/microsoft/TypeScript/issues/61978">#61978</a>
(Restructure CI to prepare for requi...) into release-5.9 (#...</li>
<li><a
href="https://github.com/microsoft/TypeScript/commit/2dce0c58af51cf9a9068365dc2f756c61b82b597"><code>2dce0c5</code></a>
Intentionally regress one buggy declaration output to an older version
(<a
href="https://redirect.github.com/microsoft/TypeScript/issues/62163">#62163</a>)</li>
<li><a
href="https://github.com/microsoft/TypeScript/commit/5be33469d551655d878876faa9e30aa3b49f8ee9"><code>5be3346</code></a>
Bump version to 5.9.2 and LKG</li>
<li><a
href="https://github.com/microsoft/TypeScript/commit/ad825f2bee3362886d642c48cb97c82df82b3ddb"><code>ad825f2</code></a>
Bump version to 5.9.1-rc and LKG</li>
<li><a
href="https://github.com/microsoft/TypeScript/commit/463a5bf92c3597dc14f252517c10a1bef7ac2f4c"><code>463a5bf</code></a>
Update LKG</li>
<li>Additional commits viewable in <a
href="https://github.com/microsoft/TypeScript/compare/v5.7.3...v5.9.3">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=typescript&package-manager=npm_and_yarn&previous-version=5.7.3&new-version=5.9.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:37:23 +00:00
dependabot[bot] c6c9fa1e39 chore: bump alpine from 3.21.3 to 3.22.1 in /scripts (#20107)
Bumps alpine from 3.21.3 to 3.22.1.


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=alpine&package-manager=docker&previous-version=3.21.3&new-version=3.22.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:34:28 +00:00
dependabot[bot] 3eb6e1cc97 chore: bump coder/claude-code/coder from 3.0.0 to 3.0.1 in /dogfood/coder (#20104)
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=coder/claude-code/coder&package-manager=terraform&previous-version=3.0.0&new-version=3.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:34:11 +00:00
dependabot[bot] b05033897a chore: bump rust from 3f391b0 to 1219c0b in /dogfood/coder (#20108)
Bumps rust from `3f391b0` to `1219c0b`.


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=rust&package-manager=docker&previous-version=slim&new-version=slim)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:33:54 +00:00
dependabot[bot] 7c3e24f3be chore: bump ubuntu from 0e5e4a5 to 4e0171b in /dogfood/coder (#20105)
Bumps ubuntu from `0e5e4a5` to `4e0171b`.


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ubuntu&package-manager=docker&previous-version=jammy&new-version=jammy)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 19:33:42 +00:00
ケイラ 3afab824f5 fix: revert playwright update and update dependabot config (#20103) 2025-10-01 13:26:49 -06:00
Cian Johnston b3f1492f14 ci(.github/workflows/traiage.yaml): set default inputs on trigger by label assignment (#20100)
When not triggering via `workflow_dispatch`, it looks like the default
values are simply empty.
This PR creates an intermediate step to conditionally set defaults based
on `github.event_name`.

I'm also adding a commented-out step for installing `gh` that's required
for local testing via `nektos/act`. It's not required in a 'real'
runner.
2025-10-01 19:53:58 +01:00
Bruno Quaresma 6b61c8a32a feat: add workspace status on tasks (#20037)
<img width="1206" height="722" alt="Screenshot 2025-09-30 at 11 29 31"
src="https://github.com/user-attachments/assets/f109552d-af5e-41e1-a0e8-fdfcb3f973b7"
/>


Closes https://github.com/coder/coder/issues/19988
2025-10-01 15:49:52 -03:00
dependabot[bot] 718f712c18 chore: bump the react group across 1 directory with 2 updates (#20102)
Bumps the react group with 2 updates in the /site directory:
[@types/react](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/react)
and
[@types/react-dom](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/react-dom).

Updates `@types/react` from 19.1.13 to 19.1.17
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/react">compare
view</a></li>
</ul>
</details>
<br />

Updates `@types/react-dom` from 19.1.9 to 19.1.11
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/react-dom">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-01 18:38:25 +00:00
Bruno Quaresma f23a6a1140 feat: add remove task button into the tasks list (#20036)
**Demo:**


https://github.com/user-attachments/assets/eca91a46-41fb-412c-b476-0cf91c0b69b8

Closes https://github.com/coder/coder/issues/19525
2025-10-01 15:37:11 -03:00
Bruno Quaresma 0fbe21e574 chore: downgrade msw and @radix-ui/dialog (#20098)
The upgrade caused the following error:

```
node: ../deps/uv/src/unix/stream.c:456: uv__stream_destroy: Assertion `!uv__io_active(&stream->io_watcher, POLLIN | POLLOUT)' failed.
```

After downgrading `msw`, a new error appeared only in
`WorkspacesPage.test.tsx`:

```
<--- Last few GCs --->

[2799:0x292c2000]    16790 ms: Scavenge 336.1 (443.3) -> 322.8 (443.3) MB, pooled: 32 MB, 6.45 / 0.00 ms  (average mu = 0.997, current mu = 0.996) allocation failure; 
[2799:0x292c2000]    16883 ms: Scavenge 336.7 (443.3) -> 326.8 (443.3) MB, pooled: 32 MB, 8.29 / 0.00 ms  (average mu = 0.997, current mu = 0.996) allocation failure; 
[2799:0x292c2000]    16989 ms: Scavenge 339.6 (443.3) -> 329.1 (443.3) MB, pooled: 32 MB, 9.87 / 0.00 ms  (average mu = 0.997, current mu = 0.996) allocation failure; 
```

After some debugging, I traced it to `@radix-ui/dialog`. I didn’t find
any open issues about memory leaks there, so my guess is it’s just using
more memory than our default allocation. Jest has an option to increase
the memory limit, but we should be fine for now.

Related issue:
[https://github.com/mswjs/msw/issues/2537](https://github.com/mswjs/msw/issues/2537)
2025-10-01 15:24:20 -03:00
Steven Masley 3a56ea56a7 test: fix rbac benchmark to test performance instead of cache (#20097)
The benchmark should be testing the performance of `authorize`, not a
cache lookup
2025-10-01 13:23:51 -05:00
Cian Johnston 257fb76882 ci: automatically determine issue URL when invoked via issue label assignment (#20089)
Silly me forgot that `inputs.*` will likely be empty when invoked
outside of `workflow_dispatch`.

Sample run:
https://github.com/coder/coder/actions/runs/18165531528/job/51706661391
2025-10-01 15:50:40 +01:00
Cian Johnston 98262d8fb2 ci: allow dispatching workflow triage via label (#20042)
Allows creating a task for an issue if a label 'traiage' is set.
Requires membership of the `coder` org to run.

Manual workflow_dispatch:
https://github.com/coder/coder/actions/runs/18158719999/job/51684512634

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-01 13:38:25 +01:00
389 changed files with 21554 additions and 6882 deletions
+6 -2
View File
@@ -10,8 +10,12 @@ install_devcontainer_cli() {
install_ssh_config() {
echo "🔑 Installing SSH configuration..."
rsync -a /mnt/home/coder/.ssh/ ~/.ssh/
chmod 0700 ~/.ssh
if [ -d /mnt/home/coder/.ssh ]; then
rsync -a /mnt/home/coder/.ssh/ ~/.ssh/
chmod 0700 ~/.ssh
else
echo "⚠️ SSH directory not found."
fi
}
install_git_config() {
+1
View File
@@ -26,5 +26,6 @@ ignorePatterns:
- pattern: "claude.ai"
- pattern: "splunk.com"
- pattern: "stackoverflow.com/questions"
- pattern: "developer.hashicorp.com/terraform/language"
aliveStatusCodes:
- 200
+3
View File
@@ -0,0 +1,3 @@
node_modules/
*.log
.DS_Store
+86
View File
@@ -0,0 +1,86 @@
name: "Create Coder Task"
description: "Create a Coder task for a GitHub user, with support for issue commenting"
inputs:
# Required: Coder configuration
coder-url:
description: "Coder deployment URL"
required: true
coder-token:
description: "Coder session token for authentication"
required: true
# Required: Task configuration
template-name:
description: "Coder template to use for workspace"
required: true
task-prompt:
description: "Prompt/instructions to send to the task"
required: true
# Optional: User identification
github-user-id:
description: "GitHub user ID (defaults to event sender)"
required: false
github-username:
description: "GitHub username (defaults to event sender)"
required: false
# Optional: Task configuration
template-preset:
description: "Template preset to use"
required: false
default: "Default"
task-name-prefix:
description: "Prefix for task name"
required: false
default: "task"
task-name:
description: "Full task name (overrides auto-generation)"
required: false
organization:
description: "Coder organization name"
required: false
default: "coder"
# Optional: Issue integration
issue-url:
description: "GitHub issue URL to comment on"
required: false
comment-on-issue:
description: "Whether to comment on the issue"
required: false
default: "true"
coder-web-url:
description: "Coder web UI URL for task links (defaults to coder-url)"
required: false
# GitHub token for API operations
github-token:
description: "GitHub token for commenting on issues"
required: true
outputs:
coder-username:
description: "The Coder username resolved from GitHub user"
task-name:
description: "The full task name (username/task-name)"
task-url:
description: "The URL to view the task in Coder"
task-exists:
description: "Whether the task already existed (true/false)"
runs:
using: "node20"
main: "dist/index.js"
+158
View File
@@ -0,0 +1,158 @@
{
"lockfileVersion": 1,
"workspaces": {
"": {
"name": "coder-task-action",
"dependencies": {
"@actions/core": "^1.10.1",
"@actions/github": "^6.0.0",
"@octokit/rest": "^21.1.1",
"zod": "^3.24.2",
},
"devDependencies": {
"@biomejs/biome": "2.2.4",
"@types/bun": "latest",
"typescript": "^5.0.0",
},
},
},
"packages": {
"@actions/core": ["@actions/core@1.11.1", "", { "dependencies": { "@actions/exec": "^1.1.1", "@actions/http-client": "^2.0.1" } }, "sha512-hXJCSrkwfA46Vd9Z3q4cpEpHB1rL5NG04+/rbqW9d3+CSvtB1tYe8UTpAlixa1vj0m/ULglfEK2UKxMGxCxv5A=="],
"@actions/exec": ["@actions/exec@1.1.1", "", { "dependencies": { "@actions/io": "^1.0.1" } }, "sha512-+sCcHHbVdk93a0XT19ECtO/gIXoxvdsgQLzb2fE2/5sIZmWQuluYyjPQtrtTHdU1YzTZ7bAPN4sITq2xi1679w=="],
"@actions/github": ["@actions/github@6.0.1", "", { "dependencies": { "@actions/http-client": "^2.2.0", "@octokit/core": "^5.0.1", "@octokit/plugin-paginate-rest": "^9.2.2", "@octokit/plugin-rest-endpoint-methods": "^10.4.0", "@octokit/request": "^8.4.1", "@octokit/request-error": "^5.1.1", "undici": "^5.28.5" } }, "sha512-xbZVcaqD4XnQAe35qSQqskb3SqIAfRyLBrHMd/8TuL7hJSz2QtbDwnNM8zWx4zO5l2fnGtseNE3MbEvD7BxVMw=="],
"@actions/http-client": ["@actions/http-client@2.2.3", "", { "dependencies": { "tunnel": "^0.0.6", "undici": "^5.25.4" } }, "sha512-mx8hyJi/hjFvbPokCg4uRd4ZX78t+YyRPtnKWwIl+RzNaVuFpQHfmlGVfsKEJN8LwTCvL+DfVgAM04XaHkm6bA=="],
"@actions/io": ["@actions/io@1.1.3", "", {}, "sha512-wi9JjgKLYS7U/z8PPbco+PvTb/nRWjeoFlJ1Qer83k/3C5PHQi28hiVdeE2kHXmIL99mQFawx8qt/JPjZilJ8Q=="],
"@biomejs/biome": ["@biomejs/biome@2.2.4", "", { "optionalDependencies": { "@biomejs/cli-darwin-arm64": "2.2.4", "@biomejs/cli-darwin-x64": "2.2.4", "@biomejs/cli-linux-arm64": "2.2.4", "@biomejs/cli-linux-arm64-musl": "2.2.4", "@biomejs/cli-linux-x64": "2.2.4", "@biomejs/cli-linux-x64-musl": "2.2.4", "@biomejs/cli-win32-arm64": "2.2.4", "@biomejs/cli-win32-x64": "2.2.4" }, "bin": { "biome": "bin/biome" } }, "sha512-TBHU5bUy/Ok6m8c0y3pZiuO/BZoY/OcGxoLlrfQof5s8ISVwbVBdFINPQZyFfKwil8XibYWb7JMwnT8wT4WVPg=="],
"@biomejs/cli-darwin-arm64": ["@biomejs/cli-darwin-arm64@2.2.4", "", { "os": "darwin", "cpu": "arm64" }, "sha512-RJe2uiyaloN4hne4d2+qVj3d3gFJFbmrr5PYtkkjei1O9c+BjGXgpUPVbi8Pl8syumhzJjFsSIYkcLt2VlVLMA=="],
"@biomejs/cli-darwin-x64": ["@biomejs/cli-darwin-x64@2.2.4", "", { "os": "darwin", "cpu": "x64" }, "sha512-cFsdB4ePanVWfTnPVaUX+yr8qV8ifxjBKMkZwN7gKb20qXPxd/PmwqUH8mY5wnM9+U0QwM76CxFyBRJhC9tQwg=="],
"@biomejs/cli-linux-arm64": ["@biomejs/cli-linux-arm64@2.2.4", "", { "os": "linux", "cpu": "arm64" }, "sha512-M/Iz48p4NAzMXOuH+tsn5BvG/Jb07KOMTdSVwJpicmhN309BeEyRyQX+n1XDF0JVSlu28+hiTQ2L4rZPvu7nMw=="],
"@biomejs/cli-linux-arm64-musl": ["@biomejs/cli-linux-arm64-musl@2.2.4", "", { "os": "linux", "cpu": "arm64" }, "sha512-7TNPkMQEWfjvJDaZRSkDCPT/2r5ESFPKx+TEev+I2BXDGIjfCZk2+b88FOhnJNHtksbOZv8ZWnxrA5gyTYhSsQ=="],
"@biomejs/cli-linux-x64": ["@biomejs/cli-linux-x64@2.2.4", "", { "os": "linux", "cpu": "x64" }, "sha512-orr3nnf2Dpb2ssl6aihQtvcKtLySLta4E2UcXdp7+RTa7mfJjBgIsbS0B9GC8gVu0hjOu021aU8b3/I1tn+pVQ=="],
"@biomejs/cli-linux-x64-musl": ["@biomejs/cli-linux-x64-musl@2.2.4", "", { "os": "linux", "cpu": "x64" }, "sha512-m41nFDS0ksXK2gwXL6W6yZTYPMH0LughqbsxInSKetoH6morVj43szqKx79Iudkp8WRT5SxSh7qVb8KCUiewGg=="],
"@biomejs/cli-win32-arm64": ["@biomejs/cli-win32-arm64@2.2.4", "", { "os": "win32", "cpu": "arm64" }, "sha512-NXnfTeKHDFUWfxAefa57DiGmu9VyKi0cDqFpdI+1hJWQjGJhJutHPX0b5m+eXvTKOaf+brU+P0JrQAZMb5yYaQ=="],
"@biomejs/cli-win32-x64": ["@biomejs/cli-win32-x64@2.2.4", "", { "os": "win32", "cpu": "x64" }, "sha512-3Y4V4zVRarVh/B/eSHczR4LYoSVyv3Dfuvm3cWs5w/HScccS0+Wt/lHOcDTRYeHjQmMYVC3rIRWqyN2EI52+zg=="],
"@fastify/busboy": ["@fastify/busboy@2.1.1", "", {}, "sha512-vBZP4NlzfOlerQTnba4aqZoMhE/a9HY7HRqoOPaETQcSQuWEIyZMHGfVu6w9wGtGK5fED5qRs2DteVCjOH60sA=="],
"@octokit/auth-token": ["@octokit/auth-token@4.0.0", "", {}, "sha512-tY/msAuJo6ARbK6SPIxZrPBms3xPbfwBrulZe0Wtr/DIY9lje2HeV1uoebShn6mx7SjCHif6EjMvoREj+gZ+SA=="],
"@octokit/core": ["@octokit/core@5.2.2", "", { "dependencies": { "@octokit/auth-token": "^4.0.0", "@octokit/graphql": "^7.1.0", "@octokit/request": "^8.4.1", "@octokit/request-error": "^5.1.1", "@octokit/types": "^13.0.0", "before-after-hook": "^2.2.0", "universal-user-agent": "^6.0.0" } }, "sha512-/g2d4sW9nUDJOMz3mabVQvOGhVa4e/BN/Um7yca9Bb2XTzPPnfTWHWQg+IsEYO7M3Vx+EXvaM/I2pJWIMun1bg=="],
"@octokit/endpoint": ["@octokit/endpoint@9.0.6", "", { "dependencies": { "@octokit/types": "^13.1.0", "universal-user-agent": "^6.0.0" } }, "sha512-H1fNTMA57HbkFESSt3Y9+FBICv+0jFceJFPWDePYlR/iMGrwM5ph+Dd4XRQs+8X+PUFURLQgX9ChPfhJ/1uNQw=="],
"@octokit/graphql": ["@octokit/graphql@7.1.1", "", { "dependencies": { "@octokit/request": "^8.4.1", "@octokit/types": "^13.0.0", "universal-user-agent": "^6.0.0" } }, "sha512-3mkDltSfcDUoa176nlGoA32RGjeWjl3K7F/BwHwRMJUW/IteSa4bnSV8p2ThNkcIcZU2umkZWxwETSSCJf2Q7g=="],
"@octokit/openapi-types": ["@octokit/openapi-types@24.2.0", "", {}, "sha512-9sIH3nSUttelJSXUrmGzl7QUBFul0/mB8HRYl3fOlgHbIWG+WnYDXU3v/2zMtAvuzZ/ed00Ei6on975FhBfzrg=="],
"@octokit/plugin-paginate-rest": ["@octokit/plugin-paginate-rest@9.2.2", "", { "dependencies": { "@octokit/types": "^12.6.0" }, "peerDependencies": { "@octokit/core": "5" } }, "sha512-u3KYkGF7GcZnSD/3UP0S7K5XUFT2FkOQdcfXZGZQPGv3lm4F2Xbf71lvjldr8c1H3nNbF+33cLEkWYbokGWqiQ=="],
"@octokit/plugin-request-log": ["@octokit/plugin-request-log@5.3.1", "", { "peerDependencies": { "@octokit/core": ">=6" } }, "sha512-n/lNeCtq+9ofhC15xzmJCNKP2BWTv8Ih2TTy+jatNCCq/gQP/V7rK3fjIfuz0pDWDALO/o/4QY4hyOF6TQQFUw=="],
"@octokit/plugin-rest-endpoint-methods": ["@octokit/plugin-rest-endpoint-methods@10.4.1", "", { "dependencies": { "@octokit/types": "^12.6.0" }, "peerDependencies": { "@octokit/core": "5" } }, "sha512-xV1b+ceKV9KytQe3zCVqjg+8GTGfDYwaT1ATU5isiUyVtlVAO3HNdzpS4sr4GBx4hxQ46s7ITtZrAsxG22+rVg=="],
"@octokit/request": ["@octokit/request@8.4.1", "", { "dependencies": { "@octokit/endpoint": "^9.0.6", "@octokit/request-error": "^5.1.1", "@octokit/types": "^13.1.0", "universal-user-agent": "^6.0.0" } }, "sha512-qnB2+SY3hkCmBxZsR/MPCybNmbJe4KAlfWErXq+rBKkQJlbjdJeS85VI9r8UqeLYLvnAenU8Q1okM/0MBsAGXw=="],
"@octokit/request-error": ["@octokit/request-error@5.1.1", "", { "dependencies": { "@octokit/types": "^13.1.0", "deprecation": "^2.0.0", "once": "^1.4.0" } }, "sha512-v9iyEQJH6ZntoENr9/yXxjuezh4My67CBSu9r6Ve/05Iu5gNgnisNWOsoJHTP6k0Rr0+HQIpnH+kyammu90q/g=="],
"@octokit/rest": ["@octokit/rest@21.1.1", "", { "dependencies": { "@octokit/core": "^6.1.4", "@octokit/plugin-paginate-rest": "^11.4.2", "@octokit/plugin-request-log": "^5.3.1", "@octokit/plugin-rest-endpoint-methods": "^13.3.0" } }, "sha512-sTQV7va0IUVZcntzy1q3QqPm/r8rWtDCqpRAmb8eXXnKkjoQEtFe3Nt5GTVsHft+R6jJoHeSiVLcgcvhtue/rg=="],
"@octokit/types": ["@octokit/types@13.10.0", "", { "dependencies": { "@octokit/openapi-types": "^24.2.0" } }, "sha512-ifLaO34EbbPj0Xgro4G5lP5asESjwHracYJvVaPIyXMuiuXLlhic3S47cBdTb+jfODkTE5YtGCLt3Ay3+J97sA=="],
"@types/bun": ["@types/bun@1.3.0", "", { "dependencies": { "bun-types": "1.3.0" } }, "sha512-+lAGCYjXjip2qY375xX/scJeVRmZ5cY0wyHYyCYxNcdEXrQ4AOe3gACgd4iQ8ksOslJtW4VNxBJ8llUwc3a6AA=="],
"@types/node": ["@types/node@24.8.1", "", { "dependencies": { "undici-types": "~7.14.0" } }, "sha512-alv65KGRadQVfVcG69MuB4IzdYVpRwMG/mq8KWOaoOdyY617P5ivaDiMCGOFDWD2sAn5Q0mR3mRtUOgm99hL9Q=="],
"@types/react": ["@types/react@19.2.2", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA=="],
"before-after-hook": ["before-after-hook@2.2.3", "", {}, "sha512-NzUnlZexiaH/46WDhANlyR2bXRopNg4F/zuSA3OpZnllCUgRaOF2znDioDWrmbNVsuZk6l9pMquQB38cfBZwkQ=="],
"bun-types": ["bun-types@1.3.0", "", { "dependencies": { "@types/node": "*" }, "peerDependencies": { "@types/react": "^19" } }, "sha512-u8X0thhx+yJ0KmkxuEo9HAtdfgCBaM/aI9K90VQcQioAmkVp3SG3FkwWGibUFz3WdXAdcsqOcbU40lK7tbHdkQ=="],
"csstype": ["csstype@3.1.3", "", {}, "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw=="],
"deprecation": ["deprecation@2.3.1", "", {}, "sha512-xmHIy4F3scKVwMsQ4WnVaS8bHOx0DmVwRywosKhaILI0ywMDWPtBSku2HNxRvF7jtwDRsoEwYQSfbxj8b7RlJQ=="],
"fast-content-type-parse": ["fast-content-type-parse@2.0.1", "", {}, "sha512-nGqtvLrj5w0naR6tDPfB4cUmYCqouzyQiz6C5y/LtcDllJdrcc6WaWW6iXyIIOErTa/XRybj28aasdn4LkVk6Q=="],
"once": ["once@1.4.0", "", { "dependencies": { "wrappy": "1" } }, "sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w=="],
"tunnel": ["tunnel@0.0.6", "", {}, "sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg=="],
"typescript": ["typescript@5.9.3", "", { "bin": { "tsc": "bin/tsc", "tsserver": "bin/tsserver" } }, "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw=="],
"undici": ["undici@5.29.0", "", { "dependencies": { "@fastify/busboy": "^2.0.0" } }, "sha512-raqeBD6NQK4SkWhQzeYKd1KmIG6dllBOTt55Rmkt4HtI9mwdWtJljnrXjAFUBLTSN67HWrOIZ3EPF4kjUw80Bg=="],
"undici-types": ["undici-types@7.14.0", "", {}, "sha512-QQiYxHuyZ9gQUIrmPo3IA+hUl4KYk8uSA7cHrcKd/l3p1OTpZcM0Tbp9x7FAtXdAYhlasd60ncPpgu6ihG6TOA=="],
"universal-user-agent": ["universal-user-agent@6.0.1", "", {}, "sha512-yCzhz6FN2wU1NiiQRogkTQszlQSlpWaw8SvVegAc+bDxbzHgh1vX8uIe8OYyMH6DwH+sdTJsgMl36+mSMdRJIQ=="],
"wrappy": ["wrappy@1.0.2", "", {}, "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ=="],
"zod": ["zod@3.25.76", "", {}, "sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ=="],
"@octokit/plugin-paginate-rest/@octokit/types": ["@octokit/types@12.6.0", "", { "dependencies": { "@octokit/openapi-types": "^20.0.0" } }, "sha512-1rhSOfRa6H9w4YwK0yrf5faDaDTb+yLyBUKOCV4xtCDB5VmIPqd/v9yr9o6SAzOAlRxMiRiCic6JVM1/kunVkw=="],
"@octokit/plugin-request-log/@octokit/core": ["@octokit/core@6.1.6", "", { "dependencies": { "@octokit/auth-token": "^5.0.0", "@octokit/graphql": "^8.2.2", "@octokit/request": "^9.2.3", "@octokit/request-error": "^6.1.8", "@octokit/types": "^14.0.0", "before-after-hook": "^3.0.2", "universal-user-agent": "^7.0.0" } }, "sha512-kIU8SLQkYWGp3pVKiYzA5OSaNF5EE03P/R8zEmmrG6XwOg5oBjXyQVVIauQ0dgau4zYhpZEhJrvIYt6oM+zZZA=="],
"@octokit/plugin-rest-endpoint-methods/@octokit/types": ["@octokit/types@12.6.0", "", { "dependencies": { "@octokit/openapi-types": "^20.0.0" } }, "sha512-1rhSOfRa6H9w4YwK0yrf5faDaDTb+yLyBUKOCV4xtCDB5VmIPqd/v9yr9o6SAzOAlRxMiRiCic6JVM1/kunVkw=="],
"@octokit/rest/@octokit/core": ["@octokit/core@6.1.6", "", { "dependencies": { "@octokit/auth-token": "^5.0.0", "@octokit/graphql": "^8.2.2", "@octokit/request": "^9.2.3", "@octokit/request-error": "^6.1.8", "@octokit/types": "^14.0.0", "before-after-hook": "^3.0.2", "universal-user-agent": "^7.0.0" } }, "sha512-kIU8SLQkYWGp3pVKiYzA5OSaNF5EE03P/R8zEmmrG6XwOg5oBjXyQVVIauQ0dgau4zYhpZEhJrvIYt6oM+zZZA=="],
"@octokit/rest/@octokit/plugin-paginate-rest": ["@octokit/plugin-paginate-rest@11.6.0", "", { "dependencies": { "@octokit/types": "^13.10.0" }, "peerDependencies": { "@octokit/core": ">=6" } }, "sha512-n5KPteiF7pWKgBIBJSk8qzoZWcUkza2O6A0za97pMGVrGfPdltxrfmfF5GucHYvHGZD8BdaZmmHGz5cX/3gdpw=="],
"@octokit/rest/@octokit/plugin-rest-endpoint-methods": ["@octokit/plugin-rest-endpoint-methods@13.5.0", "", { "dependencies": { "@octokit/types": "^13.10.0" }, "peerDependencies": { "@octokit/core": ">=6" } }, "sha512-9Pas60Iv9ejO3WlAX3maE1+38c5nqbJXV5GrncEfkndIpZrJ/WPMRd2xYDcPPEt5yzpxcjw9fWNoPhsSGzqKqw=="],
"@octokit/plugin-paginate-rest/@octokit/types/@octokit/openapi-types": ["@octokit/openapi-types@20.0.0", "", {}, "sha512-EtqRBEjp1dL/15V7WiX5LJMIxxkdiGJnabzYx5Apx4FkQIFgAfKumXeYAqqJCj1s+BMX4cPFIFC4OLCR6stlnA=="],
"@octokit/plugin-request-log/@octokit/core/@octokit/auth-token": ["@octokit/auth-token@5.1.2", "", {}, "sha512-JcQDsBdg49Yky2w2ld20IHAlwr8d/d8N6NiOXbtuoPCqzbsiJgF633mVUw3x4mo0H5ypataQIX7SFu3yy44Mpw=="],
"@octokit/plugin-request-log/@octokit/core/@octokit/graphql": ["@octokit/graphql@8.2.2", "", { "dependencies": { "@octokit/request": "^9.2.3", "@octokit/types": "^14.0.0", "universal-user-agent": "^7.0.0" } }, "sha512-Yi8hcoqsrXGdt0yObxbebHXFOiUA+2v3n53epuOg1QUgOB6c4XzvisBNVXJSl8RYA5KrDuSL2yq9Qmqe5N0ryA=="],
"@octokit/plugin-request-log/@octokit/core/@octokit/request": ["@octokit/request@9.2.4", "", { "dependencies": { "@octokit/endpoint": "^10.1.4", "@octokit/request-error": "^6.1.8", "@octokit/types": "^14.0.0", "fast-content-type-parse": "^2.0.0", "universal-user-agent": "^7.0.2" } }, "sha512-q8ybdytBmxa6KogWlNa818r0k1wlqzNC+yNkcQDECHvQo8Vmstrg18JwqJHdJdUiHD2sjlwBgSm9kHkOKe2iyA=="],
"@octokit/plugin-request-log/@octokit/core/@octokit/request-error": ["@octokit/request-error@6.1.8", "", { "dependencies": { "@octokit/types": "^14.0.0" } }, "sha512-WEi/R0Jmq+IJKydWlKDmryPcmdYSVjL3ekaiEL1L9eo1sUnqMJ+grqmC9cjk7CA7+b2/T397tO5d8YLOH3qYpQ=="],
"@octokit/plugin-request-log/@octokit/core/@octokit/types": ["@octokit/types@14.1.0", "", { "dependencies": { "@octokit/openapi-types": "^25.1.0" } }, "sha512-1y6DgTy8Jomcpu33N+p5w58l6xyt55Ar2I91RPiIA0xCJBXyUAhXCcmZaDWSANiha7R9a6qJJ2CRomGPZ6f46g=="],
"@octokit/plugin-request-log/@octokit/core/before-after-hook": ["before-after-hook@3.0.2", "", {}, "sha512-Nik3Sc0ncrMK4UUdXQmAnRtzmNQTAAXmXIopizwZ1W1t8QmfJj+zL4OA2I7XPTPW5z5TDqv4hRo/JzouDJnX3A=="],
"@octokit/plugin-request-log/@octokit/core/universal-user-agent": ["universal-user-agent@7.0.3", "", {}, "sha512-TmnEAEAsBJVZM/AADELsK76llnwcf9vMKuPz8JflO1frO8Lchitr0fNaN9d+Ap0BjKtqWqd/J17qeDnXh8CL2A=="],
"@octokit/plugin-rest-endpoint-methods/@octokit/types/@octokit/openapi-types": ["@octokit/openapi-types@20.0.0", "", {}, "sha512-EtqRBEjp1dL/15V7WiX5LJMIxxkdiGJnabzYx5Apx4FkQIFgAfKumXeYAqqJCj1s+BMX4cPFIFC4OLCR6stlnA=="],
"@octokit/rest/@octokit/core/@octokit/auth-token": ["@octokit/auth-token@5.1.2", "", {}, "sha512-JcQDsBdg49Yky2w2ld20IHAlwr8d/d8N6NiOXbtuoPCqzbsiJgF633mVUw3x4mo0H5ypataQIX7SFu3yy44Mpw=="],
"@octokit/rest/@octokit/core/@octokit/graphql": ["@octokit/graphql@8.2.2", "", { "dependencies": { "@octokit/request": "^9.2.3", "@octokit/types": "^14.0.0", "universal-user-agent": "^7.0.0" } }, "sha512-Yi8hcoqsrXGdt0yObxbebHXFOiUA+2v3n53epuOg1QUgOB6c4XzvisBNVXJSl8RYA5KrDuSL2yq9Qmqe5N0ryA=="],
"@octokit/rest/@octokit/core/@octokit/request": ["@octokit/request@9.2.4", "", { "dependencies": { "@octokit/endpoint": "^10.1.4", "@octokit/request-error": "^6.1.8", "@octokit/types": "^14.0.0", "fast-content-type-parse": "^2.0.0", "universal-user-agent": "^7.0.2" } }, "sha512-q8ybdytBmxa6KogWlNa818r0k1wlqzNC+yNkcQDECHvQo8Vmstrg18JwqJHdJdUiHD2sjlwBgSm9kHkOKe2iyA=="],
"@octokit/rest/@octokit/core/@octokit/request-error": ["@octokit/request-error@6.1.8", "", { "dependencies": { "@octokit/types": "^14.0.0" } }, "sha512-WEi/R0Jmq+IJKydWlKDmryPcmdYSVjL3ekaiEL1L9eo1sUnqMJ+grqmC9cjk7CA7+b2/T397tO5d8YLOH3qYpQ=="],
"@octokit/rest/@octokit/core/@octokit/types": ["@octokit/types@14.1.0", "", { "dependencies": { "@octokit/openapi-types": "^25.1.0" } }, "sha512-1y6DgTy8Jomcpu33N+p5w58l6xyt55Ar2I91RPiIA0xCJBXyUAhXCcmZaDWSANiha7R9a6qJJ2CRomGPZ6f46g=="],
"@octokit/rest/@octokit/core/before-after-hook": ["before-after-hook@3.0.2", "", {}, "sha512-Nik3Sc0ncrMK4UUdXQmAnRtzmNQTAAXmXIopizwZ1W1t8QmfJj+zL4OA2I7XPTPW5z5TDqv4hRo/JzouDJnX3A=="],
"@octokit/rest/@octokit/core/universal-user-agent": ["universal-user-agent@7.0.3", "", {}, "sha512-TmnEAEAsBJVZM/AADELsK76llnwcf9vMKuPz8JflO1frO8Lchitr0fNaN9d+Ap0BjKtqWqd/J17qeDnXh8CL2A=="],
"@octokit/plugin-request-log/@octokit/core/@octokit/request/@octokit/endpoint": ["@octokit/endpoint@10.1.4", "", { "dependencies": { "@octokit/types": "^14.0.0", "universal-user-agent": "^7.0.2" } }, "sha512-OlYOlZIsfEVZm5HCSR8aSg02T2lbUWOsCQoPKfTXJwDzcHQBrVBGdGXb89dv2Kw2ToZaRtudp8O3ZIYoaOjKlA=="],
"@octokit/plugin-request-log/@octokit/core/@octokit/types/@octokit/openapi-types": ["@octokit/openapi-types@25.1.0", "", {}, "sha512-idsIggNXUKkk0+BExUn1dQ92sfysJrje03Q0bv0e+KPLrvyqZF8MnBpFz8UNfYDwB3Ie7Z0TByjWfzxt7vseaA=="],
"@octokit/rest/@octokit/core/@octokit/request/@octokit/endpoint": ["@octokit/endpoint@10.1.4", "", { "dependencies": { "@octokit/types": "^14.0.0", "universal-user-agent": "^7.0.2" } }, "sha512-OlYOlZIsfEVZm5HCSR8aSg02T2lbUWOsCQoPKfTXJwDzcHQBrVBGdGXb89dv2Kw2ToZaRtudp8O3ZIYoaOjKlA=="],
"@octokit/rest/@octokit/core/@octokit/types/@octokit/openapi-types": ["@octokit/openapi-types@25.1.0", "", {}, "sha512-idsIggNXUKkk0+BExUn1dQ92sfysJrje03Q0bv0e+KPLrvyqZF8MnBpFz8UNfYDwB3Ie7Z0TByjWfzxt7vseaA=="],
}
}
+25
View File
@@ -0,0 +1,25 @@
{
"name": "coder-task-action",
"version": "1.0.0",
"description": "GitHub Action to create and manage Coder tasks",
"main": "dist/index.js",
"scripts": {
"build": "bun build src/index.ts --outfile dist/index.js --target node",
"dev": "bun run --watch src/index.ts",
"format": "biome format --write .",
"format:check": "biome format .",
"lint": "biome lint --error-on-warnings .",
"typecheck": "tsc --noEmit"
},
"dependencies": {
"@actions/core": "^1.10.1",
"@actions/github": "^6.0.0",
"@octokit/rest": "^21.1.1",
"zod": "^3.24.2"
},
"devDependencies": {
"@biomejs/biome": "2.2.4",
"@types/bun": "latest",
"typescript": "^5.0.0"
}
}
@@ -0,0 +1,682 @@
import { describe, expect, test, beforeEach } from "bun:test";
import { CoderTaskAction } from "./action";
import type { Octokit } from "./action";
import {
MockCoderClient,
createMockOctokit,
createMockInputs,
mockUser,
mockTask,
mockTemplate,
} from "./test-helpers";
describe("CoderTaskAction", () => {
let coderClient: MockCoderClient;
let octokit: ReturnType<typeof createMockOctokit>;
beforeEach(() => {
coderClient = new MockCoderClient();
octokit = createMockOctokit();
});
describe("parseGithubIssueUrl", () => {
test("parses valid GitHub issue URL", () => {
const inputs = createMockInputs({
githubIssueURL: "https://github.com/owner/repo/issues/123",
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
const result = (
action as unknown as CoderTaskAction
).parseGithubIssueURL();
expect(result).toEqual({
githubOrg: "owner",
githubRepo: "repo",
githubIssueNumber: 123,
});
});
test("throws when no issue URL provided", () => {
const inputs = createMockInputs({ githubIssueURL: undefined });
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
const result = (
action as unknown as CoderTaskAction
).parseGithubIssueURL();
expect(result).toThrowError("Missing issue URL");
});
test("throws for invalid URL format", () => {
const inputs = createMockInputs({ githubIssueURL: "not-a-url" });
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
const result = (
action as unknown as CoderTaskAction
).parseGithubIssueURL();
expect(result).toThrowError("Invalid issue URL: not-a-url");
});
test("handled non-github.com URL", () => {
const inputs = createMockInputs({
githubIssueURL: "https://code.acme.com/owner/repo/issues/123",
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
const result = (
action as unknown as CoderTaskAction
).parseGithubIssueURL();
expect(result).toEqual({
githubOrg: "owner",
githubRepo: "repo",
githubIssueNumber: 123,
});
});
test("handles URL with trailing junk", () => {
const inputs = createMockInputs({
githubIssueURL:
"https://github.com/owner/repo/issues/123/?param=value#anchor",
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
const result = (
action as unknown as CoderTaskAction
).parseGithubIssueURL();
// Should still parse correctly
expect(result).toEqual({
githubOrg: "owner",
githubRepo: "repo",
githubIssueNumber: 123,
});
});
});
describe("generateTaskUrl", () => {
test("generates correct task URL", () => {
const inputs = createMockInputs();
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
const result = (action as unknown as CoderTaskAction).generateTaskUrl(
"testuser",
"task-123",
);
expect(result).toBe("https://coder.test/tasks/testuser/task-123");
});
test("handles URL with trailing junk", () => {
const inputs = createMockInputs({
coderURL: "https://coder.test/?param=value#anchor",
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
const result = (action as unknown as CoderTaskAction).generateTaskUrl(
"testuser",
"task-123",
);
// Should not have double slash
expect(result).toBe("https://coder.test//tasks/testuser/task-123");
});
});
describe("commentOnIssue", () => {
describe("Success Cases", () => {
test("creates new comment when none exists", async () => {
octokit.rest.issues.listComments.mockResolvedValue({
data: [],
} as ReturnType<typeof octokit.rest.issues.listComments>);
octokit.rest.issues.createComment.mockResolvedValue(
{} as ReturnType<typeof octokit.rest.issues.createComment>,
);
const inputs = createMockInputs();
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
await (action as unknown as CoderTaskAction).commentOnIssue(
"https://coder.test/tasks/testuser/task-123",
"owner",
"repo",
123,
);
expect(octokit.rest.issues.createComment).toHaveBeenCalledWith({
owner: "owner",
repo: "repo",
issue_number: 123,
body: "Task created: https://coder.test/tasks/testuser/task-123",
});
});
test("updates existing Task created comment", async () => {
octokit.rest.issues.listComments.mockResolvedValue({
data: [
{ id: 1, body: "Task created: old-url" },
{ id: 2, body: "Other comment" },
{ id: 3, body: "Task created: another-old-url" },
],
} as ReturnType<typeof octokit.rest.issues.listComments>);
octokit.rest.issues.updateComment.mockResolvedValue(
{} as ReturnType<typeof octokit.rest.issues.updateComment>,
);
const inputs = createMockInputs();
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
await (action as unknown as CoderTaskAction).commentOnIssue(
"https://coder.test/tasks/testuser/task-123",
"owner",
"repo",
123,
);
// Should update the last "Task created:" comment
expect(octokit.rest.issues.updateComment).toHaveBeenCalledWith({
owner: "owner",
repo: "repo",
comment_id: 3,
body: "Task created: https://coder.test/tasks/testuser/task-123",
});
});
test("parses owner/repo/issue from URL correctly", async () => {
octokit.rest.issues.listComments.mockResolvedValue({
data: [],
} as ReturnType<typeof octokit.rest.issues.listComments>);
octokit.rest.issues.createComment.mockResolvedValue(
{} as ReturnType<typeof octokit.rest.issues.createComment>,
);
const inputs = createMockInputs();
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
await (action as unknown as CoderTaskAction).commentOnIssue(
"https://coder.test/tasks/testuser/task-123",
"different-owner",
"different-repo",
456,
);
expect(octokit.rest.issues.createComment).toHaveBeenCalledWith({
owner: "different-owner",
repo: "different-repo",
issue_number: 456,
body: "Task created: https://coder.test/tasks/testuser/task-123",
});
});
});
describe("Error Cases", () => {
test("warns but doesn't fail on GitHub API error", async () => {
octokit.rest.issues.listComments.mockRejectedValue(
new Error("API Error"),
);
const inputs = createMockInputs();
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
// Should not throw
expect(
(action as unknown as CoderTaskAction).commentOnIssue(
"https://coder.test/tasks/testuser/task-123",
"owner",
"repo",
123,
),
).resolves.toBeUndefined();
});
test("warns but doesn't fail on permission error", async () => {
octokit.rest.issues.listComments.mockResolvedValue({
data: [],
} as ReturnType<typeof octokit.rest.issues.listComments>);
octokit.rest.issues.createComment.mockRejectedValue(
new Error("Permission denied"),
);
const inputs = createMockInputs();
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
// Should not throw
expect(
(action as unknown as CoderTaskAction).commentOnIssue(
"https://coder.test/tasks/testuser/task-123",
"owner",
"repo",
123,
),
).resolves.toBeUndefined();
});
});
describe("Edge Cases", () => {
test("updates last comment when multiple Task created comments exist", async () => {
octokit.rest.issues.listComments.mockResolvedValue({
data: [
{ id: 1, body: "Task created: url1" },
{ id: 2, body: "Other comment" },
{ id: 3, body: "Task created: url2" },
{ id: 4, body: "Another comment" },
{ id: 5, body: "Task created: url3" },
],
} as ReturnType<typeof octokit.rest.issues.listComments>);
octokit.rest.issues.updateComment.mockResolvedValue(
{} as ReturnType<typeof octokit.rest.issues.updateComment>,
);
const inputs = createMockInputs();
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
await (action as unknown as CoderTaskAction).commentOnIssue(
"https://coder.test/tasks/testuser/task-123",
"owner",
"repo",
123,
);
// Should update comment 5 (last Task created comment)
expect(octokit.rest.issues.updateComment).toHaveBeenCalledWith(
expect.objectContaining({
comment_id: 5,
}),
);
});
});
});
test("creates new task successfully", async () => {
// Setup
coderClient.mockGetCoderUserByGithubID.mockResolvedValue(mockUser);
coderClient.mockGetTask.mockResolvedValue(null);
coderClient.mockCreateTask.mockResolvedValue(mockTask);
const inputs = createMockInputs({
githubUserID: 12345,
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
// Execute
const result = await action.run();
// Verify
expect(coderClient.mockGetCoderUserByGithubID).toHaveBeenCalledWith(12345);
expect(coderClient.mockGetTask).toHaveBeenCalledWith(
mockUser.username,
mockTask.name,
);
expect(coderClient.mockCreateTask).toHaveBeenCalledWith({
username: mockUser.username,
name: mockTask.name,
template_id: mockTemplate.id,
input: "idk",
});
expect(result.coderUsername).toBe("testuser");
expect(result.taskCreated).toBe(false);
expect(result.taskUrl).toContain("/tasks/testuser/");
});
test("sends prompt to existing task", async () => {
// Setup
coderClient.mockGetCoderUserByGithubID.mockResolvedValue(mockUser);
coderClient.mockGetTemplateByOrganizationAndName.mockResolvedValue(
mockTemplate,
);
coderClient.mockGetTemplateVersionPresets.mockResolvedValue([]);
coderClient.mockGetTask.mockResolvedValue(mockTask);
coderClient.mockSendTaskInput.mockResolvedValue(undefined);
const inputs = createMockInputs({
githubUserID: 12345,
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
// Execute
const result = await action.run();
// Verify
expect(coderClient.mockGetTask).toHaveBeenCalledWith(
mockUser.username,
mockTask.name,
);
expect(coderClient.mockSendTaskInput).toHaveBeenCalledWith(mockTask.id, {
prompt: "test prompt",
});
expect(coderClient.mockCreateTask).not.toHaveBeenCalled();
expect(result.taskCreated).toBe(false);
});
test("errors without issue URL", async () => {
// Setup
coderClient.mockGetCoderUserByGithubID.mockResolvedValue(mockUser);
coderClient.mockGetTask.mockResolvedValue(null);
coderClient.mockCreateTask.mockResolvedValue(mockTask);
const inputs = createMockInputs({
githubUserID: 12345,
githubIssueURL: undefined,
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
// Execute
expect(action.run()).rejects.toThrowError("Missing issue URL");
});
test("comments on issue", async () => {
// Setup
coderClient.mockGetCoderUserByGithubID.mockResolvedValue(mockUser);
coderClient.mockGetTask.mockResolvedValue(null);
coderClient.mockGetTemplateByOrganizationAndName.mockResolvedValue(
mockTemplate,
);
coderClient.mockGetTemplateVersionPresets.mockResolvedValue([]);
coderClient.mockCreateTask.mockResolvedValue(mockTask);
octokit.rest.issues.listComments.mockResolvedValue({
data: [],
} as ReturnType<typeof octokit.rest.issues.listComments>);
octokit.rest.issues.createComment.mockResolvedValue(
{} as ReturnType<typeof octokit.rest.issues.updateComment>,
);
const inputs = createMockInputs({
githubUserID: 12345,
githubIssueURL: "https://github.com/owner/repo/issues/123",
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
// Execute
await action.run();
// Verify
expect(octokit.rest.issues.createComment).toHaveBeenCalledWith(
expect.objectContaining({
owner: "owner",
repo: "repo",
issue_number: 123,
body: "Task created: https://coder.test/tasks/testuser/task-123",
}),
);
});
test("updates existing comment on issue", async () => {
// Setup
coderClient.mockGetCoderUserByGithubID.mockResolvedValue(mockUser);
coderClient.mockGetTask.mockResolvedValue(null);
coderClient.mockGetTemplateByOrganizationAndName.mockResolvedValue(
mockTemplate,
);
coderClient.mockGetTemplateVersionPresets.mockResolvedValue([]);
coderClient.mockCreateTask.mockResolvedValue(mockTask);
octokit.rest.issues.listComments.mockResolvedValue({
data: [
{
id: 23455,
body: "An unrelated comment",
},
{
id: 23456,
body: "Task created:",
},
{
id: 23457,
body: "Another unrelated comment",
},
],
} as ReturnType<typeof octokit.rest.issues.listComments>);
octokit.rest.issues.updateComment.mockResolvedValue(
{} as ReturnType<typeof octokit.rest.issues.updateComment>,
);
const inputs = createMockInputs({
githubUserID: 12345,
githubIssueURL: "https://github.com/owner/repo/issues/123",
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
// Execute
await action.run();
// Verify
expect(octokit.rest.issues.updateComment).toHaveBeenCalledWith(
expect.objectContaining({
owner: "owner",
repo: "repo",
comment_id: 23456,
body: "Task created: https://coder.test/tasks/testuser/task-123",
}),
);
});
test("handles error when comment on issue fails", async () => {
// Setup
coderClient.mockGetCoderUserByGithubID.mockResolvedValue(mockUser);
coderClient.mockGetTask.mockResolvedValue(null);
coderClient.mockGetTemplateByOrganizationAndName.mockResolvedValue(
mockTemplate,
);
coderClient.mockGetTemplateVersionPresets.mockResolvedValue([]);
coderClient.mockCreateTask.mockResolvedValue(mockTask);
octokit.rest.issues.listComments.mockResolvedValue({
data: [],
} as ReturnType<typeof octokit.rest.issues.listComments>);
octokit.rest.issues.createComment.mockRejectedValue(
new Error("Failed to comment on issue"),
);
const inputs = createMockInputs({
githubUserID: 12345,
githubIssueURL: "https://github.com/owner/repo/issues/123",
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
await action.run();
expect(octokit.rest.issues.createComment).toHaveBeenCalledWith(
expect.objectContaining({
owner: "owner",
repo: "repo",
issue_number: 123,
}),
);
});
describe("run - Error Scenarios", () => {
test("throws error when Coder user not found", async () => {
coderClient.mockGetCoderUserByGithubID.mockRejectedValue(
new Error("No Coder user found with GitHub user ID 12345"),
);
const inputs = createMockInputs({ githubUserID: 12345 });
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
expect(action.run()).rejects.toThrow(
"No Coder user found with GitHub user ID 12345",
);
});
test("throws error when template not found", async () => {
coderClient.mockGetCoderUserByGithubID.mockResolvedValue(mockUser);
coderClient.mockGetTask.mockResolvedValue(null);
coderClient.mockGetTemplateByOrganizationAndName.mockRejectedValue(
new Error("Template not found"),
);
coderClient.mockCreateTask.mockRejectedValue(
new Error("Template not found: nonexistent"),
);
const inputs = createMockInputs({
githubUserID: 12345,
coderTemplateName: "nonexistent",
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
expect(action.run()).rejects.toThrow("Template not found");
});
test("throws error when task creation fails", async () => {
coderClient.mockGetCoderUserByGithubID.mockResolvedValue(mockUser);
coderClient.mockGetTask.mockResolvedValue(null);
coderClient.mockGetTemplateByOrganizationAndName.mockResolvedValue(
mockTemplate,
);
coderClient.mockGetTemplateVersionPresets.mockResolvedValue([]);
coderClient.mockCreateTask.mockRejectedValue(
new Error("Failed to create task"),
);
const inputs = createMockInputs({ githubUserID: 12345 });
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
expect(action.run()).rejects.toThrow("Failed to create task");
});
test("throws error on permission denied", async () => {
coderClient.mockGetCoderUserByGithubID.mockResolvedValue(mockUser);
coderClient.mockGetTask.mockResolvedValue(null);
coderClient.mockGetTemplateByOrganizationAndName.mockResolvedValue(
mockTemplate,
);
coderClient.mockGetTemplateVersionPresets.mockResolvedValue([]);
coderClient.mockCreateTask.mockRejectedValue(
new Error("Permission denied"),
);
const inputs = createMockInputs({ githubUserID: 12345 });
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
expect(action.run()).rejects.toThrow("Permission denied");
});
});
// NOTE: this may or may not work in the real world depending on the permissions of the user
test("handles cross-repository issue", async () => {
// Setup
coderClient.mockGetCoderUserByGithubID.mockResolvedValue(mockUser);
coderClient.mockGetTask.mockResolvedValue(null);
coderClient.mockGetTemplateByOrganizationAndName.mockResolvedValue(
mockTemplate,
);
coderClient.mockGetTemplateVersionPresets.mockResolvedValue([]);
coderClient.mockCreateTask.mockResolvedValue(mockTask);
octokit.rest.issues.listComments.mockResolvedValue({
data: [],
} as ReturnType<typeof octokit.rest.issues.listComments>);
octokit.rest.issues.createComment.mockResolvedValue(
{} as ReturnType<typeof octokit.rest.issues.createComment>,
);
const inputs = createMockInputs({
githubIssueURL:
"https://github.com/different-owner/different-repo/issues/456",
});
const action = new CoderTaskAction(
coderClient,
octokit as unknown as Octokit,
inputs,
);
await action.run();
expect(octokit.rest.issues.createComment).toHaveBeenCalledWith(
expect.objectContaining({
owner: "different-owner",
repo: "different-repo",
issue_number: 456,
}),
);
});
});
+198
View File
@@ -0,0 +1,198 @@
import * as core from "@actions/core";
import {
ExperimentalCoderSDKCreateTaskRequest,
type CoderClient,
} from "./coder-client";
import type { ActionInputs, ActionOutputs } from "./schemas";
import type { getOctokit } from "@actions/github";
export type Octokit = ReturnType<typeof getOctokit>;
export class CoderTaskAction {
constructor(
private readonly coder: CoderClient,
private readonly octokit: Octokit,
private readonly inputs: ActionInputs,
) {}
/**
* Parse owner and repo from issue URL
*/
parseGithubIssueURL(): {
githubOrg: string;
githubRepo: string;
githubIssueNumber: number;
} {
if (!this.inputs.githubIssueURL) {
throw new Error(`Missing issue URL`);
}
// Parse: https://github.com/owner/repo/issues/123
const match = this.inputs.githubIssueURL.match(
/([^/]+)\/([^/]+)\/issues\/(\d+)/,
);
if (!match) {
throw new Error(`Invalid issue URL: ${this.inputs.githubIssueURL}`);
}
return {
githubOrg: match[1],
githubRepo: match[2],
githubIssueNumber: parseInt(match[3], 10),
};
}
/**
* Generate task URL
*/
generateTaskUrl(coderUsername: string, taskName: string): string {
return `${this.inputs.coderURL}/tasks/${coderUsername}/${taskName}`;
}
/**
* Comment on GitHub issue with task link
*/
async commentOnIssue(
taskUrl: string,
owner: string,
repo: string,
issueNumber: number,
): Promise<void> {
const body = `Task created: ${taskUrl}`;
try {
// Try to find existing comment from bot
const { data: comments } = await this.octokit.rest.issues.listComments({
owner,
repo,
issue_number: issueNumber,
});
// Find the last comment that starts with "Task created:"
const existingComment = comments
.reverse()
.find((comment: { body?: string }) =>
comment.body?.startsWith("Task created:"),
);
if (existingComment) {
// Update existing comment
await this.octokit.rest.issues.updateComment({
owner,
repo,
comment_id: existingComment.id,
body,
});
} else {
// Create new comment
await this.octokit.rest.issues.createComment({
owner,
repo,
issue_number: issueNumber,
body,
});
}
} catch (error) {
core.error(`Failed to comment on issue: ${error}`);
}
}
/**
* Main action execution
*/
async run(): Promise<ActionOutputs> {
core.debug(`GitHub user ID: ${this.inputs.githubUserID}`);
const coderUser = await this.coder.getCoderUserByGitHubId(
this.inputs.githubUserID,
);
const { githubOrg, githubRepo, githubIssueNumber } =
this.parseGithubIssueURL();
core.debug(`GitHub owner: ${githubOrg}`);
core.debug(`GitHub repo: ${githubRepo}`);
core.debug(`GitHub issue number: ${githubIssueNumber}`);
core.debug(`Coder username: ${coderUser.username}`);
if (!this.inputs.coderTaskNamePrefix || !this.inputs.githubIssueURL) {
throw new Error(
"either taskName or both taskNamePrefix and issueURL must be provided",
);
}
const taskName = `${this.inputs.coderTaskNamePrefix}-${githubIssueNumber}`;
core.debug(`Coder Task name: ${taskName}`);
const template = await this.coder.getTemplateByOrganizationAndName(
this.inputs.coderOrganization,
this.inputs.coderTemplateName,
);
core.debug(
`Coder Template: ${template.name} (id:${template.id}, active_version_id:${template.active_version_id})`,
);
const templateVersionPresets = await this.coder.getTemplateVersionPresets(
template.active_version_id,
);
let presetID = undefined;
// If no preset specified, use default preset
if (!this.inputs.coderTemplatePreset) {
for (const preset of templateVersionPresets) {
if (preset.Name === this.inputs.coderTemplatePreset) {
presetID = preset.ID;
core.debug(`Coder Template Preset ID: ${presetID}`);
break;
}
}
// User requested a preset that does not exist
if (this.inputs.coderTemplatePreset && !presetID) {
throw new Error(`Preset ${this.inputs.coderTemplatePreset} not found`);
}
}
const existingTask = await this.coder.getTask(coderUser.username, taskName);
if (existingTask) {
core.debug(`Task already exists: ${existingTask.id}`);
core.debug("Sending prompt to existing task...");
// Send prompt to existing task
await this.coder.sendTaskInput(
coderUser.username,
taskName,
this.inputs.coderTaskPrompt,
);
core.debug("Prompt sent successfully");
return {
coderUsername: coderUser.username,
taskName: existingTask.name,
taskUrl: this.generateTaskUrl(coderUser.username, taskName),
taskCreated: false,
};
}
core.debug("Creating Coder task...");
const req: ExperimentalCoderSDKCreateTaskRequest = {
name: taskName,
template_version_id: this.inputs.coderTemplateName,
template_version_preset_id: presetID,
input: this.inputs.coderTaskPrompt,
};
// Create new task
const createdTask = await this.coder.createTask(coderUser.username, req);
core.debug("Task created successfully");
// 5. Generate task URL
const taskUrl = this.generateTaskUrl(coderUser.username, createdTask.name);
core.debug(`Task URL: ${taskUrl}`);
// 6. Comment on issue if requested
core.debug(
`Commenting on issue ${githubOrg}/${githubRepo}#${githubIssueNumber}`,
);
await this.commentOnIssue(
taskUrl,
githubOrg,
githubRepo,
githubIssueNumber,
);
core.debug(`Comment posted successfully`);
return {
coderUsername: coderUser.username,
taskName: taskName,
taskUrl,
taskCreated: true,
};
}
}
@@ -0,0 +1,326 @@
import { describe, expect, test, beforeEach, mock } from "bun:test";
import {
RealCoderClient,
CoderAPIError,
ExperimentalCoderSDKCreateTaskRequestSchema,
ExperimentalCoderSDKCreateTaskRequest,
} from "./coder-client";
import {
mockUser,
mockUserList,
mockUserListEmpty,
mockUserListDuplicate,
mockTemplate,
mockTemplateVersionPresets,
mockTask,
mockTaskList,
mockTaskListEmpty,
createMockInputs,
createMockResponse,
mockTemplateVersionPreset,
} from "./test-helpers";
describe("CoderClient", () => {
let client: RealCoderClient;
let mockFetch: ReturnType<typeof mock>;
beforeEach(() => {
const mockInputs = createMockInputs();
client = new RealCoderClient(mockInputs.coderURL, mockInputs.coderToken);
mockFetch = mock(() => Promise.resolve(createMockResponse([])));
global.fetch = mockFetch as unknown as typeof fetch;
});
describe("getCoderUserByGitHubId", () => {
test("returns the user when found", async () => {
mockFetch.mockResolvedValue(createMockResponse(mockUserList));
const result = await client.getCoderUserByGitHubId(
mockUser.github_com_user_id,
);
expect(mockFetch).toHaveBeenCalledWith(
`https://coder.test/api/v2/users?q=github_com_user_id%3A${mockUser.github_com_user_id!.toString()}`,
expect.objectContaining({
headers: expect.objectContaining({
"Coder-Session-Token": "test-token",
}),
}),
);
expect(result.id).toBe(mockUser.id);
expect(result.username).toBe(mockUser.username);
expect(result.github_com_user_id).toBe(mockUser.github_com_user_id);
});
test("throws an error if multiple Coder users are found with the same GitHub ID", async () => {
mockFetch.mockResolvedValue(createMockResponse(mockUserListDuplicate));
expect(
client.getCoderUserByGitHubId(mockUser.github_com_user_id!),
).rejects.toThrow(CoderAPIError);
expect(mockFetch).toHaveBeenCalledWith(
`https://coder.test/api/v2/users?q=github_com_user_id%3A${mockUser.github_com_user_id!.toString()}`,
expect.objectContaining({
headers: expect.objectContaining({
"Coder-Session-Token": "test-token",
}),
}),
);
});
test("throws an error if no Coder user is found with the given GitHub ID", async () => {
mockFetch.mockResolvedValue(createMockResponse(mockUserListEmpty));
expect(
client.getCoderUserByGitHubId(mockUser.github_com_user_id!),
).rejects.toThrow(CoderAPIError);
expect(mockFetch).toHaveBeenCalledWith(
`https://coder.test/api/v2/users?q=github_com_user_id%3A${mockUser.github_com_user_id}`,
expect.objectContaining({
headers: expect.objectContaining({
"Coder-Session-Token": "test-token",
}),
}),
);
});
test("throws error on 401 unauthorized", async () => {
mockFetch.mockResolvedValue(
createMockResponse(
{ error: "Unauthorized" },
{ ok: false, status: 401, statusText: "Unauthorized" },
),
);
expect(
client.getCoderUserByGitHubId(mockUser.github_com_user_id!),
).rejects.toThrow(CoderAPIError);
});
test("throws error on 500 server error", async () => {
mockFetch.mockResolvedValue(
createMockResponse(
{ error: "Internal Server Error" },
{ ok: false, status: 500, statusText: "Internal Server Error" },
),
);
expect(
client.getCoderUserByGitHubId(mockUser.github_com_user_id!),
).rejects.toThrow(CoderAPIError);
});
test("throws an error when GitHub user ID is 0", async () => {
mockFetch.mockResolvedValue(createMockResponse([mockUser]));
expect(client.getCoderUserByGitHubId(0)).rejects.toThrow(
"GitHub user ID cannot be 0",
);
});
});
describe("getTemplateByOrganizationAndName", () => {
test("the given template is returned successfully if it exists", async () => {
mockFetch.mockResolvedValue(createMockResponse(mockTemplate));
const mockInputs = createMockInputs();
const result = await client.getTemplateByOrganizationAndName(
mockInputs.coderOrganization,
mockTemplate.name,
);
expect(mockFetch).toHaveBeenCalledWith(
`https://coder.test/api/v2/organizations/${mockInputs.coderOrganization}/templates/${mockTemplate.name}`,
expect.objectContaining({
headers: expect.objectContaining({
"Coder-Session-Token": "test-token",
}),
}),
);
expect(result.id).toBe(mockTemplate.id);
expect(result.name).toBe(mockTemplate.name);
expect(result.active_version_id).toBe(mockTemplate.active_version_id);
});
test("throws an error when the given template is not found", async () => {
mockFetch.mockResolvedValue(
createMockResponse(
{ error: "Not found" },
{ ok: false, status: 404, statusText: "Not Found" },
),
);
const mockInputs = createMockInputs();
expect(
client.getTemplateByOrganizationAndName(
mockInputs.coderOrganization,
"nonexistent",
),
).rejects.toThrow(CoderAPIError);
});
});
describe("getTemplateVersionPresets", () => {
test("returns template version presets", async () => {
mockFetch.mockResolvedValue(
createMockResponse(mockTemplateVersionPresets),
);
const result = await client.getTemplateVersionPresets(
mockTemplate.active_version_id,
);
expect(result).not.toBeNull();
expect(result).toHaveLength(mockTemplateVersionPresets.length);
for (let idx = 0; idx < result.length; idx++) {
expect(result[idx].ID).toBe(mockTemplateVersionPresets[idx].ID);
expect(result[idx].Name).toBe(mockTemplateVersionPresets[idx].Name);
}
expect(mockFetch).toHaveBeenCalledWith(
`https://coder.test/api/v2/templateversions/${mockTemplate.active_version_id}/presets`,
expect.objectContaining({
headers: expect.objectContaining({
"Coder-Session-Token": "test-token",
}),
}),
);
});
});
describe("getTask", () => {
test("returns task when task exists", async () => {
mockFetch.mockResolvedValue(createMockResponse(mockTaskList));
const result = await client.getTask(mockUser.username, mockTask.name);
expect(result).not.toBeNull();
expect(result?.id).toBe(mockTask.id);
expect(result?.name).toBe(mockTask.name);
expect(mockFetch).toHaveBeenCalledWith(
`https://coder.test/api/experimental/tasks?q=owner%3A${mockUser.username}`,
expect.objectContaining({
headers: expect.objectContaining({
"Coder-Session-Token": "test-token",
}),
}),
);
});
test("returns null when task doesn't exist (404)", async () => {
mockFetch.mockResolvedValue(createMockResponse(mockTaskListEmpty));
const result = await client.getTask(mockUser.username, mockTask.name);
expect(result).toBeNull();
expect(mockFetch).toHaveBeenCalledWith(
`https://coder.test/api/experimental/tasks?q=owner%3A${mockUser.username}`,
expect.objectContaining({
headers: expect.objectContaining({
"Coder-Session-Token": "test-token",
}),
}),
);
});
});
describe("createTask", () => {
test("creates task successfully given valid input", async () => {
mockFetch.mockResolvedValueOnce(createMockResponse(mockTask));
const mockInputs = createMockInputs();
const result = await client.createTask(mockUser.username, {
name: mockTask.name,
template_version_id: mockTemplate.active_version_id,
input: mockInputs.coderTaskPrompt,
});
expect(result.id).toBe(mockTask.id);
expect(result.name).toBe(mockTask.name);
expect(mockFetch).toHaveBeenNthCalledWith(
1,
`https://coder.test/api/experimental/tasks/${mockUser.username}`,
expect.objectContaining({
method: "POST",
headers: expect.objectContaining({
"Coder-Session-Token": "test-token",
}),
body: JSON.stringify({
name: mockTask.name,
template_version_id: mockTemplate.active_version_id,
input: mockInputs.coderTaskPrompt,
}),
}),
);
});
test("creates task successfully with a given preset", async () => {
mockFetch.mockResolvedValueOnce(createMockResponse(mockTask));
const mockInputs = {
...createMockInputs(),
template_version_preset_id: mockTemplateVersionPreset.ID,
};
const result = await client.createTask(mockUser.username, {
name: mockTask.name,
template_version_id: mockTemplate.active_version_id,
template_version_preset_id: mockTemplateVersionPreset.ID,
input: mockInputs.coderTaskPrompt,
});
expect(result.id).toBe(mockTask.id);
expect(result.name).toBe(mockTask.name);
expect(mockFetch).toHaveBeenNthCalledWith(
1,
`https://coder.test/api/experimental/tasks/${mockUser.username}`,
expect.objectContaining({
method: "POST",
headers: expect.objectContaining({
"Coder-Session-Token": "test-token",
}),
body: JSON.stringify({
name: mockTask.name,
template_version_id: mockTemplate.active_version_id,
template_version_preset_id: mockTemplateVersionPreset.ID,
input: mockInputs.coderTaskPrompt,
}),
}),
);
});
});
describe("sendTaskInput", () => {
test("sends input successfully", async () => {
mockFetch.mockResolvedValue(createMockResponse({}));
const testInput = "Test input";
await client.sendTaskInput(mockUser.username, mockTask.name, testInput);
expect(mockFetch).toHaveBeenCalledWith(
`https://coder.test/api/v2/users/${mockUser.username}/tasks/${mockTask.name}/send`,
expect.objectContaining({
method: "POST",
body: expect.stringContaining(testInput),
}),
);
});
test("request body contains input field", async () => {
mockFetch.mockResolvedValue(createMockResponse({}));
const testInput = "Test input";
await client.sendTaskInput(mockUser.username, mockTask.name, testInput);
const call = mockFetch.mock.calls[0];
const body = JSON.parse(call[1].body);
expect(body.input).toBe(testInput);
});
test("throws error when task not found (404)", async () => {
mockFetch.mockResolvedValue(
createMockResponse(
{ error: "Not Found" },
{ ok: false, status: 404, statusText: "Not Found" },
),
);
const testInput = "Test input";
expect(
client.sendTaskInput(mockUser.username, mockTask.name, testInput),
).rejects.toThrow(CoderAPIError);
});
test("throws error when task not running (400)", async () => {
mockFetch.mockResolvedValue(
createMockResponse(
{ error: "Bad Request" },
{ ok: false, status: 400, statusText: "Bad Request" },
),
);
const testInput = "Test input";
expect(
client.sendTaskInput(mockUser.username, mockTask.name, testInput),
).rejects.toThrow(CoderAPIError);
});
});
});
@@ -0,0 +1,271 @@
import { z } from "zod";
export interface CoderClient {
getCoderUserByGitHubId(
githubUserId: number | undefined,
): Promise<CoderSDKUser>;
getTemplateByOrganizationAndName(
organizationName: string,
templateName: string,
): Promise<CoderSDKTemplate>;
getTemplateVersionPresets(
templateVersionId: string,
): Promise<CoderSDKTemplateVersionPreset[]>;
getTask(
owner: string,
taskName: string,
): Promise<ExperimentalCoderSDKTask | null>;
createTask(
owner: string,
params: ExperimentalCoderSDKCreateTaskRequest,
): Promise<ExperimentalCoderSDKTask>;
sendTaskInput(owner: string, taskName: string, input: string): Promise<void>;
}
// CoderClient provides a minimal set of methods for interacting with the Coder API.
export class RealCoderClient implements CoderClient {
private readonly headers: Record<string, string>;
constructor(
private readonly serverURL: string,
apiToken: string,
) {
this.headers = {
"Coder-Session-Token": apiToken,
"Content-Type": "application/json",
};
}
private async request<T>(
endpoint: string,
options?: RequestInit,
): Promise<T> {
const url = `${this.serverURL}${endpoint}`;
const response = await fetch(url, {
...options,
headers: { ...this.headers, ...options?.headers },
});
if (!response.ok) {
const body = await response.text().catch(() => "");
throw new CoderAPIError(
`Coder API error: ${response.statusText}`,
response.status,
body,
);
}
return response.json() as Promise<T>;
}
/**
* getCoderUserByGitHubId retrieves an existing Coder user with the given GitHub user ID using Coder's stable API.
* Throws an error if more than one user exists with the same GitHub user ID or if a GitHub user ID of 0 is provided.
*/
async getCoderUserByGitHubId(
githubUserId: number | undefined,
): Promise<CoderSDKUser> {
if (githubUserId === undefined) {
throw new CoderAPIError("GitHub user ID cannot be undefined", 400);
}
if (githubUserId === 0) {
throw "GitHub user ID cannot be 0";
}
const endpoint = `/api/v2/users?q=${encodeURIComponent(`github_com_user_id:${githubUserId}`)}`;
const response = await this.request<unknown[]>(endpoint);
const userList = CoderSDKGetUsersResponseSchema.parse(response);
if (userList.users.length === 0) {
throw new CoderAPIError(
`No Coder user found with GitHub user ID ${githubUserId}`,
404,
);
}
if (userList.users.length > 1) {
throw new CoderAPIError(
`Multiple Coder users found with GitHub user ID ${githubUserId}`,
409,
);
}
return CoderSDKUserSchema.parse(userList.users[0]);
}
/**
* getTemplateByOrganizationAndName retrieves a template via Coder's stable API.
*/
async getTemplateByOrganizationAndName(
organizationName: string,
templateName: string,
): Promise<CoderSDKTemplate> {
const endpoint = `/api/v2/organizations/${encodeURIComponent(organizationName)}/templates/${encodeURIComponent(templateName)}`;
const response =
await this.request<typeof CoderSDKTemplateSchema>(endpoint);
return CoderSDKTemplateSchema.parse(response);
}
/**
* getTemplateVersionPresets retrieves the presets for a given template version (UUID).
*/
async getTemplateVersionPresets(
templateVersionId: string,
): Promise<CoderSDKTemplateVersionPresetsResponse> {
const endpoint = `/api/v2/templateversions/${encodeURIComponent(templateVersionId)}/presets`;
const response =
await this.request<CoderSDKTemplateVersionPresetsResponse>(endpoint);
return CoderSDKTemplateVersionPresetsResponseSchema.parse(response);
}
/**
* getTask retrieves an existing task via Coder's experimental Tasks API.
* Returns null if the task does not exist.
*/
async getTask(
owner: string,
taskName: string,
): Promise<ExperimentalCoderSDKTask | null> {
// TODO: needs taskByOwnerAndName endpoint, fake it for now with the list endpoint.
try {
const allTasksResponse = await this.request<unknown>(
`/api/experimental/tasks?q=${encodeURIComponent(`owner:${owner}`)}`,
);
const allTasks =
ExperimentalCoderSDKTaskListResponseSchema.parse(allTasksResponse);
const task = allTasks.tasks.find((t) => t.name === taskName);
if (!task) {
return null;
}
return task;
} catch (error) {
if (error instanceof CoderAPIError && error.statusCode === 404) {
return null;
}
throw error;
}
}
/**
* createTask creates a new task with the given parameters using Coder's experimental Tasks API.
*/
async createTask(
owner: string,
params: ExperimentalCoderSDKCreateTaskRequest,
): Promise<ExperimentalCoderSDKTask> {
const endpoint = `/api/experimental/tasks/${encodeURIComponent(owner)}`;
const response = await this.request<unknown>(endpoint, {
method: "POST",
body: JSON.stringify(params),
});
return ExperimentalCoderSDKTaskSchema.parse(response);
}
/**
* sendTaskInput sends the given input to an existing task via Coder's experimental Tasks API.
*/
async sendTaskInput(
ownerUsername: string,
taskName: string,
input: string,
): Promise<void> {
const endpoint = `/api/v2/users/${ownerUsername}/tasks/${taskName}/send`;
await this.request<unknown>(endpoint, {
method: "POST",
body: JSON.stringify({ input }),
});
}
}
// CoderSDKUserSchema is the schema for codersdk.User.
export const CoderSDKUserSchema = z.object({
id: z.string().uuid(),
username: z.string(),
email: z.string().email(),
organization_ids: z.array(z.string().uuid()),
github_com_user_id: z.number().optional(),
});
export type CoderSDKUser = z.infer<typeof CoderSDKUserSchema>;
// CoderSDKUserListSchema is the schema for codersdk.GetUsersResponse.
export const CoderSDKGetUsersResponseSchema = z.object({
users: z.array(CoderSDKUserSchema),
});
export type CoderSDKGetUsersResponse = z.infer<
typeof CoderSDKGetUsersResponseSchema
>;
// CoderSDKTemplateSchema is the schema for codersdk.Template.
export const CoderSDKTemplateSchema = z.object({
id: z.string().uuid(),
name: z.string(),
description: z.string().optional(),
organization_id: z.string().uuid(),
active_version_id: z.string().uuid(),
});
export type CoderSDKTemplate = z.infer<typeof CoderSDKTemplateSchema>;
// CoderSDKTemplateVersionPresetSchema is the schema for codersdk.Preset.
export const CoderSDKTemplateVersionPresetSchema = z.object({
ID: z.string().uuid(),
Name: z.string(),
Default: z.boolean(),
});
export type CoderSDKTemplateVersionPreset = z.infer<
typeof CoderSDKTemplateVersionPresetSchema
>;
// CoderSDKTemplateVersionPresetsResponseSchema is the schema for []codersdk.Preset which is returned by the API.
export const CoderSDKTemplateVersionPresetsResponseSchema = z.array(
CoderSDKTemplateVersionPresetSchema,
);
export type CoderSDKTemplateVersionPresetsResponse = z.infer<
typeof CoderSDKTemplateVersionPresetsResponseSchema
>;
// ExperimentalCoderSDKCreateTaskRequestSchema is the schema for experimental codersdk.CreateTaskRequest.
export const ExperimentalCoderSDKCreateTaskRequestSchema = z.object({
name: z.string().min(1),
template_version_id: z.string().min(1),
template_version_preset_id: z.string().min(1).optional(),
input: z.string().min(1),
});
export type ExperimentalCoderSDKCreateTaskRequest = z.infer<
typeof ExperimentalCoderSDKCreateTaskRequestSchema
>;
// ExperimentalCoderSDKTaskSchema is the schema for experimental codersdk.Task.
export const ExperimentalCoderSDKTaskSchema = z.object({
id: z.string().uuid(),
name: z.string(),
owner_id: z.string().uuid(),
template_id: z.string().uuid(),
created_at: z.string(),
updated_at: z.string(),
status: z.string(),
});
export type ExperimentalCoderSDKTask = z.infer<
typeof ExperimentalCoderSDKTaskSchema
>;
// ExperimentalCoderSDKTaskListResponseSchema is the schema for Coder's GET /api/experimental/tasks endpoint.
// At the time of writing, this type is not exported by github.com/coder/coder/v2/codersdk.
export const ExperimentalCoderSDKTaskListResponseSchema = z.object({
tasks: z.array(ExperimentalCoderSDKTaskSchema),
});
export type ExperimentalCoderSDKTaskListResponse = z.infer<
typeof ExperimentalCoderSDKTaskListResponseSchema
>;
// CoderAPIError is a custom error class for Coder API errors.
export class CoderAPIError extends Error {
constructor(
message: string,
public readonly statusCode: number,
public readonly response?: unknown,
) {
super(message);
this.name = "CoderAPIError";
}
}
+67
View File
@@ -0,0 +1,67 @@
import * as core from "@actions/core";
import * as github from "@actions/github";
import { CoderTaskAction } from "./action";
import { RealCoderClient } from "./coder-client";
import { ActionInputsSchema } from "./schemas";
async function main() {
try {
// Parse and validate inputs
const inputs = ActionInputsSchema.parse({
coderUrl: core.getInput("coder-url", { required: true }),
coderToken: core.getInput("coder-token", { required: true }),
templateName: core.getInput("template-name", { required: true }),
taskPrompt: core.getInput("task-prompt", { required: true }),
githubUserId: core.getInput("github-user-id")
? Number.parseInt(core.getInput("github-user-id"), 10)
: undefined,
githubUsername: core.getInput("github-username") || undefined,
templatePreset: core.getInput("template-preset") || "Default",
taskNamePrefix: core.getInput("task-name-prefix") || "task",
taskName: core.getInput("task-name") || undefined,
organization: core.getInput("organization") || "coder",
issueUrl: core.getInput("issue-url") || undefined,
commentOnIssue: core.getBooleanInput("comment-on-issue") !== false,
coderWebUrl: core.getInput("coder-web-url") || undefined,
githubToken: core.getInput("github-token", { required: true }),
});
core.debug("Inputs validated successfully");
core.debug(`Coder URL: ${inputs.coderURL}`);
core.debug(`Template: ${inputs.coderTemplateName}`);
core.debug(`Organization: ${inputs.coderOrganization}`);
// Initialize clients
const coder = new RealCoderClient(inputs.coderURL, inputs.coderToken);
const octokit = github.getOctokit(inputs.githubToken);
core.debug("Clients initialized");
// Execute action
const action = new CoderTaskAction(coder, octokit, inputs);
const outputs = await action.run();
// Set outputs
core.setOutput("coder-username", outputs.coderUsername);
core.setOutput("task-name", outputs.taskName);
core.setOutput("task-url", outputs.taskUrl);
core.setOutput("task-exists", outputs.taskCreated.toString());
core.debug("Action completed successfully");
core.debug(`Outputs: ${JSON.stringify(outputs, null, 2)}`);
} catch (error) {
if (error instanceof Error) {
core.setFailed(error.message);
console.error("Action failed:", error);
if (error.stack) {
console.error("Stack trace:", error.stack);
}
} else {
core.setFailed("Unknown error occurred");
console.error("Unknown error:", error);
}
process.exit(1);
}
}
main();
@@ -0,0 +1,96 @@
import { describe, expect, test } from "bun:test";
import { ActionInputs, ActionInputsSchema } from "./schemas";
const actionInputValid: ActionInputs = {
coderURL: "https://coder.test",
coderToken: "test-token",
coderOrganization: "my-org",
coderTaskNamePrefix: "gh",
coderTaskPrompt: "test prompt",
coderTemplateName: "test-template",
githubIssueURL: "https://github.com/owner/repo/issues/123",
githubToken: "github-token",
githubUserID: 12345,
coderTemplatePreset: "",
};
describe("ActionInputsSchema", () => {
describe("Valid Input Cases", () => {
test("accepts minimal required inputs and sets default values correctly", () => {
const result = ActionInputsSchema.parse(actionInputValid);
expect(result.coderURL).toBe(actionInputValid.coderURL);
expect(result.coderToken).toBe(actionInputValid.coderToken);
expect(result.coderOrganization).toBe(actionInputValid.coderOrganization);
expect(result.coderTaskNamePrefix).toBe(
actionInputValid.coderTaskNamePrefix,
);
expect(result.coderTaskPrompt).toBe(actionInputValid.coderTaskPrompt);
expect(result.coderTemplateName).toBe(actionInputValid.coderTemplateName);
expect(result.githubIssueURL).toBe(actionInputValid.githubIssueURL);
expect(result.githubToken).toBe(actionInputValid.githubToken);
expect(result.githubUserID).toBe(actionInputValid.githubUserID);
expect(result.coderTemplatePreset).toBeEmpty();
});
test("accepts all optional inputs", () => {
const input: ActionInputs = {
...actionInputValid,
coderTemplatePreset: "custom",
};
const result = ActionInputsSchema.parse(input);
expect(result.coderTemplatePreset).toBe(input.coderTemplatePreset);
});
test("accepts valid URL formats", () => {
const validUrls = [
"https://coder.test",
"https://coder.example.com:8080",
"http://12.34.56.78",
"https://12.34.56.78:9000",
"http://localhost:3000",
"http://127.0.0.1:3000",
"http://[::1]:3000",
];
for (const url of validUrls) {
const input: ActionInputs = {
...actionInputValid,
coderURL: url,
};
const result = ActionInputsSchema.parse(input);
expect(result.coderURL).toBe(url);
}
});
});
describe("Invalid Input Cases", () => {
test("rejects missing required fields", () => {
const input = {} as ActionInputs;
expect(() => ActionInputsSchema.parse(input)).toThrow();
});
test("rejects invalid URL format for coderUrl", () => {
const input: ActionInputs = {
...actionInputValid,
coderURL: "not-a-url",
};
expect(() => ActionInputsSchema.parse(input)).toThrow();
});
test("rejects invalid URL format for issueUrl", () => {
const input: ActionInputs = {
...actionInputValid,
githubIssueURL: "not-a-url",
};
expect(() => ActionInputsSchema.parse(input)).toThrow();
});
test("rejects empty strings for required fields", () => {
const input: ActionInputs = {
...actionInputValid,
coderToken: "",
};
expect(() => ActionInputsSchema.parse(input)).toThrow();
});
});
});
+27
View File
@@ -0,0 +1,27 @@
import { z } from "zod";
export type ActionInputs = z.infer<typeof ActionInputsSchema>;
export const ActionInputsSchema = z.object({
// Required
coderTaskPrompt: z.string().min(1),
coderToken: z.string().min(1),
coderURL: z.string().url(),
coderOrganization: z.string().min(1),
coderTaskNamePrefix: z.string().min(1),
coderTemplateName: z.string().min(1),
githubIssueURL: z.string().url(),
githubToken: z.string(),
githubUserID: z.number().min(1),
// Optional
coderTemplatePreset: z.string().optional(),
});
export const ActionOutputsSchema = z.object({
coderUsername: z.string(),
taskName: z.string(),
taskUrl: z.string().url(),
taskCreated: z.boolean(),
});
export type ActionOutputs = z.infer<typeof ActionOutputsSchema>;
@@ -0,0 +1,207 @@
import { mock } from "bun:test";
import { CoderClient } from "./coder-client";
import type {
CoderSDKUser,
CoderSDKGetUsersResponse,
CoderSDKTemplate,
CoderSDKTemplateVersionPreset,
ExperimentalCoderSDKTask,
ExperimentalCoderSDKTaskListResponse,
ExperimentalCoderSDKCreateTaskRequest,
} from "./coder-client";
import type { ActionInputs } from "./schemas";
/**
* Mock data for tests
*/
export const mockUser: CoderSDKUser = {
id: "550e8400-e29b-41d4-a716-446655440000",
username: "testuser",
email: "test@example.com",
organization_ids: ["660e8400-e29b-41d4-a716-446655440000"],
github_com_user_id: 12345,
};
export const mockUserList: CoderSDKGetUsersResponse = {
users: [mockUser],
};
export const mockUserListEmpty: CoderSDKGetUsersResponse = {
users: [],
};
export const mockUserListDuplicate: CoderSDKGetUsersResponse = {
users: [
mockUser,
{
...mockUser,
id: "660e8400-e29b-41d4-a716-446655440001",
username: "testuser2",
},
],
};
export const mockTemplate: CoderSDKTemplate = {
id: "770e8400-e29b-41d4-a716-446655440000",
name: "my-template",
description: "AI triage template",
organization_id: "660e8400-e29b-41d4-a716-446655440000",
active_version_id: "880e8400-e29b-41d4-a716-446655440000",
};
export const mockTemplateVersionPreset: CoderSDKTemplateVersionPreset = {
ID: "880e8400-e29b-41d4-a716-446655440000",
Name: "default-preset",
Default: true,
};
export const mockTemplateVersionPreset2: CoderSDKTemplateVersionPreset = {
ID: "990e8400-e29b-41d4-a716-446655440000",
Name: "another-preset",
Default: false,
};
export const mockTemplateVersionPresets = [
mockTemplateVersionPreset,
mockTemplateVersionPreset2,
];
export const mockTask: ExperimentalCoderSDKTask = {
id: "990e8400-e29b-41d4-a716-446655440000",
name: "task-123",
owner_id: "550e8400-e29b-41d4-a716-446655440000",
template_id: "770e8400-e29b-41d4-a716-446655440000",
created_at: "2024-01-01T00:00:00Z",
updated_at: "2024-01-01T00:00:00Z",
status: "running",
};
export const mockTaskList: ExperimentalCoderSDKTaskListResponse = {
tasks: [mockTask],
};
export const mockTaskListEmpty: ExperimentalCoderSDKTaskListResponse = {
tasks: [],
};
/**
* Create mock ActionInputs with defaults
*/
export function createMockInputs(
overrides?: Partial<ActionInputs>,
): ActionInputs {
return {
coderTaskPrompt: "Test prompt",
coderToken: "test-token",
coderURL: "https://coder.test",
coderOrganization: "coder",
coderTaskNamePrefix: "task",
coderTemplateName: "my-template",
githubToken: "github-token",
githubIssueURL: "https://github.com/test-org/test-repo/issues/12345",
githubUserID: 12345,
...overrides,
};
}
/**
* Mock CoderClient for testing
*/
export class MockCoderClient implements CoderClient {
private readonly headers: Record<string, string>;
public mockGetCoderUserByGithubID = mock();
public mockGetTemplateByOrganizationAndName = mock();
public mockGetTemplateVersionPresets = mock();
public mockGetTask = mock();
public mockCreateTask = mock();
public mockSendTaskInput = mock();
constructor() // private readonly serverURL: string,
// apiToken: string,
{
this.headers = {};
}
async getCoderUserByGitHubId(githubUserId: number): Promise<CoderSDKUser> {
return this.mockGetCoderUserByGithubID(githubUserId);
}
async getTemplateByOrganizationAndName(
organization: string,
templateName: string,
): Promise<CoderSDKTemplate> {
return this.mockGetTemplateByOrganizationAndName(
organization,
templateName,
);
}
async getTemplateVersionPresets(
templateVersionId: string,
): Promise<CoderSDKTemplateVersionPreset[]> {
return this.mockGetTemplateVersionPresets(templateVersionId);
}
async getTask(
username: string,
taskName: string,
): Promise<ExperimentalCoderSDKTask | null> {
return this.mockGetTask(username, taskName);
}
async createTask(
username: string,
params: ExperimentalCoderSDKCreateTaskRequest,
): Promise<ExperimentalCoderSDKTask> {
return this.mockCreateTask(username, params);
}
async sendTaskInput(
username: string,
taskName: string,
input: string,
): Promise<void> {
return this.mockSendTaskInput(username, taskName, input);
}
}
/**
* Mock Octokit for testing
*/
export function createMockOctokit() {
return {
rest: {
users: {
getByUsername: mock(),
},
issues: {
listComments: mock(),
createComment: mock(),
updateComment: mock(),
},
},
};
}
/**
* Mock fetch for testing
*/
export function createMockFetch() {
return mock();
}
/**
* Create mock fetch response
*/
export function createMockResponse(
body: unknown,
options: { ok?: boolean; status?: number; statusText?: string } = {},
) {
return {
ok: options.ok ?? true,
status: options.status ?? 200,
statusText: options.statusText ?? "OK",
json: async () => body,
text: async () => JSON.stringify(body),
};
}
+17
View File
@@ -0,0 +1,17 @@
{
"compilerOptions": {
"target": "ES2022",
"module": "ESNext",
"moduleResolution": "bundler",
"lib": ["ES2022"],
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"isolatedModules": true,
"noEmit": true
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist"]
}
+4
View File
@@ -80,6 +80,9 @@ updates:
mui:
patterns:
- "@mui*"
radix:
patterns:
- "@radix-ui/*"
react:
patterns:
- "react"
@@ -104,6 +107,7 @@ updates:
- dependency-name: "*"
update-types:
- version-update:semver-major
- dependency-name: "@playwright/test"
open-pull-requests-limit: 15
- package-ecosystem: "terraform"
+39 -115
View File
@@ -4,6 +4,7 @@ on:
push:
branches:
- main
- release/*
pull_request:
workflow_dispatch:
@@ -919,6 +920,7 @@ jobs:
required:
runs-on: ubuntu-latest
needs:
- changes
- fmt
- lint
- gen
@@ -942,6 +944,7 @@ jobs:
- name: Ensure required checks
run: | # zizmor: ignore[template-injection] We're just reading needs.x.result here, no risk of injection
echo "Checking required checks"
echo "- changes: ${{ needs.changes.result }}"
echo "- fmt: ${{ needs.fmt.result }}"
echo "- lint: ${{ needs.lint.result }}"
echo "- gen: ${{ needs.gen.result }}"
@@ -967,7 +970,7 @@ jobs:
needs: changes
# We always build the dylibs on Go changes to verify we're not merging unbuildable code,
# but they need only be signed and uploaded on coder/coder main.
if: needs.changes.outputs.go == 'true' || needs.changes.outputs.ci == 'true' || github.ref == 'refs/heads/main'
if: needs.changes.outputs.go == 'true' || needs.changes.outputs.ci == 'true' || github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/heads/release/')
runs-on: ${{ github.repository_owner == 'coder' && 'depot-macos-latest' || 'macos-latest' }}
steps:
# Harden Runner doesn't work on macOS
@@ -995,7 +998,7 @@ jobs:
uses: ./.github/actions/setup-go
- name: Install rcodesign
if: ${{ github.repository_owner == 'coder' && github.ref == 'refs/heads/main' }}
if: ${{ github.repository_owner == 'coder' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/heads/release/')) }}
run: |
set -euo pipefail
wget -O /tmp/rcodesign.tar.gz https://github.com/indygreg/apple-platform-rs/releases/download/apple-codesign%2F0.22.0/apple-codesign-0.22.0-macos-universal.tar.gz
@@ -1006,7 +1009,7 @@ jobs:
rm /tmp/rcodesign.tar.gz
- name: Setup Apple Developer certificate and API key
if: ${{ github.repository_owner == 'coder' && github.ref == 'refs/heads/main' }}
if: ${{ github.repository_owner == 'coder' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/heads/release/')) }}
run: |
set -euo pipefail
touch /tmp/{apple_cert.p12,apple_cert_password.txt,apple_apikey.p8}
@@ -1027,12 +1030,12 @@ jobs:
make gen/mark-fresh
make build/coder-dylib
env:
CODER_SIGN_DARWIN: ${{ github.ref == 'refs/heads/main' && '1' || '0' }}
CODER_SIGN_DARWIN: ${{ (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/heads/release/')) && '1' || '0' }}
AC_CERTIFICATE_FILE: /tmp/apple_cert.p12
AC_CERTIFICATE_PASSWORD_FILE: /tmp/apple_cert_password.txt
- name: Upload build artifacts
if: ${{ github.repository_owner == 'coder' && github.ref == 'refs/heads/main' }}
if: ${{ github.repository_owner == 'coder' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/heads/release/')) }}
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: dylibs
@@ -1042,7 +1045,7 @@ jobs:
retention-days: 7
- name: Delete Apple Developer certificate and API key
if: ${{ github.repository_owner == 'coder' && github.ref == 'refs/heads/main' }}
if: ${{ github.repository_owner == 'coder' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/heads/release/')) }}
run: rm -f /tmp/{apple_cert.p12,apple_cert_password.txt,apple_apikey.p8}
check-build:
@@ -1092,7 +1095,7 @@ jobs:
needs:
- changes
- build-dylib
if: github.ref == 'refs/heads/main' && needs.changes.outputs.docs-only == 'false' && !github.event.pull_request.head.repo.fork
if: (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/heads/release/')) && needs.changes.outputs.docs-only == 'false' && !github.event.pull_request.head.repo.fork
runs-on: ${{ github.repository_owner == 'coder' && 'depot-ubuntu-22.04-8' || 'ubuntu-22.04' }}
permissions:
# Necessary to push docker images to ghcr.io.
@@ -1245,40 +1248,45 @@ jobs:
id: build-docker
env:
CODER_IMAGE_BASE: ghcr.io/coder/coder-preview
CODER_IMAGE_TAG_PREFIX: main
DOCKER_CLI_EXPERIMENTAL: "enabled"
run: |
set -euxo pipefail
# build Docker images for each architecture
version="$(./scripts/version.sh)"
tag="main-${version//+/-}"
tag="${version//+/-}"
echo "tag=$tag" >> "$GITHUB_OUTPUT"
# build images for each architecture
# note: omitting the -j argument to avoid race conditions when pushing
make build/coder_"$version"_linux_{amd64,arm64,armv7}.tag
# only push if we are on main branch
if [ "${GITHUB_REF}" == "refs/heads/main" ]; then
# only push if we are on main branch or release branch
if [[ "${GITHUB_REF}" == "refs/heads/main" || "${GITHUB_REF}" == refs/heads/release/* ]]; then
# build and push multi-arch manifest, this depends on the other images
# being pushed so will automatically push them
# note: omitting the -j argument to avoid race conditions when pushing
make push/build/coder_"$version"_linux_{amd64,arm64,armv7}.tag
# Define specific tags
tags=("$tag" "main" "latest")
tags=("$tag")
if [ "${GITHUB_REF}" == "refs/heads/main" ]; then
tags+=("main" "latest")
elif [[ "${GITHUB_REF}" == refs/heads/release/* ]]; then
tags+=("release-${GITHUB_REF#refs/heads/release/}")
fi
# Create and push a multi-arch manifest for each tag
# we are adding `latest` tag and keeping `main` for backward
# compatibality
for t in "${tags[@]}"; do
# shellcheck disable=SC2046
./scripts/build_docker_multiarch.sh \
--push \
--target "ghcr.io/coder/coder-preview:$t" \
--version "$version" \
$(cat build/coder_"$version"_linux_{amd64,arm64,armv7}.tag)
echo "Pushing multi-arch manifest for tag: $t"
# shellcheck disable=SC2046
./scripts/build_docker_multiarch.sh \
--push \
--target "ghcr.io/coder/coder-preview:$t" \
--version "$version" \
$(cat build/coder_"$version"_linux_{amd64,arm64,armv7}.tag)
done
fi
@@ -1469,112 +1477,28 @@ jobs:
./build/*.deb
retention-days: 7
# Deploy is handled in deploy.yaml so we can apply concurrency limits.
deploy:
name: "deploy"
runs-on: ubuntu-latest
timeout-minutes: 30
needs:
- changes
- build
if: |
github.ref == 'refs/heads/main' && !github.event.pull_request.head.repo.fork
(github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/heads/release/'))
&& needs.changes.outputs.docs-only == 'false'
&& !github.event.pull_request.head.repo.fork
uses: ./.github/workflows/deploy.yaml
with:
image: ${{ needs.build.outputs.IMAGE }}
permissions:
contents: read
id-token: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Authenticate to Google Cloud
uses: google-github-actions/auth@7c6bc770dae815cd3e89ee6cdf493a5fab2cc093 # v3.0.0
with:
workload_identity_provider: ${{ vars.GCP_WORKLOAD_ID_PROVIDER }}
service_account: ${{ vars.GCP_SERVICE_ACCOUNT }}
- name: Set up Google Cloud SDK
uses: google-github-actions/setup-gcloud@aa5489c8933f4cc7a4f7d45035b3b1440c9c10db # v3.0.1
- name: Set up Flux CLI
uses: fluxcd/flux2/action@6bf37f6a560fd84982d67f853162e4b3c2235edb # v2.6.4
with:
# Keep this and the github action up to date with the version of flux installed in dogfood cluster
version: "2.5.1"
- name: Get Cluster Credentials
uses: google-github-actions/get-gke-credentials@3da1e46a907576cefaa90c484278bb5b259dd395 # v3.0.0
with:
cluster_name: dogfood-v2
location: us-central1-a
project_id: coder-dogfood-v2
- name: Reconcile Flux
run: |
set -euxo pipefail
flux --namespace flux-system reconcile source git flux-system
flux --namespace flux-system reconcile source git coder-main
flux --namespace flux-system reconcile kustomization flux-system
flux --namespace flux-system reconcile kustomization coder
flux --namespace flux-system reconcile source chart coder-coder
flux --namespace flux-system reconcile source chart coder-coder-provisioner
flux --namespace coder reconcile helmrelease coder
flux --namespace coder reconcile helmrelease coder-provisioner
# Just updating Flux is usually not enough. The Helm release may get
# redeployed, but unless something causes the Deployment to update the
# pods won't be recreated. It's important that the pods get recreated,
# since we use `imagePullPolicy: Always` to ensure we're running the
# latest image.
- name: Rollout Deployment
run: |
set -euxo pipefail
kubectl --namespace coder rollout restart deployment/coder
kubectl --namespace coder rollout status deployment/coder
kubectl --namespace coder rollout restart deployment/coder-provisioner
kubectl --namespace coder rollout status deployment/coder-provisioner
kubectl --namespace coder rollout restart deployment/coder-provisioner-tagged
kubectl --namespace coder rollout status deployment/coder-provisioner-tagged
deploy-wsproxies:
runs-on: ubuntu-latest
needs: build
if: github.ref == 'refs/heads/main' && !github.event.pull_request.head.repo.fork
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Setup flyctl
uses: superfly/flyctl-actions/setup-flyctl@fc53c09e1bc3be6f54706524e3b82c4f462f77be # v1.5
- name: Deploy workspace proxies
run: |
flyctl deploy --image "$IMAGE" --app paris-coder --config ./.github/fly-wsproxies/paris-coder.toml --env "CODER_PROXY_SESSION_TOKEN=$TOKEN_PARIS" --yes
flyctl deploy --image "$IMAGE" --app sydney-coder --config ./.github/fly-wsproxies/sydney-coder.toml --env "CODER_PROXY_SESSION_TOKEN=$TOKEN_SYDNEY" --yes
flyctl deploy --image "$IMAGE" --app sao-paulo-coder --config ./.github/fly-wsproxies/sao-paulo-coder.toml --env "CODER_PROXY_SESSION_TOKEN=$TOKEN_SAO_PAULO" --yes
flyctl deploy --image "$IMAGE" --app jnb-coder --config ./.github/fly-wsproxies/jnb-coder.toml --env "CODER_PROXY_SESSION_TOKEN=$TOKEN_JNB" --yes
env:
FLY_API_TOKEN: ${{ secrets.FLY_API_TOKEN }}
IMAGE: ${{ needs.build.outputs.IMAGE }}
TOKEN_PARIS: ${{ secrets.FLY_PARIS_CODER_PROXY_SESSION_TOKEN }}
TOKEN_SYDNEY: ${{ secrets.FLY_SYDNEY_CODER_PROXY_SESSION_TOKEN }}
TOKEN_SAO_PAULO: ${{ secrets.FLY_SAO_PAULO_CODER_PROXY_SESSION_TOKEN }}
TOKEN_JNB: ${{ secrets.FLY_JNB_CODER_PROXY_SESSION_TOKEN }}
packages: write # to retag image as dogfood
secrets:
FLY_API_TOKEN: ${{ secrets.FLY_API_TOKEN }}
FLY_PARIS_CODER_PROXY_SESSION_TOKEN: ${{ secrets.FLY_PARIS_CODER_PROXY_SESSION_TOKEN }}
FLY_SYDNEY_CODER_PROXY_SESSION_TOKEN: ${{ secrets.FLY_SYDNEY_CODER_PROXY_SESSION_TOKEN }}
FLY_SAO_PAULO_CODER_PROXY_SESSION_TOKEN: ${{ secrets.FLY_SAO_PAULO_CODER_PROXY_SESSION_TOKEN }}
FLY_JNB_CODER_PROXY_SESSION_TOKEN: ${{ secrets.FLY_JNB_CODER_PROXY_SESSION_TOKEN }}
# sqlc-vet runs a postgres docker container, runs Coder migrations, and then
# runs sqlc-vet to ensure all queries are valid. This catches any mistakes
+174
View File
@@ -0,0 +1,174 @@
name: deploy
on:
# Via workflow_call, called from ci.yaml
workflow_call:
inputs:
image:
description: "Image and tag to potentially deploy. Current branch will be validated against should-deploy check."
required: true
type: string
secrets:
FLY_API_TOKEN:
required: true
FLY_PARIS_CODER_PROXY_SESSION_TOKEN:
required: true
FLY_SYDNEY_CODER_PROXY_SESSION_TOKEN:
required: true
FLY_SAO_PAULO_CODER_PROXY_SESSION_TOKEN:
required: true
FLY_JNB_CODER_PROXY_SESSION_TOKEN:
required: true
permissions:
contents: read
concurrency:
group: ${{ github.workflow }} # no per-branch concurrency
cancel-in-progress: false
jobs:
# Determines if the given branch should be deployed to dogfood.
should-deploy:
name: should-deploy
runs-on: ubuntu-latest
outputs:
verdict: ${{ steps.check.outputs.verdict }} # DEPLOY or NOOP
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Check if deploy is enabled
id: check
run: |
set -euo pipefail
verdict="$(./scripts/should_deploy.sh)"
echo "verdict=$verdict" >> "$GITHUB_OUTPUT"
deploy:
name: "deploy"
runs-on: ubuntu-latest
timeout-minutes: 30
needs: should-deploy
if: needs.should-deploy.outputs.verdict == 'DEPLOY'
permissions:
contents: read
id-token: write
packages: write # to retag image as dogfood
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: GHCR Login
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Authenticate to Google Cloud
uses: google-github-actions/auth@7c6bc770dae815cd3e89ee6cdf493a5fab2cc093 # v3.0.0
with:
workload_identity_provider: ${{ vars.GCP_WORKLOAD_ID_PROVIDER }}
service_account: ${{ vars.GCP_SERVICE_ACCOUNT }}
- name: Set up Google Cloud SDK
uses: google-github-actions/setup-gcloud@aa5489c8933f4cc7a4f7d45035b3b1440c9c10db # v3.0.1
- name: Set up Flux CLI
uses: fluxcd/flux2/action@6bf37f6a560fd84982d67f853162e4b3c2235edb # v2.6.4
with:
# Keep this and the github action up to date with the version of flux installed in dogfood cluster
version: "2.7.0"
- name: Get Cluster Credentials
uses: google-github-actions/get-gke-credentials@3da1e46a907576cefaa90c484278bb5b259dd395 # v3.0.0
with:
cluster_name: dogfood-v2
location: us-central1-a
project_id: coder-dogfood-v2
# Retag image as dogfood while maintaining the multi-arch manifest
- name: Tag image as dogfood
run: docker buildx imagetools create --tag "ghcr.io/coder/coder-preview:dogfood" "$IMAGE"
env:
IMAGE: ${{ inputs.image }}
- name: Reconcile Flux
run: |
set -euxo pipefail
flux --namespace flux-system reconcile source git flux-system
flux --namespace flux-system reconcile source git coder-main
flux --namespace flux-system reconcile kustomization flux-system
flux --namespace flux-system reconcile kustomization coder
flux --namespace flux-system reconcile source chart coder-coder
flux --namespace flux-system reconcile source chart coder-coder-provisioner
flux --namespace coder reconcile helmrelease coder
flux --namespace coder reconcile helmrelease coder-provisioner
flux --namespace coder reconcile helmrelease coder-provisioner-tagged
flux --namespace coder reconcile helmrelease coder-provisioner-tagged-prebuilds
# Just updating Flux is usually not enough. The Helm release may get
# redeployed, but unless something causes the Deployment to update the
# pods won't be recreated. It's important that the pods get recreated,
# since we use `imagePullPolicy: Always` to ensure we're running the
# latest image.
- name: Rollout Deployment
run: |
set -euxo pipefail
kubectl --namespace coder rollout restart deployment/coder
kubectl --namespace coder rollout status deployment/coder
kubectl --namespace coder rollout restart deployment/coder-provisioner
kubectl --namespace coder rollout status deployment/coder-provisioner
kubectl --namespace coder rollout restart deployment/coder-provisioner-tagged
kubectl --namespace coder rollout status deployment/coder-provisioner-tagged
kubectl --namespace coder rollout restart deployment/coder-provisioner-tagged-prebuilds
kubectl --namespace coder rollout status deployment/coder-provisioner-tagged-prebuilds
deploy-wsproxies:
runs-on: ubuntu-latest
needs: deploy
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Setup flyctl
uses: superfly/flyctl-actions/setup-flyctl@fc53c09e1bc3be6f54706524e3b82c4f462f77be # v1.5
- name: Deploy workspace proxies
run: |
flyctl deploy --image "$IMAGE" --app paris-coder --config ./.github/fly-wsproxies/paris-coder.toml --env "CODER_PROXY_SESSION_TOKEN=$TOKEN_PARIS" --yes
flyctl deploy --image "$IMAGE" --app sydney-coder --config ./.github/fly-wsproxies/sydney-coder.toml --env "CODER_PROXY_SESSION_TOKEN=$TOKEN_SYDNEY" --yes
flyctl deploy --image "$IMAGE" --app sao-paulo-coder --config ./.github/fly-wsproxies/sao-paulo-coder.toml --env "CODER_PROXY_SESSION_TOKEN=$TOKEN_SAO_PAULO" --yes
flyctl deploy --image "$IMAGE" --app jnb-coder --config ./.github/fly-wsproxies/jnb-coder.toml --env "CODER_PROXY_SESSION_TOKEN=$TOKEN_JNB" --yes
env:
FLY_API_TOKEN: ${{ secrets.FLY_API_TOKEN }}
IMAGE: ${{ inputs.image }}
TOKEN_PARIS: ${{ secrets.FLY_PARIS_CODER_PROXY_SESSION_TOKEN }}
TOKEN_SYDNEY: ${{ secrets.FLY_SYDNEY_CODER_PROXY_SESSION_TOKEN }}
TOKEN_SAO_PAULO: ${{ secrets.FLY_SAO_PAULO_CODER_PROXY_SESSION_TOKEN }}
TOKEN_JNB: ${{ secrets.FLY_JNB_CODER_PROXY_SESSION_TOKEN }}
+80 -133
View File
@@ -1,6 +1,9 @@
name: AI Triage Automation
on:
issues:
types:
- labeled
workflow_dispatch:
inputs:
issue_url:
@@ -22,164 +25,108 @@ on:
required: false
default: "traiage"
type: string
cleanup:
description: "Cleanup workspace after triage."
required: false
default: false
type: boolean
jobs:
traiage:
name: Triage GitHub Issue with Claude Code
runs-on: ubuntu-latest
if: github.event.label.name == 'traiage' || github.event_name == 'workflow_dispatch'
timeout-minutes: 30
env:
CODER_URL: ${{ secrets.TRAIAGE_CODER_URL }}
CODER_SESSION_TOKEN: ${{ secrets.TRAIAGE_CODER_SESSION_TOKEN }}
TEMPLATE_NAME: ${{ inputs.template_name }}
permissions:
contents: read
issues: write
actions: write
steps:
- name: Determine Inputs
id: determine-inputs
env:
GITHUB_EVENT_ISSUE_HTML_URL: ${{ github.event.issue.html_url }}
GITHUB_EVENT_NAME: ${{ github.event_name }}
INPUTS_ISSUE_URL: ${{ inputs.issue_url }}
INPUTS_TEMPLATE_NAME: ${{ inputs.template_name || 'traiage' }}
INPUTS_TEMPLATE_PRESET: ${{ inputs.template_preset || 'Default'}}
INPUTS_PREFIX: ${{ inputs.prefix || 'traiage' }}
run: |
echo "template_name=${INPUTS_TEMPLATE_NAME}" >> "${GITHUB_OUTPUT}"
echo "template_preset=${INPUTS_TEMPLATE_PRESET}" >> "${GITHUB_OUTPUT}"
echo "prefix=${INPUTS_PREFIX}" >> "${GITHUB_OUTPUT}"
# Determine issue URL based on event type
if [[ "${GITHUB_EVENT_NAME}" == "workflow_dispatch" ]]; then
echo "issue_url=${INPUTS_ISSUE_URL}" >> "${GITHUB_OUTPUT}"
elif [[ "${GITHUB_EVENT_NAME}" == "issues" ]]; then
echo "issue_url=${GITHUB_EVENT_ISSUE_HTML_URL}" >> "${GITHUB_OUTPUT}"
else
echo "::error::Unsupported event type: ${GITHUB_EVENT_NAME}"
exit 1
fi
- name: Verify push access
env:
GITHUB_REPOSITORY: ${{ github.repository }}
GITHUB_ACTOR: ${{ github.actor }}
GITHUB_EVENT_USER_LOGIN: ${{ github.event.sender.login }}
GITHUB_EVENT_NAME: ${{ github.event_name }}
GH_TOKEN: ${{ github.token }}
run: |
# Determine username based on event type
if [[ "${GITHUB_EVENT_NAME}" == "workflow_dispatch" ]]; then
USERNAME="${GITHUB_ACTOR}"
else
USERNAME="${GITHUB_EVENT_USER_LOGIN}"
fi
# Query the user's permission on this repo
can_push="$(gh api "/repos/${GITHUB_REPOSITORY}/collaborators/${USERNAME}/permission" --jq '.user.permissions.push')"
if [[ "${can_push}" != "true" ]]; then
echo "::error title=Access Denied::${USERNAME} does not have push access to ${GITHUB_REPOSITORY}"
exit 1
fi
- name: Checkout repository
uses: actions/checkout@v4
with:
persist-credentials: false
fetch-depth: 0
- name: Extract context key from issue
id: extract-context
- name: Fetch issue description
id: fetch-issue
env:
ISSUE_URL: ${{ inputs.issue_url }}
GITHUB_TOKEN: ${{ github.token }}
run: |
issue_number="$(gh issue view "${ISSUE_URL}" --json number --jq '.number')"
context_key="gh-${issue_number}"
echo "context_key=${context_key}" >> "${GITHUB_OUTPUT}"
echo "CONTEXT_KEY=${context_key}" >> "${GITHUB_ENV}"
- name: Download and install Coder binary
shell: bash
env:
CODER_URL: ${{ secrets.TRAIAGE_CODER_URL }}
run: |
if [ "${{ runner.arch }}" == "ARM64" ]; then
ARCH="arm64"
else
ARCH="amd64"
fi
mkdir -p "${HOME}/.local/bin"
curl -fsSL --compressed "$CODER_URL/bin/coder-linux-${ARCH}" -o "${HOME}/.local/bin/coder"
chmod +x "${HOME}/.local/bin/coder"
export PATH="$HOME/.local/bin:$PATH"
coder version
coder whoami
echo "$HOME/.local/bin" >> "${GITHUB_PATH}"
- name: Get Coder username from GitHub actor
id: get-coder-username
env:
CODER_SESSION_TOKEN: ${{ secrets.TRAIAGE_CODER_SESSION_TOKEN }}
GITHUB_USER_ID: ${{
(github.event_name == 'workflow_dispatch' && github.actor_id)
}}
run: |
[[ -z "${GITHUB_USER_ID}" || "${GITHUB_USER_ID}" == "null" ]] && echo "No GitHub actor ID found" && exit 1
user_json=$(
coder users list --github-user-id="${GITHUB_USER_ID}" --output=json
)
coder_username=$(jq -r 'first | .username' <<< "$user_json")
[[ -z "${coder_username}" || "${coder_username}" == "null" ]] && echo "No Coder user with GitHub user ID ${GITHUB_USER_ID} found" && exit 1
echo "coder_username=${coder_username}" >> "${GITHUB_OUTPUT}"
# TODO(Cian): this is a good use-case for 'recipes'
- name: Create Coder task
id: create-task
env:
CODER_USERNAME: ${{ steps.get-coder-username.outputs.coder_username }}
CONTEXT_KEY: ${{ steps.extract-context.outputs.context_key }}
GITHUB_TOKEN: ${{ github.token }}
ISSUE_URL: ${{ inputs.issue_url }}
PREFIX: ${{ inputs.prefix }}
RUN_ID: ${{ github.run_id }}
TEMPLATE_PARAMETERS: ${{ secrets.TRAIAGE_TEMPLATE_PARAMETERS }}
TEMPLATE_PRESET: ${{ inputs.template_preset }}
ISSUE_URL: ${{ steps.determine-inputs.outputs.issue_url }}
GH_TOKEN: ${{ github.token }}
run: |
# Fetch issue description using `gh` CLI
issue_description=$(gh issue view "${ISSUE_URL}")
#shellcheck disable=SC2016 # The template string should not be subject to shell expansion
issue_description=$(gh issue view "${ISSUE_URL}" \
--json 'title,body,comments' \
--template '{{printf "%s\n\n%s\n\nComments:\n" .title .body}}{{range $k, $v := .comments}} - {{index $v.author "login"}}: {{printf "%s\n" $v.body}}{{end}}')
# Write a prompt to PROMPT_FILE
PROMPT=$(cat <<EOF
Analyze the below GitHub issue description, understand the root cause, and make appropriate changes to resolve the issue.
ISSUE URL: ${ISSUE_URL}
ISSUE DESCRIPTION BELOW:
${issue_description}
EOF
)
export PROMPT
export TASK_NAME="${PREFIX}-${CONTEXT_KEY}-${RUN_ID}"
echo "Creating task: $TASK_NAME"
./scripts/traiage.sh create
coder exp task status "${CODER_USERNAME}/$TASK_NAME" --watch
echo "TASK_NAME=${CODER_USERNAME}/${TASK_NAME}" >> "${GITHUB_OUTPUT}"
echo "TASK_NAME=${CODER_USERNAME}/${TASK_NAME}" >> "${GITHUB_ENV}"
- name: Create and upload archive
id: create-archive
if: inputs.cleanup
env:
BUCKET_PREFIX: "gs://coder-traiage-outputs/traiage"
run: |
echo "Creating archive for workspace: $TASK_NAME"
./scripts/traiage.sh archive
echo "archive_url=${BUCKET_PREFIX%%/}/$TASK_NAME.tar.gz" >> "${GITHUB_OUTPUT}"
- name: Generate a summary of the changes and post a comment on GitHub.
id: generate-summary
if: inputs.cleanup
env:
ARCHIVE_URL: ${{ steps.create-archive.outputs.archive_url }}
BUCKET_PREFIX: "gs://coder-traiage-outputs/traiage"
CONTEXT_KEY: ${{ steps.extract-context.outputs.context_key }}
GITHUB_TOKEN: ${{ github.token }}
GITHUB_REPOSITORY: ${{ github.repository }}
ISSUE_URL: ${{ inputs.issue_url }}
TASK_NAME: ${{ steps.create-task.outputs.TASK_NAME }}
run: |
SUMMARY_FILE=$(mktemp)
trap 'rm -f "${SUMMARY_FILE}"' EXIT
AUTO_SUMMARY=$(./scripts/traiage.sh summary)
# Create prompt for the task
{
echo "## TrAIage Results"
echo "- **Issue URL:** ${ISSUE_URL}"
echo "- **Context Key:** ${CONTEXT_KEY}"
echo "- **Workspace:** ${TASK_NAME}"
echo "- **Archive URL:** ${ARCHIVE_URL}"
echo
echo "${AUTO_SUMMARY}"
echo
echo "To fetch the output to your own workspace:"
echo
echo '```bash'
echo "BUCKET_PREFIX=${BUCKET_PREFIX} TASK_NAME=${TASK_NAME} ./scripts/traiage.sh resume"
echo '```'
echo
} >> "${SUMMARY_FILE}"
echo "prompt<<EOF"
cat <<PROMPT
Fix ${ISSUE_URL}
if [[ "${ISSUE_URL}" == "https://github.com/${GITHUB_REPOSITORY}"* ]]; then
gh issue comment "${ISSUE_URL}" --body-file "${SUMMARY_FILE}" --create-if-none --edit-last
else
echo "Skipping comment on other repo."
fi
cat "${SUMMARY_FILE}" >> "${GITHUB_STEP_SUMMARY}"
Analyze the below GitHub issue description, understand the root cause, and make appropriate changes to resolve the issue.
---
${issue_description}
PROMPT
echo "EOF"
} >> "${GITHUB_OUTPUT}"
- name: Cleanup task
if: inputs.cleanup && steps.create-task.outputs.TASK_NAME != '' && steps.create-archive.outputs.archive_url != ''
run: |
echo "Cleaning up task: $TASK_NAME"
./scripts/traiage.sh delete || true
- name: Create Coder Task
uses: ./.github/actions/coder-task
with:
coder-url: ${{ secrets.TRAIAGE_CODER_URL }}
coder-token: ${{ secrets.TRAIAGE_CODER_SESSION_TOKEN }}
template-name: ${{ steps.determine-inputs.outputs.template_name }}
template-preset: ${{ steps.determine-inputs.outputs.template_preset }}
task-name-prefix: ${{ steps.determine-inputs.outputs.prefix }}
task-prompt: ${{ steps.fetch-issue.outputs.prompt }}
issue-url: ${{ steps.determine-inputs.outputs.issue_url }}
coder-web-url: "https://dev.coder.com"
github-token: ${{ github.token }}
+4
View File
@@ -0,0 +1,4 @@
rules:
cache-poisoning:
ignore:
- "ci.yaml:184"
+2 -1
View File
@@ -61,5 +61,6 @@
"typos.config": ".github/workflows/typos.toml",
"[markdown]": {
"editor.defaultFormatter": "DavidAnson.vscode-markdownlint"
}
},
"biome.lsp.bin": "site/node_modules/.bin/biome"
}
+2 -10
View File
@@ -1020,19 +1020,11 @@ endif
TEST_PACKAGES ?= ./...
warm-go-cache-db-cleaner:
# ensure Go's build cache for the cleanercmd is fresh so that tests don't have to build from scratch. This
# could take some time and counts against the test's timeout, which can lead to flakes.
# c.f. https://github.com/coder/internal/issues/1026
mkdir -p build
$(GIT_FLAGS) go build -o ./build/cleaner github.com/coder/coder/v2/coderd/database/dbtestutil/cleanercmd
.PHONY: warm-go-cache-db-cleaner
test: warm-go-cache-db-cleaner
test:
$(GIT_FLAGS) gotestsum --format standard-quiet $(GOTESTSUM_RETRY_FLAGS) --packages="$(TEST_PACKAGES)" -- $(GOTEST_FLAGS)
.PHONY: test
test-cli: warm-go-cache-db-cleaner
test-cli:
$(MAKE) test TEST_PACKAGES="./cli..."
.PHONY: test-cli
+14 -3
View File
@@ -781,11 +781,15 @@ func (a *agent) reportConnectionsLoop(ctx context.Context, aAPI proto.DRPCAgentC
logger.Debug(ctx, "reporting connection")
_, err := aAPI.ReportConnection(ctx, payload)
if err != nil {
return xerrors.Errorf("failed to report connection: %w", err)
// Do not fail the loop if we fail to report a connection, just
// log a warning.
// Related to https://github.com/coder/coder/issues/20194
logger.Warn(ctx, "failed to report connection to server", slog.Error(err))
// keep going, we still need to remove it from the slice
} else {
logger.Debug(ctx, "successfully reported connection")
}
logger.Debug(ctx, "successfully reported connection")
// Remove the payload we sent.
a.reportConnectionsMu.Lock()
a.reportConnections[0] = nil // Release the pointer from the underlying array.
@@ -816,6 +820,13 @@ func (a *agent) reportConnection(id uuid.UUID, connectionType proto.Connection_T
ip = host
}
// If the IP is "localhost" (which it can be in some cases), set it to
// 127.0.0.1 instead.
// Related to https://github.com/coder/coder/issues/20194
if ip == "localhost" {
ip = "127.0.0.1"
}
a.reportConnectionsMu.Lock()
defer a.reportConnectionsMu.Unlock()
+4 -2
View File
@@ -1807,11 +1807,12 @@ func TestAgent_ReconnectingPTY(t *testing.T) {
//nolint:dogsled
conn, agentClient, _, _, _ := setupAgent(t, agentsdk.Manifest{}, 0)
idConnectionReport := uuid.New()
id := uuid.New()
// Test that the connection is reported. This must be tested in the
// first connection because we care about verifying all of these.
netConn0, err := conn.ReconnectingPTY(ctx, id, 80, 80, "bash --norc")
netConn0, err := conn.ReconnectingPTY(ctx, idConnectionReport, 80, 80, "bash --norc")
require.NoError(t, err)
_ = netConn0.Close()
assertConnectionReport(t, agentClient, proto.Connection_RECONNECTING_PTY, 0, "")
@@ -2027,7 +2028,8 @@ func runSubAgentMain() int {
ctx, cancel := context.WithTimeout(context.Background(), testutil.WaitLong)
defer cancel()
req = req.WithContext(ctx)
resp, err := http.DefaultClient.Do(req)
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
_, _ = fmt.Fprintf(os.Stderr, "agent connection failed: %v\n", err)
return 11
+2 -1
View File
@@ -63,6 +63,7 @@ func NewAppHealthReporterWithClock(
// run a ticker for each app health check.
var mu sync.RWMutex
failures := make(map[uuid.UUID]int, 0)
client := &http.Client{}
for _, nextApp := range apps {
if !shouldStartTicker(nextApp) {
continue
@@ -91,7 +92,7 @@ func NewAppHealthReporterWithClock(
if err != nil {
return err
}
res, err := http.DefaultClient.Do(req)
res, err := client.Do(req)
if err != nil {
return err
}
+5
View File
@@ -25,6 +25,7 @@ import (
// screenReconnectingPTY provides a reconnectable PTY via `screen`.
type screenReconnectingPTY struct {
logger slog.Logger
execer agentexec.Execer
command *pty.Cmd
@@ -62,6 +63,7 @@ type screenReconnectingPTY struct {
// own which causes it to spawn with the specified size.
func newScreen(ctx context.Context, logger slog.Logger, execer agentexec.Execer, cmd *pty.Cmd, options *Options) *screenReconnectingPTY {
rpty := &screenReconnectingPTY{
logger: logger,
execer: execer,
command: cmd,
metrics: options.Metrics,
@@ -173,6 +175,7 @@ func (rpty *screenReconnectingPTY) Attach(ctx context.Context, _ string, conn ne
ptty, process, err := rpty.doAttach(ctx, conn, height, width, logger)
if err != nil {
logger.Debug(ctx, "unable to attach to screen reconnecting pty", slog.Error(err))
if errors.Is(err, context.Canceled) {
// Likely the process was too short-lived and canceled the version command.
// TODO: Is it worth distinguishing between that and a cancel from the
@@ -182,6 +185,7 @@ func (rpty *screenReconnectingPTY) Attach(ctx context.Context, _ string, conn ne
}
return err
}
logger.Debug(ctx, "attached to screen reconnecting pty")
defer func() {
// Log only for debugging since the process might have already exited on its
@@ -403,6 +407,7 @@ func (rpty *screenReconnectingPTY) Wait() {
}
func (rpty *screenReconnectingPTY) Close(err error) {
rpty.logger.Debug(context.Background(), "closing screen reconnecting pty", slog.Error(err))
// The closing state change will be handled by the lifecycle.
rpty.state.setState(StateClosing, err)
}
+5 -11
View File
@@ -6,10 +6,7 @@
"defaultBranch": "main"
},
"files": {
"includes": [
"**",
"!**/pnpm-lock.yaml"
],
"includes": ["**", "!**/pnpm-lock.yaml"],
"ignoreUnknown": true
},
"linter": {
@@ -48,13 +45,14 @@
"options": {
"paths": {
"@mui/material": "Use @mui/material/<name> instead. See: https://material-ui.com/guides/minimizing-bundle-size/.",
"@mui/icons-material": "Use @mui/icons-material/<name> instead. See: https://material-ui.com/guides/minimizing-bundle-size/.",
"@mui/material/Avatar": "Use components/Avatar/Avatar instead.",
"@mui/material/Alert": "Use components/Alert/Alert instead.",
"@mui/material/Popover": "Use components/Popover/Popover instead.",
"@mui/material/Typography": "Use native HTML elements instead. Eg: <span>, <p>, <h1>, etc.",
"@mui/material/Box": "Use a <div> instead.",
"@mui/material/Button": "Use a components/Button/Button instead.",
"@mui/material/styles": "Import from @emotion/react instead.",
"@mui/material/Table*": "Import from components/Table/Table instead.",
"lodash": "Use lodash/<name> instead."
}
}
@@ -69,11 +67,7 @@
"noConsole": {
"level": "error",
"options": {
"allow": [
"error",
"info",
"warn"
]
"allow": ["error", "info", "warn"]
}
}
},
@@ -82,5 +76,5 @@
}
}
},
"$schema": "https://biomejs.dev/schemas/2.2.0/schema.json"
"$schema": "./node_modules/@biomejs/biome/configuration_schema.json"
}
+559
View File
@@ -29,15 +29,18 @@ import (
"github.com/coder/coder/v2/cli/cliui"
"github.com/coder/coder/v2/coderd/httpapi"
notificationsLib "github.com/coder/coder/v2/coderd/notifications"
"github.com/coder/coder/v2/coderd/tracing"
"github.com/coder/coder/v2/codersdk"
"github.com/coder/coder/v2/codersdk/workspacesdk"
"github.com/coder/coder/v2/scaletest/agentconn"
"github.com/coder/coder/v2/scaletest/autostart"
"github.com/coder/coder/v2/scaletest/createusers"
"github.com/coder/coder/v2/scaletest/createworkspaces"
"github.com/coder/coder/v2/scaletest/dashboard"
"github.com/coder/coder/v2/scaletest/harness"
"github.com/coder/coder/v2/scaletest/loadtestutil"
"github.com/coder/coder/v2/scaletest/notifications"
"github.com/coder/coder/v2/scaletest/reconnectingpty"
"github.com/coder/coder/v2/scaletest/workspacebuild"
"github.com/coder/coder/v2/scaletest/workspacetraffic"
@@ -57,9 +60,12 @@ func (r *RootCmd) scaletestCmd() *serpent.Command {
Children: []*serpent.Command{
r.scaletestCleanup(),
r.scaletestDashboard(),
r.scaletestDynamicParameters(),
r.scaletestCreateWorkspaces(),
r.scaletestWorkspaceUpdates(),
r.scaletestWorkspaceTraffic(),
r.scaletestAutostart(),
r.scaletestNotifications(),
},
}
@@ -1682,6 +1688,492 @@ func (r *RootCmd) scaletestDashboard() *serpent.Command {
return cmd
}
const (
autostartTestName = "autostart"
)
func (r *RootCmd) scaletestAutostart() *serpent.Command {
var (
workspaceCount int64
workspaceJobTimeout time.Duration
autostartDelay time.Duration
autostartTimeout time.Duration
template string
noCleanup bool
parameterFlags workspaceParameterFlags
tracingFlags = &scaletestTracingFlags{}
timeoutStrategy = &timeoutFlags{}
cleanupStrategy = newScaletestCleanupStrategy()
output = &scaletestOutputFlags{}
prometheusFlags = &scaletestPrometheusFlags{}
)
cmd := &serpent.Command{
Use: "autostart",
Short: "Replicate a thundering herd of autostarting workspaces",
Handler: func(inv *serpent.Invocation) error {
ctx := inv.Context()
client, err := r.InitClient(inv)
if err != nil {
return err
}
notifyCtx, stop := signal.NotifyContext(ctx, StopSignals...) // Checked later.
defer stop()
ctx = notifyCtx
me, err := requireAdmin(ctx, client)
if err != nil {
return err
}
client.HTTPClient = &http.Client{
Transport: &codersdk.HeaderTransport{
Transport: http.DefaultTransport,
Header: map[string][]string{
codersdk.BypassRatelimitHeader: {"true"},
},
},
}
if workspaceCount <= 0 {
return xerrors.Errorf("--workspace-count must be greater than zero")
}
outputs, err := output.parse()
if err != nil {
return xerrors.Errorf("could not parse --output flags")
}
tpl, err := parseTemplate(ctx, client, me.OrganizationIDs, template)
if err != nil {
return xerrors.Errorf("parse template: %w", err)
}
cliRichParameters, err := asWorkspaceBuildParameters(parameterFlags.richParameters)
if err != nil {
return xerrors.Errorf("can't parse given parameter values: %w", err)
}
richParameters, err := prepWorkspaceBuild(inv, client, prepWorkspaceBuildArgs{
Action: WorkspaceCreate,
TemplateVersionID: tpl.ActiveVersionID,
RichParameterFile: parameterFlags.richParameterFile,
RichParameters: cliRichParameters,
})
if err != nil {
return xerrors.Errorf("prepare build: %w", err)
}
tracerProvider, closeTracing, tracingEnabled, err := tracingFlags.provider(ctx)
if err != nil {
return xerrors.Errorf("create tracer provider: %w", err)
}
tracer := tracerProvider.Tracer(scaletestTracerName)
reg := prometheus.NewRegistry()
metrics := autostart.NewMetrics(reg)
setupBarrier := new(sync.WaitGroup)
setupBarrier.Add(int(workspaceCount))
th := harness.NewTestHarness(timeoutStrategy.wrapStrategy(harness.ConcurrentExecutionStrategy{}), cleanupStrategy.toStrategy())
for i := range workspaceCount {
id := strconv.Itoa(int(i))
config := autostart.Config{
User: createusers.Config{
OrganizationID: me.OrganizationIDs[0],
},
Workspace: workspacebuild.Config{
OrganizationID: me.OrganizationIDs[0],
Request: codersdk.CreateWorkspaceRequest{
TemplateID: tpl.ID,
RichParameterValues: richParameters,
},
},
WorkspaceJobTimeout: workspaceJobTimeout,
AutostartDelay: autostartDelay,
AutostartTimeout: autostartTimeout,
Metrics: metrics,
SetupBarrier: setupBarrier,
}
if err := config.Validate(); err != nil {
return xerrors.Errorf("validate config: %w", err)
}
var runner harness.Runnable = autostart.NewRunner(client, config)
if tracingEnabled {
runner = &runnableTraceWrapper{
tracer: tracer,
spanName: fmt.Sprintf("%s/%s", autostartTestName, id),
runner: runner,
}
}
th.AddRun(autostartTestName, id, runner)
}
logger := inv.Logger
prometheusSrvClose := ServeHandler(ctx, logger, promhttp.HandlerFor(reg, promhttp.HandlerOpts{}), prometheusFlags.Address, "prometheus")
defer prometheusSrvClose()
defer func() {
_, _ = fmt.Fprintln(inv.Stderr, "\nUploading traces...")
if err := closeTracing(ctx); err != nil {
_, _ = fmt.Fprintf(inv.Stderr, "\nError uploading traces: %+v\n", err)
}
// Wait for prometheus metrics to be scraped
_, _ = fmt.Fprintf(inv.Stderr, "Waiting %s for prometheus metrics to be scraped\n", prometheusFlags.Wait)
<-time.After(prometheusFlags.Wait)
}()
_, _ = fmt.Fprintln(inv.Stderr, "Running autostart load test...")
testCtx, testCancel := timeoutStrategy.toContext(ctx)
defer testCancel()
err = th.Run(testCtx)
if err != nil {
return xerrors.Errorf("run test harness (harness failure, not a test failure): %w", err)
}
// If the command was interrupted, skip stats.
if notifyCtx.Err() != nil {
return notifyCtx.Err()
}
res := th.Results()
for _, o := range outputs {
err = o.write(res, inv.Stdout)
if err != nil {
return xerrors.Errorf("write output %q to %q: %w", o.format, o.path, err)
}
}
if !noCleanup {
_, _ = fmt.Fprintln(inv.Stderr, "\nCleaning up...")
cleanupCtx, cleanupCancel := cleanupStrategy.toContext(ctx)
defer cleanupCancel()
err = th.Cleanup(cleanupCtx)
if err != nil {
return xerrors.Errorf("cleanup tests: %w", err)
}
}
if res.TotalFail > 0 {
return xerrors.New("load test failed, see above for more details")
}
return nil
},
}
cmd.Options = serpent.OptionSet{
{
Flag: "workspace-count",
FlagShorthand: "c",
Env: "CODER_SCALETEST_WORKSPACE_COUNT",
Description: "Required: Total number of workspaces to create.",
Value: serpent.Int64Of(&workspaceCount),
Required: true,
},
{
Flag: "workspace-job-timeout",
Env: "CODER_SCALETEST_WORKSPACE_JOB_TIMEOUT",
Default: "5m",
Description: "Timeout for workspace jobs (e.g. build, start).",
Value: serpent.DurationOf(&workspaceJobTimeout),
},
{
Flag: "autostart-delay",
Env: "CODER_SCALETEST_AUTOSTART_DELAY",
Default: "2m",
Description: "How long after all the workspaces have been stopped to schedule them to be started again.",
Value: serpent.DurationOf(&autostartDelay),
},
{
Flag: "autostart-timeout",
Env: "CODER_SCALETEST_AUTOSTART_TIMEOUT",
Default: "5m",
Description: "Timeout for the autostart build to be initiated after the scheduled start time.",
Value: serpent.DurationOf(&autostartTimeout),
},
{
Flag: "template",
FlagShorthand: "t",
Env: "CODER_SCALETEST_TEMPLATE",
Description: "Required: Name or ID of the template to use for workspaces.",
Value: serpent.StringOf(&template),
Required: true,
},
{
Flag: "no-cleanup",
Env: "CODER_SCALETEST_NO_CLEANUP",
Description: "Do not clean up resources after the test completes.",
Value: serpent.BoolOf(&noCleanup),
},
}
cmd.Options = append(cmd.Options, parameterFlags.cliParameters()...)
tracingFlags.attach(&cmd.Options)
timeoutStrategy.attach(&cmd.Options)
cleanupStrategy.attach(&cmd.Options)
output.attach(&cmd.Options)
prometheusFlags.attach(&cmd.Options)
return cmd
}
func (r *RootCmd) scaletestNotifications() *serpent.Command {
var (
userCount int64
ownerUserPercentage float64
notificationTimeout time.Duration
dialTimeout time.Duration
noCleanup bool
tracingFlags = &scaletestTracingFlags{}
// This test requires unlimited concurrency.
timeoutStrategy = &timeoutFlags{}
cleanupStrategy = newScaletestCleanupStrategy()
output = &scaletestOutputFlags{}
prometheusFlags = &scaletestPrometheusFlags{}
)
cmd := &serpent.Command{
Use: "notifications",
Short: "Simulate notification delivery by creating many users listening to notifications.",
Handler: func(inv *serpent.Invocation) error {
ctx := inv.Context()
client, err := r.InitClient(inv)
if err != nil {
return err
}
notifyCtx, stop := signal.NotifyContext(ctx, StopSignals...)
defer stop()
ctx = notifyCtx
me, err := requireAdmin(ctx, client)
if err != nil {
return err
}
client.HTTPClient = &http.Client{
Transport: &codersdk.HeaderTransport{
Transport: http.DefaultTransport,
Header: map[string][]string{
codersdk.BypassRatelimitHeader: {"true"},
},
},
}
if userCount <= 0 {
return xerrors.Errorf("--user-count must be greater than 0")
}
if ownerUserPercentage < 0 || ownerUserPercentage > 100 {
return xerrors.Errorf("--owner-user-percentage must be between 0 and 100")
}
ownerUserCount := int64(float64(userCount) * ownerUserPercentage / 100)
if ownerUserCount == 0 && ownerUserPercentage > 0 {
ownerUserCount = 1
}
regularUserCount := userCount - ownerUserCount
_, _ = fmt.Fprintf(inv.Stderr, "Distribution plan:\n")
_, _ = fmt.Fprintf(inv.Stderr, " Total users: %d\n", userCount)
_, _ = fmt.Fprintf(inv.Stderr, " Owner users: %d (%.1f%%)\n", ownerUserCount, ownerUserPercentage)
_, _ = fmt.Fprintf(inv.Stderr, " Regular users: %d (%.1f%%)\n", regularUserCount, 100.0-ownerUserPercentage)
outputs, err := output.parse()
if err != nil {
return xerrors.Errorf("could not parse --output flags")
}
tracerProvider, closeTracing, tracingEnabled, err := tracingFlags.provider(ctx)
if err != nil {
return xerrors.Errorf("create tracer provider: %w", err)
}
tracer := tracerProvider.Tracer(scaletestTracerName)
reg := prometheus.NewRegistry()
metrics := notifications.NewMetrics(reg)
logger := inv.Logger
prometheusSrvClose := ServeHandler(ctx, logger, promhttp.HandlerFor(reg, promhttp.HandlerOpts{}), prometheusFlags.Address, "prometheus")
defer prometheusSrvClose()
defer func() {
_, _ = fmt.Fprintln(inv.Stderr, "\nUploading traces...")
if err := closeTracing(ctx); err != nil {
_, _ = fmt.Fprintf(inv.Stderr, "\nError uploading traces: %+v\n", err)
}
// Wait for prometheus metrics to be scraped
_, _ = fmt.Fprintf(inv.Stderr, "Waiting %s for prometheus metrics to be scraped\n", prometheusFlags.Wait)
<-time.After(prometheusFlags.Wait)
}()
_, _ = fmt.Fprintln(inv.Stderr, "Creating users...")
dialBarrier := &sync.WaitGroup{}
ownerWatchBarrier := &sync.WaitGroup{}
dialBarrier.Add(int(userCount))
ownerWatchBarrier.Add(int(ownerUserCount))
expectedNotifications := map[uuid.UUID]chan time.Time{
notificationsLib.TemplateUserAccountCreated: make(chan time.Time, 1),
notificationsLib.TemplateUserAccountDeleted: make(chan time.Time, 1),
}
configs := make([]notifications.Config, 0, userCount)
for range ownerUserCount {
config := notifications.Config{
User: createusers.Config{
OrganizationID: me.OrganizationIDs[0],
},
Roles: []string{codersdk.RoleOwner},
NotificationTimeout: notificationTimeout,
DialTimeout: dialTimeout,
DialBarrier: dialBarrier,
ReceivingWatchBarrier: ownerWatchBarrier,
ExpectedNotifications: expectedNotifications,
Metrics: metrics,
}
if err := config.Validate(); err != nil {
return xerrors.Errorf("validate config: %w", err)
}
configs = append(configs, config)
}
for range regularUserCount {
config := notifications.Config{
User: createusers.Config{
OrganizationID: me.OrganizationIDs[0],
},
Roles: []string{},
NotificationTimeout: notificationTimeout,
DialTimeout: dialTimeout,
DialBarrier: dialBarrier,
ReceivingWatchBarrier: ownerWatchBarrier,
Metrics: metrics,
}
if err := config.Validate(); err != nil {
return xerrors.Errorf("validate config: %w", err)
}
configs = append(configs, config)
}
go triggerUserNotifications(
ctx,
logger,
client,
me.OrganizationIDs[0],
dialBarrier,
dialTimeout,
expectedNotifications,
)
th := harness.NewTestHarness(timeoutStrategy.wrapStrategy(harness.ConcurrentExecutionStrategy{}), cleanupStrategy.toStrategy())
for i, config := range configs {
id := strconv.Itoa(i)
name := fmt.Sprintf("notifications-%s", id)
var runner harness.Runnable = notifications.NewRunner(client, config)
if tracingEnabled {
runner = &runnableTraceWrapper{
tracer: tracer,
spanName: name,
runner: runner,
}
}
th.AddRun(name, id, runner)
}
_, _ = fmt.Fprintln(inv.Stderr, "Running notification delivery scaletest...")
testCtx, testCancel := timeoutStrategy.toContext(ctx)
defer testCancel()
err = th.Run(testCtx)
if err != nil {
return xerrors.Errorf("run test harness (harness failure, not a test failure): %w", err)
}
// If the command was interrupted, skip stats.
if notifyCtx.Err() != nil {
return notifyCtx.Err()
}
res := th.Results()
for _, o := range outputs {
err = o.write(res, inv.Stdout)
if err != nil {
return xerrors.Errorf("write output %q to %q: %w", o.format, o.path, err)
}
}
if !noCleanup {
_, _ = fmt.Fprintln(inv.Stderr, "\nCleaning up...")
cleanupCtx, cleanupCancel := cleanupStrategy.toContext(ctx)
defer cleanupCancel()
err = th.Cleanup(cleanupCtx)
if err != nil {
return xerrors.Errorf("cleanup tests: %w", err)
}
}
if res.TotalFail > 0 {
return xerrors.New("load test failed, see above for more details")
}
return nil
},
}
cmd.Options = serpent.OptionSet{
{
Flag: "user-count",
FlagShorthand: "c",
Env: "CODER_SCALETEST_NOTIFICATION_USER_COUNT",
Description: "Required: Total number of users to create.",
Value: serpent.Int64Of(&userCount),
Required: true,
},
{
Flag: "owner-user-percentage",
Env: "CODER_SCALETEST_NOTIFICATION_OWNER_USER_PERCENTAGE",
Default: "20.0",
Description: "Percentage of users to assign Owner role to (0-100).",
Value: serpent.Float64Of(&ownerUserPercentage),
},
{
Flag: "notification-timeout",
Env: "CODER_SCALETEST_NOTIFICATION_TIMEOUT",
Default: "5m",
Description: "How long to wait for notifications after triggering.",
Value: serpent.DurationOf(&notificationTimeout),
},
{
Flag: "dial-timeout",
Env: "CODER_SCALETEST_DIAL_TIMEOUT",
Default: "2m",
Description: "Timeout for dialing the notification websocket endpoint.",
Value: serpent.DurationOf(&dialTimeout),
},
{
Flag: "no-cleanup",
Env: "CODER_SCALETEST_NO_CLEANUP",
Description: "Do not clean up resources after the test completes.",
Value: serpent.BoolOf(&noCleanup),
},
}
tracingFlags.attach(&cmd.Options)
timeoutStrategy.attach(&cmd.Options)
cleanupStrategy.attach(&cmd.Options)
output.attach(&cmd.Options)
prometheusFlags.attach(&cmd.Options)
return cmd
}
type runnableTraceWrapper struct {
tracer trace.Tracer
spanName string
@@ -1882,6 +2374,73 @@ func parseTargetRange(name, targets string) (start, end int, err error) {
return start, end, nil
}
// triggerUserNotifications waits for all test users to connect,
// then creates and deletes a test user to trigger notification events for testing.
func triggerUserNotifications(
ctx context.Context,
logger slog.Logger,
client *codersdk.Client,
orgID uuid.UUID,
dialBarrier *sync.WaitGroup,
dialTimeout time.Duration,
expectedNotifications map[uuid.UUID]chan time.Time,
) {
logger.Info(ctx, "waiting for all users to connect")
// Wait for all users to connect
waitCtx, cancel := context.WithTimeout(ctx, dialTimeout+30*time.Second)
defer cancel()
done := make(chan struct{})
go func() {
dialBarrier.Wait()
close(done)
}()
select {
case <-done:
logger.Info(ctx, "all users connected")
case <-waitCtx.Done():
if waitCtx.Err() == context.DeadlineExceeded {
logger.Error(ctx, "timeout waiting for users to connect")
} else {
logger.Info(ctx, "context canceled while waiting for users")
}
return
}
const (
triggerUsername = "scaletest-trigger-user"
triggerEmail = "scaletest-trigger@example.com"
)
logger.Info(ctx, "creating test user to test notifications",
slog.F("username", triggerUsername),
slog.F("email", triggerEmail),
slog.F("org_id", orgID))
testUser, err := client.CreateUserWithOrgs(ctx, codersdk.CreateUserRequestWithOrgs{
OrganizationIDs: []uuid.UUID{orgID},
Username: triggerUsername,
Email: triggerEmail,
Password: "test-password-123",
})
if err != nil {
logger.Error(ctx, "create test user", slog.Error(err))
return
}
expectedNotifications[notificationsLib.TemplateUserAccountCreated] <- time.Now()
err = client.DeleteUser(ctx, testUser.ID)
if err != nil {
logger.Error(ctx, "delete test user", slog.Error(err))
return
}
expectedNotifications[notificationsLib.TemplateUserAccountDeleted] <- time.Now()
close(expectedNotifications[notificationsLib.TemplateUserAccountCreated])
close(expectedNotifications[notificationsLib.TemplateUserAccountDeleted])
}
func createWorkspaceAppConfig(client *codersdk.Client, appHost, app string, workspace codersdk.Workspace, agent codersdk.WorkspaceAgent) (workspacetraffic.AppConfig, error) {
if app == "" {
return workspacetraffic.AppConfig{}, nil
+110
View File
@@ -0,0 +1,110 @@
//go:build !slim
package cli
import (
"fmt"
"github.com/prometheus/client_golang/prometheus"
"golang.org/x/xerrors"
"cdr.dev/slog"
"cdr.dev/slog/sloggers/sloghuman"
"github.com/coder/coder/v2/scaletest/dynamicparameters"
"github.com/coder/coder/v2/scaletest/harness"
"github.com/coder/serpent"
)
const (
dynamicParametersTestName = "dynamic-parameters"
)
func (r *RootCmd) scaletestDynamicParameters() *serpent.Command {
var templateName string
var numEvals int64
orgContext := NewOrganizationContext()
output := &scaletestOutputFlags{}
cmd := &serpent.Command{
Use: "dynamic-parameters",
Short: "Generates load on the Coder server evaluating dynamic parameters",
Long: `It is recommended that all rate limits are disabled on the server before running this scaletest. This test generates many login events which will be rate limited against the (most likely single) IP.`,
Handler: func(inv *serpent.Invocation) error {
ctx := inv.Context()
outputs, err := output.parse()
if err != nil {
return xerrors.Errorf("could not parse --output flags")
}
client, err := r.InitClient(inv)
if err != nil {
return err
}
if templateName == "" {
return xerrors.Errorf("template cannot be empty")
}
org, err := orgContext.Selected(inv, client)
if err != nil {
return err
}
logger := slog.Make(sloghuman.Sink(inv.Stdout)).Leveled(slog.LevelDebug)
partitions, err := dynamicparameters.SetupPartitions(ctx, client, org.ID, templateName, numEvals, logger)
if err != nil {
return xerrors.Errorf("setup dynamic parameters partitions: %w", err)
}
th := harness.NewTestHarness(harness.ConcurrentExecutionStrategy{}, harness.ConcurrentExecutionStrategy{})
reg := prometheus.NewRegistry()
metrics := dynamicparameters.NewMetrics(reg, "concurrent_evaluations")
for i, part := range partitions {
for j := range part.ConcurrentEvaluations {
cfg := dynamicparameters.Config{
TemplateVersion: part.TemplateVersion.ID,
Metrics: metrics,
MetricLabelValues: []string{fmt.Sprintf("%d", part.ConcurrentEvaluations)},
}
runner := dynamicparameters.NewRunner(client, cfg)
th.AddRun(dynamicParametersTestName, fmt.Sprintf("%d/%d", j, i), runner)
}
}
err = th.Run(ctx)
if err != nil {
return xerrors.Errorf("run test harness: %w", err)
}
res := th.Results()
for _, o := range outputs {
err = o.write(res, inv.Stdout)
if err != nil {
return xerrors.Errorf("write output %q to %q: %w", o.format, o.path, err)
}
}
return nil
},
}
cmd.Options = serpent.OptionSet{
{
Flag: "template",
Description: "Name of the template to use. If it does not exist, it will be created.",
Default: "scaletest-dynamic-parameters",
Value: serpent.StringOf(&templateName),
},
{
Flag: "concurrent-evaluations",
Description: "Number of concurrent dynamic parameter evaluations to perform.",
Default: "100",
Value: serpent.Int64Of(&numEvals),
},
}
orgContext.AttachOptions(cmd)
output.attach(&cmd.Options)
return cmd
}
+22
View File
@@ -29,6 +29,28 @@ func (r *RootCmd) taskCreate() *serpent.Command {
cmd := &serpent.Command{
Use: "create [input]",
Short: "Create an experimental task",
Long: FormatExamples(
Example{
Description: "Create a task with direct input",
Command: "coder exp task create \"Add authentication to the user service\"",
},
Example{
Description: "Create a task with stdin input",
Command: "echo \"Add authentication to the user service\" | coder exp task create",
},
Example{
Description: "Create a task with a specific name",
Command: "coder exp task create --name task1 \"Add authentication to the user service\"",
},
Example{
Description: "Create a task from a specific template / preset",
Command: "coder exp task create --template backend-dev --preset \"My Preset\" \"Add authentication to the user service\"",
},
Example{
Description: "Create a task for another user (requires appropriate permissions)",
Command: "coder exp task create --owner user@example.com \"Add authentication to the user service\"",
},
),
Middleware: serpent.Chain(
serpent.RequireRangeArgs(0, 1),
),
+14
View File
@@ -19,6 +19,20 @@ func (r *RootCmd) taskDelete() *serpent.Command {
cmd := &serpent.Command{
Use: "delete <task> [<task> ...]",
Short: "Delete experimental tasks",
Long: FormatExamples(
Example{
Description: "Delete a single task.",
Command: "$ coder exp task delete task1",
},
Example{
Description: "Delete multiple tasks.",
Command: "$ coder exp task delete task1 task2 task3",
},
Example{
Description: "Delete a task without confirmation.",
Command: "$ coder exp task delete task4 --yes",
},
),
Middleware: serpent.Chain(
serpent.RequireRangeArgs(1, -1),
),
+24 -2
View File
@@ -67,8 +67,30 @@ func (r *RootCmd) taskList() *serpent.Command {
)
cmd := &serpent.Command{
Use: "list",
Short: "List experimental tasks",
Use: "list",
Short: "List experimental tasks",
Long: FormatExamples(
Example{
Description: "List tasks for the current user.",
Command: "coder exp task list",
},
Example{
Description: "List tasks for a specific user.",
Command: "coder exp task list --user someone-else",
},
Example{
Description: "List all tasks you can view.",
Command: "coder exp task list --all",
},
Example{
Description: "List all your running tasks.",
Command: "coder exp task list --status running",
},
Example{
Description: "As above, but only show IDs.",
Command: "coder exp task list --status running --quiet",
},
),
Aliases: []string{"ls"},
Middleware: serpent.Chain(
serpent.RequireNArgs(0),
+5
View File
@@ -26,6 +26,11 @@ func (r *RootCmd) taskLogs() *serpent.Command {
cmd := &serpent.Command{
Use: "logs <task>",
Short: "Show a task's logs",
Long: FormatExamples(
Example{
Description: "Show logs for a given task.",
Command: "coder exp task logs task1",
}),
Middleware: serpent.Chain(
serpent.RequireNArgs(1),
),
+9 -2
View File
@@ -14,8 +14,15 @@ func (r *RootCmd) taskSend() *serpent.Command {
var stdin bool
cmd := &serpent.Command{
Use: "send <task> [<input> | --stdin]",
Short: "Send input to a task",
Use: "send <task> [<input> | --stdin]",
Short: "Send input to a task",
Long: FormatExamples(Example{
Description: "Send direct input to a task.",
Command: "coder exp task send task1 \"Please also add unit tests\"",
}, Example{
Description: "Send input from stdin to a task.",
Command: "echo \"Please also add unit tests\" | coder exp task send task1 --stdin",
}),
Middleware: serpent.RequireRangeArgs(1, 2),
Options: serpent.OptionSet{
{
+11 -1
View File
@@ -44,7 +44,17 @@ func (r *RootCmd) taskStatus() *serpent.Command {
watchIntervalArg time.Duration
)
cmd := &serpent.Command{
Short: "Show the status of a task.",
Short: "Show the status of a task.",
Long: FormatExamples(
Example{
Description: "Show the status of a given task.",
Command: "coder exp task status task1",
},
Example{
Description: "Watch the status of a given task until it completes (idle or stopped).",
Command: "coder exp task status task1 --watch",
},
),
Use: "status",
Aliases: []string{"stat"},
Options: serpent.OptionSet{
+1
View File
@@ -193,6 +193,7 @@ STATE CHANGED STATUS HEALTHY STATE MESSAGE
"workspace_agent_id": null,
"workspace_agent_lifecycle": null,
"workspace_agent_health": null,
"workspace_app_id": null,
"initial_prompt": "",
"status": "running",
"current_state": {
+21 -4
View File
@@ -43,8 +43,9 @@ func (r *RootCmd) provisionerJobsList() *serpent.Command {
cliui.TableFormat([]provisionerJobRow{}, []string{"created at", "id", "type", "template display name", "status", "queue", "tags"}),
cliui.JSONFormat(),
)
status []string
limit int64
status []string
limit int64
initiator string
)
cmd := &serpent.Command{
@@ -65,9 +66,18 @@ func (r *RootCmd) provisionerJobsList() *serpent.Command {
return xerrors.Errorf("current organization: %w", err)
}
if initiator != "" {
user, err := client.User(ctx, initiator)
if err != nil {
return xerrors.Errorf("initiator not found: %s", initiator)
}
initiator = user.ID.String()
}
jobs, err := client.OrganizationProvisionerJobs(ctx, org.ID, &codersdk.OrganizationProvisionerJobsOptions{
Status: slice.StringEnums[codersdk.ProvisionerJobStatus](status),
Limit: int(limit),
Status: slice.StringEnums[codersdk.ProvisionerJobStatus](status),
Limit: int(limit),
Initiator: initiator,
})
if err != nil {
return xerrors.Errorf("list provisioner jobs: %w", err)
@@ -122,6 +132,13 @@ func (r *RootCmd) provisionerJobsList() *serpent.Command {
Default: "50",
Value: serpent.Int64Of(&limit),
},
{
Flag: "initiator",
FlagShorthand: "i",
Env: "CODER_PROVISIONER_JOB_LIST_INITIATOR",
Description: "Filter by initiator (user ID or username).",
Value: serpent.StringOf(&initiator),
},
}...)
orgContext.AttachOptions(cmd)
+168 -24
View File
@@ -5,6 +5,7 @@ import (
"database/sql"
"encoding/json"
"fmt"
"strings"
"testing"
"time"
@@ -26,33 +27,32 @@ import (
func TestProvisionerJobs(t *testing.T) {
t.Parallel()
db, ps := dbtestutil.NewDB(t)
client, _, coderdAPI := coderdtest.NewWithAPI(t, &coderdtest.Options{
IncludeProvisionerDaemon: false,
Database: db,
Pubsub: ps,
})
owner := coderdtest.CreateFirstUser(t, client)
templateAdminClient, templateAdmin := coderdtest.CreateAnotherUser(t, client, owner.OrganizationID, rbac.ScopedRoleOrgTemplateAdmin(owner.OrganizationID))
memberClient, member := coderdtest.CreateAnotherUser(t, client, owner.OrganizationID)
// These CLI tests are related to provisioner job CRUD operations and as such
// do not require the overhead of starting a provisioner. Other provisioner job
// functionalities (acquisition etc.) are tested elsewhere.
template := dbgen.Template(t, db, database.Template{
OrganizationID: owner.OrganizationID,
CreatedBy: owner.UserID,
AllowUserCancelWorkspaceJobs: true,
})
version := dbgen.TemplateVersion(t, db, database.TemplateVersion{
OrganizationID: owner.OrganizationID,
CreatedBy: owner.UserID,
TemplateID: uuid.NullUUID{UUID: template.ID, Valid: true},
})
t.Run("Cancel", func(t *testing.T) {
t.Parallel()
db, ps := dbtestutil.NewDB(t)
client, _, coderdAPI := coderdtest.NewWithAPI(t, &coderdtest.Options{
IncludeProvisionerDaemon: false,
Database: db,
Pubsub: ps,
})
owner := coderdtest.CreateFirstUser(t, client)
templateAdminClient, templateAdmin := coderdtest.CreateAnotherUser(t, client, owner.OrganizationID, rbac.ScopedRoleOrgTemplateAdmin(owner.OrganizationID))
memberClient, member := coderdtest.CreateAnotherUser(t, client, owner.OrganizationID)
// These CLI tests are related to provisioner job CRUD operations and as such
// do not require the overhead of starting a provisioner. Other provisioner job
// functionalities (acquisition etc.) are tested elsewhere.
template := dbgen.Template(t, db, database.Template{
OrganizationID: owner.OrganizationID,
CreatedBy: owner.UserID,
AllowUserCancelWorkspaceJobs: true,
})
version := dbgen.TemplateVersion(t, db, database.TemplateVersion{
OrganizationID: owner.OrganizationID,
CreatedBy: owner.UserID,
TemplateID: uuid.NullUUID{UUID: template.ID, Valid: true},
})
// Test helper to create a provisioner job of a given type with a given input.
prepareJob := func(t *testing.T, jobType database.ProvisionerJobType, input json.RawMessage) database.ProvisionerJob {
t.Helper()
@@ -178,4 +178,148 @@ func TestProvisionerJobs(t *testing.T) {
})
}
})
t.Run("List", func(t *testing.T) {
t.Parallel()
db, ps := dbtestutil.NewDB(t)
client, _, coderdAPI := coderdtest.NewWithAPI(t, &coderdtest.Options{
IncludeProvisionerDaemon: false,
Database: db,
Pubsub: ps,
})
owner := coderdtest.CreateFirstUser(t, client)
_, member := coderdtest.CreateAnotherUser(t, client, owner.OrganizationID)
// These CLI tests are related to provisioner job CRUD operations and as such
// do not require the overhead of starting a provisioner. Other provisioner job
// functionalities (acquisition etc.) are tested elsewhere.
template := dbgen.Template(t, db, database.Template{
OrganizationID: owner.OrganizationID,
CreatedBy: owner.UserID,
AllowUserCancelWorkspaceJobs: true,
})
version := dbgen.TemplateVersion(t, db, database.TemplateVersion{
OrganizationID: owner.OrganizationID,
CreatedBy: owner.UserID,
TemplateID: uuid.NullUUID{UUID: template.ID, Valid: true},
})
// Create some test jobs
job1 := dbgen.ProvisionerJob(t, db, coderdAPI.Pubsub, database.ProvisionerJob{
OrganizationID: owner.OrganizationID,
InitiatorID: owner.UserID,
Type: database.ProvisionerJobTypeTemplateVersionImport,
Input: []byte(`{"template_version_id":"` + version.ID.String() + `"}`),
Tags: database.StringMap{provisionersdk.TagScope: provisionersdk.ScopeOrganization},
})
job2 := dbgen.ProvisionerJob(t, db, coderdAPI.Pubsub, database.ProvisionerJob{
OrganizationID: owner.OrganizationID,
InitiatorID: member.ID,
Type: database.ProvisionerJobTypeWorkspaceBuild,
Input: []byte(`{"workspace_build_id":"` + uuid.New().String() + `"}`),
Tags: database.StringMap{provisionersdk.TagScope: provisionersdk.ScopeOrganization},
})
// Test basic list command
t.Run("Basic", func(t *testing.T) {
t.Parallel()
inv, root := clitest.New(t, "provisioner", "jobs", "list")
clitest.SetupConfig(t, client, root)
var buf bytes.Buffer
inv.Stdout = &buf
err := inv.Run()
require.NoError(t, err)
// Should contain both jobs
output := buf.String()
assert.Contains(t, output, job1.ID.String())
assert.Contains(t, output, job2.ID.String())
})
// Test list with JSON output
t.Run("JSON", func(t *testing.T) {
t.Parallel()
inv, root := clitest.New(t, "provisioner", "jobs", "list", "--output", "json")
clitest.SetupConfig(t, client, root)
var buf bytes.Buffer
inv.Stdout = &buf
err := inv.Run()
require.NoError(t, err)
// Parse JSON output
var jobs []codersdk.ProvisionerJob
err = json.Unmarshal(buf.Bytes(), &jobs)
require.NoError(t, err)
// Should contain both jobs
jobIDs := make([]uuid.UUID, len(jobs))
for i, job := range jobs {
jobIDs[i] = job.ID
}
assert.Contains(t, jobIDs, job1.ID)
assert.Contains(t, jobIDs, job2.ID)
})
// Test list with limit
t.Run("Limit", func(t *testing.T) {
t.Parallel()
inv, root := clitest.New(t, "provisioner", "jobs", "list", "--limit", "1")
clitest.SetupConfig(t, client, root)
var buf bytes.Buffer
inv.Stdout = &buf
err := inv.Run()
require.NoError(t, err)
// Should contain at most 1 job
output := buf.String()
jobCount := 0
if strings.Contains(output, job1.ID.String()) {
jobCount++
}
if strings.Contains(output, job2.ID.String()) {
jobCount++
}
assert.LessOrEqual(t, jobCount, 1)
})
// Test list with initiator filter
t.Run("InitiatorFilter", func(t *testing.T) {
t.Parallel()
// Get owner user details to access username
ctx := testutil.Context(t, testutil.WaitShort)
ownerUser, err := client.User(ctx, owner.UserID.String())
require.NoError(t, err)
// Test filtering by initiator (using username)
inv, root := clitest.New(t, "provisioner", "jobs", "list", "--initiator", ownerUser.Username)
clitest.SetupConfig(t, client, root)
var buf bytes.Buffer
inv.Stdout = &buf
err = inv.Run()
require.NoError(t, err)
// Should only contain job1 (initiated by owner)
output := buf.String()
assert.Contains(t, output, job1.ID.String())
assert.NotContains(t, output, job2.ID.String())
})
// Test list with invalid user
t.Run("InvalidUser", func(t *testing.T) {
t.Parallel()
// Test with non-existent user
inv, root := clitest.New(t, "provisioner", "jobs", "list", "--initiator", "nonexistent-user")
clitest.SetupConfig(t, client, root)
var buf bytes.Buffer
inv.Stdout = &buf
err := inv.Run()
require.Error(t, err)
assert.Contains(t, err.Error(), "initiator not found: nonexistent-user")
})
})
}
+4 -2
View File
@@ -1254,8 +1254,9 @@ func TestServer(t *testing.T) {
t.Logf("error creating request: %s", err.Error())
return false
}
client := &http.Client{}
// nolint:bodyclose
res, err := http.DefaultClient.Do(req)
res, err := client.Do(req)
if err != nil {
t.Logf("error hitting prometheus endpoint: %s", err.Error())
return false
@@ -1316,8 +1317,9 @@ func TestServer(t *testing.T) {
t.Logf("error creating request: %s", err.Error())
return false
}
client := &http.Client{}
// nolint:bodyclose
res, err := http.DefaultClient.Do(req)
res, err := client.Do(req)
if err != nil {
t.Logf("error hitting prometheus endpoint: %s", err.Error())
return false
+2 -1
View File
@@ -1242,7 +1242,8 @@ func TestSSH(t *testing.T) {
// true exits the loop.
return true
}
resp, err := http.DefaultClient.Do(req)
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
t.Logf("HTTP GET http://localhost:8222/ %s", err)
return false
+158 -3
View File
@@ -9,6 +9,7 @@ import (
"os"
"path/filepath"
"slices"
"strconv"
"strings"
"time"
@@ -461,10 +462,14 @@ func createValidTemplateVersion(inv *serpent.Invocation, args createValidTemplat
})
if err != nil {
var jobErr *cliui.ProvisionerJobError
if errors.As(err, &jobErr) && !codersdk.JobIsMissingParameterErrorCode(jobErr.Code) {
return nil, err
if errors.As(err, &jobErr) {
if codersdk.JobIsMissingRequiredTemplateVariableErrorCode(jobErr.Code) {
return handleMissingTemplateVariables(inv, args, version.ID)
}
if !codersdk.JobIsMissingParameterErrorCode(jobErr.Code) {
return nil, err
}
}
return nil, err
}
version, err = client.TemplateVersion(inv.Context(), version.ID)
@@ -528,3 +533,153 @@ func prettyDirectoryPath(dir string) string {
}
return prettyDir
}
func handleMissingTemplateVariables(inv *serpent.Invocation, args createValidTemplateVersionArgs, failedVersionID uuid.UUID) (*codersdk.TemplateVersion, error) {
client := args.Client
templateVariables, err := client.TemplateVersionVariables(inv.Context(), failedVersionID)
if err != nil {
return nil, xerrors.Errorf("fetch template variables: %w", err)
}
existingValues := make(map[string]string)
for _, v := range args.UserVariableValues {
existingValues[v.Name] = v.Value
}
var missingVariables []codersdk.TemplateVersionVariable
for _, variable := range templateVariables {
if !variable.Required {
continue
}
if existingValue, exists := existingValues[variable.Name]; exists && existingValue != "" {
continue
}
// Only prompt for variables that don't have a default value or have a redacted default
// Sensitive variables have a default value of "*redacted*"
// See: https://github.com/coder/coder/blob/a78790c632974e04babfef6de0e2ddf044787a7a/coderd/provisionerdserver/provisionerdserver.go#L3206
if variable.DefaultValue == "" || (variable.Sensitive && variable.DefaultValue == "*redacted*") {
missingVariables = append(missingVariables, variable)
}
}
if len(missingVariables) == 0 {
return nil, xerrors.New("no missing required variables found")
}
_, _ = fmt.Fprintf(inv.Stderr, "Found %d missing required variables:\n", len(missingVariables))
for _, v := range missingVariables {
_, _ = fmt.Fprintf(inv.Stderr, " - %s (%s): %s\n", v.Name, v.Type, v.Description)
}
_, _ = fmt.Fprintln(inv.Stderr, "\nThe template requires values for the following variables:")
var promptedValues []codersdk.VariableValue
for _, variable := range missingVariables {
value, err := promptForTemplateVariable(inv, variable)
if err != nil {
return nil, xerrors.Errorf("prompt for variable %q: %w", variable.Name, err)
}
promptedValues = append(promptedValues, codersdk.VariableValue{
Name: variable.Name,
Value: value,
})
}
combinedValues := codersdk.CombineVariableValues(args.UserVariableValues, promptedValues)
_, _ = fmt.Fprintln(inv.Stderr, "\nRetrying template build with provided variables...")
retryArgs := args
retryArgs.UserVariableValues = combinedValues
return createValidTemplateVersion(inv, retryArgs)
}
func promptForTemplateVariable(inv *serpent.Invocation, variable codersdk.TemplateVersionVariable) (string, error) {
displayVariableInfo(inv, variable)
switch variable.Type {
case "bool":
return promptForBoolVariable(inv, variable)
case "number":
return promptForNumberVariable(inv, variable)
default:
return promptForStringVariable(inv, variable)
}
}
func displayVariableInfo(inv *serpent.Invocation, variable codersdk.TemplateVersionVariable) {
_, _ = fmt.Fprintf(inv.Stderr, "var.%s", cliui.Bold(variable.Name))
if variable.Required {
_, _ = fmt.Fprint(inv.Stderr, pretty.Sprint(cliui.DefaultStyles.Error, " (required)"))
}
if variable.Sensitive {
_, _ = fmt.Fprint(inv.Stderr, pretty.Sprint(cliui.DefaultStyles.Warn, ", sensitive"))
}
_, _ = fmt.Fprintln(inv.Stderr, "")
if variable.Description != "" {
_, _ = fmt.Fprintf(inv.Stderr, " Description: %s\n", variable.Description)
}
_, _ = fmt.Fprintf(inv.Stderr, " Type: %s\n", variable.Type)
_, _ = fmt.Fprintf(inv.Stderr, " Current value: %s\n", pretty.Sprint(cliui.DefaultStyles.Placeholder, "<empty>"))
}
func promptForBoolVariable(inv *serpent.Invocation, variable codersdk.TemplateVersionVariable) (string, error) {
defaultValue := variable.DefaultValue
if defaultValue == "" {
defaultValue = "false"
}
return cliui.Select(inv, cliui.SelectOptions{
Options: []string{"true", "false"},
Default: defaultValue,
Message: "Select value:",
})
}
func promptForNumberVariable(inv *serpent.Invocation, variable codersdk.TemplateVersionVariable) (string, error) {
prompt := "Enter value:"
if !variable.Required && variable.DefaultValue != "" {
prompt = fmt.Sprintf("Enter value (default: %q):", variable.DefaultValue)
}
return cliui.Prompt(inv, cliui.PromptOptions{
Text: prompt,
Default: variable.DefaultValue,
Validate: createVariableValidator(variable),
})
}
func promptForStringVariable(inv *serpent.Invocation, variable codersdk.TemplateVersionVariable) (string, error) {
prompt := "Enter value:"
if !variable.Sensitive {
if !variable.Required && variable.DefaultValue != "" {
prompt = fmt.Sprintf("Enter value (default: %q):", variable.DefaultValue)
}
}
return cliui.Prompt(inv, cliui.PromptOptions{
Text: prompt,
Default: variable.DefaultValue,
Secret: variable.Sensitive,
Validate: createVariableValidator(variable),
})
}
func createVariableValidator(variable codersdk.TemplateVersionVariable) func(string) error {
return func(s string) error {
if variable.Required && s == "" && variable.DefaultValue == "" {
return xerrors.New("value is required")
}
if variable.Type == "number" && s != "" {
if _, err := strconv.ParseFloat(s, 64); err != nil {
return xerrors.Errorf("must be a valid number, got: %q", s)
}
}
return nil
}
}
+234 -48
View File
@@ -852,54 +852,6 @@ func TestTemplatePush(t *testing.T) {
require.Equal(t, "foobar", templateVariables[1].Value)
})
t.Run("VariableIsRequiredButNotProvided", func(t *testing.T) {
t.Parallel()
client := coderdtest.New(t, &coderdtest.Options{IncludeProvisionerDaemon: true})
owner := coderdtest.CreateFirstUser(t, client)
templateAdmin, _ := coderdtest.CreateAnotherUser(t, client, owner.OrganizationID, rbac.RoleTemplateAdmin())
templateVersion := coderdtest.CreateTemplateVersion(t, client, owner.OrganizationID, createEchoResponsesWithTemplateVariables(initialTemplateVariables))
_ = coderdtest.AwaitTemplateVersionJobCompleted(t, client, templateVersion.ID)
template := coderdtest.CreateTemplate(t, client, owner.OrganizationID, templateVersion.ID)
// Test the cli command.
//nolint:gocritic
modifiedTemplateVariables := append(initialTemplateVariables,
&proto.TemplateVariable{
Name: "second_variable",
Description: "This is the second variable.",
Type: "string",
Required: true,
},
)
source := clitest.CreateTemplateVersionSource(t, createEchoResponsesWithTemplateVariables(modifiedTemplateVariables))
inv, root := clitest.New(t, "templates", "push", template.Name, "--directory", source, "--test.provisioner", string(database.ProvisionerTypeEcho), "--name", "example")
clitest.SetupConfig(t, templateAdmin, root)
pty := ptytest.New(t)
inv.Stdin = pty.Input()
inv.Stdout = pty.Output()
execDone := make(chan error)
go func() {
execDone <- inv.Run()
}()
matches := []struct {
match string
write string
}{
{match: "Upload", write: "yes"},
}
for _, m := range matches {
pty.ExpectMatch(m.match)
pty.WriteLine(m.write)
}
wantErr := <-execDone
require.Error(t, wantErr)
require.Contains(t, wantErr.Error(), "required template variables need values")
})
t.Run("VariableIsOptionalButNotProvided", func(t *testing.T) {
t.Parallel()
client := coderdtest.New(t, &coderdtest.Options{IncludeProvisionerDaemon: true})
@@ -1115,6 +1067,240 @@ func TestTemplatePush(t *testing.T) {
require.Len(t, templateVersions, 2)
require.Equal(t, "example", templateVersions[1].Name)
})
t.Run("PromptForDifferentRequiredTypes", func(t *testing.T) {
t.Parallel()
client := coderdtest.New(t, &coderdtest.Options{IncludeProvisionerDaemon: true})
owner := coderdtest.CreateFirstUser(t, client)
templateAdmin, _ := coderdtest.CreateAnotherUser(t, client, owner.OrganizationID, rbac.RoleTemplateAdmin())
templateVariables := []*proto.TemplateVariable{
{
Name: "string_var",
Description: "A string variable",
Type: "string",
Required: true,
},
{
Name: "number_var",
Description: "A number variable",
Type: "number",
Required: true,
},
{
Name: "bool_var",
Description: "A boolean variable",
Type: "bool",
Required: true,
},
{
Name: "sensitive_var",
Description: "A sensitive variable",
Type: "string",
Required: true,
Sensitive: true,
},
}
source := clitest.CreateTemplateVersionSource(t, createEchoResponsesWithTemplateVariables(templateVariables))
inv, root := clitest.New(t, "templates", "push", "test-template", "--directory", source, "--test.provisioner", string(database.ProvisionerTypeEcho))
clitest.SetupConfig(t, templateAdmin, root)
pty := ptytest.New(t).Attach(inv)
execDone := make(chan error)
go func() {
execDone <- inv.Run()
}()
// Select "Yes" for the "Upload <template_path>" prompt
pty.ExpectMatch("Upload")
pty.WriteLine("yes")
pty.ExpectMatch("var.string_var")
pty.ExpectMatch("Enter value:")
pty.WriteLine("test-string")
pty.ExpectMatch("var.number_var")
pty.ExpectMatch("Enter value:")
pty.WriteLine("42")
// Boolean variable automatically selects the first option ("true")
pty.ExpectMatch("var.bool_var")
pty.ExpectMatch("var.sensitive_var")
pty.ExpectMatch("Enter value:")
pty.WriteLine("secret-value")
require.NoError(t, <-execDone)
})
t.Run("ValidateNumberInput", func(t *testing.T) {
t.Parallel()
client := coderdtest.New(t, &coderdtest.Options{IncludeProvisionerDaemon: true})
owner := coderdtest.CreateFirstUser(t, client)
templateAdmin, _ := coderdtest.CreateAnotherUser(t, client, owner.OrganizationID, rbac.RoleTemplateAdmin())
templateVariables := []*proto.TemplateVariable{
{
Name: "number_var",
Description: "A number that requires validation",
Type: "number",
Required: true,
},
}
source := clitest.CreateTemplateVersionSource(t, createEchoResponsesWithTemplateVariables(templateVariables))
inv, root := clitest.New(t, "templates", "push", "test-template", "--directory", source, "--test.provisioner", string(database.ProvisionerTypeEcho))
clitest.SetupConfig(t, templateAdmin, root)
pty := ptytest.New(t).Attach(inv)
execDone := make(chan error)
go func() {
execDone <- inv.Run()
}()
// Select "Yes" for the "Upload <template_path>" prompt
pty.ExpectMatch("Upload")
pty.WriteLine("yes")
pty.ExpectMatch("var.number_var")
pty.WriteLine("not-a-number")
pty.ExpectMatch("must be a valid number")
pty.WriteLine("123.45")
require.NoError(t, <-execDone)
})
t.Run("DontPromptForDefaultValues", func(t *testing.T) {
t.Parallel()
client := coderdtest.New(t, &coderdtest.Options{IncludeProvisionerDaemon: true})
owner := coderdtest.CreateFirstUser(t, client)
templateAdmin, _ := coderdtest.CreateAnotherUser(t, client, owner.OrganizationID, rbac.RoleTemplateAdmin())
templateVariables := []*proto.TemplateVariable{
{
Name: "with_default",
Type: "string",
Required: true,
DefaultValue: "default-value",
},
{
Name: "without_default",
Type: "string",
Required: true,
},
}
source := clitest.CreateTemplateVersionSource(t, createEchoResponsesWithTemplateVariables(templateVariables))
inv, root := clitest.New(t, "templates", "push", "test-template", "--directory", source, "--test.provisioner", string(database.ProvisionerTypeEcho))
clitest.SetupConfig(t, templateAdmin, root)
pty := ptytest.New(t).Attach(inv)
execDone := make(chan error)
go func() {
execDone <- inv.Run()
}()
// Select "Yes" for the "Upload <template_path>" prompt
pty.ExpectMatch("Upload")
pty.WriteLine("yes")
pty.ExpectMatch("var.without_default")
pty.WriteLine("test-value")
require.NoError(t, <-execDone)
})
t.Run("VariableSourcesPriority", func(t *testing.T) {
t.Parallel()
client := coderdtest.New(t, &coderdtest.Options{IncludeProvisionerDaemon: true})
owner := coderdtest.CreateFirstUser(t, client)
templateAdmin, _ := coderdtest.CreateAnotherUser(t, client, owner.OrganizationID, rbac.RoleTemplateAdmin())
templateVariables := []*proto.TemplateVariable{
{
Name: "cli_flag_var",
Description: "Variable provided via CLI flag",
Type: "string",
Required: true,
},
{
Name: "file_var",
Description: "Variable provided via file",
Type: "string",
Required: true,
},
{
Name: "prompt_var",
Description: "Variable provided via prompt",
Type: "string",
Required: true,
},
{
Name: "cli_overrides_file_var",
Description: "Variable in both CLI and file",
Type: "string",
Required: true,
},
}
source := clitest.CreateTemplateVersionSource(t, createEchoResponsesWithTemplateVariables(templateVariables))
// Create a temporary variables file.
tempDir := t.TempDir()
removeTmpDirUntilSuccessAfterTest(t, tempDir)
variablesFile, err := os.CreateTemp(tempDir, "variables*.yaml")
require.NoError(t, err)
_, err = variablesFile.WriteString(`file_var: from-file
cli_overrides_file_var: from-file`)
require.NoError(t, err)
require.NoError(t, variablesFile.Close())
inv, root := clitest.New(t, "templates", "push", "test-template",
"--directory", source,
"--test.provisioner", string(database.ProvisionerTypeEcho),
"--variables-file", variablesFile.Name(),
"--variable", "cli_flag_var=from-cli-flag",
"--variable", "cli_overrides_file_var=from-cli-override",
)
clitest.SetupConfig(t, templateAdmin, root)
pty := ptytest.New(t).Attach(inv)
execDone := make(chan error)
go func() {
execDone <- inv.Run()
}()
// Select "Yes" for the "Upload <template_path>" prompt
pty.ExpectMatch("Upload")
pty.WriteLine("yes")
// Only check for prompt_var, other variables should not prompt
pty.ExpectMatch("var.prompt_var")
pty.ExpectMatch("Enter value:")
pty.WriteLine("from-prompt")
require.NoError(t, <-execDone)
template, err := client.TemplateByName(context.Background(), owner.OrganizationID, "test-template")
require.NoError(t, err)
templateVersionVars, err := client.TemplateVersionVariables(context.Background(), template.ActiveVersionID)
require.NoError(t, err)
require.Len(t, templateVersionVars, 4)
varMap := make(map[string]string)
for _, tv := range templateVersionVars {
varMap[tv.Name] = tv.Value
}
require.Equal(t, "from-cli-flag", varMap["cli_flag_var"])
require.Equal(t, "from-file", varMap["file_var"])
require.Equal(t, "from-prompt", varMap["prompt_var"])
require.Equal(t, "from-cli-override", varMap["cli_overrides_file_var"])
})
})
}
+1
View File
@@ -45,6 +45,7 @@
"queue_position": 0,
"queue_size": 0,
"organization_id": "===========[first org ID]===========",
"initiator_id": "==========[first user ID]===========",
"input": {
"workspace_build_id": "========[workspace build ID]========"
},
+4 -1
View File
@@ -11,9 +11,12 @@ OPTIONS:
-O, --org string, $CODER_ORGANIZATION
Select which organization (uuid or name) to use.
-c, --column [id|created at|started at|completed at|canceled at|error|error code|status|worker id|worker name|file id|tags|queue position|queue size|organization id|template version id|workspace build id|type|available workers|template version name|template id|template name|template display name|template icon|workspace id|workspace name|logs overflowed|organization|queue] (default: created at,id,type,template display name,status,queue,tags)
-c, --column [id|created at|started at|completed at|canceled at|error|error code|status|worker id|worker name|file id|tags|queue position|queue size|organization id|initiator id|template version id|workspace build id|type|available workers|template version name|template id|template name|template display name|template icon|workspace id|workspace name|logs overflowed|organization|queue] (default: created at,id,type,template display name,status,queue,tags)
Columns to display in table output.
-i, --initiator string, $CODER_PROVISIONER_JOB_LIST_INITIATOR
Filter by initiator (user ID or username).
-l, --limit int, $CODER_PROVISIONER_JOB_LIST_LIMIT (default: 50)
Limit the number of jobs returned.
@@ -15,6 +15,7 @@
"queue_position": 0,
"queue_size": 0,
"organization_id": "===========[first org ID]===========",
"initiator_id": "==========[first user ID]===========",
"input": {
"template_version_id": "============[version ID]============"
},
@@ -45,6 +46,7 @@
"queue_position": 0,
"queue_size": 0,
"organization_id": "===========[first org ID]===========",
"initiator_id": "==========[first user ID]===========",
"input": {
"workspace_build_id": "========[workspace build ID]========"
},
+1 -1
View File
@@ -7,7 +7,7 @@
"last_seen_at": "====[timestamp]=====",
"name": "test-daemon",
"version": "v0.0.0-devel",
"api_version": "1.10",
"api_version": "1.11",
"provisioners": [
"echo"
],
+9 -1
View File
@@ -61,6 +61,14 @@ func (a *ConnLogAPI) ReportConnection(ctx context.Context, req *agentproto.Repor
return nil, xerrors.Errorf("get workspace by agent id: %w", err)
}
// Some older clients may incorrectly report "localhost" as the IP address.
// Related to https://github.com/coder/coder/issues/20194
logIPRaw := req.GetConnection().GetIp()
if logIPRaw == "localhost" {
logIPRaw = "127.0.0.1"
}
logIP := database.ParseIP(logIPRaw) // will return null if invalid
reason := req.GetConnection().GetReason()
connLogger := *a.ConnectionLogger.Load()
err = connLogger.Upsert(ctx, database.UpsertConnectionLogParams{
@@ -73,7 +81,7 @@ func (a *ConnLogAPI) ReportConnection(ctx context.Context, req *agentproto.Repor
AgentName: workspaceAgent.Name,
Type: connectionType,
Code: code,
Ip: database.ParseIP(req.GetConnection().GetIp()),
Ip: logIP,
ConnectionID: uuid.NullUUID{
UUID: connectionID,
Valid: true,
+10 -3
View File
@@ -3,13 +3,11 @@ package agentapi_test
import (
"context"
"database/sql"
"net"
"sync/atomic"
"testing"
"time"
"github.com/google/uuid"
"github.com/sqlc-dev/pqtype"
"github.com/stretchr/testify/require"
"go.uber.org/mock/gomock"
"google.golang.org/protobuf/types/known/timestamppb"
@@ -75,6 +73,9 @@ func TestConnectionLog(t *testing.T) {
action: agentproto.Connection_CONNECT.Enum(),
typ: agentproto.Connection_JETBRAINS.Enum(),
time: dbtime.Now(),
// Sometimes, JetBrains clients report as localhost, see
// https://github.com/coder/coder/issues/20194
ip: "localhost",
},
{
name: "Reconnecting PTY Connect",
@@ -129,6 +130,12 @@ func TestConnectionLog(t *testing.T) {
},
})
expectedIPRaw := tt.ip
if expectedIPRaw == "localhost" {
expectedIPRaw = "127.0.0.1"
}
expectedIP := database.ParseIP(expectedIPRaw)
require.True(t, connLogger.Contains(t, database.UpsertConnectionLogParams{
Time: dbtime.Time(tt.time).In(time.UTC),
OrganizationID: workspace.OrganizationID,
@@ -146,7 +153,7 @@ func TestConnectionLog(t *testing.T) {
Int32: tt.status,
Valid: *tt.action == agentproto.Connection_DISCONNECT,
},
Ip: pqtype.Inet{Valid: true, IPNet: net.IPNet{IP: net.ParseIP(tt.ip), Mask: net.CIDRMask(32, 32)}},
Ip: expectedIP,
Type: agentProtoConnectionTypeToConnectionLog(t, *tt.typ),
DisconnectReason: sql.NullString{
String: tt.reason,
+119 -153
View File
@@ -1,17 +1,13 @@
package coderd
import (
"bytes"
"context"
"database/sql"
"encoding/json"
"errors"
"fmt"
"io"
"net"
"net/http"
"net/url"
"path"
"slices"
"strings"
"time"
@@ -31,6 +27,8 @@ import (
"github.com/coder/coder/v2/coderd/taskname"
"github.com/coder/coder/v2/coderd/util/slice"
"github.com/coder/coder/v2/codersdk"
aiagentapi "github.com/coder/agentapi-sdk-go"
)
// This endpoint is experimental and not guaranteed to be stable, so we're not
@@ -84,8 +82,18 @@ func (api *API) aiTasksPrompts(rw http.ResponseWriter, r *http.Request) {
})
}
// This endpoint is experimental and not guaranteed to be stable, so we're not
// generating public-facing documentation for it.
// @Summary Create a new AI task
// @Description: EXPERIMENTAL: this endpoint is experimental and not guaranteed to be stable.
// @ID create-task
// @Security CoderSessionToken
// @Tags Experimental
// @Param user path string true "Username, user ID, or 'me' for the authenticated user"
// @Param request body codersdk.CreateTaskRequest true "Create task request"
// @Success 201 {object} codersdk.Task
// @Router /api/experimental/tasks/{user} [post]
//
// EXPERIMENTAL: This endpoint is experimental and not guaranteed to be stable.
// This endpoint creates a new task for the given user.
func (api *API) tasksCreate(rw http.ResponseWriter, r *http.Request) {
var (
ctx = r.Context()
@@ -260,6 +268,14 @@ func taskFromWorkspace(ws codersdk.Workspace, initialPrompt string) codersdk.Tas
}
}
var appID uuid.NullUUID
if ws.LatestBuild.AITaskSidebarAppID != nil {
appID = uuid.NullUUID{
Valid: true,
UUID: *ws.LatestBuild.AITaskSidebarAppID,
}
}
return codersdk.Task{
ID: ws.ID,
OrganizationID: ws.OrganizationID,
@@ -271,9 +287,11 @@ func taskFromWorkspace(ws codersdk.Workspace, initialPrompt string) codersdk.Tas
TemplateDisplayName: ws.TemplateDisplayName,
TemplateIcon: ws.TemplateIcon,
WorkspaceID: uuid.NullUUID{Valid: true, UUID: ws.ID},
WorkspaceBuildNumber: ws.LatestBuild.BuildNumber,
WorkspaceAgentID: taskAgentID,
WorkspaceAgentLifecycle: taskAgentLifecycle,
WorkspaceAgentHealth: taskAgentHealth,
WorkspaceAppID: appID,
CreatedAt: ws.CreatedAt,
UpdatedAt: ws.UpdatedAt,
InitialPrompt: initialPrompt,
@@ -318,6 +336,19 @@ type tasksListResponse struct {
Count int `json:"count"`
}
// @Summary List AI tasks
// @Description: EXPERIMENTAL: this endpoint is experimental and not guaranteed to be stable.
// @ID list-tasks
// @Security CoderSessionToken
// @Tags Experimental
// @Param q query string false "Search query for filtering tasks"
// @Param after_id query string false "Return tasks after this ID for pagination"
// @Param limit query int false "Maximum number of tasks to return" minimum(1) maximum(100) default(25)
// @Param offset query int false "Offset for pagination" minimum(0) default(0)
// @Success 200 {object} coderd.tasksListResponse
// @Router /api/experimental/tasks [get]
//
// EXPERIMENTAL: This endpoint is experimental and not guaranteed to be stable.
// tasksList is an experimental endpoint to list AI tasks by mapping
// workspaces to a task-shaped response.
func (api *API) tasksList(rw http.ResponseWriter, r *http.Request) {
@@ -421,6 +452,17 @@ func (api *API) tasksList(rw http.ResponseWriter, r *http.Request) {
})
}
// @Summary Get AI task by ID
// @Description: EXPERIMENTAL: this endpoint is experimental and not guaranteed to be stable.
// @ID get-task
// @Security CoderSessionToken
// @Tags Experimental
// @Param user path string true "Username, user ID, or 'me' for the authenticated user"
// @Param id path string true "Task ID" format(uuid)
// @Success 200 {object} codersdk.Task
// @Router /api/experimental/tasks/{user}/{id} [get]
//
// EXPERIMENTAL: This endpoint is experimental and not guaranteed to be stable.
// taskGet is an experimental endpoint to fetch a single AI task by ID
// (workspace ID). It returns a synthesized task response including
// prompt and status.
@@ -527,6 +569,17 @@ func (api *API) taskGet(rw http.ResponseWriter, r *http.Request) {
httpapi.Write(ctx, rw, http.StatusOK, tasks[0])
}
// @Summary Delete AI task by ID
// @Description: EXPERIMENTAL: this endpoint is experimental and not guaranteed to be stable.
// @ID delete-task
// @Security CoderSessionToken
// @Tags Experimental
// @Param user path string true "Username, user ID, or 'me' for the authenticated user"
// @Param id path string true "Task ID" format(uuid)
// @Success 202 "Task deletion initiated"
// @Router /api/experimental/tasks/{user}/{id} [delete]
//
// EXPERIMENTAL: This endpoint is experimental and not guaranteed to be stable.
// taskDelete is an experimental endpoint to delete a task by ID (workspace ID).
// It creates a delete workspace build and returns 202 Accepted if the build was
// created.
@@ -602,6 +655,18 @@ func (api *API) taskDelete(rw http.ResponseWriter, r *http.Request) {
rw.WriteHeader(http.StatusAccepted)
}
// @Summary Send input to AI task
// @Description: EXPERIMENTAL: this endpoint is experimental and not guaranteed to be stable.
// @ID send-task-input
// @Security CoderSessionToken
// @Tags Experimental
// @Param user path string true "Username, user ID, or 'me' for the authenticated user"
// @Param id path string true "Task ID" format(uuid)
// @Param request body codersdk.TaskSendRequest true "Task input request"
// @Success 204 "Input sent successfully"
// @Router /api/experimental/tasks/{user}/{id}/send [post]
//
// EXPERIMENTAL: This endpoint is experimental and not guaranteed to be stable.
// taskSend submits task input to the tasks sidebar app by dialing the agent
// directly over the tailnet. We enforce ApplicationConnect RBAC on the
// workspace and validate the sidebar app health.
@@ -629,64 +694,40 @@ func (api *API) taskSend(rw http.ResponseWriter, r *http.Request) {
}
if err = api.authAndDoWithTaskSidebarAppClient(r, taskID, func(ctx context.Context, client *http.Client, appURL *url.URL) error {
status, err := agentapiDoStatusRequest(ctx, client, appURL)
agentAPIClient, err := aiagentapi.NewClient(appURL.String(), aiagentapi.WithHTTPClient(client))
if err != nil {
return err
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Failed to create agentapi client.",
Detail: err.Error(),
})
}
if status != "stable" {
statusResp, err := agentAPIClient.GetStatus(ctx)
if err != nil {
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Failed to get status from task app.",
Detail: err.Error(),
})
}
if statusResp.Status != aiagentapi.StatusStable {
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Task app is not ready to accept input.",
Detail: fmt.Sprintf("Status: %s", status),
Detail: fmt.Sprintf("Status: %s", statusResp.Status),
})
}
var reqBody struct {
Content string `json:"content"`
Type string `json:"type"`
}
reqBody.Content = req.Input
reqBody.Type = "user"
req, err := agentapiNewRequest(ctx, http.MethodPost, appURL, "message", reqBody)
if err != nil {
return err
}
resp, err := client.Do(req)
_, err = agentAPIClient.PostMessage(ctx, aiagentapi.PostMessageParams{
Content: req.Input,
Type: aiagentapi.MessageTypeUser,
})
if err != nil {
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Failed to reach task app endpoint.",
Detail: err.Error(),
})
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
body, _ := io.ReadAll(io.LimitReader(resp.Body, 128))
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Task app rejected the message.",
Detail: fmt.Sprintf("Upstream status: %d; Body: %s", resp.StatusCode, body),
})
}
// {"$schema":"http://localhost:3284/schemas/MessageResponseBody.json","ok":true}
// {"$schema":"http://localhost:3284/schemas/ErrorModel.json","title":"Unprocessable Entity","status":422,"detail":"validation failed","errors":[{"location":"body.type","value":"oof"}]}
var respBody map[string]any
if err := json.NewDecoder(resp.Body).Decode(&respBody); err != nil {
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Failed to decode task app response body.",
Detail: err.Error(),
})
}
if v, ok := respBody["ok"].(bool); !ok || !v {
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Task app rejected the message.",
Detail: fmt.Sprintf("Upstream response: %v", respBody),
})
}
return nil
}); err != nil {
httperror.WriteResponseError(ctx, rw, err)
@@ -696,6 +737,19 @@ func (api *API) taskSend(rw http.ResponseWriter, r *http.Request) {
rw.WriteHeader(http.StatusNoContent)
}
// @Summary Get AI task logs
// @Description: EXPERIMENTAL: this endpoint is experimental and not guaranteed to be stable.
// @ID get-task-logs
// @Security CoderSessionToken
// @Tags Experimental
// @Param user path string true "Username, user ID, or 'me' for the authenticated user"
// @Param id path string true "Task ID" format(uuid)
// @Success 200 {object} codersdk.TaskLogsResponse
// @Router /api/experimental/tasks/{user}/{id}/logs [get]
//
// EXPERIMENTAL: This endpoint is experimental and not guaranteed to be stable.
// taskLogs reads task output by dialing the agent directly over the tailnet.
// We enforce ApplicationConnect RBAC on the workspace and validate the sidebar app health.
func (api *API) taskLogs(rw http.ResponseWriter, r *http.Request) {
ctx := r.Context()
@@ -710,51 +764,29 @@ func (api *API) taskLogs(rw http.ResponseWriter, r *http.Request) {
var out codersdk.TaskLogsResponse
if err := api.authAndDoWithTaskSidebarAppClient(r, taskID, func(ctx context.Context, client *http.Client, appURL *url.URL) error {
req, err := agentapiNewRequest(ctx, http.MethodGet, appURL, "messages", nil)
if err != nil {
return err
}
resp, err := client.Do(req)
agentAPIClient, err := aiagentapi.NewClient(appURL.String(), aiagentapi.WithHTTPClient(client))
if err != nil {
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Failed to reach task app endpoint.",
Detail: err.Error(),
})
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
body, _ := io.ReadAll(io.LimitReader(resp.Body, 128))
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Task app rejected the request.",
Detail: fmt.Sprintf("Upstream status: %d; Body: %s", resp.StatusCode, body),
})
}
// {"$schema":"http://localhost:3284/schemas/MessagesResponseBody.json","messages":[]}
var respBody struct {
Messages []struct {
ID int `json:"id"`
Content string `json:"content"`
Role string `json:"role"`
Time time.Time `json:"time"`
} `json:"messages"`
}
if err := json.NewDecoder(resp.Body).Decode(&respBody); err != nil {
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Failed to decode task app response body.",
Message: "Failed to create agentapi client.",
Detail: err.Error(),
})
}
logs := make([]codersdk.TaskLogEntry, 0, len(respBody.Messages))
for _, m := range respBody.Messages {
messagesResp, err := agentAPIClient.GetMessages(ctx)
if err != nil {
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Failed to get messages from task app.",
Detail: err.Error(),
})
}
logs := make([]codersdk.TaskLogEntry, 0, len(messagesResp.Messages))
for _, m := range messagesResp.Messages {
var typ codersdk.TaskLogType
switch strings.ToLower(m.Role) {
case "user":
switch m.Role {
case aiagentapi.RoleUser:
typ = codersdk.TaskLogTypeInput
case "agent":
case aiagentapi.RoleAgent:
typ = codersdk.TaskLogTypeOutput
default:
return httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
@@ -763,7 +795,7 @@ func (api *API) taskLogs(rw http.ResponseWriter, r *http.Request) {
})
}
logs = append(logs, codersdk.TaskLogEntry{
ID: m.ID,
ID: int(m.Id),
Content: m.Content,
Type: typ,
Time: m.Time,
@@ -903,69 +935,3 @@ func (api *API) authAndDoWithTaskSidebarAppClient(
}
return do(ctx, client, parsedURL)
}
func agentapiNewRequest(ctx context.Context, method string, appURL *url.URL, appURLPath string, body any) (*http.Request, error) {
u := *appURL
u.Path = path.Join(appURL.Path, appURLPath)
var bodyReader io.Reader
if body != nil {
b, err := json.Marshal(body)
if err != nil {
return nil, httperror.NewResponseError(http.StatusBadRequest, codersdk.Response{
Message: "Failed to marshal task app request body.",
Detail: err.Error(),
})
}
bodyReader = bytes.NewReader(b)
}
req, err := http.NewRequestWithContext(ctx, method, u.String(), bodyReader)
if err != nil {
return nil, httperror.NewResponseError(http.StatusBadRequest, codersdk.Response{
Message: "Failed to create task app request.",
Detail: err.Error(),
})
}
req.Header.Set("Content-Type", "application/json")
req.Header.Set("Accept", "application/json")
return req, nil
}
func agentapiDoStatusRequest(ctx context.Context, client *http.Client, appURL *url.URL) (string, error) {
req, err := agentapiNewRequest(ctx, http.MethodGet, appURL, "status", nil)
if err != nil {
return "", err
}
resp, err := client.Do(req)
if err != nil {
return "", httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Failed to reach task app endpoint.",
Detail: err.Error(),
})
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return "", httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Task app status returned an error.",
Detail: fmt.Sprintf("Status code: %d", resp.StatusCode),
})
}
// {"$schema":"http://localhost:3284/schemas/StatusResponseBody.json","status":"stable"}
var respBody struct {
Status string `json:"status"`
}
if err := json.NewDecoder(resp.Body).Decode(&respBody); err != nil {
return "", httperror.NewResponseError(http.StatusBadGateway, codersdk.Response{
Message: "Failed to decode task app status response body.",
Detail: err.Error(),
})
}
return respBody.Status, nil
}
+97 -4
View File
@@ -6,8 +6,10 @@ import (
"io"
"net/http"
"net/http/httptest"
"strings"
"testing"
"time"
"unicode/utf8"
"github.com/google/uuid"
"github.com/stretchr/testify/assert"
@@ -977,6 +979,7 @@ func TestTasksNotification(t *testing.T) {
isAITask bool
isNotificationSent bool
notificationTemplate uuid.UUID
taskPrompt string
}{
// Should not send a notification when the agent app is not an AI task.
{
@@ -985,6 +988,7 @@ func TestTasksNotification(t *testing.T) {
newAppStatus: codersdk.WorkspaceAppStatusStateWorking,
isAITask: false,
isNotificationSent: false,
taskPrompt: "NoAITask",
},
// Should not send a notification when the new app status is neither 'Working' nor 'Idle'.
{
@@ -993,6 +997,7 @@ func TestTasksNotification(t *testing.T) {
newAppStatus: codersdk.WorkspaceAppStatusStateComplete,
isAITask: true,
isNotificationSent: false,
taskPrompt: "NonNotifiedState",
},
// Should not send a notification when the new app status equals the latest status (Working).
{
@@ -1001,15 +1006,27 @@ func TestTasksNotification(t *testing.T) {
newAppStatus: codersdk.WorkspaceAppStatusStateWorking,
isAITask: true,
isNotificationSent: false,
taskPrompt: "NonNotifiedTransition",
},
// Should send TemplateTaskWorking when the AI task transitions to 'Working'.
// Should NOT send TemplateTaskWorking when the AI task's FIRST status is 'Working' (obvious state).
{
name: "TemplateTaskWorking",
latestAppStatuses: nil,
newAppStatus: codersdk.WorkspaceAppStatusStateWorking,
isAITask: true,
isNotificationSent: true,
isNotificationSent: false,
notificationTemplate: notifications.TemplateTaskWorking,
taskPrompt: "TemplateTaskWorking",
},
// Should send TemplateTaskIdle when the AI task's FIRST status is 'Idle' (task completed immediately).
{
name: "InitialTemplateTaskIdle",
latestAppStatuses: nil,
newAppStatus: codersdk.WorkspaceAppStatusStateIdle,
isAITask: true,
isNotificationSent: true,
notificationTemplate: notifications.TemplateTaskIdle,
taskPrompt: "InitialTemplateTaskIdle",
},
// Should send TemplateTaskWorking when the AI task transitions to 'Working' from 'Idle'.
{
@@ -1022,6 +1039,7 @@ func TestTasksNotification(t *testing.T) {
isAITask: true,
isNotificationSent: true,
notificationTemplate: notifications.TemplateTaskWorking,
taskPrompt: "TemplateTaskWorkingFromIdle",
},
// Should send TemplateTaskIdle when the AI task transitions to 'Idle'.
{
@@ -1031,6 +1049,75 @@ func TestTasksNotification(t *testing.T) {
isAITask: true,
isNotificationSent: true,
notificationTemplate: notifications.TemplateTaskIdle,
taskPrompt: "TemplateTaskIdle",
},
// Long task prompts should be truncated to 160 characters.
{
name: "LongTaskPrompt",
latestAppStatuses: []codersdk.WorkspaceAppStatusState{codersdk.WorkspaceAppStatusStateWorking},
newAppStatus: codersdk.WorkspaceAppStatusStateIdle,
isAITask: true,
isNotificationSent: true,
notificationTemplate: notifications.TemplateTaskIdle,
taskPrompt: "This is a very long task prompt that should be truncated to 160 characters. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.",
},
// Should send TemplateTaskCompleted when the AI task transitions to 'Complete'.
{
name: "TemplateTaskCompleted",
latestAppStatuses: []codersdk.WorkspaceAppStatusState{codersdk.WorkspaceAppStatusStateWorking},
newAppStatus: codersdk.WorkspaceAppStatusStateComplete,
isAITask: true,
isNotificationSent: true,
notificationTemplate: notifications.TemplateTaskCompleted,
taskPrompt: "TemplateTaskCompleted",
},
// Should send TemplateTaskFailed when the AI task transitions to 'Failure'.
{
name: "TemplateTaskFailed",
latestAppStatuses: []codersdk.WorkspaceAppStatusState{codersdk.WorkspaceAppStatusStateWorking},
newAppStatus: codersdk.WorkspaceAppStatusStateFailure,
isAITask: true,
isNotificationSent: true,
notificationTemplate: notifications.TemplateTaskFailed,
taskPrompt: "TemplateTaskFailed",
},
// Should send TemplateTaskCompleted when the AI task transitions from 'Idle' to 'Complete'.
{
name: "TemplateTaskCompletedFromIdle",
latestAppStatuses: []codersdk.WorkspaceAppStatusState{codersdk.WorkspaceAppStatusStateIdle},
newAppStatus: codersdk.WorkspaceAppStatusStateComplete,
isAITask: true,
isNotificationSent: true,
notificationTemplate: notifications.TemplateTaskCompleted,
taskPrompt: "TemplateTaskCompletedFromIdle",
},
// Should send TemplateTaskFailed when the AI task transitions from 'Idle' to 'Failure'.
{
name: "TemplateTaskFailedFromIdle",
latestAppStatuses: []codersdk.WorkspaceAppStatusState{codersdk.WorkspaceAppStatusStateIdle},
newAppStatus: codersdk.WorkspaceAppStatusStateFailure,
isAITask: true,
isNotificationSent: true,
notificationTemplate: notifications.TemplateTaskFailed,
taskPrompt: "TemplateTaskFailedFromIdle",
},
// Should NOT send notification when transitioning from 'Complete' to 'Complete' (no change).
{
name: "NoNotificationCompleteToComplete",
latestAppStatuses: []codersdk.WorkspaceAppStatusState{codersdk.WorkspaceAppStatusStateComplete},
newAppStatus: codersdk.WorkspaceAppStatusStateComplete,
isAITask: true,
isNotificationSent: false,
taskPrompt: "NoNotificationCompleteToComplete",
},
// Should NOT send notification when transitioning from 'Failure' to 'Failure' (no change).
{
name: "NoNotificationFailureToFailure",
latestAppStatuses: []codersdk.WorkspaceAppStatusState{codersdk.WorkspaceAppStatusStateFailure},
newAppStatus: codersdk.WorkspaceAppStatusStateFailure,
isAITask: true,
isNotificationSent: false,
taskPrompt: "NoNotificationFailureToFailure",
},
} {
t.Run(tc.name, func(t *testing.T) {
@@ -1067,7 +1154,7 @@ func TestTasksNotification(t *testing.T) {
}).Seed(workspaceBuildSeed).Params(database.WorkspaceBuildParameter{
WorkspaceBuildID: workspaceBuildID,
Name: codersdk.AITaskPromptParameterName,
Value: "task prompt",
Value: tc.taskPrompt,
}).WithAgent(func(agent []*proto.Agent) []*proto.Agent {
agent[0].Apps = []*proto.App{{
Id: workspaceAgentAppID.String(),
@@ -1115,7 +1202,13 @@ func TestTasksNotification(t *testing.T) {
require.Len(t, sent, 1)
require.Equal(t, memberUser.ID, sent[0].UserID)
require.Len(t, sent[0].Labels, 2)
require.Equal(t, "task prompt", sent[0].Labels["task"])
// NOTE: len(string) is the number of bytes in the string, not the number of runes.
require.LessOrEqual(t, utf8.RuneCountInString(sent[0].Labels["task"]), 160)
if len(tc.taskPrompt) > 160 {
require.Contains(t, tc.taskPrompt, strings.TrimSuffix(sent[0].Labels["task"], "…"))
} else {
require.Equal(t, tc.taskPrompt, sent[0].Labels["task"])
}
require.Equal(t, workspace.Name, sent[0].Labels["workspace"])
} else {
// Then: No notification is sent
+808 -4
View File
@@ -130,6 +130,256 @@ const docTemplate = `{
}
}
},
"/api/experimental/tasks": {
"get": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": [
"Experimental"
],
"summary": "List AI tasks",
"operationId": "list-tasks",
"parameters": [
{
"type": "string",
"description": "Search query for filtering tasks",
"name": "q",
"in": "query"
},
{
"type": "string",
"description": "Return tasks after this ID for pagination",
"name": "after_id",
"in": "query"
},
{
"maximum": 100,
"minimum": 1,
"type": "integer",
"default": 25,
"description": "Maximum number of tasks to return",
"name": "limit",
"in": "query"
},
{
"minimum": 0,
"type": "integer",
"default": 0,
"description": "Offset for pagination",
"name": "offset",
"in": "query"
}
],
"responses": {
"200": {
"description": "OK",
"schema": {
"$ref": "#/definitions/coderd.tasksListResponse"
}
}
}
}
},
"/api/experimental/tasks/{user}": {
"post": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": [
"Experimental"
],
"summary": "Create a new AI task",
"operationId": "create-task",
"parameters": [
{
"type": "string",
"description": "Username, user ID, or 'me' for the authenticated user",
"name": "user",
"in": "path",
"required": true
},
{
"description": "Create task request",
"name": "request",
"in": "body",
"required": true,
"schema": {
"$ref": "#/definitions/codersdk.CreateTaskRequest"
}
}
],
"responses": {
"201": {
"description": "Created",
"schema": {
"$ref": "#/definitions/codersdk.Task"
}
}
}
}
},
"/api/experimental/tasks/{user}/{id}": {
"get": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": [
"Experimental"
],
"summary": "Get AI task by ID",
"operationId": "get-task",
"parameters": [
{
"type": "string",
"description": "Username, user ID, or 'me' for the authenticated user",
"name": "user",
"in": "path",
"required": true
},
{
"type": "string",
"format": "uuid",
"description": "Task ID",
"name": "id",
"in": "path",
"required": true
}
],
"responses": {
"200": {
"description": "OK",
"schema": {
"$ref": "#/definitions/codersdk.Task"
}
}
}
},
"delete": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": [
"Experimental"
],
"summary": "Delete AI task by ID",
"operationId": "delete-task",
"parameters": [
{
"type": "string",
"description": "Username, user ID, or 'me' for the authenticated user",
"name": "user",
"in": "path",
"required": true
},
{
"type": "string",
"format": "uuid",
"description": "Task ID",
"name": "id",
"in": "path",
"required": true
}
],
"responses": {
"202": {
"description": "Task deletion initiated"
}
}
}
},
"/api/experimental/tasks/{user}/{id}/logs": {
"get": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": [
"Experimental"
],
"summary": "Get AI task logs",
"operationId": "get-task-logs",
"parameters": [
{
"type": "string",
"description": "Username, user ID, or 'me' for the authenticated user",
"name": "user",
"in": "path",
"required": true
},
{
"type": "string",
"format": "uuid",
"description": "Task ID",
"name": "id",
"in": "path",
"required": true
}
],
"responses": {
"200": {
"description": "OK",
"schema": {
"$ref": "#/definitions/codersdk.TaskLogsResponse"
}
}
}
}
},
"/api/experimental/tasks/{user}/{id}/send": {
"post": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": [
"Experimental"
],
"summary": "Send input to AI task",
"operationId": "send-task-input",
"parameters": [
{
"type": "string",
"description": "Username, user ID, or 'me' for the authenticated user",
"name": "user",
"in": "path",
"required": true
},
{
"type": "string",
"format": "uuid",
"description": "Task ID",
"name": "id",
"in": "path",
"required": true
},
{
"description": "Task input request",
"name": "request",
"in": "body",
"required": true,
"schema": {
"$ref": "#/definitions/codersdk.TaskSendRequest"
}
}
],
"responses": {
"204": {
"description": "Input sent successfully"
}
}
}
},
"/appearance": {
"get": {
"security": [
@@ -3744,6 +3994,13 @@ const docTemplate = `{
"description": "Provisioner tags to filter by (JSON of the form {'tag1':'value1','tag2':'value2'})",
"name": "tags",
"in": "query"
},
{
"type": "string",
"format": "uuid",
"description": "Filter results by initiator",
"name": "initiator",
"in": "query"
}
],
"responses": {
@@ -11229,6 +11486,20 @@ const docTemplate = `{
}
}
},
"coderd.tasksListResponse": {
"type": "object",
"properties": {
"count": {
"type": "integer"
},
"tasks": {
"type": "array",
"items": {
"$ref": "#/definitions/codersdk.Task"
}
}
}
},
"codersdk.ACLAvailable": {
"type": "object",
"properties": {
@@ -11442,6 +11713,17 @@ const docTemplate = `{
}
}
},
"codersdk.APIAllowListTarget": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"type": {
"$ref": "#/definitions/codersdk.RBACResource"
}
}
},
"codersdk.APIKey": {
"type": "object",
"required": [
@@ -11523,11 +11805,29 @@ const docTemplate = `{
"enum": [
"all",
"application_connect",
"aibridge_interception:*",
"aibridge_interception:create",
"aibridge_interception:read",
"aibridge_interception:update",
"api_key:*",
"api_key:create",
"api_key:delete",
"api_key:read",
"api_key:update",
"assign_org_role:*",
"assign_org_role:assign",
"assign_org_role:create",
"assign_org_role:delete",
"assign_org_role:read",
"assign_org_role:unassign",
"assign_org_role:update",
"assign_role:*",
"assign_role:assign",
"assign_role:read",
"assign_role:unassign",
"audit_log:*",
"audit_log:create",
"audit_log:read",
"coder:all",
"coder:apikeys.manage_self",
"coder:application_connect",
@@ -11537,40 +11837,193 @@ const docTemplate = `{
"coder:workspaces.create",
"coder:workspaces.delete",
"coder:workspaces.operate",
"connection_log:*",
"connection_log:read",
"connection_log:update",
"crypto_key:*",
"crypto_key:create",
"crypto_key:delete",
"crypto_key:read",
"crypto_key:update",
"debug_info:*",
"debug_info:read",
"deployment_config:*",
"deployment_config:read",
"deployment_config:update",
"deployment_stats:*",
"deployment_stats:read",
"file:*",
"file:create",
"file:read",
"group:*",
"group:create",
"group:delete",
"group:read",
"group:update",
"group_member:*",
"group_member:read",
"idpsync_settings:*",
"idpsync_settings:read",
"idpsync_settings:update",
"inbox_notification:*",
"inbox_notification:create",
"inbox_notification:read",
"inbox_notification:update",
"license:*",
"license:create",
"license:delete",
"license:read",
"notification_message:*",
"notification_message:create",
"notification_message:delete",
"notification_message:read",
"notification_message:update",
"notification_preference:*",
"notification_preference:read",
"notification_preference:update",
"notification_template:*",
"notification_template:read",
"notification_template:update",
"oauth2_app:*",
"oauth2_app:create",
"oauth2_app:delete",
"oauth2_app:read",
"oauth2_app:update",
"oauth2_app_code_token:*",
"oauth2_app_code_token:create",
"oauth2_app_code_token:delete",
"oauth2_app_code_token:read",
"oauth2_app_secret:*",
"oauth2_app_secret:create",
"oauth2_app_secret:delete",
"oauth2_app_secret:read",
"oauth2_app_secret:update",
"organization:*",
"organization:create",
"organization:delete",
"organization:read",
"organization:update",
"organization_member:*",
"organization_member:create",
"organization_member:delete",
"organization_member:read",
"organization_member:update",
"prebuilt_workspace:*",
"prebuilt_workspace:delete",
"prebuilt_workspace:update",
"provisioner_daemon:*",
"provisioner_daemon:create",
"provisioner_daemon:delete",
"provisioner_daemon:read",
"provisioner_daemon:update",
"provisioner_jobs:*",
"provisioner_jobs:create",
"provisioner_jobs:read",
"provisioner_jobs:update",
"replicas:*",
"replicas:read",
"system:*",
"system:create",
"system:delete",
"system:read",
"system:update",
"tailnet_coordinator:*",
"tailnet_coordinator:create",
"tailnet_coordinator:delete",
"tailnet_coordinator:read",
"tailnet_coordinator:update",
"task:*",
"task:create",
"task:delete",
"task:read",
"task:update",
"template:*",
"template:create",
"template:delete",
"template:read",
"template:update",
"template:use",
"template:view_insights",
"usage_event:*",
"usage_event:create",
"usage_event:read",
"usage_event:update",
"user:*",
"user:create",
"user:delete",
"user:read",
"user:read_personal",
"user:update",
"user:update_personal",
"user_secret:*",
"user_secret:create",
"user_secret:delete",
"user_secret:read",
"user_secret:update",
"webpush_subscription:*",
"webpush_subscription:create",
"webpush_subscription:delete",
"webpush_subscription:read",
"workspace:*",
"workspace:application_connect",
"workspace:create",
"workspace:create_agent",
"workspace:delete",
"workspace:delete_agent",
"workspace:read",
"workspace:ssh",
"workspace:start",
"workspace:stop",
"workspace:update"
"workspace:update",
"workspace_agent_devcontainers:*",
"workspace_agent_devcontainers:create",
"workspace_agent_resource_monitor:*",
"workspace_agent_resource_monitor:create",
"workspace_agent_resource_monitor:read",
"workspace_agent_resource_monitor:update",
"workspace_dormant:*",
"workspace_dormant:application_connect",
"workspace_dormant:create",
"workspace_dormant:create_agent",
"workspace_dormant:delete",
"workspace_dormant:delete_agent",
"workspace_dormant:read",
"workspace_dormant:ssh",
"workspace_dormant:start",
"workspace_dormant:stop",
"workspace_dormant:update",
"workspace_proxy:*",
"workspace_proxy:create",
"workspace_proxy:delete",
"workspace_proxy:read",
"workspace_proxy:update"
],
"x-enum-varnames": [
"APIKeyScopeAll",
"APIKeyScopeApplicationConnect",
"APIKeyScopeAibridgeInterceptionAll",
"APIKeyScopeAibridgeInterceptionCreate",
"APIKeyScopeAibridgeInterceptionRead",
"APIKeyScopeAibridgeInterceptionUpdate",
"APIKeyScopeApiKeyAll",
"APIKeyScopeApiKeyCreate",
"APIKeyScopeApiKeyDelete",
"APIKeyScopeApiKeyRead",
"APIKeyScopeApiKeyUpdate",
"APIKeyScopeAssignOrgRoleAll",
"APIKeyScopeAssignOrgRoleAssign",
"APIKeyScopeAssignOrgRoleCreate",
"APIKeyScopeAssignOrgRoleDelete",
"APIKeyScopeAssignOrgRoleRead",
"APIKeyScopeAssignOrgRoleUnassign",
"APIKeyScopeAssignOrgRoleUpdate",
"APIKeyScopeAssignRoleAll",
"APIKeyScopeAssignRoleAssign",
"APIKeyScopeAssignRoleRead",
"APIKeyScopeAssignRoleUnassign",
"APIKeyScopeAuditLogAll",
"APIKeyScopeAuditLogCreate",
"APIKeyScopeAuditLogRead",
"APIKeyScopeCoderAll",
"APIKeyScopeCoderApikeysManageSelf",
"APIKeyScopeCoderApplicationConnect",
@@ -11580,31 +12033,166 @@ const docTemplate = `{
"APIKeyScopeCoderWorkspacesCreate",
"APIKeyScopeCoderWorkspacesDelete",
"APIKeyScopeCoderWorkspacesOperate",
"APIKeyScopeConnectionLogAll",
"APIKeyScopeConnectionLogRead",
"APIKeyScopeConnectionLogUpdate",
"APIKeyScopeCryptoKeyAll",
"APIKeyScopeCryptoKeyCreate",
"APIKeyScopeCryptoKeyDelete",
"APIKeyScopeCryptoKeyRead",
"APIKeyScopeCryptoKeyUpdate",
"APIKeyScopeDebugInfoAll",
"APIKeyScopeDebugInfoRead",
"APIKeyScopeDeploymentConfigAll",
"APIKeyScopeDeploymentConfigRead",
"APIKeyScopeDeploymentConfigUpdate",
"APIKeyScopeDeploymentStatsAll",
"APIKeyScopeDeploymentStatsRead",
"APIKeyScopeFileAll",
"APIKeyScopeFileCreate",
"APIKeyScopeFileRead",
"APIKeyScopeGroupAll",
"APIKeyScopeGroupCreate",
"APIKeyScopeGroupDelete",
"APIKeyScopeGroupRead",
"APIKeyScopeGroupUpdate",
"APIKeyScopeGroupMemberAll",
"APIKeyScopeGroupMemberRead",
"APIKeyScopeIdpsyncSettingsAll",
"APIKeyScopeIdpsyncSettingsRead",
"APIKeyScopeIdpsyncSettingsUpdate",
"APIKeyScopeInboxNotificationAll",
"APIKeyScopeInboxNotificationCreate",
"APIKeyScopeInboxNotificationRead",
"APIKeyScopeInboxNotificationUpdate",
"APIKeyScopeLicenseAll",
"APIKeyScopeLicenseCreate",
"APIKeyScopeLicenseDelete",
"APIKeyScopeLicenseRead",
"APIKeyScopeNotificationMessageAll",
"APIKeyScopeNotificationMessageCreate",
"APIKeyScopeNotificationMessageDelete",
"APIKeyScopeNotificationMessageRead",
"APIKeyScopeNotificationMessageUpdate",
"APIKeyScopeNotificationPreferenceAll",
"APIKeyScopeNotificationPreferenceRead",
"APIKeyScopeNotificationPreferenceUpdate",
"APIKeyScopeNotificationTemplateAll",
"APIKeyScopeNotificationTemplateRead",
"APIKeyScopeNotificationTemplateUpdate",
"APIKeyScopeOauth2AppAll",
"APIKeyScopeOauth2AppCreate",
"APIKeyScopeOauth2AppDelete",
"APIKeyScopeOauth2AppRead",
"APIKeyScopeOauth2AppUpdate",
"APIKeyScopeOauth2AppCodeTokenAll",
"APIKeyScopeOauth2AppCodeTokenCreate",
"APIKeyScopeOauth2AppCodeTokenDelete",
"APIKeyScopeOauth2AppCodeTokenRead",
"APIKeyScopeOauth2AppSecretAll",
"APIKeyScopeOauth2AppSecretCreate",
"APIKeyScopeOauth2AppSecretDelete",
"APIKeyScopeOauth2AppSecretRead",
"APIKeyScopeOauth2AppSecretUpdate",
"APIKeyScopeOrganizationAll",
"APIKeyScopeOrganizationCreate",
"APIKeyScopeOrganizationDelete",
"APIKeyScopeOrganizationRead",
"APIKeyScopeOrganizationUpdate",
"APIKeyScopeOrganizationMemberAll",
"APIKeyScopeOrganizationMemberCreate",
"APIKeyScopeOrganizationMemberDelete",
"APIKeyScopeOrganizationMemberRead",
"APIKeyScopeOrganizationMemberUpdate",
"APIKeyScopePrebuiltWorkspaceAll",
"APIKeyScopePrebuiltWorkspaceDelete",
"APIKeyScopePrebuiltWorkspaceUpdate",
"APIKeyScopeProvisionerDaemonAll",
"APIKeyScopeProvisionerDaemonCreate",
"APIKeyScopeProvisionerDaemonDelete",
"APIKeyScopeProvisionerDaemonRead",
"APIKeyScopeProvisionerDaemonUpdate",
"APIKeyScopeProvisionerJobsAll",
"APIKeyScopeProvisionerJobsCreate",
"APIKeyScopeProvisionerJobsRead",
"APIKeyScopeProvisionerJobsUpdate",
"APIKeyScopeReplicasAll",
"APIKeyScopeReplicasRead",
"APIKeyScopeSystemAll",
"APIKeyScopeSystemCreate",
"APIKeyScopeSystemDelete",
"APIKeyScopeSystemRead",
"APIKeyScopeSystemUpdate",
"APIKeyScopeTailnetCoordinatorAll",
"APIKeyScopeTailnetCoordinatorCreate",
"APIKeyScopeTailnetCoordinatorDelete",
"APIKeyScopeTailnetCoordinatorRead",
"APIKeyScopeTailnetCoordinatorUpdate",
"APIKeyScopeTaskAll",
"APIKeyScopeTaskCreate",
"APIKeyScopeTaskDelete",
"APIKeyScopeTaskRead",
"APIKeyScopeTaskUpdate",
"APIKeyScopeTemplateAll",
"APIKeyScopeTemplateCreate",
"APIKeyScopeTemplateDelete",
"APIKeyScopeTemplateRead",
"APIKeyScopeTemplateUpdate",
"APIKeyScopeTemplateUse",
"APIKeyScopeTemplateViewInsights",
"APIKeyScopeUsageEventAll",
"APIKeyScopeUsageEventCreate",
"APIKeyScopeUsageEventRead",
"APIKeyScopeUsageEventUpdate",
"APIKeyScopeUserAll",
"APIKeyScopeUserCreate",
"APIKeyScopeUserDelete",
"APIKeyScopeUserRead",
"APIKeyScopeUserReadPersonal",
"APIKeyScopeUserUpdate",
"APIKeyScopeUserUpdatePersonal",
"APIKeyScopeUserSecretAll",
"APIKeyScopeUserSecretCreate",
"APIKeyScopeUserSecretDelete",
"APIKeyScopeUserSecretRead",
"APIKeyScopeUserSecretUpdate",
"APIKeyScopeWebpushSubscriptionAll",
"APIKeyScopeWebpushSubscriptionCreate",
"APIKeyScopeWebpushSubscriptionDelete",
"APIKeyScopeWebpushSubscriptionRead",
"APIKeyScopeWorkspaceAll",
"APIKeyScopeWorkspaceApplicationConnect",
"APIKeyScopeWorkspaceCreate",
"APIKeyScopeWorkspaceCreateAgent",
"APIKeyScopeWorkspaceDelete",
"APIKeyScopeWorkspaceDeleteAgent",
"APIKeyScopeWorkspaceRead",
"APIKeyScopeWorkspaceSsh",
"APIKeyScopeWorkspaceStart",
"APIKeyScopeWorkspaceStop",
"APIKeyScopeWorkspaceUpdate"
"APIKeyScopeWorkspaceUpdate",
"APIKeyScopeWorkspaceAgentDevcontainersAll",
"APIKeyScopeWorkspaceAgentDevcontainersCreate",
"APIKeyScopeWorkspaceAgentResourceMonitorAll",
"APIKeyScopeWorkspaceAgentResourceMonitorCreate",
"APIKeyScopeWorkspaceAgentResourceMonitorRead",
"APIKeyScopeWorkspaceAgentResourceMonitorUpdate",
"APIKeyScopeWorkspaceDormantAll",
"APIKeyScopeWorkspaceDormantApplicationConnect",
"APIKeyScopeWorkspaceDormantCreate",
"APIKeyScopeWorkspaceDormantCreateAgent",
"APIKeyScopeWorkspaceDormantDelete",
"APIKeyScopeWorkspaceDormantDeleteAgent",
"APIKeyScopeWorkspaceDormantRead",
"APIKeyScopeWorkspaceDormantSsh",
"APIKeyScopeWorkspaceDormantStart",
"APIKeyScopeWorkspaceDormantStop",
"APIKeyScopeWorkspaceDormantUpdate",
"APIKeyScopeWorkspaceProxyAll",
"APIKeyScopeWorkspaceProxyCreate",
"APIKeyScopeWorkspaceProxyDelete",
"APIKeyScopeWorkspaceProxyRead",
"APIKeyScopeWorkspaceProxyUpdate"
]
},
"codersdk.AddLicenseRequest": {
@@ -12416,6 +13004,25 @@ const docTemplate = `{
}
}
},
"codersdk.CreateTaskRequest": {
"type": "object",
"properties": {
"input": {
"type": "string"
},
"name": {
"type": "string"
},
"template_version_id": {
"type": "string",
"format": "uuid"
},
"template_version_preset_id": {
"type": "string",
"format": "uuid"
}
}
},
"codersdk.CreateTemplateRequest": {
"type": "object",
"required": [
@@ -12670,6 +13277,12 @@ const docTemplate = `{
"codersdk.CreateTokenRequest": {
"type": "object",
"properties": {
"allow_list": {
"type": "array",
"items": {
"$ref": "#/definitions/codersdk.APIAllowListTarget"
}
},
"lifetime": {
"type": "integer"
},
@@ -15974,6 +16587,10 @@ const docTemplate = `{
"type": "string",
"format": "uuid"
},
"initiator_id": {
"type": "string",
"format": "uuid"
},
"input": {
"$ref": "#/definitions/codersdk.ProvisionerJobInput"
},
@@ -16370,6 +16987,7 @@ const docTemplate = `{
"replicas",
"system",
"tailnet_coordinator",
"task",
"template",
"usage_event",
"user",
@@ -16413,6 +17031,7 @@ const docTemplate = `{
"ResourceReplicas",
"ResourceSystem",
"ResourceTailnetCoordinator",
"ResourceTask",
"ResourceTemplate",
"ResourceUsageEvent",
"ResourceUser",
@@ -16628,7 +17247,8 @@ const docTemplate = `{
"idp_sync_settings_group",
"idp_sync_settings_role",
"workspace_agent",
"workspace_app"
"workspace_app",
"task"
],
"x-enum-varnames": [
"ResourceTypeTemplate",
@@ -16655,7 +17275,8 @@ const docTemplate = `{
"ResourceTypeIdpSyncSettingsGroup",
"ResourceTypeIdpSyncSettingsRole",
"ResourceTypeWorkspaceAgent",
"ResourceTypeWorkspaceApp"
"ResourceTypeWorkspaceApp",
"ResourceTypeTask"
]
},
"codersdk.Response": {
@@ -16911,6 +17532,189 @@ const docTemplate = `{
}
}
},
"codersdk.Task": {
"type": "object",
"properties": {
"created_at": {
"type": "string",
"format": "date-time"
},
"current_state": {
"$ref": "#/definitions/codersdk.TaskStateEntry"
},
"id": {
"type": "string",
"format": "uuid"
},
"initial_prompt": {
"type": "string"
},
"name": {
"type": "string"
},
"organization_id": {
"type": "string",
"format": "uuid"
},
"owner_id": {
"type": "string",
"format": "uuid"
},
"owner_name": {
"type": "string"
},
"status": {
"enum": [
"pending",
"starting",
"running",
"stopping",
"stopped",
"failed",
"canceling",
"canceled",
"deleting",
"deleted"
],
"allOf": [
{
"$ref": "#/definitions/codersdk.WorkspaceStatus"
}
]
},
"template_display_name": {
"type": "string"
},
"template_icon": {
"type": "string"
},
"template_id": {
"type": "string",
"format": "uuid"
},
"template_name": {
"type": "string"
},
"updated_at": {
"type": "string",
"format": "date-time"
},
"workspace_agent_health": {
"$ref": "#/definitions/codersdk.WorkspaceAgentHealth"
},
"workspace_agent_id": {
"format": "uuid",
"allOf": [
{
"$ref": "#/definitions/uuid.NullUUID"
}
]
},
"workspace_agent_lifecycle": {
"$ref": "#/definitions/codersdk.WorkspaceAgentLifecycle"
},
"workspace_app_id": {
"format": "uuid",
"allOf": [
{
"$ref": "#/definitions/uuid.NullUUID"
}
]
},
"workspace_build_number": {
"type": "integer"
},
"workspace_id": {
"format": "uuid",
"allOf": [
{
"$ref": "#/definitions/uuid.NullUUID"
}
]
}
}
},
"codersdk.TaskLogEntry": {
"type": "object",
"properties": {
"content": {
"type": "string"
},
"id": {
"type": "integer"
},
"time": {
"type": "string",
"format": "date-time"
},
"type": {
"$ref": "#/definitions/codersdk.TaskLogType"
}
}
},
"codersdk.TaskLogType": {
"type": "string",
"enum": [
"input",
"output"
],
"x-enum-varnames": [
"TaskLogTypeInput",
"TaskLogTypeOutput"
]
},
"codersdk.TaskLogsResponse": {
"type": "object",
"properties": {
"logs": {
"type": "array",
"items": {
"$ref": "#/definitions/codersdk.TaskLogEntry"
}
}
}
},
"codersdk.TaskSendRequest": {
"type": "object",
"properties": {
"input": {
"type": "string"
}
}
},
"codersdk.TaskState": {
"type": "string",
"enum": [
"working",
"idle",
"complete",
"failed"
],
"x-enum-varnames": [
"TaskStateWorking",
"TaskStateIdle",
"TaskStateComplete",
"TaskStateFailed"
]
},
"codersdk.TaskStateEntry": {
"type": "object",
"properties": {
"message": {
"type": "string"
},
"state": {
"$ref": "#/definitions/codersdk.TaskState"
},
"timestamp": {
"type": "string",
"format": "date-time"
},
"uri": {
"type": "string"
}
}
},
"codersdk.TelemetryConfig": {
"type": "object",
"properties": {
+785 -4
View File
@@ -106,6 +106,244 @@
}
}
},
"/api/experimental/tasks": {
"get": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": ["Experimental"],
"summary": "List AI tasks",
"operationId": "list-tasks",
"parameters": [
{
"type": "string",
"description": "Search query for filtering tasks",
"name": "q",
"in": "query"
},
{
"type": "string",
"description": "Return tasks after this ID for pagination",
"name": "after_id",
"in": "query"
},
{
"maximum": 100,
"minimum": 1,
"type": "integer",
"default": 25,
"description": "Maximum number of tasks to return",
"name": "limit",
"in": "query"
},
{
"minimum": 0,
"type": "integer",
"default": 0,
"description": "Offset for pagination",
"name": "offset",
"in": "query"
}
],
"responses": {
"200": {
"description": "OK",
"schema": {
"$ref": "#/definitions/coderd.tasksListResponse"
}
}
}
}
},
"/api/experimental/tasks/{user}": {
"post": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": ["Experimental"],
"summary": "Create a new AI task",
"operationId": "create-task",
"parameters": [
{
"type": "string",
"description": "Username, user ID, or 'me' for the authenticated user",
"name": "user",
"in": "path",
"required": true
},
{
"description": "Create task request",
"name": "request",
"in": "body",
"required": true,
"schema": {
"$ref": "#/definitions/codersdk.CreateTaskRequest"
}
}
],
"responses": {
"201": {
"description": "Created",
"schema": {
"$ref": "#/definitions/codersdk.Task"
}
}
}
}
},
"/api/experimental/tasks/{user}/{id}": {
"get": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": ["Experimental"],
"summary": "Get AI task by ID",
"operationId": "get-task",
"parameters": [
{
"type": "string",
"description": "Username, user ID, or 'me' for the authenticated user",
"name": "user",
"in": "path",
"required": true
},
{
"type": "string",
"format": "uuid",
"description": "Task ID",
"name": "id",
"in": "path",
"required": true
}
],
"responses": {
"200": {
"description": "OK",
"schema": {
"$ref": "#/definitions/codersdk.Task"
}
}
}
},
"delete": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": ["Experimental"],
"summary": "Delete AI task by ID",
"operationId": "delete-task",
"parameters": [
{
"type": "string",
"description": "Username, user ID, or 'me' for the authenticated user",
"name": "user",
"in": "path",
"required": true
},
{
"type": "string",
"format": "uuid",
"description": "Task ID",
"name": "id",
"in": "path",
"required": true
}
],
"responses": {
"202": {
"description": "Task deletion initiated"
}
}
}
},
"/api/experimental/tasks/{user}/{id}/logs": {
"get": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": ["Experimental"],
"summary": "Get AI task logs",
"operationId": "get-task-logs",
"parameters": [
{
"type": "string",
"description": "Username, user ID, or 'me' for the authenticated user",
"name": "user",
"in": "path",
"required": true
},
{
"type": "string",
"format": "uuid",
"description": "Task ID",
"name": "id",
"in": "path",
"required": true
}
],
"responses": {
"200": {
"description": "OK",
"schema": {
"$ref": "#/definitions/codersdk.TaskLogsResponse"
}
}
}
}
},
"/api/experimental/tasks/{user}/{id}/send": {
"post": {
"security": [
{
"CoderSessionToken": []
}
],
"tags": ["Experimental"],
"summary": "Send input to AI task",
"operationId": "send-task-input",
"parameters": [
{
"type": "string",
"description": "Username, user ID, or 'me' for the authenticated user",
"name": "user",
"in": "path",
"required": true
},
{
"type": "string",
"format": "uuid",
"description": "Task ID",
"name": "id",
"in": "path",
"required": true
},
{
"description": "Task input request",
"name": "request",
"in": "body",
"required": true,
"schema": {
"$ref": "#/definitions/codersdk.TaskSendRequest"
}
}
],
"responses": {
"204": {
"description": "Input sent successfully"
}
}
}
},
"/appearance": {
"get": {
"security": [
@@ -3299,6 +3537,13 @@
"description": "Provisioner tags to filter by (JSON of the form {'tag1':'value1','tag2':'value2'})",
"name": "tags",
"in": "query"
},
{
"type": "string",
"format": "uuid",
"description": "Filter results by initiator",
"name": "initiator",
"in": "query"
}
],
"responses": {
@@ -9953,6 +10198,20 @@
}
}
},
"coderd.tasksListResponse": {
"type": "object",
"properties": {
"count": {
"type": "integer"
},
"tasks": {
"type": "array",
"items": {
"$ref": "#/definitions/codersdk.Task"
}
}
}
},
"codersdk.ACLAvailable": {
"type": "object",
"properties": {
@@ -10166,6 +10425,17 @@
}
}
},
"codersdk.APIAllowListTarget": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"type": {
"$ref": "#/definitions/codersdk.RBACResource"
}
}
},
"codersdk.APIKey": {
"type": "object",
"required": [
@@ -10239,11 +10509,29 @@
"enum": [
"all",
"application_connect",
"aibridge_interception:*",
"aibridge_interception:create",
"aibridge_interception:read",
"aibridge_interception:update",
"api_key:*",
"api_key:create",
"api_key:delete",
"api_key:read",
"api_key:update",
"assign_org_role:*",
"assign_org_role:assign",
"assign_org_role:create",
"assign_org_role:delete",
"assign_org_role:read",
"assign_org_role:unassign",
"assign_org_role:update",
"assign_role:*",
"assign_role:assign",
"assign_role:read",
"assign_role:unassign",
"audit_log:*",
"audit_log:create",
"audit_log:read",
"coder:all",
"coder:apikeys.manage_self",
"coder:application_connect",
@@ -10253,40 +10541,193 @@
"coder:workspaces.create",
"coder:workspaces.delete",
"coder:workspaces.operate",
"connection_log:*",
"connection_log:read",
"connection_log:update",
"crypto_key:*",
"crypto_key:create",
"crypto_key:delete",
"crypto_key:read",
"crypto_key:update",
"debug_info:*",
"debug_info:read",
"deployment_config:*",
"deployment_config:read",
"deployment_config:update",
"deployment_stats:*",
"deployment_stats:read",
"file:*",
"file:create",
"file:read",
"group:*",
"group:create",
"group:delete",
"group:read",
"group:update",
"group_member:*",
"group_member:read",
"idpsync_settings:*",
"idpsync_settings:read",
"idpsync_settings:update",
"inbox_notification:*",
"inbox_notification:create",
"inbox_notification:read",
"inbox_notification:update",
"license:*",
"license:create",
"license:delete",
"license:read",
"notification_message:*",
"notification_message:create",
"notification_message:delete",
"notification_message:read",
"notification_message:update",
"notification_preference:*",
"notification_preference:read",
"notification_preference:update",
"notification_template:*",
"notification_template:read",
"notification_template:update",
"oauth2_app:*",
"oauth2_app:create",
"oauth2_app:delete",
"oauth2_app:read",
"oauth2_app:update",
"oauth2_app_code_token:*",
"oauth2_app_code_token:create",
"oauth2_app_code_token:delete",
"oauth2_app_code_token:read",
"oauth2_app_secret:*",
"oauth2_app_secret:create",
"oauth2_app_secret:delete",
"oauth2_app_secret:read",
"oauth2_app_secret:update",
"organization:*",
"organization:create",
"organization:delete",
"organization:read",
"organization:update",
"organization_member:*",
"organization_member:create",
"organization_member:delete",
"organization_member:read",
"organization_member:update",
"prebuilt_workspace:*",
"prebuilt_workspace:delete",
"prebuilt_workspace:update",
"provisioner_daemon:*",
"provisioner_daemon:create",
"provisioner_daemon:delete",
"provisioner_daemon:read",
"provisioner_daemon:update",
"provisioner_jobs:*",
"provisioner_jobs:create",
"provisioner_jobs:read",
"provisioner_jobs:update",
"replicas:*",
"replicas:read",
"system:*",
"system:create",
"system:delete",
"system:read",
"system:update",
"tailnet_coordinator:*",
"tailnet_coordinator:create",
"tailnet_coordinator:delete",
"tailnet_coordinator:read",
"tailnet_coordinator:update",
"task:*",
"task:create",
"task:delete",
"task:read",
"task:update",
"template:*",
"template:create",
"template:delete",
"template:read",
"template:update",
"template:use",
"template:view_insights",
"usage_event:*",
"usage_event:create",
"usage_event:read",
"usage_event:update",
"user:*",
"user:create",
"user:delete",
"user:read",
"user:read_personal",
"user:update",
"user:update_personal",
"user_secret:*",
"user_secret:create",
"user_secret:delete",
"user_secret:read",
"user_secret:update",
"webpush_subscription:*",
"webpush_subscription:create",
"webpush_subscription:delete",
"webpush_subscription:read",
"workspace:*",
"workspace:application_connect",
"workspace:create",
"workspace:create_agent",
"workspace:delete",
"workspace:delete_agent",
"workspace:read",
"workspace:ssh",
"workspace:start",
"workspace:stop",
"workspace:update"
"workspace:update",
"workspace_agent_devcontainers:*",
"workspace_agent_devcontainers:create",
"workspace_agent_resource_monitor:*",
"workspace_agent_resource_monitor:create",
"workspace_agent_resource_monitor:read",
"workspace_agent_resource_monitor:update",
"workspace_dormant:*",
"workspace_dormant:application_connect",
"workspace_dormant:create",
"workspace_dormant:create_agent",
"workspace_dormant:delete",
"workspace_dormant:delete_agent",
"workspace_dormant:read",
"workspace_dormant:ssh",
"workspace_dormant:start",
"workspace_dormant:stop",
"workspace_dormant:update",
"workspace_proxy:*",
"workspace_proxy:create",
"workspace_proxy:delete",
"workspace_proxy:read",
"workspace_proxy:update"
],
"x-enum-varnames": [
"APIKeyScopeAll",
"APIKeyScopeApplicationConnect",
"APIKeyScopeAibridgeInterceptionAll",
"APIKeyScopeAibridgeInterceptionCreate",
"APIKeyScopeAibridgeInterceptionRead",
"APIKeyScopeAibridgeInterceptionUpdate",
"APIKeyScopeApiKeyAll",
"APIKeyScopeApiKeyCreate",
"APIKeyScopeApiKeyDelete",
"APIKeyScopeApiKeyRead",
"APIKeyScopeApiKeyUpdate",
"APIKeyScopeAssignOrgRoleAll",
"APIKeyScopeAssignOrgRoleAssign",
"APIKeyScopeAssignOrgRoleCreate",
"APIKeyScopeAssignOrgRoleDelete",
"APIKeyScopeAssignOrgRoleRead",
"APIKeyScopeAssignOrgRoleUnassign",
"APIKeyScopeAssignOrgRoleUpdate",
"APIKeyScopeAssignRoleAll",
"APIKeyScopeAssignRoleAssign",
"APIKeyScopeAssignRoleRead",
"APIKeyScopeAssignRoleUnassign",
"APIKeyScopeAuditLogAll",
"APIKeyScopeAuditLogCreate",
"APIKeyScopeAuditLogRead",
"APIKeyScopeCoderAll",
"APIKeyScopeCoderApikeysManageSelf",
"APIKeyScopeCoderApplicationConnect",
@@ -10296,31 +10737,166 @@
"APIKeyScopeCoderWorkspacesCreate",
"APIKeyScopeCoderWorkspacesDelete",
"APIKeyScopeCoderWorkspacesOperate",
"APIKeyScopeConnectionLogAll",
"APIKeyScopeConnectionLogRead",
"APIKeyScopeConnectionLogUpdate",
"APIKeyScopeCryptoKeyAll",
"APIKeyScopeCryptoKeyCreate",
"APIKeyScopeCryptoKeyDelete",
"APIKeyScopeCryptoKeyRead",
"APIKeyScopeCryptoKeyUpdate",
"APIKeyScopeDebugInfoAll",
"APIKeyScopeDebugInfoRead",
"APIKeyScopeDeploymentConfigAll",
"APIKeyScopeDeploymentConfigRead",
"APIKeyScopeDeploymentConfigUpdate",
"APIKeyScopeDeploymentStatsAll",
"APIKeyScopeDeploymentStatsRead",
"APIKeyScopeFileAll",
"APIKeyScopeFileCreate",
"APIKeyScopeFileRead",
"APIKeyScopeGroupAll",
"APIKeyScopeGroupCreate",
"APIKeyScopeGroupDelete",
"APIKeyScopeGroupRead",
"APIKeyScopeGroupUpdate",
"APIKeyScopeGroupMemberAll",
"APIKeyScopeGroupMemberRead",
"APIKeyScopeIdpsyncSettingsAll",
"APIKeyScopeIdpsyncSettingsRead",
"APIKeyScopeIdpsyncSettingsUpdate",
"APIKeyScopeInboxNotificationAll",
"APIKeyScopeInboxNotificationCreate",
"APIKeyScopeInboxNotificationRead",
"APIKeyScopeInboxNotificationUpdate",
"APIKeyScopeLicenseAll",
"APIKeyScopeLicenseCreate",
"APIKeyScopeLicenseDelete",
"APIKeyScopeLicenseRead",
"APIKeyScopeNotificationMessageAll",
"APIKeyScopeNotificationMessageCreate",
"APIKeyScopeNotificationMessageDelete",
"APIKeyScopeNotificationMessageRead",
"APIKeyScopeNotificationMessageUpdate",
"APIKeyScopeNotificationPreferenceAll",
"APIKeyScopeNotificationPreferenceRead",
"APIKeyScopeNotificationPreferenceUpdate",
"APIKeyScopeNotificationTemplateAll",
"APIKeyScopeNotificationTemplateRead",
"APIKeyScopeNotificationTemplateUpdate",
"APIKeyScopeOauth2AppAll",
"APIKeyScopeOauth2AppCreate",
"APIKeyScopeOauth2AppDelete",
"APIKeyScopeOauth2AppRead",
"APIKeyScopeOauth2AppUpdate",
"APIKeyScopeOauth2AppCodeTokenAll",
"APIKeyScopeOauth2AppCodeTokenCreate",
"APIKeyScopeOauth2AppCodeTokenDelete",
"APIKeyScopeOauth2AppCodeTokenRead",
"APIKeyScopeOauth2AppSecretAll",
"APIKeyScopeOauth2AppSecretCreate",
"APIKeyScopeOauth2AppSecretDelete",
"APIKeyScopeOauth2AppSecretRead",
"APIKeyScopeOauth2AppSecretUpdate",
"APIKeyScopeOrganizationAll",
"APIKeyScopeOrganizationCreate",
"APIKeyScopeOrganizationDelete",
"APIKeyScopeOrganizationRead",
"APIKeyScopeOrganizationUpdate",
"APIKeyScopeOrganizationMemberAll",
"APIKeyScopeOrganizationMemberCreate",
"APIKeyScopeOrganizationMemberDelete",
"APIKeyScopeOrganizationMemberRead",
"APIKeyScopeOrganizationMemberUpdate",
"APIKeyScopePrebuiltWorkspaceAll",
"APIKeyScopePrebuiltWorkspaceDelete",
"APIKeyScopePrebuiltWorkspaceUpdate",
"APIKeyScopeProvisionerDaemonAll",
"APIKeyScopeProvisionerDaemonCreate",
"APIKeyScopeProvisionerDaemonDelete",
"APIKeyScopeProvisionerDaemonRead",
"APIKeyScopeProvisionerDaemonUpdate",
"APIKeyScopeProvisionerJobsAll",
"APIKeyScopeProvisionerJobsCreate",
"APIKeyScopeProvisionerJobsRead",
"APIKeyScopeProvisionerJobsUpdate",
"APIKeyScopeReplicasAll",
"APIKeyScopeReplicasRead",
"APIKeyScopeSystemAll",
"APIKeyScopeSystemCreate",
"APIKeyScopeSystemDelete",
"APIKeyScopeSystemRead",
"APIKeyScopeSystemUpdate",
"APIKeyScopeTailnetCoordinatorAll",
"APIKeyScopeTailnetCoordinatorCreate",
"APIKeyScopeTailnetCoordinatorDelete",
"APIKeyScopeTailnetCoordinatorRead",
"APIKeyScopeTailnetCoordinatorUpdate",
"APIKeyScopeTaskAll",
"APIKeyScopeTaskCreate",
"APIKeyScopeTaskDelete",
"APIKeyScopeTaskRead",
"APIKeyScopeTaskUpdate",
"APIKeyScopeTemplateAll",
"APIKeyScopeTemplateCreate",
"APIKeyScopeTemplateDelete",
"APIKeyScopeTemplateRead",
"APIKeyScopeTemplateUpdate",
"APIKeyScopeTemplateUse",
"APIKeyScopeTemplateViewInsights",
"APIKeyScopeUsageEventAll",
"APIKeyScopeUsageEventCreate",
"APIKeyScopeUsageEventRead",
"APIKeyScopeUsageEventUpdate",
"APIKeyScopeUserAll",
"APIKeyScopeUserCreate",
"APIKeyScopeUserDelete",
"APIKeyScopeUserRead",
"APIKeyScopeUserReadPersonal",
"APIKeyScopeUserUpdate",
"APIKeyScopeUserUpdatePersonal",
"APIKeyScopeUserSecretAll",
"APIKeyScopeUserSecretCreate",
"APIKeyScopeUserSecretDelete",
"APIKeyScopeUserSecretRead",
"APIKeyScopeUserSecretUpdate",
"APIKeyScopeWebpushSubscriptionAll",
"APIKeyScopeWebpushSubscriptionCreate",
"APIKeyScopeWebpushSubscriptionDelete",
"APIKeyScopeWebpushSubscriptionRead",
"APIKeyScopeWorkspaceAll",
"APIKeyScopeWorkspaceApplicationConnect",
"APIKeyScopeWorkspaceCreate",
"APIKeyScopeWorkspaceCreateAgent",
"APIKeyScopeWorkspaceDelete",
"APIKeyScopeWorkspaceDeleteAgent",
"APIKeyScopeWorkspaceRead",
"APIKeyScopeWorkspaceSsh",
"APIKeyScopeWorkspaceStart",
"APIKeyScopeWorkspaceStop",
"APIKeyScopeWorkspaceUpdate"
"APIKeyScopeWorkspaceUpdate",
"APIKeyScopeWorkspaceAgentDevcontainersAll",
"APIKeyScopeWorkspaceAgentDevcontainersCreate",
"APIKeyScopeWorkspaceAgentResourceMonitorAll",
"APIKeyScopeWorkspaceAgentResourceMonitorCreate",
"APIKeyScopeWorkspaceAgentResourceMonitorRead",
"APIKeyScopeWorkspaceAgentResourceMonitorUpdate",
"APIKeyScopeWorkspaceDormantAll",
"APIKeyScopeWorkspaceDormantApplicationConnect",
"APIKeyScopeWorkspaceDormantCreate",
"APIKeyScopeWorkspaceDormantCreateAgent",
"APIKeyScopeWorkspaceDormantDelete",
"APIKeyScopeWorkspaceDormantDeleteAgent",
"APIKeyScopeWorkspaceDormantRead",
"APIKeyScopeWorkspaceDormantSsh",
"APIKeyScopeWorkspaceDormantStart",
"APIKeyScopeWorkspaceDormantStop",
"APIKeyScopeWorkspaceDormantUpdate",
"APIKeyScopeWorkspaceProxyAll",
"APIKeyScopeWorkspaceProxyCreate",
"APIKeyScopeWorkspaceProxyDelete",
"APIKeyScopeWorkspaceProxyRead",
"APIKeyScopeWorkspaceProxyUpdate"
]
},
"codersdk.AddLicenseRequest": {
@@ -11094,6 +11670,25 @@
}
}
},
"codersdk.CreateTaskRequest": {
"type": "object",
"properties": {
"input": {
"type": "string"
},
"name": {
"type": "string"
},
"template_version_id": {
"type": "string",
"format": "uuid"
},
"template_version_preset_id": {
"type": "string",
"format": "uuid"
}
}
},
"codersdk.CreateTemplateRequest": {
"type": "object",
"required": ["name", "template_version_id"],
@@ -11327,6 +11922,12 @@
"codersdk.CreateTokenRequest": {
"type": "object",
"properties": {
"allow_list": {
"type": "array",
"items": {
"$ref": "#/definitions/codersdk.APIAllowListTarget"
}
},
"lifetime": {
"type": "integer"
},
@@ -14532,6 +15133,10 @@
"type": "string",
"format": "uuid"
},
"initiator_id": {
"type": "string",
"format": "uuid"
},
"input": {
"$ref": "#/definitions/codersdk.ProvisionerJobInput"
},
@@ -14904,6 +15509,7 @@
"replicas",
"system",
"tailnet_coordinator",
"task",
"template",
"usage_event",
"user",
@@ -14947,6 +15553,7 @@
"ResourceReplicas",
"ResourceSystem",
"ResourceTailnetCoordinator",
"ResourceTask",
"ResourceTemplate",
"ResourceUsageEvent",
"ResourceUser",
@@ -15152,7 +15759,8 @@
"idp_sync_settings_group",
"idp_sync_settings_role",
"workspace_agent",
"workspace_app"
"workspace_app",
"task"
],
"x-enum-varnames": [
"ResourceTypeTemplate",
@@ -15179,7 +15787,8 @@
"ResourceTypeIdpSyncSettingsGroup",
"ResourceTypeIdpSyncSettingsRole",
"ResourceTypeWorkspaceAgent",
"ResourceTypeWorkspaceApp"
"ResourceTypeWorkspaceApp",
"ResourceTypeTask"
]
},
"codersdk.Response": {
@@ -15431,6 +16040,178 @@
}
}
},
"codersdk.Task": {
"type": "object",
"properties": {
"created_at": {
"type": "string",
"format": "date-time"
},
"current_state": {
"$ref": "#/definitions/codersdk.TaskStateEntry"
},
"id": {
"type": "string",
"format": "uuid"
},
"initial_prompt": {
"type": "string"
},
"name": {
"type": "string"
},
"organization_id": {
"type": "string",
"format": "uuid"
},
"owner_id": {
"type": "string",
"format": "uuid"
},
"owner_name": {
"type": "string"
},
"status": {
"enum": [
"pending",
"starting",
"running",
"stopping",
"stopped",
"failed",
"canceling",
"canceled",
"deleting",
"deleted"
],
"allOf": [
{
"$ref": "#/definitions/codersdk.WorkspaceStatus"
}
]
},
"template_display_name": {
"type": "string"
},
"template_icon": {
"type": "string"
},
"template_id": {
"type": "string",
"format": "uuid"
},
"template_name": {
"type": "string"
},
"updated_at": {
"type": "string",
"format": "date-time"
},
"workspace_agent_health": {
"$ref": "#/definitions/codersdk.WorkspaceAgentHealth"
},
"workspace_agent_id": {
"format": "uuid",
"allOf": [
{
"$ref": "#/definitions/uuid.NullUUID"
}
]
},
"workspace_agent_lifecycle": {
"$ref": "#/definitions/codersdk.WorkspaceAgentLifecycle"
},
"workspace_app_id": {
"format": "uuid",
"allOf": [
{
"$ref": "#/definitions/uuid.NullUUID"
}
]
},
"workspace_build_number": {
"type": "integer"
},
"workspace_id": {
"format": "uuid",
"allOf": [
{
"$ref": "#/definitions/uuid.NullUUID"
}
]
}
}
},
"codersdk.TaskLogEntry": {
"type": "object",
"properties": {
"content": {
"type": "string"
},
"id": {
"type": "integer"
},
"time": {
"type": "string",
"format": "date-time"
},
"type": {
"$ref": "#/definitions/codersdk.TaskLogType"
}
}
},
"codersdk.TaskLogType": {
"type": "string",
"enum": ["input", "output"],
"x-enum-varnames": ["TaskLogTypeInput", "TaskLogTypeOutput"]
},
"codersdk.TaskLogsResponse": {
"type": "object",
"properties": {
"logs": {
"type": "array",
"items": {
"$ref": "#/definitions/codersdk.TaskLogEntry"
}
}
}
},
"codersdk.TaskSendRequest": {
"type": "object",
"properties": {
"input": {
"type": "string"
}
}
},
"codersdk.TaskState": {
"type": "string",
"enum": ["working", "idle", "complete", "failed"],
"x-enum-varnames": [
"TaskStateWorking",
"TaskStateIdle",
"TaskStateComplete",
"TaskStateFailed"
]
},
"codersdk.TaskStateEntry": {
"type": "object",
"properties": {
"message": {
"type": "string"
},
"state": {
"$ref": "#/definitions/codersdk.TaskState"
},
"timestamp": {
"type": "string",
"format": "date-time"
},
"uri": {
"type": "string"
}
}
},
"codersdk.TelemetryConfig": {
"type": "object",
"properties": {
+31
View File
@@ -116,6 +116,37 @@ func (api *API) postToken(rw http.ResponseWriter, r *http.Request) {
TokenName: tokenName,
}
if len(createToken.AllowList) > 0 {
rbacAllowListElements := make([]rbac.AllowListElement, 0, len(createToken.AllowList))
for _, t := range createToken.AllowList {
entry, err := rbac.NewAllowListElement(string(t.Type), t.ID)
if err != nil {
httpapi.Write(ctx, rw, http.StatusBadRequest, codersdk.Response{
Message: "Failed to create API key.",
Detail: err.Error(),
})
return
}
rbacAllowListElements = append(rbacAllowListElements, entry)
}
rbacAllowList, err := rbac.NormalizeAllowList(rbacAllowListElements)
if err != nil {
httpapi.Write(ctx, rw, http.StatusBadRequest, codersdk.Response{
Message: "Failed to create API key.",
Detail: err.Error(),
})
return
}
dbAllowList := make(database.AllowList, 0, len(rbacAllowList))
for _, e := range rbacAllowList {
dbAllowList = append(dbAllowList, rbac.AllowListElement{Type: e.Type, ID: e.ID})
}
params.AllowList = dbAllowList
}
if createToken.Lifetime != 0 {
err := api.validateAPIKeyLifetime(ctx, user.ID, createToken.Lifetime)
if err != nil {
+9 -1
View File
@@ -12,6 +12,7 @@ import (
"github.com/coder/coder/v2/coderd/database"
"github.com/coder/coder/v2/coderd/database/dbtime"
"github.com/coder/coder/v2/coderd/rbac/policy"
"github.com/coder/coder/v2/cryptorand"
)
@@ -34,6 +35,9 @@ type CreateParams struct {
Scopes database.APIKeyScopes
TokenName string
RemoteAddr string
// AllowList is an optional, normalized allow-list
// of resource type and uuid entries. If empty, defaults to wildcard.
AllowList database.AllowList
}
// Generate generates an API key, returning the key as a string as well as the
@@ -61,6 +65,10 @@ func Generate(params CreateParams) (database.InsertAPIKeyParams, string, error)
params.LifetimeSeconds = int64(time.Until(params.ExpiresAt).Seconds())
}
if len(params.AllowList) == 0 {
params.AllowList = database.AllowList{{Type: policy.WildcardSymbol, ID: policy.WildcardSymbol}}
}
ip := net.ParseIP(params.RemoteAddr)
if ip == nil {
ip = net.IPv4(0, 0, 0, 0)
@@ -115,7 +123,7 @@ func Generate(params CreateParams) (database.InsertAPIKeyParams, string, error)
HashedSecret: hashed[:],
LoginType: params.LoginType,
Scopes: scopes,
AllowList: database.AllowList{database.AllowListWildcard()},
AllowList: params.AllowList,
TokenName: params.TokenName,
}, token, nil
}
+19
View File
@@ -420,6 +420,14 @@ func (api *API) auditLogIsResourceDeleted(ctx context.Context, alog database.Get
api.Logger.Error(ctx, "unable to fetch oauth2 app secret", slog.Error(err))
}
return false
case database.ResourceTypeTask:
task, err := api.Database.GetTaskByID(ctx, alog.AuditLog.ResourceID)
if xerrors.Is(err, sql.ErrNoRows) {
return true
} else if err != nil {
api.Logger.Error(ctx, "unable to fetch task", slog.Error(err))
}
return task.DeletedAt.Valid && task.DeletedAt.Time.Before(time.Now())
default:
return false
}
@@ -496,6 +504,17 @@ func (api *API) auditLogResourceLink(ctx context.Context, alog database.GetAudit
}
return fmt.Sprintf("/deployment/oauth2-provider/apps/%s", secret.AppID)
case database.ResourceTypeTask:
task, err := api.Database.GetTaskByID(ctx, alog.AuditLog.ResourceID)
if err != nil {
return ""
}
workspace, err := api.Database.GetWorkspaceByID(ctx, task.WorkspaceID.UUID)
if err != nil {
return ""
}
return fmt.Sprintf("/tasks/%s/%s", workspace.OwnerName, task.Name)
default:
return ""
}
+2 -1
View File
@@ -31,7 +31,8 @@ type Auditable interface {
database.NotificationTemplate |
idpsync.OrganizationSyncSettings |
idpsync.GroupSyncSettings |
idpsync.RoleSyncSettings
idpsync.RoleSyncSettings |
database.TaskTable
}
// Map is a map of changed fields in an audited resource. It maps field names to
+8
View File
@@ -131,6 +131,8 @@ func ResourceTarget[T Auditable](tgt T) string {
return "Organization Group Sync"
case idpsync.RoleSyncSettings:
return "Organization Role Sync"
case database.TaskTable:
return typed.Name
default:
panic(fmt.Sprintf("unknown resource %T for ResourceTarget", tgt))
}
@@ -193,6 +195,8 @@ func ResourceID[T Auditable](tgt T) uuid.UUID {
return noID // Org field on audit log has org id
case idpsync.RoleSyncSettings:
return noID // Org field on audit log has org id
case database.TaskTable:
return typed.ID
default:
panic(fmt.Sprintf("unknown resource %T for ResourceID", tgt))
}
@@ -246,6 +250,8 @@ func ResourceType[T Auditable](tgt T) database.ResourceType {
return database.ResourceTypeIdpSyncSettingsRole
case idpsync.GroupSyncSettings:
return database.ResourceTypeIdpSyncSettingsGroup
case database.TaskTable:
return database.ResourceTypeTask
default:
panic(fmt.Sprintf("unknown resource %T for ResourceType", typed))
}
@@ -302,6 +308,8 @@ func ResourceRequiresOrgID[T Auditable]() bool {
return true
case idpsync.RoleSyncSettings:
return true
case database.TaskTable:
return true
default:
panic(fmt.Sprintf("unknown resource %T for ResourceRequiresOrgID", tgt))
}
+1 -1
View File
@@ -68,7 +68,7 @@ func AssertRBAC(t *testing.T, api *coderd.API, client *codersdk.Client) RBACAsse
ID: key.UserID.String(),
Roles: rbac.RoleIdentifiers(roleNames),
Groups: roles.Groups,
Scope: key.Scopes,
Scope: key.ScopeSet(),
},
Recorder: recorder,
}
+12 -4
View File
@@ -62,10 +62,6 @@ func (m *FakeConnectionLogger) Contains(t testing.TB, expected database.UpsertCo
t.Logf("connection log %d: expected ID %s, got %s", idx+1, expected.ID, cl.ID)
continue
}
if !expected.Time.IsZero() && expected.Time != cl.Time {
t.Logf("connection log %d: expected Time %s, got %s", idx+1, expected.Time, cl.Time)
continue
}
if expected.OrganizationID != uuid.Nil && cl.OrganizationID != expected.OrganizationID {
t.Logf("connection log %d: expected OrganizationID %s, got %s", idx+1, expected.OrganizationID, cl.OrganizationID)
continue
@@ -114,6 +110,18 @@ func (m *FakeConnectionLogger) Contains(t testing.TB, expected database.UpsertCo
t.Logf("connection log %d: expected ConnectionID %s, got %s", idx+1, expected.ConnectionID.UUID, cl.ConnectionID.UUID)
continue
}
if expected.DisconnectReason.Valid && cl.DisconnectReason.String != expected.DisconnectReason.String {
t.Logf("connection log %d: expected DisconnectReason %s, got %s", idx+1, expected.DisconnectReason.String, cl.DisconnectReason.String)
continue
}
if !expected.Time.IsZero() && expected.Time != cl.Time {
t.Logf("connection log %d: expected Time %s, got %s", idx+1, expected.Time, cl.Time)
continue
}
if expected.ConnectionStatus != "" && expected.ConnectionStatus != cl.ConnectionStatus {
t.Logf("connection log %d: expected ConnectionStatus %s, got %s", idx+1, expected.ConnectionStatus, cl.ConnectionStatus)
continue
}
return true
}
+2 -2
View File
@@ -9,10 +9,10 @@ const (
CheckOneTimePasscodeSet CheckConstraint = "one_time_passcode_set" // users
CheckUsersUsernameMinLength CheckConstraint = "users_username_min_length" // users
CheckMaxProvisionerLogsLength CheckConstraint = "max_provisioner_logs_length" // provisioner_jobs
CheckValidationMonotonicOrder CheckConstraint = "validation_monotonic_order" // template_version_parameters
CheckUsageEventTypeCheck CheckConstraint = "usage_event_type_check" // usage_events
CheckMaxLogsLength CheckConstraint = "max_logs_length" // workspace_agents
CheckSubsystemsNotNone CheckConstraint = "subsystems_not_none" // workspace_agents
CheckWorkspaceBuildsAiTaskSidebarAppIDRequired CheckConstraint = "workspace_builds_ai_task_sidebar_app_id_required" // workspace_builds
CheckWorkspaceBuildsDeadlineBelowMaxDeadline CheckConstraint = "workspace_builds_deadline_below_max_deadline" // workspace_builds
CheckValidationMonotonicOrder CheckConstraint = "validation_monotonic_order" // template_version_parameters
CheckUsageEventTypeCheck CheckConstraint = "usage_event_type_check" // usage_events
)
+2 -2
View File
@@ -693,13 +693,13 @@ func SlimRoleFromName(name string) codersdk.SlimRole {
func RBACRole(role rbac.Role) codersdk.Role {
slim := SlimRole(role)
orgPerms := role.Org[slim.OrganizationID]
orgPerms := role.ByOrgID[slim.OrganizationID]
return codersdk.Role{
Name: slim.Name,
OrganizationID: slim.OrganizationID,
DisplayName: slim.DisplayName,
SitePermissions: List(role.Site, RBACPermission),
OrganizationPermissions: List(orgPerms, RBACPermission),
OrganizationPermissions: List(orgPerms.Org, RBACPermission),
UserPermissions: List(role.User, RBACPermission),
}
}
+77 -38
View File
@@ -219,7 +219,9 @@ var (
rbac.ResourceUser.Type: {policy.ActionRead, policy.ActionReadPersonal, policy.ActionUpdatePersonal},
rbac.ResourceWorkspaceDormant.Type: {policy.ActionDelete, policy.ActionRead, policy.ActionUpdate, policy.ActionWorkspaceStop},
rbac.ResourceWorkspace.Type: {policy.ActionDelete, policy.ActionRead, policy.ActionUpdate, policy.ActionWorkspaceStart, policy.ActionWorkspaceStop, policy.ActionCreateAgent},
rbac.ResourceApiKey.Type: {policy.WildcardSymbol},
// Provisionerd needs to read and update tasks associated with workspaces.
rbac.ResourceTask.Type: {policy.ActionRead, policy.ActionUpdate},
rbac.ResourceApiKey.Type: {policy.WildcardSymbol},
// When org scoped provisioner credentials are implemented,
// this can be reduced to read a specific org.
rbac.ResourceOrganization.Type: {policy.ActionRead},
@@ -232,8 +234,8 @@ var (
// Provisionerd creates usage events
rbac.ResourceUsageEvent.Type: {policy.ActionCreate},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -257,8 +259,8 @@ var (
rbac.ResourceWorkspace.Type: {policy.ActionDelete, policy.ActionRead, policy.ActionUpdate, policy.ActionWorkspaceStart, policy.ActionWorkspaceStop},
rbac.ResourceWorkspaceDormant.Type: {policy.ActionDelete, policy.ActionRead, policy.ActionUpdate, policy.ActionWorkspaceStop},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -274,13 +276,14 @@ var (
Identifier: rbac.RoleIdentifier{Name: "jobreaper"},
DisplayName: "Job Reaper Daemon",
Site: rbac.Permissions(map[string][]policy.Action{
rbac.ResourceSystem.Type: {policy.WildcardSymbol},
rbac.ResourceTemplate.Type: {policy.ActionRead},
rbac.ResourceWorkspace.Type: {policy.ActionRead, policy.ActionUpdate},
rbac.ResourceProvisionerJobs.Type: {policy.ActionRead, policy.ActionUpdate},
rbac.ResourceSystem.Type: {policy.WildcardSymbol},
rbac.ResourceTemplate.Type: {policy.ActionRead, policy.ActionUpdate},
rbac.ResourceWorkspace.Type: {policy.ActionRead, policy.ActionUpdate},
rbac.ResourceWorkspaceDormant.Type: {policy.ActionRead, policy.ActionUpdate},
rbac.ResourceProvisionerJobs.Type: {policy.ActionRead, policy.ActionUpdate},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -298,8 +301,8 @@ var (
Site: rbac.Permissions(map[string][]policy.Action{
rbac.ResourceCryptoKey.Type: {policy.WildcardSymbol},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -317,8 +320,8 @@ var (
Site: rbac.Permissions(map[string][]policy.Action{
rbac.ResourceCryptoKey.Type: {policy.WildcardSymbol},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -335,8 +338,8 @@ var (
Site: rbac.Permissions(map[string][]policy.Action{
rbac.ResourceConnectionLog.Type: {policy.ActionUpdate, policy.ActionRead},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -356,8 +359,8 @@ var (
rbac.ResourceWebpushSubscription.Type: {policy.ActionCreate, policy.ActionRead, policy.ActionUpdate, policy.ActionDelete},
rbac.ResourceDeploymentConfig.Type: {policy.ActionRead, policy.ActionUpdate}, // To read and upsert VAPID keys
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -375,8 +378,8 @@ var (
// The workspace monitor needs to be able to update monitors
rbac.ResourceWorkspaceAgentResourceMonitor.Type: {policy.ActionUpdate},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -392,12 +395,12 @@ var (
Identifier: rbac.RoleIdentifier{Name: "subagentapi"},
DisplayName: "Sub Agent API",
Site: []rbac.Permission{},
Org: map[string][]rbac.Permission{
orgID.String(): {},
},
User: rbac.Permissions(map[string][]policy.Action{
rbac.ResourceWorkspace.Type: {policy.ActionRead, policy.ActionUpdate, policy.ActionCreateAgent, policy.ActionDeleteAgent},
}),
ByOrgID: map[string]rbac.OrgPermissions{
orgID.String(): {},
},
},
}),
Scope: rbac.ScopeAll,
@@ -436,8 +439,8 @@ var (
rbac.ResourceOauth2App.Type: {policy.ActionCreate, policy.ActionRead, policy.ActionUpdate, policy.ActionDelete},
rbac.ResourceOauth2AppSecret.Type: {policy.ActionCreate, policy.ActionRead, policy.ActionUpdate, policy.ActionDelete},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -454,8 +457,8 @@ var (
Site: rbac.Permissions(map[string][]policy.Action{
rbac.ResourceProvisionerDaemon.Type: {policy.ActionRead},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -531,8 +534,8 @@ var (
Site: rbac.Permissions(map[string][]policy.Action{
rbac.ResourceFile.Type: {policy.ActionRead},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -552,8 +555,8 @@ var (
// reads/processes them.
rbac.ResourceUsageEvent.Type: {policy.ActionRead, policy.ActionUpdate},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -576,8 +579,8 @@ var (
rbac.ResourceApiKey.Type: {policy.ActionRead}, // Validate API keys.
rbac.ResourceAibridgeInterception.Type: {policy.ActionCreate, policy.ActionRead, policy.ActionUpdate},
}),
Org: map[string][]rbac.Permission{},
User: []rbac.Permission{},
User: []rbac.Permission{},
ByOrgID: map[string]rbac.OrgPermissions{},
},
}),
Scope: rbac.ScopeAll,
@@ -1253,13 +1256,13 @@ func (q *querier) customRoleCheck(ctx context.Context, role database.CustomRole)
return xerrors.Errorf("invalid role: %w", err)
}
if len(rbacRole.Org) > 0 && len(rbacRole.Site) > 0 {
if len(rbacRole.ByOrgID) > 0 && len(rbacRole.Site) > 0 {
// This is a choice to keep roles simple. If we allow mixing site and org scoped perms, then knowing who can
// do what gets more complicated.
return xerrors.Errorf("invalid custom role, cannot assign both org and site permissions at the same time")
}
if len(rbacRole.Org) > 1 {
if len(rbacRole.ByOrgID) > 1 {
// Again to avoid more complexity in our roles
return xerrors.Errorf("invalid custom role, cannot assign permissions to more than 1 org at a time")
}
@@ -1272,8 +1275,8 @@ func (q *querier) customRoleCheck(ctx context.Context, role database.CustomRole)
}
}
for orgID, perms := range rbacRole.Org {
for _, orgPerm := range perms {
for orgID, perms := range rbacRole.ByOrgID {
for _, orgPerm := range perms.Org {
err := q.customRoleEscalationCheck(ctx, act, orgPerm, rbac.Object{OrgID: orgID, Type: orgPerm.ResourceType})
if err != nil {
return xerrors.Errorf("org=%q: %w", orgID, err)
@@ -2882,6 +2885,14 @@ func (q *querier) GetTailnetTunnelPeerIDs(ctx context.Context, srcID uuid.UUID)
return q.db.GetTailnetTunnelPeerIDs(ctx, srcID)
}
func (q *querier) GetTaskByID(ctx context.Context, id uuid.UUID) (database.Task, error) {
return fetch(q.log, q.auth, q.db.GetTaskByID)(ctx, id)
}
func (q *querier) GetTaskByWorkspaceID(ctx context.Context, workspaceID uuid.UUID) (database.Task, error) {
return fetch(q.log, q.auth, q.db.GetTaskByWorkspaceID)(ctx, workspaceID)
}
func (q *querier) GetTelemetryItem(ctx context.Context, key string) (database.TelemetryItem, error) {
if err := q.authorizeContext(ctx, policy.ActionRead, rbac.ResourceSystem); err != nil {
return database.TelemetryItem{}, err
@@ -4107,6 +4118,17 @@ func (q *querier) InsertReplica(ctx context.Context, arg database.InsertReplicaP
return q.db.InsertReplica(ctx, arg)
}
func (q *querier) InsertTask(ctx context.Context, arg database.InsertTaskParams) (database.TaskTable, error) {
// Ensure the actor can access the specified template version (and thus its template).
if _, err := q.GetTemplateVersionByID(ctx, arg.TemplateVersionID); err != nil {
return database.TaskTable{}, err
}
obj := rbac.ResourceTask.WithOwner(arg.OwnerID.String()).InOrg(arg.OrganizationID)
return insert(q.log, q.auth, obj, q.db.InsertTask)(ctx, arg)
}
func (q *querier) InsertTelemetryItemIfNotExists(ctx context.Context, arg database.InsertTelemetryItemIfNotExistsParams) error {
if err := q.authorizeContext(ctx, policy.ActionCreate, rbac.ResourceSystem); err != nil {
return err
@@ -4463,6 +4485,11 @@ func (q *querier) ListProvisionerKeysByOrganizationExcludeReserved(ctx context.C
return fetchWithPostFilter(q.auth, policy.ActionRead, q.db.ListProvisionerKeysByOrganizationExcludeReserved)(ctx, organizationID)
}
func (q *querier) ListTasks(ctx context.Context, arg database.ListTasksParams) ([]database.Task, error) {
// TODO(Cian): replace this with a sql filter for improved performance. https://github.com/coder/internal/issues/1061
return fetchWithPostFilter(q.auth, policy.ActionRead, q.db.ListTasks)(ctx, arg)
}
func (q *querier) ListUserSecrets(ctx context.Context, userID uuid.UUID) ([]database.UserSecret, error) {
obj := rbac.ResourceUserSecret.WithOwner(userID.String())
if err := q.authorizeContext(ctx, policy.ActionRead, obj); err != nil {
@@ -5665,6 +5692,18 @@ func (q *querier) UpsertTailnetTunnel(ctx context.Context, arg database.UpsertTa
return q.db.UpsertTailnetTunnel(ctx, arg)
}
func (q *querier) UpsertTaskWorkspaceApp(ctx context.Context, arg database.UpsertTaskWorkspaceAppParams) (database.TaskWorkspaceApp, error) {
// Fetch the task to derive the RBAC object and authorize update on it.
task, err := q.db.GetTaskByID(ctx, arg.TaskID)
if err != nil {
return database.TaskWorkspaceApp{}, err
}
if err := q.authorizeContext(ctx, policy.ActionUpdate, task); err != nil {
return database.TaskWorkspaceApp{}, err
}
return q.db.UpsertTaskWorkspaceApp(ctx, arg)
}
func (q *querier) UpsertTelemetryItem(ctx context.Context, arg database.UpsertTelemetryItemParams) error {
if err := q.authorizeContext(ctx, policy.ActionUpdate, rbac.ResourceSystem); err != nil {
return err
+95 -1
View File
@@ -1639,10 +1639,43 @@ func (s *MethodTestSuite) TestUser() {
}
func (s *MethodTestSuite) TestWorkspace() {
// The Workspace object differs it's type based on whether it's dormant or
// not, which is why we have two tests for it. To ensure we are actually
// testing the correct RBAC objects, we also explicitly create the expected
// object here rather than passing in the model.
s.Run("GetWorkspaceByID", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
ws := testutil.Fake(s.T(), faker, database.Workspace{})
ws.DormantAt = sql.NullTime{
Time: time.Time{},
Valid: false,
}
// Ensure the RBAC is not the dormant type.
require.Equal(s.T(), rbac.ResourceWorkspace.Type, ws.RBACObject().Type)
dbm.EXPECT().GetWorkspaceByID(gomock.Any(), ws.ID).Return(ws, nil).AnyTimes()
check.Args(ws.ID).Asserts(ws, policy.ActionRead).Returns(ws)
// Explicitly create the expected object.
expected := rbac.ResourceWorkspace.WithID(ws.ID).
InOrg(ws.OrganizationID).
WithOwner(ws.OwnerID.String()).
WithGroupACL(ws.GroupACL.RBACACL()).
WithACLUserList(ws.UserACL.RBACACL())
check.Args(ws.ID).Asserts(expected, policy.ActionRead).Returns(ws)
}))
s.Run("DormantWorkspace/GetWorkspaceByID", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
ws := testutil.Fake(s.T(), faker, database.Workspace{
DormantAt: sql.NullTime{
Time: time.Now().Add(-time.Hour),
Valid: true,
},
})
// Ensure the RBAC changed automatically.
require.Equal(s.T(), rbac.ResourceWorkspaceDormant.Type, ws.RBACObject().Type)
dbm.EXPECT().GetWorkspaceByID(gomock.Any(), ws.ID).Return(ws, nil).AnyTimes()
// Explicitly create the expected object.
expected := rbac.ResourceWorkspaceDormant.
WithID(ws.ID).
InOrg(ws.OrganizationID).
WithOwner(ws.OwnerID.String())
check.Args(ws.ID).Asserts(expected, policy.ActionRead).Returns(ws)
}))
s.Run("GetWorkspaceByResourceID", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
ws := testutil.Fake(s.T(), faker, database.Workspace{})
@@ -2314,6 +2347,65 @@ func (s *MethodTestSuite) TestWorkspacePortSharing() {
}))
}
func (s *MethodTestSuite) TestTasks() {
s.Run("GetTaskByID", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
task := testutil.Fake(s.T(), faker, database.Task{})
dbm.EXPECT().GetTaskByID(gomock.Any(), task.ID).Return(task, nil).AnyTimes()
check.Args(task.ID).Asserts(task, policy.ActionRead).Returns(task)
}))
s.Run("InsertTask", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
tpl := testutil.Fake(s.T(), faker, database.Template{})
tv := testutil.Fake(s.T(), faker, database.TemplateVersion{
TemplateID: uuid.NullUUID{UUID: tpl.ID, Valid: true},
OrganizationID: tpl.OrganizationID,
})
arg := testutil.Fake(s.T(), faker, database.InsertTaskParams{
OrganizationID: tpl.OrganizationID,
TemplateVersionID: tv.ID,
})
dbm.EXPECT().GetTemplateVersionByID(gomock.Any(), tv.ID).Return(tv, nil).AnyTimes()
dbm.EXPECT().GetTemplateByID(gomock.Any(), tpl.ID).Return(tpl, nil).AnyTimes()
dbm.EXPECT().InsertTask(gomock.Any(), arg).Return(database.TaskTable{}, nil).AnyTimes()
check.Args(arg).Asserts(
tpl, policy.ActionRead,
rbac.ResourceTask.InOrg(arg.OrganizationID).WithOwner(arg.OwnerID.String()), policy.ActionCreate,
).Returns(database.TaskTable{})
}))
s.Run("UpsertTaskWorkspaceApp", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
task := testutil.Fake(s.T(), faker, database.Task{})
arg := database.UpsertTaskWorkspaceAppParams{
TaskID: task.ID,
WorkspaceBuildNumber: 1,
}
dbm.EXPECT().GetTaskByID(gomock.Any(), task.ID).Return(task, nil).AnyTimes()
dbm.EXPECT().UpsertTaskWorkspaceApp(gomock.Any(), arg).Return(database.TaskWorkspaceApp{}, nil).AnyTimes()
check.Args(arg).Asserts(task, policy.ActionUpdate).Returns(database.TaskWorkspaceApp{})
}))
s.Run("GetTaskByWorkspaceID", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
task := testutil.Fake(s.T(), faker, database.Task{})
task.WorkspaceID = uuid.NullUUID{UUID: uuid.New(), Valid: true}
dbm.EXPECT().GetTaskByWorkspaceID(gomock.Any(), task.WorkspaceID.UUID).Return(task, nil).AnyTimes()
check.Args(task.WorkspaceID.UUID).Asserts(task, policy.ActionRead).Returns(task)
}))
s.Run("ListTasks", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
u1 := testutil.Fake(s.T(), faker, database.User{})
u2 := testutil.Fake(s.T(), faker, database.User{})
org1 := testutil.Fake(s.T(), faker, database.Organization{})
org2 := testutil.Fake(s.T(), faker, database.Organization{})
_ = testutil.Fake(s.T(), faker, database.OrganizationMember{UserID: u1.ID, OrganizationID: org1.ID})
_ = testutil.Fake(s.T(), faker, database.OrganizationMember{UserID: u2.ID, OrganizationID: org2.ID})
t1 := testutil.Fake(s.T(), faker, database.Task{OwnerID: u1.ID})
t2 := testutil.Fake(s.T(), faker, database.Task{OwnerID: u2.ID})
dbm.EXPECT().ListTasks(gomock.Any(), gomock.Any()).Return([]database.Task{t1, t2}, nil).AnyTimes()
check.Args(database.ListTasksParams{}).Asserts(t1, policy.ActionRead, t2, policy.ActionRead).Returns([]database.Task{t1, t2})
}))
}
func (s *MethodTestSuite) TestProvisionerKeys() {
s.Run("InsertProvisionerKey", s.Mocked(func(dbm *dbmock.MockStore, faker *gofakeit.Faker, check *expects) {
org := testutil.Fake(s.T(), faker, database.Organization{})
@@ -2484,10 +2576,12 @@ func (s *MethodTestSuite) TestExtraMethods() {
ds, err := db.GetProvisionerJobsByOrganizationAndStatusWithQueuePositionAndProvisioner(context.Background(), database.GetProvisionerJobsByOrganizationAndStatusWithQueuePositionAndProvisionerParams{
OrganizationID: org.ID,
InitiatorID: uuid.Nil,
})
s.NoError(err, "get provisioner jobs by org")
check.Args(database.GetProvisionerJobsByOrganizationAndStatusWithQueuePositionAndProvisionerParams{
OrganizationID: org.ID,
InitiatorID: uuid.Nil,
}).Asserts(j1, policy.ActionRead, j2, policy.ActionRead).Returns(ds)
}))
}
+9 -5
View File
@@ -225,6 +225,10 @@ func (s *MethodTestSuite) SubtestWithDB(db database.Store, testCaseF func(db dat
if testCase.outputs != nil {
// Assert the required outputs
s.Equal(len(testCase.outputs), len(outputs), "method %q returned unexpected number of outputs", methodName)
cmpOptions := []cmp.Option{
// Equate nil and empty slices.
cmpopts.EquateEmpty(),
}
for i := range outputs {
a, b := testCase.outputs[i].Interface(), outputs[i].Interface()
@@ -232,10 +236,9 @@ func (s *MethodTestSuite) SubtestWithDB(db database.Store, testCaseF func(db dat
// first check if the values are equal with regard to order.
// If not, re-check disregarding order and show a nice diff
// output of the two values.
if !cmp.Equal(a, b, cmpopts.EquateEmpty()) {
if diff := cmp.Diff(a, b,
// Equate nil and empty slices.
cmpopts.EquateEmpty(),
if !cmp.Equal(a, b, cmpOptions...) {
diffOpts := append(
append([]cmp.Option{}, cmpOptions...),
// Allow slice order to be ignored.
cmpopts.SortSlices(func(a, b any) bool {
var ab, bb strings.Builder
@@ -247,7 +250,8 @@ func (s *MethodTestSuite) SubtestWithDB(db database.Store, testCaseF func(db dat
// https://github.com/google/go-cmp/issues/67
return ab.String() < bb.String()
}),
); diff != "" {
)
if diff := cmp.Diff(a, b, diffOpts...); diff != "" {
s.Failf("compare outputs failed", "method %q returned unexpected output %d (-want +got):\n%s", methodName, i, diff)
}
}
+31
View File
@@ -24,6 +24,7 @@ import (
"github.com/coder/coder/v2/coderd/rbac"
"github.com/coder/coder/v2/coderd/telemetry"
"github.com/coder/coder/v2/coderd/wspubsub"
"github.com/coder/coder/v2/codersdk"
"github.com/coder/coder/v2/provisionersdk"
sdkproto "github.com/coder/coder/v2/provisionersdk/proto"
)
@@ -55,6 +56,7 @@ type WorkspaceBuildBuilder struct {
params []database.WorkspaceBuildParameter
agentToken string
dispo workspaceBuildDisposition
taskAppID uuid.UUID
}
type workspaceBuildDisposition struct {
@@ -117,6 +119,27 @@ func (b WorkspaceBuildBuilder) WithAgent(mutations ...func([]*sdkproto.Agent) []
return b
}
func (b WorkspaceBuildBuilder) WithTask(seed *sdkproto.App) WorkspaceBuildBuilder {
//nolint: revive // returns modified struct
b.taskAppID = uuid.New()
if seed == nil {
seed = &sdkproto.App{}
}
return b.Params(database.WorkspaceBuildParameter{
Name: codersdk.AITaskPromptParameterName,
Value: "list me",
}).WithAgent(func(a []*sdkproto.Agent) []*sdkproto.Agent {
a[0].Apps = []*sdkproto.App{
{
Id: takeFirst(seed.Id, b.taskAppID.String()),
Slug: takeFirst(seed.Slug, "vcode"),
Url: takeFirst(seed.Url, ""),
},
}
return a
})
}
func (b WorkspaceBuildBuilder) Starting() WorkspaceBuildBuilder {
//nolint: revive // returns modified struct
b.dispo.starting = true
@@ -134,6 +157,14 @@ func (b WorkspaceBuildBuilder) Do() WorkspaceResponse {
b.seed.ID = uuid.New()
b.seed.JobID = jobID
if b.taskAppID != uuid.Nil {
b.seed.HasAITask = sql.NullBool{
Bool: true,
Valid: true,
}
b.seed.AITaskSidebarAppID = uuid.NullUUID{UUID: b.taskAppID, Valid: true}
}
resp := WorkspaceResponse{
AgentToken: b.agentToken,
}
+48 -1
View File
@@ -27,6 +27,8 @@ import (
"github.com/coder/coder/v2/coderd/database/provisionerjobs"
"github.com/coder/coder/v2/coderd/database/pubsub"
"github.com/coder/coder/v2/coderd/rbac"
"github.com/coder/coder/v2/coderd/rbac/policy"
"github.com/coder/coder/v2/coderd/taskname"
"github.com/coder/coder/v2/codersdk"
"github.com/coder/coder/v2/cryptorand"
"github.com/coder/coder/v2/provisionerd/proto"
@@ -186,7 +188,7 @@ func APIKey(t testing.TB, db database.Store, seed database.APIKey, munge ...func
UpdatedAt: takeFirst(seed.UpdatedAt, dbtime.Now()),
LoginType: takeFirst(seed.LoginType, database.LoginTypePassword),
Scopes: takeFirstSlice([]database.APIKeyScope(seed.Scopes), []database.APIKeyScope{database.ApiKeyScopeCoderAll}),
AllowList: takeFirstSlice(seed.AllowList, database.AllowList{database.AllowListWildcard()}),
AllowList: takeFirstSlice(seed.AllowList, database.AllowList{{Type: policy.WildcardSymbol, ID: policy.WildcardSymbol}}),
TokenName: takeFirst(seed.TokenName),
}
for _, fn := range munge {
@@ -420,6 +422,14 @@ func Workspace(t testing.TB, db database.Store, orig database.WorkspaceTable) da
require.NoError(t, err, "set workspace as deleted")
workspace.Deleted = true
}
if orig.DormantAt.Valid {
_, err = db.UpdateWorkspaceDormantDeletingAt(genCtx, database.UpdateWorkspaceDormantDeletingAtParams{
ID: workspace.ID,
DormantAt: orig.DormantAt,
})
require.NoError(t, err, "set workspace as dormant")
workspace.DormantAt = orig.DormantAt
}
return workspace
}
@@ -1551,6 +1561,43 @@ func AIBridgeToolUsage(t testing.TB, db database.Store, seed database.InsertAIBr
return toolUsage
}
func Task(t testing.TB, db database.Store, orig database.TaskTable) database.TaskTable {
t.Helper()
parameters := orig.TemplateParameters
if parameters == nil {
parameters = json.RawMessage([]byte("{}"))
}
task, err := db.InsertTask(genCtx, database.InsertTaskParams{
OrganizationID: orig.OrganizationID,
OwnerID: orig.OwnerID,
Name: takeFirst(orig.Name, taskname.GenerateFallback()),
WorkspaceID: orig.WorkspaceID,
TemplateVersionID: orig.TemplateVersionID,
TemplateParameters: parameters,
Prompt: orig.Prompt,
CreatedAt: takeFirst(orig.CreatedAt, dbtime.Now()),
})
require.NoError(t, err, "failed to insert task")
return task
}
func TaskWorkspaceApp(t testing.TB, db database.Store, orig database.TaskWorkspaceApp) database.TaskWorkspaceApp {
t.Helper()
app, err := db.UpsertTaskWorkspaceApp(genCtx, database.UpsertTaskWorkspaceAppParams{
TaskID: orig.TaskID,
WorkspaceBuildNumber: orig.WorkspaceBuildNumber,
WorkspaceAgentID: orig.WorkspaceAgentID,
WorkspaceAppID: orig.WorkspaceAppID,
})
require.NoError(t, err, "failed to upsert task workspace app")
return app
}
func provisionerJobTiming(t testing.TB, db database.Store, seed database.ProvisionerJobTiming) database.ProvisionerJobTiming {
timing, err := db.InsertProvisionerJobTimings(genCtx, database.InsertProvisionerJobTimingsParams{
JobID: takeFirst(seed.JobID, uuid.New()),
+35
View File
@@ -1482,6 +1482,20 @@ func (m queryMetricsStore) GetTailnetTunnelPeerIDs(ctx context.Context, srcID uu
return r0, r1
}
func (m queryMetricsStore) GetTaskByID(ctx context.Context, id uuid.UUID) (database.Task, error) {
start := time.Now()
r0, r1 := m.s.GetTaskByID(ctx, id)
m.queryLatencies.WithLabelValues("GetTaskByID").Observe(time.Since(start).Seconds())
return r0, r1
}
func (m queryMetricsStore) GetTaskByWorkspaceID(ctx context.Context, workspaceID uuid.UUID) (database.Task, error) {
start := time.Now()
r0, r1 := m.s.GetTaskByWorkspaceID(ctx, workspaceID)
m.queryLatencies.WithLabelValues("GetTaskByWorkspaceID").Observe(time.Since(start).Seconds())
return r0, r1
}
func (m queryMetricsStore) GetTelemetryItem(ctx context.Context, key string) (database.TelemetryItem, error) {
start := time.Now()
r0, r1 := m.s.GetTelemetryItem(ctx, key)
@@ -2455,6 +2469,13 @@ func (m queryMetricsStore) InsertReplica(ctx context.Context, arg database.Inser
return replica, err
}
func (m queryMetricsStore) InsertTask(ctx context.Context, arg database.InsertTaskParams) (database.TaskTable, error) {
start := time.Now()
r0, r1 := m.s.InsertTask(ctx, arg)
m.queryLatencies.WithLabelValues("InsertTask").Observe(time.Since(start).Seconds())
return r0, r1
}
func (m queryMetricsStore) InsertTelemetryItemIfNotExists(ctx context.Context, arg database.InsertTelemetryItemIfNotExistsParams) error {
start := time.Now()
r0 := m.s.InsertTelemetryItemIfNotExists(ctx, arg)
@@ -2714,6 +2735,13 @@ func (m queryMetricsStore) ListProvisionerKeysByOrganizationExcludeReserved(ctx
return r0, r1
}
func (m queryMetricsStore) ListTasks(ctx context.Context, arg database.ListTasksParams) ([]database.Task, error) {
start := time.Now()
r0, r1 := m.s.ListTasks(ctx, arg)
m.queryLatencies.WithLabelValues("ListTasks").Observe(time.Since(start).Seconds())
return r0, r1
}
func (m queryMetricsStore) ListUserSecrets(ctx context.Context, userID uuid.UUID) ([]database.UserSecret, error) {
start := time.Now()
r0, r1 := m.s.ListUserSecrets(ctx, userID)
@@ -3533,6 +3561,13 @@ func (m queryMetricsStore) UpsertTailnetTunnel(ctx context.Context, arg database
return r0, r1
}
func (m queryMetricsStore) UpsertTaskWorkspaceApp(ctx context.Context, arg database.UpsertTaskWorkspaceAppParams) (database.TaskWorkspaceApp, error) {
start := time.Now()
r0, r1 := m.s.UpsertTaskWorkspaceApp(ctx, arg)
m.queryLatencies.WithLabelValues("UpsertTaskWorkspaceApp").Observe(time.Since(start).Seconds())
return r0, r1
}
func (m queryMetricsStore) UpsertTelemetryItem(ctx context.Context, arg database.UpsertTelemetryItemParams) error {
start := time.Now()
r0 := m.s.UpsertTelemetryItem(ctx, arg)
+75
View File
@@ -3119,6 +3119,36 @@ func (mr *MockStoreMockRecorder) GetTailnetTunnelPeerIDs(ctx, srcID any) *gomock
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetTailnetTunnelPeerIDs", reflect.TypeOf((*MockStore)(nil).GetTailnetTunnelPeerIDs), ctx, srcID)
}
// GetTaskByID mocks base method.
func (m *MockStore) GetTaskByID(ctx context.Context, id uuid.UUID) (database.Task, error) {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "GetTaskByID", ctx, id)
ret0, _ := ret[0].(database.Task)
ret1, _ := ret[1].(error)
return ret0, ret1
}
// GetTaskByID indicates an expected call of GetTaskByID.
func (mr *MockStoreMockRecorder) GetTaskByID(ctx, id any) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetTaskByID", reflect.TypeOf((*MockStore)(nil).GetTaskByID), ctx, id)
}
// GetTaskByWorkspaceID mocks base method.
func (m *MockStore) GetTaskByWorkspaceID(ctx context.Context, workspaceID uuid.UUID) (database.Task, error) {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "GetTaskByWorkspaceID", ctx, workspaceID)
ret0, _ := ret[0].(database.Task)
ret1, _ := ret[1].(error)
return ret0, ret1
}
// GetTaskByWorkspaceID indicates an expected call of GetTaskByWorkspaceID.
func (mr *MockStoreMockRecorder) GetTaskByWorkspaceID(ctx, workspaceID any) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetTaskByWorkspaceID", reflect.TypeOf((*MockStore)(nil).GetTaskByWorkspaceID), ctx, workspaceID)
}
// GetTelemetryItem mocks base method.
func (m *MockStore) GetTelemetryItem(ctx context.Context, key string) (database.TelemetryItem, error) {
m.ctrl.T.Helper()
@@ -5244,6 +5274,21 @@ func (mr *MockStoreMockRecorder) InsertReplica(ctx, arg any) *gomock.Call {
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "InsertReplica", reflect.TypeOf((*MockStore)(nil).InsertReplica), ctx, arg)
}
// InsertTask mocks base method.
func (m *MockStore) InsertTask(ctx context.Context, arg database.InsertTaskParams) (database.TaskTable, error) {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "InsertTask", ctx, arg)
ret0, _ := ret[0].(database.TaskTable)
ret1, _ := ret[1].(error)
return ret0, ret1
}
// InsertTask indicates an expected call of InsertTask.
func (mr *MockStoreMockRecorder) InsertTask(ctx, arg any) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "InsertTask", reflect.TypeOf((*MockStore)(nil).InsertTask), ctx, arg)
}
// InsertTelemetryItemIfNotExists mocks base method.
func (m *MockStore) InsertTelemetryItemIfNotExists(ctx context.Context, arg database.InsertTelemetryItemIfNotExistsParams) error {
m.ctrl.T.Helper()
@@ -5803,6 +5848,21 @@ func (mr *MockStoreMockRecorder) ListProvisionerKeysByOrganizationExcludeReserve
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "ListProvisionerKeysByOrganizationExcludeReserved", reflect.TypeOf((*MockStore)(nil).ListProvisionerKeysByOrganizationExcludeReserved), ctx, organizationID)
}
// ListTasks mocks base method.
func (m *MockStore) ListTasks(ctx context.Context, arg database.ListTasksParams) ([]database.Task, error) {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "ListTasks", ctx, arg)
ret0, _ := ret[0].([]database.Task)
ret1, _ := ret[1].(error)
return ret0, ret1
}
// ListTasks indicates an expected call of ListTasks.
func (mr *MockStoreMockRecorder) ListTasks(ctx, arg any) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "ListTasks", reflect.TypeOf((*MockStore)(nil).ListTasks), ctx, arg)
}
// ListUserSecrets mocks base method.
func (m *MockStore) ListUserSecrets(ctx context.Context, userID uuid.UUID) ([]database.UserSecret, error) {
m.ctrl.T.Helper()
@@ -7517,6 +7577,21 @@ func (mr *MockStoreMockRecorder) UpsertTailnetTunnel(ctx, arg any) *gomock.Call
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "UpsertTailnetTunnel", reflect.TypeOf((*MockStore)(nil).UpsertTailnetTunnel), ctx, arg)
}
// UpsertTaskWorkspaceApp mocks base method.
func (m *MockStore) UpsertTaskWorkspaceApp(ctx context.Context, arg database.UpsertTaskWorkspaceAppParams) (database.TaskWorkspaceApp, error) {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "UpsertTaskWorkspaceApp", ctx, arg)
ret0, _ := ret[0].(database.TaskWorkspaceApp)
ret1, _ := ret[1].(error)
return ret0, ret1
}
// UpsertTaskWorkspaceApp indicates an expected call of UpsertTaskWorkspaceApp.
func (mr *MockStoreMockRecorder) UpsertTaskWorkspaceApp(ctx, arg any) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "UpsertTaskWorkspaceApp", reflect.TypeOf((*MockStore)(nil).UpsertTaskWorkspaceApp), ctx, arg)
}
// UpsertTelemetryItem mocks base method.
func (m *MockStore) UpsertTelemetryItem(ctx context.Context, arg database.UpsertTelemetryItemParams) error {
m.ctrl.T.Helper()
+1 -1
View File
@@ -150,7 +150,7 @@ func (b *Broker) init(t TBSubset) error {
b.uuid = uuid.New()
ctx, cancel := context.WithTimeout(context.Background(), 20*time.Second)
defer cancel()
b.cleanerFD, err = startCleaner(ctx, b.uuid, coderTestingParams.DSN())
b.cleanerFD, err = startCleaner(ctx, t, b.uuid, coderTestingParams.DSN())
if err != nil {
return xerrors.Errorf("start test db cleaner: %w", err)
}
+26 -20
View File
@@ -22,36 +22,43 @@ const (
cleanerRespOK = "OK"
envCleanerParentUUID = "DB_CLEANER_PARENT_UUID"
envCleanerDSN = "DB_CLEANER_DSN"
)
var (
originalWorkingDir string
errGettingWorkingDir error
envCleanerMagic = "DB_CLEANER_MAGIC"
envCleanerMagicValue = "XEHdJqWehWek8AaWwopy" // 20 random characters to make this collision resistant
)
func init() {
// We expect our tests to run from somewhere in the project tree where `go run` below in `startCleaner` will
// be able to resolve the command package. However, some of the tests modify the working directory during the run.
// So, we grab the working directory during package init, before tests are run, and then set that work dir on the
// subcommand process before it starts.
originalWorkingDir, errGettingWorkingDir = os.Getwd()
// We are hijacking the init() function here to do something very non-standard.
//
// We want to be able to run the cleaner as a subprocess of the test process so that it can outlive the test binary
// and still clean up, even if the test process times out or is killed. So, what we do is in startCleaner() below,
// which is called in the parent process, we exec our own binary and set a collision-resistant environment variable.
// Then here in the init(), which will run before main() and therefore before executing tests, we check for the
// environment variable, and if present we know this is the child process and we exec the cleaner. Instead of
// returning normally from init() we call os.Exit(). This prevents tests from being re-run in the child process (and
// recursion).
//
// If the magic value is not present, we know we are the parent process and init() returns normally.
magicValue := os.Getenv(envCleanerMagic)
if magicValue == envCleanerMagicValue {
RunCleaner()
os.Exit(0)
}
}
// startCleaner starts the cleaner in a subprocess. holdThis is an opaque reference that needs to be kept from being
// garbage collected until we are done with all test databases (e.g. the end of the process).
func startCleaner(ctx context.Context, parentUUID uuid.UUID, dsn string) (holdThis any, err error) {
cmd := exec.Command("go", "run", "github.com/coder/coder/v2/coderd/database/dbtestutil/cleanercmd")
func startCleaner(ctx context.Context, _ TBSubset, parentUUID uuid.UUID, dsn string) (holdThis any, err error) {
bin, err := os.Executable()
if err != nil {
return nil, xerrors.Errorf("could not get executable path: %w", err)
}
cmd := exec.Command(bin)
cmd.Env = append(os.Environ(),
fmt.Sprintf("%s=%s", envCleanerParentUUID, parentUUID.String()),
fmt.Sprintf("%s=%s", envCleanerDSN, dsn),
fmt.Sprintf("%s=%s", envCleanerMagic, envCleanerMagicValue),
)
// c.f. comment on `func init()` in this file.
if errGettingWorkingDir != nil {
return nil, xerrors.Errorf("failed to get working directory during init: %w", errGettingWorkingDir)
}
cmd.Dir = originalWorkingDir
// Here we don't actually use the reference to the stdin pipe, because we never write anything to it. When this
// process exits, the pipe is closed by the OS and this triggers the cleaner to do its cleaning work. But, we do
// need to hang on to a reference to it so that it doesn't get garbage collected and trigger cleanup early.
@@ -178,8 +185,7 @@ func (c *cleaner) waitAndClean() {
}
// RunCleaner runs the test database cleaning process. It takes no arguments but uses stdio and environment variables
// for its operation. It is designed to be launched as the only task of a `main()` process, but is included in this
// package to share constants with the parent code that launches it above.
// for its operation.
//
// The cleaner is designed to run in a separate process from the main test suite, connected over stdio. If the main test
// process ends (panics, times out, or is killed) without explicitly discarding the databases it clones, the cleaner
@@ -1,7 +0,0 @@
package main
import "github.com/coder/coder/v2/coderd/database/dbtestutil"
func main() {
dbtestutil.RunCleaner()
}
+3 -2
View File
@@ -242,10 +242,11 @@ func PGDump(dbURL string) ([]byte, error) {
"PGCLIENTENCODING=UTF8",
"PGDATABASE=", // we should always specify the database name in the connection string
}
var stdout bytes.Buffer
var stdout, stderr bytes.Buffer
cmd.Stdout = &stdout
cmd.Stderr = &stderr
if err := cmd.Run(); err != nil {
return nil, xerrors.Errorf("exec pg_dump: %w", err)
return nil, xerrors.Errorf("exec pg_dump: %w\n%s", err, stderr.String())
}
return stdout.Bytes(), nil
}
+1
View File
@@ -166,6 +166,7 @@ type TBSubset interface {
Cleanup(func())
Helper()
Logf(format string, args ...any)
TempDir() string
}
// Open creates a new PostgreSQL database instance.
File diff suppressed because it is too large Load Diff
+250 -125
View File
@@ -157,7 +157,52 @@ CREATE TYPE api_key_scope AS ENUM (
'coder:workspaces.access',
'coder:templates.build',
'coder:templates.author',
'coder:apikeys.manage_self'
'coder:apikeys.manage_self',
'aibridge_interception:*',
'api_key:*',
'assign_org_role:*',
'assign_role:*',
'audit_log:*',
'connection_log:*',
'crypto_key:*',
'debug_info:*',
'deployment_config:*',
'deployment_stats:*',
'file:*',
'group:*',
'group_member:*',
'idpsync_settings:*',
'inbox_notification:*',
'license:*',
'notification_message:*',
'notification_preference:*',
'notification_template:*',
'oauth2_app:*',
'oauth2_app_code_token:*',
'oauth2_app_secret:*',
'organization:*',
'organization_member:*',
'prebuilt_workspace:*',
'provisioner_daemon:*',
'provisioner_jobs:*',
'replicas:*',
'system:*',
'tailnet_coordinator:*',
'template:*',
'usage_event:*',
'user:*',
'user_secret:*',
'webpush_subscription:*',
'workspace:*',
'workspace_agent_devcontainers:*',
'workspace_agent_resource_monitor:*',
'workspace_dormant:*',
'workspace_proxy:*',
'task:create',
'task:read',
'task:update',
'task:delete',
'task:*'
);
CREATE TYPE app_sharing_level AS ENUM (
@@ -415,7 +460,8 @@ CREATE TYPE resource_type AS ENUM (
'idp_sync_settings_role',
'workspace_agent',
'workspace_app',
'prebuilds_settings'
'prebuilds_settings',
'task'
);
CREATE TYPE startup_script_behavior AS ENUM (
@@ -432,6 +478,15 @@ CREATE TYPE tailnet_status AS ENUM (
'lost'
);
CREATE TYPE task_status AS ENUM (
'pending',
'initializing',
'active',
'paused',
'unknown',
'error'
);
CREATE TYPE user_status AS ENUM (
'active',
'suspended',
@@ -1751,9 +1806,9 @@ CREATE TABLE tailnet_tunnels (
CREATE TABLE task_workspace_apps (
task_id uuid NOT NULL,
workspace_build_id uuid NOT NULL,
workspace_agent_id uuid NOT NULL,
workspace_app_id uuid NOT NULL
workspace_agent_id uuid,
workspace_app_id uuid,
workspace_build_number integer NOT NULL
);
CREATE TABLE tasks (
@@ -1769,6 +1824,183 @@ CREATE TABLE tasks (
deleted_at timestamp with time zone
);
CREATE TABLE workspace_agents (
id uuid NOT NULL,
created_at timestamp with time zone NOT NULL,
updated_at timestamp with time zone NOT NULL,
name character varying(64) NOT NULL,
first_connected_at timestamp with time zone,
last_connected_at timestamp with time zone,
disconnected_at timestamp with time zone,
resource_id uuid NOT NULL,
auth_token uuid NOT NULL,
auth_instance_id character varying,
architecture character varying(64) NOT NULL,
environment_variables jsonb,
operating_system character varying(64) NOT NULL,
instance_metadata jsonb,
resource_metadata jsonb,
directory character varying(4096) DEFAULT ''::character varying NOT NULL,
version text DEFAULT ''::text NOT NULL,
last_connected_replica_id uuid,
connection_timeout_seconds integer DEFAULT 0 NOT NULL,
troubleshooting_url text DEFAULT ''::text NOT NULL,
motd_file text DEFAULT ''::text NOT NULL,
lifecycle_state workspace_agent_lifecycle_state DEFAULT 'created'::workspace_agent_lifecycle_state NOT NULL,
expanded_directory character varying(4096) DEFAULT ''::character varying NOT NULL,
logs_length integer DEFAULT 0 NOT NULL,
logs_overflowed boolean DEFAULT false NOT NULL,
started_at timestamp with time zone,
ready_at timestamp with time zone,
subsystems workspace_agent_subsystem[] DEFAULT '{}'::workspace_agent_subsystem[],
display_apps display_app[] DEFAULT '{vscode,vscode_insiders,web_terminal,ssh_helper,port_forwarding_helper}'::display_app[],
api_version text DEFAULT ''::text NOT NULL,
display_order integer DEFAULT 0 NOT NULL,
parent_id uuid,
api_key_scope agent_key_scope_enum DEFAULT 'all'::agent_key_scope_enum NOT NULL,
deleted boolean DEFAULT false NOT NULL,
CONSTRAINT max_logs_length CHECK ((logs_length <= 1048576)),
CONSTRAINT subsystems_not_none CHECK ((NOT ('none'::workspace_agent_subsystem = ANY (subsystems))))
);
COMMENT ON COLUMN workspace_agents.version IS 'Version tracks the version of the currently running workspace agent. Workspace agents register their version upon start.';
COMMENT ON COLUMN workspace_agents.connection_timeout_seconds IS 'Connection timeout in seconds, 0 means disabled.';
COMMENT ON COLUMN workspace_agents.troubleshooting_url IS 'URL for troubleshooting the agent.';
COMMENT ON COLUMN workspace_agents.motd_file IS 'Path to file inside workspace containing the message of the day (MOTD) to show to the user when logging in via SSH.';
COMMENT ON COLUMN workspace_agents.lifecycle_state IS 'The current lifecycle state reported by the workspace agent.';
COMMENT ON COLUMN workspace_agents.expanded_directory IS 'The resolved path of a user-specified directory. e.g. ~/coder -> /home/coder/coder';
COMMENT ON COLUMN workspace_agents.logs_length IS 'Total length of startup logs';
COMMENT ON COLUMN workspace_agents.logs_overflowed IS 'Whether the startup logs overflowed in length';
COMMENT ON COLUMN workspace_agents.started_at IS 'The time the agent entered the starting lifecycle state';
COMMENT ON COLUMN workspace_agents.ready_at IS 'The time the agent entered the ready or start_error lifecycle state';
COMMENT ON COLUMN workspace_agents.display_order IS 'Specifies the order in which to display agents in user interfaces.';
COMMENT ON COLUMN workspace_agents.api_key_scope IS 'Defines the scope of the API key associated with the agent. ''all'' allows access to everything, ''no_user_data'' restricts it to exclude user data.';
COMMENT ON COLUMN workspace_agents.deleted IS 'Indicates whether or not the agent has been deleted. This is currently only applicable to sub agents.';
CREATE TABLE workspace_apps (
id uuid NOT NULL,
created_at timestamp with time zone NOT NULL,
agent_id uuid NOT NULL,
display_name character varying(64) NOT NULL,
icon character varying(256) NOT NULL,
command character varying(65534),
url character varying(65534),
healthcheck_url text DEFAULT ''::text NOT NULL,
healthcheck_interval integer DEFAULT 0 NOT NULL,
healthcheck_threshold integer DEFAULT 0 NOT NULL,
health workspace_app_health DEFAULT 'disabled'::workspace_app_health NOT NULL,
subdomain boolean DEFAULT false NOT NULL,
sharing_level app_sharing_level DEFAULT 'owner'::app_sharing_level NOT NULL,
slug text NOT NULL,
external boolean DEFAULT false NOT NULL,
display_order integer DEFAULT 0 NOT NULL,
hidden boolean DEFAULT false NOT NULL,
open_in workspace_app_open_in DEFAULT 'slim-window'::workspace_app_open_in NOT NULL,
display_group text,
tooltip character varying(2048) DEFAULT ''::character varying NOT NULL
);
COMMENT ON COLUMN workspace_apps.display_order IS 'Specifies the order in which to display agent app in user interfaces.';
COMMENT ON COLUMN workspace_apps.hidden IS 'Determines if the app is not shown in user interfaces.';
COMMENT ON COLUMN workspace_apps.tooltip IS 'Markdown text that is displayed when hovering over workspace apps.';
CREATE TABLE workspace_builds (
id uuid NOT NULL,
created_at timestamp with time zone NOT NULL,
updated_at timestamp with time zone NOT NULL,
workspace_id uuid NOT NULL,
template_version_id uuid NOT NULL,
build_number integer NOT NULL,
transition workspace_transition NOT NULL,
initiator_id uuid NOT NULL,
provisioner_state bytea,
job_id uuid NOT NULL,
deadline timestamp with time zone DEFAULT '0001-01-01 00:00:00+00'::timestamp with time zone NOT NULL,
reason build_reason DEFAULT 'initiator'::build_reason NOT NULL,
daily_cost integer DEFAULT 0 NOT NULL,
max_deadline timestamp with time zone DEFAULT '0001-01-01 00:00:00+00'::timestamp with time zone NOT NULL,
template_version_preset_id uuid,
has_ai_task boolean,
ai_task_sidebar_app_id uuid,
has_external_agent boolean,
CONSTRAINT workspace_builds_ai_task_sidebar_app_id_required CHECK (((((has_ai_task IS NULL) OR (has_ai_task = false)) AND (ai_task_sidebar_app_id IS NULL)) OR ((has_ai_task = true) AND (ai_task_sidebar_app_id IS NOT NULL)))),
CONSTRAINT workspace_builds_deadline_below_max_deadline CHECK ((((deadline <> '0001-01-01 00:00:00+00'::timestamp with time zone) AND (deadline <= max_deadline)) OR (max_deadline = '0001-01-01 00:00:00+00'::timestamp with time zone)))
);
CREATE VIEW tasks_with_status AS
SELECT tasks.id,
tasks.organization_id,
tasks.owner_id,
tasks.name,
tasks.workspace_id,
tasks.template_version_id,
tasks.template_parameters,
tasks.prompt,
tasks.created_at,
tasks.deleted_at,
CASE
WHEN ((tasks.workspace_id IS NULL) OR (latest_build.job_status IS NULL)) THEN 'pending'::task_status
WHEN (latest_build.job_status = 'failed'::provisioner_job_status) THEN 'error'::task_status
WHEN ((latest_build.transition = ANY (ARRAY['stop'::workspace_transition, 'delete'::workspace_transition])) AND (latest_build.job_status = 'succeeded'::provisioner_job_status)) THEN 'paused'::task_status
WHEN ((latest_build.transition = 'start'::workspace_transition) AND (latest_build.job_status = 'pending'::provisioner_job_status)) THEN 'initializing'::task_status
WHEN ((latest_build.transition = 'start'::workspace_transition) AND (latest_build.job_status = ANY (ARRAY['running'::provisioner_job_status, 'succeeded'::provisioner_job_status]))) THEN
CASE
WHEN agent_status."none" THEN 'initializing'::task_status
WHEN agent_status.connecting THEN 'initializing'::task_status
WHEN agent_status.connected THEN
CASE
WHEN app_status.any_unhealthy THEN 'error'::task_status
WHEN app_status.any_initializing THEN 'initializing'::task_status
WHEN app_status.all_healthy_or_disabled THEN 'active'::task_status
ELSE 'unknown'::task_status
END
ELSE 'unknown'::task_status
END
ELSE 'unknown'::task_status
END AS status,
task_app.workspace_build_number,
task_app.workspace_agent_id,
task_app.workspace_app_id
FROM ((((tasks
LEFT JOIN LATERAL ( SELECT task_app_1.workspace_build_number,
task_app_1.workspace_agent_id,
task_app_1.workspace_app_id
FROM task_workspace_apps task_app_1
WHERE (task_app_1.task_id = tasks.id)
ORDER BY task_app_1.workspace_build_number DESC
LIMIT 1) task_app ON (true))
LEFT JOIN LATERAL ( SELECT workspace_build.transition,
provisioner_job.job_status,
workspace_build.job_id
FROM (workspace_builds workspace_build
JOIN provisioner_jobs provisioner_job ON ((provisioner_job.id = workspace_build.job_id)))
WHERE ((workspace_build.workspace_id = tasks.workspace_id) AND (workspace_build.build_number = task_app.workspace_build_number))) latest_build ON (true))
CROSS JOIN LATERAL ( SELECT (count(*) = 0) AS "none",
bool_or((workspace_agent.lifecycle_state = ANY (ARRAY['created'::workspace_agent_lifecycle_state, 'starting'::workspace_agent_lifecycle_state]))) AS connecting,
bool_and((workspace_agent.lifecycle_state = 'ready'::workspace_agent_lifecycle_state)) AS connected
FROM workspace_agents workspace_agent
WHERE (workspace_agent.id = task_app.workspace_agent_id)) agent_status)
CROSS JOIN LATERAL ( SELECT bool_or((workspace_app.health = 'unhealthy'::workspace_app_health)) AS any_unhealthy,
bool_or((workspace_app.health = 'initializing'::workspace_app_health)) AS any_initializing,
bool_and((workspace_app.health = ANY (ARRAY['healthy'::workspace_app_health, 'disabled'::workspace_app_health]))) AS all_healthy_or_disabled
FROM workspace_apps workspace_app
WHERE (workspace_app.id = task_app.workspace_app_id)) app_status)
WHERE (tasks.deleted_at IS NULL);
CREATE TABLE telemetry_items (
key text NOT NULL,
value text NOT NULL,
@@ -2332,71 +2564,6 @@ CREATE TABLE workspace_agent_volume_resource_monitors (
debounced_until timestamp with time zone DEFAULT '0001-01-01 00:00:00+00'::timestamp with time zone NOT NULL
);
CREATE TABLE workspace_agents (
id uuid NOT NULL,
created_at timestamp with time zone NOT NULL,
updated_at timestamp with time zone NOT NULL,
name character varying(64) NOT NULL,
first_connected_at timestamp with time zone,
last_connected_at timestamp with time zone,
disconnected_at timestamp with time zone,
resource_id uuid NOT NULL,
auth_token uuid NOT NULL,
auth_instance_id character varying,
architecture character varying(64) NOT NULL,
environment_variables jsonb,
operating_system character varying(64) NOT NULL,
instance_metadata jsonb,
resource_metadata jsonb,
directory character varying(4096) DEFAULT ''::character varying NOT NULL,
version text DEFAULT ''::text NOT NULL,
last_connected_replica_id uuid,
connection_timeout_seconds integer DEFAULT 0 NOT NULL,
troubleshooting_url text DEFAULT ''::text NOT NULL,
motd_file text DEFAULT ''::text NOT NULL,
lifecycle_state workspace_agent_lifecycle_state DEFAULT 'created'::workspace_agent_lifecycle_state NOT NULL,
expanded_directory character varying(4096) DEFAULT ''::character varying NOT NULL,
logs_length integer DEFAULT 0 NOT NULL,
logs_overflowed boolean DEFAULT false NOT NULL,
started_at timestamp with time zone,
ready_at timestamp with time zone,
subsystems workspace_agent_subsystem[] DEFAULT '{}'::workspace_agent_subsystem[],
display_apps display_app[] DEFAULT '{vscode,vscode_insiders,web_terminal,ssh_helper,port_forwarding_helper}'::display_app[],
api_version text DEFAULT ''::text NOT NULL,
display_order integer DEFAULT 0 NOT NULL,
parent_id uuid,
api_key_scope agent_key_scope_enum DEFAULT 'all'::agent_key_scope_enum NOT NULL,
deleted boolean DEFAULT false NOT NULL,
CONSTRAINT max_logs_length CHECK ((logs_length <= 1048576)),
CONSTRAINT subsystems_not_none CHECK ((NOT ('none'::workspace_agent_subsystem = ANY (subsystems))))
);
COMMENT ON COLUMN workspace_agents.version IS 'Version tracks the version of the currently running workspace agent. Workspace agents register their version upon start.';
COMMENT ON COLUMN workspace_agents.connection_timeout_seconds IS 'Connection timeout in seconds, 0 means disabled.';
COMMENT ON COLUMN workspace_agents.troubleshooting_url IS 'URL for troubleshooting the agent.';
COMMENT ON COLUMN workspace_agents.motd_file IS 'Path to file inside workspace containing the message of the day (MOTD) to show to the user when logging in via SSH.';
COMMENT ON COLUMN workspace_agents.lifecycle_state IS 'The current lifecycle state reported by the workspace agent.';
COMMENT ON COLUMN workspace_agents.expanded_directory IS 'The resolved path of a user-specified directory. e.g. ~/coder -> /home/coder/coder';
COMMENT ON COLUMN workspace_agents.logs_length IS 'Total length of startup logs';
COMMENT ON COLUMN workspace_agents.logs_overflowed IS 'Whether the startup logs overflowed in length';
COMMENT ON COLUMN workspace_agents.started_at IS 'The time the agent entered the starting lifecycle state';
COMMENT ON COLUMN workspace_agents.ready_at IS 'The time the agent entered the ready or start_error lifecycle state';
COMMENT ON COLUMN workspace_agents.display_order IS 'Specifies the order in which to display agents in user interfaces.';
COMMENT ON COLUMN workspace_agents.api_key_scope IS 'Defines the scope of the API key associated with the agent. ''all'' allows access to everything, ''no_user_data'' restricts it to exclude user data.';
COMMENT ON COLUMN workspace_agents.deleted IS 'Indicates whether or not the agent has been deleted. This is currently only applicable to sub agents.';
CREATE UNLOGGED TABLE workspace_app_audit_sessions (
agent_id uuid NOT NULL,
app_id uuid NOT NULL,
@@ -2485,35 +2652,6 @@ CREATE TABLE workspace_app_statuses (
uri text
);
CREATE TABLE workspace_apps (
id uuid NOT NULL,
created_at timestamp with time zone NOT NULL,
agent_id uuid NOT NULL,
display_name character varying(64) NOT NULL,
icon character varying(256) NOT NULL,
command character varying(65534),
url character varying(65534),
healthcheck_url text DEFAULT ''::text NOT NULL,
healthcheck_interval integer DEFAULT 0 NOT NULL,
healthcheck_threshold integer DEFAULT 0 NOT NULL,
health workspace_app_health DEFAULT 'disabled'::workspace_app_health NOT NULL,
subdomain boolean DEFAULT false NOT NULL,
sharing_level app_sharing_level DEFAULT 'owner'::app_sharing_level NOT NULL,
slug text NOT NULL,
external boolean DEFAULT false NOT NULL,
display_order integer DEFAULT 0 NOT NULL,
hidden boolean DEFAULT false NOT NULL,
open_in workspace_app_open_in DEFAULT 'slim-window'::workspace_app_open_in NOT NULL,
display_group text,
tooltip character varying(2048) DEFAULT ''::character varying NOT NULL
);
COMMENT ON COLUMN workspace_apps.display_order IS 'Specifies the order in which to display agent app in user interfaces.';
COMMENT ON COLUMN workspace_apps.hidden IS 'Determines if the app is not shown in user interfaces.';
COMMENT ON COLUMN workspace_apps.tooltip IS 'Markdown text that is displayed when hovering over workspace apps.';
CREATE TABLE workspace_build_parameters (
workspace_build_id uuid NOT NULL,
name text NOT NULL,
@@ -2524,29 +2662,6 @@ COMMENT ON COLUMN workspace_build_parameters.name IS 'Parameter name';
COMMENT ON COLUMN workspace_build_parameters.value IS 'Parameter value';
CREATE TABLE workspace_builds (
id uuid NOT NULL,
created_at timestamp with time zone NOT NULL,
updated_at timestamp with time zone NOT NULL,
workspace_id uuid NOT NULL,
template_version_id uuid NOT NULL,
build_number integer NOT NULL,
transition workspace_transition NOT NULL,
initiator_id uuid NOT NULL,
provisioner_state bytea,
job_id uuid NOT NULL,
deadline timestamp with time zone DEFAULT '0001-01-01 00:00:00+00'::timestamp with time zone NOT NULL,
reason build_reason DEFAULT 'initiator'::build_reason NOT NULL,
daily_cost integer DEFAULT 0 NOT NULL,
max_deadline timestamp with time zone DEFAULT '0001-01-01 00:00:00+00'::timestamp with time zone NOT NULL,
template_version_preset_id uuid,
has_ai_task boolean,
ai_task_sidebar_app_id uuid,
has_external_agent boolean,
CONSTRAINT workspace_builds_ai_task_sidebar_app_id_required CHECK (((((has_ai_task IS NULL) OR (has_ai_task = false)) AND (ai_task_sidebar_app_id IS NULL)) OR ((has_ai_task = true) AND (ai_task_sidebar_app_id IS NOT NULL)))),
CONSTRAINT workspace_builds_deadline_below_max_deadline CHECK ((((deadline <> '0001-01-01 00:00:00+00'::timestamp with time zone) AND (deadline <= max_deadline)) OR (max_deadline = '0001-01-01 00:00:00+00'::timestamp with time zone)))
);
CREATE VIEW workspace_build_with_user AS
SELECT workspace_builds.id,
workspace_builds.created_at,
@@ -2962,6 +3077,9 @@ ALTER TABLE ONLY tailnet_peers
ALTER TABLE ONLY tailnet_tunnels
ADD CONSTRAINT tailnet_tunnels_pkey PRIMARY KEY (coordinator_id, src_id, dst_id);
ALTER TABLE ONLY task_workspace_apps
ADD CONSTRAINT task_workspace_apps_pkey PRIMARY KEY (task_id, workspace_build_number);
ALTER TABLE ONLY tasks
ADD CONSTRAINT tasks_pkey PRIMARY KEY (id);
@@ -3227,6 +3345,16 @@ COMMENT ON INDEX provisioner_jobs_worker_id_organization_id_completed_at_idx IS
CREATE UNIQUE INDEX provisioner_keys_organization_id_name_idx ON provisioner_keys USING btree (organization_id, lower((name)::text));
CREATE INDEX tasks_organization_id_idx ON tasks USING btree (organization_id);
CREATE INDEX tasks_owner_id_idx ON tasks USING btree (owner_id);
CREATE UNIQUE INDEX tasks_owner_id_name_unique_idx ON tasks USING btree (owner_id, lower(name)) WHERE (deleted_at IS NULL);
COMMENT ON INDEX tasks_owner_id_name_unique_idx IS 'Index to ensure uniqueness for task owner/name';
CREATE INDEX tasks_workspace_id_idx ON tasks USING btree (workspace_id);
CREATE INDEX template_usage_stats_start_time_idx ON template_usage_stats USING btree (start_time DESC);
COMMENT ON INDEX template_usage_stats_start_time_idx IS 'Index for querying MAX(start_time).';
@@ -3507,9 +3635,6 @@ ALTER TABLE ONLY task_workspace_apps
ALTER TABLE ONLY task_workspace_apps
ADD CONSTRAINT task_workspace_apps_workspace_app_id_fkey FOREIGN KEY (workspace_app_id) REFERENCES workspace_apps(id) ON DELETE CASCADE;
ALTER TABLE ONLY task_workspace_apps
ADD CONSTRAINT task_workspace_apps_workspace_build_id_fkey FOREIGN KEY (workspace_build_id) REFERENCES workspace_builds(id) ON DELETE CASCADE;
ALTER TABLE ONLY tasks
ADD CONSTRAINT tasks_organization_id_fkey FOREIGN KEY (organization_id) REFERENCES organizations(id) ON DELETE CASCADE;
@@ -48,7 +48,6 @@ const (
ForeignKeyTaskWorkspaceAppsTaskID ForeignKeyConstraint = "task_workspace_apps_task_id_fkey" // ALTER TABLE ONLY task_workspace_apps ADD CONSTRAINT task_workspace_apps_task_id_fkey FOREIGN KEY (task_id) REFERENCES tasks(id) ON DELETE CASCADE;
ForeignKeyTaskWorkspaceAppsWorkspaceAgentID ForeignKeyConstraint = "task_workspace_apps_workspace_agent_id_fkey" // ALTER TABLE ONLY task_workspace_apps ADD CONSTRAINT task_workspace_apps_workspace_agent_id_fkey FOREIGN KEY (workspace_agent_id) REFERENCES workspace_agents(id) ON DELETE CASCADE;
ForeignKeyTaskWorkspaceAppsWorkspaceAppID ForeignKeyConstraint = "task_workspace_apps_workspace_app_id_fkey" // ALTER TABLE ONLY task_workspace_apps ADD CONSTRAINT task_workspace_apps_workspace_app_id_fkey FOREIGN KEY (workspace_app_id) REFERENCES workspace_apps(id) ON DELETE CASCADE;
ForeignKeyTaskWorkspaceAppsWorkspaceBuildID ForeignKeyConstraint = "task_workspace_apps_workspace_build_id_fkey" // ALTER TABLE ONLY task_workspace_apps ADD CONSTRAINT task_workspace_apps_workspace_build_id_fkey FOREIGN KEY (workspace_build_id) REFERENCES workspace_builds(id) ON DELETE CASCADE;
ForeignKeyTasksOrganizationID ForeignKeyConstraint = "tasks_organization_id_fkey" // ALTER TABLE ONLY tasks ADD CONSTRAINT tasks_organization_id_fkey FOREIGN KEY (organization_id) REFERENCES organizations(id) ON DELETE CASCADE;
ForeignKeyTasksOwnerID ForeignKeyConstraint = "tasks_owner_id_fkey" // ALTER TABLE ONLY tasks ADD CONSTRAINT tasks_owner_id_fkey FOREIGN KEY (owner_id) REFERENCES users(id) ON DELETE CASCADE;
ForeignKeyTasksTemplateVersionID ForeignKeyConstraint = "tasks_template_version_id_fkey" // ALTER TABLE ONLY tasks ADD CONSTRAINT tasks_template_version_id_fkey FOREIGN KEY (template_version_id) REFERENCES template_versions(id) ON DELETE CASCADE;
+4
View File
@@ -35,6 +35,10 @@ func (*mockTB) Logf(format string, args ...any) {
_, _ = fmt.Printf(format, args...)
}
func (*mockTB) TempDir() string {
panic("not implemented")
}
func main() {
t := &mockTB{}
defer func() {
@@ -0,0 +1,2 @@
-- No-op: enum values remain to avoid churn. Removing enum values requires
-- doing a create/cast/drop cycle which is intentionally omitted here.
@@ -0,0 +1,42 @@
-- Add wildcard api_key_scope entries so every RBAC resource has a matching resource:* value.
-- Generated via: CGO_ENABLED=0 go run ./scripts/generate_api_key_scope_enum
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'aibridge_interception:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'api_key:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'assign_org_role:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'assign_role:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'audit_log:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'connection_log:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'crypto_key:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'debug_info:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'deployment_config:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'deployment_stats:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'file:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'group:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'group_member:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'idpsync_settings:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'inbox_notification:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'license:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'notification_message:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'notification_preference:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'notification_template:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'oauth2_app:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'oauth2_app_code_token:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'oauth2_app_secret:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'organization:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'organization_member:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'prebuilt_workspace:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'provisioner_daemon:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'provisioner_jobs:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'replicas:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'system:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'tailnet_coordinator:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'template:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'usage_event:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'user:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'user_secret:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'webpush_subscription:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'workspace:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'workspace_agent_devcontainers:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'workspace_agent_resource_monitor:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'workspace_dormant:*';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'workspace_proxy:*';
@@ -0,0 +1,3 @@
-- Revert Tasks RBAC.
-- No-op: enum values remain to avoid churn. Removing enum values requires
-- doing a create/cast/drop cycle which is intentionally omitted here.
@@ -0,0 +1,6 @@
-- Tasks RBAC.
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'task:create';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'task:read';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'task:update';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'task:delete';
ALTER TYPE api_key_scope ADD VALUE IF NOT EXISTS 'task:*';
@@ -0,0 +1,33 @@
DROP VIEW IF EXISTS tasks_with_status;
DROP TYPE IF EXISTS task_status;
DROP INDEX IF EXISTS tasks_organization_id_idx;
DROP INDEX IF EXISTS tasks_owner_id_idx;
DROP INDEX IF EXISTS tasks_workspace_id_idx;
ALTER TABLE task_workspace_apps
DROP CONSTRAINT IF EXISTS task_workspace_apps_pkey;
-- Add back workspace_build_id column.
ALTER TABLE task_workspace_apps
ADD COLUMN workspace_build_id UUID;
-- Try to populate workspace_build_id from workspace_builds.
UPDATE task_workspace_apps
SET workspace_build_id = workspace_builds.id
FROM workspace_builds
WHERE workspace_builds.build_number = task_workspace_apps.workspace_build_number
AND workspace_builds.workspace_id IN (
SELECT workspace_id FROM tasks WHERE tasks.id = task_workspace_apps.task_id
);
-- Remove rows that couldn't be restored.
DELETE FROM task_workspace_apps
WHERE workspace_build_id IS NULL;
-- Restore original schema.
ALTER TABLE task_workspace_apps
DROP COLUMN workspace_build_number,
ALTER COLUMN workspace_build_id SET NOT NULL,
ALTER COLUMN workspace_agent_id SET NOT NULL,
ALTER COLUMN workspace_app_id SET NOT NULL;
@@ -0,0 +1,104 @@
-- Replace workspace_build_id with workspace_build_number.
ALTER TABLE task_workspace_apps
ADD COLUMN workspace_build_number INTEGER;
-- Try to populate workspace_build_number from workspace_builds.
UPDATE task_workspace_apps
SET workspace_build_number = workspace_builds.build_number
FROM workspace_builds
WHERE workspace_builds.id = task_workspace_apps.workspace_build_id;
-- Remove rows that couldn't be migrated.
DELETE FROM task_workspace_apps
WHERE workspace_build_number IS NULL;
ALTER TABLE task_workspace_apps
DROP COLUMN workspace_build_id,
ALTER COLUMN workspace_build_number SET NOT NULL,
ALTER COLUMN workspace_agent_id DROP NOT NULL,
ALTER COLUMN workspace_app_id DROP NOT NULL,
ADD CONSTRAINT task_workspace_apps_pkey PRIMARY KEY (task_id, workspace_build_number);
-- Add indexes for common joins or filters.
CREATE INDEX IF NOT EXISTS tasks_workspace_id_idx ON tasks (workspace_id);
CREATE INDEX IF NOT EXISTS tasks_owner_id_idx ON tasks (owner_id);
CREATE INDEX IF NOT EXISTS tasks_organization_id_idx ON tasks (organization_id);
CREATE TYPE task_status AS ENUM (
'pending',
'initializing',
'active',
'paused',
'unknown',
'error'
);
CREATE VIEW
tasks_with_status
AS
SELECT
tasks.*,
CASE
WHEN tasks.workspace_id IS NULL OR latest_build.job_status IS NULL THEN 'pending'::task_status
WHEN latest_build.job_status = 'failed' THEN 'error'::task_status
WHEN latest_build.transition IN ('stop', 'delete')
AND latest_build.job_status = 'succeeded' THEN 'paused'::task_status
WHEN latest_build.transition = 'start'
AND latest_build.job_status = 'pending' THEN 'initializing'::task_status
WHEN latest_build.transition = 'start' AND latest_build.job_status IN ('running', 'succeeded') THEN
CASE
WHEN agent_status.none THEN 'initializing'::task_status
WHEN agent_status.connecting THEN 'initializing'::task_status
WHEN agent_status.connected THEN
CASE
WHEN app_status.any_unhealthy THEN 'error'::task_status
WHEN app_status.any_initializing THEN 'initializing'::task_status
WHEN app_status.all_healthy_or_disabled THEN 'active'::task_status
ELSE 'unknown'::task_status
END
ELSE 'unknown'::task_status
END
ELSE 'unknown'::task_status
END AS status
FROM
tasks
LEFT JOIN LATERAL (
SELECT workspace_build_number, workspace_agent_id, workspace_app_id
FROM task_workspace_apps task_app
WHERE task_id = tasks.id
ORDER BY workspace_build_number DESC
LIMIT 1
) task_app ON TRUE
LEFT JOIN LATERAL (
SELECT
workspace_build.transition,
provisioner_job.job_status,
workspace_build.job_id
FROM workspace_builds workspace_build
JOIN provisioner_jobs provisioner_job ON provisioner_job.id = workspace_build.job_id
WHERE workspace_build.workspace_id = tasks.workspace_id
AND workspace_build.build_number = task_app.workspace_build_number
) latest_build ON TRUE
CROSS JOIN LATERAL (
SELECT
COUNT(*) = 0 AS none,
bool_or(workspace_agent.lifecycle_state IN ('created', 'starting')) AS connecting,
bool_and(workspace_agent.lifecycle_state = 'ready') AS connected
FROM workspace_agents workspace_agent
WHERE workspace_agent.id = task_app.workspace_agent_id
) agent_status
CROSS JOIN LATERAL (
SELECT
bool_or(workspace_app.health = 'unhealthy') AS any_unhealthy,
bool_or(workspace_app.health = 'initializing') AS any_initializing,
bool_and(workspace_app.health IN ('healthy', 'disabled')) AS all_healthy_or_disabled
FROM workspace_apps workspace_app
WHERE workspace_app.id = task_app.workspace_app_id
) app_status
WHERE
tasks.deleted_at IS NULL;
@@ -0,0 +1 @@
DROP INDEX IF EXISTS tasks_owner_id_name_unique_idx;
@@ -0,0 +1,2 @@
CREATE UNIQUE INDEX IF NOT EXISTS tasks_owner_id_name_unique_idx ON tasks (owner_id, LOWER(name)) WHERE deleted_at IS NULL;
COMMENT ON INDEX tasks_owner_id_name_unique_idx IS 'Index to ensure uniqueness for task owner/name';
@@ -0,0 +1 @@
-- Nothing to do
@@ -0,0 +1 @@
ALTER TYPE resource_type ADD VALUE IF NOT EXISTS 'task';
@@ -0,0 +1,72 @@
DROP VIEW IF EXISTS tasks_with_status;
-- Restore from 00037_add_columns_to_tasks_with_status.up.sql.
CREATE VIEW
tasks_with_status
AS
SELECT
tasks.*,
CASE
WHEN tasks.workspace_id IS NULL OR latest_build.job_status IS NULL THEN 'pending'::task_status
WHEN latest_build.job_status = 'failed' THEN 'error'::task_status
WHEN latest_build.transition IN ('stop', 'delete')
AND latest_build.job_status = 'succeeded' THEN 'paused'::task_status
WHEN latest_build.transition = 'start'
AND latest_build.job_status = 'pending' THEN 'initializing'::task_status
WHEN latest_build.transition = 'start' AND latest_build.job_status IN ('running', 'succeeded') THEN
CASE
WHEN agent_status.none THEN 'initializing'::task_status
WHEN agent_status.connecting THEN 'initializing'::task_status
WHEN agent_status.connected THEN
CASE
WHEN app_status.any_unhealthy THEN 'error'::task_status
WHEN app_status.any_initializing THEN 'initializing'::task_status
WHEN app_status.all_healthy_or_disabled THEN 'active'::task_status
ELSE 'unknown'::task_status
END
ELSE 'unknown'::task_status
END
ELSE 'unknown'::task_status
END AS status
FROM
tasks
LEFT JOIN LATERAL (
SELECT workspace_build_number, workspace_agent_id, workspace_app_id
FROM task_workspace_apps task_app
WHERE task_id = tasks.id
ORDER BY workspace_build_number DESC
LIMIT 1
) task_app ON TRUE
LEFT JOIN LATERAL (
SELECT
workspace_build.transition,
provisioner_job.job_status,
workspace_build.job_id
FROM workspace_builds workspace_build
JOIN provisioner_jobs provisioner_job ON provisioner_job.id = workspace_build.job_id
WHERE workspace_build.workspace_id = tasks.workspace_id
AND workspace_build.build_number = task_app.workspace_build_number
) latest_build ON TRUE
CROSS JOIN LATERAL (
SELECT
COUNT(*) = 0 AS none,
bool_or(workspace_agent.lifecycle_state IN ('created', 'starting')) AS connecting,
bool_and(workspace_agent.lifecycle_state = 'ready') AS connected
FROM workspace_agents workspace_agent
WHERE workspace_agent.id = task_app.workspace_agent_id
) agent_status
CROSS JOIN LATERAL (
SELECT
bool_or(workspace_app.health = 'unhealthy') AS any_unhealthy,
bool_or(workspace_app.health = 'initializing') AS any_initializing,
bool_and(workspace_app.health IN ('healthy', 'disabled')) AS all_healthy_or_disabled
FROM workspace_apps workspace_app
WHERE workspace_app.id = task_app.workspace_app_id
) app_status
WHERE
tasks.deleted_at IS NULL;
@@ -0,0 +1,74 @@
-- Drop view from 00037_add_columns_to_tasks_with_status.up.sql.
DROP VIEW IF EXISTS tasks_with_status;
-- Add task_app columns.
CREATE VIEW
tasks_with_status
AS
SELECT
tasks.*,
CASE
WHEN tasks.workspace_id IS NULL OR latest_build.job_status IS NULL THEN 'pending'::task_status
WHEN latest_build.job_status = 'failed' THEN 'error'::task_status
WHEN latest_build.transition IN ('stop', 'delete')
AND latest_build.job_status = 'succeeded' THEN 'paused'::task_status
WHEN latest_build.transition = 'start'
AND latest_build.job_status = 'pending' THEN 'initializing'::task_status
WHEN latest_build.transition = 'start' AND latest_build.job_status IN ('running', 'succeeded') THEN
CASE
WHEN agent_status.none THEN 'initializing'::task_status
WHEN agent_status.connecting THEN 'initializing'::task_status
WHEN agent_status.connected THEN
CASE
WHEN app_status.any_unhealthy THEN 'error'::task_status
WHEN app_status.any_initializing THEN 'initializing'::task_status
WHEN app_status.all_healthy_or_disabled THEN 'active'::task_status
ELSE 'unknown'::task_status
END
ELSE 'unknown'::task_status
END
ELSE 'unknown'::task_status
END AS status,
task_app.*
FROM
tasks
LEFT JOIN LATERAL (
SELECT workspace_build_number, workspace_agent_id, workspace_app_id
FROM task_workspace_apps task_app
WHERE task_id = tasks.id
ORDER BY workspace_build_number DESC
LIMIT 1
) task_app ON TRUE
LEFT JOIN LATERAL (
SELECT
workspace_build.transition,
provisioner_job.job_status,
workspace_build.job_id
FROM workspace_builds workspace_build
JOIN provisioner_jobs provisioner_job ON provisioner_job.id = workspace_build.job_id
WHERE workspace_build.workspace_id = tasks.workspace_id
AND workspace_build.build_number = task_app.workspace_build_number
) latest_build ON TRUE
CROSS JOIN LATERAL (
SELECT
COUNT(*) = 0 AS none,
bool_or(workspace_agent.lifecycle_state IN ('created', 'starting')) AS connecting,
bool_and(workspace_agent.lifecycle_state = 'ready') AS connected
FROM workspace_agents workspace_agent
WHERE workspace_agent.id = task_app.workspace_agent_id
) agent_status
CROSS JOIN LATERAL (
SELECT
bool_or(workspace_app.health = 'unhealthy') AS any_unhealthy,
bool_or(workspace_app.health = 'initializing') AS any_initializing,
bool_and(workspace_app.health IN ('healthy', 'disabled')) AS all_healthy_or_disabled
FROM workspace_apps workspace_app
WHERE workspace_app.id = task_app.workspace_app_id
) app_status
WHERE
tasks.deleted_at IS NULL;
@@ -0,0 +1,5 @@
-- Remove Task 'completed' transition template notification
DELETE FROM notification_templates WHERE id = '8c5a4d12-9f7e-4b3a-a1c8-6e4f2d9b5a7c';
-- Remove Task 'failed' transition template notification
DELETE FROM notification_templates WHERE id = '3b7e8f1a-4c2d-49a6-b5e9-7f3a1c8d6b4e';
@@ -0,0 +1,63 @@
-- Task transition to 'complete' status
INSERT INTO notification_templates (
id,
name,
title_template,
body_template,
actions,
"group",
method,
kind,
enabled_by_default
) VALUES (
'8c5a4d12-9f7e-4b3a-a1c8-6e4f2d9b5a7c',
'Task Completed',
E'Task ''{{.Labels.workspace}}'' completed',
E'The task ''{{.Labels.task}}'' has completed successfully.',
'[
{
"label": "View task",
"url": "{{base_url}}/tasks/{{.UserUsername}}/{{.Labels.workspace}}"
},
{
"label": "View workspace",
"url": "{{base_url}}/@{{.UserUsername}}/{{.Labels.workspace}}"
}
]'::jsonb,
'Task Events',
NULL,
'system'::notification_template_kind,
true
);
-- Task transition to 'failed' status
INSERT INTO notification_templates (
id,
name,
title_template,
body_template,
actions,
"group",
method,
kind,
enabled_by_default
) VALUES (
'3b7e8f1a-4c2d-49a6-b5e9-7f3a1c8d6b4e',
'Task Failed',
E'Task ''{{.Labels.workspace}}'' failed',
E'The task ''{{.Labels.task}}'' has failed. Check the logs for more details.',
'[
{
"label": "View task",
"url": "{{base_url}}/tasks/{{.UserUsername}}/{{.Labels.workspace}}"
},
{
"label": "View workspace",
"url": "{{base_url}}/@{{.UserUsername}}/{{.Labels.workspace}}"
}
]'::jsonb,
'Task Events',
NULL,
'system'::notification_template_kind,
true
);
@@ -0,0 +1,6 @@
INSERT INTO public.task_workspace_apps VALUES (
'f5a1c3e4-8b2d-4f6a-9d7e-2a8b5c9e1f3d', -- task_id
NULL, -- workspace_agent_id
NULL, -- workspace_app_id
99 -- workspace_build_number
) ON CONFLICT DO NOTHING;
+76 -35
View File
@@ -132,6 +132,20 @@ func (w ConnectionLog) RBACObject() rbac.Object {
return obj
}
func (t Task) RBACObject() rbac.Object {
return rbac.ResourceTask.
WithID(t.ID).
WithOwner(t.OwnerID.String()).
InOrg(t.OrganizationID)
}
func (t TaskTable) RBACObject() rbac.Object {
return rbac.ResourceTask.
WithID(t.ID).
WithOwner(t.OwnerID.String()).
InOrg(t.OrganizationID)
}
func (s APIKeyScope) ToRBAC() rbac.ScopeName {
switch s {
case ApiKeyScopeCoderAll:
@@ -145,24 +159,30 @@ func (s APIKeyScope) ToRBAC() rbac.ScopeName {
}
}
// APIKeyScopes allows expanding multiple API key scopes into a single
// RBAC scope for authorization. This implements rbac.ExpandableScope so
// callers can pass the list directly without deriving a single scope.
// APIKeyScopes represents a collection of individual API key scope names as
// stored in the database. Helper methods on this type are used to derive the
// RBAC scope that should be authorized for the key.
type APIKeyScopes []APIKeyScope
var _ rbac.ExpandableScope = APIKeyScopes{}
// WithAllowList wraps the scopes with a database allow list, producing an
// ExpandableScope that always enforces the allow list overlay when expanded.
func (s APIKeyScopes) WithAllowList(list AllowList) APIKeyScopeSet {
return APIKeyScopeSet{Scopes: s, AllowList: list}
}
// Has returns true if the slice contains the provided scope.
func (s APIKeyScopes) Has(target APIKeyScope) bool {
return slices.Contains(s, target)
}
// Expand merges the permissions of all scopes in the list into a single scope.
// If the list is empty, it defaults to rbac.ScopeAll.
func (s APIKeyScopes) Expand() (rbac.Scope, error) {
// expandRBACScope merges the permissions of all scopes in the list into a
// single RBAC scope. If the list is empty, it defaults to rbac.ScopeAll for
// backward compatibility. This method is internal; use ScopeSet() to combine
// scopes with the API key's allow list for authorization.
func (s APIKeyScopes) expandRBACScope() (rbac.Scope, error) {
// Default to ScopeAll for backward compatibility when no scopes provided.
if len(s) == 0 {
return rbac.ScopeAll.Expand()
return rbac.Scope{}, xerrors.New("no scopes provided")
}
var merged rbac.Scope
@@ -170,13 +190,12 @@ func (s APIKeyScopes) Expand() (rbac.Scope, error) {
// Identifier is informational; not used in policy evaluation.
Identifier: rbac.RoleIdentifier{Name: "Scope_Multiple"},
Site: nil,
Org: map[string][]rbac.Permission{},
User: nil,
ByOrgID: map[string]rbac.OrgPermissions{},
}
// Track allow list union, collapsing to wildcard if any child is wildcard.
allowAll := false
allowSet := make(map[string]rbac.AllowListElement)
// Collect allow lists for a union after expanding all scopes.
allowLists := make([][]rbac.AllowListElement, 0, len(s))
for _, s := range s {
expanded, err := s.ToRBAC().Expand()
@@ -186,39 +205,30 @@ func (s APIKeyScopes) Expand() (rbac.Scope, error) {
// Merge role permissions: union by simple concatenation.
merged.Site = append(merged.Site, expanded.Site...)
for orgID, perms := range expanded.Org {
merged.Org[orgID] = append(merged.Org[orgID], perms...)
for orgID, perms := range expanded.ByOrgID {
orgPerms := merged.ByOrgID[orgID]
orgPerms.Org = append(orgPerms.Org, perms.Org...)
merged.ByOrgID[orgID] = orgPerms
}
merged.User = append(merged.User, expanded.User...)
// Merge allow lists.
for _, e := range expanded.AllowIDList {
if e.ID == policy.WildcardSymbol && e.Type == policy.WildcardSymbol {
allowAll = true
// No need to track other entries once wildcard is present.
continue
}
key := e.String()
allowSet[key] = e
}
allowLists = append(allowLists, expanded.AllowIDList)
}
// De-duplicate permissions across Site/Org/User
merged.Site = rbac.DeduplicatePermissions(merged.Site)
for orgID, perms := range merged.Org {
merged.Org[orgID] = rbac.DeduplicatePermissions(perms)
}
merged.User = rbac.DeduplicatePermissions(merged.User)
if allowAll || len(allowSet) == 0 {
merged.AllowIDList = []rbac.AllowListElement{rbac.AllowListAll()}
} else {
merged.AllowIDList = make([]rbac.AllowListElement, 0, len(allowSet))
for _, v := range allowSet {
merged.AllowIDList = append(merged.AllowIDList, v)
}
for orgID, perms := range merged.ByOrgID {
perms.Org = rbac.DeduplicatePermissions(perms.Org)
merged.ByOrgID[orgID] = perms
}
union, err := rbac.UnionAllowLists(allowLists...)
if err != nil {
return rbac.Scope{}, err
}
merged.AllowIDList = union
return merged, nil
}
@@ -235,6 +245,37 @@ func (s APIKeyScopes) Name() rbac.RoleIdentifier {
return rbac.RoleIdentifier{Name: "scopes[" + strings.Join(names, "+") + "]"}
}
// APIKeyScopeSet merges expanded scopes with the API key's DB allow_list. If
// the DB allow_list is a wildcard or empty, the merged scope's allow list is
// unchanged. Otherwise, the DB allow_list overrides the merged AllowIDList to
// enforce the token's resource scoping consistently across all permissions.
type APIKeyScopeSet struct {
Scopes APIKeyScopes
AllowList AllowList
}
var _ rbac.ExpandableScope = APIKeyScopeSet{}
func (s APIKeyScopeSet) Name() rbac.RoleIdentifier { return s.Scopes.Name() }
func (s APIKeyScopeSet) Expand() (rbac.Scope, error) {
merged, err := s.Scopes.expandRBACScope()
if err != nil {
return rbac.Scope{}, err
}
merged.AllowIDList = rbac.IntersectAllowLists(merged.AllowIDList, s.AllowList)
return merged, nil
}
// ScopeSet returns the scopes combined with the database allow list. It is the
// canonical way to expose an API key's effective scope for authorization.
func (k APIKey) ScopeSet() APIKeyScopeSet {
return APIKeyScopeSet{
Scopes: k.Scopes,
AllowList: k.AllowList,
}
}
func (k APIKey) RBACObject() rbac.Object {
return rbac.ResourceApiKey.WithIDString(k.ID).
WithOwner(k.UserID.String())
+62 -6
View File
@@ -3,6 +3,7 @@ package database
import (
"testing"
"github.com/google/uuid"
"github.com/stretchr/testify/require"
"github.com/coder/coder/v2/coderd/rbac"
@@ -38,7 +39,7 @@ func TestAPIKeyScopesExpand(t *testing.T) {
for _, tc := range cases {
t.Run(tc.name, func(t *testing.T) {
t.Parallel()
s, err := tc.scopes.Expand()
s, err := tc.scopes.expandRBACScope()
require.NoError(t, err)
tc.want(t, s)
})
@@ -59,7 +60,7 @@ func TestAPIKeyScopesExpand(t *testing.T) {
for _, tc := range cases {
t.Run(tc.name, func(t *testing.T) {
t.Parallel()
s, err := tc.scopes.Expand()
s, err := tc.scopes.expandRBACScope()
require.NoError(t, err)
requirePermission(t, s, tc.res, tc.act)
requireAllowAll(t, s)
@@ -70,7 +71,7 @@ func TestAPIKeyScopesExpand(t *testing.T) {
t.Run("merge", func(t *testing.T) {
t.Parallel()
scopes := APIKeyScopes{ApiKeyScopeCoderApplicationConnect, ApiKeyScopeCoderAll, ApiKeyScopeWorkspaceRead}
s, err := scopes.Expand()
s, err := scopes.expandRBACScope()
require.NoError(t, err)
requirePermission(t, s, rbac.ResourceWildcard.Type, policy.Action(policy.WildcardSymbol))
requirePermission(t, s, rbac.ResourceWorkspace.Type, policy.ActionApplicationConnect)
@@ -78,13 +79,68 @@ func TestAPIKeyScopesExpand(t *testing.T) {
requireAllowAll(t, s)
})
t.Run("empty_defaults_to_all", func(t *testing.T) {
t.Run("effective_scope_keep_types", func(t *testing.T) {
t.Parallel()
s, err := (APIKeyScopes{}).Expand()
workspaceID := uuid.New()
effective := APIKeyScopeSet{
Scopes: APIKeyScopes{ApiKeyScopeWorkspaceRead},
AllowList: AllowList{
{Type: rbac.ResourceWorkspace.Type, ID: workspaceID.String()},
},
}
expanded, err := effective.Expand()
require.NoError(t, err)
requirePermission(t, s, rbac.ResourceWildcard.Type, policy.Action(policy.WildcardSymbol))
require.Len(t, expanded.AllowIDList, 1)
require.Equal(t, "workspace", expanded.AllowIDList[0].Type)
require.Equal(t, workspaceID.String(), expanded.AllowIDList[0].ID)
})
t.Run("empty_rejected", func(t *testing.T) {
t.Parallel()
_, err := (APIKeyScopes{}).expandRBACScope()
require.Error(t, err)
require.ErrorContains(t, err, "no scopes provided")
})
t.Run("allow_list_overrides", func(t *testing.T) {
t.Parallel()
allowID := uuid.NewString()
set := APIKeyScopes{ApiKeyScopeWorkspaceRead}.WithAllowList(AllowList{
{Type: rbac.ResourceWorkspace.Type, ID: allowID},
})
s, err := set.Expand()
require.NoError(t, err)
require.Len(t, s.AllowIDList, 1)
require.Equal(t, rbac.AllowListElement{Type: rbac.ResourceWorkspace.Type, ID: allowID}, s.AllowIDList[0])
})
t.Run("allow_list_wildcard_keeps_merged", func(t *testing.T) {
t.Parallel()
set := APIKeyScopes{ApiKeyScopeWorkspaceRead}.WithAllowList(AllowList{
{Type: policy.WildcardSymbol, ID: policy.WildcardSymbol},
})
s, err := set.Expand()
require.NoError(t, err)
requirePermission(t, s, rbac.ResourceWorkspace.Type, policy.ActionRead)
requireAllowAll(t, s)
})
t.Run("scope_set_helper", func(t *testing.T) {
t.Parallel()
allowID := uuid.NewString()
key := APIKey{
Scopes: APIKeyScopes{ApiKeyScopeWorkspaceRead},
AllowList: AllowList{
{Type: rbac.ResourceWorkspace.Type, ID: allowID},
},
}
s, err := key.ScopeSet().Expand()
require.NoError(t, err)
require.Len(t, s.AllowIDList, 1)
require.Equal(t, rbac.AllowListElement{Type: rbac.ResourceWorkspace.Type, ID: allowID}, s.AllowIDList[0])
})
}
// Helpers
+231 -6
View File
@@ -166,6 +166,51 @@ const (
ApiKeyScopeCoderTemplatesbuild APIKeyScope = "coder:templates.build"
ApiKeyScopeCoderTemplatesauthor APIKeyScope = "coder:templates.author"
ApiKeyScopeCoderApikeysmanageSelf APIKeyScope = "coder:apikeys.manage_self"
ApiKeyScopeAibridgeInterception APIKeyScope = "aibridge_interception:*"
ApiKeyScopeApiKey APIKeyScope = "api_key:*"
ApiKeyScopeAssignOrgRole APIKeyScope = "assign_org_role:*"
ApiKeyScopeAssignRole APIKeyScope = "assign_role:*"
ApiKeyScopeAuditLog APIKeyScope = "audit_log:*"
ApiKeyScopeConnectionLog APIKeyScope = "connection_log:*"
ApiKeyScopeCryptoKey APIKeyScope = "crypto_key:*"
ApiKeyScopeDebugInfo APIKeyScope = "debug_info:*"
ApiKeyScopeDeploymentConfig APIKeyScope = "deployment_config:*"
ApiKeyScopeDeploymentStats APIKeyScope = "deployment_stats:*"
ApiKeyScopeFile APIKeyScope = "file:*"
ApiKeyScopeGroup APIKeyScope = "group:*"
ApiKeyScopeGroupMember APIKeyScope = "group_member:*"
ApiKeyScopeIdpsyncSettings APIKeyScope = "idpsync_settings:*"
ApiKeyScopeInboxNotification APIKeyScope = "inbox_notification:*"
ApiKeyScopeLicense APIKeyScope = "license:*"
ApiKeyScopeNotificationMessage APIKeyScope = "notification_message:*"
ApiKeyScopeNotificationPreference APIKeyScope = "notification_preference:*"
ApiKeyScopeNotificationTemplate APIKeyScope = "notification_template:*"
ApiKeyScopeOauth2App APIKeyScope = "oauth2_app:*"
ApiKeyScopeOauth2AppCodeToken APIKeyScope = "oauth2_app_code_token:*"
ApiKeyScopeOauth2AppSecret APIKeyScope = "oauth2_app_secret:*"
ApiKeyScopeOrganization APIKeyScope = "organization:*"
ApiKeyScopeOrganizationMember APIKeyScope = "organization_member:*"
ApiKeyScopePrebuiltWorkspace APIKeyScope = "prebuilt_workspace:*"
ApiKeyScopeProvisionerDaemon APIKeyScope = "provisioner_daemon:*"
ApiKeyScopeProvisionerJobs APIKeyScope = "provisioner_jobs:*"
ApiKeyScopeReplicas APIKeyScope = "replicas:*"
ApiKeyScopeSystem APIKeyScope = "system:*"
ApiKeyScopeTailnetCoordinator APIKeyScope = "tailnet_coordinator:*"
ApiKeyScopeTemplate APIKeyScope = "template:*"
ApiKeyScopeUsageEvent APIKeyScope = "usage_event:*"
ApiKeyScopeUser APIKeyScope = "user:*"
ApiKeyScopeUserSecret APIKeyScope = "user_secret:*"
ApiKeyScopeWebpushSubscription APIKeyScope = "webpush_subscription:*"
ApiKeyScopeWorkspace APIKeyScope = "workspace:*"
ApiKeyScopeWorkspaceAgentDevcontainers APIKeyScope = "workspace_agent_devcontainers:*"
ApiKeyScopeWorkspaceAgentResourceMonitor APIKeyScope = "workspace_agent_resource_monitor:*"
ApiKeyScopeWorkspaceDormant APIKeyScope = "workspace_dormant:*"
ApiKeyScopeWorkspaceProxy APIKeyScope = "workspace_proxy:*"
ApiKeyScopeTaskCreate APIKeyScope = "task:create"
ApiKeyScopeTaskRead APIKeyScope = "task:read"
ApiKeyScopeTaskUpdate APIKeyScope = "task:update"
ApiKeyScopeTaskDelete APIKeyScope = "task:delete"
ApiKeyScopeTask APIKeyScope = "task:*"
)
func (e *APIKeyScope) Scan(src interface{}) error {
@@ -351,7 +396,52 @@ func (e APIKeyScope) Valid() bool {
ApiKeyScopeCoderWorkspacesaccess,
ApiKeyScopeCoderTemplatesbuild,
ApiKeyScopeCoderTemplatesauthor,
ApiKeyScopeCoderApikeysmanageSelf:
ApiKeyScopeCoderApikeysmanageSelf,
ApiKeyScopeAibridgeInterception,
ApiKeyScopeApiKey,
ApiKeyScopeAssignOrgRole,
ApiKeyScopeAssignRole,
ApiKeyScopeAuditLog,
ApiKeyScopeConnectionLog,
ApiKeyScopeCryptoKey,
ApiKeyScopeDebugInfo,
ApiKeyScopeDeploymentConfig,
ApiKeyScopeDeploymentStats,
ApiKeyScopeFile,
ApiKeyScopeGroup,
ApiKeyScopeGroupMember,
ApiKeyScopeIdpsyncSettings,
ApiKeyScopeInboxNotification,
ApiKeyScopeLicense,
ApiKeyScopeNotificationMessage,
ApiKeyScopeNotificationPreference,
ApiKeyScopeNotificationTemplate,
ApiKeyScopeOauth2App,
ApiKeyScopeOauth2AppCodeToken,
ApiKeyScopeOauth2AppSecret,
ApiKeyScopeOrganization,
ApiKeyScopeOrganizationMember,
ApiKeyScopePrebuiltWorkspace,
ApiKeyScopeProvisionerDaemon,
ApiKeyScopeProvisionerJobs,
ApiKeyScopeReplicas,
ApiKeyScopeSystem,
ApiKeyScopeTailnetCoordinator,
ApiKeyScopeTemplate,
ApiKeyScopeUsageEvent,
ApiKeyScopeUser,
ApiKeyScopeUserSecret,
ApiKeyScopeWebpushSubscription,
ApiKeyScopeWorkspace,
ApiKeyScopeWorkspaceAgentDevcontainers,
ApiKeyScopeWorkspaceAgentResourceMonitor,
ApiKeyScopeWorkspaceDormant,
ApiKeyScopeWorkspaceProxy,
ApiKeyScopeTaskCreate,
ApiKeyScopeTaskRead,
ApiKeyScopeTaskUpdate,
ApiKeyScopeTaskDelete,
ApiKeyScopeTask:
return true
}
return false
@@ -506,6 +596,51 @@ func AllAPIKeyScopeValues() []APIKeyScope {
ApiKeyScopeCoderTemplatesbuild,
ApiKeyScopeCoderTemplatesauthor,
ApiKeyScopeCoderApikeysmanageSelf,
ApiKeyScopeAibridgeInterception,
ApiKeyScopeApiKey,
ApiKeyScopeAssignOrgRole,
ApiKeyScopeAssignRole,
ApiKeyScopeAuditLog,
ApiKeyScopeConnectionLog,
ApiKeyScopeCryptoKey,
ApiKeyScopeDebugInfo,
ApiKeyScopeDeploymentConfig,
ApiKeyScopeDeploymentStats,
ApiKeyScopeFile,
ApiKeyScopeGroup,
ApiKeyScopeGroupMember,
ApiKeyScopeIdpsyncSettings,
ApiKeyScopeInboxNotification,
ApiKeyScopeLicense,
ApiKeyScopeNotificationMessage,
ApiKeyScopeNotificationPreference,
ApiKeyScopeNotificationTemplate,
ApiKeyScopeOauth2App,
ApiKeyScopeOauth2AppCodeToken,
ApiKeyScopeOauth2AppSecret,
ApiKeyScopeOrganization,
ApiKeyScopeOrganizationMember,
ApiKeyScopePrebuiltWorkspace,
ApiKeyScopeProvisionerDaemon,
ApiKeyScopeProvisionerJobs,
ApiKeyScopeReplicas,
ApiKeyScopeSystem,
ApiKeyScopeTailnetCoordinator,
ApiKeyScopeTemplate,
ApiKeyScopeUsageEvent,
ApiKeyScopeUser,
ApiKeyScopeUserSecret,
ApiKeyScopeWebpushSubscription,
ApiKeyScopeWorkspace,
ApiKeyScopeWorkspaceAgentDevcontainers,
ApiKeyScopeWorkspaceAgentResourceMonitor,
ApiKeyScopeWorkspaceDormant,
ApiKeyScopeWorkspaceProxy,
ApiKeyScopeTaskCreate,
ApiKeyScopeTaskRead,
ApiKeyScopeTaskUpdate,
ApiKeyScopeTaskDelete,
ApiKeyScopeTask,
}
}
@@ -2535,6 +2670,7 @@ const (
ResourceTypeWorkspaceAgent ResourceType = "workspace_agent"
ResourceTypeWorkspaceApp ResourceType = "workspace_app"
ResourceTypePrebuildsSettings ResourceType = "prebuilds_settings"
ResourceTypeTask ResourceType = "task"
)
func (e *ResourceType) Scan(src interface{}) error {
@@ -2598,7 +2734,8 @@ func (e ResourceType) Valid() bool {
ResourceTypeIdpSyncSettingsRole,
ResourceTypeWorkspaceAgent,
ResourceTypeWorkspaceApp,
ResourceTypePrebuildsSettings:
ResourceTypePrebuildsSettings,
ResourceTypeTask:
return true
}
return false
@@ -2631,6 +2768,7 @@ func AllResourceTypeValues() []ResourceType {
ResourceTypeWorkspaceAgent,
ResourceTypeWorkspaceApp,
ResourceTypePrebuildsSettings,
ResourceTypeTask,
}
}
@@ -2750,6 +2888,76 @@ func AllTailnetStatusValues() []TailnetStatus {
}
}
type TaskStatus string
const (
TaskStatusPending TaskStatus = "pending"
TaskStatusInitializing TaskStatus = "initializing"
TaskStatusActive TaskStatus = "active"
TaskStatusPaused TaskStatus = "paused"
TaskStatusUnknown TaskStatus = "unknown"
TaskStatusError TaskStatus = "error"
)
func (e *TaskStatus) Scan(src interface{}) error {
switch s := src.(type) {
case []byte:
*e = TaskStatus(s)
case string:
*e = TaskStatus(s)
default:
return fmt.Errorf("unsupported scan type for TaskStatus: %T", src)
}
return nil
}
type NullTaskStatus struct {
TaskStatus TaskStatus `json:"task_status"`
Valid bool `json:"valid"` // Valid is true if TaskStatus is not NULL
}
// Scan implements the Scanner interface.
func (ns *NullTaskStatus) Scan(value interface{}) error {
if value == nil {
ns.TaskStatus, ns.Valid = "", false
return nil
}
ns.Valid = true
return ns.TaskStatus.Scan(value)
}
// Value implements the driver Valuer interface.
func (ns NullTaskStatus) Value() (driver.Value, error) {
if !ns.Valid {
return nil, nil
}
return string(ns.TaskStatus), nil
}
func (e TaskStatus) Valid() bool {
switch e {
case TaskStatusPending,
TaskStatusInitializing,
TaskStatusActive,
TaskStatusPaused,
TaskStatusUnknown,
TaskStatusError:
return true
}
return false
}
func AllTaskStatusValues() []TaskStatus {
return []TaskStatus{
TaskStatusPending,
TaskStatusInitializing,
TaskStatusActive,
TaskStatusPaused,
TaskStatusUnknown,
TaskStatusError,
}
}
// Defines the users status: active, dormant, or suspended.
type UserStatus string
@@ -3992,6 +4200,23 @@ type TailnetTunnel struct {
}
type Task struct {
ID uuid.UUID `db:"id" json:"id"`
OrganizationID uuid.UUID `db:"organization_id" json:"organization_id"`
OwnerID uuid.UUID `db:"owner_id" json:"owner_id"`
Name string `db:"name" json:"name"`
WorkspaceID uuid.NullUUID `db:"workspace_id" json:"workspace_id"`
TemplateVersionID uuid.UUID `db:"template_version_id" json:"template_version_id"`
TemplateParameters json.RawMessage `db:"template_parameters" json:"template_parameters"`
Prompt string `db:"prompt" json:"prompt"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
DeletedAt sql.NullTime `db:"deleted_at" json:"deleted_at"`
Status TaskStatus `db:"status" json:"status"`
WorkspaceBuildNumber sql.NullInt32 `db:"workspace_build_number" json:"workspace_build_number"`
WorkspaceAgentID uuid.NullUUID `db:"workspace_agent_id" json:"workspace_agent_id"`
WorkspaceAppID uuid.NullUUID `db:"workspace_app_id" json:"workspace_app_id"`
}
type TaskTable struct {
ID uuid.UUID `db:"id" json:"id"`
OrganizationID uuid.UUID `db:"organization_id" json:"organization_id"`
OwnerID uuid.UUID `db:"owner_id" json:"owner_id"`
@@ -4005,10 +4230,10 @@ type Task struct {
}
type TaskWorkspaceApp struct {
TaskID uuid.UUID `db:"task_id" json:"task_id"`
WorkspaceBuildID uuid.UUID `db:"workspace_build_id" json:"workspace_build_id"`
WorkspaceAgentID uuid.UUID `db:"workspace_agent_id" json:"workspace_agent_id"`
WorkspaceAppID uuid.UUID `db:"workspace_app_id" json:"workspace_app_id"`
TaskID uuid.UUID `db:"task_id" json:"task_id"`
WorkspaceAgentID uuid.NullUUID `db:"workspace_agent_id" json:"workspace_agent_id"`
WorkspaceAppID uuid.NullUUID `db:"workspace_app_id" json:"workspace_app_id"`
WorkspaceBuildNumber int32 `db:"workspace_build_number" json:"workspace_build_number"`
}
type TelemetryItem struct {
+5
View File
@@ -331,6 +331,8 @@ type sqlcQuerier interface {
GetTailnetPeers(ctx context.Context, id uuid.UUID) ([]TailnetPeer, error)
GetTailnetTunnelPeerBindings(ctx context.Context, srcID uuid.UUID) ([]GetTailnetTunnelPeerBindingsRow, error)
GetTailnetTunnelPeerIDs(ctx context.Context, srcID uuid.UUID) ([]GetTailnetTunnelPeerIDsRow, error)
GetTaskByID(ctx context.Context, id uuid.UUID) (Task, error)
GetTaskByWorkspaceID(ctx context.Context, workspaceID uuid.UUID) (Task, error)
GetTelemetryItem(ctx context.Context, key string) (TelemetryItem, error)
GetTelemetryItems(ctx context.Context) ([]TelemetryItem, error)
// GetTemplateAppInsights returns the aggregate usage of each app in a given
@@ -550,6 +552,7 @@ type sqlcQuerier interface {
InsertProvisionerJobTimings(ctx context.Context, arg InsertProvisionerJobTimingsParams) ([]ProvisionerJobTiming, error)
InsertProvisionerKey(ctx context.Context, arg InsertProvisionerKeyParams) (ProvisionerKey, error)
InsertReplica(ctx context.Context, arg InsertReplicaParams) (Replica, error)
InsertTask(ctx context.Context, arg InsertTaskParams) (TaskTable, error)
InsertTelemetryItemIfNotExists(ctx context.Context, arg InsertTelemetryItemIfNotExistsParams) error
InsertTemplate(ctx context.Context, arg InsertTemplateParams) error
InsertTemplateVersion(ctx context.Context, arg InsertTemplateVersionParams) error
@@ -592,6 +595,7 @@ type sqlcQuerier interface {
ListAIBridgeUserPromptsByInterceptionIDs(ctx context.Context, interceptionIds []uuid.UUID) ([]AIBridgeUserPrompt, error)
ListProvisionerKeysByOrganization(ctx context.Context, organizationID uuid.UUID) ([]ProvisionerKey, error)
ListProvisionerKeysByOrganizationExcludeReserved(ctx context.Context, organizationID uuid.UUID) ([]ProvisionerKey, error)
ListTasks(ctx context.Context, arg ListTasksParams) ([]Task, error)
ListUserSecrets(ctx context.Context, userID uuid.UUID) ([]UserSecret, error)
ListWorkspaceAgentPortShares(ctx context.Context, workspaceID uuid.UUID) ([]WorkspaceAgentPortShare, error)
MarkAllInboxNotificationsAsRead(ctx context.Context, arg MarkAllInboxNotificationsAsReadParams) error
@@ -729,6 +733,7 @@ type sqlcQuerier interface {
UpsertTailnetCoordinator(ctx context.Context, id uuid.UUID) (TailnetCoordinator, error)
UpsertTailnetPeer(ctx context.Context, arg UpsertTailnetPeerParams) (TailnetPeer, error)
UpsertTailnetTunnel(ctx context.Context, arg UpsertTailnetTunnelParams) (TailnetTunnel, error)
UpsertTaskWorkspaceApp(ctx context.Context, arg UpsertTaskWorkspaceAppParams) (TaskWorkspaceApp, error)
UpsertTelemetryItem(ctx context.Context, arg UpsertTelemetryItemParams) error
// This query aggregates the workspace_agent_stats and workspace_app_stats data
// into a single table for efficient storage and querying. Half-hour buckets are
+788
View File
@@ -6653,6 +6653,649 @@ func TestGetLatestWorkspaceBuildsByWorkspaceIDs(t *testing.T) {
}
}
func TestTasksWithStatusView(t *testing.T) {
t.Parallel()
createProvisionerJob := func(t *testing.T, db database.Store, org database.Organization, user database.User, buildStatus database.ProvisionerJobStatus) database.ProvisionerJob {
t.Helper()
var jobParams database.ProvisionerJob
switch buildStatus {
case database.ProvisionerJobStatusPending:
jobParams = database.ProvisionerJob{
OrganizationID: org.ID,
Type: database.ProvisionerJobTypeWorkspaceBuild,
InitiatorID: user.ID,
}
case database.ProvisionerJobStatusRunning:
jobParams = database.ProvisionerJob{
OrganizationID: org.ID,
Type: database.ProvisionerJobTypeWorkspaceBuild,
InitiatorID: user.ID,
StartedAt: sql.NullTime{Valid: true, Time: dbtime.Now()},
}
case database.ProvisionerJobStatusFailed:
jobParams = database.ProvisionerJob{
OrganizationID: org.ID,
Type: database.ProvisionerJobTypeWorkspaceBuild,
InitiatorID: user.ID,
StartedAt: sql.NullTime{Valid: true, Time: dbtime.Now()},
CompletedAt: sql.NullTime{Valid: true, Time: dbtime.Now()},
Error: sql.NullString{Valid: true, String: "job failed"},
}
case database.ProvisionerJobStatusSucceeded:
jobParams = database.ProvisionerJob{
OrganizationID: org.ID,
Type: database.ProvisionerJobTypeWorkspaceBuild,
InitiatorID: user.ID,
StartedAt: sql.NullTime{Valid: true, Time: dbtime.Now()},
CompletedAt: sql.NullTime{Valid: true, Time: dbtime.Now()},
}
default:
t.Errorf("invalid build status: %v", buildStatus)
}
return dbgen.ProvisionerJob(t, db, nil, jobParams)
}
createTask := func(
ctx context.Context,
t *testing.T,
db database.Store,
org database.Organization,
user database.User,
buildStatus database.ProvisionerJobStatus,
buildTransition database.WorkspaceTransition,
agentState database.WorkspaceAgentLifecycleState,
appHealths []database.WorkspaceAppHealth,
) database.TaskTable {
t.Helper()
template := dbgen.Template(t, db, database.Template{
OrganizationID: org.ID,
CreatedBy: user.ID,
})
templateVersion := dbgen.TemplateVersion(t, db, database.TemplateVersion{
TemplateID: uuid.NullUUID{UUID: template.ID, Valid: true},
OrganizationID: org.ID,
CreatedBy: user.ID,
})
if buildStatus == "" {
return dbgen.Task(t, db, database.TaskTable{
OrganizationID: org.ID,
OwnerID: user.ID,
Name: "test-task",
TemplateVersionID: templateVersion.ID,
Prompt: "Test prompt",
})
}
job := createProvisionerJob(t, db, org, user, buildStatus)
workspace := dbgen.Workspace(t, db, database.WorkspaceTable{
OrganizationID: org.ID,
TemplateID: template.ID,
OwnerID: user.ID,
})
workspaceID := uuid.NullUUID{Valid: true, UUID: workspace.ID}
task := dbgen.Task(t, db, database.TaskTable{
OrganizationID: org.ID,
OwnerID: user.ID,
Name: "test-task",
WorkspaceID: workspaceID,
TemplateVersionID: templateVersion.ID,
Prompt: "Test prompt",
})
workspaceBuild := dbgen.WorkspaceBuild(t, db, database.WorkspaceBuild{
WorkspaceID: workspace.ID,
TemplateVersionID: templateVersion.ID,
BuildNumber: 1,
Transition: buildTransition,
InitiatorID: user.ID,
JobID: job.ID,
})
workspaceBuildNumber := workspaceBuild.BuildNumber
_, err := db.UpsertTaskWorkspaceApp(ctx, database.UpsertTaskWorkspaceAppParams{
TaskID: task.ID,
WorkspaceBuildNumber: workspaceBuildNumber,
})
require.NoError(t, err)
resource := dbgen.WorkspaceResource(t, db, database.WorkspaceResource{
JobID: job.ID,
})
if agentState != "" {
agent := dbgen.WorkspaceAgent(t, db, database.WorkspaceAgent{
ResourceID: resource.ID,
})
workspaceAgentID := agent.ID
_, err := db.UpsertTaskWorkspaceApp(ctx, database.UpsertTaskWorkspaceAppParams{
TaskID: task.ID,
WorkspaceBuildNumber: workspaceBuildNumber,
WorkspaceAgentID: uuid.NullUUID{UUID: workspaceAgentID, Valid: true},
})
require.NoError(t, err)
err = db.UpdateWorkspaceAgentLifecycleStateByID(ctx, database.UpdateWorkspaceAgentLifecycleStateByIDParams{
ID: agent.ID,
LifecycleState: agentState,
})
require.NoError(t, err)
for i, health := range appHealths {
app := dbgen.WorkspaceApp(t, db, database.WorkspaceApp{
AgentID: workspaceAgentID,
Slug: fmt.Sprintf("test-app-%d", i),
DisplayName: fmt.Sprintf("Test App %d", i+1),
Health: health,
})
if i == 0 {
// Assume the first app is the tasks app.
_, err := db.UpsertTaskWorkspaceApp(ctx, database.UpsertTaskWorkspaceAppParams{
TaskID: task.ID,
WorkspaceBuildNumber: workspaceBuildNumber,
WorkspaceAgentID: uuid.NullUUID{UUID: workspaceAgentID, Valid: true},
WorkspaceAppID: uuid.NullUUID{UUID: app.ID, Valid: true},
})
require.NoError(t, err)
}
}
}
return task
}
tests := []struct {
name string
buildStatus database.ProvisionerJobStatus
buildTransition database.WorkspaceTransition
agentState database.WorkspaceAgentLifecycleState
appHealths []database.WorkspaceAppHealth
expectedStatus database.TaskStatus
description string
expectBuildNumberValid bool
expectBuildNumber int32
expectWorkspaceAgentValid bool
expectWorkspaceAppValid bool
}{
{
name: "NoWorkspace",
expectedStatus: "pending",
description: "Task with no workspace assigned",
expectBuildNumberValid: false,
expectWorkspaceAgentValid: false,
expectWorkspaceAppValid: false,
},
{
name: "FailedBuild",
buildStatus: database.ProvisionerJobStatusFailed,
buildTransition: database.WorkspaceTransitionStart,
expectedStatus: database.TaskStatusError,
description: "Latest workspace build failed",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: false,
expectWorkspaceAppValid: false,
},
{
name: "StoppedWorkspace",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStop,
expectedStatus: database.TaskStatusPaused,
description: "Workspace is stopped",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: false,
expectWorkspaceAppValid: false,
},
{
name: "DeletedWorkspace",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionDelete,
expectedStatus: database.TaskStatusPaused,
description: "Workspace is deleted",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: false,
expectWorkspaceAppValid: false,
},
{
name: "PendingStart",
buildStatus: database.ProvisionerJobStatusPending,
buildTransition: database.WorkspaceTransitionStart,
expectedStatus: database.TaskStatusInitializing,
description: "Workspace build is starting (pending)",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: false,
expectWorkspaceAppValid: false,
},
{
name: "RunningStart",
buildStatus: database.ProvisionerJobStatusRunning,
buildTransition: database.WorkspaceTransitionStart,
expectedStatus: database.TaskStatusInitializing,
description: "Workspace build is starting (running)",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: false,
expectWorkspaceAppValid: false,
},
{
name: "StartingAgent",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateStarting,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthInitializing},
expectedStatus: database.TaskStatusInitializing,
description: "Workspace is running but agent is starting",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "CreatedAgent",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateCreated,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthInitializing},
expectedStatus: database.TaskStatusInitializing,
description: "Workspace is running but agent is created",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "ReadyAgentInitializingApp",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateReady,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthInitializing},
expectedStatus: database.TaskStatusInitializing,
description: "Agent is ready but app is initializing",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "ReadyAgentHealthyApp",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateReady,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthHealthy},
expectedStatus: database.TaskStatusActive,
description: "Agent is ready and app is healthy",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "ReadyAgentDisabledApp",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateReady,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthDisabled},
expectedStatus: database.TaskStatusActive,
description: "Agent is ready and app health checking is disabled",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "ReadyAgentUnhealthyApp",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateReady,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthUnhealthy},
expectedStatus: database.TaskStatusError,
description: "Agent is ready but app is unhealthy",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "AgentStartTimeout",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateStartTimeout,
expectedStatus: database.TaskStatusUnknown,
description: "Agent start timed out",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: false,
},
{
name: "AgentStartError",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateStartError,
expectedStatus: database.TaskStatusUnknown,
description: "Agent failed to start",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: false,
},
{
name: "AgentShuttingDown",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateShuttingDown,
expectedStatus: database.TaskStatusUnknown,
description: "Agent is shutting down",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: false,
},
{
name: "AgentOff",
buildStatus: database.ProvisionerJobStatusSucceeded,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateOff,
expectedStatus: database.TaskStatusUnknown,
description: "Agent is off",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: false,
},
{
name: "RunningJobReadyAgentHealthyApp",
buildStatus: database.ProvisionerJobStatusRunning,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateReady,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthHealthy},
expectedStatus: database.TaskStatusActive,
description: "Running job with ready agent and healthy app should be active",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "RunningJobReadyAgentInitializingApp",
buildStatus: database.ProvisionerJobStatusRunning,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateReady,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthInitializing},
expectedStatus: database.TaskStatusInitializing,
description: "Running job with ready agent but initializing app should be initializing",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "RunningJobReadyAgentUnhealthyApp",
buildStatus: database.ProvisionerJobStatusRunning,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateReady,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthUnhealthy},
expectedStatus: database.TaskStatusError,
description: "Running job with ready agent but unhealthy app should be error",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "RunningJobConnectingAgent",
buildStatus: database.ProvisionerJobStatusRunning,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateStarting,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthInitializing},
expectedStatus: database.TaskStatusInitializing,
description: "Running job with connecting agent should be initializing",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "RunningJobReadyAgentDisabledApp",
buildStatus: database.ProvisionerJobStatusRunning,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateReady,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthDisabled},
expectedStatus: database.TaskStatusActive,
description: "Running job with ready agent and disabled app health checking should be active",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
{
name: "RunningJobReadyAgentHealthyTaskAppUnhealthyOtherAppIsOK",
buildStatus: database.ProvisionerJobStatusRunning,
buildTransition: database.WorkspaceTransitionStart,
agentState: database.WorkspaceAgentLifecycleStateReady,
appHealths: []database.WorkspaceAppHealth{database.WorkspaceAppHealthHealthy, database.WorkspaceAppHealthUnhealthy},
expectedStatus: database.TaskStatusActive,
description: "Running job with ready agent and multiple healthy apps should be active",
expectBuildNumberValid: true,
expectBuildNumber: 1,
expectWorkspaceAgentValid: true,
expectWorkspaceAppValid: true,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
t.Parallel()
db, _ := dbtestutil.NewDB(t)
ctx := testutil.Context(t, testutil.WaitLong)
org := dbgen.Organization(t, db, database.Organization{})
user := dbgen.User(t, db, database.User{})
task := createTask(ctx, t, db, org, user, tt.buildStatus, tt.buildTransition, tt.agentState, tt.appHealths)
got, err := db.GetTaskByID(ctx, task.ID)
require.NoError(t, err)
require.Equal(t, tt.expectedStatus, got.Status)
require.Equal(t, tt.expectBuildNumberValid, got.WorkspaceBuildNumber.Valid)
if tt.expectBuildNumberValid {
require.Equal(t, tt.expectBuildNumber, got.WorkspaceBuildNumber.Int32)
}
require.Equal(t, tt.expectWorkspaceAgentValid, got.WorkspaceAgentID.Valid)
if tt.expectWorkspaceAgentValid {
require.NotEqual(t, uuid.Nil, got.WorkspaceAgentID.UUID)
}
require.Equal(t, tt.expectWorkspaceAppValid, got.WorkspaceAppID.Valid)
if tt.expectWorkspaceAppValid {
require.NotEqual(t, uuid.Nil, got.WorkspaceAppID.UUID)
}
})
}
}
func TestGetTaskByWorkspaceID(t *testing.T) {
t.Parallel()
tests := []struct {
name string
setupTask func(t *testing.T, db database.Store, org database.Organization, user database.User, templateVersion database.TemplateVersion, workspace database.WorkspaceTable)
wantErr bool
}{
{
name: "task doesn't exist",
wantErr: true,
},
{
name: "task with no workspace id",
setupTask: func(t *testing.T, db database.Store, org database.Organization, user database.User, templateVersion database.TemplateVersion, workspace database.WorkspaceTable) {
dbgen.Task(t, db, database.TaskTable{
OrganizationID: org.ID,
OwnerID: user.ID,
Name: "test-task",
TemplateVersionID: templateVersion.ID,
Prompt: "Test prompt",
})
},
wantErr: true,
},
{
name: "task with workspace id",
setupTask: func(t *testing.T, db database.Store, org database.Organization, user database.User, templateVersion database.TemplateVersion, workspace database.WorkspaceTable) {
workspaceID := uuid.NullUUID{Valid: true, UUID: workspace.ID}
dbgen.Task(t, db, database.TaskTable{
OrganizationID: org.ID,
OwnerID: user.ID,
Name: "test-task",
WorkspaceID: workspaceID,
TemplateVersionID: templateVersion.ID,
Prompt: "Test prompt",
})
},
wantErr: false,
},
}
db, _ := dbtestutil.NewDB(t)
for _, tt := range tests {
tt := tt
t.Run(tt.name, func(t *testing.T) {
t.Parallel()
org := dbgen.Organization(t, db, database.Organization{})
user := dbgen.User(t, db, database.User{})
template := dbgen.Template(t, db, database.Template{
OrganizationID: org.ID,
CreatedBy: user.ID,
})
templateVersion := dbgen.TemplateVersion(t, db, database.TemplateVersion{
OrganizationID: org.ID,
TemplateID: uuid.NullUUID{Valid: true, UUID: template.ID},
CreatedBy: user.ID,
})
workspace := dbgen.Workspace(t, db, database.WorkspaceTable{
OrganizationID: org.ID,
OwnerID: user.ID,
TemplateID: template.ID,
})
if tt.setupTask != nil {
tt.setupTask(t, db, org, user, templateVersion, workspace)
}
ctx := testutil.Context(t, testutil.WaitLong)
task, err := db.GetTaskByWorkspaceID(ctx, workspace.ID)
if tt.wantErr {
require.Error(t, err)
} else {
require.NoError(t, err)
require.False(t, task.WorkspaceBuildNumber.Valid)
require.False(t, task.WorkspaceAgentID.Valid)
require.False(t, task.WorkspaceAppID.Valid)
}
})
}
}
func TestTaskNameUniqueness(t *testing.T) {
t.Parallel()
db, _ := dbtestutil.NewDB(t)
org := dbgen.Organization(t, db, database.Organization{})
user1 := dbgen.User(t, db, database.User{})
user2 := dbgen.User(t, db, database.User{})
template := dbgen.Template(t, db, database.Template{
OrganizationID: org.ID,
CreatedBy: user1.ID,
})
tv := dbgen.TemplateVersion(t, db, database.TemplateVersion{
TemplateID: uuid.NullUUID{UUID: template.ID, Valid: true},
OrganizationID: org.ID,
CreatedBy: user1.ID,
})
taskName := "my-task"
// Create initial task for user1.
task1 := dbgen.Task(t, db, database.TaskTable{
OrganizationID: org.ID,
OwnerID: user1.ID,
Name: taskName,
TemplateVersionID: tv.ID,
Prompt: "Test prompt",
})
require.NotEqual(t, uuid.Nil, task1.ID)
tests := []struct {
name string
ownerID uuid.UUID
taskName string
wantErr bool
}{
{
name: "duplicate task name same user",
ownerID: user1.ID,
taskName: taskName,
wantErr: true,
},
{
name: "duplicate task name different case same user",
ownerID: user1.ID,
taskName: "MY-TASK",
wantErr: true,
},
{
name: "same task name different user",
ownerID: user2.ID,
taskName: taskName,
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
t.Parallel()
ctx := testutil.Context(t, testutil.WaitShort)
task, err := db.InsertTask(ctx, database.InsertTaskParams{
OrganizationID: org.ID,
OwnerID: tt.ownerID,
Name: tt.taskName,
TemplateVersionID: tv.ID,
TemplateParameters: json.RawMessage("{}"),
Prompt: "Test prompt",
CreatedAt: dbtime.Now(),
})
if tt.wantErr {
require.Error(t, err)
} else {
require.NoError(t, err)
require.NotEqual(t, uuid.Nil, task.ID)
require.NotEqual(t, task1.ID, task.ID)
}
})
}
}
func TestUsageEventsTrigger(t *testing.T) {
t.Parallel()
@@ -6780,3 +7423,148 @@ func TestUsageEventsTrigger(t *testing.T) {
require.Len(t, rows, 0)
})
}
func TestListTasks(t *testing.T) {
t.Parallel()
db, ps := dbtestutil.NewDB(t)
// Given: two organizations and two users, one of which is a member of both
org1 := dbgen.Organization(t, db, database.Organization{})
org2 := dbgen.Organization(t, db, database.Organization{})
user1 := dbgen.User(t, db, database.User{})
user2 := dbgen.User(t, db, database.User{})
_ = dbgen.OrganizationMember(t, db, database.OrganizationMember{
OrganizationID: org1.ID,
UserID: user1.ID,
})
_ = dbgen.OrganizationMember(t, db, database.OrganizationMember{
OrganizationID: org2.ID,
UserID: user2.ID,
})
// Given: a template with an active version
tv := dbgen.TemplateVersion(t, db, database.TemplateVersion{
CreatedBy: user1.ID,
OrganizationID: org1.ID,
})
tpl := dbgen.Template(t, db, database.Template{
CreatedBy: user1.ID,
OrganizationID: org1.ID,
ActiveVersionID: tv.ID,
})
// Helper function to create a task
createTask := func(orgID, ownerID uuid.UUID) database.TaskTable {
ws := dbgen.Workspace(t, db, database.WorkspaceTable{
OrganizationID: orgID,
OwnerID: ownerID,
TemplateID: tpl.ID,
})
pj := dbgen.ProvisionerJob(t, db, ps, database.ProvisionerJob{})
sidebarAppID := uuid.New()
wb := dbgen.WorkspaceBuild(t, db, database.WorkspaceBuild{
JobID: pj.ID,
TemplateVersionID: tv.ID,
WorkspaceID: ws.ID,
})
wr := dbgen.WorkspaceResource(t, db, database.WorkspaceResource{
JobID: pj.ID,
})
agt := dbgen.WorkspaceAgent(t, db, database.WorkspaceAgent{
ResourceID: wr.ID,
})
wa := dbgen.WorkspaceApp(t, db, database.WorkspaceApp{
ID: sidebarAppID,
AgentID: agt.ID,
})
tsk := dbgen.Task(t, db, database.TaskTable{
OrganizationID: orgID,
OwnerID: ownerID,
Prompt: testutil.GetRandomName(t),
TemplateVersionID: tv.ID,
WorkspaceID: uuid.NullUUID{UUID: ws.ID, Valid: true},
})
_ = dbgen.TaskWorkspaceApp(t, db, database.TaskWorkspaceApp{
TaskID: tsk.ID,
WorkspaceBuildNumber: wb.BuildNumber,
WorkspaceAgentID: uuid.NullUUID{Valid: true, UUID: agt.ID},
WorkspaceAppID: uuid.NullUUID{Valid: true, UUID: wa.ID},
})
t.Logf("task_id:%s owner_id:%s org_id:%s", tsk.ID, ownerID, orgID)
return tsk
}
// Given: user1 has one task, user2 has one task, user3 has two tasks (one in each org)
task1 := createTask(org1.ID, user1.ID)
task2 := createTask(org1.ID, user2.ID)
task3 := createTask(org2.ID, user2.ID)
// Then: run various filters and assert expected results
for _, tc := range []struct {
name string
filter database.ListTasksParams
expectIDs []uuid.UUID
}{
{
name: "no filter",
filter: database.ListTasksParams{
OwnerID: uuid.Nil,
OrganizationID: uuid.Nil,
},
expectIDs: []uuid.UUID{task3.ID, task2.ID, task1.ID},
},
{
name: "filter by user ID",
filter: database.ListTasksParams{
OwnerID: user1.ID,
OrganizationID: uuid.Nil,
},
expectIDs: []uuid.UUID{task1.ID},
},
{
name: "filter by organization ID",
filter: database.ListTasksParams{
OwnerID: uuid.Nil,
OrganizationID: org1.ID,
},
expectIDs: []uuid.UUID{task2.ID, task1.ID},
},
{
name: "filter by user and organization ID",
filter: database.ListTasksParams{
OwnerID: user2.ID,
OrganizationID: org2.ID,
},
expectIDs: []uuid.UUID{task3.ID},
},
{
name: "no results",
filter: database.ListTasksParams{
OwnerID: user1.ID,
OrganizationID: org2.ID,
},
expectIDs: nil,
},
} {
t.Run(tc.name, func(t *testing.T) {
t.Parallel()
ctx := testutil.Context(t, testutil.WaitShort)
tasks, err := db.ListTasks(ctx, tc.filter)
require.NoError(t, err)
require.Len(t, tasks, len(tc.expectIDs))
for idx, eid := range tc.expectIDs {
task := tasks[idx]
assert.Equal(t, eid, task.ID, "task ID mismatch at index %d", idx)
require.True(t, task.WorkspaceBuildNumber.Valid)
require.Greater(t, task.WorkspaceBuildNumber.Int32, int32(0))
require.True(t, task.WorkspaceAgentID.Valid)
require.NotEqual(t, uuid.Nil, task.WorkspaceAgentID.UUID)
require.True(t, task.WorkspaceAppID.Valid)
require.NotEqual(t, uuid.Nil, task.WorkspaceAppID.UUID)
}
})
}
}
+205 -17
View File
@@ -8966,7 +8966,7 @@ WHERE
-- Filter by max age if provided
AND (
$7::bigint IS NULL
OR pd.last_seen_at IS NULL
OR pd.last_seen_at IS NULL
OR pd.last_seen_at >= (NOW() - ($7::bigint || ' ms')::interval)
)
AND (
@@ -9291,11 +9291,11 @@ func (q *sqlQuerier) InsertProvisionerJobLogs(ctx context.Context, arg InsertPro
}
const updateProvisionerJobLogsLength = `-- name: UpdateProvisionerJobLogsLength :exec
UPDATE
UPDATE
provisioner_jobs
SET
SET
logs_length = logs_length + $2
WHERE
WHERE
id = $1
`
@@ -9310,11 +9310,11 @@ func (q *sqlQuerier) UpdateProvisionerJobLogsLength(ctx context.Context, arg Upd
}
const updateProvisionerJobLogsOverflowed = `-- name: UpdateProvisionerJobLogsOverflowed :exec
UPDATE
UPDATE
provisioner_jobs
SET
SET
logs_overflowed = $2
WHERE
WHERE
id = $1
`
@@ -9834,6 +9834,7 @@ WHERE
AND (COALESCE(array_length($2::uuid[], 1), 0) = 0 OR pj.id = ANY($2::uuid[]))
AND (COALESCE(array_length($3::provisioner_job_status[], 1), 0) = 0 OR pj.job_status = ANY($3::provisioner_job_status[]))
AND ($4::tagset = 'null'::tagset OR provisioner_tagset_contains(pj.tags::tagset, $4::tagset))
AND ($5::uuid = '00000000-0000-0000-0000-000000000000'::uuid OR pj.initiator_id = $5::uuid)
GROUP BY
pj.id,
qp.queue_position,
@@ -9849,7 +9850,7 @@ GROUP BY
ORDER BY
pj.created_at DESC
LIMIT
$5::int
$6::int
`
type GetProvisionerJobsByOrganizationAndStatusWithQueuePositionAndProvisionerParams struct {
@@ -9857,6 +9858,7 @@ type GetProvisionerJobsByOrganizationAndStatusWithQueuePositionAndProvisionerPar
IDs []uuid.UUID `db:"ids" json:"ids"`
Status []ProvisionerJobStatus `db:"status" json:"status"`
Tags StringMap `db:"tags" json:"tags"`
InitiatorID uuid.UUID `db:"initiator_id" json:"initiator_id"`
Limit sql.NullInt32 `db:"limit" json:"limit"`
}
@@ -9881,6 +9883,7 @@ func (q *sqlQuerier) GetProvisionerJobsByOrganizationAndStatusWithQueuePositionA
pq.Array(arg.IDs),
pq.Array(arg.Status),
arg.Tags,
arg.InitiatorID,
arg.Limit,
)
if err != nil {
@@ -10373,7 +10376,7 @@ FROM
provisioner_keys
WHERE
organization_id = $1
AND
AND
lower(name) = lower($2)
`
@@ -10489,10 +10492,10 @@ WHERE
AND
-- exclude reserved built-in key
id != '00000000-0000-0000-0000-000000000001'::uuid
AND
AND
-- exclude reserved user-auth key
id != '00000000-0000-0000-0000-000000000002'::uuid
AND
AND
-- exclude reserved psk key
id != '00000000-0000-0000-0000-000000000003'::uuid
`
@@ -12504,6 +12507,191 @@ func (q *sqlQuerier) UpsertTailnetTunnel(ctx context.Context, arg UpsertTailnetT
return i, err
}
const getTaskByID = `-- name: GetTaskByID :one
SELECT id, organization_id, owner_id, name, workspace_id, template_version_id, template_parameters, prompt, created_at, deleted_at, status, workspace_build_number, workspace_agent_id, workspace_app_id FROM tasks_with_status WHERE id = $1::uuid
`
func (q *sqlQuerier) GetTaskByID(ctx context.Context, id uuid.UUID) (Task, error) {
row := q.db.QueryRowContext(ctx, getTaskByID, id)
var i Task
err := row.Scan(
&i.ID,
&i.OrganizationID,
&i.OwnerID,
&i.Name,
&i.WorkspaceID,
&i.TemplateVersionID,
&i.TemplateParameters,
&i.Prompt,
&i.CreatedAt,
&i.DeletedAt,
&i.Status,
&i.WorkspaceBuildNumber,
&i.WorkspaceAgentID,
&i.WorkspaceAppID,
)
return i, err
}
const getTaskByWorkspaceID = `-- name: GetTaskByWorkspaceID :one
SELECT id, organization_id, owner_id, name, workspace_id, template_version_id, template_parameters, prompt, created_at, deleted_at, status, workspace_build_number, workspace_agent_id, workspace_app_id FROM tasks_with_status WHERE workspace_id = $1::uuid
`
func (q *sqlQuerier) GetTaskByWorkspaceID(ctx context.Context, workspaceID uuid.UUID) (Task, error) {
row := q.db.QueryRowContext(ctx, getTaskByWorkspaceID, workspaceID)
var i Task
err := row.Scan(
&i.ID,
&i.OrganizationID,
&i.OwnerID,
&i.Name,
&i.WorkspaceID,
&i.TemplateVersionID,
&i.TemplateParameters,
&i.Prompt,
&i.CreatedAt,
&i.DeletedAt,
&i.Status,
&i.WorkspaceBuildNumber,
&i.WorkspaceAgentID,
&i.WorkspaceAppID,
)
return i, err
}
const insertTask = `-- name: InsertTask :one
INSERT INTO tasks
(id, organization_id, owner_id, name, workspace_id, template_version_id, template_parameters, prompt, created_at)
VALUES
(gen_random_uuid(), $1, $2, $3, $4, $5, $6, $7, $8)
RETURNING id, organization_id, owner_id, name, workspace_id, template_version_id, template_parameters, prompt, created_at, deleted_at
`
type InsertTaskParams struct {
OrganizationID uuid.UUID `db:"organization_id" json:"organization_id"`
OwnerID uuid.UUID `db:"owner_id" json:"owner_id"`
Name string `db:"name" json:"name"`
WorkspaceID uuid.NullUUID `db:"workspace_id" json:"workspace_id"`
TemplateVersionID uuid.UUID `db:"template_version_id" json:"template_version_id"`
TemplateParameters json.RawMessage `db:"template_parameters" json:"template_parameters"`
Prompt string `db:"prompt" json:"prompt"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
}
func (q *sqlQuerier) InsertTask(ctx context.Context, arg InsertTaskParams) (TaskTable, error) {
row := q.db.QueryRowContext(ctx, insertTask,
arg.OrganizationID,
arg.OwnerID,
arg.Name,
arg.WorkspaceID,
arg.TemplateVersionID,
arg.TemplateParameters,
arg.Prompt,
arg.CreatedAt,
)
var i TaskTable
err := row.Scan(
&i.ID,
&i.OrganizationID,
&i.OwnerID,
&i.Name,
&i.WorkspaceID,
&i.TemplateVersionID,
&i.TemplateParameters,
&i.Prompt,
&i.CreatedAt,
&i.DeletedAt,
)
return i, err
}
const listTasks = `-- name: ListTasks :many
SELECT id, organization_id, owner_id, name, workspace_id, template_version_id, template_parameters, prompt, created_at, deleted_at, status, workspace_build_number, workspace_agent_id, workspace_app_id FROM tasks_with_status tws
WHERE tws.deleted_at IS NULL
AND CASE WHEN $1::UUID != '00000000-0000-0000-0000-000000000000' THEN tws.owner_id = $1::UUID ELSE TRUE END
AND CASE WHEN $2::UUID != '00000000-0000-0000-0000-000000000000' THEN tws.organization_id = $2::UUID ELSE TRUE END
ORDER BY tws.created_at DESC
`
type ListTasksParams struct {
OwnerID uuid.UUID `db:"owner_id" json:"owner_id"`
OrganizationID uuid.UUID `db:"organization_id" json:"organization_id"`
}
func (q *sqlQuerier) ListTasks(ctx context.Context, arg ListTasksParams) ([]Task, error) {
rows, err := q.db.QueryContext(ctx, listTasks, arg.OwnerID, arg.OrganizationID)
if err != nil {
return nil, err
}
defer rows.Close()
var items []Task
for rows.Next() {
var i Task
if err := rows.Scan(
&i.ID,
&i.OrganizationID,
&i.OwnerID,
&i.Name,
&i.WorkspaceID,
&i.TemplateVersionID,
&i.TemplateParameters,
&i.Prompt,
&i.CreatedAt,
&i.DeletedAt,
&i.Status,
&i.WorkspaceBuildNumber,
&i.WorkspaceAgentID,
&i.WorkspaceAppID,
); err != nil {
return nil, err
}
items = append(items, i)
}
if err := rows.Close(); err != nil {
return nil, err
}
if err := rows.Err(); err != nil {
return nil, err
}
return items, nil
}
const upsertTaskWorkspaceApp = `-- name: UpsertTaskWorkspaceApp :one
INSERT INTO task_workspace_apps
(task_id, workspace_build_number, workspace_agent_id, workspace_app_id)
VALUES
($1, $2, $3, $4)
ON CONFLICT (task_id, workspace_build_number)
DO UPDATE SET
workspace_agent_id = EXCLUDED.workspace_agent_id,
workspace_app_id = EXCLUDED.workspace_app_id
RETURNING task_id, workspace_agent_id, workspace_app_id, workspace_build_number
`
type UpsertTaskWorkspaceAppParams struct {
TaskID uuid.UUID `db:"task_id" json:"task_id"`
WorkspaceBuildNumber int32 `db:"workspace_build_number" json:"workspace_build_number"`
WorkspaceAgentID uuid.NullUUID `db:"workspace_agent_id" json:"workspace_agent_id"`
WorkspaceAppID uuid.NullUUID `db:"workspace_app_id" json:"workspace_app_id"`
}
func (q *sqlQuerier) UpsertTaskWorkspaceApp(ctx context.Context, arg UpsertTaskWorkspaceAppParams) (TaskWorkspaceApp, error) {
row := q.db.QueryRowContext(ctx, upsertTaskWorkspaceApp,
arg.TaskID,
arg.WorkspaceBuildNumber,
arg.WorkspaceAgentID,
arg.WorkspaceAppID,
)
var i TaskWorkspaceApp
err := row.Scan(
&i.TaskID,
&i.WorkspaceAgentID,
&i.WorkspaceAppID,
&i.WorkspaceBuildNumber,
)
return i, err
}
const getTelemetryItem = `-- name: GetTelemetryItem :one
SELECT key, value, created_at, updated_at FROM telemetry_items WHERE key = $1
`
@@ -14286,14 +14474,14 @@ DO $$
DECLARE
table_record record;
BEGIN
FOR table_record IN
SELECT table_schema, table_name
FROM information_schema.tables
FOR table_record IN
SELECT table_schema, table_name
FROM information_schema.tables
WHERE table_schema NOT IN ('pg_catalog', 'information_schema')
AND table_type = 'BASE TABLE'
LOOP
EXECUTE format('ALTER TABLE %I.%I DISABLE TRIGGER ALL',
table_record.table_schema,
EXECUTE format('ALTER TABLE %I.%I DISABLE TRIGGER ALL',
table_record.table_schema,
table_record.table_name);
END LOOP;
END;
@@ -18278,7 +18466,7 @@ WITH agent_stats AS (
coalesce((PERCENTILE_CONT(0.95) WITHIN GROUP (ORDER BY connection_median_latency_ms)), -1)::FLOAT AS workspace_connection_latency_95
FROM workspace_agent_stats
-- The greater than 0 is to support legacy agents that don't report connection_median_latency_ms.
WHERE workspace_agent_stats.created_at > $1 AND connection_median_latency_ms > 0
WHERE workspace_agent_stats.created_at > $1 AND connection_median_latency_ms > 0
GROUP BY user_id, agent_id, workspace_id, template_id
), latest_agent_stats AS (
SELECT
@@ -113,7 +113,7 @@ WHERE
-- Filter by max age if provided
AND (
sqlc.narg('max_age_ms')::bigint IS NULL
OR pd.last_seen_at IS NULL
OR pd.last_seen_at IS NULL
OR pd.last_seen_at >= (NOW() - (sqlc.narg('max_age_ms')::bigint || ' ms')::interval)
)
AND (
@@ -19,19 +19,19 @@ SELECT
unnest(@level :: log_level [ ]) AS LEVEL,
unnest(@stage :: VARCHAR(128) [ ]) AS stage,
unnest(@output :: VARCHAR(1024) [ ]) AS output RETURNING *;
-- name: UpdateProvisionerJobLogsOverflowed :exec
UPDATE
UPDATE
provisioner_jobs
SET
SET
logs_overflowed = $2
WHERE
WHERE
id = $1;
-- name: UpdateProvisionerJobLogsLength :exec
UPDATE
UPDATE
provisioner_jobs
SET
SET
logs_length = logs_length + $2
WHERE
WHERE
id = $1;

Some files were not shown because too many files have changed in this diff Show More