You can now transfer Marketplace resources between teams directly from the Vercel dashboard without relying on the API. This simplifies resource management during team or project changes. Both owner and member roles on the source and destination teams can initiate transfers.
The destination team must have the corresponding integration installed before receiving a resource. The feature currently supports transfer databases from Prisma, Neon and Supabase , with additional providers and product support coming soon.
Start from your database settings in the dashboard, or learn more in the documentation.
The axios npm package was compromised in an active supply chain attack discovered on March 31, 2026. Vercel investigated this issue and implemented remediation actions to protect the platform. No Vercel systems were affected.
The npm registry removed the compromised package versions, and the latest tag now points to the safe axios@1.14.0 release.
We’ve blocked outgoing access from our build infrastructure to the Command & Control hostname sfrclak.com.
The malicious version of the package has been blocked and unpublished from npm.
Vercel’s own infrastructure and applications have been unaffected. We recommend checking your supply chain for exposure.
New Vercel projects will honor cache-controlheaders by default when proxying requests to external origins, starting April 6th.
Previously, responses served through rewrites to external origins were uncached by default, and enabling caching required the x-vercel-enable-rewrite-caching header in vercel.json. Now, Vercel's CDN automatically respects your origin's caching headers.
What's changing:
For new projects, Vercel will cache responses from external origins according to upstream Cache-Control, CDN-Cache-Control and Vercel-CDN-Cache-Control headers by default.
You can use Cache Tags (Vercel-Cache-Tag header) from your origins to purge cached content.
Existing projects can opt in to the new caching behavior from the project dashboard.
You can opt out of caching for specific request paths by setting the x-vercel-enable-rewrite-caching header to 0.
Review your upstream cache headers before April 6th when creating a new project that proxies to external origins without caching, ensuring they reflect your intended caching strategy.
The Vercel plugin now supports OpenAI Codex and the Codex CLI.
With the plugin, teams can access over 39 platform skills, three specialist agents, and real-time code validation to assist with their development workflow.
Install it in the Codex app or from the Codex CLI:
Vercel Sandboxes can now automatically save their filesystem state when stopped and restore it when resumed. This removes the need for manual snapshots, making it easier to run long-running, durable sandboxes that continue where you left off.
A sandbox is the durable identity, now identified by a name, its filesystem state, and configuration options. A session is the compute tied to that state, invoked as needed.
Automatic persistence introduces orchestration that separates storage from compute, reducing the need for manual snapshotting, so:
when you stop a sandbox, the session shuts down but the filesystem is automatically snapshotted.
when you resume, a new session boots from that snapshot. This state storage is not charged, so you pay when your setting is active.
Persistence is enabled by default in the beta SDK, can be disabled between sessions with persistent: false. When disabled, the sandbox still exists after being stopped and can be resumed by its name, but each session starts with a clean filesystem.
If a sandbox is stopped and you run a command, the SDK will transparently create a new session, so you don't need to check state or manually restart
The beta CLI adds configuration management and session inspection:
# Spin up a sandbox for a user
sandbox create --name user-alice
# User runs commands — if the sandbox timed out, it resumes automatically
sandbox run --name user-alice -- npmtest
# Check what happened across sessions
sandbox sessions list user-alice
# Tune resources without recreating
sandbox config vcpus user-alice 4
sandbox config timeout user-alice 5h
This feature is in beta and requires upgrading to the beta SDK and CLI packages.
Install the beta packages to try persistent sandboxes today: pnpm install @vercel/sandbox@beta for the SDK, and pnpm install -g sandbox@beta for the CLI.
Persistent sandboxes are available in beta on all plans.
Vercel Sandboxes created with the latest beta package will now have a unique, customizable name within your project, replacing the previous ID-based identification. Names make sandboxes easy to find and reference:
The beta CLI adds configuration management and session inspection:
# Spin up a sandbox for a user
sandbox create --name user-alice
# User runs commands — if the sandbox timed out, it resumes automatically
sandbox run --name user-alice -- npmtest
# Check what happened across sessions
sandbox sessions list user-alice
Named sandboxes are the mechanism for identifying automatic persistence, which allows your session to be more easily identified for at both time of creation resumption.
Install the beta packages to try named sandboxes today: pnpm install @vercel/sandbox@beta for the SDK, and pnpm install -g sandbox@beta for the CLI.
Chat SDK now lets you control what happens when a new message arrives before a previous one finishes processing, with the new concurrency option for the Chat class.
const bot =newChat({
concurrency:{
strategy:"queue",
maxQueueSize:20,
onQueueFull:"drop-oldest",
queueEntryTtlMs:60_000,
},
// ...
});
Multiple options are supported to customize your concurrency strategy.
Four strategies are available:
drop (default): discards incoming messages
queue: processes the latest message after the handler finishes
debounce: waits for a pause in conversation, processes only the final message
concurrent: processes every message immediately, no locking