Fix client-side crash when vLLM server is unavailable #16
+4,946
−7,231
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Application crashes with "a client-side exception has occurred" when the vLLM backend is unreachable. Three issues cause this:
Changes
src/lib/token-counter.ts:getTokenLimitnow returns default value (4096) on error instead of throwing unhandled exceptionssrc/components/chat/chat-topbar.tsx:data.dataexists before calling.map()on API error responsestoast.error(error as string)→toast.error(error instanceof Error ? error.message : String(error))— Error objects aren't renderable by Reactsrc/components/chat/chat-bottombar.tsx: Add.catch()handler forgetTokenLimitpromiseBefore
After
Original prompt
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.