A companion example project demonstrating best practices for building agentic AI systems using Typhoon 2.5 and LangGraph. This project showcases patterns for creating intelligent IT support assistants with real-time streaming, tool usage, and multi-turn conversations.
Please refer to this companion blog post for more details: π¬π§ Mastering Agentic Workflows: 20 Principles That Works | πΉπ 20 ΰΈ«ΰΈ₯ΰΈ±ΰΈΰΈΰΈ²ΰΈ£ΰΉΰΈΰΈ·ΰΉΰΈΰΈΰΈ²ΰΈ£ΰΈΰΈΰΈΰΉΰΈΰΈ Agentic Workflow ΰΈΰΈ’ΰΉΰΈ²ΰΈΰΈ‘ΰΈ΅ΰΈΰΈ£ΰΈ°ΰΈͺΰΈ΄ΰΈΰΈΰΈ΄ΰΈ ΰΈ²ΰΈ
Note: This is an educational reference implementation designed to teach best practices. It is not intended for production use without significant customization and hardening.
- Agentic Workflows: Implement Think β Act β Observe patterns with LangGraph
- Typhoon Integration: Use Thai-English bilingual AI (Typhoon 2.5) for IT support
- Tool Orchestration: Document search, ticket management, and utility tools
- Event System: Real-time streaming with Server-Sent Events (SSE)
- Memory Management: Session-based conversations with persistent state
- Production Patterns: Best practices for error handling, testing, and configuration
- Demo Mode: Simulated logged-in experience with pre-configured Thai employee profile
This companion project demonstrates best practices for building agentic AI systems:
β
Real-world Architecture: Not just a toy exampleβshows proper state management, error handling, and testing patterns
β
Bilingual Support: Demonstrates Thai-English handling with culturally appropriate responses
β
Streaming First: Modern UX with token-by-token streaming, not blocking requests
β
Tool Integration: Shows how to give agents real capabilities (search, tickets, etc.)
β
Production Patterns: Demonstrates checkpointing, session management, monitoring, and testing approaches
β
Extensible Design: Easy to understand, modify, and adapt to your domain
Perfect for developers learning about:
- Multi-agent systems with LangGraph
- Thai language AI applications
- Building IT support automation
- Streaming chat interfaces
- Agentic workflow patterns
The system follows a modern three-tier architecture with event-driven streaming:
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β Next.js UI β ββββΊ β FastAPI Server β ββββΊ β LangGraph β
β (Port 3000) β HTTP β (Port 8000) β β Agent Workflow β
β - Streaming β β - Streaming SSE β β - Typhoon 2.5 β
β - Markdown β β - Session Mgmt β β - Tools β
β - User Context β β - Event System β β - State Mgmt β
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
Flow: User input β Backend API β LangGraph workflow β Agent (Think) β Tools (Act) β Response streaming β Frontend display
- Python 3.12+
- Node.js 18+
- pnpm
- Typhoon API key from opentyphoon.ai
- Visit https://opentyphoon.ai
- Sign up and generate your API key
- Keep it for the next step
Create a .env file in the agentic-workflow directory:
cd agentic-workflowCreate .env file with the following content:
TYPHOON_API_KEY=your_actual_key_here
TYPHOON_BASE_URL=https://api.opentyphoon.ai/v1
TYPHOON_MODEL=typhoon-v2.5-30b-a3b-instruct
TEMPERATURE=0.7
MAX_TOKENS=1024
MAX_ITERATIONS=10Backend:
cd agentic-workflow
uv pip install -e ".[dev]"Frontend:
cd ../frontend
pnpm installThe document search requires a vector store. Initialize it with:
cd ../agentic-workflow
python scripts/init_vector_store.pyThis will load IT policy documents and create FAISS embeddings.
Terminal 1 - Backend:
./start-backend.sh
# Wait for: "Uvicorn running on http://0.0.0.0:8000"Terminal 2 - Frontend:
./start-frontend.sh
# Wait for: "Ready in 2-3s"Go to: http://localhost:3000
π Demo Mode: The application simulates a logged-in employee experience. You'll be automatically logged in as:
- Name: Somchai Phimsawat (ΰΈͺΰΈ‘ΰΈΰΈ²ΰΈ’ ΰΈΰΈ΄ΰΈ‘ΰΈΰΉΰΈͺΰΈ§ΰΈ±ΰΈͺΰΈΰΈ΄ΰΉ)
- Position: Digital Marketing Specialist
- Company: BlueWave Technology Co., Ltd.
- Department: Marketing & Communications
See the profile information in the top-right corner!
Try asking:
- "ΰΈΰΈ±ΰΈΰΈΰΉΰΈΰΈΰΈΰΈ²ΰΈ£ΰΉΰΈΰΉ Canva Pro ΰΈͺΰΈ³ΰΈ«ΰΈ£ΰΈ±ΰΈΰΈΰΈ³ campaign" (requesting marketing tools)
- "ΰΈΰΉΰΈ§ΰΈ’ΰΉΰΈΰΉΰΈΰΈ±ΰΈΰΈ«ΰΈ²ΰΈΰΈΰΈ‘ΰΈΰΈ΄ΰΈ§ΰΉΰΈΰΈΰΈ£ΰΉΰΈΰΉΰΈ²ΰΈ«ΰΈΰΉΰΈΰΈ’" (technical issues)
- "I need help resetting my password"
- "ΰΈͺΰΈ£ΰΉΰΈ²ΰΈ ticket ΰΉΰΈΰΈ·ΰΉΰΈΰΈΰΈΰΈΰΈΈΰΈΰΈΰΈ£ΰΈΰΉΰΉΰΈ«ΰΈ‘ΰΉ"
- Getting Started - Detailed setup and configuration
- Architecture - System design and components
- Extending - How to add tools and features
- Best Practices - Agentic system principles
- API Reference - REST endpoints and usage
- Think: Agent decides what to do next
- Act: Execute tools (search docs, manage tickets)
- Observe: Process results and respond
- Token-by-token streaming responses
- Server-Sent Events (SSE) for real-time updates
- Document search using FAISS vector store
- Ticket management (create, update, search, comment)
- Utility functions (time, formatting)
- Session-based conversations
- LangGraph checkpointers for context
- Multi-turn dialogue support
typhoon-it-support/
βββ agentic-workflow/ # Backend - LangGraph + FastAPI
β βββ src/typhoon_it_support/
β β βββ agents/ # Agent nodes (think, act, observe)
β β βββ api/ # FastAPI server with streaming
β β βββ config/ # Configuration management
β β βββ graph/ # LangGraph workflow
β β βββ tools/ # Agent tools
β β βββ prompts/ # System prompts
β βββ tests/ # Comprehensive tests
β βββ docs/ # Technical documentation
β
βββ frontend/ # Next.js + React
β βββ app/components/
β βββ Chat.tsx # Streaming chat UI
β
βββ docs/ # Developer documentation
βββ README.md # This file
Run the comprehensive test suite:
cd agentic-workflow
# Run all tests
uv run pytest -v
# With coverage report
uv run pytest --cov=src --cov-report=html
# Test specific module
uv run pytest tests/test_api.py -v
# Test with verbose output
uv run pytest -vv
# Run fast tests only (exclude slow integration tests)
uv run pytest -m "not slow"View coverage report:
open htmlcov/index.html # macOS
# or
xdg-open htmlcov/index.html # LinuxEdit agentic-workflow/.env:
# Required
TYPHOON_API_KEY=your_api_key_here
# API Configuration
TYPHOON_BASE_URL=https://api.opentyphoon.ai/v1
TYPHOON_MODEL=typhoon-v2.5-30b-a3b-instruct
# Agent Settings
TEMPERATURE=0.7 # Response creativity (0.0-1.0)
MAX_TOKENS=1024 # Maximum response length
MAX_ITERATIONS=10 # Max workflow iterations
# Checkpointer (memory management)
CHECKPOINTER_TYPE=memory # "memory" or "sqlite"
SQLITE_CHECKPOINT_PATH=./checkpoints.db- Temperature: Controls response creativity (0.0 = deterministic, 1.0 = creative)
- Max Tokens: Maximum length of generated responses
- Max Iterations: Maximum workflow loops before stopping
- Checkpointer: Use "memory" for development, "sqlite" for production
# In src/typhoon_it_support/tools/your_tool.py
def your_tool(param: str) -> str:
"""
Brief description of what the tool does.
Args:
param: Description of parameter
Returns:
Description of return value
"""
return resultSee docs/EXTENDING.md for complete examples.
| Component | Technology |
|---|---|
| AI Model | Typhoon 2.5 (30B) - Thai LLM |
| Workflow Engine | LangGraph - Agentic workflow orchestration |
| Backend Framework | FastAPI - Modern async Python web framework |
| Frontend | Next.js 16 (App Router) + React 19 |
| Language | Python 3.12+ / TypeScript |
| Styling | Tailwind CSS 4 - Utility-first CSS |
| Vector Store | FAISS - Semantic document search |
| Testing | pytest with coverage |
| Package Management | uv (Python), pnpm (Node.js) |
This is an educational reference implementation for learning and experimentation. You're encouraged to:
- Fork and Learn: Study the code to understand agentic workflow patterns
- Customize: Adapt this project as a starting point for your own applications
- Experiment: Try adding new tools, agents, or capabilities
- Share: Contribute improvements, examples, or documentation
Important: This project demonstrates best practices but requires additional security, scalability, and reliability work before being deployed in production environments.
Port 8000 already in use:
lsof -ti:8000 | xargs kill -9API Key not working:
- Verify your API key at https://opentyphoon.ai
- Check
.envfile hasTYPHOON_API_KEY=your_key - Ensure no extra spaces or quotes around the key
Vector store errors:
cd agentic-workflow
rm -rf vector_store/
python scripts/init_vector_store.pyPort 3000 already in use:
lsof -ti:3000 | xargs kill -9Connection refused errors:
- Ensure backend is running on port 8000
- Check
NEXT_PUBLIC_API_URLin frontend environment - Verify CORS settings in backend
Module not found errors:
cd frontend
rm -rf node_modules .next
pnpm installDependencies not installing:
- Update uv:
pip install -U uv - Update pnpm:
pnpm add -g pnpm - Check Python version:
python --version(need 3.12+) - Check Node version:
node --version(need 18+)
Logs location:
- Backend logs:
logs/backend.log - Frontend logs:
logs/frontend.log - Check these for detailed error messages
For more help, see docs/GETTING_STARTED.md troubleshooting section.
This project is designed as a hands-on companion for learning agentic AI development:
- Project Documentation: docs/ - Detailed guides and architecture explanations
- Typhoon AI: https://opentyphoon.ai - Thai-English LLM platform
- LangGraph: https://langchain-ai.github.io/langgraph/ - Agent workflow framework
- Issues: Report bugs or questions via GitHub Issues
- Discussions: Share your learnings and implementations
- Fork: Use this as a starting point for your own projects
πͺοΈ Powered by Typhoon 2.5 - Thai Language AI by SCB 10X
Built with β€οΈ using LangGraph, FastAPI, and Next.js