Skip to content

AI Chat Template

AI chat application with Vertex AI (Gemini), streaming responses, conversation history, and Vue 3 frontend.

Terminal window
# Create project
stacksolo init --template ai-chat
# Install dependencies
cd my-chat
npm install
# Start development
stacksolo dev
  • Firebase SDK for authentication (email/password, Google)
  • Pinia store for chat state with streaming support
  • Vue Router with protected routes
  • Tailwind CSS styling
  • Real-time streaming responses via SSE
  • Conversation sidebar with history
  • Markdown rendering in messages
  • Auto-scrolling chat interface
  • Express API on Cloud Functions
  • Vertex AI Gemini integration
  • Server-Sent Events (SSE) for streaming
  • Firestore for conversation history
  • Message history context for multi-turn conversations
├── apps/web/ # Vue 3 frontend
│ └── src/
│ ├── components/
│ │ ├── ChatMessage.vue # Message bubble with markdown
│ │ ├── ChatInput.vue # Input with send button
│ │ └── ConversationList.vue # Sidebar
│ ├── stores/
│ │ ├── auth.ts # Firebase auth store
│ │ └── chat.ts # Chat state with streaming
│ ├── pages/
│ │ ├── Chat.vue # Main chat interface
│ │ └── Login.vue
│ ├── router/index.ts
│ └── lib/
│ ├── firebase.ts
│ └── api.ts # SSE streaming client
├── functions/api/ # Express API
│ └── src/
│ ├── services/
│ │ ├── gemini.service.ts # Vertex AI streaming
│ │ └── firestore.service.ts # Conversation storage
│ ├── routes/
│ │ ├── chat.ts # SSE streaming endpoint
│ │ └── conversations.ts # History CRUD
│ └── index.ts
└── stacksolo.config.json
MethodPathAuthDescription
GET/api/healthNoHealth check
POST/api/chatYesStream chat response (SSE)
GET/api/conversationsYesList conversations
GET/api/conversations/:idYesGet conversation with messages
DELETE/api/conversations/:idYesDelete conversation

The chat uses Server-Sent Events (SSE) for real-time streaming:

router.post('/chat', kernel.authMiddleware(), async (req, res) => {
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
for await (const chunk of geminiService.streamChat(messages)) {
res.write(`data: ${JSON.stringify({ type: 'chunk', content: chunk })}\n\n`);
}
res.write(`data: ${JSON.stringify({ type: 'done', conversationId })}\n\n`);
res.end();
});
export async function* streamChat(message: string, conversationId?: string) {
const response = await fetch('/api/chat', {
method: 'POST',
headers: { Authorization: `Bearer ${token}` },
body: JSON.stringify({ message, conversationId }),
});
const reader = response.body!.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
const lines = decoder.decode(value).split('\n');
for (const line of lines) {
if (line.startsWith('data: ')) {
yield JSON.parse(line.slice(6));
}
}
}
}

Update services/gemini.service.ts:

const model = vertexAI.getGenerativeModel({
model: 'gemini-1.5-pro', // or gemini-1.5-flash for faster responses
});
const chat = model.startChat({
history,
systemInstruction: {
role: 'system',
parts: [{ text: 'You are a helpful assistant specialized in...' }],
},
});
  1. Add vector store (pgvector or Vertex AI Vector Search)
  2. Before sending to Gemini, retrieve relevant documents
  3. Include context in the system prompt

For local development, create .env.local:

VITE_FIREBASE_API_KEY=your-api-key
VITE_FIREBASE_AUTH_DOMAIN=your-project.firebaseapp.com
VITE_FIREBASE_PROJECT_ID=your-project-id
Terminal window
stacksolo deploy

This creates:

  • Cloud Functions API with Vertex AI access
  • Firestore database for conversations
  • Cloud Storage for frontend
  • Load balancer with SSL