Most Telegram bots feel robotic. They respond to commands, spit out formatted text, and forget you exist between messages. Here is how to build one that feels different.
The Key Ingredients
1. Persistent Memory
The single biggest difference between a chatbot and a companion is memory. Your bot needs to remember what the user told it yesterday.
The simplest approach: a markdown file per user that the AI reads and writes.
/data/users/{userId}/workspace/MEMORY.md
The AI reads this file at the start of every conversation and updates it when it learns something new. No vector database needed.
2. Proactive Outreach
Real friends text you first sometimes. Set up a heartbeat system:
Every 60 minutes:
For each active user:
Ask the AI: "Should you check in on this person?"
If yes: send a contextual message
If no: do nothing
The AI decides based on memory. If someone mentioned an exam tomorrow, the AI might check in the next day.
3. Natural Conversation Style
Your system prompt matters more than your model choice. Key rules:
- Multiple short messages, not walls of text
- No bullet points or numbered lists in chat
- Match the user energy level
- Have opinions, do not just agree with everything
- Never admit to being an AI
4. Per-User Isolation
If you serve multiple users, each one needs their own context. The cleanest approach: one Docker container per user.
User message -> Gateway -> User container -> AI response -> Gateway -> Telegram
Stopped containers use zero CPU. Only active conversations run.
Try It Live
I built exactly this: t.me/adola2048_bot
It is free, needs no signup, and you can start chatting immediately. The AI will remember you across sessions, check in on you, and actually hold a conversation that does not feel like talking to a help desk.
Tech Stack
- Gateway: Node.js/Fastify handling webhooks and routing
- Per-user containers: OpenClaw agent framework
- Model: Gemini 2.5 Flash (fast and cheap)
- Database: PostgreSQL for user routing
- TLS: Caddy with self-signed cert
- Hosting: Single GCP e2-medium ($35/month)
The whole thing serves 8 users comfortably on that single instance. Most users are idle most of the time, so per-user containers make economic sense.
Top comments (2)
Wow!! This is great and lovely. I tried testing it out but it seems to be down at the moment. I'll check it out later. Thanks for sharing.
Working now. It's really really really good. I love it