Simple Chatbot Example¶
A basic conversational bot demonstrating core dataknobs-bots functionality.
Overview¶
This example shows how to:
- Create a basic DynaBot from configuration
- Use Ollama as the LLM provider
- Set up in-memory conversation storage
- Have a simple conversation with the bot
Prerequisites¶
# Install Ollama: https://ollama.ai/
# Pull the required model
ollama pull gemma3:1b
# Install dataknobs-bots
pip install dataknobs-bots
Key Concepts¶
Configuration¶
The bot is configured entirely through a dictionary:
config = {
"llm": {
"provider": "ollama", # Use Ollama for local inference
"model": "gemma3:1b", # Lightweight model
"temperature": 0.7,
"max_tokens": 500
},
"conversation_storage": {
"backend": "memory" # Store conversations in memory
},
"prompts": {
"friendly_assistant": "You are a friendly and helpful AI assistant."
},
"system_prompt": {
"name": "friendly_assistant"
}
}
Bot Context¶
Each conversation needs a context that identifies:
conversation_id- Unique ID for this conversationclient_id- Tenant/application identifieruser_id- User identifier (optional)
context = BotContext(
conversation_id="simple-chat-001",
client_id="example-client",
user_id="demo-user"
)
Chatting¶
The chat() method sends a message and returns a response:
Complete Code¶
01_simple_chatbot.py
"""Simple chatbot example.
This example demonstrates:
- Basic DynaBot configuration
- Using Ollama as the LLM provider
- In-memory conversation storage
- Simple message exchange
Required Ollama model:
ollama pull gemma3:1b
"""
import asyncio
from dataknobs_bots import BotContext, DynaBot
async def main():
"""Run a simple chatbot conversation."""
print("=" * 60)
print("Simple Chatbot Example")
print("=" * 60)
print()
print("This example shows a basic chatbot with no memory.")
print("Required: ollama pull gemma3:1b")
print()
# Configuration for a simple chatbot
config = {
"llm": {
"provider": "ollama",
"model": "gemma3:1b",
"temperature": 0.7,
"max_tokens": 500,
},
"conversation_storage": {
"backend": "memory",
},
"prompts": {
"friendly_assistant": "You are a friendly and helpful AI assistant. "
"Keep your responses concise and clear."
},
"system_prompt": {
"name": "friendly_assistant",
},
}
print("Creating bot from configuration...")
bot = await DynaBot.from_config(config)
print("✓ Bot created successfully")
print()
# Create context for this conversation
context = BotContext(
conversation_id="simple-chat-001",
client_id="example-client",
user_id="demo-user",
)
# Example conversation
messages = [
"Hello! What can you help me with?",
"Tell me a fun fact about Python programming.",
"That's interesting! What makes Python so popular?",
]
for i, user_message in enumerate(messages, 1):
print(f"User: {user_message}")
response = await bot.chat(
message=user_message,
context=context,
)
print(f"Bot: {response}")
print()
# Add a small delay between messages
if i < len(messages):
await asyncio.sleep(1)
print("=" * 60)
print("Conversation complete!")
print()
print("Note: This bot has no memory between conversations.")
print("Each new conversation starts fresh.")
if __name__ == "__main__":
asyncio.run(main())
Running the Example¶
# Navigate to the bots package
cd packages/bots
# Run the example
python examples/01_simple_chatbot.py
Expected Output¶
============================================================
Simple Chatbot Example
============================================================
This example shows a basic chatbot with no memory.
Required: ollama pull gemma3:1b
Creating bot from configuration...
✓ Bot created successfully
User: Hello! What can you help me with?
Bot: Hi! I'm here to assist you with various tasks...
User: Tell me a fun fact about Python programming.
Bot: Python was named after Monty Python's Flying Circus...
User: That's interesting! What makes Python so popular?
Bot: Python's popularity comes from its simplicity...
============================================================
Conversation complete!
Note: This bot has no memory between conversations.
Each new conversation starts fresh.
What's Next?¶
This example has no memory - the bot doesn't remember previous messages in the conversation.
To add memory, see the Memory Chatbot Example.
Key Takeaways¶
- ✅ Configuration-First - Bot behavior defined entirely through configuration
- ✅ Async/Await - All operations are asynchronous
- ✅ Context Isolation - Each conversation has its own context
- ✅ Local LLMs - No API keys needed with Ollama
- ⚠️ No Memory - Bot doesn't remember conversation history
Customization¶
Change the Model¶
Adjust Response Length¶
Change System Prompt¶
Using a template name:
"prompts": {
"custom_assistant": "You are an expert in Python programming."
},
"system_prompt": {
"name": "custom_assistant"
}
Using inline content directly:
# Multi-line prompts can be specified directly without a prompts library
"system_prompt": """You are an expert Python programming assistant.
Key responsibilities:
- Help users write clean, idiomatic Python code
- Explain Python concepts clearly
- Suggest best practices and design patterns
"""
Or as a dict with content:
"system_prompt": {
"content": "You are an expert in Python programming. Help users write clean code."
}
Related Examples¶
- Memory Chatbot - Add conversation memory
- RAG Chatbot - Add knowledge base
- ReAct Agent - Add tools and reasoning