๐ง Ollama AI Daily Toolkit — Python + Streamlit GUI Project
This tutorial shows how to build a Streamlit web app using Ollama local models.
Each AI use case runs as a separate module in a renamed folder ai_utils (fixing the import issue).
๐ก Project Structure
ai_daily_toolkit/
├── app.py
├── ai_utils/
│ ├── __init__.py
│ ├── ollama_client.py
│ ├── planner.py
│ ├── optimizer.py
│ ├── translator.py
│ ├── budgeter.py
│ └── studycoach.py
└── requirements.txt
๐ฆ requirements.txt
streamlit
ollama
pandas
⚙️ ai_utils/ollama_client.py
Utility to send a prompt to Ollama and get response text.
import ollama
def query_ollama(model: str, prompt: str) -> str:
"""
Send prompt to specified Ollama model and return the response text.
"""
try:
response = ollama.chat(model=model, messages=[{"role": "user", "content": prompt}])
return response['message']['content']
except Exception as e:
return f"⚠️ Ollama error: {e}"
๐ app.py — Streamlit Main App
import streamlit as st
from ai_utils import planner, optimizer, translator, budgeter, studycoach
st.set_page_config(page_title="Ollama AI Daily Toolkit", page_icon="๐ง ", layout="wide")
st.title("๐ง Ollama AI Daily Toolkit")
st.sidebar.title("Select an AI Tool")
tools = ["Smart Planner", "Home Optimizer", "Travel Buddy", "Budget Assistant", "Study Coach"]
choice = st.sidebar.radio("Choose an AI tool:", tools)
if choice == "Smart Planner":
planner.run_planner()
elif choice == "Home Optimizer":
optimizer.run_optimizer()
elif choice == "Travel Buddy":
translator.run_translator()
elif choice == "Budget Assistant":
budgeter.run_budgeter()
else:
studycoach.run_studycoach()
๐ ai_utils/planner.py — Smart Daily Planner
import streamlit as st
from .ollama_client import query_ollama
def run_planner():
st.subheader("๐️ Smart Personal Assistant & Daily Planner")
user_prompt = st.text_area("Enter your daily goals or tasks:")
if st.button("Generate Plan"):
full_prompt = f"Plan a productive day with the following info:\n{user_prompt}"
output = query_ollama("mistral", full_prompt)
st.markdown("### ๐ AI Suggested Plan")
st.write(output)
⚡ ai_utils/optimizer.py — Home Energy Optimizer
import streamlit as st
from .ollama_client import query_ollama
def run_optimizer():
st.subheader("๐ Home Energy Optimizer")
usage_data = st.text_area("Paste last 7 days electricity usage:")
if st.button("Optimize Power Usage"):
prompt = f"Analyze this usage data and suggest energy saving schedule:\n{usage_data}"
output = query_ollama("phi3", prompt)
st.markdown("### ⚙️ Optimization Plan")
st.write(output)
๐ ai_utils/translator.py — Multilingual Travel Buddy
import streamlit as st
from .ollama_client import query_ollama
def run_translator():
st.subheader("๐ Multilingual Translator & Travel Buddy")
phrase = st.text_input("Enter sentence to translate:")
lang = st.selectbox("Target language:", ["Japanese", "Spanish", "French"])
if st.button("Translate"):
prompt = f"Translate to {lang}: '{phrase}'. Also explain if it's polite."
output = query_ollama("mistral", prompt)
st.markdown("### ๐ฌ Translation")
st.write(output)
๐ฐ ai_utils/budgeter.py — Smart Budget Assistant
import streamlit as st
from .ollama_client import query_ollama
def run_budgeter():
st.subheader("๐ฐ Smart Budget & Shopping Assistant")
expenses = st.text_area("Enter recent expenses (item + amount):")
if st.button("Analyze & Suggest Savings"):
prompt = f"Here are my expenses:\n{expenses}\nSuggest ways to save 10%."
output = query_ollama("phi3", prompt)
st.markdown("### ๐ก AI Suggestions")
st.write(output)
๐ ai_utils/studycoach.py — Local Study Coach
import streamlit as st
from .ollama_client import query_ollama
def run_studycoach():
st.subheader("๐ Local Study & Skill Coach")
topic = st.text_input("Enter topic (e.g., Python basics):")
if st.button("Generate Lesson"):
prompt = f"Teach me {topic} for beginners. Give one example and 2 quiz questions."
output = query_ollama("llama3", prompt)
st.markdown("### ๐งฉ Lesson")
st.write(output)
▶️ Run the App
python -m streamlit run app.py
๐งฉ Notes
- Folder renamed to
ai_utilsto avoid import conflicts. - All prompts use
ollama.chat()locally — no external API required. - Models:
mistral,phi3, andllama3(replace as needed). - Use the sidebar to switch between AI tools.
Sample Response for Ollam3:2 model
Pre-requisite Run CMD : ollama run llama3.2, Check by CMD : ollama list
C:\Users\AURMC>ollama list
NAME ID SIZE MODIFIED
llama3.2:latest a80c4f17acd5 2.0 GB 13 days ago
No comments:
Post a Comment