We ❤️ Open Source
A community education resource
Build your own private AI assistant with Bookshelf
Turn your notes, PDFs, and research into a searchable knowledge base with open source tools.
Tired of losing track of your notes or searching through folders to find old PDFs? In his lightning talk at All Things Open, Ash Tewari from Applied Information Sciences shares how he built Bookshelf, a personal knowledge base powered by generative AI. It’s a practical example of using retrieval augmented generation (RAG) to organize and search private content without needing massive infrastructure.
Ash starts by outlining the real problem: Developers and knowledge workers collect tons of valuable information, but traditional tools make it hard to access when you need it. With Bookshelf, he shows how you can index your own content, such as articles, snippets, and notes, then use a Large Language Model (LLM) to answer questions in context. The assistant even shows its sources so you can verify the answers or dig deeper.
The project is open source and built with commonly available tools: LangChain for orchestration, ChromaDB for local vector storage, and Streamlit for a simple interface. Ash walks through how each part fits together, from chunking data to embedding it with OpenAI, and running local queries. It is designed to run on your own machine, keeping your data private and fast to access.
For developers looking to build their own RAG app, Ash shares actionable tips. Start small with a clear use case, use local content you already trust, and avoid over-engineering. Bookshelf is lightweight, useful, and easy to adapt for other needs, from research to customer support to project docs.
Read more: How to build a multiagent RAG system with Granite
Key takeaways
- RAG can work at a personal scale. You do not need enterprise infrastructure to build useful knowledge assistants.
- Open source tools are enough. LangChain, ChromaDB, and Streamlit provide the building blocks for your own AI-powered app.
- Privacy matters. Running the app locally means your data stays on your machine, not in the cloud.
Conclusion
Bookshelf shows how developers can use open source and generative AI to make their own content more accessible and useful. Ash Tewari’s demo is a reminder that you do not need a team of machine learning engineers to build something helpful. You just need the right tools, a clear goal, and your own notes to get started.
More from We Love Open Source
- Getting started with Ollama
- Why AI won’t replace developers
- Are we coding through a revolution or an evolution?
- How to build a multiagent RAG system with Granite
- Build a local AI co-pilot using IBM Granite Code, Ollama, and Continue
The opinions expressed on this website are those of each author, not of the author's employer or All Things Open/We Love Open Source.