Be the first to know and get exclusive access to offers by signing up for our mailing list(s).

Subscribe

We ❤️ Open Source

A community education resource

Build your own private AI assistant with Bookshelf

Turn your notes, PDFs, and research into a searchable knowledge base with open source tools.

Tired of losing track of your notes or searching through folders to find old PDFs? In his lightning talk at All Things Open, Ash Tewari from Applied Information Sciences shares how he built Bookshelf, a personal knowledge base powered by generative AI. It’s a practical example of using retrieval augmented generation (RAG) to organize and search private content without needing massive infrastructure.

Subscribe to our All Things Open YouTube channel to get notifications when new videos are available.

Ash starts by outlining the real problem: Developers and knowledge workers collect tons of valuable information, but traditional tools make it hard to access when you need it. With Bookshelf, he shows how you can index your own content, such as articles, snippets, and notes, then use a Large Language Model (LLM) to answer questions in context. The assistant even shows its sources so you can verify the answers or dig deeper.

The project is open source and built with commonly available tools: LangChain for orchestration, ChromaDB for local vector storage, and Streamlit for a simple interface. Ash walks through how each part fits together, from chunking data to embedding it with OpenAI, and running local queries. It is designed to run on your own machine, keeping your data private and fast to access.

For developers looking to build their own RAG app, Ash shares actionable tips. Start small with a clear use case, use local content you already trust, and avoid over-engineering. Bookshelf is lightweight, useful, and easy to adapt for other needs, from research to customer support to project docs.

Read more: How to build a multiagent RAG system with Granite

Key takeaways

  • RAG can work at a personal scale. You do not need enterprise infrastructure to build useful knowledge assistants.
  • Open source tools are enough. LangChain, ChromaDB, and Streamlit provide the building blocks for your own AI-powered app.
  • Privacy matters. Running the app locally means your data stays on your machine, not in the cloud.

Conclusion

Bookshelf shows how developers can use open source and generative AI to make their own content more accessible and useful. Ash Tewari’s demo is a reminder that you do not need a team of machine learning engineers to build something helpful. You just need the right tools, a clear goal, and your own notes to get started.

More from We Love Open Source

About the Author

The ATO Team is a small but skilled team of talented professionals, bringing you the best open source content possible.

Read the ATO Team's Full Bio

The opinions expressed on this website are those of each author, not of the author's employer or All Things Open/We Love Open Source.

Want to contribute your open source content?

Contribute to We ❤️ Open Source

Help educate our community by contributing a blog post, tutorial, or how-to.

We're hosting two world-class events in 2026!

Join us for All Things AI, March 23-24 and for All Things Open, October 18-20.

Open Source Meetups

We host some of the most active open source meetups in the U.S. Get more info and RSVP to an upcoming event.