A Stupidly-Simple App to Chat with Your Documents Using a Local LLM

Cory Slater
3 min readAug 13, 2024

--

Like many, many people, I’ve been using large language models (LLMs) to interact with documents of all kinds for a while now.

However, there’s still a class of documents— whether they be proprietary code files, sensitive business documents, or personal notes — that, like many, I’m cautious about uploading such files to third-party services. The privacy concerns are real, and for certain data, I’d much rather use a solution that keeps everything local.

Like I said, it’s stupidly-simple…

The Problem: Finding a Simple, Secure Solution

I expected to find numerous apps designed for chatting with documents using a local LLM. Surprisingly, I couldn’t find one that met my criteria for simplicity and security. Many existing tools came with unnecessary complexity, lacked transparency about data handling, or required complex setup processes.

For instance, some open-source solutions required intricate configuration of docker containers, while others needed powerful GPUs to run effectively. Commercial solutions often lacked clarity on where exactly your data was being processed.

I wanted something stupidly simple — an application I could run locally on my M1 MacBook Air that would let me load documents, interact with them, and manage conversations without any data leaving my machine.

The Solution: Chat-with-Notes

To address this need, I built Chat-with-Notes, an open-source application that prioritizes simplicity and privacy.

It uses Flask for the backend and HTML, CSS, and JavaScript for the frontend. For the LLM, it uses Llama 3.1 8B model on Ollama (although it’s model agnostic and can use whichever model Ollama is running), enabling local interaction with document content. And, of course, it’s designed to be minimalist and easy (no steep learning curve or complex setup) to use while running entirely on your local machine.

You can find and download the code here to run it yourself:

How It Works: Stupidly-Simple…

It’s pretty straightforward:

  • Upload Files: You can upload text files, PDFs, or markdown files, which the app then parses and displays.
  • Chat Interface: Once the file is uploaded, you can ask questions or interact with the content through a (boringly) simple chat interface. The conversation history is maintained in-browser, and you can choose to clear or keep it when uploading new files.
  • Local Processing: Everything happens locally, ensuring that sensitive data stays on your machine. The app handles text extraction from files and passes the content to the LLM for interaction.

What’s Next?

We’ll see where this goes…but I’d like to establish a connection with my notes repositories — e.g. those stored in Obsidian, to create a more personalized assistant with access to my local notes.

Anyways, I’m sure there are a million other options out there, but if you’re looking for something simple, this might be worth a try. If you do try it out, I’d love to know what you think.

--

--

No responses yet