Automate Genius
2026-04-06 · 5 min read

Auto-Sync GitHub to NotebookLM with GitHub Actions — Setup Guide

A GitHub Action that keeps your NotebookLM notebook current whenever you push markdown files — no manual uploads, no stale AI answers.

TL;DR: This GitHub Action auto-syncs your markdown files to NotebookLM on every push. It groups files by folder, deletes stale sources, and uploads fresh bundles — no manual uploads. Works with free NotebookLM accounts. Setup takes 3 steps: authenticate once with setup.py, add the workflow file, push any .md file. Normal syncs run in 31 seconds.
Get the files on GitHub


NotebookLM gave me a confident, well-sourced answer about my own project.

It was two weeks out of date.

The AI wasn't hallucinating — it had good sources, good structure, and a well-reasoned conclusion. The problem was that I'd changed the architecture the week before and hadn't re-uploaded the documentation. I'd asked a question and gotten a perfect answer about a version of the project that no longer existed.

This is the "docs rot" problem: the moment your documentation drifts from your actual project, you're no longer querying AI about reality. You're querying it about history.

The fix was to stop treating uploads as a manual task.

What I Built

A GitHub Action that watches for .md file pushes and syncs changed files to NotebookLM automatically. The result: push your repo, notebook updates, AI answers from today's documentation. Normal syncs run in 31 seconds.

The core tool that made this possible is an open-source Python CLI called notebooklm-mcp-cli — it talks to the NotebookLM API and handles source management (add, delete, list). I wrote a setup.py authentication script on top of it and wired the whole thing into a GitHub Actions workflow.

What the system does on each push:

  1. Detects which .md files changed
  2. Groups them by top-level folder
  3. For each changed folder: deletes the existing NotebookLM source, uploads a fresh bundle
  4. Reports sync status in the Actions log

System overview

How This Compares to Other Approaches

| Approach | Target User | Limitation | |---|---|---| | storage-notebooklm-sync | Enterprise users with GCS | Requires NotebookLM Enterprise + Google Cloud setup | | notebooklm-mcp-cli | Developers using Claude/Cursor | General-purpose CLI, not a GitHub Action — no CI/CD | | Manual upload | Everyone | Breaks the moment you forget. Doesn't scale. | | This tool | Any developer with free NotebookLM | Cookie expiration requires periodic re-auth (~5 min) |

How It Works

1. Bundles by folder, not file

Rather than uploading each .md file as a separate source, the Action concatenates all files in each top-level folder into one bundle. My docs/ folder becomes one NotebookLM source called docs.md. My community/ folder becomes community.md.

This keeps the notebook clean — you always have exactly one source per folder, regardless of how many files are inside it. NotebookLM works better with fewer, larger sources than with many small fragments.

2. Smart diff — only syncs what changed

On a normal push, the Action uses git diff to identify which folders contain changed files and re-syncs only those. Everything else is left alone.

Here's the actual log from pushing one change to community/README.md:

>>> Creating Bundle: community
Deleting source ID: 291d4b83-9159-4d4b-85e7-5680088115eb (title: community.md)
✓ Added source: community.md (ready)
--- Sync Process Complete ---

One file changed. One folder synced. Total run: 31 seconds. All other sources untouched.

3. Delete before upload — no accumulation

This was the hardest part to get right.

The notebooklm-mcp-cli only accepts UUIDs for source deletion — it doesn't support delete-by-title. So a naive "just upload" approach would create a new source on every sync while the old one stays, building up duplicates indefinitely.

The solution: before uploading a fresh bundle, the Action queries the source list, finds the source matching the folder name, extracts its UUID, then deletes it. After that lookup-and-delete step is in place, accumulation stops entirely.

Before I had this working: 28 duplicate sources across 7 folders. After initial cleanup using force sync: 7 clean sources, no duplicates.

4. Force sync option

There's a manual trigger with a "Force sync" checkbox in the GitHub Actions UI. This re-syncs all folders regardless of what changed — useful for initial setup, major restructures, or cleaning up accumulated duplicates.

Initial cleanup run: 7 folders synced, 28 accumulated duplicates deleted, 7 fresh sources uploaded. Total time: 2 minutes 12 seconds.

Setup

Prerequisites

  • A GitHub repository with .md files
  • A Google account with access to NotebookLM
  • Python 3.8+ (for running setup.py locally)
  • Optional: GitHub CLI (speeds up secrets setup)

Step 1 — Authenticate (one time)

pip install notebooklm-mcp-cli
python setup.py

The script opens your browser → you log in with your Google account → credentials are saved locally → if you have GitHub CLI installed, it sets NOTEBOOKLM_COOKIES and NOTEBOOKLM_METADATA as repository secrets directly. Otherwise it prints the values to copy into GitHub Settings → Secrets manually.

You'll need your NOTEBOOK_ID: go to notebooklm.google.com, open your notebook, and copy the ID from the URL — everything after /notebook/.

Step 2 — Add the workflow file

Create .github/workflows/sync_docs.yml in your repository:

name: Sync Repo to NotebookLM
 
on:
  push:
    paths:
      - '**.md'
  workflow_dispatch:
    inputs:
      force_sync:
        description: 'Force sync all bundles'
        required: false
        type: boolean
        default: false
 
jobs:
  sync:
    runs-on: ubuntu-latest
 
    steps:
      - name: Checkout Repository
        uses: actions/checkout@v4
        with:
          fetch-depth: 0
 
      - name: Install NotebookLM CLI
        run: pip install notebooklm-mcp-cli
 
      - name: Setup Authentication Profile
        env:
          NLM_COOKIES: ${{ secrets.NOTEBOOKLM_COOKIES }}
          NLM_METADATA: ${{ secrets.NOTEBOOKLM_METADATA }}
        run: |
          set -euo pipefail
          CONFIG_DIR="/home/runner/.notebooklm-mcp-cli/profiles/default"
          mkdir -p "$CONFIG_DIR"
          printf '%s' "$NLM_COOKIES" > "$CONFIG_DIR/cookies.json"
          printf '%s' "$NLM_METADATA" > "$CONFIG_DIR/metadata.json"
 
      - name: Verify Authentication
        run: |
          if ! nlm login --check 2>&1; then
            echo "::error::NotebookLM auth expired. Re-run setup.py locally and update GitHub secrets."
            exit 1
          fi
 
      - name: Smart Sync Bundles
        env:
          NB_ID: ${{ secrets.NOTEBOOK_ID }}
          FORCE_SYNC: ${{ inputs.force_sync }}
        run: |
          set -euo pipefail
 
          OUTDIR="${RUNNER_TEMP}/nlm-bundles"
          rm -rf "$OUTDIR" && mkdir -p "$OUTDIR"
 
          # Identify changed files
          if [ "${FORCE_SYNC:-false}" = "true" ]; then
            CHANGED_FILES=$(find . -type f -name "*.md" -not -path "*/.*" | sed 's|^\./||')
          else
            BASE_SHA="${{ github.event.before }}"
            if [ -n "$BASE_SHA" ] && [ "$BASE_SHA" != "0000000000000000000000000000000000000000" ]; then
              CHANGED_FILES=$(git diff --name-only "$BASE_SHA" "$GITHUB_SHA" -- '*.md')
            else
              CHANGED_FILES=$(find . -type f -name "*.md" -not -path "*/.*" | sed 's|^\./||')
            fi
          fi
 
          # Delete all sources matching a title
          delete_sources_by_title() {
            local nb_id="$1" title="$2"
            local -a ids
            mapfile -t ids < <(nlm source list "$nb_id" --json 2>/dev/null \
              | jq -r --arg t "$title" \
                '(if type=="array" then . else (.sources // []) end)
                 | .[] | select(.title == $t) | (.id // .source_id // "")
                 | select(. != "" and . != "null")' 2>/dev/null)
            for sid in "${ids[@]+"${ids[@]}"}"; do
              echo "Deleting source ID: $sid (title: $title)"
              nlm source delete "$sid" --confirm || true
            done
          }
 
          # Bundle a folder and upload
          sync_bundle() {
            local title="$1"; shift
            local bundle="$OUTDIR/${title}.md"
            echo ">>> Creating Bundle: $title"
            { echo "# ${title}"; echo
              for f in "$@"; do echo "--- SOURCE: ${f#./} ---"; echo; cat "$f"; echo; done
            } > "$bundle"
            delete_sources_by_title "$NB_ID" "${title}.md"
            nlm source add "$NB_ID" --title "$title" --file "$bundle" --wait
          }
 
          # Root-level .md files → "Root_Files" bundle
          if echo "$CHANGED_FILES" | grep -qE '^[^/]+\.md$'; then
            mapfile -d '' ROOT_MD < <(find . -maxdepth 1 -type f -name '*.md' -not -path '*/.*' -print0)
            [ ${#ROOT_MD[@]} -gt 0 ] && sync_bundle "Root_Files" "${ROOT_MD[@]}"
          fi
 
          # Top-level folder bundles
          while IFS= read -r -d '' DIR; do
            DIR_NAME="$(basename "$DIR")"
            case "$DIR_NAME" in .git|.github|node_modules|venv) continue ;; esac
            if printf '%s\n' "$CHANGED_FILES" | grep -F -q "${DIR_NAME}/"; then
              mapfile -d '' MD_FILES < <(find "$DIR" -type f -name '*.md' -not -path '*/.*' -print0)
              if [ ${#MD_FILES[@]} -gt 0 ]; then
                sync_bundle "$DIR_NAME" "${MD_FILES[@]}"
              else
                echo ">>> No .md files in $DIR_NAME. Pruning source."
                delete_sources_by_title "$NB_ID" "${DIR_NAME}.md"
              fi
            fi
          done < <(find . -maxdepth 1 -type d -not -path '.' -not -path '*/.*' -print0)
 
          echo "--- Sync Process Complete ---"

Step 3 — Push any .md file

That's it. The Action runs automatically on every .md push from this point forward.

One Thing to Know

Google session cookies expire periodically. When the Action starts failing with a 401 or auth error, re-run python setup.py locally and update NOTEBOOKLM_COOKIES and NOTEBOOKLM_METADATA in GitHub Settings → Secrets with the fresh values.

Takes about 5 minutes. Small price for never thinking about NotebookLM uploads again.

Results

| Scenario | Time | |---|---| | Single file change (1 folder synced) | 31 seconds | | Force sync — 7 folders, 28 duplicates cleared | 2 minutes 12 seconds | | Authentication refresh (periodic) | ~5 minutes |

What to Feed It

The system works for any repository with markdown files. A few setups that work well:

  • Project documentation — specs, architecture decisions, changelogs
  • Research notes — anything you're accumulating in Obsidian-style repos
  • SOPs and runbooks — operational docs that change as your process evolves
  • Knowledge bases — anything you'd otherwise copy-paste into an AI chat

The key constraint: NotebookLM works best when sources are clean and well-structured. The folder-bundling approach helps with this — instead of uploading 40 individual files, you get 7 organized sources that reflect your actual project structure.

If your repo documents n8n workflows, the Automate Genius Template Explorer already indexes 9,118 workflows — worth syncing alongside your own specs to keep AI answers current on both sides.

Both files are available on GitHub: patrick-creates/notebooklm-github-sync. Drop them into any repository that has markdown files.

Built on the same automation-first thinking behind the n8n Workflow Intelligence Index — if you're building with n8n, the index has data worth keeping current.

Questions about the setup or the delete logic? Happy to answer in the comments.

Automate Genius Team
Founder, Automate Genius

Builder and analyst behind the n8n Workflow Intelligence Index. Tracking automation demand signals, use case trends, and workflow complexity patterns across the n8n template library — updated weekly.

LinkedIn →
← All articles