Initial commit: Astro 6 static blog site
All checks were successful
Deploy / deploy (push) Successful in 49s

- German (default) and English i18n support
- Categories and tags
- Blog posts with hero images
- Dark/light theme switcher
- View Transitions removed to fix reload ghost images
- Webmentions integration
- RSS feeds per locale

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
Adrian Altner 2026-04-22 10:55:29 +02:00
commit 5bb63bacf5
95 changed files with 12199 additions and 0 deletions

76
scripts/VISION.md Normal file
View file

@ -0,0 +1,76 @@
# Vision Script
Generiert Metadaten-Sidecars (JSON) fuer Foto-Kollektionen mithilfe von EXIF-Daten und einer Vision-AI (Anthropic oder OpenAI).
## Voraussetzungen
- `exiftool` installiert (`brew install exiftool`)
- `ANTHROPIC_API_KEY` oder `OPENAI_API_KEY` in `.env.local` gesetzt (je nach Provider)
## Aufruf
```bash
pnpm run vision [optionen] [verzeichnis]
```
Ohne Verzeichnis wird der Standard `content/fotos` verwendet.
## Optionen
| Option | Beschreibung |
|---|---|
| `--provider=anthropic\|openai` | Vision-API Provider (Standard: `anthropic`). Anthropic nutzt `claude-opus-4-6`, OpenAI nutzt `gpt-4o-mini`. |
| `--refresh` | Alle Sidecars neu generieren (EXIF + AI). Ueberschreibt vorhandene Dateien. |
| `--exif-only` | Nur EXIF-Daten in bestehenden Sidecars aktualisieren. AI-Felder (Titel, Alt, Tags) bleiben erhalten. |
| `--concurrency=N` | Anzahl paralleler Vision-API-Anfragen (Standard: 2) |
| `--retries=N` | Maximale Wiederholungsversuche bei Rate-Limits (Standard: 8) |
| `--backoff-ms=N` | Basis-Wartezeit in ms fuer exponentielles Backoff (Standard: 1500) |
## Umgebungsvariablen
Alternativ zu den CLI-Optionen koennen diese Werte auch per Umgebungsvariable gesetzt werden:
| Variable | Entspricht |
|---|---|
| `VISION_PROVIDER` | `--provider` |
| `VISION_CONCURRENCY` | `--concurrency` |
| `VISION_MAX_RETRIES` | `--retries` |
| `VISION_BASE_BACKOFF_MS` | `--backoff-ms` |
CLI-Optionen haben Vorrang vor Umgebungsvariablen.
## Beispiele
```bash
# Neue Bilder ohne Sidecar verarbeiten (Anthropic)
pnpm run vision
# Mit OpenAI statt Anthropic
pnpm run vision --provider=openai
# Alle Sidecars in einem bestimmten Ordner neu generieren
pnpm run vision --refresh --provider=openai content/fotos/kollektionen/reisen/asien/thailand
# Nur EXIF-Daten aktualisieren (z.B. nach erneutem Lightroom-Export)
pnpm run vision --exif-only
# Mit hoeherer Parallelitaet
pnpm run vision --refresh --concurrency=4
```
## Ausgabe
Pro Bild wird eine JSON-Sidecar-Datei mit folgendem Inhalt erstellt:
- `title` - 5 Titelvorschlaege (deutsch, via AI)
- `alt` - Bildbeschreibung / Alt-Text (deutsch, via AI)
- `tags` - 5 thematische Tags (deutsch, via AI)
- `date` - Aufnahmedatum (aus EXIF)
- `location` - GPS-Koordinaten (aus EXIF, falls vorhanden)
- `locationName` - Aufgeloester Ortsname via Nominatim (falls GPS vorhanden)
- `exif` - Kamera, Objektiv, Blende, ISO, Brennweite, Belichtungszeit
# Fix
bei OPENAI rate-limit mit einem gerigem Tier-Level
pnpm run vision --refresh --provider=openai --concurrency=1 content/fotos/kollektionen/reisen/asien/malaysia

17
scripts/copy-sw.js Normal file
View file

@ -0,0 +1,17 @@
// Copies sw.js + workbox-*.js from dist/server/ to dist/client/ after build.
// @astrojs/node standalone only serves static files from dist/client/, but
// @vite-pwa/astro generates the service worker into dist/server/ during the
// SSR Vite build pass.
import { copyFile, readdir } from "node:fs/promises";
import { join } from "node:path";
const serverDir = "dist/server";
const clientDir = "dist/client";
const files = await readdir(serverDir).catch(() => []);
for (const file of files) {
if (file === "sw.js" || file.startsWith("workbox-")) {
await copyFile(join(serverDir, file), join(clientDir, file));
console.log(`[copy-sw] ${file} → dist/client/`);
}
}

49
scripts/deploy.sh Executable file
View file

@ -0,0 +1,49 @@
#!/usr/bin/env bash
set -euo pipefail
VPS="${1:-hetzner}"
REMOTE_BRANCH="${2:-main}"
REMOTE_BASE='/opt/websites/adrian-altner.de'
REMOTE_URL='ssh://git@git.altner.cloud:2222/adrian/adrian-altner.de.git'
GIT_HOST='git.altner.cloud'
GIT_PORT='2222'
# --- 1. Pull latest from repo ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
mkdir -p ~/.ssh && touch ~/.ssh/known_hosts && chmod 600 ~/.ssh/known_hosts
ssh-keygen -F '[$GIT_HOST]:$GIT_PORT' >/dev/null || ssh-keyscan -p '$GIT_PORT' '$GIT_HOST' >> ~/.ssh/known_hosts
git remote set-url origin '$REMOTE_URL'
git fetch --prune origin '$REMOTE_BRANCH'
git checkout '$REMOTE_BRANCH'
git reset --hard 'origin/$REMOTE_BRANCH'
git clean -fd -e .env -e .env.production
"
# --- 2. Build + deploy ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
sudo podman build -t localhost/adrian-altner.de:latest .
sudo systemctl restart podman-compose@adrian-altner.de.service
sudo podman container prune -f 2>/dev/null || true
sudo podman image prune --external -f 2>/dev/null || true
sudo podman image prune -f 2>/dev/null || true
sudo podman builder prune -af 2>/dev/null || true
"
echo "Deploy done via $VPS (branch: $REMOTE_BRANCH)."
# --- 3. Webmentions ---
WEBMENTION_APP_TOKEN="$(ssh "$VPS" "grep '^WEBMENTION_APP_TOKEN=' '$REMOTE_BASE/.env.production' | cut -d= -f2-" 2>/dev/null || true)"
if [[ -n "$WEBMENTION_APP_TOKEN" ]]; then
echo "Sending webmentions via webmention.app..."
for feed in rss.xml en/rss.xml; do
curl -s -X POST "https://webmention.app/check?url=https://adrian-altner.de/${feed}&token=${WEBMENTION_APP_TOKEN}" \
| grep -o '"status":"[^"]*"' || true
done
echo "Webmentions triggered."
else
echo "No WEBMENTION_APP_TOKEN in .env.production — skipping webmentions."
fi

View file

@ -0,0 +1,74 @@
/**
* Fetches favicons for all linked domains and saves them locally to public/favicons/.
* Run before astro build so favicons are served statically instead of via Google S2.
* Idempotent: skips domains that already have a cached favicon.
*/
import { existsSync, mkdirSync, readFileSync, writeFileSync } from "node:fs";
import { join } from "node:path";
import { fileURLToPath } from "node:url";
const __dirname = fileURLToPath(new URL(".", import.meta.url));
const root = join(__dirname, "..");
const linksPath = join(root, "src/content/links/links.json");
const outDir = join(root, "public/favicons");
function getDomain(url) {
try {
return new URL(url).hostname.replace(/^www\./, "");
} catch {
return null;
}
}
function faviconPath(domain) {
return join(outDir, `${domain}.png`);
}
async function fetchFavicon(domain) {
const url = `https://www.google.com/s2/favicons?domain=${domain}&sz=32`;
const res = await fetch(url);
if (!res.ok) throw new Error(`HTTP ${res.status}`);
const buf = await res.arrayBuffer();
return Buffer.from(buf);
}
async function main() {
const links = JSON.parse(readFileSync(linksPath, "utf-8"));
const domains = [
...new Set(links.map((l) => getDomain(l.url)).filter(Boolean)),
];
if (!existsSync(outDir)) mkdirSync(outDir, { recursive: true });
let fetched = 0;
let skipped = 0;
await Promise.all(
domains.map(async (domain) => {
const dest = faviconPath(domain);
if (existsSync(dest)) {
skipped++;
return;
}
try {
const data = await fetchFavicon(domain);
writeFileSync(dest, data);
fetched++;
console.log(`${domain}`);
} catch (err) {
console.warn(`${domain}: ${err.message}`);
}
}),
);
console.log(
`Favicons: ${fetched} fetched, ${skipped} cached, ${domains.length} total`,
);
}
main().catch((err) => {
console.error(err);
process.exit(1);
});

424
scripts/metadata.ts Normal file
View file

@ -0,0 +1,424 @@
#!/usr/bin/env -S node --experimental-strip-types
import { writeFile } from "node:fs/promises";
import { basename, relative, resolve } from "node:path";
import { fileURLToPath } from "node:url";
import { consola } from "consola";
import sharp from "sharp";
import {
getImagesMissingMetadata,
getMetadataPathForImage,
getPhotoAbsolutePath,
getPhotoDirectories,
PHOTOS_DIRECTORY,
} from "../src/lib/photo-albums.ts";
const PHOTOS_DIR = PHOTOS_DIRECTORY;
// ─── IPTC parser ────────────────────────────────────────────────────────────
interface IptcFields {
title?: string;
caption?: string;
keywords?: string[];
dateCreated?: string;
timeCreated?: string;
}
function parseIptc(buf: Buffer): IptcFields {
const fields: IptcFields = {};
let i = 0;
while (i < buf.length - 4) {
if (buf[i] !== 0x1c) {
i++;
continue;
}
const record = buf[i + 1];
const dataset = buf[i + 2];
const len = buf.readUInt16BE(i + 3);
const value = buf.subarray(i + 5, i + 5 + len).toString("utf8");
i += 5 + len;
if (record !== 2) continue;
switch (dataset) {
case 5:
fields.title = value;
break;
case 25:
fields.keywords ??= [];
fields.keywords.push(value);
break;
case 55:
fields.dateCreated = value;
break;
case 60:
fields.timeCreated = value;
break;
case 120:
fields.caption = value;
break;
}
}
return fields;
}
// ─── XMP parser ─────────────────────────────────────────────────────────────
interface XmpFields {
title?: string;
description?: string;
keywords?: string[];
lens?: string | undefined;
createDate?: string | undefined;
}
function extractRdfLiValues(xml: string, tagName: string): string[] {
const re = new RegExp(`<${tagName}[^>]*>[\\s\\S]*?<\\/${tagName}>`, "i");
const match = xml.match(re);
if (!match) return [];
const liRe = /<rdf:li[^>]*>([^<]*)<\/rdf:li>/gi;
const values: string[] = [];
for (let m = liRe.exec(match[0]); m !== null; m = liRe.exec(match[0])) {
if (m[1]?.trim()) values.push(m[1].trim());
}
return values;
}
function extractXmpAttr(xml: string, attr: string): string | undefined {
const re = new RegExp(`${attr}="([^"]*)"`, "i");
return xml.match(re)?.[1] ?? undefined;
}
function parseXmp(buf: Buffer): XmpFields {
const xml = buf.toString("utf8");
const fields: XmpFields = {};
const titles = extractRdfLiValues(xml, "dc:title");
if (titles[0]) fields.title = titles[0];
const descriptions = extractRdfLiValues(xml, "dc:description");
if (descriptions[0]) fields.description = descriptions[0];
const subjects = extractRdfLiValues(xml, "dc:subject");
if (subjects.length > 0) fields.keywords = subjects;
fields.lens = extractXmpAttr(xml, "aux:Lens");
fields.createDate = extractXmpAttr(xml, "xmp:CreateDate");
return fields;
}
// ─── EXIF parser (minimal TIFF IFD0 + SubIFD) ──────────────────────────────
interface ExifFields {
model?: string;
lensModel?: string;
fNumber?: number;
focalLength?: string;
exposureTime?: string;
iso?: number;
dateTimeOriginal?: string;
gpsLatitude?: string;
gpsLongitude?: string;
gpsLatitudeRef?: string;
gpsLongitudeRef?: string;
}
function parseExifBuffer(buf: Buffer): ExifFields {
// Skip "Exif\0\0" header if present
let offset = 0;
if (
buf[0] === 0x45 &&
buf[1] === 0x78 &&
buf[2] === 0x69 &&
buf[3] === 0x66
) {
offset = 6;
}
const isLE = buf[offset] === 0x49; // "II" = little endian
const read16 = isLE
? (o: number) => buf.readUInt16LE(offset + o)
: (o: number) => buf.readUInt16BE(offset + o);
const read32 = isLE
? (o: number) => buf.readUInt32LE(offset + o)
: (o: number) => buf.readUInt32BE(offset + o);
const readRational = (o: number): number => {
const num = read32(o);
const den = read32(o + 4);
return den === 0 ? 0 : num / den;
};
const readString = (o: number, len: number): string => {
return buf
.subarray(offset + o, offset + o + len)
.toString("ascii")
.replace(/\0+$/, "");
};
const fields: ExifFields = {};
const parseIfd = (ifdOffset: number, parseGps = false) => {
if (ifdOffset + 2 > buf.length - offset) return;
const count = read16(ifdOffset);
for (let i = 0; i < count; i++) {
const entryOffset = ifdOffset + 2 + i * 12;
if (entryOffset + 12 > buf.length - offset) break;
const tag = read16(entryOffset);
const type = read16(entryOffset + 2);
const numValues = read32(entryOffset + 4);
const valueOffset = read32(entryOffset + 8);
// For values that fit in 4 bytes, data is inline at entryOffset+8
const dataOffset =
type === 2 && numValues <= 4
? entryOffset + 8
: type === 5 || numValues > 4
? valueOffset
: entryOffset + 8;
if (parseGps) {
switch (tag) {
case 1: // GPSLatitudeRef
fields.gpsLatitudeRef = readString(entryOffset + 8, 2);
break;
case 2: // GPSLatitude
if (dataOffset + 24 <= buf.length - offset) {
const d = readRational(dataOffset);
const m = readRational(dataOffset + 8);
const s = readRational(dataOffset + 16);
fields.gpsLatitude = `${d} deg ${Math.floor(m)}' ${s.toFixed(2)}"`;
}
break;
case 3: // GPSLongitudeRef
fields.gpsLongitudeRef = readString(entryOffset + 8, 2);
break;
case 4: // GPSLongitude
if (dataOffset + 24 <= buf.length - offset) {
const d = readRational(dataOffset);
const m = readRational(dataOffset + 8);
const s = readRational(dataOffset + 16);
fields.gpsLongitude = `${d} deg ${Math.floor(m)}' ${s.toFixed(2)}"`;
}
break;
}
continue;
}
switch (tag) {
case 0x0110: // Model
if (dataOffset + numValues <= buf.length - offset) {
fields.model = readString(dataOffset, numValues);
}
break;
case 0x8769: // ExifIFD pointer
parseIfd(valueOffset);
break;
case 0x8825: // GPS IFD pointer
parseIfd(valueOffset, true);
break;
case 0x829a: // ExposureTime
if (dataOffset + 8 <= buf.length - offset) {
const num = read32(dataOffset);
const den = read32(dataOffset + 4);
fields.exposureTime =
den > num ? `1/${Math.round(den / num)}` : `${num / den}`;
}
break;
case 0x829d: // FNumber
if (dataOffset + 8 <= buf.length - offset) {
fields.fNumber = readRational(dataOffset);
}
break;
case 0x8827: // ISO
fields.iso = type === 3 ? read16(entryOffset + 8) : valueOffset;
break;
case 0x9003: // DateTimeOriginal
if (dataOffset + numValues <= buf.length - offset) {
fields.dateTimeOriginal = readString(dataOffset, numValues);
}
break;
case 0x920a: // FocalLength
if (dataOffset + 8 <= buf.length - offset) {
const fl = readRational(dataOffset);
fields.focalLength = fl.toFixed(1).replace(/\.0$/, "");
}
break;
case 0xa434: // LensModel
if (dataOffset + numValues <= buf.length - offset) {
fields.lensModel = readString(dataOffset, numValues);
}
break;
}
}
};
const ifdOffset = read32(4);
parseIfd(ifdOffset);
return fields;
}
// ─── Merged metadata ────────────────────────────────────────────────────────
interface ImageMetadata {
id: string;
title: string[];
image: string;
alt: string;
location: string;
date: string;
tags: string[];
exif: {
camera: string;
lens: string;
aperture: string;
iso: string;
focal_length: string;
shutter_speed: string;
};
}
function formatGpsLocation(exif: ExifFields): string {
if (!exif.gpsLatitude || !exif.gpsLongitude) return "";
const latRef = exif.gpsLatitudeRef ?? "N";
const lonRef = exif.gpsLongitudeRef ?? "E";
return `${exif.gpsLatitude} ${latRef}, ${exif.gpsLongitude} ${lonRef}`;
}
function formatDate(raw: string | undefined): string {
if (!raw) return "";
// Handle "YYYY:MM:DD HH:MM:SS" or "YYYYMMDD" or "YYYY-MM-DDTHH:MM:SS"
if (/^\d{8}$/.test(raw)) {
return `${raw.slice(0, 4)}-${raw.slice(4, 6)}-${raw.slice(6, 8)}`;
}
const [datePart] = raw.split(/[T ]/);
if (!datePart) return "";
return datePart.replaceAll(":", "-");
}
async function extractMetadata(imagePath: string): Promise<ImageMetadata> {
const meta = await sharp(imagePath).metadata();
const fileName = basename(imagePath);
const iptc = meta.iptc ? parseIptc(meta.iptc) : ({} as IptcFields);
const xmp = meta.xmp ? parseXmp(meta.xmp) : ({} as XmpFields);
const exif = meta.exif ? parseExifBuffer(meta.exif) : ({} as ExifFields);
const title = iptc.title || xmp.title || "";
const caption = iptc.caption || xmp.description || "";
const keywords = iptc.keywords ?? xmp.keywords ?? [];
const date = formatDate(
exif.dateTimeOriginal ?? xmp.createDate ?? iptc.dateCreated,
);
if (!title && !caption) {
consola.warn(`No title or caption found in ${fileName}`);
}
return {
id: fileName.replace(/\.jpg$/i, ""),
title: title ? [title] : [],
image: `./${fileName}`,
alt: caption,
location: formatGpsLocation(exif),
date,
tags: keywords,
exif: {
camera: exif.model ?? "",
lens: exif.lensModel || xmp.lens || "",
aperture: exif.fNumber?.toString() ?? "",
iso: exif.iso?.toString() ?? "",
focal_length: exif.focalLength?.replace(/ mm$/, "") ?? "",
shutter_speed: exif.exposureTime ?? "",
},
};
}
// ─── CLI ────────────────────────────────────────────────────────────────────
interface CliOptions {
refresh: boolean;
photosDirectory: string;
}
function parseCliOptions(argv: string[]): CliOptions {
const nonFlagArgs = argv.filter((arg) => !arg.startsWith("--"));
return {
refresh: argv.includes("--refresh"),
photosDirectory: resolve(nonFlagArgs[0] ?? PHOTOS_DIR),
};
}
async function getImagesToProcess(
photosDirectory: string,
options: Pick<CliOptions, "refresh">,
): Promise<string[]> {
const relativeImagePaths = options.refresh
? (await getPhotoDirectories(photosDirectory)).flatMap((d) => d.imagePaths)
: await getImagesMissingMetadata(photosDirectory);
consola.info(
options.refresh
? `Refreshing ${relativeImagePaths.length} image(s)`
: `Found ${relativeImagePaths.length} image(s) without metadata`,
);
return relativeImagePaths.map((p) =>
getPhotoAbsolutePath(p, photosDirectory),
);
}
async function main() {
consola.start("Checking for images to process...");
const opts = parseCliOptions(process.argv.slice(2));
const images = await getImagesToProcess(opts.photosDirectory, opts);
if (images.length === 0) {
consola.success(
opts.refresh
? "No images found to refresh."
: "No images require metadata.",
);
return;
}
for (let i = 0; i < images.length; i++) {
const imagePath = images[i] as string;
const rel = relative(process.cwd(), imagePath);
consola.info(`Processing ${i + 1}/${images.length}: ${rel}`);
const metadata = await extractMetadata(imagePath);
const relativeImagePath = relative(opts.photosDirectory, imagePath);
const jsonPath = getMetadataPathForImage(
relativeImagePath,
opts.photosDirectory,
);
await writeFile(jsonPath, JSON.stringify(metadata, null, 2));
consola.info(`Wrote ${relative(process.cwd(), jsonPath)}`);
}
consola.success(`Processed ${images.length} image(s).`);
}
if (process.argv[1] && fileURLToPath(import.meta.url) === process.argv[1]) {
try {
await main();
} catch (error) {
consola.error(error);
process.exit(1);
}
}

40
scripts/new-note-mdx-prompt.sh Executable file
View file

@ -0,0 +1,40 @@
#!/usr/bin/env bash
# Standalone wrapper für Obsidian Script Runner — neue Note mit Cover-Bild (MDX)
set -euo pipefail
VAULT='/Users/adrian/Obsidian/Web/adrian-altner-com'
TITLE=$(osascript \
-e 'Tell application "System Events" to display dialog "Note title (with cover):" default answer ""' \
-e 'text returned of result' 2>/dev/null) || exit 0
if [[ -z "$TITLE" ]]; then exit 0; fi
SLUG=$(echo "$TITLE" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9 -]//g' | sed 's/ \+/-/g' | sed 's/^-\+//;s/-\+$//')
DATE_FOLDER=$(date +%Y/%m/%d)
PUBLISH_DATE=$(date +%Y-%m-%d)
DIR="$VAULT/content/notes/$DATE_FOLDER"
FILE="$DIR/$SLUG.mdx"
mkdir -p "$DIR"
if [[ -f "$FILE" ]]; then
osascript -e "display notification \"File already exists: $SLUG.mdx\" with title \"New Note\"" 2>/dev/null || true
exit 1
fi
cat > "$FILE" << EOF
---
title: "$TITLE"
publishDate: $PUBLISH_DATE
description: ""
cover: "./$SLUG.jpg"
coverAlt: ""
tags:
-
draft: false
syndication:
---
EOF
echo "Created: $FILE"

38
scripts/new-note-prompt.sh Executable file
View file

@ -0,0 +1,38 @@
#!/usr/bin/env bash
# Standalone wrapper für Obsidian Script Runner — neue Note anlegen
set -euo pipefail
VAULT='/Users/adrian/Obsidian/Web/adrian-altner-com'
TITLE=$(osascript \
-e 'Tell application "System Events" to display dialog "Note title:" default answer ""' \
-e 'text returned of result' 2>/dev/null) || exit 0
if [[ -z "$TITLE" ]]; then exit 0; fi
SLUG=$(echo "$TITLE" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9 -]//g' | sed 's/ \+/-/g' | sed 's/^-\+//;s/-\+$//')
DATE_FOLDER=$(date +%Y/%m/%d)
PUBLISH_DATE=$(date +%Y-%m-%d)
DIR="$VAULT/content/notes/$DATE_FOLDER"
FILE="$DIR/$SLUG.md"
mkdir -p "$DIR"
if [[ -f "$FILE" ]]; then
osascript -e "display notification \"File already exists: $SLUG.md\" with title \"New Note\"" 2>/dev/null || true
exit 1
fi
cat > "$FILE" << EOF
---
title: "$TITLE"
publishDate: $PUBLISH_DATE
description: ""
tags:
-
draft: false
syndication:
---
EOF
echo "Created: $FILE"

68
scripts/new-note.sh Executable file
View file

@ -0,0 +1,68 @@
#!/usr/bin/env bash
# Usage: new-note.sh "Note Title" [--mdx]
# Creates a new note in the Obsidian vault with correct frontmatter and folder structure.
# Use --mdx for notes that need a cover image or custom components (creates .mdx file).
set -euo pipefail
VAULT='/Users/adrian/Obsidian/Web/adrian-altner-com'
if [[ -z "${1:-}" ]]; then
echo "Usage: new-note.sh \"Note Title\" [--mdx]" >&2
exit 1
fi
TITLE="$1"
MDX=false
if [[ "${2:-}" == "--mdx" ]]; then
MDX=true
fi
SLUG=$(echo "$TITLE" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9 -]//g' | sed 's/ \+/-/g' | sed 's/^-\+//;s/-\+$//')
DATE_FOLDER=$(date +%Y/%m/%d)
PUBLISH_DATE=$(date +%Y-%m-%d)
DIR="$VAULT/content/notes/$DATE_FOLDER"
if $MDX; then
EXT="mdx"
else
EXT="md"
fi
FILE="$DIR/$SLUG.$EXT"
mkdir -p "$DIR"
if [[ -f "$FILE" ]]; then
echo "File already exists: $FILE" >&2
exit 1
fi
if $MDX; then
cat > "$FILE" << EOF
---
title: "$TITLE"
publishDate: $PUBLISH_DATE
description: ""
cover: "./$SLUG.jpg"
coverAlt: ""
tags:
-
draft: false
syndication:
---
EOF
else
cat > "$FILE" << EOF
---
title: "$TITLE"
publishDate: $PUBLISH_DATE
description: ""
tags:
-
draft: false
syndication:
---
EOF
fi
echo "Created: $FILE"

38
scripts/new-post-prompt.sh Executable file
View file

@ -0,0 +1,38 @@
#!/usr/bin/env bash
# Standalone wrapper für Obsidian Script Runner — neuen Blog-Post anlegen
set -euo pipefail
VAULT='/Users/adrian/Obsidian/Web/adrian-altner-com'
TITLE=$(osascript \
-e 'Tell application "System Events" to display dialog "Post title:" default answer ""' \
-e 'text returned of result' 2>/dev/null) || exit 0
if [[ -z "$TITLE" ]]; then exit 0; fi
SLUG=$(echo "$TITLE" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9 -]//g' | sed 's/ \+/-/g' | sed 's/^-\+//;s/-\+$//')
DATE_FOLDER=$(date +%Y/%m/%d)
PUBLISH_DATE=$(date +%Y-%m-%dT%H:%M:%S%z)
DIR="$VAULT/content/blog/posts/$DATE_FOLDER"
FILE="$DIR/$SLUG.md"
mkdir -p "$DIR"
if [[ -f "$FILE" ]]; then
osascript -e "display notification \"File already exists: $SLUG.md\" with title \"New Post\"" 2>/dev/null || true
exit 1
fi
cat > "$FILE" << EOF
---
title: "$TITLE"
description: ""
publishDate: $PUBLISH_DATE
tags:
-
draft: true
syndication:
---
EOF
echo "Created: $FILE"

39
scripts/new-post.sh Executable file
View file

@ -0,0 +1,39 @@
#!/usr/bin/env bash
# Usage: new-post.sh "Post Title"
# Creates a new blog post in the Obsidian vault with correct frontmatter and folder structure.
set -euo pipefail
VAULT='/Users/adrian/Obsidian/Web/adrian-altner-com'
if [[ -z "${1:-}" ]]; then
echo "Usage: new-post.sh \"Post Title\"" >&2
exit 1
fi
TITLE="$1"
SLUG=$(echo "$TITLE" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9 -]//g' | sed 's/ \+/-/g' | sed 's/^-\+//;s/-\+$//')
DATE_FOLDER=$(date +%Y/%m/%d)
PUBLISH_DATE=$(date +%Y-%m-%dT%H:%M:%S%z)
DIR="$VAULT/content/blog/posts/$DATE_FOLDER"
FILE="$DIR/$SLUG.md"
mkdir -p "$DIR"
if [[ -f "$FILE" ]]; then
echo "File already exists: $FILE" >&2
exit 1
fi
cat > "$FILE" << EOF
---
title: "$TITLE"
description: ""
publishDate: $PUBLISH_DATE
tags:
-
draft: true
syndication:
---
EOF
echo "Created: $FILE"

62
scripts/publish-all.sh Executable file
View file

@ -0,0 +1,62 @@
#!/usr/bin/env bash
set -euo pipefail
VAULT_CONTENT='/Users/adrian/Library/Mobile Documents/iCloud~md~obsidian/Documents/03 Bereiche/Webseite/adrian-altner-de/content'
VPS="${1:-hetzner}"
REMOTE_BRANCH="${2:-main}"
REMOTE_BASE='/opt/websites/adrian-altner.de'
REMOTE_CONTENT="${REMOTE_BASE}/src/content"
# --- 1. Sync vault to VPS ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
git fetch --prune origin '$REMOTE_BRANCH'
git checkout '$REMOTE_BRANCH'
git reset --hard 'origin/$REMOTE_BRANCH'
git clean -fd -e .env -e .env.production
mkdir -p '$REMOTE_CONTENT'
"
rsync -az --delete \
--include='*/' \
--include='*.md' \
--include='*.mdx' \
--include='*.jpg' \
--include='*.jpeg' \
--include='*.png' \
--include='*.PNG' \
--include='*.JPG' \
--include='*.JPEG' \
--include='*.json' \
--exclude='.DS_Store' \
--exclude='*' \
"$VAULT_CONTENT/" "$VPS:$REMOTE_CONTENT/"
# --- 2. Build + cleanup ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
sudo podman build -t localhost/adrian-altner.de:latest .
sudo systemctl restart podman-compose@adrian-altner.de.service
sudo podman container prune -f 2>/dev/null || true
sudo podman image prune --external -f 2>/dev/null || true
sudo podman image prune -f 2>/dev/null || true
sudo podman builder prune -af 2>/dev/null || true
"
echo "Redeploy done via $VPS (branch: $REMOTE_BRANCH)."
# --- 3. Webmentions ---
WEBMENTION_APP_TOKEN="$(ssh "$VPS" "grep '^WEBMENTION_APP_TOKEN=' '$REMOTE_BASE/.env.production' | cut -d= -f2-" 2>/dev/null || true)"
if [[ -n "$WEBMENTION_APP_TOKEN" ]]; then
echo "Sending webmentions via webmention.app..."
for feed in rss/blog.xml rss/fotos.xml; do
curl -s -X POST "https://webmention.app/check?url=https://adrian-altner.de/${feed}&token=${WEBMENTION_APP_TOKEN}" \
| grep -o '"status":"[^"]*"' || true
done
echo "Webmentions triggered."
else
echo "No WEBMENTION_APP_TOKEN in .env.production — skipping webmentions."
fi

62
scripts/publish-blog.sh Executable file
View file

@ -0,0 +1,62 @@
#!/usr/bin/env bash
# Usage: publish-blog.sh [vps-host] [branch]
# Can be called from any directory — no dependency on the repo being the working dir.
set -euo pipefail
VAULT_BLOG='/Users/adrian/Obsidian/Web/adrian-altner-com/content/blog'
VPS="${1:-hetzner}"
REMOTE_BRANCH="${2:-main}"
REMOTE_BASE='/opt/websites/www.adrian-altner.de'
REMOTE_BLOG="${REMOTE_BASE}/src/content/blog"
# --- 1. Sync vault to VPS ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
git fetch --prune origin '$REMOTE_BRANCH'
git checkout '$REMOTE_BRANCH'
git reset --hard 'origin/$REMOTE_BRANCH'
git clean -fd -e .env -e .env.production
mkdir -p '$REMOTE_BLOG'
"
rsync -az --delete \
--include='*/' \
--include='*.md' \
--include='*.mdx' \
--include='*.jpg' \
--include='*.jpeg' \
--include='*.JPG' \
--include='*.JPEG' \
--include='*.png' \
--include='*.PNG' \
--include='*.webp' \
--include='*.gif' \
--exclude='.DS_Store' \
--exclude='*' \
"$VAULT_BLOG/" "$VPS:$REMOTE_BLOG/"
# --- 2. Build + cleanup ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
podman-compose -f compose.yml up --build -d --force-recreate
podman image prune -af
podman builder prune -af || true
"
echo "Blog deploy done via $VPS (branch: $REMOTE_BRANCH)."
# --- 3. Webmentions ---
WEBMENTION_APP_TOKEN="$(ssh "$VPS" "grep '^WEBMENTION_APP_TOKEN=' '$REMOTE_BASE/.env.production' | cut -d= -f2-" 2>/dev/null || true)"
if [[ -n "$WEBMENTION_APP_TOKEN" ]]; then
echo "Sending webmentions..."
curl -s -X POST "https://webmention.app/check?url=https://adrian-altner.de/rss/blog.xml&token=${WEBMENTION_APP_TOKEN}" \
| grep -o '"status":"[^"]*"' || true
echo "Webmentions triggered."
else
echo "No WEBMENTION_APP_TOKEN in .env.production — skipping webmentions."
fi

52
scripts/publish-links.sh Executable file
View file

@ -0,0 +1,52 @@
#!/usr/bin/env bash
# Usage: publish-links.sh [vps-host] [branch]
# Can be called from any directory — no dependency on the repo being the working dir.
set -euo pipefail
VAULT_LINKS='/Users/adrian/Obsidian/Web/adrian-altner-com/content/links'
VPS="${1:-hetzner}"
REMOTE_BRANCH="${2:-main}"
REMOTE_BASE='/opt/websites/www.adrian-altner.de'
REMOTE_LINKS="${REMOTE_BASE}/src/content/links"
# --- 1. Sync vault to VPS ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
git fetch --prune origin '$REMOTE_BRANCH'
git checkout '$REMOTE_BRANCH'
git reset --hard 'origin/$REMOTE_BRANCH'
git clean -fd -e .env -e .env.production
mkdir -p '$REMOTE_LINKS'
"
rsync -az --delete \
--include='*/' \
--include='*.md' \
--include='*.mdx' \
--exclude='.DS_Store' \
--exclude='*' \
"$VAULT_LINKS/" "$VPS:$REMOTE_LINKS/"
# --- 2. Build + cleanup ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
podman-compose -f compose.yml up --build -d --force-recreate
podman image prune -af
podman builder prune -af
"
echo "Links deploy done via $VPS (branch: $REMOTE_BRANCH)."
# --- 3. Webmentions ---
WEBMENTION_APP_TOKEN="$(ssh "$VPS" "grep '^WEBMENTION_APP_TOKEN=' '$REMOTE_BASE/.env.production' | cut -d= -f2-" 2>/dev/null || true)"
if [[ -n "$WEBMENTION_APP_TOKEN" ]]; then
echo "Sending webmentions..."
curl -s -X POST "https://webmention.app/check?url=https://adrian-altner.de/rss/links.xml&token=${WEBMENTION_APP_TOKEN}" \
| grep -o '"status":"[^"]*"' || true
echo "Webmentions triggered."
else
echo "No WEBMENTION_APP_TOKEN in .env.production — skipping webmentions."
fi

58
scripts/publish-notes.sh Executable file
View file

@ -0,0 +1,58 @@
#!/usr/bin/env bash
# Usage: publish-notes.sh [vps-host] [branch]
# Can be called from any directory — no dependency on the repo being the working dir.
set -euo pipefail
VAULT_NOTES='/Users/adrian/Obsidian/Web/adrian-altner-com/content/notes'
VPS="${1:-hetzner}"
REMOTE_BRANCH="${2:-main}"
REMOTE_BASE='/opt/websites/www.adrian-altner.de'
REMOTE_NOTES="${REMOTE_BASE}/src/content/notes"
# --- 1. Sync vault to VPS ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
git fetch --prune origin '$REMOTE_BRANCH'
git checkout '$REMOTE_BRANCH'
git reset --hard 'origin/$REMOTE_BRANCH'
git clean -fd -e .env -e .env.production
mkdir -p '$REMOTE_NOTES'
"
rsync -az --delete \
--include='*/' \
--include='*.md' \
--include='*.mdx' \
--include='*.jpg' \
--include='*.jpeg' \
--include='*.JPG' \
--include='*.JPEG' \
--exclude='.DS_Store' \
--exclude='*' \
"$VAULT_NOTES/" "$VPS:$REMOTE_NOTES/"
# --- 2. Build + cleanup ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
podman-compose -f compose.yml up --build -d --force-recreate
podman image prune -af
podman builder prune -af
"
echo "Notes deploy done via $VPS (branch: $REMOTE_BRANCH)."
# --- 3. Webmentions ---
WEBMENTION_APP_TOKEN="$(ssh "$VPS" "grep '^WEBMENTION_APP_TOKEN=' '$REMOTE_BASE/.env.production' | cut -d= -f2-" 2>/dev/null || true)"
if [[ -n "$WEBMENTION_APP_TOKEN" ]]; then
echo "Sending webmentions..."
curl -s -X POST "https://webmention.app/check?url=https://adrian-altner.de/rss/notes.xml&token=${WEBMENTION_APP_TOKEN}" \
| grep -o '"status":"[^"]*"' || true
echo "Webmentions triggered."
else
echo "No WEBMENTION_APP_TOKEN in .env.production — skipping webmentions."
fi

57
scripts/publish-photos.sh Executable file
View file

@ -0,0 +1,57 @@
#!/usr/bin/env bash
# Usage: publish-photos.sh [vps-host] [branch]
# Can be called from any directory — no dependency on the repo being the working dir.
set -euo pipefail
VAULT_PHOTOS='/Users/adrian/Obsidian/Web/adrian-altner-com/content/photos'
VPS="${1:-hetzner}"
REMOTE_BRANCH="${2:-main}"
REMOTE_BASE='/opt/websites/www.adrian-altner.de'
REMOTE_PHOTOS="${REMOTE_BASE}/src/content/photos"
# --- 1. Sync vault to VPS ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
git fetch --prune origin '$REMOTE_BRANCH'
git checkout '$REMOTE_BRANCH'
git reset --hard 'origin/$REMOTE_BRANCH'
git clean -fd -e .env -e .env.production
mkdir -p '$REMOTE_PHOTOS'
"
rsync -az --delete \
--include='*/' \
--include='*.md' \
--include='*.mdx' \
--include='*.jpg' \
--include='*.jpeg' \
--include='*.JPG' \
--include='*.JPEG' \
--include='*.json' \
--exclude='.DS_Store' \
--exclude='*' \
"$VAULT_PHOTOS/" "$VPS:$REMOTE_PHOTOS/"
# --- 2. Build + cleanup ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
podman-compose -f compose.yml up --build -d --force-recreate
podman image prune -af
podman builder prune -af
"
echo "Photos deploy done via $VPS (branch: $REMOTE_BRANCH)."
# --- 3. Webmentions ---
WEBMENTION_APP_TOKEN="$(ssh "$VPS" "grep '^WEBMENTION_APP_TOKEN=' '$REMOTE_BASE/.env.production' | cut -d= -f2-" 2>/dev/null || true)"
if [[ -n "$WEBMENTION_APP_TOKEN" ]]; then
echo "Sending webmentions..."
curl -s -X POST "https://webmention.app/check?url=https://adrian-altner.de/rss/photos.xml&token=${WEBMENTION_APP_TOKEN}" \
| grep -o '"status":"[^"]*"' || true
echo "Webmentions triggered."
else
echo "No WEBMENTION_APP_TOKEN in .env.production — skipping webmentions."
fi

49
scripts/publish-projects.sh Executable file
View file

@ -0,0 +1,49 @@
#!/usr/bin/env bash
# Usage: publish-projects.sh [vps-host] [branch]
# Can be called from any directory — no dependency on the repo being the working dir.
set -euo pipefail
VAULT_PROJECTS='/Users/adrian/Obsidian/Web/adrian-altner-com/content/projects'
VPS="${1:-hetzner}"
REMOTE_BRANCH="${2:-main}"
REMOTE_BASE='/opt/websites/www.adrian-altner.de'
REMOTE_PROJECTS="${REMOTE_BASE}/src/content/projects"
# --- 1. Sync vault to VPS ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
git fetch --prune origin '$REMOTE_BRANCH'
git checkout '$REMOTE_BRANCH'
git reset --hard 'origin/$REMOTE_BRANCH'
git clean -fd -e .env -e .env.production
mkdir -p '$REMOTE_PROJECTS'
"
rsync -az --delete \
--include='*/' \
--include='*.md' \
--include='*.mdx' \
--include='*.jpg' \
--include='*.jpeg' \
--include='*.JPG' \
--include='*.JPEG' \
--include='*.png' \
--include='*.PNG' \
--include='*.webp' \
--include='*.gif' \
--exclude='.DS_Store' \
--exclude='*' \
"$VAULT_PROJECTS/" "$VPS:$REMOTE_PROJECTS/"
# --- 2. Build + cleanup ---
ssh "$VPS" "
set -euo pipefail
cd '$REMOTE_BASE'
podman-compose -f compose.yml up --build -d --force-recreate
podman image prune -af
podman builder prune -af
"
echo "Projects deploy done via $VPS (branch: $REMOTE_BRANCH)."

23
scripts/squash-history.sh Executable file
View file

@ -0,0 +1,23 @@
#!/usr/bin/env bash
# squash-history.sh — Replaces entire git history with a single "init" commit.
# WARNING: Destructive and irreversible. Force-pushes to remote.
set -euo pipefail
COMMIT_MSG="${1:-init}"
REMOTE="${2:-origin}"
BRANCH="main"
TEMP="temp-squash-$$"
echo "⚠️ This will destroy all git history and force-push to $REMOTE/$BRANCH."
read -r -p "Continue? [y/N] " confirm
[[ "$confirm" =~ ^[Yy]$ ]] || { echo "Aborted."; exit 0; }
git checkout --orphan "$TEMP"
git add -A
git commit -m "$COMMIT_MSG"
git branch -D "$BRANCH"
git branch -m "$TEMP" "$BRANCH"
git push --force "$REMOTE" "$BRANCH"
echo "Done. $(git log --oneline)"

91
scripts/vision.spec.ts Normal file
View file

@ -0,0 +1,91 @@
import assert from "node:assert/strict";
import { mkdir, mkdtemp, rm, writeFile } from "node:fs/promises";
import { tmpdir } from "node:os";
import { join } from "node:path";
import type {
ExifMetadata,
ImageMetadataSuggestion,
VisionAIResult,
} from "./vision.ts";
import { getImagesToProcess, mergeMetaAndVisionData } from "./vision.ts";
const FINAL: ImageMetadataSuggestion = {
id: "2R9A2805",
title: [
"Blossom and Buzz",
"Spring's Gentle Awakening",
"Cherry Blossom Haven",
"Nature's Delicate Balance",
"A Bee's Spring Feast",
],
image: "./2R9A2805.jpg",
alt: "Close-up of vibrant pink cherry blossoms on a branch with a honeybee collecting nectar. The bee's wings are slightly blurred, capturing its motion as it works. The background is a soft, dreamy pink hue, complementing the sharp details of the blossoms and the bee.",
location: "48 deg 8' 37.56\" N, 11 deg 34' 13.32\" E",
date: "2024-03-17",
tags: ["nature", "cherryblossom", "bee", "spring", "floral"],
exif: {
camera: "Canon EOS R6m2",
lens: "RF70-200mm F2.8 L IS USM",
aperture: "2.8",
iso: "125",
focal_length: "200.0",
shutter_speed: "1/1000",
},
};
const VISION_DATA: VisionAIResult = {
title_ideas: [
"Blossom and Buzz",
"Spring's Gentle Awakening",
"Cherry Blossom Haven",
"Nature's Delicate Balance",
"A Bee's Spring Feast",
],
description:
"Close-up of vibrant pink cherry blossoms on a branch with a honeybee collecting nectar. The bee's wings are slightly blurred, capturing its motion as it works. The background is a soft, dreamy pink hue, complementing the sharp details of the blossoms and the bee.",
tags: ["nature", "cherryblossom", "bee", "spring", "floral"],
};
const EXIF_DATA: ExifMetadata = {
SourceFile: "/Users/flori/Sites/flori-dev/src/content/grid/2R9A2805.jpg",
FileName: "2R9A2805.jpg",
Model: "Canon EOS R6m2",
ExposureTime: "1/1000",
FNumber: 2.8,
ISO: 125,
DateTimeOriginal: "2024:03:17 15:06:16",
FocalLength: "200.0 mm",
LensModel: "RF70-200mm F2.8 L IS USM",
GPSPosition: "48 deg 8' 37.56\" N, 11 deg 34' 13.32\" E",
};
async function main() {
const tempRoot = await mkdtemp(join(tmpdir(), "vision-photos-"));
try {
assert.deepEqual(mergeMetaAndVisionData(EXIF_DATA, VISION_DATA), FINAL);
const albumDirectory = join(tempRoot, "chiang-mai");
const missingImage = join(albumDirectory, "2025-10-06-121017.jpg");
const completeImage = join(albumDirectory, "2025-10-06-121212.jpg");
await mkdir(albumDirectory, { recursive: true });
await writeFile(missingImage, "");
await writeFile(completeImage, "");
await writeFile(join(albumDirectory, "2025-10-06-121212.json"), "{}");
assert.deepEqual(await getImagesToProcess(tempRoot), [missingImage]);
assert.deepEqual(
await getImagesToProcess(tempRoot, { refresh: true, exifOnly: false }),
[missingImage, completeImage],
);
assert.deepEqual(
await getImagesToProcess(tempRoot, { refresh: false, exifOnly: true }),
[completeImage],
);
} finally {
await rm(tempRoot, { recursive: true, force: true });
}
}
await main();

820
scripts/vision.ts Normal file
View file

@ -0,0 +1,820 @@
#!/usr/bin/env -S node --experimental-strip-types
import { execFile } from "node:child_process";
import { readFile, writeFile } from "node:fs/promises";
import { relative, resolve } from "node:path";
import { fileURLToPath } from "node:url";
import { promisify } from "node:util";
import Anthropic from "@anthropic-ai/sdk";
import { consola } from "consola";
import OpenAI from "openai";
import sharp from "sharp";
import {
getImagesMissingMetadata,
getImagesWithExistingMetadata,
getMetadataPathForImage,
getPhotoAbsolutePath,
getPhotoDirectories,
PHOTOS_DIRECTORY,
} from "../src/lib/photo-albums.ts";
const execFileAsync = promisify(execFile);
/**
* Define the directory where the images are located.
*/
const PHOTOS_DIR = PHOTOS_DIRECTORY;
/**
* Instantiate the Anthropic client.
*/
type VisionProvider = "anthropic" | "openai";
let anthropic: Anthropic | undefined;
let openai: OpenAI | undefined;
function getAnthropicClient(): Anthropic {
anthropic ??= new Anthropic({ maxRetries: 0 });
return anthropic;
}
function getOpenAIClient(): OpenAI {
openai ??= new OpenAI({ maxRetries: 0 });
return openai;
}
function assertRequiredEnvironment(provider: VisionProvider): void {
if (provider === "anthropic" && !process.env.ANTHROPIC_API_KEY) {
throw new Error(
"Missing ANTHROPIC_API_KEY. `pnpm run vision` loads `.env.local` automatically. If you run the script directly, use `node --env-file=.env.local --experimental-strip-types scripts/vision.ts`.",
);
}
if (provider === "openai" && !process.env.OPENAI_API_KEY) {
throw new Error("Missing OPENAI_API_KEY. Set it in `.env.local`.");
}
}
/**
* Represents the metadata of an image in the Exif format.
*/
export interface ExifMetadata {
SourceFile: string;
FileName: string;
Model: string;
FNumber: number;
FocalLength: string;
ExposureTime: string;
ISO: number;
DateTimeOriginal: string;
LensModel: string;
GPSPosition?: string;
GPSLatitude?: string;
GPSLongitude?: string;
Keywords?: string | string[];
Subject?: string | string[];
Title?: string;
"Caption-Abstract"?: string;
}
/**
* Represents the result of the AI analysis.
*/
export interface VisionAIResult {
title_ideas: string[];
description: string;
tags: string[];
}
/**
* Represents the final metadata suggestion for an image.
*/
export interface ImageMetadataSuggestion {
id: string;
title: string[];
image: string;
alt: string;
location: string;
locationName?: string;
date: string;
tags: string[];
exif: {
camera: string;
lens: string;
aperture: string;
iso: string;
focal_length: string;
shutter_speed: string;
};
}
interface VisionCliOptions {
refresh: boolean;
exifOnly: boolean;
photosDirectory?: string;
visionConcurrency: number;
visionMaxRetries: number;
visionBaseBackoffMs: number;
provider: VisionProvider;
}
function parseCliOptions(argv: string[]): VisionCliOptions {
const getNumericOption = (name: string, fallback: number): number => {
const prefix = `--${name}=`;
const rawValue = argv
.find((arg) => arg.startsWith(prefix))
?.slice(prefix.length);
const parsed = Number.parseInt(rawValue ?? "", 10);
return Number.isFinite(parsed) && parsed > 0 ? parsed : fallback;
};
const envConcurrency = Number.parseInt(
process.env.VISION_CONCURRENCY ?? "",
10,
);
const envMaxRetries = Number.parseInt(
process.env.VISION_MAX_RETRIES ?? "",
10,
);
const envBaseBackoffMs = Number.parseInt(
process.env.VISION_BASE_BACKOFF_MS ?? "",
10,
);
const nonFlagArgs = argv.filter((arg) => !arg.startsWith("--"));
const providerArg = argv
.find((arg) => arg.startsWith("--provider="))
?.slice("--provider=".length);
const envProvider = process.env.VISION_PROVIDER;
const rawProvider = providerArg ?? envProvider ?? "anthropic";
const provider: VisionProvider =
rawProvider === "openai" ? "openai" : "anthropic";
return {
refresh: argv.includes("--refresh"),
exifOnly: argv.includes("--exif-only"),
provider,
photosDirectory: resolve(nonFlagArgs[0] ?? PHOTOS_DIR),
visionConcurrency: getNumericOption(
"concurrency",
Number.isFinite(envConcurrency) && envConcurrency > 0
? envConcurrency
: 2,
),
visionMaxRetries: getNumericOption(
"retries",
Number.isFinite(envMaxRetries) && envMaxRetries > 0 ? envMaxRetries : 8,
),
visionBaseBackoffMs: getNumericOption(
"backoff-ms",
Number.isFinite(envBaseBackoffMs) && envBaseBackoffMs > 0
? envBaseBackoffMs
: 1500,
),
};
}
function sleep(ms: number): Promise<void> {
return new Promise((resolve) => setTimeout(resolve, ms));
}
function isRateLimitError(error: unknown): boolean {
return error instanceof Anthropic.RateLimitError;
}
function extractRetryAfterMs(error: unknown): number | null {
if (!(error instanceof Anthropic.RateLimitError)) {
return null;
}
const retryAfter = error.headers?.get("retry-after");
if (retryAfter) {
const seconds = Number.parseFloat(retryAfter);
if (Number.isFinite(seconds) && seconds > 0) {
return Math.ceil(seconds * 1000);
}
}
return null;
}
async function mapWithConcurrency<T, R>(
values: T[],
concurrency: number,
mapper: (value: T, index: number) => Promise<R>,
): Promise<R[]> {
if (values.length === 0) {
return [];
}
const results: R[] = new Array(values.length);
const workerCount = Math.max(1, Math.min(concurrency, values.length));
let cursor = 0;
const workers = Array.from({ length: workerCount }, async () => {
while (true) {
const currentIndex = cursor;
cursor += 1;
if (currentIndex >= values.length) {
return;
}
const value = values[currentIndex];
if (typeof value === "undefined") {
continue;
}
results[currentIndex] = await mapper(value, currentIndex);
}
});
await Promise.all(workers);
return results;
}
/**
* Get all images that don't have a JSON file and therefore need to be processed.
*/
export async function getImagesToProcess(
photosDirectory = PHOTOS_DIR,
options: Pick<VisionCliOptions, "refresh" | "exifOnly"> = {
refresh: false,
exifOnly: false,
},
): Promise<string[]> {
let relativeImagePaths: string[];
let label: string;
if (options.exifOnly) {
relativeImagePaths = await getImagesWithExistingMetadata(photosDirectory);
label = `Found ${relativeImagePaths.length} ${relativeImagePaths.length === 1 ? "image" : "images"} with existing metadata (EXIF-only update)`;
} else if (options.refresh) {
relativeImagePaths = (await getPhotoDirectories(photosDirectory)).flatMap(
(directory) => directory.imagePaths,
);
label = `Refreshing ${relativeImagePaths.length} ${relativeImagePaths.length === 1 ? "image" : "images"} with metadata sidecars`;
} else {
relativeImagePaths = await getImagesMissingMetadata(photosDirectory);
label = `Found ${relativeImagePaths.length} ${relativeImagePaths.length === 1 ? "image" : "images"} without metadata`;
}
consola.info(label);
return relativeImagePaths.map((imagePath) =>
getPhotoAbsolutePath(imagePath, photosDirectory),
);
}
/**
* Reads an existing JSON sidecar for an image, preserving all fields.
*/
async function readExistingJsonSidecar(
imagePath: string,
photosDirectory: string,
): Promise<Record<string, unknown>> {
const relativeImagePath = relative(photosDirectory, imagePath);
const jsonPath = getMetadataPathForImage(relativeImagePath, photosDirectory);
const content = await readFile(jsonPath, "utf-8");
return JSON.parse(content) as Record<string, unknown>;
}
/**
* Updates only the EXIF-derived fields in an existing metadata object,
* preserving all other fields (title, alt, tags, flickrId, etc.).
*/
export function mergeExifIntoExisting(
exifData: ExifMetadata,
existing: Record<string, unknown>,
locationName?: string | null,
): Record<string, unknown> {
const [date] = exifData.DateTimeOriginal.split(" ");
if (!date) {
throw new Error(`Missing original date for ${exifData.SourceFile}.`);
}
const result: Record<string, unknown> = {
...existing,
location: getLocationFromExif(exifData),
date: date.replaceAll(":", "-"),
exif: {
camera: exifData.Model,
lens: exifData.LensModel,
aperture: exifData.FNumber.toString(),
iso: exifData.ISO.toString(),
focal_length: exifData.FocalLength.replace(" mm", ""),
shutter_speed: exifData.ExposureTime,
},
};
if (locationName) {
result.locationName = locationName;
}
return result;
}
/**
* Extracts the EXIF metadata from an image file.
* @param imagePath - The path to the image file.
*
* @returns A promise that resolves to the extracted EXIF metadata.
*/
export async function extractExifMetadata(
imagePath: string,
): Promise<ExifMetadata> {
/// Check if `exiftool` is installed.
try {
await execFileAsync("exiftool", ["--version"]);
} catch (_error) {
consola.error(
"exiftool is not installed. Please run `brew install exiftool`.",
);
process.exit(1);
}
/// Extract the metadata
const { stdout } = await execFileAsync("exiftool", ["-j", imagePath]);
const output = JSON.parse(stdout) as ExifMetadata[];
if (!output[0]) {
throw new Error(`No EXIF metadata found for ${imagePath}.`);
}
return output[0];
}
/**
* Encodes an image file to base64.
* @param imagePath - The path to the image file.
* @returns A Promise that resolves to the base64 encoded image.
*/
/**
* The Vision API internally downscales to max 1568px on the longest side.
* Anything larger wastes tokens without improving results.
*/
const VISION_MAX_DIMENSION = 1568;
async function base64EncodeImage(imagePath: string): Promise<string> {
const resized = await sharp(imagePath)
.resize({
width: VISION_MAX_DIMENSION,
height: VISION_MAX_DIMENSION,
fit: "inside",
withoutEnlargement: true,
})
.jpeg({ quality: 80 })
.toBuffer();
return resized.toString("base64");
}
const VISION_TOOL = {
name: "vision_response",
description: "Return the vision analysis of the image.",
input_schema: {
type: "object" as const,
additionalProperties: false,
properties: {
title_ideas: { type: "array", items: { type: "string" } },
description: { type: "string" },
tags: { type: "array", items: { type: "string" } },
},
required: ["title_ideas", "description", "tags"],
},
};
/**
* Generates image description, title suggestions and tags using AI.
*
* @param metadata - The metadata of the image.
* @returns A Promise that resolves to a VisionAIResult object containing the generated image description, title suggestions, and tags.
*/
function buildVisionPrompt(
metadata: ExifMetadata,
locationName: string | null,
): string {
const locationContext = locationName
? ` Das Foto wurde aufgenommen in: ${locationName}. Verwende diesen Ort konkret in der Beschreibung.`
: "";
const rawKeywords = metadata.Keywords ?? metadata.Subject ?? [];
const keywords = Array.isArray(rawKeywords) ? rawKeywords : [rawKeywords];
const keywordContext =
keywords.length > 0
? ` Folgende Tags sind vom Fotografen vergeben worden: ${keywords.join(", ")}. Beruecksichtige diese Informationen in der Beschreibung.`
: "";
return `Erstelle eine präzise und detaillierte Beschreibung dieses Bildes, die auch als Alt-Text funktioniert. Der Alt-Text soll keine Wörter wie Bild, Foto, Fotografie, Illustration oder Ähnliches enthalten. Beschreibe die Szene so, wie sie ist.${locationContext}${keywordContext} Erstelle außerdem 5 Titelvorschläge für dieses Bild. Schlage zuletzt 5 Tags vor, die zur Bildbeschreibung passen. Diese Tags sollen einzelne Wörter sein. Identifiziere das Hauptmotiv oder Thema und stelle den entsprechenden Tag an die erste Stelle. Gib die Beschreibung, die Titelvorschläge und die Tags zurück.`;
}
async function callAnthropicVision(
encodedImage: string,
prompt: string,
sourceFile: string,
): Promise<VisionAIResult> {
const response = await getAnthropicClient().messages.create({
model: "claude-opus-4-6",
max_tokens: 2048,
tools: [VISION_TOOL],
tool_choice: { type: "tool", name: "vision_response" },
messages: [
{
role: "user",
content: [
{
type: "image",
source: {
type: "base64",
media_type: "image/jpeg",
data: encodedImage,
},
},
{ type: "text", text: prompt },
],
},
],
});
const toolUseBlock = response.content.find((b) => b.type === "tool_use");
if (!toolUseBlock || toolUseBlock.type !== "tool_use") {
throw new Error(`No tool use response from AI for ${sourceFile}.`);
}
return toolUseBlock.input as VisionAIResult;
}
async function callOpenAIVision(
encodedImage: string,
prompt: string,
sourceFile: string,
): Promise<VisionAIResult> {
const jsonPrompt = `${prompt}\n\nWICHTIG: Antworte komplett auf Deutsch. Alle Titel, Beschreibungen und Tags muessen auf Deutsch sein.\n\nAntworte ausschliesslich mit einem JSON-Objekt im folgenden Format:\n{"title_ideas": ["...", "...", "...", "...", "..."], "description": "...", "tags": ["...", "...", "...", "...", "..."]}`;
const response = await getOpenAIClient().chat.completions.create({
model: "gpt-4o-mini",
max_tokens: 2048,
response_format: { type: "json_object" },
messages: [
{
role: "user",
content: [
{
type: "image_url",
image_url: { url: `data:image/jpeg;base64,${encodedImage}` },
},
{ type: "text", text: jsonPrompt },
],
},
],
});
const content = response.choices[0]?.message?.content;
if (!content) {
throw new Error(`No response from OpenAI for ${sourceFile}.`);
}
return JSON.parse(content) as VisionAIResult;
}
function isRetryableError(error: unknown, provider: VisionProvider): boolean {
if (provider === "anthropic") {
return isRateLimitError(error);
}
if (error instanceof OpenAI.RateLimitError) {
return true;
}
return false;
}
async function generateImageDescriptionTitleSuggestionsAndTags(
metadata: ExifMetadata,
locationName: string | null,
options: Pick<
VisionCliOptions,
"visionMaxRetries" | "visionBaseBackoffMs" | "provider"
>,
): Promise<VisionAIResult> {
const encodedImage = await base64EncodeImage(metadata.SourceFile);
const prompt = buildVisionPrompt(metadata, locationName);
let lastError: unknown;
for (let attempt = 0; attempt <= options.visionMaxRetries; attempt += 1) {
try {
const result =
options.provider === "openai"
? await callOpenAIVision(encodedImage, prompt, metadata.SourceFile)
: await callAnthropicVision(
encodedImage,
prompt,
metadata.SourceFile,
);
if (
result.title_ideas.length === 0 ||
result.description.length === 0 ||
result.tags.length === 0
) {
throw new Error(
`Incomplete vision response for ${metadata.SourceFile}.`,
);
}
return result;
} catch (error) {
lastError = error;
if (
!isRetryableError(error, options.provider) ||
attempt >= options.visionMaxRetries
) {
break;
}
const retryAfterMs = extractRetryAfterMs(error);
const exponentialBackoffMs = options.visionBaseBackoffMs * 2 ** attempt;
const jitterMs = Math.floor(Math.random() * 350);
const waitMs =
Math.max(retryAfterMs ?? 0, exponentialBackoffMs) + jitterMs;
const relativeSourcePath = relative(process.cwd(), metadata.SourceFile);
const nextAttempt = attempt + 1;
consola.warn(
`Rate limit for ${relativeSourcePath}. Retry ${nextAttempt}/${options.visionMaxRetries} in ${Math.ceil(waitMs / 1000)}s...`,
);
await sleep(waitMs);
}
}
throw lastError;
}
function ensureVisionCanRun(
imagesToProcess: string[],
provider: VisionProvider,
): void {
if (imagesToProcess.length === 0) {
return;
}
assertRequiredEnvironment(provider);
}
/**
* Parses an EXIF DMS string like `7 deg 49' 12.00" N` into a decimal number.
*/
function parseDms(dms: string): number {
const match = dms.match(/(\d+)\s*deg\s*(\d+)'\s*([\d.]+)"\s*([NSEW])/i);
if (!match) {
return Number.NaN;
}
const [, deg, min, sec, dir] = match;
let decimal = Number(deg) + Number(min) / 60 + Number(sec) / 3600;
if (dir === "S" || dir === "W") {
decimal *= -1;
}
return decimal;
}
/**
* Resolves GPS coordinates to a human-readable location via Nominatim.
*/
async function reverseGeocode(
lat: number,
lon: number,
): Promise<string | null> {
const url = `https://nominatim.openstreetmap.org/reverse?lat=${lat}&lon=${lon}&format=json&accept-language=de&zoom=14`;
try {
const response = await fetch(url, {
headers: { "User-Agent": "adrian-altner.de/vision-script" },
});
if (!response.ok) return null;
const data = (await response.json()) as { display_name?: string };
return data.display_name ?? null;
} catch {
return null;
}
}
/**
* Resolves EXIF GPS data to a readable location name. Returns null if no GPS data.
*/
async function resolveLocationName(
exifData: ExifMetadata,
): Promise<string | null> {
const latStr = exifData.GPSLatitude;
const lonStr = exifData.GPSLongitude;
if (!latStr || !lonStr) return null;
const lat = parseDms(latStr);
const lon = parseDms(lonStr);
if (Number.isNaN(lat) || Number.isNaN(lon)) return null;
return await reverseGeocode(lat, lon);
}
function getLocationFromExif(exifData: ExifMetadata): string {
if (exifData.GPSPosition) {
return exifData.GPSPosition;
}
if (exifData.GPSLatitude && exifData.GPSLongitude) {
return `${exifData.GPSLatitude}, ${exifData.GPSLongitude}`;
}
return "";
}
/**
* Merges the metadata from EXIF data and vision data to create an ImageMetadataSuggestion object.
* @param exifData - The EXIF metadata of the image.
* @param visionData - The vision AI result data of the image.
* @returns The merged ImageMetadataSuggestion object.
*/
export function mergeMetaAndVisionData(
exifData: ExifMetadata,
visionData: VisionAIResult,
locationName?: string | null,
): ImageMetadataSuggestion {
const [date] = exifData.DateTimeOriginal.split(" ");
if (!date) {
throw new Error(`Missing original date for ${exifData.SourceFile}.`);
}
const result: ImageMetadataSuggestion = {
id: exifData.FileName.replace(".jpg", ""),
title: visionData.title_ideas,
image: `./${exifData.FileName}`,
alt: visionData.description,
location: getLocationFromExif(exifData),
date: date.replaceAll(":", "-"),
tags: visionData.tags,
exif: {
camera: exifData.Model,
lens: exifData.LensModel,
aperture: exifData.FNumber.toString(),
iso: exifData.ISO.toString(),
focal_length: exifData.FocalLength.replace(" mm", ""),
shutter_speed: exifData.ExposureTime,
},
};
if (locationName) {
result.locationName = locationName;
}
return result;
}
/**
* Writes the given image metadata to a JSON file.
* @param imageMetadata - The image metadata to be written.
* @returns A Promise that resolves when the JSON file is written successfully.
*/
async function writeToJsonFile(
imageMetadata: ImageMetadataSuggestion,
imagePath: string,
photosDirectory: string,
): Promise<void> {
const relativeImagePath = relative(photosDirectory, imagePath);
const jsonPath = getMetadataPathForImage(relativeImagePath, photosDirectory);
const json = JSON.stringify(imageMetadata, null, 2);
await writeFile(jsonPath, json);
}
/**
* Main.
*/
async function main() {
consola.start("Checking for images to process...");
const cliOptions = parseCliOptions(process.argv.slice(2));
const photosDirectory = cliOptions.photosDirectory ?? PHOTOS_DIR;
/// Load all images that don't have a JSON file.
const images = await getImagesToProcess(photosDirectory, cliOptions);
if (images.length === 0) {
consola.success(
cliOptions.exifOnly
? "No images with existing metadata found."
: cliOptions.refresh
? "No images found to refresh."
: "No images require metadata.",
);
return;
}
/// Extract EXIF metadata from these images.
const exifData = await mapWithConcurrency(
images,
8,
async (imagePath, index) => {
consola.info(`Extracting EXIF ${index + 1}/${images.length}...`);
return await extractExifMetadata(imagePath);
},
);
/// Resolve location names via Nominatim (sequential, 1 req/s limit).
consola.info("Resolving location names via Nominatim...");
const locationNames: (string | null)[] = [];
for (const exifEntry of exifData) {
const name = await resolveLocationName(exifEntry);
locationNames.push(name);
if (name) {
await sleep(1000);
}
}
const resolvedCount = locationNames.filter(Boolean).length;
consola.info(`Resolved ${resolvedCount}/${exifData.length} location names.`);
if (cliOptions.exifOnly) {
/// EXIF-only mode: read existing JSON, merge EXIF fields, write back.
const existingData = await mapWithConcurrency(
images,
8,
async (imagePath) => readExistingJsonSidecar(imagePath, photosDirectory),
);
await mapWithConcurrency(exifData, 8, async (exifEntry, index) => {
const existing = existingData[index];
if (!existing) {
throw new Error(
`Missing existing metadata for ${exifEntry.SourceFile}.`,
);
}
const updated = mergeExifIntoExisting(
exifEntry,
existing,
locationNames[index],
);
const relativeImagePath = relative(photosDirectory, exifEntry.SourceFile);
const jsonPath = getMetadataPathForImage(
relativeImagePath,
photosDirectory,
);
await writeFile(jsonPath, JSON.stringify(updated, null, 2));
consola.info(
`Updated EXIF ${index + 1}/${exifData.length}: ${relativeImagePath}`,
);
});
consola.success("All EXIF data updated successfully.");
return;
}
consola.info(
`Vision settings: provider=${cliOptions.provider}, concurrency=${cliOptions.visionConcurrency}, retries=${cliOptions.visionMaxRetries}, backoff=${cliOptions.visionBaseBackoffMs}ms`,
);
ensureVisionCanRun(images, cliOptions.provider);
/// Determine the image description, title suggestions and tags for each image with AI.
const visionData = await mapWithConcurrency(
exifData,
cliOptions.visionConcurrency,
async (exifEntry, index) => {
consola.info(`Generating AI metadata ${index + 1}/${exifData.length}...`);
return await generateImageDescriptionTitleSuggestionsAndTags(
exifEntry,
locationNames[index] ?? null,
cliOptions,
);
},
);
/// Merge the EXIF and Vision data to create the final metadata suggestion.
const imageData = exifData.map((e, i) => {
const currentVisionData = visionData[i];
if (!currentVisionData) {
throw new Error(`Missing vision data for ${e.SourceFile}.`);
}
return mergeMetaAndVisionData(e, currentVisionData, locationNames[i]);
});
/// Write the metadata to JSON files.
await mapWithConcurrency(imageData, 8, async (imageMetadata, index) => {
const sourceFile = exifData[index]?.SourceFile;
if (!sourceFile) {
throw new Error(`Missing source file for ${imageMetadata.id}.`);
}
await writeToJsonFile(imageMetadata, sourceFile, photosDirectory);
consola.info(`Wrote metadata ${index + 1}/${imageData.length}.`);
});
consola.success("All images processed successfully.");
}
if (process.argv[1] && fileURLToPath(import.meta.url) === process.argv[1]) {
try {
await main();
} catch (error) {
consola.error(error);
process.exit(1);
}
}