MUSEUM AI PLATFORM
▆▆▆▆▆ x 1508
A concept and working technical proof of concept for an AI-powered digital archive platform — making 114,000+ records from legacy museum databases searchable via natural language, with results enriched by vision models, surface-linked to editorial content, and connected to a community participation layer. Eduardo led the technical direction and architecture, built the PoC end-to-end, and actively contributed to the UX and platform concept alongside the design team.

OVERVIEW _
The brief was to test whether a new kind of digital platform was possible: one that could make a major institution’s historical archive genuinely searchable — not just by keyword, but by meaning — while connecting it to editorial content, community participation, and a Denmark-wide network of maritime organisations.
Eduardo took on the technical direction and architecture, and built a working proof of concept to validate the approach before any full build was committed to. The PoC is not a prototype — it runs real archival data through the full proposed technical stack, and the results are served through a live web interface.
The data alone was a challenge. Seven source systems. Over 114,000 records across FotoWeb (photographs and metadata), SARA (vessel registry), and ▆▆▆▆▆ databases (ships, maps, references, vocabulary). Each had different formats — REST APIs, XML exports, MySQL exports — and varying levels of structure and completeness. Everything was extracted, cleaned, normalised, and stored in MongoDB; images were downloaded, converted, and processed through OpenAI’s vision models to generate text descriptions. All records were then embedded and indexed in Meilisearch for hybrid full-text and vector search.
The resulting search interface does things that keyword search can’t: it understands what you’re asking, generates a natural-language overview of what it found, and surfaces related content across record types and collections. It can read into scanned documents, photographs, and historical drawings, and can place results on a map when coordinates are present.
The platform concept has three pillars. Search is the foundation: AI-powered, cross-database, visual. Perspective adds editorial depth — long-form articles that connect contemporary events to historical context, online exhibitions, and an AI writing assistant trained on the institution’s own voice. Community completes the picture: an interactive map of maritime Denmark, volunteer-run wiki pages around ships and places, and a simple form that lets anyone contribute a personal story to the collective record.
TECHNOLOGY _
MongoDB, MinIO, Meilisearch, OpenAI (text embeddings + vision), Next.js, Shadcn UI, Tailwind CSS, Docker, GitHub Actions CI/CD, Prometheus + Grafana
114,000+ records across FotoWeb (66k metadata records + 792 images), SARA API (48k records), and ▆▆▆▆▆ databases (ships, maps, references, vocabulary) — extracted from REST APIs, XML exports, and MySQL, cleaned and normalised, OpenAI-embedded, and indexed in Meilisearch.
IMPACT & RECOGNITION _
Concept validated with 5 stakeholders across the Danish maritime community — strong interest from all
Responsible AI framework defined and applied throughout
Three formal deliverables: clickable prototype, motion video, and coded PoC running real institutional data
RELATED _

Global Platform
Headless microservice platform across 40+ markets — 57% uplift in users, one-click market launch via environment replication.

The MindArt Experience
AI-generated personalised artworks for pharmaceutical conference booths — thousands of unique pieces, activated across Budapest, Barcelona, Paris, Seoul.

Headless Platform
Centralised headless CMS replacing a fragmented multi-CMS landscape. Underpins Carlsberg, Tuborg, Brooklyn, Grimbergen, Jacobsen, Holsten, and more.