lepetal.com
A mobile-first plant identification app — snap a photo, get a rich identification card with etymology, origin, and growing conditions, then watch it transform into a Victorian botanical illustration. Public profiles, follow feeds, and a curated personal garden.
Plant identification apps exist, but they stop at the name. You get a label — Rosa gallica — and nothing else. No etymology, no origin story, no growing context, no sense of why this plant has shown up in gardens for 3,000 years. And they're solitary: identify a plant, close the app, forget you found it. There's no place to collect what you've found, follow other gardeners, or discover what's growing in your community.
Rebuild the identification result as a rich card — not just a name, but etymology, country of origin, climate range, historical fact, confidence score, and two named alternatives with reasons. Pair that with async generative art: the same photo that feeds Claude Vision also feeds a Flux image-to-image model that transforms it into a Victorian botanical illustration. The social layer — public profiles, follows, a discover feed — gives the collection somewhere to live. Ships as a PWA so it installs to the home screen without an App Store.
- Photo identification — client-side image compression before upload, SHA-256 hash for cache keying, Claude Sonnet vision call returning a validated 11-field JSON schema
- Hash-before-compression caching — the original file is hashed before compression so the same plant re-uploaded matches the cache regardless of encoding; zero API spend on repeat identifications
- Confidence-aware result card — confidence score drives a visual badge (high / medium / low); low-confidence triggers a callout; two alternatives power a one-tap re-identification flow
- Victorian botanical illustrations — async fal.ai Flux image-to-image job; result page polls for completion and swaps the photo for the illustration on arrival; URL persisted so every return visit is instant
- Private bucket + signed URL pattern — plant photos live in a private storage bucket; a short-lived signed URL is generated immediately before fal.ai ingestion, never exposing the bucket
- Dual-path auth — API routes check Bearer token first, then fall back to session cookie; handles both SSR and PWA home screen install contexts where cookie propagation isn't guaranteed
- Social layer — public profiles at lepetal.com/@handle, follow other gardeners, personal plant collection, community discover feed
- PWA — installs to iOS and Android home screen in standalone mode with portrait lock and notch support; no App Store required
Photo in, rich identification card out — with a Victorian botanical illustration generating in the background and persisted on arrival, every return visit instant.
Next.js 16 · App Router · React 19 · Vercel
Tailwind v4 · CSS custom properties (OKLCH color tokens)
Supabase — Postgres, RLS, Storage, Google OAuth
Claude Sonnet — plant identification, structured JSON output, confidence scoring
fal.ai Flux — image-to-image botanical illustration; async queue + polling
Cormorant Garamond (display) · DM Sans (UI)
shadcn (base-nova) + custom Petal components
manifest.json · standalone display · portrait lock · viewportFit cover
Four problems that required actual design work. Stable cache keys across compressed images: hashing the original file before compression means the cache key is stable regardless of how the image was re-encoded — same plant, same hash, zero API spend. Async illustration without a queue service: fal.ai's own async queue handles the 30–60 second inference window; the result page polls for completion, persists the URL on arrival, and renders instantly on every subsequent visit — no service worker, no background fetch needed. Private storage + third-party AI: fal.ai requires a public URL; plant photos live in a private bucket; a short-lived signed URL is generated immediately before job submission so the image is accessible during inference without permanently exposing the bucket. Dual-path auth: PWA home screen installs don't always propagate cookies reliably, so every API route checks Bearer token first and falls back to cookie — consistently across all routes, not patched in one place.
Petal is the proof that "AI wrapper" is a design failure, not a category. The identification result exists in a hundred apps. The prompt designed backward from the UI — what fields does the card need, what does the re-identification flow require, what does low confidence look like to a user — is what makes the AI output fit the product instead of the product fitting the API. The Victorian illustration isn't a feature. It's the reason you come back.