Slefhostet und postgres

This commit is contained in:
2026-04-02 11:39:57 +02:00
parent b1c99893a6
commit 08483c7075
215 changed files with 4584 additions and 5190 deletions

279
README.md
View File

@@ -1,92 +1,187 @@
# GreenLens
Expo app for plant scanning, care tracking, lexicon browsing, and profile settings.
## Run locally
1. Install dependencies:
- `npm install`
2. Start Expo:
- `npm run start`
## iOS TestFlight (EAS)
Use these three commands in order:
1. Set iOS build number:
- `npx eas-cli build:version:set -p ios`
2. Create production iOS build:
- `npx eas-cli build -p ios --profile production`
3. Submit latest iOS build to TestFlight:
- `npx eas-cli submit -p ios --latest`
## Lexicon SQLite maintenance
The server now uses a persistent SQLite database (`server/data/greenlns.sqlite`) and supports validated rebuilds.
1. Install server dependencies:
- `cd server && npm install`
2. Run the server:
- `npm run start`
3. Rebuild plants from the local lexicon batch constants:
- `npm run rebuild:batches`
4. Check duplicates and import audits:
- `npm run diagnostics`
For protected rebuild endpoints, set `PLANT_IMPORT_ADMIN_KEY` and send `x-admin-key` in requests.
### Local plant images
The lexicon now supports storing plant image paths in SQLite as local public paths instead of external URLs.
Recommended structure:
- Database field: `imageUri`
- Value example: `/plants/monstera-deliciosa.webp`
- File location on disk: `server/public/plants/monstera-deliciosa.webp`
Notes:
- The Express server serves `server/public/plants` at `/plants/*`.
- Remote `https://...` image URLs still work, so migration can be incremental.
- Keep the database focused on metadata and store only the image path, not binary blobs.
## Billing and backend simulation
The app now uses a backend API contract for paid AI features:
- Scan AI (`/v1/scan`)
- Semantic AI search (`/v1/search/semantic`)
- Billing summary (`/v1/billing/summary`)
- Health check AI (`/v1/health-check`)
The Node server in `server/index.js` now implements these `/v1` routes directly and uses:
- `server/lib/openai.js` for OpenAI calls
- `server/lib/billing.js` for credit/billing/idempotency state
If `EXPO_PUBLIC_BACKEND_URL` is not set, the app uses an in-app mock backend simulation for `/v1/*` API calls.
`EXPO_PUBLIC_PAYMENT_SERVER_URL` is used only for Stripe PaymentSheet calls (`/api/payment-sheet`).
The in-app mock backend provides:
- Server-side style credit enforcement
- Atomic `consumeCredit()` behavior
- Idempotency-key handling
- Free and Pro monthly credit buckets
- Top-up purchase simulation
- RevenueCat/Stripe webhook simulation
This makes it possible to build UI and flow now, then replace mock endpoints with a real backend later.
## Production integration notes
- Keep OpenAI keys only on the backend.
- Use app-store billing via RevenueCat or StoreKit/Play Billing.
- Forward entitlement updates to backend webhooks.
- Enforce credits on backend only; app should only display UX quota.
- Recommended backend env vars:
- `OPENAI_API_KEY`
- `OPENAI_SCAN_MODEL` (for example `gpt-5`)
- `OPENAI_HEALTH_MODEL` (for example `gpt-5`)
- `STRIPE_SECRET_KEY`
- `STRIPE_PUBLISHABLE_KEY`
# GreenLens
Expo app for plant scanning, care tracking, billing, and profile management, backed by an Express API.
## App development
```bash
npm install
npm run start
```
## Backend development
The backend now targets PostgreSQL instead of SQLite.
```bash
cd server
npm install
npm run start
```
Required backend environment:
- `DATABASE_URL` or `POSTGRES_HOST` + `POSTGRES_PORT` + `POSTGRES_DB` + `POSTGRES_USER` + `POSTGRES_PASSWORD`
- `JWT_SECRET`
Optional integrations:
- `OPENAI_API_KEY`
- `STRIPE_SECRET_KEY`
- `STRIPE_PUBLISHABLE_KEY`
- `STRIPE_WEBHOOK_SECRET`
- `REVENUECAT_WEBHOOK_SECRET`
- `PLANT_IMPORT_ADMIN_KEY`
- `MINIO_ENDPOINT`
- `MINIO_ACCESS_KEY`
- `MINIO_SECRET_KEY`
- `MINIO_BUCKET`
- `MINIO_PUBLIC_URL`
## Docker Compose
For backend-only local infrastructure use [docker-compose.yml](/abs/path/C:/Users/a931627/Documents/apps/GreenLns/docker-compose.yml).
For the production-style self-hosted stack with landing page, Caddy, API, PostgreSQL, and MinIO use [greenlns-landing/docker-compose.yml](/abs/path/C:/Users/a931627/Documents/apps/GreenLns/greenlns-landing/docker-compose.yml).
## Server deployment
Run the commands in this section from the repo root on your server:
```bash
cd /path/to/GreenLns
```
Example:
```bash
cd /var/www/GreenLns
```
### 1. Prepare environment
```bash
cp .env.example .env
```
Then fill at least:
- `SITE_DOMAIN`
- `SITE_URL`
- `POSTGRES_PASSWORD`
- `JWT_SECRET`
- `MINIO_SECRET_KEY`
- optional: `OPENAI_API_KEY`, `STRIPE_*`, `REVENUECAT_*`
### 2. Start the full production stack
```bash
docker compose up --build -d
```
When you run this from the repo root, Docker Compose uses [docker-compose.yml](/abs/path/C:/Users/a931627/Documents/apps/GreenLns/docker-compose.yml).
What gets built:
- `landing` is built from `./greenlns-landing/Dockerfile`
- `api` is built from `./server/Dockerfile`
What is not built locally, but pulled as ready-made images:
- `postgres` uses `postgres:16-alpine`
- `minio` uses `minio/minio:latest`
- `caddy` uses `caddy:2.8-alpine`
So yes: `docker compose up --build -d` builds the landing page container and the API container, and it starts PostgreSQL as a container. PostgreSQL is not "built" from your code, it is started from the official Postgres image.
This starts:
- `caddy`
- `landing`
- `api`
- `postgres`
- `minio`
### 3. Useful server commands
Check running containers:
```bash
docker compose ps
```
Follow all logs:
```bash
docker compose logs -f
```
Follow only API logs:
```bash
docker compose logs -f api
```
Follow only landing logs:
```bash
docker compose logs -f landing
```
Restart one service:
```bash
docker compose restart api
docker compose restart landing
```
Rebuild and restart after code changes:
```bash
docker compose up --build -d
```
Stop the stack:
```bash
docker compose down
```
Stop the stack and remove volumes:
```bash
docker compose down -v
```
### 4. Health checks after deploy
```bash
curl https://greenlenspro.com/health
curl https://greenlenspro.com/
curl https://greenlenspro.com/sitemap.xml
```
### 5. Production compose file location
If you want to run the same stack from inside the landing directory instead:
```bash
cd greenlns-landing
docker compose up --build -d
```
In that case Docker Compose uses [greenlns-landing/docker-compose.yml](/abs/path/C:/Users/a931627/Documents/apps/GreenLns/greenlns-landing/docker-compose.yml).
There, too:
- `landing` is built from `greenlns-landing/Dockerfile`
- `api` is built from `../server/Dockerfile`
- `postgres`, `minio`, and `caddy` are started from official images
## iOS TestFlight
```bash
npx eas-cli build:version:set -p ios
npx eas-cli build -p ios --profile production
npx eas-cli submit -p ios --latest
```