Compare commits

...

23 Commits

Author SHA1 Message Date
Nico
ee94853084 fix(list): restore added-by attribution with display name fallback
All checks were successful
Build & Deploy Costco Grocery List / build (push) Successful in 1m7s
Build & Deploy Costco Grocery List / verify-images (push) Successful in 3s
Build & Deploy Costco Grocery List / deploy (push) Successful in 12s
Build & Deploy Costco Grocery List / notify (push) Successful in 1s
2026-02-21 00:07:22 -08:00
Nico
3dd58f51e8 fix(ui): use bounded member dropdown in assign-item modal 2026-02-21 00:07:17 -08:00
Nico
beb9cdcec7 fix(invites): lock invite row without outer join update error 2026-02-21 00:07:11 -08:00
Nico
9fa48e6eb3 feat: support assigning grocery items to other household members 2026-02-20 23:33:22 -08:00
Nico
a1beb486cb changed dev frontend port 2026-02-18 14:53:29 -08:00
Nico
d62564fd0d refactor: streamline navbar and settings tab cues 2026-02-18 14:52:41 -08:00
Nico
c1259f0bf5 fix: recover when sessions table is missing 2026-02-18 14:52:35 -08:00
Nico
c3c0c33339 fix: harden auth inputs, throttling, and debug exposure 2026-02-18 12:24:15 -08:00
Nico
3469284e98 docs: add project state audit and execution plan 2026-02-16 01:49:44 -08:00
Nico
aa9488755f feat: enable cookie auth flow and database url runtime config 2026-02-16 01:49:03 -08:00
Nico
119994b602 feat: add db-backed session cookie auth compatibility 2026-02-16 01:43:27 -08:00
Nico
0f9d349fa5 feat: add db migration for session storage 2026-02-16 01:40:18 -08:00
Nico
9cb0ac19e5 refactor: use safe request-scoped backend error logging 2026-02-16 01:36:39 -08:00
Nico
e2e9ec9eb4 fix: redact invite codes in logs using last4 policy 2026-02-16 01:34:09 -08:00
Nico
05ad576206 refactor: return json from health endpoints for request ids 2026-02-16 01:28:11 -08:00
Nico
16e60dcf63 refactor: align legacy list controller with sendError 2026-02-16 01:27:35 -08:00
Nico
2a9389532f fix: assign default user role on registration 2026-02-16 01:26:52 -08:00
Nico
9a73cea27d refactor: adopt sendError helper across core controllers 2026-02-16 01:26:18 -08:00
Nico
fec9f1ab25 feat: include request id in all json responses 2026-02-16 01:23:42 -08:00
Nico
a5f99ba475 fix: normalize frontend api errors and remove sensitive debug logs 2026-02-16 01:20:45 -08:00
Nico
ac92bed8a1 feat: standardize error envelope and request id propagation 2026-02-16 01:18:51 -08:00
Nico
b3f607d8f8 feat: add request id middleware for api responses 2026-02-16 01:10:26 -08:00
Nico
7fb28e659f chore: establish governance baseline and migration workflow 2026-02-16 01:09:13 -08:00
73 changed files with 4083 additions and 1325 deletions

View File

@ -1,324 +1,21 @@
# Costco Grocery List - AI Agent Instructions # Copilot Compatibility Instructions
## Architecture Overview ## Precedence
- Source of truth: `PROJECT_INSTRUCTIONS.md` (repo root).
- Agent workflow constraints: `AGENTS.md` (repo root).
- Bugfix protocol: `DEBUGGING_INSTRUCTIONS.md` (repo root).
This is a full-stack grocery list management app with **role-based access control (RBAC)**: If any guidance in this file conflicts with the root instruction files, follow the root instruction files.
- **Backend**: Node.js + Express + PostgreSQL (port 5000)
- **Frontend**: React 19 + TypeScript + Vite (port 3000/5173)
- **Deployment**: Docker Compose with separate dev/prod configurations
## Mobile-First Design Principles ## Current stack note
This repository is currently:
- Backend: Express (`backend/`)
- Frontend: React + Vite (`frontend/`)
**CRITICAL**: All UI components MUST be designed for both mobile and desktop from the start. Apply architecture intent from `PROJECT_INSTRUCTIONS.md` using the current stack mapping in:
- `docs/AGENTIC_CONTRACT_MAP.md`
**Responsive Design Requirements**: ## Safety reminders
- Use relative units (`rem`, `em`, `%`, `vh/vw`) over fixed pixels where possible - External DB only (`DATABASE_URL`), no DB container assumptions.
- Implement mobile breakpoints: `480px`, `768px`, `1024px` - No cron/worker additions unless explicitly approved.
- Test layouts at: 320px (small phone), 375px (phone), 768px (tablet), 1024px+ (desktop) - Never log secrets, receipt bytes, or full invite codes.
- Avoid horizontal scrolling on mobile devices
- Touch targets minimum 44x44px for mobile usability
- Use `max-width` with `margin: 0 auto` for content containers
- Stack elements vertically on mobile, use flexbox/grid for larger screens
- Hide/collapse navigation into hamburger menus on mobile
- Ensure modals/dropdowns work well on small screens
**Common Patterns**:
```css
/* Mobile-first approach */
.container {
padding: 1rem;
max-width: 100%;
}
@media (min-width: 768px) {
.container {
padding: 2rem;
max-width: 800px;
margin: 0 auto;
}
}
```
### Key Design Patterns
**Dual RBAC System** - Two separate role hierarchies:
**1. System Roles** (users.role column):
- `system_admin`: Access to Admin Panel for system-wide management (stores, users)
- `user`: Regular system user (default for new registrations)
- Defined in [backend/models/user.model.js](backend/models/user.model.js)
- Used for Admin Panel access control
**2. Household Roles** (household_members.role column):
- `admin`: Can manage household members, change roles, delete household
- `user`: Can add/edit items, mark as bought (standard member permissions)
- Defined per household membership
- Used for household-level permissions (item management, member management)
**Important**: Always distinguish between system role and household role:
- **System role**: From `AuthContext` or `req.user.role` - controls Admin Panel access
- **Household role**: From `activeHousehold.role` or `household_members.role` - controls household operations
**Middleware chain pattern** for protected routes:
```javascript
// System-level protection
router.get("/stores", auth, requireRole("system_admin"), controller.getAllStores);
// Household-level checks done in controller
router.post("/lists/:householdId/items", auth, controller.addItem);
```
- `auth` middleware extracts JWT from `Authorization: Bearer <token>` header
- `requireRole` checks system role only
- Household role checks happen in controllers using `household.model.js` methods
**Frontend route protection**:
- `<PrivateRoute>`: Requires authentication, redirects to `/login` if no token
- `<RoleGuard allowed={[ROLES.SYSTEM_ADMIN]}>`: Requires system_admin role for Admin Panel
- Household permissions: Check `activeHousehold.role` in components (not route-level)
- Example in [frontend/src/App.jsx](frontend/src/App.jsx)
**Multi-Household Architecture**:
- Users can belong to multiple households
- Each household has its own grocery lists, stores, and item classifications
- `HouseholdContext` manages active household selection
- All list operations are scoped to the active household
## Database Schema
**PostgreSQL server runs externally** - not in Docker Compose. Connection configured in [backend/.env](backend/.env) via standard environment variables.
**Core Tables**:
**users** - System users
- `id` (PK), `username`, `password` (bcrypt), `name`, `display_name`
- `role`: `system_admin` | `user` (default: `viewer` - legacy)
- System-level authentication and authorization
**households** - Household entities
- `id` (PK), `name`, `invite_code`, `created_by`, `created_at`
- Each household is independent with own lists and members
**household_members** - Junction table (users ↔ households)
- `id` (PK), `household_id` (FK), `user_id` (FK), `role`, `joined_at`
- `role`: `admin` | `user` (household-level permissions)
- One user can belong to multiple households with different roles
**items** - Master item catalog
- `id` (PK), `name`, `default_image`, `default_image_mime_type`, `usage_count`
- Shared across all households, case-insensitive unique names
**stores** - Store definitions (system-wide)
- `id` (PK), `name`, `default_zones` (JSONB array)
- Managed by system_admin in Admin Panel
**household_stores** - Stores available to each household
- `id` (PK), `household_id` (FK), `store_id` (FK), `is_default`
- Links households to stores they use
**household_lists** - Grocery list items per household
- `id` (PK), `household_id` (FK), `store_id` (FK), `item_id` (FK)
- `quantity`, `bought`, `custom_image`, `custom_image_mime_type`
- `added_by`, `modified_on`
- Scoped to household + store combination
**household_list_history** - Tracks quantity contributions
- `id` (PK), `household_list_id` (FK), `quantity`, `added_by`, `added_on`
- Multi-contributor tracking (who added how much)
**household_item_classifications** - Item classifications per household/store
- `id` (PK), `household_id`, `store_id`, `item_id`
- `item_type`, `item_group`, `zone`, `confidence`, `source`
- Household-specific overrides of global classifications
**item_classification** - Global item classifications
- `id` (PK), `item_type`, `item_group`, `zone`, `confidence`, `source`
- System-wide defaults for item categorization
**Legacy Tables** (deprecated, may still exist):
- `grocery_list`, `grocery_history` - Old single-household implementation
**Important patterns**:
- No formal migration system - schema changes are manual SQL
- Items use case-insensitive matching (`ILIKE`) to prevent duplicates
- JOINs with `ARRAY_AGG` for multi-contributor queries (see [backend/models/list.model.v2.js](backend/models/list.model.v2.js))
- All list operations require `household_id` parameter for scoping
- Image storage: `bytea` columns for images with separate MIME type columns
## Development Workflow
### Local Development
```bash
# Start all services with hot-reload against LOCAL database
docker-compose -f docker-compose.dev.yml up
# Backend runs nodemon (watches backend/*.js)
# Frontend runs Vite dev server with HMR on port 3000
```
**Key dev setup details**:
- Volume mounts preserve `node_modules` in containers while syncing source code
- Backend uses `Dockerfile` (standard) with `npm run dev` override
- Frontend uses `Dockerfile.dev` with `CHOKIDAR_USEPOLLING=true` for file watching
- Both connect to **external PostgreSQL server** (configured in `backend/.env`)
- No database container in compose - DB is managed separately
### Production Build
```bash
# Local production build (for testing)
docker-compose -f docker-compose.prod.yml up --build
# Actual production uses pre-built images
docker-compose up # Pulls from private registry
```
### CI/CD Pipeline (Gitea Actions)
See [.gitea/workflows/deploy.yml](.gitea/workflows/deploy.yml) for full workflow:
**Build stage** (on push to `main`):
1. Run backend tests (`npm test --if-present`)
2. Build backend image with tags: `:latest` and `:<commit-sha>`
3. Build frontend image with tags: `:latest` and `:<commit-sha>`
4. Push both images to private registry
**Deploy stage**:
1. SSH to production server
2. Upload `docker-compose.yml` to deployment directory
3. Pull latest images and restart containers with `docker compose up -d`
4. Prune old images
**Notify stage**:
- Sends deployment status via webhook
**Required secrets**:
- `REGISTRY_USER`, `REGISTRY_PASS`: Docker registry credentials
- `DEPLOY_HOST`, `DEPLOY_USER`, `DEPLOY_KEY`: SSH deployment credentials
### Backend Scripts
- `npm run dev`: Start with nodemon
- `npm run build`: esbuild compilation + copy public assets to `dist/`
- `npm test`: Run Jest tests (currently no tests exist)
### Frontend Scripts
- `npm run dev`: Vite dev server (port 5173)
- `npm run build`: TypeScript compilation + Vite production build
### Docker Configurations
**docker-compose.yml** (production):
- Pulls pre-built images from private registry
- Backend on port 5000, frontend on port 3000 (nginx serves on port 80)
- Requires `backend.env` and `frontend.env` files
**docker-compose.dev.yml** (local development):
- Builds images locally from Dockerfile/Dockerfile.dev
- Volume mounts for hot-reload: `./backend:/app` and `./frontend:/app`
- Named volumes preserve `node_modules` between rebuilds
- Backend uses `backend/.env` directly
- Frontend uses `Dockerfile.dev` with polling enabled for cross-platform compatibility
**docker-compose.prod.yml** (local production testing):
- Builds images locally using production Dockerfiles
- Backend: Standard Node.js server
- Frontend: Multi-stage build with nginx serving static files
## Configuration & Environment
**Backend** ([backend/.env](backend/.env)):
- Database connection variables (host, user, password, database name)
- `JWT_SECRET`: Token signing key
- `ALLOWED_ORIGINS`: Comma-separated CORS whitelist (supports static origins + `192.168.*.*` IP ranges)
- `PORT`: Server port (default 5000)
**Frontend** (environment variables):
- `VITE_API_URL`: Backend base URL
**Config accessed via**:
- Backend: `process.env.VAR_NAME`
- Frontend: `import.meta.env.VITE_VAR_NAME` (see [frontend/src/config.ts](frontend/src/config.ts))
## Authentication Flow
1. User logs in → backend returns `{token, userId, role, username}` ([backend/controllers/auth.controller.js](backend/controllers/auth.controller.js))
- `role` is the **system role** (`system_admin` or `user`)
2. Frontend stores in `localStorage` and `AuthContext` ([frontend/src/context/AuthContext.jsx](frontend/src/context/AuthContext.jsx))
3. `HouseholdContext` loads user's households and sets active household
- Active household includes `household.role` (the **household role**)
4. Axios interceptor auto-attaches `Authorization: Bearer <token>` header ([frontend/src/api/axios.js](frontend/src/api/axios.js))
5. Backend validates JWT on protected routes ([backend/middleware/auth.js](backend/middleware/auth.js))
- Sets `req.user = { id, role, username }` with **system role**
6. Controllers check household membership/role using [backend/models/household.model.js](backend/models/household.model.js)
7. On 401 "Invalid or expired token" response, frontend clears storage and redirects to login
## Critical Conventions
### Security Practices
- **Never expose credentials**: Do not hardcode or document actual values for `JWT_SECRET`, database passwords, API keys, or any sensitive configuration
- **No infrastructure details**: Avoid documenting specific IP addresses, domain names, deployment paths, or server locations in code or documentation
- **Environment variables**: Reference `.env` files conceptually - never include actual contents
- **Secrets in CI/CD**: Document that secrets are required, not their values
- **Code review**: Scan all changes for accidentally committed credentials before pushing
### Backend
- **No SQL injection**: Always use parameterized queries (`$1`, `$2`, etc.) with [backend/db/pool.js](backend/db/pool.js)
- **Password hashing**: Use `bcryptjs` for hashing (see [backend/controllers/auth.controller.js](backend/controllers/auth.controller.js))
- **CORS**: Dynamic origin validation in [backend/app.js](backend/app.js) allows configured origins + local IPs
- **Error responses**: Return JSON with `{message: "..."}` structure
### Frontend
- **Mixed JSX/TSX**: Some components are `.jsx` (JavaScript), others `.tsx` (TypeScript) - maintain existing file extensions
- **API calls**: Use centralized `api` instance from [frontend/src/api/axios.js](frontend/src/api/axios.js), not raw axios
- **Role checks**: Access role from `AuthContext`, compare with constants from [frontend/src/constants/roles.js](frontend/src/constants/roles.js)
- **Navigation**: Use React Router's `<Navigate>` for redirects, not `window.location` (except in interceptor)
## Common Tasks
**Add a new protected route**:
1. Backend: Add route with `auth` middleware (+ `requireRole(...)` if system role check needed)
2. Frontend: Add route in [frontend/src/App.jsx](frontend/src/App.jsx) wrapped in `<PrivateRoute>` (and `<RoleGuard>` for Admin Panel)
**Access user info in backend controller**:
```javascript
const { id, role } = req.user; // Set by auth middleware (system role)
const userId = req.user.id;
```
**Check household permissions in backend controller**:
```javascript
const householdRole = await household.getUserRole(householdId, userId);
if (!householdRole) return res.status(403).json({ message: "Not a member of this household" });
if (householdRole !== 'admin') return res.status(403).json({ message: "Household admin required" });
```
**Check household permissions in frontend**:
```javascript
const { activeHousehold } = useContext(HouseholdContext);
const householdRole = activeHousehold?.role; // 'admin' or 'user'
// Allow all members except viewers (no viewer role in households)
const canManageItems = householdRole && householdRole !== 'viewer'; // Usually just check if role exists
// Admin-only actions
const canManageMembers = householdRole === 'admin';
```
**Query grocery items with contributors**:
Use the JOIN pattern in [backend/models/list.model.v2.js](backend/models/list.model.v2.js) - aggregates user names via `household_list_history` table.
## Testing
**Backend**:
- Jest configured at root level ([package.json](package.json))
- Currently **no test files exist** - testing infrastructure needs development
- CI/CD runs `npm test --if-present` but will pass if no tests found
- Focus area: API endpoint testing (use `supertest` with Express)
**Frontend**:
- ESLint only (see [frontend/eslint.config.js](frontend/eslint.config.js))
- No test runner configured
- Manual testing workflow in use
**To add backend tests**:
1. Create `backend/__tests__/` directory
2. Use Jest + Supertest pattern for API tests
3. Mock database calls or use test database

53
AGENTS.md Normal file
View File

@ -0,0 +1,53 @@
# AGENTS.md - Fiddy (External DB)
## Authority
- Source of truth: `PROJECT_INSTRUCTIONS.md` (repo root). If conflict, follow it.
- Bugfix protocol: `DEBUGGING_INSTRUCTIONS.md` (repo root).
- Do not implement features unless required to fix the bug.
## Non-negotiables
- External DB: `DATABASE_URL` points to on-prem Postgres (NOT a container).
- Dev/Prod share schema via migrations in `packages/db/migrations`.
- No cron/worker jobs. Fixes must work without background tasks.
- Server-side RBAC only. Client checks are UX only.
## Security / logging (hard rules)
- Never log secrets (passwords/tokens/cookies).
- Never log receipt bytes.
- Never log full invite codes; logs/audit store last4 only.
## Non-regression contracts
- Sessions are DB-backed (`sessions` table) and cookies are HttpOnly.
- Receipt images stored in `receipts` (`bytea`).
- Entries list endpoints must NEVER return receipt bytes.
- API responses must include `request_id`; audit logs must include `request_id`.
## Architecture boundaries (follow existing patterns; do not invent)
1) API routes: `app/api/**/route.ts`
- Thin: parse/validate + call service, return JSON.
2) Server services: `lib/server/*`
- Own DB + authz. Must include `import "server-only";`.
3) Client wrappers: `lib/client/*`
- Typed fetch + error normalization; always send credentials.
4) Hooks: `hooks/use-*.ts`
- Primary UI-facing API layer; components avoid raw `fetch()`.
## Next.js dynamic route params (required)
- In `app/api/**/[param]/route.ts`, treat `context.params` as async:
- `const { id } = await context.params;`
## Working style
- Scan repo first; do not guess file names or patterns.
- Make the smallest change that resolves the issue.
- Keep touched files free of TS warnings and lint errors.
- Add/update tests when API behavior changes (include negative cases).
- Keep text encoding clean (no mojibake).
## Response icon legend
Use the same status icons defined in `PROJECT_INSTRUCTIONS.md` section "Agent Response Legend (required)":
- `🔄` in progress
- `✅` completed
- `🧪` verification/test result
- `⚠️` risk/blocker/manual action
- `❌` failure
- `🧭` recommendation/next step

48
DEBUGGING_INSTRUCTIONS.md Normal file
View File

@ -0,0 +1,48 @@
# Debugging Instructions - Fiddy
## Scope and authority
- This file is required for bugfix work.
- `PROJECT_INSTRUCTIONS.md` remains the source of truth for global project rules.
- For debugging tasks, ship the smallest safe fix that resolves the verified issue.
## Required bugfix workflow
1. Reproduce:
- Capture exact route/page, inputs, actor role, and expected vs actual behavior.
- Record a concrete repro sequence before changing code.
2. Localize:
- Identify the failing boundary (route/controller/model/service/client wrapper/hook/ui).
- Confirm whether failure is validation, authorization, data, or rendering.
3. Fix minimally:
- Modify only the layers needed to resolve the bug.
- Do not introduce parallel mechanisms for the same state flow.
4. Verify:
- Re-run repro.
- Run lint/tests for touched areas.
- Confirm no regression against contracts in `PROJECT_INSTRUCTIONS.md`.
## Guardrails while debugging
- External DB only:
- Use `DATABASE_URL`.
- Never add a DB container for a fix.
- No background jobs:
- Do not add cron, workers, or polling daemons.
- Security:
- Never log secrets, receipt bytes, or full invite codes.
- Invite logs/audit may include only last4.
- Authorization:
- Enforce RBAC server-side; client checks are UX only.
## Contract-specific debug checks
- Auth:
- Sessions must remain DB-backed and cookie-based (HttpOnly).
- Receipts:
- List endpoints must never include receipt bytes.
- Byte retrieval must be through dedicated endpoint only.
- Request IDs/audit:
- Ensure `request_id` appears in responses and audit trail for affected paths.
## Evidence to include with every bugfix
- Root cause summary (one short paragraph).
- Changed files list with rationale.
- Verification steps performed and outcome.
- Any residual risk, fallback, or operator action.

201
PROJECT_INSTRUCTIONS.md Normal file
View File

@ -0,0 +1,201 @@
# Project Instructions - Fiddy (External DB)
## 1) Core expectation
This project connects to an **external Postgres instance (on-prem server)**. Dev and Prod must share the **same schema** through **migrations**.
## 2) Authority & doc order
1) **PROJECT_INSTRUCTIONS.md** (this file) is the source of truth.
2) **DEBUGGING_INSTRUCTIONS.md** (repo root) is required for bugfix work.
3) Other instruction files (e.g. `.github/copilot-instructions.md`) must not conflict with this doc.
If anything conflicts, follow **this** doc.
---
## 3) Non-negotiables (hard rules)
### External DB + migrations
- `DATABASE_URL` points to **on-prem Postgres** (**NOT** a container).
- Dev/Prod share schema via migrations in: `packages/db/migrations`.
- Active migration runbook: `docs/DB_MIGRATION_WORKFLOW.md` (active set + status commands).
### No background jobs
- **No cron/worker jobs**. Any fix must work without background tasks.
### Security / logging
- **Never log secrets** (passwords, tokens, session cookies).
- **Never log receipt bytes**.
- **Never log full invite codes** - logs/audit store **last4 only**.
### Server-side authorization only
- **Server-side RBAC only.** Client checks are UX only and must not be trusted.
---
## 4) Non-regression contracts (do not break)
### Auth
- Custom email/password auth.
- Sessions are **DB-backed** and stored in table `sessions`.
- Session cookies are **HttpOnly**.
### Receipts
- Receipt images are stored in Postgres `bytea` table `receipts`.
- **Entries list endpoints must never return receipt image bytes.**
- Receipt bytes are fetched only via a **separate endpoint** when inspecting a single item.
### Request IDs + audit
- API must generate a **`request_id`** and return it in responses.
- Audit logs must include `request_id`.
- Audit logs must never store full invite codes (store **last4 only**).
---
## 5) Architecture contract (Backend <-> Client <-> Hooks <-> UI)
### No-assumptions rule (required)
Before making structural changes, first scan the repo and identify:
- where `app/`, `components/`, `features/`, `hooks/`, `lib/` live
- existing API routes and helpers
- patterns already in use
Do not invent files/endpoints/conventions. If something is missing, add it **minimally** and **consistently**.
### Single mechanism rule (required)
For any cross-component state propagation concern, keep **one** canonical mechanism only:
- Context **OR** custom events **OR** cache invalidation
Do not keep old and new mechanisms in parallel. Remove superseded utilities/imports/files in the same PR.
### Layering (hard boundaries)
For every domain (auth, groups, entries, receipts, etc.) follow this flow:
1) **API Route Handlers** - `app/api/.../route.ts`
- Thin: parse/validate input, call a server service, return JSON.
- No direct DB queries in route files unless there is no existing server service.
2) **Server Services (DB + authorization)** - `lib/server/*`
- Own all DB access and authorization helpers.
- Server-only modules must include: `import "server-only";`
- Prefer small domain modules: `lib/server/auth.ts`, `lib/server/groups.ts`, `lib/server/entries.ts`, `lib/server/receipts.ts`, `lib/server/session.ts`.
3) **Client API Wrappers** - `lib/client/*`
- Typed fetch helpers only (no React state).
- Centralize fetch + error normalization.
- Always send credentials (cookies) and never trust client-side RBAC.
4) **Hooks (UI-facing API layer)** - `hooks/use-*.ts`
- Hooks are the primary interface for components/pages to call APIs.
- Components should not call `fetch()` directly unless there is a strong reason.
### API conventions
- Prefer consistent JSON error shape:
- `{ error: { code: string, message: string }, request_id?: string }`
- Validate inputs at the route boundary (shape/type), authorize in server services.
- Mirror existing REST style used in the project.
### Next.js route params checklist (required)
For `app/api/**/[param]/route.ts`:
- Treat `context.params` as **async** and `await` it before reading properties.
- Example: `const { id } = await context.params;`
### Frontend structure preference
- Prefer domain-first structure: `features/<domain>/...` + `shared/...`.
- Use `components/*` only for compatibility shims during migrations (remove them after imports are migrated).
### Maintainability thresholds (refactor triggers)
- Component files > **400 lines** should be split into container/presentational parts.
- Hook files > **150 lines** should extract helper functions/services.
- Functions with more than **3 nested branches** should be extracted.
---
## 6) Decisions / constraints (Group Settings)
- Add `GROUP_OWNER` role to group roles; migrate existing groups so the first admin becomes owner.
- Join policy default is `NOT_ACCEPTING`. Policies: `NOT_ACCEPTING`, `AUTO_ACCEPT`, `APPROVAL_REQUIRED`.
- Both owner and admins can approve join requests and manage invite links.
- Invite links:
- TTL limited to 1-7 days.
- Settings are immutable after creation (policy, single-use, etc.).
- Single-use does not override approval-required.
- Expired links are retained and can be revived.
- Single-use links are deleted after successful use.
- Revive resets `used_at` and `revoked_at`, refreshes `expires_at`, and creates a new audit event.
- No cron/worker jobs for now (auto ownership transfer and invite rotation are paused).
- Group role icons must be consistent: owner, admin, member.
---
## 7) Do first (vertical slice)
1) DB migrate command + schema
2) Register/Login/Logout (custom sessions)
3) Protected dashboard page
4) Group create/join + group switcher (approval-based joins + optional join disable)
5) Entries CRUD (no receipt bytes in list)
6) Receipt upload/download endpoints
7) Settings + Reports
---
## 8) Definition of done
- Works via `docker-compose.dev.yml` with external DB
- Migrations applied via `npm run db:migrate`
- Tests + lint pass
- RBAC enforced server-side
- No large files
- No TypeScript warnings or lint errors in touched files
- No new cron/worker dependencies unless explicitly approved
- No orphaned utilities/hooks/contexts after refactors
- No duplicate mechanisms for the same state flow
- Text encoding remains clean in user-facing strings/docs
---
## 9) Desktop + mobile UX checklist (required)
- Touch: long-press affordance for item-level actions when no visible button.
- Mouse: hover affordance on interactive rows/cards.
- Tap targets remain >= 40px on mobile.
- Modal overlays must close on outside click/tap.
- Use bubble notifications for main actions (create/update/delete/join).
- Add Playwright UI tests for new UI features and critical flows.
---
## 10) Tests (required)
- Add/update tests for API behavior changes (auth, groups, entries, receipts).
- Include negative cases where applicable:
- unauthorized
- not-a-member
- invalid input
---
## 11) Agent Response Legend (required)
Use emoji/icons in agent progress and final responses so status is obvious at a glance.
Legend:
- `🔄` in progress
- `✅` completed
- `🧪` test/lint/verification result
- `📄` documentation update
- `🗄️` database or migration change
- `🚀` deploy/release step
- `⚠️` risk, blocker, or manual operator action needed
- `❌` failed command or unsuccessful attempt
- `` informational context
- `🧭` recommendation or next-step option
Usage rules:
- Include at least one status icon in each substantive agent response.
- Use one icon per bullet/line; avoid icon spam.
- Keep icon meaning consistent with this legend.
---
## 12) Commit Discipline (required)
- Commit in small, logical slices (no broad mixed-purpose commits).
- Each commit must:
- follow Conventional Commits style (`feat:`, `fix:`, `docs:`, `refactor:`, `test:`, `chore:`)
- include only related files for that slice
- exclude secrets, credentials, and generated noise
- Run verification before commit when applicable (lint/tests/build or targeted checks for touched areas).
- Prefer frequent checkpoint commits during agentic work rather than one large end-state commit.
- If a rule or contract changes, commit docs first (or in the same atomic slice as enforcing code).

11
backend/.env.example Normal file
View File

@ -0,0 +1,11 @@
DATABASE_URL=postgres://username:password@db-host:5432/database_name
DB_USER=
DB_PASS=
DB_HOST=
DB_PORT=5432
DB_NAME=
PORT=5000
JWT_SECRET=change-me
ALLOWED_ORIGINS=http://localhost:3000
SESSION_COOKIE_NAME=sid
SESSION_TTL_DAYS=30

View File

@ -2,15 +2,22 @@ const express = require("express");
const cors = require("cors"); const cors = require("cors");
const path = require("path"); const path = require("path");
const User = require("./models/user.model"); const User = require("./models/user.model");
const requestIdMiddleware = require("./middleware/request-id");
const { sendError } = require("./utils/http");
const app = express(); const app = express();
app.use(requestIdMiddleware);
app.use(express.json()); app.use(express.json());
// Serve static files from public directory // Expose manual API test pages in non-production environments only.
app.use('/test', express.static(path.join(__dirname, 'public'))); if (process.env.NODE_ENV !== "production") {
app.use("/test", express.static(path.join(__dirname, "public")));
}
const allowedOrigins = process.env.ALLOWED_ORIGINS.split(",").map(origin => origin.trim()); const allowedOrigins = (process.env.ALLOWED_ORIGINS || "")
console.log("Allowed Origins:", allowedOrigins); .split(",")
.map((origin) => origin.trim())
.filter(Boolean);
app.use( app.use(
cors({ cors({
origin: function (origin, callback) { origin: function (origin, callback) {
@ -18,18 +25,20 @@ app.use(
if (allowedOrigins.includes(origin)) return callback(null, true); if (allowedOrigins.includes(origin)) return callback(null, true);
if (/^http:\/\/192\.168\.\d+\.\d+/.test(origin)) return callback(null, true); if (/^http:\/\/192\.168\.\d+\.\d+/.test(origin)) return callback(null, true);
if (/^https:\/\/192\.168\.\d+\.\d+/.test(origin)) return callback(null, true); if (/^https:\/\/192\.168\.\d+\.\d+/.test(origin)) return callback(null, true);
console.error(`🚫 CORS blocked origin: ${origin}`); console.error(`CORS blocked origin: ${origin}`);
callback(new Error(`CORS blocked: ${origin}. Add this origin to ALLOWED_ORIGINS environment variable.`)); callback(new Error(`CORS blocked: ${origin}. Add this origin to ALLOWED_ORIGINS environment variable.`));
}, },
methods: ["GET", "POST", "PUT", "DELETE", "PATCH"], methods: ["GET", "POST", "PUT", "DELETE", "PATCH"],
credentials: true,
exposedHeaders: ["X-Request-Id"],
}) })
); );
app.get('/', async (req, res) => { app.get('/', async (req, res) => {
resText = `Grocery List API is running.\n` + res.status(200).json({
`Roles available: ${Object.values(User.ROLES).join(', ')}` message: "Grocery List API is running.",
roles: Object.values(User.ROLES),
res.status(200).type("text/plain").send(resText); });
}); });
@ -54,4 +63,16 @@ app.use("/households", householdsRoutes);
const storesRoutes = require("./routes/stores.routes"); const storesRoutes = require("./routes/stores.routes");
app.use("/stores", storesRoutes); app.use("/stores", storesRoutes);
app.use((err, req, res, next) => {
if (res.headersSent) {
return next(err);
}
const statusCode = err.status || err.statusCode || 500;
const message =
statusCode >= 500 ? "Internal server error" : err.message || "Request failed";
return sendError(res, statusCode, message);
});
module.exports = app; module.exports = app;

View File

@ -1,44 +1,101 @@
const bcrypt = require("bcryptjs"); const bcrypt = require("bcryptjs");
const jwt = require("jsonwebtoken"); const jwt = require("jsonwebtoken");
const User = require("../models/user.model"); const User = require("../models/user.model");
const { sendError } = require("../utils/http");
const Session = require("../models/session.model");
const { parseCookieHeader } = require("../utils/cookies");
const { setSessionCookie, clearSessionCookie, cookieName } = require("../utils/session-cookie");
const { logError } = require("../utils/logger");
exports.register = async (req, res) => { exports.register = async (req, res) => {
let { username, password, name } = req.body; let { username, password, name } = req.body;
if (
!username ||
!password ||
!name ||
typeof username !== "string" ||
typeof password !== "string" ||
typeof name !== "string"
) {
return sendError(res, 400, "Username, password, and name are required");
}
username = username.toLowerCase(); username = username.toLowerCase();
console.log(`🆕 Registration attempt for ${name} => username:${username}, password:${password}`); if (password.length < 8) {
return sendError(res, 400, "Password must be at least 8 characters");
}
try { try {
const hash = await bcrypt.hash(password, 10); const hash = await bcrypt.hash(password, 10);
const user = await User.createUser(username, hash, name); const user = await User.createUser(username, hash, name);
console.log(`✅ User registered: ${username}`);
res.json({ message: "User registered", user }); res.json({ message: "User registered", user });
} catch (err) { } catch (err) {
res.status(400).json({ message: "Registration failed", error: err }); logError(req, "auth.register", err);
sendError(res, 400, "Registration failed");
} }
}; };
exports.login = async (req, res) => { exports.login = async (req, res) => {
let { username, password } = req.body; let { username, password } = req.body;
if (
!username ||
!password ||
typeof username !== "string" ||
typeof password !== "string"
) {
return sendError(res, 400, "Username and password are required");
}
username = username.toLowerCase(); username = username.toLowerCase();
const user = await User.findByUsername(username); const user = await User.findByUsername(username);
if (!user) { if (!user) {
console.log(`⚠️ Login attempt -> No user found: ${username}`); return sendError(res, 401, "Invalid credentials");
return res.status(401).json({ message: "User not found" });
} }
const valid = await bcrypt.compare(password, user.password); const valid = await bcrypt.compare(password, user.password);
if (!valid) { if (!valid) {
console.log(`⛔ Login attempt for user ${username} with password ${password}`); return sendError(res, 401, "Invalid credentials");
return res.status(401).json({ message: "Invalid credentials" }); }
const jwtSecret = process.env.JWT_SECRET;
if (!jwtSecret) {
logError(req, "auth.login.jwtSecretMissing", new Error("JWT_SECRET is not configured"));
return sendError(res, 500, "Authentication is unavailable");
} }
const token = jwt.sign( const token = jwt.sign(
{ id: user.id, role: user.role }, { id: user.id, role: user.role },
process.env.JWT_SECRET, jwtSecret,
{ expiresIn: "1 year" } { expiresIn: "1 year" }
); );
try {
const session = await Session.createSession(user.id, req.headers["user-agent"] || null);
setSessionCookie(res, session.id);
} catch (err) {
logError(req, "auth.login.createSession", err);
return sendError(res, 500, "Failed to create session");
}
res.json({ token, userId: user.id, username, role: user.role }); res.json({ token, userId: user.id, username, role: user.role });
}; };
exports.logout = async (req, res) => {
try {
const cookies = parseCookieHeader(req.headers.cookie);
const sid = cookies[cookieName()];
if (sid) {
await Session.deleteSession(sid);
}
clearSessionCookie(res);
res.json({ message: "Logged out" });
} catch (err) {
logError(req, "auth.logout", err);
sendError(res, 500, "Failed to logout");
}
};

View File

@ -1,4 +1,7 @@
const householdModel = require("../models/household.model"); const householdModel = require("../models/household.model");
const { sendError } = require("../utils/http");
const { inviteCodeLast4 } = require("../utils/redaction");
const { logError } = require("../utils/logger");
// Get all households user belongs to // Get all households user belongs to
exports.getUserHouseholds = async (req, res) => { exports.getUserHouseholds = async (req, res) => {
@ -6,8 +9,8 @@ exports.getUserHouseholds = async (req, res) => {
const households = await householdModel.getUserHouseholds(req.user.id); const households = await householdModel.getUserHouseholds(req.user.id);
res.json(households); res.json(households);
} catch (error) { } catch (error) {
console.error("Get user households error:", error); logError(req, "households.getUserHouseholds", error);
res.status(500).json({ error: "Failed to fetch households" }); sendError(res, 500, "Failed to fetch households");
} }
}; };
@ -20,13 +23,13 @@ exports.getHousehold = async (req, res) => {
); );
if (!household) { if (!household) {
return res.status(404).json({ error: "Household not found" }); return sendError(res, 404, "Household not found");
} }
res.json(household); res.json(household);
} catch (error) { } catch (error) {
console.error("Get household error:", error); logError(req, "households.getHousehold", error);
res.status(500).json({ error: "Failed to fetch household" }); sendError(res, 500, "Failed to fetch household");
} }
}; };
@ -36,11 +39,11 @@ exports.createHousehold = async (req, res) => {
const { name } = req.body; const { name } = req.body;
if (!name || name.trim().length === 0) { if (!name || name.trim().length === 0) {
return res.status(400).json({ error: "Household name is required" }); return sendError(res, 400, "Household name is required");
} }
if (name.length > 100) { if (name.length > 100) {
return res.status(400).json({ error: "Household name must be 100 characters or less" }); return sendError(res, 400, "Household name must be 100 characters or less");
} }
const household = await householdModel.createHousehold( const household = await householdModel.createHousehold(
@ -53,8 +56,8 @@ exports.createHousehold = async (req, res) => {
household household
}); });
} catch (error) { } catch (error) {
console.error("Create household error:", error); logError(req, "households.createHousehold", error);
res.status(500).json({ error: "Failed to create household" }); sendError(res, 500, "Failed to create household");
} }
}; };
@ -64,11 +67,11 @@ exports.updateHousehold = async (req, res) => {
const { name } = req.body; const { name } = req.body;
if (!name || name.trim().length === 0) { if (!name || name.trim().length === 0) {
return res.status(400).json({ error: "Household name is required" }); return sendError(res, 400, "Household name is required");
} }
if (name.length > 100) { if (name.length > 100) {
return res.status(400).json({ error: "Household name must be 100 characters or less" }); return sendError(res, 400, "Household name must be 100 characters or less");
} }
const household = await householdModel.updateHousehold( const household = await householdModel.updateHousehold(
@ -81,8 +84,8 @@ exports.updateHousehold = async (req, res) => {
household household
}); });
} catch (error) { } catch (error) {
console.error("Update household error:", error); logError(req, "households.updateHousehold", error);
res.status(500).json({ error: "Failed to update household" }); sendError(res, 500, "Failed to update household");
} }
}; };
@ -92,8 +95,8 @@ exports.deleteHousehold = async (req, res) => {
await householdModel.deleteHousehold(req.params.householdId); await householdModel.deleteHousehold(req.params.householdId);
res.json({ message: "Household deleted successfully" }); res.json({ message: "Household deleted successfully" });
} catch (error) { } catch (error) {
console.error("Delete household error:", error); logError(req, "households.deleteHousehold", error);
res.status(500).json({ error: "Failed to delete household" }); sendError(res, 500, "Failed to delete household");
} }
}; };
@ -106,23 +109,26 @@ exports.refreshInviteCode = async (req, res) => {
household household
}); });
} catch (error) { } catch (error) {
console.error("Refresh invite code error:", error); logError(req, "households.refreshInviteCode", error, {
res.status(500).json({ error: "Failed to refresh invite code" }); invite_last4: inviteCodeLast4(req.body?.inviteCode),
});
sendError(res, 500, "Failed to refresh invite code");
} }
}; };
// Join household via invite code // Join household via invite code
exports.joinHousehold = async (req, res) => { exports.joinHousehold = async (req, res) => {
const inviteLast4 = inviteCodeLast4(req.params.inviteCode);
try { try {
const { inviteCode } = req.params; const { inviteCode } = req.params;
if (!inviteCode) return res.status(400).json({ error: "Invite code is required" }); if (!inviteCode) return sendError(res, 400, "Invite code is required");
const result = await householdModel.joinHousehold( const result = await householdModel.joinHousehold(
inviteCode.toUpperCase(), inviteCode.toUpperCase(),
req.user.id req.user.id
); );
if (!result) return res.status(404).json({ error: "Invalid or expired invite code" }); if (!result) return sendError(res, 404, "Invalid or expired invite code");
if (result.alreadyMember) { if (result.alreadyMember) {
@ -137,8 +143,8 @@ exports.joinHousehold = async (req, res) => {
household: { id: result.id, name: result.name } household: { id: result.id, name: result.name }
}); });
} catch (error) { } catch (error) {
console.error("Join household error:", error); logError(req, "households.joinHousehold", error, { invite_last4: inviteLast4 });
res.status(500).json({ error: "Failed to join household" }); sendError(res, 500, "Failed to join household");
} }
}; };
@ -148,8 +154,8 @@ exports.getMembers = async (req, res) => {
const members = await householdModel.getHouseholdMembers(req.params.householdId); const members = await householdModel.getHouseholdMembers(req.params.householdId);
res.json(members); res.json(members);
} catch (error) { } catch (error) {
console.error("Get members error:", error); logError(req, "households.getMembers", error);
res.status(500).json({ error: "Failed to fetch members" }); sendError(res, 500, "Failed to fetch members");
} }
}; };
@ -160,12 +166,12 @@ exports.updateMemberRole = async (req, res) => {
const { role } = req.body; const { role } = req.body;
if (!role || !['admin', 'user'].includes(role)) { if (!role || !['admin', 'user'].includes(role)) {
return res.status(400).json({ error: "Invalid role. Must be 'admin' or 'user'" }); return sendError(res, 400, "Invalid role. Must be 'admin' or 'user'");
} }
// Can't change own role // Can't change own role
if (parseInt(userId) === req.user.id) { if (parseInt(userId) === req.user.id) {
return res.status(400).json({ error: "Cannot change your own role" }); return sendError(res, 400, "Cannot change your own role");
} }
const updated = await householdModel.updateMemberRole( const updated = await householdModel.updateMemberRole(
@ -179,8 +185,8 @@ exports.updateMemberRole = async (req, res) => {
member: updated member: updated
}); });
} catch (error) { } catch (error) {
console.error("Update member role error:", error); logError(req, "households.updateMemberRole", error);
res.status(500).json({ error: "Failed to update member role" }); sendError(res, 500, "Failed to update member role");
} }
}; };
@ -192,16 +198,14 @@ exports.removeMember = async (req, res) => {
// Allow users to remove themselves, or admins to remove others // Allow users to remove themselves, or admins to remove others
if (targetUserId !== req.user.id && req.household.role !== 'admin') { if (targetUserId !== req.user.id && req.household.role !== 'admin') {
return res.status(403).json({ return sendError(res, 403, "Only admins can remove other members");
error: "Only admins can remove other members"
});
} }
await householdModel.removeMember(req.params.householdId, userId); await householdModel.removeMember(req.params.householdId, userId);
res.json({ message: "Member removed successfully" }); res.json({ message: "Member removed successfully" });
} catch (error) { } catch (error) {
console.error("Remove member error:", error); logError(req, "households.removeMember", error);
res.status(500).json({ error: "Failed to remove member" }); sendError(res, 500, "Failed to remove member");
} }
}; };

View File

@ -1,5 +1,7 @@
const List = require("../models/list.model"); const List = require("../models/list.model");
const { isValidItemType, isValidItemGroup, isValidZone } = require("../constants/classifications"); const { isValidItemType, isValidItemGroup, isValidZone } = require("../constants/classifications");
const { sendError } = require("../utils/http");
const { logError } = require("../utils/logger");
exports.getList = async (req, res) => { exports.getList = async (req, res) => {
@ -58,7 +60,7 @@ exports.updateItemImage = async (req, res) => {
const mimeType = req.processedImage?.mimeType || null; const mimeType = req.processedImage?.mimeType || null;
if (!imageBuffer) { if (!imageBuffer) {
return res.status(400).json({ message: "No image provided" }); return sendError(res, 400, "No image provided");
} }
// Update the item with new image // Update the item with new image
@ -90,15 +92,15 @@ exports.updateItemWithClassification = async (req, res) => {
// Validate classification data // Validate classification data
if (item_type && !isValidItemType(item_type)) { if (item_type && !isValidItemType(item_type)) {
return res.status(400).json({ message: "Invalid item_type" }); return sendError(res, 400, "Invalid item_type");
} }
if (item_group && !isValidItemGroup(item_type, item_group)) { if (item_group && !isValidItemGroup(item_type, item_group)) {
return res.status(400).json({ message: "Invalid item_group for selected item_type" }); return sendError(res, 400, "Invalid item_group for selected item_type");
} }
if (zone && !isValidZone(zone)) { if (zone && !isValidZone(zone)) {
return res.status(400).json({ message: "Invalid zone" }); return sendError(res, 400, "Invalid zone");
} }
// Upsert classification with confidence=1.0 and source='user' // Upsert classification with confidence=1.0 and source='user'
@ -113,7 +115,7 @@ exports.updateItemWithClassification = async (req, res) => {
res.json({ message: "Item updated successfully" }); res.json({ message: "Item updated successfully" });
} catch (error) { } catch (error) {
console.error("Error updating item with classification:", error); logError(req, "listsLegacy.updateItemWithClassification", error);
res.status(500).json({ message: "Failed to update item" }); sendError(res, 500, "Failed to update item");
} }
}; };

View File

@ -1,5 +1,8 @@
const List = require("../models/list.model.v2"); const List = require("../models/list.model.v2");
const householdModel = require("../models/household.model");
const { isValidItemType, isValidItemGroup, isValidZone } = require("../constants/classifications"); const { isValidItemType, isValidItemGroup, isValidZone } = require("../constants/classifications");
const { sendError } = require("../utils/http");
const { logError } = require("../utils/logger");
/** /**
* Get list items for household and store * Get list items for household and store
@ -11,8 +14,8 @@ exports.getList = async (req, res) => {
const items = await List.getHouseholdStoreList(householdId, storeId); const items = await List.getHouseholdStoreList(householdId, storeId);
res.json({ items }); res.json({ items });
} catch (error) { } catch (error) {
console.error("Error getting list:", error); logError(req, "listsV2.getList", error);
res.status(500).json({ message: "Failed to get list" }); sendError(res, 500, "Failed to get list");
} }
}; };
@ -26,18 +29,18 @@ exports.getItemByName = async (req, res) => {
const { item_name } = req.query; const { item_name } = req.query;
if (!item_name) { if (!item_name) {
return res.status(400).json({ message: "Item name is required" }); return sendError(res, 400, "Item name is required");
} }
const item = await List.getItemByName(householdId, storeId, item_name); const item = await List.getItemByName(householdId, storeId, item_name);
if (!item) { if (!item) {
return res.status(404).json({ message: "Item not found" }); return sendError(res, 404, "Item not found");
} }
res.json(item); res.json(item);
} catch (error) { } catch (error) {
console.error("Error getting item:", error); logError(req, "listsV2.getItemByName", error);
res.status(500).json({ message: "Failed to get item" }); sendError(res, 500, "Failed to get item");
} }
}; };
@ -48,11 +51,27 @@ exports.getItemByName = async (req, res) => {
exports.addItem = async (req, res) => { exports.addItem = async (req, res) => {
try { try {
const { householdId, storeId } = req.params; const { householdId, storeId } = req.params;
const { item_name, quantity, notes } = req.body; const { item_name, quantity, notes, added_for_user_id } = req.body;
const userId = req.user.id; const userId = req.user.id;
let historyUserId = userId;
if (!item_name || item_name.trim() === "") { if (!item_name || item_name.trim() === "") {
return res.status(400).json({ message: "Item name is required" }); return sendError(res, 400, "Item name is required");
}
if (added_for_user_id !== undefined && added_for_user_id !== null && String(added_for_user_id).trim() !== "") {
const parsedUserId = Number.parseInt(String(added_for_user_id), 10);
if (!Number.isInteger(parsedUserId) || parsedUserId <= 0) {
return sendError(res, 400, "Added-for user ID must be a positive integer");
}
const isMember = await householdModel.isHouseholdMember(householdId, parsedUserId);
if (!isMember) {
return sendError(res, 400, "Selected user is not a member of this household");
}
historyUserId = parsedUserId;
} }
// Get processed image if uploaded // Get processed image if uploaded
@ -71,7 +90,7 @@ exports.addItem = async (req, res) => {
); );
// Add history record // Add history record
await List.addHistoryRecord(result.listId, quantity || "1", userId); await List.addHistoryRecord(result.listId, quantity || "1", historyUserId);
res.json({ res.json({
message: result.isNew ? "Item added" : "Item updated", message: result.isNew ? "Item added" : "Item updated",
@ -83,8 +102,8 @@ exports.addItem = async (req, res) => {
} }
}); });
} catch (error) { } catch (error) {
console.error("Error adding item:", error); logError(req, "listsV2.addItem", error);
res.status(500).json({ message: "Failed to add item" }); sendError(res, 500, "Failed to add item");
} }
}; };
@ -97,11 +116,10 @@ exports.markBought = async (req, res) => {
const { householdId, storeId } = req.params; const { householdId, storeId } = req.params;
const { item_name, bought, quantity_bought } = req.body; const { item_name, bought, quantity_bought } = req.body;
if (!item_name) return res.status(400).json({ message: "Item name is required" }); if (!item_name) return sendError(res, 400, "Item name is required");
const item = await List.getItemByName(householdId, storeId, item_name); const item = await List.getItemByName(householdId, storeId, item_name);
console.log('requesting mark ', { item, householdId, storeId, item_name, bought, quantity_bought }); if (!item) return sendError(res, 404, "Item not found");
if (!item) return res.status(404).json({ message: "Item not found" });
// Update bought status (with optional partial purchase) // Update bought status (with optional partial purchase)
@ -109,8 +127,8 @@ exports.markBought = async (req, res) => {
res.json({ message: bought ? "Item marked as bought" : "Item unmarked" }); res.json({ message: bought ? "Item marked as bought" : "Item unmarked" });
} catch (error) { } catch (error) {
console.error("Error marking bought:", error); logError(req, "listsV2.markBought", error);
res.status(500).json({ message: "Failed to update item" }); sendError(res, 500, "Failed to update item");
} }
}; };
@ -124,13 +142,13 @@ exports.updateItem = async (req, res) => {
const { item_name, quantity, notes } = req.body; const { item_name, quantity, notes } = req.body;
if (!item_name) { if (!item_name) {
return res.status(400).json({ message: "Item name is required" }); return sendError(res, 400, "Item name is required");
} }
// Get the list item // Get the list item
const item = await List.getItemByName(householdId, storeId, item_name); const item = await List.getItemByName(householdId, storeId, item_name);
if (!item) { if (!item) {
return res.status(404).json({ message: "Item not found" }); return sendError(res, 404, "Item not found");
} }
// Update item // Update item
@ -146,8 +164,8 @@ exports.updateItem = async (req, res) => {
} }
}); });
} catch (error) { } catch (error) {
console.error("Error updating item:", error); logError(req, "listsV2.updateItem", error);
res.status(500).json({ message: "Failed to update item" }); sendError(res, 500, "Failed to update item");
} }
}; };
@ -161,21 +179,21 @@ exports.deleteItem = async (req, res) => {
const { item_name } = req.body; const { item_name } = req.body;
if (!item_name) { if (!item_name) {
return res.status(400).json({ message: "Item name is required" }); return sendError(res, 400, "Item name is required");
} }
// Get the list item // Get the list item
const item = await List.getItemByName(householdId, storeId, item_name); const item = await List.getItemByName(householdId, storeId, item_name);
if (!item) { if (!item) {
return res.status(404).json({ message: "Item not found" }); return sendError(res, 404, "Item not found");
} }
await List.deleteItem(item.id); await List.deleteItem(item.id);
res.json({ message: "Item deleted" }); res.json({ message: "Item deleted" });
} catch (error) { } catch (error) {
console.error("Error deleting item:", error); logError(req, "listsV2.deleteItem", error);
res.status(500).json({ message: "Failed to delete item" }); sendError(res, 500, "Failed to delete item");
} }
}; };
@ -191,8 +209,8 @@ exports.getSuggestions = async (req, res) => {
const suggestions = await List.getSuggestions(query || "", householdId, storeId); const suggestions = await List.getSuggestions(query || "", householdId, storeId);
res.json(suggestions); res.json(suggestions);
} catch (error) { } catch (error) {
console.error("Error getting suggestions:", error); logError(req, "listsV2.getSuggestions", error);
res.status(500).json({ message: "Failed to get suggestions" }); sendError(res, 500, "Failed to get suggestions");
} }
}; };
@ -206,8 +224,8 @@ exports.getRecentlyBought = async (req, res) => {
const items = await List.getRecentlyBoughtItems(householdId, storeId); const items = await List.getRecentlyBoughtItems(householdId, storeId);
res.json(items); res.json(items);
} catch (error) { } catch (error) {
console.error("Error getting recent items:", error); logError(req, "listsV2.getRecentlyBought", error);
res.status(500).json({ message: "Failed to get recent items" }); sendError(res, 500, "Failed to get recent items");
} }
}; };
@ -221,7 +239,7 @@ exports.getClassification = async (req, res) => {
const { item_name } = req.query; const { item_name } = req.query;
if (!item_name) { if (!item_name) {
return res.status(400).json({ message: "Item name is required" }); return sendError(res, 400, "Item name is required");
} }
// Get item ID from name // Get item ID from name
@ -233,8 +251,8 @@ exports.getClassification = async (req, res) => {
const classification = await List.getClassification(householdId, item.item_id); const classification = await List.getClassification(householdId, item.item_id);
res.json({ classification }); res.json({ classification });
} catch (error) { } catch (error) {
console.error("Error getting classification:", error); logError(req, "listsV2.getClassification", error);
res.status(500).json({ message: "Failed to get classification" }); sendError(res, 500, "Failed to get classification");
} }
}; };
@ -248,17 +266,17 @@ exports.setClassification = async (req, res) => {
const { item_name, classification } = req.body; const { item_name, classification } = req.body;
if (!item_name) { if (!item_name) {
return res.status(400).json({ message: "Item name is required" }); return sendError(res, 400, "Item name is required");
} }
if (!classification) { if (!classification) {
return res.status(400).json({ message: "Classification is required" }); return sendError(res, 400, "Classification is required");
} }
// Validate classification // Validate classification
const validClassifications = ['produce', 'dairy', 'meat', 'bakery', 'frozen', 'pantry', 'snacks', 'beverages', 'household', 'other']; const validClassifications = ['produce', 'dairy', 'meat', 'bakery', 'frozen', 'pantry', 'snacks', 'beverages', 'household', 'other'];
if (!validClassifications.includes(classification)) { if (!validClassifications.includes(classification)) {
return res.status(400).json({ message: "Invalid classification value" }); return sendError(res, 400, "Invalid classification value");
} }
// Get item - add to master items if not exists // Get item - add to master items if not exists
@ -290,8 +308,8 @@ exports.setClassification = async (req, res) => {
res.json({ message: "Classification set", classification }); res.json({ message: "Classification set", classification });
} catch (error) { } catch (error) {
console.error("Error setting classification:", error); logError(req, "listsV2.setClassification", error);
res.status(500).json({ message: "Failed to set classification" }); sendError(res, 500, "Failed to set classification");
} }
}; };
@ -310,7 +328,7 @@ exports.updateItemImage = async (req, res) => {
const mimeType = req.processedImage?.mimeType || null; const mimeType = req.processedImage?.mimeType || null;
if (!imageBuffer) { if (!imageBuffer) {
return res.status(400).json({ message: "No image provided" }); return sendError(res, 400, "No image provided");
} }
// Update the item with new image // Update the item with new image
@ -318,7 +336,7 @@ exports.updateItemImage = async (req, res) => {
res.json({ message: "Image updated successfully" }); res.json({ message: "Image updated successfully" });
} catch (error) { } catch (error) {
console.error("Error updating image:", error); logError(req, "listsV2.updateItemImage", error);
res.status(500).json({ message: "Failed to update image" }); sendError(res, 500, "Failed to update image");
} }
}; };

View File

@ -1,4 +1,6 @@
const storeModel = require("../models/store.model"); const storeModel = require("../models/store.model");
const { sendError } = require("../utils/http");
const { logError } = require("../utils/logger");
// Get all available stores // Get all available stores
exports.getAllStores = async (req, res) => { exports.getAllStores = async (req, res) => {
@ -6,8 +8,8 @@ exports.getAllStores = async (req, res) => {
const stores = await storeModel.getAllStores(); const stores = await storeModel.getAllStores();
res.json(stores); res.json(stores);
} catch (error) { } catch (error) {
console.error("Get all stores error:", error); logError(req, "stores.getAllStores", error);
res.status(500).json({ error: "Failed to fetch stores" }); sendError(res, 500, "Failed to fetch stores");
} }
}; };
@ -17,8 +19,8 @@ exports.getHouseholdStores = async (req, res) => {
const stores = await storeModel.getHouseholdStores(req.params.householdId); const stores = await storeModel.getHouseholdStores(req.params.householdId);
res.json(stores); res.json(stores);
} catch (error) { } catch (error) {
console.error("Get household stores error:", error); logError(req, "stores.getHouseholdStores", error);
res.status(500).json({ error: "Failed to fetch household stores" }); sendError(res, 500, "Failed to fetch household stores");
} }
}; };
@ -28,11 +30,11 @@ exports.addStoreToHousehold = async (req, res) => {
const { storeId, isDefault } = req.body; const { storeId, isDefault } = req.body;
// console.log("Adding store to household:", { householdId: req.params.householdId, storeId, isDefault }); // console.log("Adding store to household:", { householdId: req.params.householdId, storeId, isDefault });
if (!storeId) { if (!storeId) {
return res.status(400).json({ error: "Store ID is required" }); return sendError(res, 400, "Store ID is required");
} }
const store = await storeModel.getStoreById(storeId); const store = await storeModel.getStoreById(storeId);
if (!store) return res.status(404).json({ error: "Store not found" }); if (!store) return sendError(res, 404, "Store not found");
const foundStores = await storeModel.getHouseholdStores(req.params.householdId); const foundStores = await storeModel.getHouseholdStores(req.params.householdId);
// if (foundStores.length == 0) isDefault = 'true'; // if (foundStores.length == 0) isDefault = 'true';
@ -47,8 +49,8 @@ exports.addStoreToHousehold = async (req, res) => {
store store
}); });
} catch (error) { } catch (error) {
console.error("Add store to household error:", error); logError(req, "stores.addStoreToHousehold", error);
res.status(500).json({ error: "Failed to add store to household" }); sendError(res, 500, "Failed to add store to household");
} }
}; };
@ -62,8 +64,8 @@ exports.removeStoreFromHousehold = async (req, res) => {
res.json({ message: "Store removed from household successfully" }); res.json({ message: "Store removed from household successfully" });
} catch (error) { } catch (error) {
console.error("Remove store from household error:", error); logError(req, "stores.removeStoreFromHousehold", error);
res.status(500).json({ error: "Failed to remove store from household" }); sendError(res, 500, "Failed to remove store from household");
} }
}; };
@ -77,8 +79,8 @@ exports.setDefaultStore = async (req, res) => {
res.json({ message: "Default store updated successfully" }); res.json({ message: "Default store updated successfully" });
} catch (error) { } catch (error) {
console.error("Set default store error:", error); logError(req, "stores.setDefaultStore", error);
res.status(500).json({ error: "Failed to set default store" }); sendError(res, 500, "Failed to set default store");
} }
}; };
@ -88,7 +90,7 @@ exports.createStore = async (req, res) => {
const { name, default_zones } = req.body; const { name, default_zones } = req.body;
if (!name || name.trim().length === 0) { if (!name || name.trim().length === 0) {
return res.status(400).json({ error: "Store name is required" }); return sendError(res, 400, "Store name is required");
} }
const store = await storeModel.createStore(name.trim(), default_zones || null); const store = await storeModel.createStore(name.trim(), default_zones || null);
@ -98,11 +100,11 @@ exports.createStore = async (req, res) => {
store store
}); });
} catch (error) { } catch (error) {
console.error("Create store error:", error); logError(req, "stores.createStore", error);
if (error.code === '23505') { // Unique violation if (error.code === '23505') { // Unique violation
return res.status(400).json({ error: "Store with this name already exists" }); return sendError(res, 400, "Store with this name already exists");
} }
res.status(500).json({ error: "Failed to create store" }); sendError(res, 500, "Failed to create store");
} }
}; };
@ -117,7 +119,7 @@ exports.updateStore = async (req, res) => {
}); });
if (!store) { if (!store) {
return res.status(404).json({ error: "Store not found" }); return sendError(res, 404, "Store not found");
} }
res.json({ res.json({
@ -125,8 +127,8 @@ exports.updateStore = async (req, res) => {
store store
}); });
} catch (error) { } catch (error) {
console.error("Update store error:", error); logError(req, "stores.updateStore", error);
res.status(500).json({ error: "Failed to update store" }); sendError(res, 500, "Failed to update store");
} }
}; };
@ -136,10 +138,10 @@ exports.deleteStore = async (req, res) => {
await storeModel.deleteStore(req.params.storeId); await storeModel.deleteStore(req.params.storeId);
res.json({ message: "Store deleted successfully" }); res.json({ message: "Store deleted successfully" });
} catch (error) { } catch (error) {
console.error("Delete store error:", error); logError(req, "stores.deleteStore", error);
if (error.message.includes('in use')) { if (error.message.includes('in use')) {
return res.status(400).json({ error: error.message }); return sendError(res, 400, error.message);
} }
res.status(500).json({ error: "Failed to delete store" }); sendError(res, 500, "Failed to delete store");
} }
}; };

View File

@ -1,8 +1,9 @@
const User = require("../models/user.model"); const User = require("../models/user.model");
const bcrypt = require("bcryptjs"); const bcrypt = require("bcryptjs");
const { sendError } = require("../utils/http");
const { logError } = require("../utils/logger");
exports.test = async (req, res) => { exports.test = async (req, res) => {
console.log("User route is working");
res.json({ message: "User route is working" }); res.json({ message: "User route is working" });
}; };
@ -15,18 +16,17 @@ exports.getAllUsers = async (req, res) => {
exports.updateUserRole = async (req, res) => { exports.updateUserRole = async (req, res) => {
try { try {
const { id, role } = req.body; const { id, role } = req.body;
console.log(`Updating user ${id} to role ${role}`);
if (!Object.values(User.ROLES).includes(role)) if (!Object.values(User.ROLES).includes(role))
return res.status(400).json({ error: "Invalid role" }); return sendError(res, 400, "Invalid role");
const updated = await User.updateUserRole(id, role); const updated = await User.updateUserRole(id, role);
if (!updated) if (!updated)
return res.status(404).json({ error: "User not found" }); return sendError(res, 404, "User not found");
res.json({ message: "Role updated", id, role }); res.json({ message: "Role updated", id, role });
} catch (err) { } catch (err) {
res.status(500).json({ error: "Failed to update role" }); logError(req, "users.updateUserRole", err);
sendError(res, 500, "Failed to update role");
} }
}; };
@ -36,12 +36,13 @@ exports.deleteUser = async (req, res) => {
const deleted = await User.deleteUser(id); const deleted = await User.deleteUser(id);
if (!deleted) if (!deleted)
return res.status(404).json({ error: "User not found" }); return sendError(res, 404, "User not found");
res.json({ message: "User deleted", id }); res.json({ message: "User deleted", id });
} catch (err) { } catch (err) {
res.status(500).json({ error: "Failed to delete user" }); logError(req, "users.deleteUser", err);
sendError(res, 500, "Failed to delete user");
} }
}; };
@ -57,13 +58,13 @@ exports.getCurrentUser = async (req, res) => {
const user = await User.getUserById(userId); const user = await User.getUserById(userId);
if (!user) { if (!user) {
return res.status(404).json({ error: "User not found" }); return sendError(res, 404, "User not found");
} }
res.json(user); res.json(user);
} catch (err) { } catch (err) {
console.error("Error getting current user:", err); logError(req, "users.getCurrentUser", err);
res.status(500).json({ error: "Failed to get user profile" }); sendError(res, 500, "Failed to get user profile");
} }
}; };
@ -73,23 +74,23 @@ exports.updateCurrentUser = async (req, res) => {
const { display_name } = req.body; const { display_name } = req.body;
if (!display_name || display_name.trim().length === 0) { if (!display_name || display_name.trim().length === 0) {
return res.status(400).json({ error: "Display name is required" }); return sendError(res, 400, "Display name is required");
} }
if (display_name.length > 100) { if (display_name.length > 100) {
return res.status(400).json({ error: "Display name must be 100 characters or less" }); return sendError(res, 400, "Display name must be 100 characters or less");
} }
const updated = await User.updateUserProfile(userId, { display_name: display_name.trim() }); const updated = await User.updateUserProfile(userId, { display_name: display_name.trim() });
if (!updated) { if (!updated) {
return res.status(404).json({ error: "User not found" }); return sendError(res, 404, "User not found");
} }
res.json({ message: "Profile updated successfully", user: updated }); res.json({ message: "Profile updated successfully", user: updated });
} catch (err) { } catch (err) {
console.error("Error updating user profile:", err); logError(req, "users.updateCurrentUser", err);
res.status(500).json({ error: "Failed to update profile" }); sendError(res, 500, "Failed to update profile");
} }
}; };
@ -100,25 +101,25 @@ exports.changePassword = async (req, res) => {
// Validation // Validation
if (!current_password || !new_password) { if (!current_password || !new_password) {
return res.status(400).json({ error: "Current password and new password are required" }); return sendError(res, 400, "Current password and new password are required");
} }
if (new_password.length < 6) { if (new_password.length < 6) {
return res.status(400).json({ error: "New password must be at least 6 characters" }); return sendError(res, 400, "New password must be at least 6 characters");
} }
// Get current password hash // Get current password hash
const currentHash = await User.getUserPasswordHash(userId); const currentHash = await User.getUserPasswordHash(userId);
if (!currentHash) { if (!currentHash) {
return res.status(404).json({ error: "User not found" }); return sendError(res, 404, "User not found");
} }
// Verify current password // Verify current password
const isValidPassword = await bcrypt.compare(current_password, currentHash); const isValidPassword = await bcrypt.compare(current_password, currentHash);
if (!isValidPassword) { if (!isValidPassword) {
return res.status(401).json({ error: "Current password is incorrect" }); return sendError(res, 401, "Current password is incorrect");
} }
// Hash new password // Hash new password
@ -130,7 +131,7 @@ exports.changePassword = async (req, res) => {
res.json({ message: "Password changed successfully" }); res.json({ message: "Password changed successfully" });
} catch (err) { } catch (err) {
console.error("Error changing password:", err); logError(req, "users.changePassword", err);
res.status(500).json({ error: "Failed to change password" }); sendError(res, 500, "Failed to change password");
} }
}; };

View File

@ -1,11 +1,21 @@
const { Pool } = require("pg"); const { Pool } = require("pg");
const pool = new Pool({ function buildPoolConfig() {
user: process.env.DB_USER, if (process.env.DATABASE_URL) {
password: process.env.DB_PASS, return {
host: process.env.DB_HOST, connectionString: process.env.DATABASE_URL,
database: process.env.DB_NAME, };
port: 5432, }
});
return {
user: process.env.DB_USER,
password: process.env.DB_PASS,
host: process.env.DB_HOST,
database: process.env.DB_NAME,
port: Number(process.env.DB_PORT || 5432),
};
}
const pool = new Pool(buildPoolConfig());
module.exports = pool; module.exports = pool;

View File

@ -1,18 +1,54 @@
const jwt = require("jsonwebtoken"); const jwt = require("jsonwebtoken");
const { sendError } = require("../utils/http");
const Session = require("../models/session.model");
const { parseCookieHeader } = require("../utils/cookies");
const { cookieName } = require("../utils/session-cookie");
const { logError } = require("../utils/logger");
function auth(req, res, next) { async function auth(req, res, next) {
const header = req.headers.authorization; const header = req.headers.authorization || "";
if (!header) return res.status(401).json({ message: "Missing token" }); const token = header.startsWith("Bearer ") ? header.slice(7).trim() : null;
const token = header.split(" ")[1]; if (token) {
if (!token) return res.status(401).json({ message: "Invalid token format" }); const jwtSecret = process.env.JWT_SECRET;
if (!jwtSecret) {
logError(req, "middleware.auth.jwtSecretMissing", new Error("JWT_SECRET is not configured"));
return sendError(res, 500, "Authentication is unavailable");
}
try {
const decoded = jwt.verify(token, jwtSecret);
req.user = decoded; // id + role
return next();
} catch (err) {
return sendError(res, 401, "Invalid or expired token");
}
}
try { try {
const decoded = jwt.verify(token, process.env.JWT_SECRET); const cookies = parseCookieHeader(req.headers.cookie);
req.user = decoded; // id + role const sid = cookies[cookieName()];
next();
if (!sid) {
return sendError(res, 401, "Missing authentication");
}
const session = await Session.getActiveSessionWithUser(sid);
if (!session) {
return sendError(res, 401, "Invalid or expired session");
}
req.user = {
id: session.user_id,
role: session.role,
username: session.username,
};
req.session_id = session.id;
return next();
} catch (err) { } catch (err) {
res.status(401).json({ message: "Invalid or expired token" }); logError(req, "middleware.auth", err);
return sendError(res, 500, "Authentication check failed");
} }
} }

View File

@ -1,4 +1,6 @@
const householdModel = require("../models/household.model"); const householdModel = require("../models/household.model");
const { sendError } = require("../utils/http");
const { logError } = require("../utils/logger");
// Middleware to check if user belongs to household // Middleware to check if user belongs to household
exports.householdAccess = async (req, res, next) => { exports.householdAccess = async (req, res, next) => {
@ -7,16 +9,14 @@ exports.householdAccess = async (req, res, next) => {
const userId = req.user.id; const userId = req.user.id;
if (!householdId) { if (!householdId) {
return res.status(400).json({ error: "Household ID required" }); return sendError(res, 400, "Household ID required");
} }
// Check if user is member of household // Check if user is member of household
const isMember = await householdModel.isHouseholdMember(householdId, userId); const isMember = await householdModel.isHouseholdMember(householdId, userId);
if (!isMember) { if (!isMember) {
return res.status(403).json({ return sendError(res, 403, "Access denied. You are not a member of this household.");
error: "Access denied. You are not a member of this household."
});
} }
// Get user's role in household // Get user's role in household
@ -30,8 +30,8 @@ exports.householdAccess = async (req, res, next) => {
next(); next();
} catch (error) { } catch (error) {
console.error("Household access check error:", error); logError(req, "middleware.householdAccess", error);
res.status(500).json({ error: "Server error checking household access" }); sendError(res, 500, "Server error checking household access");
} }
}; };
@ -39,15 +39,15 @@ exports.householdAccess = async (req, res, next) => {
exports.requireHouseholdRole = (...allowedRoles) => { exports.requireHouseholdRole = (...allowedRoles) => {
return (req, res, next) => { return (req, res, next) => {
if (!req.household) { if (!req.household) {
return res.status(500).json({ return sendError(res, 500, "Household context not set. Use householdAccess middleware first.");
error: "Household context not set. Use householdAccess middleware first."
});
} }
if (!allowedRoles.includes(req.household.role)) { if (!allowedRoles.includes(req.household.role)) {
return res.status(403).json({ return sendError(
error: `Access denied. Required role: ${allowedRoles.join(" or ")}. Your role: ${req.household.role}` res,
}); 403,
`Access denied. Required role: ${allowedRoles.join(" or ")}. Your role: ${req.household.role}`
);
} }
next(); next();
@ -63,13 +63,11 @@ exports.storeAccess = async (req, res, next) => {
const storeId = parseInt(req.params.storeId || req.params.sId); const storeId = parseInt(req.params.storeId || req.params.sId);
if (!storeId) { if (!storeId) {
return res.status(400).json({ error: "Store ID required" }); return sendError(res, 400, "Store ID required");
} }
if (!req.household) { if (!req.household) {
return res.status(500).json({ return sendError(res, 500, "Household context not set. Use householdAccess middleware first.");
error: "Household context not set. Use householdAccess middleware first."
});
} }
// Check if household has access to this store // Check if household has access to this store
@ -77,9 +75,7 @@ exports.storeAccess = async (req, res, next) => {
const hasStore = await storeModel.householdHasStore(req.household.id, storeId); const hasStore = await storeModel.householdHasStore(req.household.id, storeId);
if (!hasStore) { if (!hasStore) {
return res.status(403).json({ return sendError(res, 403, "This household does not have access to this store.");
error: "This household does not have access to this store."
});
} }
// Attach store info to request // Attach store info to request
@ -89,21 +85,19 @@ exports.storeAccess = async (req, res, next) => {
next(); next();
} catch (error) { } catch (error) {
console.error("Store access check error:", error); logError(req, "middleware.storeAccess", error);
res.status(500).json({ error: "Server error checking store access" }); sendError(res, 500, "Server error checking store access");
} }
}; };
// Middleware to require system admin role // Middleware to require system admin role
exports.requireSystemAdmin = (req, res, next) => { exports.requireSystemAdmin = (req, res, next) => {
if (!req.user) { if (!req.user) {
return res.status(401).json({ error: "Authentication required" }); return sendError(res, 401, "Authentication required");
} }
if (req.user.role !== 'system_admin') { if (req.user.role !== 'system_admin') {
return res.status(403).json({ return sendError(res, 403, "Access denied. System administrator privileges required.");
error: "Access denied. System administrator privileges required."
});
} }
next(); next();

View File

@ -1,6 +1,7 @@
const multer = require("multer"); const multer = require("multer");
const sharp = require("sharp"); const sharp = require("sharp");
const { MAX_FILE_SIZE_BYTES, MAX_IMAGE_DIMENSION, IMAGE_QUALITY } = require("../config/constants"); const { MAX_FILE_SIZE_BYTES, MAX_IMAGE_DIMENSION, IMAGE_QUALITY } = require("../config/constants");
const { sendError } = require("../utils/http");
// Configure multer for memory storage (we'll process before saving to DB) // Configure multer for memory storage (we'll process before saving to DB)
const upload = multer({ const upload = multer({
@ -42,7 +43,7 @@ const processImage = async (req, res, next) => {
next(); next();
} catch (error) { } catch (error) {
res.status(400).json({ message: "Error processing image: " + error.message }); sendError(res, 400, `Error processing image: ${error.message}`);
} }
}; };

View File

@ -0,0 +1,58 @@
const { sendError } = require("../utils/http");
const buckets = new Map();
function pruneExpired(now) {
for (const [key, value] of buckets.entries()) {
if (value.resetAt <= now) {
buckets.delete(key);
}
}
}
function getClientIp(req) {
const forwardedFor = req.headers["x-forwarded-for"];
if (typeof forwardedFor === "string" && forwardedFor.trim()) {
return forwardedFor.split(",")[0].trim();
}
return req.ip || req.socket?.remoteAddress || "unknown";
}
function createRateLimit({ keyPrefix, windowMs, max, message }) {
return (req, res, next) => {
const now = Date.now();
if (buckets.size > 5000) {
pruneExpired(now);
}
const key = `${keyPrefix}:${getClientIp(req)}`;
const existing = buckets.get(key);
const bucket =
!existing || existing.resetAt <= now
? { count: 0, resetAt: now + windowMs }
: existing;
bucket.count += 1;
buckets.set(key, bucket);
if (bucket.count > max) {
const retryAfterSeconds = Math.max(
1,
Math.ceil((bucket.resetAt - now) / 1000)
);
res.setHeader("Retry-After", String(retryAfterSeconds));
return sendError(
res,
429,
message || "Too many requests. Please try again later."
);
}
return next();
};
}
module.exports = {
createRateLimit,
};

View File

@ -1,8 +1,10 @@
const { sendError } = require("../utils/http");
function requireRole(...allowedRoles) { function requireRole(...allowedRoles) {
return (req, res, next) => { return (req, res, next) => {
if (!req.user) return res.status(401).json({ message: "Authentication required" }); if (!req.user) return sendError(res, 401, "Authentication required");
if (!allowedRoles.includes(req.user.role)) if (!allowedRoles.includes(req.user.role))
return res.status(403).json({ message: "Forbidden" }); return sendError(res, 403, "Forbidden");
next(); next();
}; };

View File

@ -0,0 +1,47 @@
const crypto = require("crypto");
const { normalizeErrorPayload } = require("../utils/http");
function generateRequestId() {
if (typeof crypto.randomUUID === "function") {
return crypto.randomUUID();
}
return crypto.randomBytes(16).toString("hex");
}
function isPlainObject(value) {
return (
value !== null &&
typeof value === "object" &&
!Array.isArray(value) &&
Object.prototype.toString.call(value) === "[object Object]"
);
}
function requestIdMiddleware(req, res, next) {
const requestId = generateRequestId();
req.request_id = requestId;
res.locals.request_id = requestId;
res.setHeader("X-Request-Id", requestId);
const originalJson = res.json.bind(res);
res.json = (payload) => {
const normalizedPayload = normalizeErrorPayload(payload, res.statusCode);
if (isPlainObject(normalizedPayload)) {
if (normalizedPayload.request_id === undefined) {
return originalJson({ ...normalizedPayload, request_id: requestId });
}
return originalJson(normalizedPayload);
}
return originalJson({
data: normalizedPayload,
request_id: requestId,
});
};
next();
}
module.exports = requestIdMiddleware;

View File

@ -0,0 +1,392 @@
const pool = require("../db/pool");
function getExecutor(client) {
return client || pool;
}
async function withTransaction(handler) {
const client = await pool.connect();
try {
await client.query("BEGIN");
const result = await handler(client);
await client.query("COMMIT");
return result;
} catch (error) {
await client.query("ROLLBACK");
throw error;
} finally {
client.release();
}
}
async function getManageableGroupsForUser(userId, client) {
const result = await getExecutor(client).query(
`SELECT household_id AS group_id
FROM household_members
WHERE user_id = $1
AND role IN ('owner', 'admin')`,
[userId]
);
return result.rows;
}
async function getUserGroupRole(groupId, userId, client) {
const result = await getExecutor(client).query(
`SELECT role
FROM household_members
WHERE household_id = $1
AND user_id = $2`,
[groupId, userId]
);
return result.rows[0]?.role || null;
}
async function getGroupById(groupId, client) {
const result = await getExecutor(client).query(
`SELECT id, name
FROM households
WHERE id = $1`,
[groupId]
);
return result.rows[0] || null;
}
async function listInviteLinks(groupId, client) {
const result = await getExecutor(client).query(
`SELECT
id,
group_id,
created_by,
token,
policy,
single_use,
expires_at,
used_at,
revoked_at,
created_at
FROM group_invite_links
WHERE group_id = $1
ORDER BY created_at DESC`,
[groupId]
);
return result.rows;
}
async function createInviteLink(
{ groupId, createdBy, token, policy, singleUse, expiresAt },
client
) {
const result = await getExecutor(client).query(
`INSERT INTO group_invite_links (
group_id,
created_by,
token,
policy,
single_use,
expires_at
) VALUES ($1, $2, $3, $4, $5, $6)
RETURNING
id,
group_id,
created_by,
token,
policy,
single_use,
expires_at,
used_at,
revoked_at,
created_at`,
[groupId, createdBy, token, policy, singleUse, expiresAt]
);
return result.rows[0];
}
async function getInviteLinkById(groupId, linkId, client) {
const result = await getExecutor(client).query(
`SELECT
id,
group_id,
created_by,
token,
policy,
single_use,
expires_at,
used_at,
revoked_at,
created_at
FROM group_invite_links
WHERE group_id = $1
AND id = $2`,
[groupId, linkId]
);
return result.rows[0] || null;
}
async function revokeInviteLink(groupId, linkId, client) {
const result = await getExecutor(client).query(
`UPDATE group_invite_links
SET revoked_at = NOW()
WHERE group_id = $1
AND id = $2
RETURNING
id,
group_id,
created_by,
token,
policy,
single_use,
expires_at,
used_at,
revoked_at,
created_at`,
[groupId, linkId]
);
return result.rows[0] || null;
}
async function reviveInviteLink(groupId, linkId, expiresAt, client) {
const result = await getExecutor(client).query(
`UPDATE group_invite_links
SET used_at = NULL,
revoked_at = NULL,
expires_at = $3
WHERE group_id = $1
AND id = $2
RETURNING
id,
group_id,
created_by,
token,
policy,
single_use,
expires_at,
used_at,
revoked_at,
created_at`,
[groupId, linkId, expiresAt]
);
return result.rows[0] || null;
}
async function deleteInviteLink(groupId, linkId, client) {
const result = await getExecutor(client).query(
`DELETE FROM group_invite_links
WHERE group_id = $1
AND id = $2
RETURNING
id,
group_id,
created_by,
token,
policy,
single_use,
expires_at,
used_at,
revoked_at,
created_at`,
[groupId, linkId]
);
return result.rows[0] || null;
}
async function getInviteLinkSummaryByToken(token, client, forUpdate = false) {
const result = await getExecutor(client).query(
`SELECT
gil.id,
gil.group_id,
gil.created_by,
gil.token,
gil.policy,
gil.single_use,
gil.expires_at,
gil.used_at,
gil.revoked_at,
gil.created_at,
h.name AS group_name,
gs.join_policy AS current_join_policy
FROM group_invite_links gil
JOIN households h ON h.id = gil.group_id
LEFT JOIN group_settings gs ON gs.group_id = gil.group_id
WHERE gil.token = $1
${forUpdate ? "FOR UPDATE OF gil" : ""}`,
[token]
);
return result.rows[0] || null;
}
async function isGroupMember(groupId, userId, client) {
const result = await getExecutor(client).query(
`SELECT 1
FROM household_members
WHERE household_id = $1
AND user_id = $2`,
[groupId, userId]
);
return result.rows.length > 0;
}
async function getPendingJoinRequest(groupId, userId, client) {
const result = await getExecutor(client).query(
`SELECT id, group_id, user_id, status, created_at, updated_at
FROM group_join_requests
WHERE group_id = $1
AND user_id = $2
AND status = 'PENDING'`,
[groupId, userId]
);
return result.rows[0] || null;
}
async function createOrTouchPendingJoinRequest(groupId, userId, client) {
const executor = getExecutor(client);
const existing = await executor.query(
`UPDATE group_join_requests
SET updated_at = NOW()
WHERE group_id = $1
AND user_id = $2
AND status = 'PENDING'
RETURNING id, group_id, user_id, status, created_at, updated_at`,
[groupId, userId]
);
if (existing.rows[0]) {
return existing.rows[0];
}
try {
const inserted = await executor.query(
`INSERT INTO group_join_requests (group_id, user_id, status)
VALUES ($1, $2, 'PENDING')
RETURNING id, group_id, user_id, status, created_at, updated_at`,
[groupId, userId]
);
return inserted.rows[0];
} catch (error) {
if (error.code !== "23505") {
throw error;
}
const fallback = await executor.query(
`SELECT id, group_id, user_id, status, created_at, updated_at
FROM group_join_requests
WHERE group_id = $1
AND user_id = $2
AND status = 'PENDING'
LIMIT 1`,
[groupId, userId]
);
return fallback.rows[0] || null;
}
}
async function addGroupMember(groupId, userId, role = "member", client) {
const result = await getExecutor(client).query(
`INSERT INTO household_members (household_id, user_id, role)
VALUES ($1, $2, $3)
ON CONFLICT (household_id, user_id) DO NOTHING
RETURNING id`,
[groupId, userId, role]
);
return result.rows.length > 0;
}
async function consumeSingleUseInvite(linkId, client) {
const result = await getExecutor(client).query(
`UPDATE group_invite_links
SET used_at = NOW(),
revoked_at = NOW()
WHERE id = $1
RETURNING id`,
[linkId]
);
return result.rows.length > 0;
}
async function getGroupSettings(groupId, client) {
const result = await getExecutor(client).query(
`SELECT group_id, join_policy
FROM group_settings
WHERE group_id = $1`,
[groupId]
);
return result.rows[0] || null;
}
async function upsertGroupSettings(groupId, joinPolicy, client) {
const result = await getExecutor(client).query(
`INSERT INTO group_settings (group_id, join_policy)
VALUES ($1, $2)
ON CONFLICT (group_id)
DO UPDATE SET
join_policy = EXCLUDED.join_policy,
updated_at = NOW()
RETURNING group_id, join_policy`,
[groupId, joinPolicy]
);
return result.rows[0];
}
async function createGroupAuditLog(
{
groupId,
actorUserId,
actorRole,
eventType,
requestId,
ip,
userAgent,
success = true,
errorCode = null,
metadata = {},
},
client
) {
const result = await getExecutor(client).query(
`INSERT INTO group_audit_log (
group_id,
actor_user_id,
actor_role,
event_type,
request_id,
ip,
user_agent,
success,
error_code,
metadata
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10::jsonb)
RETURNING id`,
[
groupId,
actorUserId,
actorRole,
eventType,
requestId,
ip,
userAgent,
success,
errorCode,
JSON.stringify(metadata || {}),
]
);
return result.rows[0];
}
module.exports = {
addGroupMember,
createGroupAuditLog,
createInviteLink,
createOrTouchPendingJoinRequest,
consumeSingleUseInvite,
deleteInviteLink,
getGroupById,
getGroupSettings,
getInviteLinkById,
getInviteLinkSummaryByToken,
getManageableGroupsForUser,
getPendingJoinRequest,
getUserGroupRole,
isGroupMember,
listInviteLinks,
revokeInviteLink,
reviveInviteLink,
upsertGroupSettings,
withTransaction,
};

View File

@ -18,14 +18,14 @@ exports.getHouseholdStoreList = async (householdId, storeId, includeHistory = tr
hl.custom_image_mime_type as image_mime_type, hl.custom_image_mime_type as image_mime_type,
${includeHistory ? ` ${includeHistory ? `
( (
SELECT ARRAY_AGG(DISTINCT u.name) SELECT ARRAY_AGG(added_by_labels.user_label ORDER BY added_by_labels.user_label)
FROM ( FROM (
SELECT DISTINCT hlh.added_by SELECT DISTINCT
COALESCE(NULLIF(TRIM(u.display_name), ''), NULLIF(TRIM(u.name), ''), u.username) AS user_label
FROM household_list_history hlh FROM household_list_history hlh
JOIN users u ON hlh.added_by = u.id
WHERE hlh.household_list_id = hl.id WHERE hlh.household_list_id = hl.id
ORDER BY hlh.added_by ) added_by_labels
) hlh
JOIN users u ON hlh.added_by = u.id
) as added_by_users, ) as added_by_users,
` : 'NULL as added_by_users,'} ` : 'NULL as added_by_users,'}
hl.modified_on as last_added_on, hl.modified_on as last_added_on,
@ -74,14 +74,14 @@ exports.getItemByName = async (householdId, storeId, itemName) => {
ENCODE(hl.custom_image, 'base64') as item_image, ENCODE(hl.custom_image, 'base64') as item_image,
hl.custom_image_mime_type as image_mime_type, hl.custom_image_mime_type as image_mime_type,
( (
SELECT ARRAY_AGG(DISTINCT u.name) SELECT ARRAY_AGG(added_by_labels.user_label ORDER BY added_by_labels.user_label)
FROM ( FROM (
SELECT DISTINCT hlh.added_by SELECT DISTINCT
COALESCE(NULLIF(TRIM(u.display_name), ''), NULLIF(TRIM(u.name), ''), u.username) AS user_label
FROM household_list_history hlh FROM household_list_history hlh
JOIN users u ON hlh.added_by = u.id
WHERE hlh.household_list_id = hl.id WHERE hlh.household_list_id = hl.id
ORDER BY hlh.added_by ) added_by_labels
) hlh
JOIN users u ON hlh.added_by = u.id
) as added_by_users, ) as added_by_users,
hl.modified_on as last_added_on, hl.modified_on as last_added_on,
hic.item_type, hic.item_type,
@ -97,7 +97,6 @@ exports.getItemByName = async (householdId, storeId, itemName) => {
AND hl.item_id = $3`, AND hl.item_id = $3`,
[householdId, storeId, itemId] [householdId, storeId, itemId]
); );
console.log(result.rows);
return result.rows[0] || null; return result.rows[0] || null;
}; };
@ -290,14 +289,14 @@ exports.getRecentlyBoughtItems = async (householdId, storeId) => {
ENCODE(hl.custom_image, 'base64') as item_image, ENCODE(hl.custom_image, 'base64') as item_image,
hl.custom_image_mime_type as image_mime_type, hl.custom_image_mime_type as image_mime_type,
( (
SELECT ARRAY_AGG(DISTINCT u.name) SELECT ARRAY_AGG(added_by_labels.user_label ORDER BY added_by_labels.user_label)
FROM ( FROM (
SELECT DISTINCT hlh.added_by SELECT DISTINCT
COALESCE(NULLIF(TRIM(u.display_name), ''), NULLIF(TRIM(u.name), ''), u.username) AS user_label
FROM household_list_history hlh FROM household_list_history hlh
JOIN users u ON hlh.added_by = u.id
WHERE hlh.household_list_id = hl.id WHERE hlh.household_list_id = hl.id
ORDER BY hlh.added_by ) added_by_labels
) hlh
JOIN users u ON hlh.added_by = u.id
) as added_by_users, ) as added_by_users,
hl.modified_on as last_added_on hl.modified_on as last_added_on
FROM household_lists hl FROM household_lists hl

View File

@ -0,0 +1,123 @@
const crypto = require("crypto");
const pool = require("../db/pool");
const { SESSION_TTL_DAYS } = require("../utils/session-cookie");
const INSERT_SESSION_SQL = `INSERT INTO sessions (id, user_id, expires_at, user_agent)
VALUES ($1, $2, NOW() + ($3 || ' days')::interval, $4)
RETURNING id, user_id, created_at, expires_at`;
const SELECT_ACTIVE_SESSION_SQL = `SELECT
s.id,
s.user_id,
s.expires_at,
u.username,
u.role
FROM sessions s
JOIN users u ON u.id = s.user_id
WHERE s.id = $1
AND s.expires_at > NOW()`;
let ensureSessionsTablePromise = null;
function generateSessionId() {
if (typeof crypto.randomUUID === "function") {
return crypto.randomUUID().replace(/-/g, "") + crypto.randomBytes(8).toString("hex");
}
return crypto.randomBytes(32).toString("hex");
}
function isUndefinedTableError(error) {
return error && error.code === "42P01";
}
async function ensureSessionsTable() {
if (!ensureSessionsTablePromise) {
ensureSessionsTablePromise = (async () => {
await pool.query(`CREATE TABLE IF NOT EXISTS sessions (
id VARCHAR(128) PRIMARY KEY,
user_id INTEGER NOT NULL REFERENCES users(id) ON DELETE CASCADE,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
expires_at TIMESTAMPTZ NOT NULL,
last_seen_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
user_agent TEXT
);`);
await pool.query(
"CREATE INDEX IF NOT EXISTS idx_sessions_user_id ON sessions(user_id);"
);
await pool.query(
"CREATE INDEX IF NOT EXISTS idx_sessions_expires_at ON sessions(expires_at);"
);
})().catch((error) => {
ensureSessionsTablePromise = null;
throw error;
});
}
await ensureSessionsTablePromise;
}
async function insertSession(id, userId, userAgent) {
const result = await pool.query(INSERT_SESSION_SQL, [
id,
userId,
String(SESSION_TTL_DAYS),
userAgent,
]);
return result.rows[0];
}
exports.createSession = async (userId, userAgent = null) => {
const id = generateSessionId();
try {
return await insertSession(id, userId, userAgent);
} catch (error) {
if (!isUndefinedTableError(error)) {
throw error;
}
await ensureSessionsTable();
return insertSession(id, userId, userAgent);
}
};
exports.getActiveSessionWithUser = async (sessionId) => {
let result;
try {
result = await pool.query(SELECT_ACTIVE_SESSION_SQL, [sessionId]);
} catch (error) {
if (isUndefinedTableError(error)) {
return null;
}
throw error;
}
const session = result.rows[0] || null;
if (!session) return null;
try {
await pool.query(
`UPDATE sessions
SET last_seen_at = NOW()
WHERE id = $1`,
[sessionId]
);
} catch (error) {
if (!isUndefinedTableError(error)) {
throw error;
}
}
return session;
};
exports.deleteSession = async (sessionId) => {
try {
await pool.query(
`DELETE FROM sessions WHERE id = $1`,
[sessionId]
);
} catch (error) {
if (!isUndefinedTableError(error)) {
throw error;
}
}
};

View File

@ -6,17 +6,16 @@ exports.ROLES = {
} }
exports.findByUsername = async (username) => { exports.findByUsername = async (username) => {
query = `SELECT * FROM users WHERE username = ${username}`;
const result = await pool.query("SELECT * FROM users WHERE username = $1", [username]); const result = await pool.query("SELECT * FROM users WHERE username = $1", [username]);
console.log(query);
return result.rows[0]; return result.rows[0];
}; };
exports.createUser = async (username, hashedPassword, name) => { exports.createUser = async (username, hashedPassword, name) => {
const result = await pool.query( const result = await pool.query(
`INSERT INTO users (username, password, name, role) `INSERT INTO users (username, password, name, role)
VALUES ($1, $2, $3, $4)`, VALUES ($1, $2, $3, $4)
[username, hashedPassword, name, this.ROLES.VIEWER] RETURNING id, username, name, role`,
[username, hashedPassword, name, exports.ROLES.USER]
); );
return result.rows[0]; return result.rows[0];
}; };

View File

@ -1,13 +1,30 @@
const router = require("express").Router(); const router = require("express").Router();
const controller = require("../controllers/auth.controller"); const controller = require("../controllers/auth.controller");
const User = require("../models/user.model");
const { createRateLimit } = require("../middleware/rate-limit");
router.post("/register", controller.register); const loginRateLimit = createRateLimit({
router.post("/login", controller.login); keyPrefix: "auth:login",
windowMs: 15 * 60 * 1000,
max: 25,
message: "Too many login attempts. Please try again later.",
});
const registerRateLimit = createRateLimit({
keyPrefix: "auth:register",
windowMs: 15 * 60 * 1000,
max: 10,
message: "Too many registration attempts. Please try again later.",
});
router.post("/register", registerRateLimit, controller.register);
router.post("/login", loginRateLimit, controller.login);
router.post("/logout", controller.logout);
router.post("/", async (req, res) => { router.post("/", async (req, res) => {
resText = `Grocery List API is running.\n` + res.status(200).json({
`Roles available: ${Object.values(User.ROLES).join(', ')}` message: "Auth API is running.",
roles: Object.values(User.ROLES),
res.status(200).type("text/plain").send(resText); });
}); });
module.exports = router; module.exports = router;

View File

@ -3,9 +3,19 @@ const auth = require("../middleware/auth");
const requireRole = require("../middleware/rbac"); const requireRole = require("../middleware/rbac");
const usersController = require("../controllers/users.controller"); const usersController = require("../controllers/users.controller");
const { ROLES } = require("../models/user.model"); const { ROLES } = require("../models/user.model");
const { createRateLimit } = require("../middleware/rate-limit");
router.get("/exists", usersController.checkIfUserExists); const userExistsRateLimit = createRateLimit({
router.get("/test", usersController.test); keyPrefix: "users:exists",
windowMs: 15 * 60 * 1000,
max: 60,
message: "Too many availability checks. Please try again later.",
});
router.get("/exists", userExistsRateLimit, usersController.checkIfUserExists);
if (process.env.NODE_ENV !== "production") {
router.get("/test", usersController.test);
}
// Current user profile routes (authenticated) // Current user profile routes (authenticated)
router.get("/me", auth, usersController.getCurrentUser); router.get("/me", auth, usersController.getCurrentUser);

View File

@ -0,0 +1,101 @@
jest.mock("../models/list.model.v2", () => ({
addHistoryRecord: jest.fn(),
addOrUpdateItem: jest.fn(),
}));
jest.mock("../models/household.model", () => ({
isHouseholdMember: jest.fn(),
}));
jest.mock("../utils/logger", () => ({
logError: jest.fn(),
}));
const List = require("../models/list.model.v2");
const householdModel = require("../models/household.model");
const controller = require("../controllers/lists.controller.v2");
function createResponse() {
const res = {};
res.status = jest.fn().mockReturnValue(res);
res.json = jest.fn().mockReturnValue(res);
return res;
}
describe("lists.controller.v2 addItem", () => {
beforeEach(() => {
List.addOrUpdateItem.mockResolvedValue({
listId: 42,
itemName: "milk",
isNew: true,
});
List.addHistoryRecord.mockResolvedValue(undefined);
householdModel.isHouseholdMember.mockResolvedValue(true);
});
test("records history for selected added_for_user_id when member is valid", async () => {
const req = {
params: { householdId: "1", storeId: "2" },
body: { item_name: "milk", quantity: "1", added_for_user_id: "9" },
user: { id: 7 },
processedImage: null,
};
const res = createResponse();
await controller.addItem(req, res);
expect(householdModel.isHouseholdMember).toHaveBeenCalledWith("1", 9);
expect(List.addOrUpdateItem).toHaveBeenCalled();
expect(List.addHistoryRecord).toHaveBeenCalledWith(42, "1", 9);
expect(res.status).not.toHaveBeenCalledWith(400);
});
test("rejects invalid added_for_user_id", async () => {
const req = {
params: { householdId: "1", storeId: "2" },
body: { item_name: "milk", quantity: "1", added_for_user_id: "abc" },
user: { id: 7 },
processedImage: null,
};
const res = createResponse();
await controller.addItem(req, res);
expect(List.addOrUpdateItem).not.toHaveBeenCalled();
expect(List.addHistoryRecord).not.toHaveBeenCalled();
expect(res.status).toHaveBeenCalledWith(400);
expect(res.json).toHaveBeenCalledWith(
expect.objectContaining({
error: expect.objectContaining({
message: "Added-for user ID must be a positive integer",
}),
})
);
});
test("rejects added_for_user_id when target user is not household member", async () => {
householdModel.isHouseholdMember.mockResolvedValue(false);
const req = {
params: { householdId: "1", storeId: "2" },
body: { item_name: "milk", quantity: "1", added_for_user_id: "11" },
user: { id: 7 },
processedImage: null,
};
const res = createResponse();
await controller.addItem(req, res);
expect(householdModel.isHouseholdMember).toHaveBeenCalledWith("1", 11);
expect(List.addOrUpdateItem).not.toHaveBeenCalled();
expect(List.addHistoryRecord).not.toHaveBeenCalled();
expect(res.status).toHaveBeenCalledWith(400);
expect(res.json).toHaveBeenCalledWith(
expect.objectContaining({
error: expect.objectContaining({
message: "Selected user is not a member of this household",
}),
})
);
});
});

25
backend/utils/cookies.js Normal file
View File

@ -0,0 +1,25 @@
function parseCookieHeader(cookieHeader) {
const cookies = {};
if (!cookieHeader || typeof cookieHeader !== "string") return cookies;
const segments = cookieHeader.split(";");
for (const segment of segments) {
const index = segment.indexOf("=");
if (index === -1) continue;
const key = segment.slice(0, index).trim();
const value = segment.slice(index + 1).trim();
if (!key) continue;
try {
cookies[key] = decodeURIComponent(value);
} catch (_) {
// Ignore malformed cookie values instead of throwing.
continue;
}
}
return cookies;
}
module.exports = {
parseCookieHeader,
};

116
backend/utils/http.js Normal file
View File

@ -0,0 +1,116 @@
function isPlainObject(value) {
return (
value !== null &&
typeof value === "object" &&
!Array.isArray(value) &&
Object.prototype.toString.call(value) === "[object Object]"
);
}
function errorCodeFromStatus(statusCode) {
switch (statusCode) {
case 400:
return "bad_request";
case 401:
return "unauthorized";
case 403:
return "forbidden";
case 404:
return "not_found";
case 409:
return "conflict";
case 413:
return "payload_too_large";
case 415:
return "unsupported_media_type";
case 422:
return "unprocessable_entity";
case 429:
return "rate_limited";
case 500:
return "internal_error";
default:
return statusCode >= 500 ? "internal_error" : "request_error";
}
}
function normalizeErrorPayload(payload, statusCode) {
if (statusCode < 400) return payload;
if (typeof payload === "string") {
return {
error: {
code: errorCodeFromStatus(statusCode),
message: payload,
},
};
}
if (!isPlainObject(payload)) {
return {
error: {
code: errorCodeFromStatus(statusCode),
message: "Request failed",
},
};
}
if (isPlainObject(payload.error)) {
const code = payload.error.code || errorCodeFromStatus(statusCode);
const message = payload.error.message || "Request failed";
return {
...payload,
error: {
...payload.error,
code,
message,
},
};
}
if (typeof payload.error === "string") {
const { error, ...rest } = payload;
return {
...rest,
error: {
code: errorCodeFromStatus(statusCode),
message: error,
},
};
}
if (typeof payload.message === "string") {
const { message, ...rest } = payload;
return {
...rest,
error: {
code: errorCodeFromStatus(statusCode),
message,
},
};
}
return {
...payload,
error: {
code: errorCodeFromStatus(statusCode),
message: "Request failed",
},
};
}
function sendError(res, statusCode, message, code, extra = {}) {
return res.status(statusCode).json({
...extra,
error: {
code: code || errorCodeFromStatus(statusCode),
message,
},
});
}
module.exports = {
errorCodeFromStatus,
normalizeErrorPayload,
sendError,
};

20
backend/utils/logger.js Normal file
View File

@ -0,0 +1,20 @@
const { safeErrorMessage } = require("./redaction");
function formatExtra(extra = {}) {
return Object.entries(extra)
.filter(([, value]) => value !== undefined && value !== null && value !== "")
.map(([key, value]) => `${key}=${String(value)}`)
.join(" ");
}
function logError(req, context, error, extra = {}) {
const requestId = req?.request_id || "unknown";
const message = safeErrorMessage(error);
const extraText = formatExtra(extra);
const suffix = extraText ? ` ${extraText}` : "";
console.error(`[${context}] request_id=${requestId} message=${message}${suffix}`);
}
module.exports = {
logError,
};

View File

@ -0,0 +1,20 @@
function inviteCodeLast4(inviteCode) {
if (!inviteCode || typeof inviteCode !== "string") return "none";
const trimmed = inviteCode.trim();
if (!trimmed) return "none";
return trimmed.slice(-4);
}
function safeErrorMessage(error) {
if (!error) return "unknown_error";
if (typeof error === "string") return error;
if (typeof error.message === "string" && error.message.trim()) {
return error.message;
}
return "unknown_error";
}
module.exports = {
inviteCodeLast4,
safeErrorMessage,
};

View File

@ -0,0 +1,36 @@
const SESSION_COOKIE_NAME = process.env.SESSION_COOKIE_NAME || "sid";
const SESSION_TTL_DAYS = Number(process.env.SESSION_TTL_DAYS || 30);
function sessionMaxAgeMs() {
return SESSION_TTL_DAYS * 24 * 60 * 60 * 1000;
}
function cookieName() {
return SESSION_COOKIE_NAME;
}
function setSessionCookie(res, sessionId) {
res.cookie(cookieName(), sessionId, {
httpOnly: true,
secure: process.env.NODE_ENV === "production",
sameSite: "lax",
path: "/",
maxAge: sessionMaxAgeMs(),
});
}
function clearSessionCookie(res) {
res.clearCookie(cookieName(), {
httpOnly: true,
secure: process.env.NODE_ENV === "production",
sameSite: "lax",
path: "/",
});
}
module.exports = {
SESSION_TTL_DAYS,
clearSessionCookie,
cookieName,
setSessionCookie,
};

View File

@ -9,7 +9,7 @@ services:
- ./frontend:/app - ./frontend:/app
- frontend_node_modules:/app/node_modules - frontend_node_modules:/app/node_modules
ports: ports:
- "3000:5173" - "3010:5173"
depends_on: depends_on:
- backend - backend
restart: always restart: always

View File

@ -0,0 +1,49 @@
# Agentic Contract Map (Current Stack)
This file maps `PROJECT_INSTRUCTIONS.md` architecture intent to the current repository stack.
## Current stack
- Backend: Express (`backend/`)
- Frontend: React + Vite (`frontend/`)
## Contract mapping
### API Route Handlers (`app/api/**/route.ts` intent)
Current equivalent:
- `backend/routes/*.js`
- `backend/controllers/*.js`
Expectation:
- Keep these thin for parsing/validation and response shape.
- Delegate DB and authorization-heavy logic to model/service layers.
### Server Services (`lib/server/*` intent)
Current equivalent:
- `backend/models/*.js`
- `backend/middleware/*.js`
- `backend/db/*`
Expectation:
- Concentrate DB access and authorization logic in these backend layers.
- Avoid raw DB usage directly in route files unless no service/model exists.
### Client Wrappers (`lib/client/*` intent)
Current equivalent:
- `frontend/src/api/*.js`
Expectation:
- Centralize fetch/axios calls and error normalization here.
- Always send credentials/authorization headers as required.
### Hooks (`hooks/use-*.ts` intent)
Current equivalent:
- `frontend/src/context/*`
- `frontend/src/utils/*` for route guards
Expectation:
- Keep components free of direct raw network calls where possible.
- Favor one canonical state propagation mechanism per concern.
## Notes
- This map does not force a framework migration.
- It defines how to apply the contract consistently in the existing codebase.

View File

@ -0,0 +1,51 @@
# DB Migration Workflow (External Postgres)
This project uses an external on-prem Postgres database. Migration files are canonical in:
- `packages/db/migrations`
## Preconditions
- `DATABASE_URL` is set and points to the on-prem Postgres instance.
- `psql` is installed and available in PATH.
- You are in repo root.
## Commands
- Apply pending migrations:
- `npm run db:migrate`
- Show migration status:
- `npm run db:migrate:status`
- Fail if pending migrations exist:
- `npm run db:migrate:verify`
## Active migration set
Migration files are applied in lexicographic filename order from `packages/db/migrations`.
Current baseline files:
- `add_display_name_column.sql`
- `add_image_columns.sql`
- `add_modified_on_column.sql`
- `add_notes_column.sql`
- `create_item_classification_table.sql`
- `create_sessions_table.sql`
- `multi_household_architecture.sql`
## Tracking table
Applied migrations are recorded in:
- `schema_migrations(filename text unique, applied_at timestamptz)`
## Expected operator flow
1. Check status:
- `npm run db:migrate:status`
2. Apply pending:
- `npm run db:migrate`
3. Verify clean state:
- `npm run db:migrate:verify`
## Troubleshooting
- `DATABASE_URL is required`:
- Export/set `DATABASE_URL` in your environment.
- `psql executable was not found in PATH`:
- Install PostgreSQL client tools and retry.
- SQL failure:
- Fix migration SQL and rerun; only successful files are recorded in `schema_migrations`.

View File

@ -0,0 +1,57 @@
# Project State Audit - Fiddy
Snapshot date: 2026-02-16
## 1) Confirmed stack and structure
- Backend: Express API in `backend/` with `routes/`, `controllers/`, `models/`, `middleware/`, `utils/`.
- Frontend: React + Vite in `frontend/` with API wrappers in `frontend/src/api`, auth/state in `frontend/src/context`, pages in `frontend/src/pages`.
- DB migrations: canonical folder is `packages/db/migrations`.
## 2) Governance and agentic setup status
- Present and aligned:
- `PROJECT_INSTRUCTIONS.md`
- `AGENTS.md`
- `DEBUGGING_INSTRUCTIONS.md`
- `docs/DB_MIGRATION_WORKFLOW.md`
- `docs/AGENTIC_CONTRACT_MAP.md`
- Commit discipline added in `PROJECT_INSTRUCTIONS.md` section 12 and being followed with small conventional commits.
## 3) Current implementation status vs vertical-slice goals
1. DB migrate command + schema:
- Implemented: root scripts `db:migrate`, `db:migrate:status`, `db:migrate:verify`.
- Implemented: migration tracking + runbook.
2. Register/Login/Logout (custom sessions):
- Implemented: DB sessions table migration (`create_sessions_table.sql`).
- Implemented: session model, HttpOnly cookie set/clear, `/auth/logout`, auth middleware fallback to DB session cookie.
- Implemented: frontend credentialed API (`withCredentials`), logout route call.
3. Protected dashboard page:
- Partially implemented via existing `PrivateRoute` token gate.
4. Group create/join + switcher:
- Existing household create/join/switch flow exists but does not yet match all group-policy requirements.
5. Entries CRUD:
- Existing list CRUD exists in legacy and multi-household paths.
6. Receipt upload/download endpoints:
- Not implemented as dedicated receipt domain/endpoints.
7. Settings + Reports:
- Settings page exists; reporting is not fully formalized.
## 4) Contract gaps and risks
- `DATABASE_URL` is now supported in runtime pool config, but local operator environment still needs this variable configured.
- No automated test suite currently exercises the new auth/session behavior; API behavior is mostly validated by static/lint checks.
- Group policy requirements (owner role, join policy states, invite lifecycle constraints, revive semantics) are not fully implemented.
- No explicit audit log persistence layer verified for invite events/request IDs.
- Encoding cleanliness needs ongoing watch; historical mojibake appears in some UI text/log strings.
## 5) Recommended next implementation order
1. Finalize auth session contract:
- Add authenticated session introspection endpoint (`/users/me` already exists) to support cookie-only bootstrapping if token absent.
- Update frontend auth bootstrap so protected routes work with DB session cookie as canonical auth.
2. Add explicit API tests (auth + households/list negative cases):
- unauthorized
- not-a-member
- invalid input
3. Implement group-policy requirements incrementally:
- owner role migration + policy enums
- invite policy and immutable settings
- approval-required flow + revive/single-use semantics
4. Add dedicated receipt domain endpoints (metadata list vs byte retrieval split) if the product scope requires the receipt contract verbatim.

View File

@ -39,7 +39,10 @@ Historical documentation of completed features. Useful for reference but not act
These files remain at the project root for easy access: These files remain at the project root for easy access:
- **[../README.md](../README.md)** - Project overview and quick start - **[../README.md](../README.md)** - Project overview and quick start
- **[../.github/copilot-instructions.md](../.github/copilot-instructions.md)** - AI assistant instructions (architecture, RBAC, conventions) - **[../PROJECT_INSTRUCTIONS.md](../PROJECT_INSTRUCTIONS.md)** - Canonical project constraints and delivery contract
- **[../AGENTS.md](../AGENTS.md)** - Agent behavior and guardrails
- **[../DEBUGGING_INSTRUCTIONS.md](../DEBUGGING_INSTRUCTIONS.md)** - Required bugfix workflow
- **[../.github/copilot-instructions.md](../.github/copilot-instructions.md)** - Copilot compatibility shim to root instructions
--- ---
@ -51,9 +54,9 @@ These files remain at the project root for easy access:
**Working on mobile UI?** → Check [MOBILE_RESPONSIVE_AUDIT.md](guides/MOBILE_RESPONSIVE_AUDIT.md) **Working on mobile UI?** → Check [MOBILE_RESPONSIVE_AUDIT.md](guides/MOBILE_RESPONSIVE_AUDIT.md)
**Need architecture context?** → Read [../.github/copilot-instructions.md](../.github/copilot-instructions.md) **Need architecture context?** → Read [AGENTIC_CONTRACT_MAP.md](AGENTIC_CONTRACT_MAP.md) and [../PROJECT_INSTRUCTIONS.md](../PROJECT_INSTRUCTIONS.md)
**Running migrations?** → Follow [MIGRATION_GUIDE.md](migration/MIGRATION_GUIDE.md) **Running migrations?** → Follow [DB_MIGRATION_WORKFLOW.md](DB_MIGRATION_WORKFLOW.md)
--- ---

View File

@ -9,3 +9,8 @@ export const registerRequest = async (username, password, name) => {
const res = await api.post("/auth/register", { username, password, name }); const res = await api.post("/auth/register", { username, password, name });
return res.data; return res.data;
}; };
export const logoutRequest = async () => {
const res = await api.post("/auth/logout");
return res.data;
};

View File

@ -3,6 +3,7 @@ import { API_BASE_URL } from "../config";
const api = axios.create({ const api = axios.create({
baseURL: API_BASE_URL, baseURL: API_BASE_URL,
withCredentials: true,
headers: { headers: {
"Content-Type": "application/json", "Content-Type": "application/json",
}, },
@ -17,10 +18,39 @@ api.interceptors.request.use((config => {
})); }));
api.interceptors.response.use( api.interceptors.response.use(
response => response, response => {
const payload = response.data;
if (
payload &&
typeof payload === "object" &&
!Array.isArray(payload) &&
Object.keys(payload).length === 2 &&
Object.prototype.hasOwnProperty.call(payload, "data") &&
Object.prototype.hasOwnProperty.call(payload, "request_id")
) {
response.request_id = payload.request_id;
response.data = payload.data;
}
return response;
},
error => { error => {
if (error.response?.status === 401 && const payload = error.response?.data;
error.response?.data?.message === "Invalid or expired token") { const normalizedMessage = payload?.error?.message || payload?.message;
if (payload?.error?.message && payload.message === undefined) {
payload.message = payload.error.message;
}
if (
error.response?.status === 401 &&
window.location.pathname !== "/login" &&
window.location.pathname !== "/register" &&
[
"Invalid or expired token",
"Invalid or expired session",
"Missing authentication",
].includes(normalizedMessage)
) {
localStorage.removeItem("token"); localStorage.removeItem("token");
window.location.href = "/login"; window.location.href = "/login";
alert("Your session has expired. Please log in again."); alert("Your session has expired. Please log in again.");

View File

@ -17,13 +17,24 @@ export const getItemByName = (householdId, storeId, itemName) =>
/** /**
* Add item to list * Add item to list
*/ */
export const addItem = (householdId, storeId, itemName, quantity, imageFile = null, notes = null) => { export const addItem = (
householdId,
storeId,
itemName,
quantity,
imageFile = null,
notes = null,
addedForUserId = null
) => {
const formData = new FormData(); const formData = new FormData();
formData.append("item_name", itemName); formData.append("item_name", itemName);
formData.append("quantity", quantity); formData.append("quantity", quantity);
if (notes) { if (notes) {
formData.append("notes", notes); formData.append("notes", notes);
} }
if (addedForUserId != null) {
formData.append("added_for_user_id", addedForUserId);
}
if (imageFile) { if (imageFile) {
formData.append("image", imageFile); formData.append("image", imageFile);
} }
@ -108,7 +119,14 @@ export const getRecentlyBought = (householdId, storeId) =>
/** /**
* Update item image * Update item image
*/ */
export const updateItemImage = (householdId, storeId, itemName, quantity, imageFile) => { export const updateItemImage = (
householdId,
storeId,
itemName,
quantity,
imageFile,
options = {}
) => {
const formData = new FormData(); const formData = new FormData();
formData.append("item_name", itemName); formData.append("item_name", itemName);
formData.append("quantity", quantity); formData.append("quantity", quantity);
@ -118,5 +136,8 @@ export const updateItemImage = (householdId, storeId, itemName, quantity, imageF
headers: { headers: {
"Content-Type": "multipart/form-data", "Content-Type": "multipart/form-data",
}, },
onUploadProgress: options.onUploadProgress,
signal: options.signal,
timeout: options.timeoutMs,
}); });
}; };

View File

@ -0,0 +1,67 @@
import "../../styles/components/ToggleButtonGroup.css";
function joinClasses(parts) {
return parts.filter(Boolean).join(" ");
}
export default function ToggleButtonGroup({
value,
options,
onChange,
ariaLabel,
role = "group",
className = "tbg-group",
buttonBaseClassName = "tbg-button",
buttonClassName,
activeClassName = "is-active",
inactiveClassName = "is-inactive",
sizeClassName = "tbg-size-default"
}) {
const optionCount = Math.max(options.length, 1);
const activeIndex =
value == null ? -1 : options.findIndex((option) => option.value === value);
const groupStyle = {
"--tbg-option-count": optionCount,
"--tbg-active-index": activeIndex >= 0 ? activeIndex : 0
};
return (
<div
className={joinClasses([className, activeIndex >= 0 && "has-active"])}
role={role}
aria-label={ariaLabel}
style={groupStyle}
>
<span className="tbg-indicator" aria-hidden="true" />
{options.map((option) => {
const isActive = value != null && option.value === value;
const handleClick = option.onClick
? option.onClick
: onChange
? () => onChange(option.value)
: undefined;
return (
<button
key={option.value}
type="button"
className={joinClasses([
buttonBaseClassName,
sizeClassName,
buttonClassName,
isActive ? (option.activeClassName || activeClassName) : (option.inactiveClassName || inactiveClassName),
option.className
])}
onClick={handleClick}
disabled={option.disabled}
aria-pressed={value != null ? isActive : undefined}
aria-label={option.ariaLabel}
>
{option.label}
</button>
);
})}
</div>
);
}

View File

@ -3,5 +3,6 @@ export { default as ErrorMessage } from './ErrorMessage.jsx';
export { default as FloatingActionButton } from './FloatingActionButton.jsx'; export { default as FloatingActionButton } from './FloatingActionButton.jsx';
export { default as FormInput } from './FormInput.jsx'; export { default as FormInput } from './FormInput.jsx';
export { default as SortDropdown } from './SortDropdown.jsx'; export { default as SortDropdown } from './SortDropdown.jsx';
export { default as ToggleButtonGroup } from './ToggleButtonGroup.jsx';
export { default as UserRoleCard } from './UserRoleCard.jsx'; export { default as UserRoleCard } from './UserRoleCard.jsx';

View File

@ -1,19 +1,49 @@
import { useState } from "react"; import { useMemo, useState } from "react";
import { ToggleButtonGroup } from "../common";
import AssignItemForModal from "../modals/AssignItemForModal";
import "../../styles/components/AddItemForm.css"; import "../../styles/components/AddItemForm.css";
import SuggestionList from "../items/SuggestionList"; import SuggestionList from "../items/SuggestionList";
export default function AddItemForm({ onAdd, onSuggest, suggestions, buttonText = "Add" }) { export default function AddItemForm({
onAdd,
onSuggest,
suggestions,
buttonText = "Add",
householdMembers = [],
currentUserId = null
}) {
const [itemName, setItemName] = useState(""); const [itemName, setItemName] = useState("");
const [quantity, setQuantity] = useState(1); const [quantity, setQuantity] = useState(1);
const [showSuggestions, setShowSuggestions] = useState(false); const [showSuggestions, setShowSuggestions] = useState(false);
const [assignmentMode, setAssignmentMode] = useState("me");
const [assignedUserId, setAssignedUserId] = useState(null);
const [showAssignModal, setShowAssignModal] = useState(false);
const numericCurrentUserId =
currentUserId == null ? null : Number.parseInt(String(currentUserId), 10);
const otherMembers = useMemo(
() => householdMembers.filter((member) => Number(member.id) !== numericCurrentUserId),
[householdMembers, numericCurrentUserId]
);
const assignedMemberLabel = useMemo(() => {
if (assignmentMode !== "others" || assignedUserId == null) return "";
const member = otherMembers.find((item) => Number(item.id) === Number(assignedUserId));
return member ? (member.display_name || member.name || member.username || `User ${member.id}`) : "";
}, [assignmentMode, assignedUserId, otherMembers]);
const handleSubmit = (e) => { const handleSubmit = (e) => {
e.preventDefault(); e.preventDefault();
if (!itemName.trim()) return; if (!itemName.trim()) return;
onAdd(itemName, quantity); const targetUserId = assignmentMode === "others" ? assignedUserId : null;
onAdd(itemName, quantity, targetUserId);
setItemName(""); setItemName("");
setQuantity(1); setQuantity(1);
setAssignmentMode("me");
setAssignedUserId(null);
setShowAssignModal(false);
}; };
const handleInputChange = (text) => { const handleInputChange = (text) => {
@ -35,30 +65,78 @@ export default function AddItemForm({ onAdd, onSuggest, suggestions, buttonText
setQuantity(prev => Math.max(1, prev - 1)); setQuantity(prev => Math.max(1, prev - 1));
}; };
const handleAssignmentModeChange = (mode) => {
if (mode === "me") {
setAssignmentMode("me");
setAssignedUserId(null);
setShowAssignModal(false);
return;
}
if (otherMembers.length === 0) {
setAssignmentMode("me");
setAssignedUserId(null);
return;
}
setAssignmentMode("others");
setShowAssignModal(true);
};
const handleAssignCancel = () => {
setShowAssignModal(false);
setAssignmentMode("me");
setAssignedUserId(null);
};
const handleAssignConfirm = (memberId) => {
setShowAssignModal(false);
setAssignmentMode("others");
setAssignedUserId(Number(memberId));
};
const isDisabled = !itemName.trim(); const isDisabled = !itemName.trim();
return ( return (
<div className="add-item-form-container"> <div className="add-item-form-container">
<form onSubmit={handleSubmit} className="add-item-form"> <form onSubmit={handleSubmit} className="add-item-form">
<div className="add-item-form-field"> <div className="add-item-form-input-row">
<input <div className="add-item-form-field">
type="text" <input
className="add-item-form-input" type="text"
placeholder="Enter item name" className="add-item-form-input"
value={itemName} placeholder="Enter item name"
onChange={(e) => handleInputChange(e.target.value)} value={itemName}
onBlur={() => setTimeout(() => setShowSuggestions(false), 150)} onChange={(e) => handleInputChange(e.target.value)}
onClick={() => setShowSuggestions(true)} onBlur={() => setTimeout(() => setShowSuggestions(false), 150)}
/> onClick={() => setShowSuggestions(true)}
{showSuggestions && suggestions.length > 0 && (
<SuggestionList
suggestions={suggestions}
onSelect={handleSuggestionSelect}
/> />
)}
{showSuggestions && suggestions.length > 0 && (
<SuggestionList
suggestions={suggestions}
onSelect={handleSuggestionSelect}
/>
)}
</div>
<ToggleButtonGroup
value={assignmentMode}
ariaLabel="Item assignment mode"
className="tbg-group add-item-form-assignee-toggle"
sizeClassName="tbg-size-xs"
options={[
{ value: "me", label: "Me" },
{ value: "others", label: "Others", disabled: otherMembers.length === 0 }
]}
onChange={handleAssignmentModeChange}
/>
</div> </div>
{assignmentMode === "others" && assignedMemberLabel ? (
<p className="add-item-form-assignee-hint">Adding for: {assignedMemberLabel}</p>
) : null}
<div className="add-item-form-actions"> <div className="add-item-form-actions">
<div className="add-item-form-quantity-control"> <div className="add-item-form-quantity-control">
<button <button
@ -94,6 +172,13 @@ export default function AddItemForm({ onAdd, onSuggest, suggestions, buttonText
</button> </button>
</div> </div>
</form> </form>
<AssignItemForModal
isOpen={showAssignModal}
members={otherMembers}
onCancel={handleAssignCancel}
onConfirm={handleAssignConfirm}
/>
</div> </div>
); );
} }

View File

@ -97,6 +97,11 @@ function GroceryListItem({ item, onClick, onImageAdded, onLongPress, allItems =
const imageUrl = item.item_image && item.image_mime_type const imageUrl = item.item_image && item.image_mime_type
? `data:${item.image_mime_type};base64,${item.item_image}` ? `data:${item.image_mime_type};base64,${item.item_image}`
: null; : null;
const addedByUsers = Array.isArray(item.added_by_users)
? item.added_by_users.filter(
(name) => typeof name === "string" && name.trim().length > 0
)
: [];
const getTimeAgo = (dateString) => { const getTimeAgo = (dateString) => {
if (!dateString) return null; if (!dateString) return null;
@ -146,10 +151,10 @@ function GroceryListItem({ item, onClick, onImageAdded, onLongPress, allItems =
<div className="glist-item-header"> <div className="glist-item-header">
<span className="glist-item-name">{item.item_name}</span> <span className="glist-item-name">{item.item_name}</span>
</div> </div>
{item.added_by_users && item.added_by_users.length > 0 && ( {addedByUsers.length > 0 && (
<div className="glist-item-users"> <div className="glist-item-users">
{item.last_added_on && `${getTimeAgo(item.last_added_on)} -- `} {item.last_added_on && `${getTimeAgo(item.last_added_on)} -- `}
{item.added_by_users.join(" • ")} {addedByUsers.join(" | ")}
</div> </div>
)} )}
</div> </div>

View File

@ -2,54 +2,61 @@ import "../../styles/components/Navbar.css";
import { useContext, useState } from "react"; import { useContext, useState } from "react";
import { Link } from "react-router-dom"; import { Link } from "react-router-dom";
import { logoutRequest } from "../../api/auth";
import { AuthContext } from "../../context/AuthContext"; import { AuthContext } from "../../context/AuthContext";
import HouseholdSwitcher from "../household/HouseholdSwitcher"; import HouseholdSwitcher from "../household/HouseholdSwitcher";
export default function Navbar() { export default function Navbar() {
const { role, logout, username } = useContext(AuthContext); const { role, logout, username } = useContext(AuthContext);
const [showNavMenu, setShowNavMenu] = useState(false);
const [showUserMenu, setShowUserMenu] = useState(false); const [showUserMenu, setShowUserMenu] = useState(false);
const closeMenus = () => { const closeMenus = () => {
setShowNavMenu(false);
setShowUserMenu(false); setShowUserMenu(false);
}; };
const handleLogout = async () => {
try {
await logoutRequest();
} catch (_) {
// Clear local auth state even if server logout fails.
} finally {
logout();
closeMenus();
window.location.href = "/login";
}
};
return ( return (
<nav className="navbar"> <nav className="navbar">
{/* Left: Navigation Menu */} <div className="navbar-section navbar-spacer" aria-hidden="true"></div>
<div className="navbar-section navbar-left">
<button
className="navbar-menu-btn"
onClick={() => {
setShowNavMenu(!showNavMenu);
setShowUserMenu(false);
}}
aria-label="Navigation menu"
>
<span className="hamburger-icon">
<span></span>
<span></span>
<span></span>
</span>
</button>
{showNavMenu && ( {/* Center: Home + Household + Manage */}
<>
<div className="menu-overlay" onClick={closeMenus}></div>
<div className="navbar-dropdown nav-dropdown">
<Link to="/" onClick={closeMenus}>Home</Link>
<Link to="/manage" onClick={closeMenus}>Manage</Link>
<Link to="/settings" onClick={closeMenus}>Settings</Link>
{role === "system_admin" && <Link to="/admin" onClick={closeMenus}>Admin</Link>}
</div>
</>
)}
</div>
{/* Center: Household Switcher */}
<div className="navbar-section navbar-center"> <div className="navbar-section navbar-center">
<HouseholdSwitcher /> <div className="navbar-center-nav">
<Link
to="/"
className="navbar-icon-link navbar-icon-left"
title="Home"
aria-label="Home"
onClick={closeMenus}
>
<span aria-hidden="true">&#8962;</span>
</Link>
<div className="navbar-household-wrap">
<HouseholdSwitcher />
</div>
<Link
to="/manage"
className="navbar-icon-link navbar-icon-right"
title="Manage"
aria-label="Manage"
onClick={closeMenus}
>
<span aria-hidden="true">&#9881;</span>
</Link>
</div>
</div> </div>
{/* Right: User Menu */} {/* Right: User Menu */}
@ -58,10 +65,15 @@ export default function Navbar() {
className="navbar-user-btn" className="navbar-user-btn"
onClick={() => { onClick={() => {
setShowUserMenu(!showUserMenu); setShowUserMenu(!showUserMenu);
setShowNavMenu(false);
}} }}
aria-label="User menu"
> >
{username} <span className="navbar-user-icon" aria-hidden="true">
<svg viewBox="0 0 24 24" focusable="false" aria-hidden="true">
<path d="M12 12a5 5 0 1 0-5-5 5 5 0 0 0 5 5Zm0 2c-4.14 0-7.5 2.69-7.5 6v1h15v-1c0-3.31-3.36-6-7.5-6Z" />
</svg>
</span>
<span className="navbar-user-name">{username}</span>
</button> </button>
{showUserMenu && ( {showUserMenu && (
@ -72,7 +84,15 @@ export default function Navbar() {
<span className="user-dropdown-username">{username}</span> <span className="user-dropdown-username">{username}</span>
<span className="user-dropdown-role">{role}</span> <span className="user-dropdown-role">{role}</span>
</div> </div>
<button className="user-dropdown-logout" onClick={() => { logout(); closeMenus(); }}> <Link to="/settings" className="user-dropdown-link" onClick={closeMenus}>
User Settings
</Link>
{role === "system_admin" && (
<Link to="/admin" className="user-dropdown-link" onClick={closeMenus}>
Admin Settings
</Link>
)}
<button className="user-dropdown-logout" onClick={handleLogout}>
Logout Logout
</button> </button>
</div> </div>

View File

@ -38,7 +38,6 @@ export default function CreateJoinHousehold({ onClose }) {
setError(""); setError("");
try { try {
console.log("Joining household with invite code:", inviteCode);
await joinHousehold(inviteCode); await joinHousehold(inviteCode);
await refreshHouseholds(); await refreshHouseholds();
onClose(); onClose();

View File

@ -64,9 +64,19 @@ export default function ManageHousehold() {
try { try {
const response = await refreshInviteCode(activeHousehold.id); const response = await refreshInviteCode(activeHousehold.id);
await refreshHouseholds(); await refreshHouseholds();
alert(`New invite code: ${response.data.inviteCode}`); const refreshedInviteCode = response.data?.household?.invite_code;
if (refreshedInviteCode) {
alert(`New invite code: ${refreshedInviteCode}`);
} else {
alert("Invite code refreshed successfully");
}
} catch (error) { } catch (error) {
console.error("Failed to refresh invite code:", error); console.error(
"Failed to refresh invite code:",
error?.response?.data?.error?.message ||
error?.response?.data?.message ||
error?.message
);
alert("Failed to refresh invite code"); alert("Failed to refresh invite code");
} }
}; };

View File

@ -86,7 +86,6 @@ export default function ManageStores() {
<div className="store-info"> <div className="store-info">
<h3>{store.name}</h3> <h3>{store.name}</h3>
{store.location && <p className="store-location">{store.location}</p>} {store.location && <p className="store-location">{store.location}</p>}
{store.is_default && <span className="default-badge">Default</span>}
</div> </div>
{isAdmin && ( {isAdmin && (
<div className="store-actions"> <div className="store-actions">

View File

@ -0,0 +1,149 @@
import { useEffect, useMemo, useRef, useState } from "react";
import "../../styles/components/AssignItemForModal.css";
function getMemberLabel(member) {
return member.display_name || member.name || member.username || `User ${member.id}`;
}
function getMemberOptionLabel(member, maxLength = 28) {
const label = getMemberLabel(member);
if (label.length <= maxLength) return label;
return `${label.slice(0, maxLength - 3)}...`;
}
export default function AssignItemForModal({
isOpen,
members,
onCancel,
onConfirm
}) {
const [selectedUserId, setSelectedUserId] = useState("");
const [isDropdownOpen, setIsDropdownOpen] = useState(false);
const dropdownRef = useRef(null);
const hasMembers = members.length > 0;
const selectedMember = useMemo(
() => members.find((member) => String(member.id) === String(selectedUserId)) || null,
[members, selectedUserId]
);
useEffect(() => {
if (!isOpen) return;
setSelectedUserId(members[0] ? String(members[0].id) : "");
setIsDropdownOpen(false);
}, [isOpen, members]);
useEffect(() => {
if (!isOpen) return undefined;
const handleEscape = (event) => {
if (event.key === "Escape") {
if (isDropdownOpen) {
setIsDropdownOpen(false);
} else {
onCancel();
}
}
};
window.addEventListener("keydown", handleEscape);
return () => window.removeEventListener("keydown", handleEscape);
}, [isDropdownOpen, isOpen, onCancel]);
useEffect(() => {
if (!isOpen || !isDropdownOpen) return undefined;
const handlePointerDown = (event) => {
if (!dropdownRef.current) return;
if (!dropdownRef.current.contains(event.target)) {
setIsDropdownOpen(false);
}
};
window.addEventListener("pointerdown", handlePointerDown);
return () => window.removeEventListener("pointerdown", handlePointerDown);
}, [isDropdownOpen, isOpen]);
if (!isOpen) return null;
const handleConfirm = () => {
if (!selectedMember) return;
onConfirm(selectedMember.id);
};
return (
<div className="modal-overlay" onClick={onCancel}>
<div className="modal assign-item-for-modal" onClick={(event) => event.stopPropagation()}>
<h2 className="modal-title">Add Item For Someone Else</h2>
{hasMembers ? (
<div className="assign-item-for-modal-field">
<label className="form-label">
Household member
</label>
<div className="assign-item-for-dropdown" ref={dropdownRef}>
<button
type="button"
className={`assign-item-for-dropdown-trigger ${isDropdownOpen ? "is-open" : ""}`}
aria-haspopup="listbox"
aria-expanded={isDropdownOpen}
onClick={() => setIsDropdownOpen((prev) => !prev)}
>
<span className="assign-item-for-dropdown-label">
{selectedMember ? getMemberOptionLabel(selectedMember) : "Select member"}
</span>
<span className="assign-item-for-dropdown-caret" aria-hidden="true">
{isDropdownOpen ? "▲" : "▼"}
</span>
</button>
{isDropdownOpen ? (
<div className="assign-item-for-dropdown-menu" role="listbox" aria-label="Household member">
{members.map((member) => {
const memberId = String(member.id);
const isSelected = memberId === String(selectedUserId);
return (
<button
key={member.id}
type="button"
className={`assign-item-for-dropdown-option ${isSelected ? "is-selected" : ""}`}
role="option"
aria-selected={isSelected}
onClick={() => {
setSelectedUserId(memberId);
setIsDropdownOpen(false);
}}
title={getMemberLabel(member)}
>
{getMemberOptionLabel(member)}
</button>
);
})}
</div>
) : null}
</div>
</div>
) : (
<p className="assign-item-for-modal-empty">
No other household members are available.
</p>
)}
<div className="modal-actions">
<button type="button" className="btn btn-outline flex-1" onClick={onCancel}>
Cancel
</button>
<button
type="button"
className="btn btn-primary flex-1"
onClick={handleConfirm}
disabled={!selectedMember}
>
Confirm
</button>
</div>
</div>
</div>
);
}

View File

@ -1,7 +1,9 @@
// Barrel export for modal components // Barrel export for modal components
export { default as AddImageModal } from './AddImageModal.jsx'; export { default as AddImageModal } from './AddImageModal.jsx';
export { default as AddItemWithDetailsModal } from './AddItemWithDetailsModal.jsx'; export { default as AddItemWithDetailsModal } from './AddItemWithDetailsModal.jsx';
export { default as AssignItemForModal } from './AssignItemForModal.jsx';
export { default as ConfirmBuyModal } from './ConfirmBuyModal.jsx'; export { default as ConfirmBuyModal } from './ConfirmBuyModal.jsx';
export { default as ConfirmSlideModal } from './ConfirmSlideModal.jsx';
export { default as EditItemModal } from './EditItemModal.jsx'; export { default as EditItemModal } from './EditItemModal.jsx';
export { default as ImageModal } from './ImageModal.jsx'; export { default as ImageModal } from './ImageModal.jsx';
export { default as ImageUploadModal } from './ImageUploadModal.jsx'; export { default as ImageUploadModal } from './ImageUploadModal.jsx';

View File

@ -26,7 +26,6 @@ export default function StoreTabs() {
disabled={loading} disabled={loading}
> >
<span className="store-name">{store.name}</span> <span className="store-name">{store.name}</span>
{store.is_default && <span className="default-badge">Default</span>}
</button> </button>
))} ))}
</div> </div>

View File

@ -15,6 +15,13 @@ export const AuthProvider = ({ children }) => {
const [role, setRole] = useState(localStorage.getItem('role') || null); const [role, setRole] = useState(localStorage.getItem('role') || null);
const [username, setUsername] = useState(localStorage.getItem('username') || null); const [username, setUsername] = useState(localStorage.getItem('username') || null);
const clearAuthStorage = () => {
localStorage.removeItem("token");
localStorage.removeItem("userId");
localStorage.removeItem("role");
localStorage.removeItem("username");
};
const login = (data) => { const login = (data) => {
localStorage.setItem('token', data.token); localStorage.setItem('token', data.token);
localStorage.setItem('userId', data.userId); localStorage.setItem('userId', data.userId);
@ -27,7 +34,7 @@ export const AuthProvider = ({ children }) => {
}; };
const logout = () => { const logout = () => {
localStorage.clear(); clearAuthStorage();
setToken(null); setToken(null);
setUserId(null); setUserId(null);

View File

@ -8,10 +8,9 @@ import {
getRecentlyBought, getRecentlyBought,
getSuggestions, getSuggestions,
markBought, markBought,
updateItemImage,
updateItemWithClassification updateItemWithClassification
} from "../api/list"; } from "../api/list";
import FloatingActionButton from "../components/common/FloatingActionButton"; import { getHouseholdMembers } from "../api/households";
import SortDropdown from "../components/common/SortDropdown"; import SortDropdown from "../components/common/SortDropdown";
import AddItemForm from "../components/forms/AddItemForm"; import AddItemForm from "../components/forms/AddItemForm";
import GroceryListItem from "../components/items/GroceryListItem"; import GroceryListItem from "../components/items/GroceryListItem";
@ -24,31 +23,34 @@ import { ZONE_FLOW } from "../constants/classifications";
import { ROLES } from "../constants/roles"; import { ROLES } from "../constants/roles";
import { AuthContext } from "../context/AuthContext"; import { AuthContext } from "../context/AuthContext";
import { HouseholdContext } from "../context/HouseholdContext"; import { HouseholdContext } from "../context/HouseholdContext";
import { IMAGE_UPLOAD_SUCCESS_EVENT } from "../context/UploadQueueContext";
import { SettingsContext } from "../context/SettingsContext"; import { SettingsContext } from "../context/SettingsContext";
import { StoreContext } from "../context/StoreContext"; import { StoreContext } from "../context/StoreContext";
import useUploadQueue from "../hooks/useUploadQueue";
import "../styles/pages/GroceryList.css"; import "../styles/pages/GroceryList.css";
import { findSimilarItems } from "../utils/stringSimilarity"; import { findSimilarItems } from "../utils/stringSimilarity";
export default function GroceryList() { export default function GroceryList() {
const pageTitle = "Grocery List"; const pageTitle = "Grocery List";
const { role: systemRole } = useContext(AuthContext); const { userId } = useContext(AuthContext);
const { activeHousehold } = useContext(HouseholdContext); const { activeHousehold } = useContext(HouseholdContext);
const { activeStore, stores, loading: storeLoading } = useContext(StoreContext); const { activeStore, stores, loading: storeLoading } = useContext(StoreContext);
const { settings } = useContext(SettingsContext); const { settings } = useContext(SettingsContext);
const { enqueueImageUpload } = useUploadQueue();
const navigate = useNavigate(); const navigate = useNavigate();
// Get household role for permissions // Get household role for permissions
const householdRole = activeHousehold?.role; const householdRole = activeHousehold?.role;
const isHouseholdAdmin = householdRole === "admin"; const isHouseholdAdmin = ["owner", "admin"].includes(householdRole);
// === State === // // === State === //
const [items, setItems] = useState([]); const [items, setItems] = useState([]);
const [recentlyBoughtItems, setRecentlyBoughtItems] = useState([]); const [recentlyBoughtItems, setRecentlyBoughtItems] = useState([]);
const [householdMembers, setHouseholdMembers] = useState([]);
const [recentlyBoughtDisplayCount, setRecentlyBoughtDisplayCount] = useState(settings.recentlyBoughtCount); const [recentlyBoughtDisplayCount, setRecentlyBoughtDisplayCount] = useState(settings.recentlyBoughtCount);
const [sortMode, setSortMode] = useState(settings.defaultSortMode); const [sortMode, setSortMode] = useState(settings.defaultSortMode);
const [suggestions, setSuggestions] = useState([]); const [suggestions, setSuggestions] = useState([]);
const [showAddForm, setShowAddForm] = useState(true);
const [loading, setLoading] = useState(true); const [loading, setLoading] = useState(true);
const [buttonText, setButtonText] = useState("Add Item"); const [buttonText, setButtonText] = useState("Add Item");
const [pendingItem, setPendingItem] = useState(null); const [pendingItem, setPendingItem] = useState(null);
@ -101,6 +103,73 @@ export default function GroceryList() {
loadRecentlyBought(); loadRecentlyBought();
}, [activeHousehold?.id, activeStore?.id]); }, [activeHousehold?.id, activeStore?.id]);
useEffect(() => {
const loadHouseholdMembers = async () => {
if (!activeHousehold?.id) {
setHouseholdMembers([]);
return;
}
try {
const response = await getHouseholdMembers(activeHousehold.id);
setHouseholdMembers(response.data || []);
} catch (error) {
console.error("Failed to load household members:", error);
setHouseholdMembers([]);
}
};
loadHouseholdMembers();
}, [activeHousehold?.id]);
useEffect(() => {
const handleUploadSuccess = async (event) => {
const detail = event?.detail || {};
if (!activeHousehold?.id || !activeStore?.id) return;
if (String(detail.householdId) !== String(activeHousehold.id)) return;
if (String(detail.storeId) !== String(activeStore.id)) return;
if (!detail.itemName) return;
try {
const response = await getItemByName(activeHousehold.id, activeStore.id, detail.itemName);
const refreshedItem = response.data;
setItems((prev) =>
prev.map((item) => {
const byId =
detail.localItemId !== null &&
detail.localItemId !== undefined &&
item.id === detail.localItemId;
const byName =
String(item.item_name || "").toLowerCase() ===
String(detail.itemName || "").toLowerCase();
return byId || byName ? { ...item, ...refreshedItem } : item;
})
);
setRecentlyBoughtItems((prev) =>
prev.map((item) => {
const byId =
detail.localItemId !== null &&
detail.localItemId !== undefined &&
item.id === detail.localItemId;
const byName =
String(item.item_name || "").toLowerCase() ===
String(detail.itemName || "").toLowerCase();
return byId || byName ? { ...item, ...refreshedItem } : item;
})
);
} catch (error) {
console.error("Failed to refresh item after upload success:", error);
}
};
window.addEventListener(IMAGE_UPLOAD_SUCCESS_EVENT, handleUploadSuccess);
return () => {
window.removeEventListener(IMAGE_UPLOAD_SUCCESS_EVENT, handleUploadSuccess);
};
}, [activeHousehold?.id, activeStore?.id]);
// === Zone Collapse Handler === // === Zone Collapse Handler ===
const toggleZoneCollapse = (zone) => { const toggleZoneCollapse = (zone) => {
@ -185,49 +254,60 @@ export default function GroceryList() {
// === Item Addition Handlers === // === Item Addition Handlers ===
const handleAdd = useCallback(async (itemName, quantity) => { const handleAdd = useCallback(async (itemName, quantity, addedForUserId = null) => {
if (!itemName.trim()) return; const normalizedItemName = itemName.trim().toLowerCase();
if (!normalizedItemName) return;
if (!activeHousehold?.id || !activeStore?.id) return; if (!activeHousehold?.id || !activeStore?.id) return;
// Check if item already exists const allItems = [...items, ...recentlyBoughtItems];
let existingItem = null; const existingLocalItem = allItems.find(
try { (item) => String(item.item_name || "").toLowerCase() === normalizedItemName
const response = await getItemByName(activeHousehold.id, activeStore.id, itemName); );
existingItem = response.data;
} catch {
// Item doesn't exist, continue
}
if (existingItem) { if (existingLocalItem) {
await processItemAddition(itemName, quantity); await processItemAddition(itemName, quantity, {
existingItem: existingLocalItem,
addedForUserId
});
return; return;
} }
setItems(prevItems => { const similar = findSimilarItems(itemName, allItems, 70);
const allItems = [...prevItems, ...recentlyBoughtItems]; if (similar.length > 0) {
const similar = findSimilarItems(itemName, allItems, 70); setSimilarItemSuggestion({
if (similar.length > 0) { originalName: itemName,
setSimilarItemSuggestion({ originalName: itemName, suggestedItem: similar[0], quantity }); suggestedItem: similar[0],
setShowSimilarModal(true); quantity,
return prevItems; addedForUserId
} });
setShowSimilarModal(true);
return;
}
processItemAddition(itemName, quantity); const shouldSkipLookup = buttonText === "Create + Add";
return prevItems; await processItemAddition(itemName, quantity, {
skipLookup: shouldSkipLookup,
addedForUserId
}); });
}, [activeHousehold?.id, activeStore?.id, recentlyBoughtItems]); }, [activeHousehold?.id, activeStore?.id, items, recentlyBoughtItems, buttonText]);
const processItemAddition = useCallback(async (itemName, quantity) => { const processItemAddition = useCallback(async (itemName, quantity, options = {}) => {
if (!activeHousehold?.id || !activeStore?.id) return; if (!activeHousehold?.id || !activeStore?.id) return;
const {
existingItem: providedItem = null,
skipLookup = false,
addedForUserId = null
} = options;
// Fetch current item state from backend let existingItem = providedItem;
let existingItem = null; if (!existingItem && !skipLookup) {
try { try {
const response = await getItemByName(activeHousehold.id, activeStore.id, itemName); const response = await getItemByName(activeHousehold.id, activeStore.id, itemName);
existingItem = response.data; existingItem = response.data;
} catch { } catch {
// Item doesn't exist, continue with add // Item doesn't exist, continue with add
}
} }
if (existingItem?.bought === false) { if (existingItem?.bought === false) {
@ -240,11 +320,20 @@ export default function GroceryList() {
currentQuantity, currentQuantity,
addingQuantity: quantity, addingQuantity: quantity,
newQuantity, newQuantity,
existingItem existingItem,
addedForUserId
}); });
setShowConfirmAddExisting(true); setShowConfirmAddExisting(true);
} else if (existingItem) { } else if (existingItem) {
await addItem(activeHousehold.id, activeStore.id, itemName, quantity, null); await addItem(
activeHousehold.id,
activeStore.id,
itemName,
quantity,
null,
null,
addedForUserId
);
setSuggestions([]); setSuggestions([]);
setButtonText("Add Item"); setButtonText("Add Item");
@ -252,10 +341,10 @@ export default function GroceryList() {
await loadItems(); await loadItems();
await loadRecentlyBought(); await loadRecentlyBought();
} else { } else {
setPendingItem({ itemName, quantity }); setPendingItem({ itemName, quantity, addedForUserId });
setShowAddDetailsModal(true); setShowAddDetailsModal(true);
} }
}, [activeHousehold?.id, activeStore?.id, items, loadItems]); }, [activeHousehold?.id, activeStore?.id, loadItems]);
// === Similar Item Modal Handlers === // === Similar Item Modal Handlers ===
@ -268,7 +357,10 @@ export default function GroceryList() {
const handleSimilarNo = useCallback(async () => { const handleSimilarNo = useCallback(async () => {
if (!similarItemSuggestion) return; if (!similarItemSuggestion) return;
setShowSimilarModal(false); setShowSimilarModal(false);
await processItemAddition(similarItemSuggestion.originalName, similarItemSuggestion.quantity); await processItemAddition(similarItemSuggestion.originalName, similarItemSuggestion.quantity, {
skipLookup: true,
addedForUserId: similarItemSuggestion.addedForUserId || null
});
setSimilarItemSuggestion(null); setSimilarItemSuggestion(null);
}, [similarItemSuggestion, processItemAddition]); }, [similarItemSuggestion, processItemAddition]);
@ -276,7 +368,9 @@ export default function GroceryList() {
const handleSimilarYes = useCallback(async () => { const handleSimilarYes = useCallback(async () => {
if (!similarItemSuggestion) return; if (!similarItemSuggestion) return;
setShowSimilarModal(false); setShowSimilarModal(false);
await processItemAddition(similarItemSuggestion.suggestedItem.item_name, similarItemSuggestion.quantity); await processItemAddition(similarItemSuggestion.suggestedItem.item_name, similarItemSuggestion.quantity, {
addedForUserId: similarItemSuggestion.addedForUserId || null
});
setSimilarItemSuggestion(null); setSimilarItemSuggestion(null);
}, [similarItemSuggestion, processItemAddition]); }, [similarItemSuggestion, processItemAddition]);
@ -286,13 +380,21 @@ export default function GroceryList() {
if (!confirmAddExistingData) return; if (!confirmAddExistingData) return;
if (!activeHousehold?.id || !activeStore?.id) return; if (!activeHousehold?.id || !activeStore?.id) return;
const { itemName, newQuantity, existingItem } = confirmAddExistingData; const { itemName, newQuantity, existingItem, addedForUserId } = confirmAddExistingData;
setShowConfirmAddExisting(false); setShowConfirmAddExisting(false);
setConfirmAddExistingData(null); setConfirmAddExistingData(null);
try { try {
await addItem(activeHousehold.id, activeStore.id, itemName, newQuantity, null); await addItem(
activeHousehold.id,
activeStore.id,
itemName,
newQuantity,
null,
null,
addedForUserId || null
);
const response = await getItemByName(activeHousehold.id, activeStore.id, itemName); const response = await getItemByName(activeHousehold.id, activeStore.id, itemName);
const updatedItem = response.data; const updatedItem = response.data;
@ -318,7 +420,16 @@ export default function GroceryList() {
if (!activeHousehold?.id || !activeStore?.id) return; if (!activeHousehold?.id || !activeStore?.id) return;
try { try {
await addItem(activeHousehold.id, activeStore.id, pendingItem.itemName, pendingItem.quantity, imageFile); // Create the list item first, upload image separately in background.
await addItem(
activeHousehold.id,
activeStore.id,
pendingItem.itemName,
pendingItem.quantity,
null,
null,
pendingItem.addedForUserId || null
);
if (classification) { if (classification) {
// Apply classification if provided // Apply classification if provided
@ -337,19 +448,42 @@ export default function GroceryList() {
// Add to state // Add to state
if (newItem) { if (newItem) {
setItems(prevItems => [...prevItems, newItem]); setItems(prevItems => [...prevItems, newItem]);
if (imageFile) {
enqueueImageUpload({
householdId: activeHousehold.id,
storeId: activeStore.id,
itemName: newItem.item_name || pendingItem.itemName,
quantity: newItem.quantity || pendingItem.quantity,
fileBlob: imageFile,
fileName: imageFile.name || "upload.jpg",
fileType: imageFile.type || "image/jpeg",
fileSize: imageFile.size || 0,
source: "add_details",
localItemId: newItem.id,
});
}
} }
} catch (error) { } catch (error) {
console.error("Failed to add item:", error); console.error("Failed to add item:", error);
alert("Failed to add item. Please try again."); alert("Failed to add item. Please try again.");
} }
}, [activeHousehold?.id, activeStore?.id, pendingItem]); }, [activeHousehold?.id, activeStore?.id, pendingItem, enqueueImageUpload]);
const handleAddDetailsSkip = useCallback(async () => { const handleAddDetailsSkip = useCallback(async () => {
if (!pendingItem) return; if (!pendingItem) return;
if (!activeHousehold?.id || !activeStore?.id) return; if (!activeHousehold?.id || !activeStore?.id) return;
try { try {
await addItem(activeHousehold.id, activeStore.id, pendingItem.itemName, pendingItem.quantity, null); await addItem(
activeHousehold.id,
activeStore.id,
pendingItem.itemName,
pendingItem.quantity,
null,
null,
pendingItem.addedForUserId || null
);
// Fetch the newly added item // Fetch the newly added item
const itemResponse = await getItemByName(activeHousehold.id, activeStore.id, pendingItem.itemName); const itemResponse = await getItemByName(activeHousehold.id, activeStore.id, pendingItem.itemName);
@ -403,28 +537,28 @@ export default function GroceryList() {
loadRecentlyBought(); loadRecentlyBought();
}, [activeHousehold?.id, activeStore?.id, items]); }, [activeHousehold?.id, activeStore?.id, items]);
const handleImageAdded = useCallback(async (id, itemName, quantity, imageFile) => { const handleImageAdded = useCallback(async (id, itemName, quantity, imageFile, source = "add_image_modal") => {
if (!activeHousehold?.id || !activeStore?.id) return; if (!activeHousehold?.id || !activeStore?.id) return;
if (!imageFile) return;
try { try {
const response = await updateItemImage(activeHousehold.id, activeStore.id, id, itemName, quantity, imageFile); enqueueImageUpload({
householdId: activeHousehold.id,
setItems(prevItems => storeId: activeStore.id,
prevItems.map(item => itemName,
item.id === id ? { ...item, ...response.data } : item quantity,
) fileBlob: imageFile,
); fileName: imageFile.name || "upload.jpg",
fileType: imageFile.type || "image/jpeg",
setRecentlyBoughtItems(prevItems => fileSize: imageFile.size || 0,
prevItems.map(item => source,
item.id === id ? { ...item, ...response.data } : item localItemId: id,
) });
);
} catch (error) { } catch (error) {
console.error("Failed to add image:", error); console.error("Failed to add image:", error);
alert("Failed to add image. Please try again."); alert("Failed to add image. Please try again.");
} }
}, [activeHousehold?.id, activeStore?.id]); }, [activeHousehold?.id, activeStore?.id, enqueueImageUpload]);
const handleLongPress = useCallback(async (item) => { const handleLongPress = useCallback(async (item) => {
@ -586,12 +720,14 @@ export default function GroceryList() {
<StoreTabs /> <StoreTabs />
{householdRole && householdRole !== 'viewer' && showAddForm && ( {householdRole && householdRole !== 'viewer' && (
<AddItemForm <AddItemForm
onAdd={handleAdd} onAdd={handleAdd}
onSuggest={handleSuggest} onSuggest={handleSuggest}
suggestions={suggestions} suggestions={suggestions}
buttonText={buttonText} buttonText={buttonText}
householdMembers={householdMembers}
currentUserId={userId}
/> />
)} )}
@ -711,13 +847,6 @@ export default function GroceryList() {
)} )}
</div> </div>
{householdRole && householdRole !== 'viewer' && (
<FloatingActionButton
isOpen={showAddForm}
onClick={() => setShowAddForm(!showAddForm)}
/>
)}
{showAddDetailsModal && pendingItem && ( {showAddDetailsModal && pendingItem && (
<AddItemWithDetailsModal <AddItemWithDetailsModal
itemName={pendingItem.itemName} itemName={pendingItem.itemName}

View File

@ -1,4 +1,4 @@
import { useContext, useEffect, useState } from "react"; import { useContext, useEffect, useRef, useState } from "react";
import { changePassword, getCurrentUser, updateCurrentUser } from "../api/users"; import { changePassword, getCurrentUser, updateCurrentUser } from "../api/users";
import { SettingsContext } from "../context/SettingsContext"; import { SettingsContext } from "../context/SettingsContext";
import "../styles/pages/Settings.css"; import "../styles/pages/Settings.css";
@ -7,6 +7,9 @@ import "../styles/pages/Settings.css";
export default function Settings() { export default function Settings() {
const { settings, updateSettings, resetSettings } = useContext(SettingsContext); const { settings, updateSettings, resetSettings } = useContext(SettingsContext);
const [activeTab, setActiveTab] = useState("appearance"); const [activeTab, setActiveTab] = useState("appearance");
const tabsRef = useRef(null);
const [showLeftArrow, setShowLeftArrow] = useState(false);
const [showRightArrow, setShowRightArrow] = useState(false);
// Account management state // Account management state
const [displayName, setDisplayName] = useState(""); const [displayName, setDisplayName] = useState("");
@ -30,6 +33,35 @@ export default function Settings() {
loadProfile(); loadProfile();
}, []); }, []);
useEffect(() => {
const tabsElement = tabsRef.current;
if (!tabsElement) return;
const updateArrowVisibility = () => {
const hasOverflow = tabsElement.scrollWidth > tabsElement.clientWidth + 1;
if (!hasOverflow) {
setShowLeftArrow(false);
setShowRightArrow(false);
return;
}
setShowLeftArrow(tabsElement.scrollLeft > 4);
setShowRightArrow(
tabsElement.scrollLeft + tabsElement.clientWidth < tabsElement.scrollWidth - 4
);
};
updateArrowVisibility();
tabsElement.addEventListener("scroll", updateArrowVisibility, { passive: true });
window.addEventListener("resize", updateArrowVisibility);
return () => {
tabsElement.removeEventListener("scroll", updateArrowVisibility);
window.removeEventListener("resize", updateArrowVisibility);
};
}, []);
const handleThemeChange = (theme) => { const handleThemeChange = (theme) => {
updateSettings({ theme }); updateSettings({ theme });
@ -114,31 +146,47 @@ export default function Settings() {
<div className="card" style={{ maxWidth: '800px', margin: '0 auto' }}> <div className="card" style={{ maxWidth: '800px', margin: '0 auto' }}>
<h1 className="text-2xl font-semibold mb-4">Settings</h1> <h1 className="text-2xl font-semibold mb-4">Settings</h1>
<div className="settings-tabs"> <div className="settings-tabs-wrapper">
<button <div
className={`settings-tab ${activeTab === "appearance" ? "active" : ""}`} className={`settings-tabs-arrow settings-tabs-arrow-left ${showLeftArrow ? "visible" : ""}`}
onClick={() => setActiveTab("appearance")} aria-hidden="true"
> >
Appearance &#8249;
</button> </div>
<button
className={`settings-tab ${activeTab === "list" ? "active" : ""}`} <div className="settings-tabs" ref={tabsRef}>
onClick={() => setActiveTab("list")} <button
className={`settings-tab ${activeTab === "appearance" ? "active" : ""}`}
onClick={() => setActiveTab("appearance")}
>
Appearance
</button>
<button
className={`settings-tab ${activeTab === "list" ? "active" : ""}`}
onClick={() => setActiveTab("list")}
>
List Display
</button>
<button
className={`settings-tab ${activeTab === "behavior" ? "active" : ""}`}
onClick={() => setActiveTab("behavior")}
>
Behavior
</button>
<button
className={`settings-tab ${activeTab === "account" ? "active" : ""}`}
onClick={() => setActiveTab("account")}
>
Account
</button>
</div>
<div
className={`settings-tabs-arrow settings-tabs-arrow-right ${showRightArrow ? "visible" : ""}`}
aria-hidden="true"
> >
List Display &#8250;
</button> </div>
<button
className={`settings-tab ${activeTab === "behavior" ? "active" : ""}`}
onClick={() => setActiveTab("behavior")}
>
Behavior
</button>
<button
className={`settings-tab ${activeTab === "account" ? "active" : ""}`}
onClick={() => setActiveTab("account")}
>
Account
</button>
</div> </div>
<div className="settings-content"> <div className="settings-content">

View File

@ -1,7 +1,7 @@
/* Add Item Form Container */ /* Add Item Form Container */
.add-item-form-container { .add-item-form-container {
background: var(--color-bg-surface); background: var(--color-bg-surface);
padding: var(--spacing-lg); padding: var(--spacing-md);
border-radius: var(--border-radius-lg); border-radius: var(--border-radius-lg);
box-shadow: var(--shadow-md); box-shadow: var(--shadow-md);
margin-bottom: var(--spacing-xs); margin-bottom: var(--spacing-xs);
@ -11,7 +11,7 @@
.add-item-form { .add-item-form {
display: flex; display: flex;
flex-direction: column; flex-direction: column;
gap: var(--spacing-sm); gap: var(--spacing-xs);
} }
/* Form Fields */ /* Form Fields */
@ -21,6 +21,28 @@
position: relative; position: relative;
} }
.add-item-form-input-row {
display: flex;
align-items: stretch;
gap: var(--spacing-xs);
}
.add-item-form-input-row .add-item-form-field {
flex: 1;
}
.add-item-form-assignee-toggle {
flex: 0 0 auto;
width: 134px;
margin: 0;
}
.add-item-form-assignee-hint {
margin: 0;
font-size: var(--font-size-xs);
color: var(--color-text-secondary);
}
.add-item-form-input { .add-item-form-input {
padding: var(--input-padding-y) var(--input-padding-x); padding: var(--input-padding-y) var(--input-padding-x);
border: var(--border-width-thin) solid var(--input-border-color); border: var(--border-width-thin) solid var(--input-border-color);
@ -58,7 +80,8 @@
display: flex; display: flex;
align-items: center; align-items: center;
justify-content: space-between; justify-content: space-between;
gap: var(--spacing-md); gap: var(--spacing-sm);
min-height: 40px;
} }
/* Quantity Control */ /* Quantity Control */
@ -66,11 +89,12 @@
display: flex; display: flex;
align-items: center; align-items: center;
gap: var(--spacing-xs); gap: var(--spacing-xs);
height: 40px;
} }
.quantity-btn { .quantity-btn {
width: 40px; width: 40px;
height: 40px; height: 100%;
border: var(--border-width-thin) solid var(--color-border-medium); border: var(--border-width-thin) solid var(--color-border-medium);
background: var(--color-bg-surface); background: var(--color-bg-surface);
color: var(--color-text-primary); color: var(--color-text-primary);
@ -106,6 +130,8 @@
.add-item-form-quantity-input { .add-item-form-quantity-input {
width: 40px; width: 40px;
max-width: 40px; max-width: 40px;
height: 100%;
box-sizing: border-box;
padding: var(--input-padding-y) var(--input-padding-x); padding: var(--input-padding-y) var(--input-padding-x);
border: var(--border-width-thin) solid var(--input-border-color); border: var(--border-width-thin) solid var(--input-border-color);
border-radius: var(--input-border-radius); border-radius: var(--input-border-radius);
@ -142,9 +168,9 @@
font-size: var(--font-size-base); font-size: var(--font-size-base);
font-weight: var(--button-font-weight); font-weight: var(--button-font-weight);
flex: 1; flex: 1;
min-width: 120px min-width: 120px;
transition: var(--transition-base); transition: var(--transition-base);
margin-top: var(--spacing-sm); margin-top: 0;
} }
.add-item-form-submit:hover:not(:disabled) { .add-item-form-submit:hover:not(:disabled) {
@ -174,9 +200,22 @@
padding: var(--spacing-md); padding: var(--spacing-md);
} }
.add-item-form-assignee-toggle {
width: 120px;
}
.add-item-form-quantity-control {
height: 36px;
}
.quantity-btn { .quantity-btn {
width: 36px; width: 36px;
height: 36px; height: 100%;
font-size: var(--font-size-lg); font-size: var(--font-size-lg);
} }
.add-item-form-quantity-input,
.add-item-form-submit {
height: 36px;
}
} }

View File

@ -0,0 +1,95 @@
.assign-item-for-modal {
width: min(420px, calc(100vw - (2 * var(--spacing-md))));
max-width: 420px;
overflow-x: hidden;
}
.assign-item-for-modal-field {
margin-bottom: var(--spacing-sm);
width: 100%;
min-width: 0;
}
.assign-item-for-dropdown {
position: relative;
width: 100%;
min-width: 0;
}
.assign-item-for-dropdown-trigger {
width: 100%;
min-width: 0;
max-width: 100%;
display: flex;
align-items: center;
justify-content: space-between;
gap: var(--spacing-xs);
padding: var(--input-padding-y) var(--input-padding-x);
border: var(--border-width-thin) solid var(--input-border-color);
border-radius: var(--input-border-radius);
background: var(--color-bg-surface);
color: var(--color-text-primary);
font-size: var(--font-size-base);
text-align: left;
cursor: pointer;
}
.assign-item-for-dropdown-trigger.is-open,
.assign-item-for-dropdown-trigger:focus-visible {
outline: none;
border-color: var(--input-focus-border-color);
box-shadow: var(--input-focus-shadow);
}
.assign-item-for-dropdown-label {
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
.assign-item-for-dropdown-caret {
flex-shrink: 0;
font-size: 0.75rem;
color: var(--color-text-secondary);
}
.assign-item-for-dropdown-menu {
position: absolute;
top: calc(100% + 6px);
left: 0;
right: 0;
z-index: 3;
max-height: 180px;
overflow-y: auto;
background: var(--color-bg-surface);
border: var(--border-width-thin) solid var(--input-border-color);
border-radius: var(--border-radius-md);
box-shadow: var(--shadow-lg);
}
.assign-item-for-dropdown-option {
width: 100%;
display: block;
text-align: left;
margin: 0;
border: 0;
border-radius: 0;
padding: 10px var(--input-padding-x);
background: transparent;
color: var(--color-text-primary);
cursor: pointer;
}
.assign-item-for-dropdown-option:hover {
background: var(--color-bg-hover);
}
.assign-item-for-dropdown-option.is-selected {
background: var(--color-primary-light);
}
.assign-item-for-modal-empty {
margin: 0 0 var(--spacing-sm) 0;
color: var(--color-text-secondary);
font-size: var(--font-size-sm);
}

View File

@ -6,8 +6,8 @@
background: #343a40; background: #343a40;
color: white; color: white;
padding: 0.75rem 1rem; padding: 0.75rem 1rem;
display: flex; display: grid;
justify-content: space-between; grid-template-columns: 1fr auto 1fr;
align-items: center; align-items: center;
gap: 1rem; gap: 1rem;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1); box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
@ -19,52 +19,86 @@
align-items: center; align-items: center;
} }
.navbar-left {
flex: 0 0 auto;
}
.navbar-center { .navbar-center {
flex: 1 1 auto; grid-column: 2;
display: flex; display: flex;
justify-content: center; justify-content: center;
max-width: 80%; min-width: 0;
margin: 0 auto; justify-self: center;
} }
.navbar-right { .navbar-right {
grid-column: 3;
justify-self: end;
flex: 0 0 auto; flex: 0 0 auto;
position: relative; position: relative;
} }
/* Hamburger Menu Button */ .navbar-spacer {
.navbar-menu-btn { grid-column: 1;
background: transparent; }
border: none;
cursor: pointer; .navbar-center-nav {
padding: 0.5rem;
display: flex; display: flex;
align-items: center; align-items: center;
gap: 0;
width: 100%;
justify-content: center; justify-content: center;
} }
.hamburger-icon { .navbar-household-wrap {
display: flex; position: relative;
flex-direction: column; width: 17rem;
gap: 4px; flex: 0 0 17rem;
width: 24px; margin: 0;
} }
.hamburger-icon span { .navbar .household-switcher {
display: block; display: block;
width: 100%; width: 100%;
height: 3px;
background: white;
border-radius: 2px;
transition: all 0.3s;
} }
.navbar-menu-btn:hover .hamburger-icon span { .navbar-icon-link {
background: #ddd; width: 48px;
height: 40px;
display: inline-flex;
align-items: center;
justify-content: center;
text-decoration: none;
color: #ffffff;
background: #495057;
border: 1px solid #5a6268;
border-radius: 0;
font-size: 1.2rem;
line-height: 1;
transition: background 0.2s;
}
.navbar-icon-link:hover {
background: #5a6268;
}
.navbar-icon-link:focus-visible {
outline: 2px solid #9ec5fe;
outline-offset: 2px;
}
.navbar-icon-left {
margin-right: 0;
border-radius: 8px 0 0 8px;
border-right: none;
}
.navbar-icon-right {
margin-left: 0;
border-radius: 0 8px 8px 0;
border-left: none;
}
.navbar .household-switcher-toggle {
border-radius: 0;
height: 40px;
padding: 0 0.75rem;
} }
/* User Button */ /* User Button */
@ -79,12 +113,33 @@
font-weight: 500; font-weight: 500;
white-space: nowrap; white-space: nowrap;
transition: background 0.2s; transition: background 0.2s;
display: inline-flex;
align-items: center;
gap: 0.5rem;
} }
.navbar-user-btn:hover { .navbar-user-btn:hover {
background: #5a6268; background: #5a6268;
} }
.navbar-user-icon {
width: 18px;
height: 18px;
display: none;
color: #ffffff;
}
.navbar-user-icon svg {
width: 100%;
height: 100%;
fill: currentColor;
display: block;
}
.navbar-user-name {
display: inline-block;
}
/* Dropdown Overlay */ /* Dropdown Overlay */
.menu-overlay { .menu-overlay {
position: fixed; position: fixed;
@ -110,31 +165,6 @@
} }
/* Navigation Dropdown */ /* Navigation Dropdown */
.nav-dropdown {
left: 0;
display: flex;
flex-direction: column;
}
.nav-dropdown a {
color: #343a40;
text-decoration: none;
padding: 0.75rem 1.25rem;
font-size: 1rem;
transition: background 0.2s;
border-bottom: 1px solid #f0f0f0;
}
.nav-dropdown a:last-child {
border-bottom: none;
}
.nav-dropdown a:hover {
background: #f8f9fa;
color: var(--color-primary);
}
/* User Dropdown */
.user-dropdown { .user-dropdown {
right: 0; right: 0;
min-width: 200px; min-width: 200px;
@ -161,6 +191,20 @@
text-transform: capitalize; text-transform: capitalize;
} }
.user-dropdown-link {
display: block;
width: 100%;
padding: 0.75rem 1.25rem;
color: #343a40;
text-decoration: none;
border-bottom: 1px solid #f0f0f0;
transition: background 0.2s;
}
.user-dropdown-link:hover {
background: #f8f9fa;
}
.user-dropdown-logout { .user-dropdown-logout {
width: 100%; width: 100%;
background: #dc3545; background: #dc3545;
@ -177,12 +221,6 @@
background: #c82333; background: #c82333;
} }
/* Household Switcher - Centered with max width */
.navbar-center > * {
width: 100%;
max-width: 24ch; /* 24 characters max width */
}
/* Mobile Responsive */ /* Mobile Responsive */
@media (max-width: 768px) { @media (max-width: 768px) {
.navbar { .navbar {
@ -190,8 +228,23 @@
gap: 0.5rem; gap: 0.5rem;
} }
.navbar-center { .navbar-household-wrap {
max-width: 60%; width: 14rem;
flex: 0 0 14rem;
}
.navbar-icon-link {
width: 42px;
height: 40px;
font-size: 1rem;
}
.navbar-icon-left {
margin-right: 0;
}
.navbar-icon-right {
margin-left: 0;
} }
.navbar-user-btn { .navbar-user-btn {
@ -199,10 +252,6 @@
font-size: 0.9rem; font-size: 0.9rem;
} }
.nav-dropdown {
min-width: 160px;
}
.user-dropdown { .user-dropdown {
min-width: 180px; min-width: 180px;
} }
@ -211,22 +260,43 @@
@media (max-width: 480px) { @media (max-width: 480px) {
.navbar { .navbar {
padding: 0.5rem; padding: 0.5rem;
grid-template-columns: auto 1fr auto;
} }
.navbar-center { .navbar-household-wrap {
max-width: 50%; width: 11rem;
flex: 0 0 11rem;
} }
.navbar-user-btn { .navbar-icon-left {
padding: 0.4rem 0.6rem; margin-right: 0;
font-size: 0.85rem;
} }
.hamburger-icon { .navbar-icon-right {
width: 20px; margin-left: 0;
} }
}
.hamburger-icon span {
height: 2.5px; @media (max-width: 360px) {
.navbar-household-wrap {
width: 10rem;
flex: 0 0 10rem;
}
}
@media (max-width: 900px) {
.navbar-user-btn {
width: 40px;
height: 40px;
padding: 0;
justify-content: center;
}
.navbar-user-icon {
display: inline-flex;
}
.navbar-user-name {
display: none;
} }
} }

View File

@ -0,0 +1,81 @@
.tbg-group {
position: relative;
display: grid;
grid-template-columns: repeat(var(--tbg-option-count, 1), minmax(0, 1fr));
align-items: stretch;
gap: 0;
padding: 2px;
border: 1px solid var(--border);
border-radius: 999px;
background: var(--background);
overflow: hidden;
isolation: isolate;
}
.tbg-indicator {
position: absolute;
top: 2px;
bottom: 2px;
left: 2px;
width: calc((100% - 4px) / var(--tbg-option-count, 1));
border-radius: 999px;
background: var(--primary);
transform: translateX(calc(var(--tbg-active-index, 0) * 100%));
transition: transform 0.22s ease, opacity 0.2s ease;
opacity: 0;
z-index: 0;
}
.tbg-group.has-active .tbg-indicator {
opacity: 1;
}
.tbg-button {
position: relative;
z-index: 1;
margin: 0;
width: 100%;
border: none;
border-radius: 999px;
background: transparent;
color: var(--text-secondary);
cursor: pointer;
transition: color var(--transition-fast), background-color var(--transition-fast);
white-space: nowrap;
}
.tbg-button.tbg-size-default {
padding: 0.5rem 0.8rem;
font-size: 0.9rem;
font-weight: 500;
}
.tbg-button.tbg-size-xs {
padding: 0.35rem 0.5rem;
font-size: var(--font-size-xs);
font-weight: var(--font-weight-semibold);
}
.tbg-button.is-active {
color: var(--color-text-inverse);
background: transparent;
}
.tbg-button.is-inactive:hover:not(:disabled) {
color: var(--text-primary);
background: rgba(0, 0, 0, 0.04);
}
[data-theme="dark"] .tbg-button.is-inactive:hover:not(:disabled) {
background: rgba(255, 255, 255, 0.08);
}
.tbg-button:focus-visible {
outline: 2px solid var(--primary);
outline-offset: -2px;
}
.tbg-button:disabled {
opacity: 0.6;
cursor: not-allowed;
}

View File

@ -7,10 +7,19 @@
} }
/* Tabs */ /* Tabs */
.settings-tabs-wrapper {
position: relative;
margin-bottom: var(--spacing-xl);
padding: 0 0.8rem;
}
.settings-tabs { .settings-tabs {
display: flex; display: flex;
gap: var(--spacing-sm); gap: 0rem;
margin-bottom: var(--spacing-xl); flex-wrap: nowrap;
overflow-x: auto;
overflow-y: hidden;
-webkit-overflow-scrolling: touch;
border-bottom: 2px solid var(--color-border-light); border-bottom: 2px solid var(--color-border-light);
touch-action: pan-x; /* Lock Y-axis, allow only horizontal scrolling */ touch-action: pan-x; /* Lock Y-axis, allow only horizontal scrolling */
scrollbar-width: none; /* Firefox */ scrollbar-width: none; /* Firefox */
@ -22,7 +31,10 @@
} }
.settings-tab { .settings-tab {
padding: var(--spacing-md) var(--spacing-lg); flex: 0 0 max-content;
white-space: nowrap;
width: max-content;
padding: 0rem 1.4rem;
background: none; background: none;
border: none; border: none;
border-bottom: 3px solid transparent; border-bottom: 3px solid transparent;
@ -34,6 +46,41 @@
margin-bottom: -2px; margin-bottom: -2px;
} }
.settings-tabs-arrow {
position: absolute;
top: calc(50% - 0.2rem);
transform: translateY(-50%);
width: 2.6rem;
height: 2.6rem;
border-radius: 999px;
border: none;
background: transparent;
color: var(--color-primary);
font-size: 2rem;
font-weight: 700;
line-height: 1;
display: inline-flex;
align-items: center;
justify-content: center;
z-index: 2;
opacity: 0;
pointer-events: none;
user-select: none;
transition: opacity 0.2s ease;
}
.settings-tabs-arrow.visible {
opacity: 1;
}
.settings-tabs-arrow-left {
left: -1.6rem;
}
.settings-tabs-arrow-right {
right: -1.6rem;
}
.settings-tab:hover { .settings-tab:hover {
color: var(--color-primary); color: var(--color-primary);
background: var(--color-bg-hover); background: var(--color-bg-hover);
@ -180,14 +227,30 @@
} }
.settings-tabs { .settings-tabs {
flex-wrap: nowrap; padding: 0 0.1rem;
overflow-x: auto;
-webkit-overflow-scrolling: touch;
} }
.settings-tab { .settings-tab {
padding: var(--spacing-sm) var(--spacing-md); padding: 0.4rem 0.35rem;
white-space: nowrap; }
.settings-tabs-wrapper {
padding: 0 0.55rem;
}
.settings-tabs-arrow {
top: calc(50% - 0.15rem);
width: 2.2rem;
height: 2.2rem;
font-size: 1.65rem;
}
.settings-tabs-arrow-left {
left: -1.2rem;
}
.settings-tabs-arrow-right {
right: -1.2rem;
} }
.settings-theme-options { .settings-theme-options {

View File

@ -1,4 +1,9 @@
{ {
"scripts": {
"db:migrate": "node scripts/db-migrate.js",
"db:migrate:status": "node scripts/db-migrate-status.js",
"db:migrate:verify": "node scripts/db-migrate-verify.js"
},
"devDependencies": { "devDependencies": {
"cross-env": "^10.1.0", "cross-env": "^10.1.0",
"jest": "^30.2.0", "jest": "^30.2.0",

View File

@ -0,0 +1,9 @@
# Migration Directory
This directory is the canonical location for SQL migrations.
- Use `npm run db:migrate` to apply pending migrations.
- Use `npm run db:migrate:status` to view applied/pending migrations.
- Use `npm run db:migrate:verify` to fail when pending migrations exist.
Do not place new canonical migrations under `backend/migrations`.

View File

@ -0,0 +1,10 @@
-- Add display_name column to users table
-- This allows users to have a friendly name separate from their username
ALTER TABLE users
ADD COLUMN IF NOT EXISTS display_name VARCHAR(100);
-- Set display_name to name for existing users (as default)
UPDATE users
SET display_name = name
WHERE display_name IS NULL;

View File

@ -0,0 +1,20 @@
# Database Migration: Add Image Support
Run these SQL commands on your PostgreSQL database:
```sql
-- Add image columns to grocery_list table
ALTER TABLE grocery_list
ADD COLUMN item_image BYTEA,
ADD COLUMN image_mime_type VARCHAR(50);
-- Optional: Add index for faster queries when filtering by items with images
CREATE INDEX idx_grocery_list_has_image ON grocery_list ((item_image IS NOT NULL));
```
## To Verify:
```sql
\d grocery_list
```
You should see the new columns `item_image` and `image_mime_type`.

View File

@ -0,0 +1,8 @@
-- Add modified_on column to grocery_list table
ALTER TABLE grocery_list
ADD COLUMN modified_on TIMESTAMP DEFAULT NOW();
-- Set modified_on to NOW() for existing records
UPDATE grocery_list
SET modified_on = NOW()
WHERE modified_on IS NULL;

View File

@ -0,0 +1,7 @@
-- Add notes column to household_lists table
-- This allows users to add custom notes/descriptions to list items
ALTER TABLE household_lists
ADD COLUMN IF NOT EXISTS notes TEXT;
COMMENT ON COLUMN household_lists.notes IS 'Optional user notes/description for the item';

View File

@ -0,0 +1,29 @@
-- Migration: Create item_classification table
-- This table stores classification data for items in the grocery_list table
-- Each row in grocery_list can have ONE corresponding classification row
CREATE TABLE IF NOT EXISTS item_classification (
id INTEGER PRIMARY KEY REFERENCES grocery_list(id) ON DELETE CASCADE,
item_type VARCHAR(50) NOT NULL,
item_group VARCHAR(100) NOT NULL,
zone VARCHAR(100),
confidence DECIMAL(3,2) DEFAULT 1.0 CHECK (confidence >= 0 AND confidence <= 1),
source VARCHAR(20) DEFAULT 'user' CHECK (source IN ('user', 'ml', 'default')),
created_at TIMESTAMP DEFAULT NOW(),
updated_at TIMESTAMP DEFAULT NOW()
);
-- Index for faster lookups by type
CREATE INDEX IF NOT EXISTS idx_item_classification_type ON item_classification(item_type);
-- Index for zone-based queries
CREATE INDEX IF NOT EXISTS idx_item_classification_zone ON item_classification(zone);
-- Comments
COMMENT ON TABLE item_classification IS 'Stores classification metadata for grocery list items';
COMMENT ON COLUMN item_classification.id IS 'Foreign key to grocery_list.id (one-to-one relationship)';
COMMENT ON COLUMN item_classification.item_type IS 'High-level category (produce, meat, dairy, etc.)';
COMMENT ON COLUMN item_classification.item_group IS 'Subcategory within item_type (filtered by type)';
COMMENT ON COLUMN item_classification.zone IS 'Store zone/location (optional)';
COMMENT ON COLUMN item_classification.confidence IS 'Confidence score 0-1 (1.0 for user-provided, lower for ML-predicted)';
COMMENT ON COLUMN item_classification.source IS 'Source of classification: user, ml, or default';

View File

@ -0,0 +1,14 @@
CREATE TABLE IF NOT EXISTS sessions (
id VARCHAR(128) PRIMARY KEY,
user_id INTEGER NOT NULL REFERENCES users(id) ON DELETE CASCADE,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
expires_at TIMESTAMPTZ NOT NULL,
last_seen_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
user_agent TEXT
);
CREATE INDEX IF NOT EXISTS idx_sessions_user_id ON sessions(user_id);
CREATE INDEX IF NOT EXISTS idx_sessions_expires_at ON sessions(expires_at);
COMMENT ON TABLE sessions IS 'DB-backed application sessions';
COMMENT ON COLUMN sessions.id IS 'Opaque session identifier stored in HttpOnly cookie';

View File

@ -0,0 +1,397 @@
-- ============================================================================
-- Multi-Household & Multi-Store Architecture Migration
-- ============================================================================
-- This migration transforms the single-list app into a multi-tenant system
-- supporting multiple households, each with multiple stores.
--
-- IMPORTANT: Backup your database before running this migration!
-- pg_dump grocery_list > backup_$(date +%Y%m%d).sql
--
-- Migration Strategy:
-- 1. Create new tables
-- 2. Create "Main Household" for existing users
-- 3. Migrate existing data to new structure
-- 4. Update roles (keep users.role for system admin)
-- 5. Verify data integrity
-- 6. (Manual step) Drop old tables after verification
-- ============================================================================
BEGIN;
-- ============================================================================
-- STEP 1: CREATE NEW TABLES
-- ============================================================================
-- Households table
CREATE TABLE IF NOT EXISTS households (
id SERIAL PRIMARY KEY,
name VARCHAR(100) NOT NULL,
created_at TIMESTAMP DEFAULT NOW(),
created_by INTEGER REFERENCES users(id) ON DELETE SET NULL,
invite_code VARCHAR(20) UNIQUE NOT NULL,
code_expires_at TIMESTAMP
);
CREATE INDEX idx_households_invite_code ON households(invite_code);
COMMENT ON TABLE households IS 'Household groups (families, roommates, etc.)';
COMMENT ON COLUMN households.invite_code IS 'Unique code for inviting users to join household';
-- Store types table
CREATE TABLE IF NOT EXISTS stores (
id SERIAL PRIMARY KEY,
name VARCHAR(50) NOT NULL UNIQUE,
default_zones JSONB,
created_at TIMESTAMP DEFAULT NOW()
);
COMMENT ON TABLE stores IS 'Store types/chains (Costco, Target, Walmart, etc.)';
COMMENT ON COLUMN stores.default_zones IS 'JSON array of default zone names for this store type';
-- User-Household membership with per-household roles
CREATE TABLE IF NOT EXISTS household_members (
id SERIAL PRIMARY KEY,
household_id INTEGER REFERENCES households(id) ON DELETE CASCADE,
user_id INTEGER REFERENCES users(id) ON DELETE CASCADE,
role VARCHAR(20) NOT NULL CHECK (role IN ('admin', 'user')),
joined_at TIMESTAMP DEFAULT NOW(),
UNIQUE(household_id, user_id)
);
CREATE INDEX idx_household_members_user ON household_members(user_id);
CREATE INDEX idx_household_members_household ON household_members(household_id);
COMMENT ON TABLE household_members IS 'User membership in households with per-household roles';
COMMENT ON COLUMN household_members.role IS 'admin: full control, user: standard member';
-- Household-Store relationship
CREATE TABLE IF NOT EXISTS household_stores (
id SERIAL PRIMARY KEY,
household_id INTEGER REFERENCES households(id) ON DELETE CASCADE,
store_id INTEGER REFERENCES stores(id) ON DELETE CASCADE,
is_default BOOLEAN DEFAULT FALSE,
added_at TIMESTAMP DEFAULT NOW(),
UNIQUE(household_id, store_id)
);
CREATE INDEX idx_household_stores_household ON household_stores(household_id);
COMMENT ON TABLE household_stores IS 'Which stores each household shops at';
-- Master item catalog (shared across all households)
CREATE TABLE IF NOT EXISTS items (
id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL UNIQUE,
default_image BYTEA,
default_image_mime_type VARCHAR(50),
created_at TIMESTAMP DEFAULT NOW(),
usage_count INTEGER DEFAULT 0
);
CREATE INDEX idx_items_name ON items(name);
CREATE INDEX idx_items_usage_count ON items(usage_count DESC);
COMMENT ON TABLE items IS 'Master item catalog shared across all households';
COMMENT ON COLUMN items.usage_count IS 'Popularity metric for suggestions';
-- Household-specific grocery lists (per store)
CREATE TABLE IF NOT EXISTS household_lists (
id SERIAL PRIMARY KEY,
household_id INTEGER REFERENCES households(id) ON DELETE CASCADE,
store_id INTEGER REFERENCES stores(id) ON DELETE CASCADE,
item_id INTEGER REFERENCES items(id) ON DELETE CASCADE,
quantity INTEGER NOT NULL DEFAULT 1,
bought BOOLEAN DEFAULT FALSE,
custom_image BYTEA,
custom_image_mime_type VARCHAR(50),
added_by INTEGER REFERENCES users(id) ON DELETE SET NULL,
modified_on TIMESTAMP DEFAULT NOW(),
UNIQUE(household_id, store_id, item_id)
);
CREATE INDEX idx_household_lists_household_store ON household_lists(household_id, store_id);
CREATE INDEX idx_household_lists_bought ON household_lists(household_id, store_id, bought);
CREATE INDEX idx_household_lists_modified ON household_lists(modified_on DESC);
COMMENT ON TABLE household_lists IS 'Grocery lists scoped to household + store combination';
-- Household-specific item classifications (per store)
CREATE TABLE IF NOT EXISTS household_item_classifications (
id SERIAL PRIMARY KEY,
household_id INTEGER REFERENCES households(id) ON DELETE CASCADE,
store_id INTEGER REFERENCES stores(id) ON DELETE CASCADE,
item_id INTEGER REFERENCES items(id) ON DELETE CASCADE,
item_type VARCHAR(50),
item_group VARCHAR(100),
zone VARCHAR(100),
confidence DECIMAL(3,2) DEFAULT 1.0 CHECK (confidence >= 0 AND confidence <= 1),
source VARCHAR(20) DEFAULT 'user' CHECK (source IN ('user', 'ml', 'default')),
created_at TIMESTAMP DEFAULT NOW(),
updated_at TIMESTAMP DEFAULT NOW(),
UNIQUE(household_id, store_id, item_id)
);
CREATE INDEX idx_household_classifications ON household_item_classifications(household_id, store_id);
CREATE INDEX idx_household_classifications_type ON household_item_classifications(item_type);
CREATE INDEX idx_household_classifications_zone ON household_item_classifications(zone);
COMMENT ON TABLE household_item_classifications IS 'Item classifications scoped to household + store';
-- History tracking
CREATE TABLE IF NOT EXISTS household_list_history (
id SERIAL PRIMARY KEY,
household_list_id INTEGER REFERENCES household_lists(id) ON DELETE CASCADE,
quantity INTEGER NOT NULL,
added_by INTEGER REFERENCES users(id) ON DELETE SET NULL,
added_on TIMESTAMP DEFAULT NOW()
);
CREATE INDEX idx_household_history_list ON household_list_history(household_list_id);
CREATE INDEX idx_household_history_user ON household_list_history(added_by);
CREATE INDEX idx_household_history_date ON household_list_history(added_on DESC);
COMMENT ON TABLE household_list_history IS 'Tracks who added items and when';
-- ============================================================================
-- STEP 2: CREATE DEFAULT HOUSEHOLD AND STORE
-- ============================================================================
-- Create default household for existing users
INSERT INTO households (name, created_by, invite_code)
SELECT
'Main Household',
(SELECT id FROM users WHERE role = 'admin' LIMIT 1), -- First admin as creator
'MAIN' || LPAD(FLOOR(RANDOM() * 1000000)::TEXT, 6, '0') -- Random 6-digit code
WHERE NOT EXISTS (SELECT 1 FROM households WHERE name = 'Main Household');
-- Create default Costco store
INSERT INTO stores (name, default_zones)
VALUES (
'Costco',
'{
"zones": [
"Entrance & Seasonal",
"Fresh Produce",
"Meat & Seafood",
"Dairy & Refrigerated",
"Deli & Prepared Foods",
"Bakery & Bread",
"Frozen Foods",
"Beverages",
"Snacks & Candy",
"Pantry & Dry Goods",
"Health & Beauty",
"Household & Cleaning",
"Other"
]
}'::jsonb
)
ON CONFLICT (name) DO NOTHING;
-- Link default household to default store
INSERT INTO household_stores (household_id, store_id, is_default)
SELECT
(SELECT id FROM households WHERE name = 'Main Household'),
(SELECT id FROM stores WHERE name = 'Costco'),
TRUE
WHERE NOT EXISTS (
SELECT 1 FROM household_stores
WHERE household_id = (SELECT id FROM households WHERE name = 'Main Household')
);
-- ============================================================================
-- STEP 3: MIGRATE USERS TO HOUSEHOLD MEMBERS
-- ============================================================================
-- Add all existing users to Main Household
-- Old admins become household admins, others become standard users
INSERT INTO household_members (household_id, user_id, role)
SELECT
(SELECT id FROM households WHERE name = 'Main Household'),
id,
CASE
WHEN role = 'admin' THEN 'admin'
ELSE 'user'
END
FROM users
WHERE NOT EXISTS (
SELECT 1 FROM household_members hm
WHERE hm.user_id = users.id
AND hm.household_id = (SELECT id FROM households WHERE name = 'Main Household')
);
-- ============================================================================
-- STEP 4: MIGRATE ITEMS TO MASTER CATALOG
-- ============================================================================
-- Extract unique items from grocery_list into master items table
INSERT INTO items (name, default_image, default_image_mime_type, created_at, usage_count)
SELECT
LOWER(TRIM(item_name)) as name,
item_image,
image_mime_type,
MIN(modified_on) as created_at,
COUNT(*) as usage_count
FROM grocery_list
WHERE NOT EXISTS (
SELECT 1 FROM items WHERE LOWER(items.name) = LOWER(TRIM(grocery_list.item_name))
)
GROUP BY LOWER(TRIM(item_name)), item_image, image_mime_type
ON CONFLICT (name) DO NOTHING;
-- ============================================================================
-- STEP 5: MIGRATE GROCERY_LIST TO HOUSEHOLD_LISTS
-- ============================================================================
-- Migrate current list to household_lists
INSERT INTO household_lists (
household_id,
store_id,
item_id,
quantity,
bought,
custom_image,
custom_image_mime_type,
added_by,
modified_on
)
SELECT
(SELECT id FROM households WHERE name = 'Main Household'),
(SELECT id FROM stores WHERE name = 'Costco'),
i.id,
gl.quantity,
gl.bought,
CASE WHEN gl.item_image != i.default_image THEN gl.item_image ELSE NULL END, -- Only store if different
CASE WHEN gl.item_image != i.default_image THEN gl.image_mime_type ELSE NULL END,
gl.added_by,
gl.modified_on
FROM grocery_list gl
JOIN items i ON LOWER(i.name) = LOWER(TRIM(gl.item_name))
WHERE NOT EXISTS (
SELECT 1 FROM household_lists hl
WHERE hl.household_id = (SELECT id FROM households WHERE name = 'Main Household')
AND hl.store_id = (SELECT id FROM stores WHERE name = 'Costco')
AND hl.item_id = i.id
)
ON CONFLICT (household_id, store_id, item_id) DO NOTHING;
-- ============================================================================
-- STEP 6: MIGRATE ITEM_CLASSIFICATION TO HOUSEHOLD_ITEM_CLASSIFICATIONS
-- ============================================================================
-- Migrate classifications
INSERT INTO household_item_classifications (
household_id,
store_id,
item_id,
item_type,
item_group,
zone,
confidence,
source,
created_at,
updated_at
)
SELECT
(SELECT id FROM households WHERE name = 'Main Household'),
(SELECT id FROM stores WHERE name = 'Costco'),
i.id,
ic.item_type,
ic.item_group,
ic.zone,
ic.confidence,
ic.source,
ic.created_at,
ic.updated_at
FROM item_classification ic
JOIN grocery_list gl ON ic.id = gl.id
JOIN items i ON LOWER(i.name) = LOWER(TRIM(gl.item_name))
WHERE NOT EXISTS (
SELECT 1 FROM household_item_classifications hic
WHERE hic.household_id = (SELECT id FROM households WHERE name = 'Main Household')
AND hic.store_id = (SELECT id FROM stores WHERE name = 'Costco')
AND hic.item_id = i.id
)
ON CONFLICT (household_id, store_id, item_id) DO NOTHING;
-- ============================================================================
-- STEP 7: MIGRATE GROCERY_HISTORY TO HOUSEHOLD_LIST_HISTORY
-- ============================================================================
-- Migrate history records
INSERT INTO household_list_history (household_list_id, quantity, added_by, added_on)
SELECT
hl.id,
gh.quantity,
gh.added_by,
gh.added_on
FROM grocery_history gh
JOIN grocery_list gl ON gh.list_item_id = gl.id
JOIN items i ON LOWER(i.name) = LOWER(TRIM(gl.item_name))
JOIN household_lists hl ON hl.item_id = i.id
AND hl.household_id = (SELECT id FROM households WHERE name = 'Main Household')
AND hl.store_id = (SELECT id FROM stores WHERE name = 'Costco')
WHERE NOT EXISTS (
SELECT 1 FROM household_list_history hlh
WHERE hlh.household_list_id = hl.id
AND hlh.added_by = gh.added_by
AND hlh.added_on = gh.added_on
);
-- ============================================================================
-- STEP 8: UPDATE USER ROLES (SYSTEM-WIDE)
-- ============================================================================
-- Update system roles: admin → system_admin, others → user
UPDATE users
SET role = 'system_admin'
WHERE role = 'admin';
UPDATE users
SET role = 'user'
WHERE role IN ('editor', 'viewer');
-- ============================================================================
-- VERIFICATION QUERIES
-- ============================================================================
-- Run these to verify migration success:
-- Check household created
-- SELECT * FROM households;
-- Check all users added to household
-- SELECT u.username, u.role as system_role, hm.role as household_role
-- FROM users u
-- JOIN household_members hm ON u.id = hm.user_id
-- ORDER BY u.id;
-- Check items migrated
-- SELECT COUNT(*) as total_items FROM items;
-- SELECT COUNT(*) as original_items FROM (SELECT DISTINCT item_name FROM grocery_list) sub;
-- Check lists migrated
-- SELECT COUNT(*) as new_lists FROM household_lists;
-- SELECT COUNT(*) as old_lists FROM grocery_list;
-- Check classifications migrated
-- SELECT COUNT(*) as new_classifications FROM household_item_classifications;
-- SELECT COUNT(*) as old_classifications FROM item_classification;
-- Check history migrated
-- SELECT COUNT(*) as new_history FROM household_list_history;
-- SELECT COUNT(*) as old_history FROM grocery_history;
-- ============================================================================
-- MANUAL STEPS AFTER VERIFICATION
-- ============================================================================
-- After verifying data integrity, uncomment and run these to clean up:
-- DROP TABLE IF EXISTS grocery_history CASCADE;
-- DROP TABLE IF EXISTS item_classification CASCADE;
-- DROP TABLE IF EXISTS grocery_list CASCADE;
COMMIT;
-- ============================================================================
-- ROLLBACK (if something goes wrong)
-- ============================================================================
-- ROLLBACK;
-- Then restore from backup:
-- psql -U your_user -d grocery_list < backup_YYYYMMDD.sql

View File

@ -1,80 +1,21 @@
@echo off @echo off
REM Multi-Household Migration Runner (Windows) setlocal
REM This script handles the complete migration process with safety checks
setlocal enabledelayedexpansion if "%DATABASE_URL%"=="" (
echo DATABASE_URL is required. Aborting.
REM Database configuration exit /b 1
set DB_USER=postgres
set DB_HOST=192.168.7.112
set DB_NAME=grocery
set PGPASSWORD=Asdwed123A.
set BACKUP_DIR=backend\migrations\backups
set TIMESTAMP=%date:~-4%%date:~-10,2%%date:~-7,2%_%time:~0,2%%time:~3,2%%time:~6,2%
set TIMESTAMP=%TIMESTAMP: =0%
set BACKUP_FILE=%BACKUP_DIR%\backup_%TIMESTAMP%.sql
echo ================================================
echo Multi-Household Architecture Migration
echo ================================================
echo.
REM Create backup directory
if not exist "%BACKUP_DIR%" mkdir "%BACKUP_DIR%"
REM Step 1: Backup (SKIPPED - using database template copy)
echo [1/5] Backup: SKIPPED (using 'grocery' database copy)
echo.
REM Step 2: Show current stats
echo [2/5] Current database statistics:
psql -h %DB_HOST% -U %DB_USER% -d %DB_NAME% -c "SELECT 'Users' as table_name, COUNT(*) as count FROM users UNION ALL SELECT 'Grocery Items', COUNT(*) FROM grocery_list UNION ALL SELECT 'Classifications', COUNT(*) FROM item_classification UNION ALL SELECT 'History Records', COUNT(*) FROM grocery_history;"
echo.
REM Step 3: Confirm
echo [3/5] Ready to run migration
echo Database: %DB_NAME% on %DB_HOST%
echo Backup: %BACKUP_FILE%
echo.
set /p CONFIRM="Continue with migration? (yes/no): "
if /i not "%CONFIRM%"=="yes" (
echo Migration cancelled.
exit /b 0
) )
echo.
REM Step 4: Run migration echo Checking migration status...
echo [4/5] Running migration script... call npm run db:migrate:status
psql -h %DB_HOST% -U %DB_USER% -d %DB_NAME% -f backend\migrations\multi_household_architecture.sql if errorlevel 1 exit /b 1
if %errorlevel% neq 0 (
echo [ERROR] Migration failed! Rolling back...
echo Restoring from backup: %BACKUP_FILE%
psql -h %DB_HOST% -U %DB_USER% -d %DB_NAME% < "%BACKUP_FILE%"
exit /b 1
)
echo [OK] Migration completed successfully
echo.
REM Step 5: Verification echo Applying pending migrations...
echo [5/5] Verifying migration... call npm run db:migrate
psql -h %DB_HOST% -U %DB_USER% -d %DB_NAME% -c "SELECT id, name, invite_code FROM households;" if errorlevel 1 exit /b 1
psql -h %DB_HOST% -U %DB_USER% -d %DB_NAME% -c "SELECT u.id, u.username, u.role as system_role, hm.role as household_role FROM users u LEFT JOIN household_members hm ON u.id = hm.user_id ORDER BY u.id LIMIT 10;"
psql -h %DB_HOST% -U %DB_USER% -d %DB_NAME% -c "SELECT 'Items' as metric, COUNT(*)::text as count FROM items UNION ALL SELECT 'Household Lists', COUNT(*)::text FROM household_lists UNION ALL SELECT 'Classifications', COUNT(*)::text FROM household_item_classifications UNION ALL SELECT 'History Records', COUNT(*)::text FROM household_list_history;"
echo.
echo ================================================ echo Final migration status...
echo Migration Complete! call npm run db:migrate:status
echo ================================================ if errorlevel 1 exit /b 1
echo.
echo Next Steps:
echo 1. Review verification results above
echo 2. Test the application
echo 3. If issues found, rollback with:
echo psql -h %DB_HOST% -U %DB_USER% -d %DB_NAME% ^< %BACKUP_FILE%
echo 4. If successful, proceed to Sprint 2 (Backend API)
echo.
echo Backup location: %BACKUP_FILE%
echo.
pause echo Done.

View File

@ -1,146 +1,24 @@
#!/bin/bash #!/bin/bash
# Multi-Household Migration Runner set -euo pipefail
# This script handles the complete migration process with safety checks
set -e # Exit on error if ! command -v node >/dev/null 2>&1; then
echo "node is required."
# Colors for output exit 1
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m' # No Color
# Database configuration (from .env)
DB_USER="postgres"
DB_HOST="192.168.7.112"
DB_NAME="grocery"
export PGPASSWORD="Asdwed123A."
BACKUP_DIR="./backend/migrations/backups"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="${BACKUP_DIR}/backup_${TIMESTAMP}.sql"
echo -e "${BLUE}╔════════════════════════════════════════════════╗${NC}"
echo -e "${BLUE}║ Multi-Household Architecture Migration ║${NC}"
echo -e "${BLUE}╚════════════════════════════════════════════════╝${NC}"
echo ""
# Create backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"
# Step 1: Backup
echo -e "${YELLOW}[1/5] Creating database backup...${NC}"
pg_dump -h "$DB_HOST" -U "$DB_USER" -d "$DB_NAME" > "$BACKUP_FILE"
if [ $? -eq 0 ]; then
echo -e "${GREEN}✓ Backup created: $BACKUP_FILE${NC}"
else
echo -e "${RED}✗ Backup failed!${NC}"
exit 1
fi
echo ""
# Step 2: Show current stats
echo -e "${YELLOW}[2/5] Current database statistics:${NC}"
psql -h "$DB_HOST" -U "$DB_USER" -d "$DB_NAME" -c "
SELECT
'Users' as table_name, COUNT(*) as count FROM users
UNION ALL
SELECT 'Grocery Items', COUNT(*) FROM grocery_list
UNION ALL
SELECT 'Classifications', COUNT(*) FROM item_classification
UNION ALL
SELECT 'History Records', COUNT(*) FROM grocery_history;
"
echo ""
# Step 3: Confirm
echo -e "${YELLOW}[3/5] Ready to run migration${NC}"
echo -e "Database: ${BLUE}$DB_NAME${NC} on ${BLUE}$DB_HOST${NC}"
echo -e "Backup: ${GREEN}$BACKUP_FILE${NC}"
echo ""
read -p "Continue with migration? (yes/no): " -r
echo ""
if [[ ! $REPLY =~ ^[Yy]es$ ]]; then
echo -e "${RED}Migration cancelled.${NC}"
exit 1
fi fi
# Step 4: Run migration if [ -z "${DATABASE_URL:-}" ]; then
echo -e "${YELLOW}[4/5] Running migration script...${NC}" echo "DATABASE_URL is required. Aborting."
psql -h "$DB_HOST" -U "$DB_USER" -d "$DB_NAME" -f backend/migrations/multi_household_architecture.sql exit 1
if [ $? -eq 0 ]; then
echo -e "${GREEN}✓ Migration completed successfully${NC}"
else
echo -e "${RED}✗ Migration failed! Rolling back...${NC}"
echo -e "${YELLOW}Restoring from backup: $BACKUP_FILE${NC}"
psql -h "$DB_HOST" -U "$DB_USER" -d "$DB_NAME" < "$BACKUP_FILE"
exit 1
fi fi
echo ""
# Step 5: Verification echo "Checking migration status..."
echo -e "${YELLOW}[5/5] Verifying migration...${NC}" npm run db:migrate:status
psql -h "$DB_HOST" -U "$DB_USER" -d "$DB_NAME" << 'EOF'
\echo ''
\echo '=== Household Created ==='
SELECT id, name, invite_code FROM households;
\echo '' echo "Applying pending migrations..."
\echo '=== User Roles ===' npm run db:migrate
SELECT u.id, u.username, u.role as system_role, hm.role as household_role
FROM users u
LEFT JOIN household_members hm ON u.id = hm.user_id
ORDER BY u.id
LIMIT 10;
\echo '' echo "Final migration status..."
\echo '=== Migration Counts ===' npm run db:migrate:status
SELECT
'Items (Master Catalog)' as metric, COUNT(*)::text as count FROM items
UNION ALL
SELECT 'Household Lists', COUNT(*)::text FROM household_lists
UNION ALL
SELECT 'Classifications', COUNT(*)::text FROM household_item_classifications
UNION ALL
SELECT 'History Records', COUNT(*)::text FROM household_list_history
UNION ALL
SELECT 'Household Members', COUNT(*)::text FROM household_members
UNION ALL
SELECT 'Stores', COUNT(*)::text FROM stores;
\echo '' echo "Done."
\echo '=== Data Integrity Checks ==='
\echo 'Users without household membership (should be 0):'
SELECT COUNT(*) FROM users u
LEFT JOIN household_members hm ON u.id = hm.user_id
WHERE hm.id IS NULL;
\echo ''
\echo 'Lists without valid items (should be 0):'
SELECT COUNT(*) FROM household_lists hl
LEFT JOIN items i ON hl.item_id = i.id
WHERE i.id IS NULL;
\echo ''
\echo 'History without valid lists (should be 0):'
SELECT COUNT(*) FROM household_list_history hlh
LEFT JOIN household_lists hl ON hlh.household_list_id = hl.id
WHERE hl.id IS NULL;
EOF
echo ""
echo -e "${GREEN}╔════════════════════════════════════════════════╗${NC}"
echo -e "${GREEN}║ Migration Complete! ║${NC}"
echo -e "${GREEN}╚════════════════════════════════════════════════╝${NC}"
echo ""
echo -e "${BLUE}Next Steps:${NC}"
echo -e "1. Review verification results above"
echo -e "2. Test the application"
echo -e "3. If issues found, rollback with:"
echo -e " ${YELLOW}psql -h $DB_HOST -U $DB_USER -d $DB_NAME < $BACKUP_FILE${NC}"
echo -e "4. If successful, proceed to Sprint 2 (Backend API)"
echo ""
echo -e "${YELLOW}Backup location: $BACKUP_FILE${NC}"
echo ""

View File

@ -0,0 +1,108 @@
"use strict";
const fs = require("fs");
const path = require("path");
const { spawnSync } = require("child_process");
const migrationsDir = path.resolve(
__dirname,
"..",
"packages",
"db",
"migrations"
);
function ensureDatabaseUrl() {
const databaseUrl = process.env.DATABASE_URL;
if (!databaseUrl) {
throw new Error("DATABASE_URL is required.");
}
return databaseUrl;
}
function ensurePsql() {
const result = spawnSync("psql", ["--version"], { stdio: "pipe" });
if (result.error || result.status !== 0) {
throw new Error("psql executable was not found in PATH.");
}
}
function ensureMigrationsDir() {
if (!fs.existsSync(migrationsDir)) {
throw new Error(`Migrations directory not found: ${migrationsDir}`);
}
}
function getMigrationFiles() {
ensureMigrationsDir();
return fs
.readdirSync(migrationsDir)
.filter((file) => file.endsWith(".sql"))
.sort((a, b) => a.localeCompare(b));
}
function runPsql(databaseUrl, args) {
const result = spawnSync("psql", [databaseUrl, ...args], {
stdio: "pipe",
encoding: "utf8",
});
if (result.status !== 0) {
const stderr = (result.stderr || "").trim();
const stdout = (result.stdout || "").trim();
const details = [stderr, stdout].filter(Boolean).join("\n");
throw new Error(details || "psql command failed");
}
return result.stdout || "";
}
function escapeSqlLiteral(value) {
return value.replace(/'/g, "''");
}
function ensureSchemaMigrationsTable(databaseUrl) {
runPsql(databaseUrl, [
"-v",
"ON_ERROR_STOP=1",
"-c",
"CREATE TABLE IF NOT EXISTS schema_migrations (filename TEXT PRIMARY KEY, applied_at TIMESTAMPTZ NOT NULL DEFAULT NOW());",
]);
}
function getAppliedMigrations(databaseUrl) {
const output = runPsql(databaseUrl, [
"-At",
"-v",
"ON_ERROR_STOP=1",
"-c",
"SELECT filename FROM schema_migrations ORDER BY filename ASC;",
]);
return new Set(
output
.split(/\r?\n/)
.map((line) => line.trim())
.filter(Boolean)
);
}
function applyMigration(databaseUrl, filename) {
const fullPath = path.join(migrationsDir, filename);
runPsql(databaseUrl, ["-v", "ON_ERROR_STOP=1", "-f", fullPath]);
runPsql(databaseUrl, [
"-v",
"ON_ERROR_STOP=1",
"-c",
`INSERT INTO schema_migrations (filename) VALUES ('${escapeSqlLiteral(
filename
)}') ON CONFLICT DO NOTHING;`,
]);
}
module.exports = {
applyMigration,
ensureDatabaseUrl,
ensurePsql,
ensureSchemaMigrationsTable,
getAppliedMigrations,
getMigrationFiles,
migrationsDir,
};

View File

@ -0,0 +1,42 @@
"use strict";
const {
ensureDatabaseUrl,
ensurePsql,
ensureSchemaMigrationsTable,
getAppliedMigrations,
getMigrationFiles,
} = require("./db-migrate-common");
function main() {
if (process.argv.includes("--help")) {
console.log("Usage: npm run db:migrate:status");
process.exit(0);
}
const databaseUrl = ensureDatabaseUrl();
ensurePsql();
ensureSchemaMigrationsTable(databaseUrl);
const files = getMigrationFiles();
const applied = getAppliedMigrations(databaseUrl);
let pendingCount = 0;
for (const file of files) {
const status = applied.has(file) ? "APPLIED" : "PENDING";
if (status === "PENDING") pendingCount += 1;
console.log(`${status} ${file}`);
}
console.log("");
console.log(`Total: ${files.length}`);
console.log(`Applied: ${files.length - pendingCount}`);
console.log(`Pending: ${pendingCount}`);
}
try {
main();
} catch (error) {
console.error(error.message);
process.exit(1);
}

View File

@ -0,0 +1,41 @@
"use strict";
const {
ensureDatabaseUrl,
ensurePsql,
ensureSchemaMigrationsTable,
getAppliedMigrations,
getMigrationFiles,
} = require("./db-migrate-common");
function main() {
if (process.argv.includes("--help")) {
console.log("Usage: npm run db:migrate:verify");
process.exit(0);
}
const databaseUrl = ensureDatabaseUrl();
ensurePsql();
ensureSchemaMigrationsTable(databaseUrl);
const files = getMigrationFiles();
const applied = getAppliedMigrations(databaseUrl);
const pending = files.filter((file) => !applied.has(file));
if (pending.length > 0) {
console.error("Pending migrations detected:");
for (const file of pending) {
console.error(`- ${file}`);
}
process.exit(1);
}
console.log("Migration verification passed. No pending migrations.");
}
try {
main();
} catch (error) {
console.error(error.message);
process.exit(1);
}

44
scripts/db-migrate.js Normal file
View File

@ -0,0 +1,44 @@
"use strict";
const {
applyMigration,
ensureDatabaseUrl,
ensurePsql,
ensureSchemaMigrationsTable,
getAppliedMigrations,
getMigrationFiles,
} = require("./db-migrate-common");
function main() {
if (process.argv.includes("--help")) {
console.log("Usage: npm run db:migrate");
process.exit(0);
}
const databaseUrl = ensureDatabaseUrl();
ensurePsql();
ensureSchemaMigrationsTable(databaseUrl);
const files = getMigrationFiles();
const applied = getAppliedMigrations(databaseUrl);
const pending = files.filter((file) => !applied.has(file));
if (pending.length === 0) {
console.log("No pending migrations.");
return;
}
for (const file of pending) {
console.log(`Applying: ${file}`);
applyMigration(databaseUrl, file);
}
console.log(`Applied ${pending.length} migration(s).`);
}
try {
main();
} catch (error) {
console.error(error.message);
process.exit(1);
}