┌─────────────────────────────────────────────────────────────────┐ │ PRICE MONITORING SYSTEM │ └─────────────────────────────────────────────────────────────────┘
┌──────────────┐ ┌──────────────┐ ┌──────────────────┐ │ Supabase │ │ Edge │ │ Python │ │ Cron Job │─────▶│ Function │─────▶│ Backend │ │ (Hourly) │ │ (Cron) │ │ (FastAPI) │ └──────────────┘ └──────────────┘ └──────────────────┘ │ ┌───────────────────────────────┼───────────────────────────┐ │ │ │ ▼ ▼ ▼ ┌──────────────────┐ ┌──────────────────┐ ┌──────────────────┐ │ Firecrawl API │ │ Credits Service │ │ AI Call Logger │ │ (Web Scraping) │ │ (Debit Credits) │ │ (Usage Logs) │ └──────────────────┘ └──────────────────┘ └──────────────────┘ │ │ │ └───────────────────────────────┼───────────────────────────┘ ▼ ┌──────────────────────┐ │ Supabase Database │ │ - price_history │ │ - competitor_sources│ │ - price_alerts │ │ - ai_usage_logs │ └──────────────────────┘
Run supabase db push from the project root to apply all migrations. Verify the schema with supabase db diff.
Check that these tables exist:
price_monitoring_productsprice_historycompetitor_sourcesprice_monitoring_jobsprice_alertsprice_alert_historyCheck that these functions exist:
get_products_due_for_monitoring()update_next_check_time(p_monitoring_id, p_frequency)should_trigger_alert(p_alert_id, p_old_price, p_new_price)Add the following to mivaa-pdf-extractor/.env:
FIRECRAWL_API_KEY - Your Firecrawl API keySUPABASE_URL - Supabase project URL (should already exist)SUPABASE_SERVICE_ROLE_KEY - Service role key (should already exist)SUPABASE_JWT_SECRET - JWT secret (should already exist)MATERIAL_KAI_API_KEY - Material Kai API key (should already exist)MATERIAL_KAI_WORKSPACE_ID - Workspace ID (should already exist)Start the backend with python -m uvicorn app.main:app --reload --port 8000. Verify the health endpoint responds at http://localhost:8000/health. The price monitoring endpoint at http://localhost:8000/api/v1/price-monitoring/status/test-id should return 401 without authentication.
Use supabase secrets set to configure the following secrets: SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY, PYTHON_BACKEND_URL (localhost for dev or the production URL), and CRON_SECRET (generate with openssl rand -hex 32). Verify with supabase secrets list.
Run supabase functions deploy price-monitoring-cron and verify with supabase functions list.
Send a POST request to https://your-project.supabase.co/functions/v1/price-monitoring-cron with the x-cron-secret header set to your CRON_SECRET value. With no products configured yet, the expected response will show success with all stats at zero.
price-monitoring-hourly0 * * * * (every hour)net.http_post to call the price-monitoring-cron edge function URL with the Content-Type and x-cron-secret headers.Execute a SELECT cron.schedule(...) statement with the job name 'price-monitoring-hourly', schedule '0 * * * *', and a dollar-quoted SQL block that calls net.http_post with the edge function URL and the required headers including the cron secret.
Query SELECT * FROM cron.job; to list all scheduled cron jobs. Query SELECT * FROM cron.job_run_details ORDER BY start_time DESC LIMIT 10; to check recent execution history.
Send a POST request to http://localhost:8000/api/v1/price-monitoring/start with your JWT token in the Authorization header and a body specifying product_id, frequency: 'hourly', and enabled: true.
Send POST requests to http://localhost:8000/api/v1/price-monitoring/sources with your JWT token. Each request body specifies product_id, source_name, source_url, and optionally scraping_config with settings like waitFor (milliseconds) and timeout.
Send a POST request to http://localhost:8000/api/v1/price-monitoring/check-now with your JWT token and a body specifying product_id and product_name. The expected response includes success, message, job_id, sources_checked, prices_found, and credits_consumed.
Query the price_history table filtering by product_id and ordering by scraped_at descending to confirm price records were created. Query price_monitoring_jobs similarly to confirm job records. Query ai_usage_logs filtering by provider 'firecrawl' to confirm credit usage was logged.
Manually trigger the cron function by sending a POST request with the cron secret header. Check Edge Function logs with supabase functions logs price-monitoring-cron --tail. Check Python backend logs in the backend terminal.
Query SELECT * FROM cron.job WHERE jobname = 'price-monitoring-hourly'; to confirm the job is scheduled. Query cron.job_run_details filtering by the jobid to see fields including jobid, runid, job_pid, database, username, command, status, return_message, start_time, and end_time.
Use supabase functions logs price-monitoring-cron --tail for real-time logs or supabase functions logs price-monitoring-cron --limit 100 for the last 100 entries.
View logs via systemd (sudo journalctl -u mivaa-backend -f), Docker (docker logs -f mivaa-backend), or directly in the terminal if running locally.
Symptoms: No logs in Edge Function, no database updates
Solutions:
SELECT * FROM cron.job;supabase functions listSELECT * FROM cron.job_run_details;Symptoms: "Unauthorized" error in logs
Solutions:
CRON_SECRET is set correctlyopenssl rand -hex 32 and set it via supabase secrets set CRON_SECRET=...Symptoms: Edge Function logs show connection errors
Solutions:
PYTHON_BACKEND_URL is correctSymptoms: "Found 0 products due for monitoring"
Solutions:
price_monitoring_products table has recordsmonitoring_enabled = truenext_check_at <= NOW()monitoring_enabled = true to see product_id, monitoring_frequency, next_check_at, and status for each record.Symptoms: "Failed to scrape" errors in logs
Solutions:
FIRECRAWL_API_KEY is set in Python backendSymptoms: Price checks succeed but credits unchanged
Solutions:
CreditsIntegrationService is configuredai_usage_logs table for entriesSet the production Python backend URL using supabase secrets set PYTHON_BACKEND_URL=https://api.yourdomain.com. Verify all secrets with supabase secrets list.
Deploy to production with supabase functions deploy price-monitoring-cron --project-ref your-project-ref. Verify with supabase functions list --project-ref your-project-ref.
Delete the old cron job with SELECT cron.unschedule('price-monitoring-hourly');. Create a new cron job using SELECT cron.schedule(...) pointing to the production URL with the production cron secret.
/api/v1/price-monitoring/check-nowprice_monitoring_jobs table for failuresprice_history table for data gapsai_usage_logs for anomaliesSELECT * FROM cron.job_run_details ORDER BY start_time DESC LIMIT 10;For issues or questions:
supabase functions logs price-monitoring-cron