Track 3D printer listings across Kijiji and retailer sites, then automatically detect price drops and deals.
- Automated Scraping: Periodically scrape Kijiji plus selected retailer pages
- Price Tracking: Monitor price changes over time with historical snapshots
- Multi-Currency Storage: Store one canonical
pricepluscurrencyper listing - Sale Detection: Detect sale pricing and keep nominal (non-sale) price when available
- Deal Detection: Automatically identify listings with:
- Price drops from original listing price
- Prices significantly below MSRP
- Prices below current retail (via Aurora Tech Channel integration)
- Brand/Model Detection: Automatically identify printer brands and models
- Retail Price Integration: Compare against live retail prices from Aurora Tech Channel
- Web Dashboard: View listings, deals, and price history in a clean interface
- CLI Tools: Command-line interface for scraping, viewing deals, and stats
-
Clone the repository:
git clone https://github.com/justinh-rahb/3dp-listing-scrape.git cd 3dp-listing-scrape -
Create a virtual environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
Initialize the database:
python migrate_db.py
Run the server (default behavior):
python cli.pyThen visit http://127.0.0.1:5000
Run the server (explicit command):
python cli.py serve --host 0.0.0.0 --port 5000 --workers 1Run a scrape:
python cli.py scrapeView top deals:
python cli.py deals --limit 20Update retail prices from Aurora Tech Channel:
python cli.py update-retail-pricesView database statistics:
python cli.py statsStart the server using production entrypoint:
python server.pyBuild and run with Docker Compose:
docker compose up --build -dView logs:
docker compose logs -f trackerStop:
docker compose downThe web dashboard provides:
- Listings: Browse all active listings with filtering
- Deals: View listings sorted by deal quality
- Price History: See price changes over time
- Settings: Configure scraping behavior and search queries
To protect the Settings page and settings-management API endpoints, set:
export SETTINGS_PASSWORD="your-strong-password"When SETTINGS_PASSWORD is set, opening /settings will require HTTP Basic auth.
If SETTINGS_PASSWORD is unset or empty, settings auth is disabled.
By default, the scraper searches Hamilton (ON) Kijiji for:
- 3d printer
- bambu lab
- prusa
- creality
- ender 3
- anycubic
- voron
It also includes:
- Sovol Zero product page (
sovol3d.com) - Formbot Voron collection page (
formbot3d.com)
You can modify search queries in the Settings page of the web dashboard or by editing the database directly.
Configure scraping behavior in Settings:
- Scrape Interval: Hours between automated scrapes (if using scheduler)
- Max Pages: Maximum pages to scrape per search query
- Request Delay: Random delay between requests (2-5 seconds)
- Inactive Threshold: Missed runs before marking a listing inactive
- FX Rates to USD: Used for USD-equivalent price change detection (
fx_rates_to_usd)
Webhook notifications can be configured in Settings > General > Webhook Notifications.
Supported providers:
generic(canonical JSON payload)discord(Discord webhook payload)google_chat(Google Chat incoming webhook payload)
Supported events:
scrape_completednew_deal_detectedscrape_failed
The Aurora Tech Channel integration has been temporarily disabled due to HTML parsing complexity. The website structure changes frequently, making reliable scraping difficult.
Current workaround: Manually update msrp_data.json with current prices.
Future enhancement: Consider using an API or more robust scraping approach when available.
Running python cli.py update-retail-prices will show a message that this feature is disabled.
Deals are scored based on multiple factors:
- Savings vs Retail (highest priority): Amount saved compared to current retail price
- Price Drop Percentage: Discount from original listing price
- Newness: Bonus points for recently listed items
- Below Retail Ratio: Extra points for listings significantly below retail
The database tracks:
- listings: All scraped listings with current and original prices
- price_snapshots: Historical price data for each listing
- scrape_runs: Log of all scraping operations
- msrp_entries: Brand/model MSRP and retail price data
- brand_keywords: Keywords for brand detection
- search_queries: Configured search URLs
- settings: Application configuration
pytestAfter updating the schema, run:
python migrate_db.pyAdd brand keywords in Settings or edit config.py:
DEFAULT_BRAND_KEYWORDS = {
"brand_name": ["keyword1", "keyword2", "model1", "model2"],
}The built-in scheduler now starts automatically with the server by default.
Configure scrape interval in the Settings page (scrape_interval_hours).
- Be Respectful: The scraper includes delays to avoid overloading Kijiji's servers
- Hamilton Only: Currently configured for Hamilton, ON listings
- Price Format: Listings store both
priceandcurrency - Aurora Data: Retail prices are in USD but tracked as CAD for comparison
Pull requests welcome! Please ensure code follows existing style and includes tests.
MIT License - see LICENSE file for details
- Aurora Tech Channel for providing comprehensive 3D printer pricing data
- Kijiji for providing a platform for local buying/selling