Python console app for job vacancy search and local cataloging: it talks to the public HeadHunter (hh.ru) API (requests), normalizes vacancy fields (id, title, salary, requirements, URL), stores them in vacancies.json, and drives everything through an interactive menu (search API vs file, filter by ID or keyword, top-N by salary with optional range, manual add, clear file). The codebase uses small OOP layers (API, FileWork, Vacancy). Tests mock HTTP so CI stays deterministic without calling hh.ru.
- HeadHunter search —
HeadHunterAPI.get_vacancies(keyword, per_page)callshttps://api.hh.ru/vacancies, maps salary (from/to→ single number where applicable), and returns a list of dicts ready for JSON or display. - JSON storage —
WorkingWithJSONappends, reads, deletes selected rows, and clearsvacancies.json(created on first use if missing). - Interactive menu —
user_interaction()insrc/main.pyguides search-in-API, search-in-file (by 9-digit ID or tokenized title match), top vacancies by salary, manual vacancy entry, and file reset. - Display helpers —
src/utils.pyformats counts, ranges, sorted “top” lists, and optional delete prompts after a file search. - Tests —
pytestsuite mocksrequestsandapi_vacancies.get_vacanciesso flows do not depend on network or live vacancy IDs.
| Layer | Technology |
|---|---|
| HTTP | Requests |
| Data on disk | JSON (vacancies.json, UTF-8) |
| Packaging | Poetry (pyproject.toml / poetry.lock) |
| Tests | Pytest (+ pytest-cov in dev group) |
| Lint / types | Black, Flake8, isort, Mypy (lint group) |
src/main.py— menu loop, wires API +WorkingWithJSON+Vacancy.src/api.py— abstractAPI, concreteHeadHunterAPI(_connect_api,get_vacancies).src/vacancies.py—Vacancymodel (to_dict, validation-style helpers used in tests).src/file_work.py— abstractFileWork,WorkingWithJSONimplementation.src/utils.py— terminal output and user-facing helpers (display_*,deleted_option).tests/— unit tests for API (mocked), file I/O, main flows (mocked API), utils, vacancies.vacancies.json— runtime data file (created when you save vacancies; safe to delete for a clean state).
-
Clone the repo:
git clone https://github.com/AJLbN0H/job-parser-professional.git cd job-parser-professional -
Build and run with Docker:
docker build -t job-parser . docker run -it -v ./data:/app/data job-parserOr use Docker Compose:
docker-compose up job-parser
Network: live menu option "search via API" needs outbound HTTPS to hh.ru. Offline development is covered by tests with mocks.
docker run --rm job-parser python -m pytest -vOr with Docker Compose:
docker-compose up job-parser-testsInstall dependencies (the dev group includes Pytest):
poetry install --no-interaction --with devRun the console app:
poetry run python -m src.mainPoetry + lockfile is the supported path and matches CI.
GitHub Actions runs on pushes to main / develop and on pull requests targeting main: Python 3.13, poetry install --no-interaction --no-root --with dev, then poetry run pytest.
Workflow: .github/workflows/tests.yml.
- Optional CLI flags or a thin Typer wrapper instead of a pure input-driven menu.
- More boards (SuperJob, Rabota.ru, etc.) behind the same
APIabstraction. - Exported
requirements.txt(orpip-tools) for environments without Poetry. - Optional marked integration tests that hit hh.ru behind an env flag (skipped by default).
- GUI or small web UI (e.g. PyQt / FastAPI) on top of the same core modules.