Skip to content

Commit 2299bc5

Browse files
authored
Merge pull request #12 from CyberRoute/anthropicapi
integration with Anthropic API
2 parents 22654e4 + 21b4907 commit 2299bc5

4 files changed

Lines changed: 180 additions & 22 deletions

File tree

README.md

Lines changed: 29 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88

99
## Overview
1010

11-
Phantom is a **network reconnaissance and security auditing tool** designed for directly connected networks. It discovers devices via ARP scanning, tracks their history, detects ARP spoofing attacks, and can perform MITM interception with live packet analysis powered by a local LLM.
11+
Phantom is a **network reconnaissance and security auditing tool** designed for directly connected networks. It discovers devices via ARP scanning, tracks their history, detects ARP spoofing attacks, and can perform MITM interception with live packet analysis powered by a local or cloud LLM.
1212

1313
The GUI is built with **PySide6** (Qt framework) and uses **Scapy** for all packet-level operations.
1414

@@ -21,7 +21,7 @@ The GUI is built with **PySide6** (Qt framework) and uses **Scapy** for all pack
2121
- **New Device & MAC Change Detection**: Highlights new devices (green) and IP-to-MAC binding changes (red) — a classic ARP spoofing indicator.
2222
- **ARP Spoof Detection**: Passive background sniffer that alerts on conflicting ARP bindings and gateway MAC changes.
2323
- **MITM Interception**: ARP-spoof a target to intercept its traffic; captured packets are displayed in real time with a full layer-by-layer breakdown.
24-
- **LLM Packet Analysis**: Send any captured packet to a local [Ollama](https://ollama.com) instance for AI-assisted analysis (protocol identification, risk assessment, credential spotting).
24+
- **LLM Packet Analysis**: Send any captured packet to a local [Ollama](https://ollama.com) instance or the [Anthropic API](https://www.anthropic.com) for AI-assisted analysis (protocol identification, risk assessment, credential spotting).
2525
- **PCAP Export**: Save captured packets from a MITM session as a `.pcap` file for offline analysis in Wireshark.
2626
- **Scan Export**: Export scan results to JSON or CSV.
2727
- **Progress Bar**: Live progress feedback during scanning.
@@ -38,7 +38,9 @@ The GUI is built with **PySide6** (Qt framework) and uses **Scapy** for all pack
3838
- **PySide6** — graphical user interface
3939
- **netifaces** — network interface introspection
4040
- **requests** — Ollama API streaming
41+
- **anthropic** — Anthropic API client (installed via `requirements.txt`)
4142
- **Ollama** (optional) — local LLM for packet analysis (`ollama serve`)
43+
- **Anthropic API key** (optional) — set via `ANTHROPIC_API_KEY` env var or entered in the UI
4244

4345
---
4446

@@ -132,18 +134,35 @@ Click **Save PCAP** to write the captured session to a `.pcap` file.
132134

133135
> **Note:** MITM requires root/sudo. IP forwarding is restored automatically when MITM is stopped.
134136
135-
### 4. LLM packet analysis (Ollama)
137+
### 4. LLM packet analysis
136138

137-
With [Ollama](https://ollama.com) running locally (`ollama serve`) and at least one model pulled:
139+
Select a captured packet in the MITM window, then choose a **Provider**:
138140

139-
1. Select a captured packet in the MITM window.
140-
2. Choose a model from the **Model** drop-down (populated automatically from the running Ollama instance). Click **** to refresh the list after pulling a new model.
141+
#### Ollama (local)
142+
143+
Requires [Ollama](https://ollama.com) running locally (`ollama serve`) with at least one model pulled.
144+
145+
1. Set **Provider** to **Ollama (local)**.
146+
2. Choose a model from the **Model** drop-down (populated automatically). Click **** to refresh after pulling a new model.
141147
3. Optionally add context in the **Context** field (e.g. `"this is a smart TV"`).
142-
4. Click **Analyse with LLM** — the analysis opens in a dedicated window and streams in token by token. Use **Copy analysis** to copy the result to the clipboard.
148+
4. Click **Analyse with LLM**.
149+
150+
> **Tip:** Any model available via `ollama list` can be used. Smaller models respond faster; larger ones give more detailed analysis.
151+
152+
#### Anthropic API (cloud)
153+
154+
Requires an [Anthropic API key](https://console.anthropic.com).
155+
156+
1. Set **Provider** to **Anthropic**.
157+
2. Choose a model (`claude-opus-4-6`, `claude-sonnet-4-6`, or `claude-haiku-4-5`).
158+
3. Enter your API key in the **API key** field (or set `ANTHROPIC_API_KEY` in the environment and it will pre-fill automatically).
159+
4. Optionally add context, then click **Analyse with LLM**.
160+
161+
> **Tip:** `claude-haiku-4-5` is fastest and cheapest for quick checks; `claude-opus-4-6` gives the most thorough analysis.
143162
144-
The LLM identifies protocol/service, describes what the endpoints are doing, flags security-relevant observations, and provides a risk rating.
163+
The analysis opens in a dedicated window and streams token by token. Use **Copy analysis** to copy the result to the clipboard.
145164

146-
> **Tip:** Any model available via `ollama list` can be used. Smaller models (e.g. `llama3.2:1b`) respond faster; larger ones (e.g. `llama3.1:8b`) give more detailed analysis.
165+
The LLM identifies protocol/service, flags security-relevant observations (plaintext credentials, CVE patterns, suspicious beaconing), and provides a risk rating.
147166

148167
---
149168

@@ -156,7 +175,7 @@ core/
156175
arp_spoofer.py — low-level ARP spoof / restore primitives
157176
mitm.py — MitmThread (spoof loop + sniffer), IP forwarding management
158177
spoof_detector.py — passive ARP sniff-based spoof detection
159-
ollama_analyst.py — OllamaThread for streaming LLM packet analysis
178+
llm_analyst.py — OllamaThread and AnthropicThread for streaming LLM packet analysis
160179
db.py — SQLite persistence (device history, MAC audit trail)
161180
networking.py — CIDR calculation, hostname resolution helpers
162181
vendor.py — OUI/MAC vendor lookup

core/arp_scanner.py

Lines changed: 57 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,9 @@
2222
import core.networking as net
2323
from core import vendor
2424
from core.mitm import MitmThread
25-
from core.ollama_analyst import OllamaThread, fetch_ollama_models
25+
from core.llm_analyst import ( # pylint: disable=E0611
26+
ANTHROPIC_MODELS, AnthropicThread, OllamaThread, fetch_ollama_models
27+
)
2628
from core.platform import get_os
2729
from ui.ui_arpscan import Ui_DeviceDiscovery
2830

@@ -191,7 +193,7 @@ def __init__( # pylint: disable=too-many-arguments,too-many-positional-argument
191193
self.setCentralWidget(central_widget)
192194
self.resize(680, 850)
193195

194-
self._load_ollama_models()
196+
self._on_provider_changed()
195197

196198
@Slot(bool)
197199
def _toggle_mitm(self, checked):
@@ -247,8 +249,17 @@ def _build_packet_panel(self, layout: QVBoxLayout):
247249
)
248250
self._user_context.returnPressed.connect(self._analyse_packet)
249251

252+
# Provider selector
253+
self._provider_combo = QComboBox()
254+
self._provider_combo.addItems(["Ollama (local)", "Anthropic"])
255+
self._provider_combo.currentIndexChanged.connect(self._on_provider_changed)
256+
provider_row = QHBoxLayout()
257+
provider_row.addWidget(QLabel("Provider:"))
258+
provider_row.addWidget(self._provider_combo, stretch=1)
259+
260+
# Model selector — contents change with provider
250261
self._model_combo = QComboBox()
251-
self._model_combo.setPlaceholderText("Select Ollama model…")
262+
self._model_combo.setPlaceholderText("Select model…")
252263
self._refresh_models_button = QPushButton("↻")
253264
self._refresh_models_button.setFixedWidth(28)
254265
self._refresh_models_button.setToolTip("Refresh available Ollama models")
@@ -258,13 +269,37 @@ def _build_packet_panel(self, layout: QVBoxLayout):
258269
model_row.addWidget(self._model_combo, stretch=1)
259270
model_row.addWidget(self._refresh_models_button)
260271

272+
# Anthropic API key field (hidden when Ollama is selected)
273+
import os as _os # pylint: disable=import-outside-toplevel
274+
self._api_key_edit = QLineEdit()
275+
self._api_key_edit.setPlaceholderText(
276+
"Anthropic API key (or set ANTHROPIC_API_KEY env var)"
277+
)
278+
self._api_key_edit.setEchoMode(QLineEdit.EchoMode.Password)
279+
self._api_key_edit.setText(_os.environ.get("ANTHROPIC_API_KEY", ""))
280+
self._api_key_edit.setVisible(False)
281+
261282
layout.addWidget(QLabel("Captured packets:"))
262283
layout.addWidget(pkt_splitter)
263284
layout.addWidget(QLabel("Context:"))
264285
layout.addWidget(self._user_context)
286+
layout.addLayout(provider_row)
265287
layout.addLayout(model_row)
288+
layout.addWidget(self._api_key_edit)
266289
layout.addWidget(self._analyse_button)
267290

291+
def _on_provider_changed(self):
292+
"""Switch model list and API key visibility when the provider changes."""
293+
is_anthropic = self._provider_combo.currentText() == "Anthropic"
294+
self._api_key_edit.setVisible(is_anthropic)
295+
self._refresh_models_button.setVisible(not is_anthropic)
296+
self._model_combo.clear()
297+
if is_anthropic:
298+
self._model_combo.addItems(ANTHROPIC_MODELS)
299+
self._model_combo.setEnabled(True)
300+
else:
301+
self._load_ollama_models()
302+
268303
def _load_ollama_models(self):
269304
"""Populate the model combo box with models available on the local Ollama server."""
270305
models = fetch_ollama_models()
@@ -314,24 +349,35 @@ def _analyse_packet(self):
314349

315350
model = self._model_combo.currentText()
316351
if not model:
317-
QMessageBox.warning(self, "No model", "No Ollama model selected — click ↻ to refresh.")
352+
QMessageBox.warning(self, "No model", "No model selected.")
318353
return
319354

320355
pkt = self._captured_packets[row]
321356
pkt_text = _format_packet(pkt)
322357
user_context = self._user_context.text().strip()
358+
is_anthropic = self._provider_combo.currentText() == "Anthropic"
323359

324360
self._llm_window = LlmAnalysisWindow(pkt.summary(), pkt_text, parent=self)
325361
self._llm_window.set_analysing()
326362
self._llm_window.show()
327363

328-
self._ollama_thread = OllamaThread(
329-
pkt_text,
330-
model,
331-
user_context=user_context,
332-
device_vendor=self._device_vendor,
333-
hostname=self._hostname,
334-
)
364+
if is_anthropic:
365+
self._ollama_thread = AnthropicThread(
366+
pkt_text,
367+
model,
368+
api_key=self._api_key_edit.text().strip(),
369+
user_context=user_context,
370+
device_vendor=self._device_vendor,
371+
hostname=self._hostname,
372+
)
373+
else:
374+
self._ollama_thread = OllamaThread(
375+
pkt_text,
376+
model,
377+
user_context=user_context,
378+
device_vendor=self._device_vendor,
379+
hostname=self._hostname,
380+
)
335381
self._ollama_thread.token.connect(self._on_llm_token)
336382
self._ollama_thread.error.connect(self._on_llm_error)
337383
self._ollama_thread.finished.connect(self._on_llm_finished)
Lines changed: 93 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,19 @@
11
"""
2-
Ollama integration — streams LLM analysis of a captured packet.
2+
LLM integration — streams packet analysis via Ollama or the Anthropic API.
33
"""
44

55
import json
6+
import os
67

78
import requests
89
from PySide6.QtCore import QThread, Signal # pylint: disable=E0611
910

11+
ANTHROPIC_MODELS = [
12+
"claude-opus-4-6",
13+
"claude-sonnet-4-6",
14+
"claude-haiku-4-5",
15+
]
16+
1017
OLLAMA_BASE = "http://localhost:11434"
1118
OLLAMA_URL = f"{OLLAMA_BASE}/api/generate"
1219

@@ -105,3 +112,88 @@ def run(self):
105112
self.error.emit(str(e))
106113
finally:
107114
self.finished.emit()
115+
116+
117+
class AnthropicThread(QThread):
118+
"""Streams packet analysis via the Anthropic API. Emits token by token."""
119+
120+
token = Signal(str)
121+
finished = Signal()
122+
error = Signal(str)
123+
124+
def __init__(
125+
self,
126+
packet_text: str,
127+
model: str,
128+
api_key: str = "",
129+
user_context: str = "",
130+
device_vendor: str = "",
131+
hostname: str = "",
132+
parent=None,
133+
):
134+
super().__init__(parent)
135+
self.packet_text = packet_text
136+
self.model = model
137+
self.api_key = api_key or os.environ.get("ANTHROPIC_API_KEY", "")
138+
self.user_context = user_context
139+
self.device_vendor = device_vendor
140+
self.hostname = hostname
141+
142+
def run(self):
143+
"""Stream analysis from the Anthropic API to the token signal."""
144+
try:
145+
import anthropic # pylint: disable=import-outside-toplevel
146+
except ImportError:
147+
self.error.emit(
148+
"anthropic package not installed — run: pip install anthropic"
149+
)
150+
self.finished.emit()
151+
return
152+
153+
if not self.api_key:
154+
self.error.emit(
155+
"No Anthropic API key — set ANTHROPIC_API_KEY or enter it in the UI."
156+
)
157+
self.finished.emit()
158+
return
159+
160+
device_section = ""
161+
if self.device_vendor or self.hostname:
162+
device_section = "\nDevice under analysis:"
163+
if self.hostname:
164+
device_section += f"\n Hostname : {self.hostname}"
165+
if self.device_vendor:
166+
device_section += f"\n Vendor : {self.device_vendor}"
167+
device_section += "\n"
168+
169+
context_section = (
170+
f"\nAdditional context from analyst:\n{self.user_context}\n"
171+
if self.user_context
172+
else ""
173+
)
174+
user_message = (
175+
f"{device_section}{context_section}\nPacket:\n{self.packet_text}"
176+
)
177+
178+
try:
179+
client = anthropic.Anthropic(api_key=self.api_key)
180+
with client.messages.stream(
181+
model=self.model,
182+
max_tokens=4096,
183+
system=SYSTEM_PROMPT,
184+
messages=[{"role": "user", "content": user_message}],
185+
) as stream:
186+
for text in stream.text_stream:
187+
self.token.emit(text)
188+
except anthropic.AuthenticationError:
189+
self.error.emit("Invalid Anthropic API key.")
190+
except anthropic.RateLimitError:
191+
self.error.emit("Anthropic rate limit reached — try again shortly.")
192+
except anthropic.APIConnectionError:
193+
self.error.emit("Cannot reach Anthropic API — check your network.")
194+
except anthropic.APIStatusError as e:
195+
self.error.emit(f"Anthropic API error {e.status_code}: {e.message}")
196+
except Exception as e: # pylint: disable=broad-exception-caught
197+
self.error.emit(str(e))
198+
finally:
199+
self.finished.emit()

requirements.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ pylint==3.3.5
1111
PySide6==6.10.1
1212
PySide6_Addons==6.10.1
1313
PySide6_Essentials==6.10.1
14+
anthropic>=0.40.0
1415
requests==2.33.0
1516
scapy==2.6.1
1617
setuptools==78.1.1

0 commit comments

Comments
 (0)