File: InvestorX.json
This workflow acts as an autonomous financial analyst, aggregating market data, analyzing social sentiment, and generating formatted investment briefs.
- Trigger: Scheduled CRON job running daily at 6:00 AM (Mon-Fri).
- Process: * Data Ingestion: Fetches real-time market news and financial data via REST APIs (NewsAPI, Marketaux) and scrapes trending discussions from targeted Reddit finance communities.
- Data Transformation & Scoring: Utilizes custom JavaScript nodes to normalize JSON payloads from disparate sources. Implements a proprietary scoring algorithm to filter out noise (e.g., meme stocks) and rank stories based on investment keywords, source authority, recency, and Reddit sentiment overlap.
- AI Processing: Feeds the highest-scoring, validated data to Anthropic (Claude 4.6 Sonnet) via LangChain integrations. The LLM is prompted to act as an expert analyst, generating contrarian investment angles, key insights, and X (Twitter) drafts with predicted engagement scores.
- Output: * Distribution: Formats the AI output into HTML and sends a daily intelligence brief via Gmail.
- Data Logging: Archives the processed stories, scores, and AI drafts into a Notion database for historical tracking.
- Conditional Automation: Uses boolean logic gates to evaluate the LLM's engagement prediction and compliance risk flags. If a story scores high enough and passes compliance, it automatically publishes the draft to X (Twitter).
Key Skills Highlighted: Multi-API orchestration, custom JavaScript data manipulation, LLM prompt engineering, sentiment analysis, and conditional logic routing.