Show HN:我使用 n8n 和 OpenAI 建構了個人化 AI 新聞策展工具來過濾 RSS 資訊
一位開發者利用 n8n 和 OpenAI 的 GPT-4o-mini 建構了自託管的自動化工作流程,用於過濾 RSS 資訊,充當個人編輯,識別相關科技新聞並提供簡潔的電子郵件摘要。
I found myself wasting too much time doomscrolling through tech news and RSS feeds, scanning hundreds of headlines just to find the 3-4 items that actually mattered to my work.
To fix this, I built a self-hosted automation workflow using n8n that acts as a personal editor.
The Architecture:
Ingest: Pulls RSS feeds (TechCrunch, Hacker News, etc.) every morning.
Filter (The Agent): Passes headlines to GPT-4o-mini with a system prompt to "act as a senior editor." It scores each article 0-10 based on specific interests (e.g., "High interest in Local LLMs," "Low interest in crypto gossip").
Logic: Discards anything with a score < 7.
Research: Uses Tavily API to scrape and summarize the full content of the high-scoring articles.
Delivery: Sends a single, clean email digest via SMTP.
The Hardest Part (SSE & Timeouts): The biggest technical hurdle was handling timeouts. Since the AI research step takes time, the HTTP requests would often drop. I had to configure Server-Sent Events (SSE) and adjust the execution timeout env variables in Node.js to keep the connection alive during the deep-dive research phase.
Resources:
Workflow/Source (JSON): https://github.com/sojojp-hue/NewsSummarizer/tree/main
Video Walkthrough & Demo: https://youtu.be/mOnbK6DuFhc
I’d love to hear how others are handling information overload or if there are better ways to handle the long-polling for the AI agents.

相關文章