Opportunity Aggregator banner

Opportunity Aggregator

26 devlogs
36h 28m 22s

An AI-powered matchmaking engine that centralizes academic and professional opportunities from 6 global sources. Built during bus commutes to solve fragmented information retrieval, it uses a multi-tier Gemini/OpenRouter architecture to rank hacka…

An AI-powered matchmaking engine that centralizes academic and professional opportunities from 6 global sources. Built during bus commutes to solve fragmented information retrieval, it uses a multi-tier Gemini/OpenRouter architecture to rank hackathons and jobs based on personalized Markdown profiles. Fully integrated with Discord and running 24/7

This project uses AI

I used the Gemini CLI for my Senior Mentor, specifically to handle the boilerplate for the Discord.py slash commands, the initial scraping pipelines (BeautifulSoup), and SQL query optimization. For the core AI scoring engine (google SDK and OpenRouter), I relied on it for architectural guidance and debugging.
I also used Perplexity for deep research and to keep the documentation updated with the latest tech standards and API changes, and specifically to translate help structure, formato, and polish the English in this README.

Throughout the project I used Google Translate, and for things like that, kind of within the code/docs in the repo, I almost exclusively used the CLI.

Demo Repository

Loading README...

ChefThi

Shipped this project!

Hours: 36.47
Cookies: 🍪 241
Multiplier: 6.61 cookies/hr

I built this Discord bot and background radar for jobs and events (It was originally for Telegram, but I had to change it because I discovered the Converge bot sidequest, which was only available on Discord and Slack slack ) (TabNews, Devpost, GitHub) Developed a litlle part during bus commutes msn-bus , it solves fragmented info retrieval. I overcame Discord’s 3s discord limit using async deferring and hardened persistence. Solo project by EngThi. Ready for production with Docker docker-transparent yay

ChefThi

Final Ship & Interactive UI: Finalized the interactive UI with Discord Buttons. Implemented the “Copy Match Info” and “Export Config” features using discord.uiView Completed the README with badges and assets of the project

Attachment
0
ChefThi

Implemented the “Strict Language Rule” in the AI Scorer. The engine now detects the User Profile language and responds accordingly, even when the source data (TabNews/MLH) is in a different language.

Attachment
0
ChefThi

Finished the main.py scheduler. The system now runs as a persistent service, syncing every 6 hours and at 09:00 AM. Added automated Webhook alerts for opportunities with match scores above 90% tw_top

I was doing some tests with the bot because I had changed it, but the messages weren’t reaching the Discord channel. They did arrive a few times, but it was signed with the default Webhook bot tw_angry . I discovered that the system wasn’t updating the .env file and had cached or something like that the old URL, causing this problem…

Man, I spent ages trying to figure out what it was. I messed with the code vscode , but that was all it was. dinosaur_cant_believe_theres_no_code

Attachment
Attachment
0
ChefThi

BYOK Architecture: Completed the Bring Your Own Key (BYOK) system. Users can now securely store their own Gemini/OpenRouter keys and Markdown profiles in the SQLite database via ephemeral Discord commands.

Attachment
0
ChefThi

Integrated the final data source (GitHub Jobs) and refactored all scrapers into a unified ingestion pipeline. Fixed environment variable caching issues using load_dotenv(override=True)

Attachment
0
ChefThi

This session was focused on high-level organization and production readiness.

Documentation:* Rewrote the technical README from scratch. Added a dedicated section for the project’s origin story (bus commutes) and created placeholders for architecture diagrams and Discord showcases.
Refactoring:* Cleaned up the project structure, moving core logic to the /src directory and ensuring all imports follow the package structure.
Badges & Branding:* Integrated professional GitHub badges (Python, Discord, AI, Docker) to standardize the repository.
Deployment Prep:* Finalized the Dockerfile and docker-compose.yml to ensure the bot and the background radar can run smoothly on a VM or cloud environment without manual intervention

Attachment
Attachment
0
ChefThi

Prepping for the cloud move. Spent this session building the Dockerfile and docker-compose.yml

Attachment
0
ChefThi

Added the /models command to keep track of which “brain” is active. Since I’m using a multi-tier fallback (Gemini vs OpenRouter), I need to know exactly which API is eating my quota in real-time. Also refined the ephemeral logic so my configuration tweaks don’t spam the entire Discord channel. Professionalism is in the details.

Attachment
0
ChefThi

Locked in for nearly 5 hours to restructure the entire project. I finalized the “Hot-Reload” system for profiles—now, if I update my skills in the markdown file from my phone while I’m on the bus, the bot adapts its scoring logic instantly. No restarts, no downtime. This was the session where the aggregator stopped being a tool and started being a real system.

Attachment
0
ChefThi

The bot is very great, but the AI rationale is where the real value is. I refined the strategic engine to explain why an opportunity fits my specific profile in Portuguese. It’s not just listing links anymore; it’s providing career intel while I’m literally on the bus ride to college

Attachment
0
ChefThi

Quick session to squash some OpenRouter naming bugs and prepare for new features. I’m making sure the backend is ready for the /models command so I can switch the bot’s “brain” on the fly. The bot is getting smarter and much more stable with every push

Attachment
0
ChefThi

Moving away from plain text. I’m shifting the focus to high-end Discord Embeds with color-coded matchmaking. High matches (>75%) now pop in green, while lower scores drop to amber or red. It’s all about glanceability—I need to know if a job is worth my time in less than a second

Attachment
0
ChefThi

Fighting 429 errors on a free tier is a total nightmare. To fix this, I built a Multi-tier Fallback system today. If Gemini 3.1 hits a wall, the bot automatically shifts to Gemini 2.0 or OpenRouter/Gemma. It’s about making the aggregator unbreakable so I don’t get left in the dark.

Attachment
0
ChefThi

The holiday grind was intense, and now I’m cleaning up the aftermath. I spent this session hardening the new OpportunityBot class. It’s no longer just a script; it’s a modular engine that handles state without leaking. The spaghetti is gonenow we have a real factory foundation.

Attachment
0
ChefThi

Just got back from the long holiday grind (thanks to Tiradentes day and no classes until Wednesday). I don’t even have anything fancy to report because I’m 100% locked into the bot’s core. Scrapping the old procedural scripts and moving everything to a real engine. The holiday acceleration was real, now it’s time to harden the bot

Attachment
0
ChefThi

Opportunities result

The loop is closed! The project have a fully functional aggregator that scrapes 4 sources (Devpost API, MLH, TabNews, and GitHub Jobs) and runs them through the AI brain.

Seeing the bot return a Top 5 list with a customized “Rationale” in Portuguese is incredible. It’s not just listing jobs; it’s telling me why I should care about them based on my specific interests in AI and
automation. 🤖 brazilianfanparrot
neobot

Architecture highlight: Everything is backed by a local SQLite instance (opportunities.db). This makes searches instant and data persistent
databaseparrot

If you look at the data from last year, there was an attribute error at the end of the AI ​​processing function, but it’s easy to fix.
fixed

Attachment
Attachment
Attachment
Attachment
0
ChefThi

Bot is now better

Total refactor of bot.py! 📦 I moved away from a procedural script and encapsulated everything into an OpportunityBot class. This makes the code modular and ready for unit testing.

I also added Pre-flight Validation. The bot now checks the .env for the Discord Token and Gemini Keys before even trying to connect. 🛠️ If something is missing, it fails gracefully with a clear error message.

Also added a dynamic presence: the bot now shows /opportunities as its activity status, signaling to the server exactly what it’s built for.
And guys, I really think I’m improving and making the project more complete. The API integration isn’t 100%, but it’s progressing. I had some naming and module import errors, but they were pretty straightforward

Attachment
Attachment
Attachment
0
ChefThi

more things here my man

Dealing with 429 errors (Quota Exceeded) while building a “Super MVP” on a free tier is a nightmare. 😤 Instead of crying about it, I built a Multi-tier Fallback

System to make the aggregator “unbreakable.”

  • I built a central config.py that manages the intelligence flow:
  • Primary: Gemini 3.1 Flash (its cheapest).
  • Secondary: Automatic fallback to Gemini 2.5/2.0 if the primary quota hits a ceiling.
  • Last Resort: OpenRouter integration using Gemma/Free models as a safety net.

googlegemini
error

Attachment
0
ChefThi
  • refactor: improve Discord UI with embeds and add profile hot-reload (324662c)

With the new UI I transformed the bot from basic text output to high-end Discord Embeds. It’s not just for looks; it’s about readability. I implemented a color-coded hierarchy: High Match (>75%) gets a vibrant green, while lower scores drop to amber and red. 🟢🟡🔴
tw_traffic_light

discord
The biggest technical win this session was the Profile Hot-Reloading in src/scorer.py. I refactored the AIScorer to re-read user_profile.md on every single request.

It happened that when I’m on the bus and realize I forgot to add FastAPI to my skills, I can just edit the markdown file on my phone and the bot adapts instantly. No restarts, no downtime.
neobot

Attachment
Attachment
0
ChefThi

It’s been a natural next step after bringing the aggregator to Discord. With the /opportunities command already working, the bot can now go further: users can paste any opportunity title or description and get an instant AI analysis on demand.
Making the bot more interactive
The new /analyze slash command was added to bot.py. It accepts free text input, creates a lightweight mock opportunity, runs it through the existing AIScorer class, and returns a clean Discord embed.
The embed shows:

A Match Score (0-100%, with orange highlight when >70%)
A clear Rationale explaining why the opportunity fits (or doesn’t) the user profile
Title and shortened description for context

To keep the bot responsive, the heavy AI scoring is offloaded with loop.run_in_executor and the interaction is deferred, following the same pattern used in the previous Discord integration.

Attachment
Attachment
0
ChefThi
  • feat: Add Discord Bot integration for Converge sidequest 🤖 (c5a681e)

Discord Integration for Converge Sidequest

It’s been a busy few days. While the Telegram bot was working fine, I realized that to truly close the loop for the Converge sidequest, I needed to bring the aggregator to where the community actually hangs out: Discord.

Shifting to Discord

I implemented the Discord integration using discord.py. Instead of traditional prefix commands, I went with Slash Commands (/opportunities) using the CommandTree. It makes the UX much cleaner for the end user.

The biggest challenge wasn’t the UI, but how the bot handles the workload.

Since the bot has to scrape multiple sources (Devpost, MLH, TabNews) and then wait for the Gemini API to score them, these tasks are “blocking.” In a Discord bot, if you run a heavy scraping function directly inside an async command, the entire bot freezes until it’s done.

To fix this, I used loop.run_in_executor. This allows the bot to:
Receive the command.
Trigger the “thinking…” state (interaction.response.defer).
Offload the heavy scraping/scoring to a separate thread.
Post the results back once they are ready without ever disconnecting from Discord’s gateway.

On Telegram, I was mostly using plain text. For Discord, I took advantage of Embeds. Now, each opportunity shows:

  • A clear Match Score (0-100%).
  • The Rationale (Why the AI thinks this fits my profile).
  • Links and tags in a structured format.

I’ve also updated the environment handling to support both Telegram and Discord simultaneously. The project is now reaching a stable milestone

I get happy for complete this part for the sidequest and because I got a new skill of dev:)

Attachment
Attachment
Attachment
Attachment
0
ChefThi
  • feat: Integrated Devpost API and AI Daily Strategy engine (a108ed8)

The aggregator took another solid step forward with a focused update that strengthens data collection and adds intelligent daily guidance.
News

Full integration of the official Devpost API (https://devpost.com/api/hackathons) replacing previous placeholder logic in main.py.
New src/sources/devpost.py module with fetch_devpost() that pulls up to 20 upcoming hackathons, including title, URL, cleaned prize amount, submission deadline, and structured opportunity data.
HTML sanitization via clean_html() to properly handle prize fields.
Addition of generate_daily_strategy() method in src/scorer.py that uses Gemini to analyze the top scored opportunities and produce a concise 3–4 sentence strategic recommendation in Portuguese, taking the user profile into account.
Updated src/config.py to include newer Gemini model variants for better fallback behavior.
main.py now calls the Devpost fetcher and prints the AI-generated daily strategy after scoring.

Attachment
Attachment
0
ChefThi
  • feat: Evolve Aggregator to AI-Powered Matchmaker with Gemini 3.1 🚀 (64b172f)

The project evolved from a simple opportunity aggregator to an intelligent recommendation system. The main change was the integration of the Gemini 3.1 model as the matching engine, transforming the flow into a true personalized matchmaker.

News

  • Complete refactoring of the code structure for greater modularity: creation of the src/ folder with clear separation between sources, scorer, notifier, and bot.

  • Implementation of the src/scorer.py module responsible for calculating the compatibility score between the user’s profile and each collected opportunity, using Gemini 3.1 as the main model.

  • Addition of a multi-tier fallback strategy and dynamic model selection to increase robustness in case of quota limits or temporary failures.

  • Expansion of data sources: the system now collects opportunities from Devpost (via API), MLH, TabNews, and GitHub Jobs in parallel.

  • Development of Telegram bot commands:

  • /today — displays opportunities collected on the day

  • /match — returns the 3 best personalized recommendations with an explanation of the score

  • /search <term> — local search in the database

  • Creation of the daily scheduler in main.py + digest.py for automatic updating of opportunities and preparation for sending digests.
    Here I used a quite Geini CLI

  • Documentation update

Lessons learned and technical process

The transition required careful separation of responsibilities to facilitate future expansions. The choice of Gemini allowed for context-rich scoring, considering skills, experience level, and preferences described in the user profile. The fallback strategy ensures that the system remains functional even under adverse API conditions.

The result is a functional Super MVP that already delivers real Value

Attachment
Attachment
0
ChefThi

From simple scraper to AI‑assisted matchmaker (while commuting between classes) 🚏💼

Opportunity Aggregator started in January as a very simple experiment: a Python bot, a SQLite file, and a basic TabNews parser. Then college kicked in, Blueprint deadlines appeared, and I found myself coding on short windows between bus rides and homework instead of doing long, focused sprints. That’s why the commit history shows an initial burst in January and then a big jump only now.

During that gap I spent more time thinking than committing: what makes this different from a fancy RSS reader? The interesting piece is the match score — using AI to tell you how well each opportunity fits your profile, instead of just dumping links. I used Perplexity a lot in this phase to explore architecture ideas: how many sources, how to model user profiles, how aggressive the AI usage should be, and how to keep things cheap and robust.
The latest commit is where all that background thinking finally lands in code: I implemented a Super MVP with a proper SQLite persistence layer and a multi‑tier AI fallback strategy (Gemini as the primary brain, with dynamic model discovery as a fallback) to score opportunities. The scraper stack now has a cleaner structure and is ready for more sources.

It’s still early, but now the project feels like an actual assistant instead of a script. Next steps: Telegram commands for /match and /today, plus a daily digest flow so it can ping me with the top 3 fits while I’m literally on the bus to college.

Attachment
Attachment
Attachment
0
ChefThi

Título: Setup Inicial, Faxina no Git e Integração com TabNews
Data: 2026-01-25

Commits:

  • b966db7 — feat: implementa parser para buscar notícias do TabNews
  • 89f8288 — chore: stop tracking venv and internal config files
  • d41b700 — Criei o arq bot.py e estou me preparando para o primeiro Devlog :)
  • af2779b — Iniciando o projeto com os primeiros arquivos
  • a98014e — Initial commit

Resumo:
Iniciei o Opportunity Aggregator para centralizar chances acadêmicas e tech. O foco foi estruturar o ambiente, criar a base do bot no Telegram e implementar um parser para coletar dados reais via RSS. Este início foi um mergulho prático em bibliotecas novas e conceitos de versionamento.

O que foi feito:

  • Estruturação: Configuração de .gitignore e requirements.txt para um ambiente organizado.
  • Bot Base: Criação do bot.py com comandos /start e /ping usando ‘python-telegram-bot’.
  • Parser RSS: Uso da lib ‘feedparser’ para extrair os 5 posts mais recentes do TabNews.
  • Segurança: Uso do ‘python-dotenv’ para gerenciar o token do bot de forma segura.

Dificuldades e Aprendizado Ativo:
Enfrentei desafios logo de cara. Um erro comum foi subir a pasta ‘venv’ para o GitHub. Isso me forçou a aprender comandos avançados de Git, (não q vou lembrar muito, mas usei 🫡😁) como ‘git rm –cached’, para limpar o repositório sem perder os arquivos locais. Foi uma lição prática sobre o que não deve ser versionado.

Estou lidando com libs complexas como ‘python-telegram-bot’. Em vez de só copiar código, estou lendo a documentação para entender o “porquê” das coisas, como a lógica de funções assíncronas (async/await). A IA tem sido uma “Mentora”; ela explica a engrenagem por trás dos snippets, mas eu mesmo aplico e edito o código para garantir o aprendizado.

Resultados:
Bot operacional e parser extraindo dados reais com sucesso. Próximo passo: integrar o parser no comando /vagas do bot e iniciar os estudos com Supabase.

Attachment
0