Right now we are in a phase where everyone creates their own "AI SAAS". LinkedIn is flooded with these "AI will replace all devs" dumb posts and I can't wait for it all to fade away.
In the meantime, I've decided to take AI and use it as it should be used, as a tool. However, this time I wanted to go all-in and see how good code could the best AI models write if I guide it from architectural side and review what it's doing.
The majority of the code was written by Claude Opus 4.6 while minor tasks were done either by Claude Sonnet 4.5 or GPT-5 mini for super simple ones.
But first, let's start with from the beginning.
The problem origin
It's super hard to keep up with all the newsletters, YouTube channels, documentations and I simply don't have enough time to go through it all. So what I usually did is if I run into an interesting YouTube video but it's too long, I'd use Perplexity or other AI platforms to summarize it for me.
Manual workload
Even though that process saves time, it is still manual and I would need to browse through YouTube to find what I want to summarize in the first place.
I knew there is room for further improvement.
The solution
The first idea that came to my mind is what if I could have a scheduled job of curated YouTube channels that I handpick summarized each day. Sure, I'd need to spend some time initially to choose which channels to put there, but that's still better than the previous process.
It's a 1 hour total investment vs 1 hour daily investment.
That's when the idea was born.
I wanted to have some kind of a "dashboard" which I could access from multiple devices hosted on a server and available 24/7. The design had to be:
- Responsive
- Simple
- Intuitive
The channels would be added through an "Admin panel" where I could add or remove those I want summarized. The trick is, I didn't want to summarize all available videos from each channel as some of them might have more than a thousand videos.
That's when I decided to put a scheduler (cron) job to fetch videos from the channels on the list every day at a certain time. If the channel had any new videos published in the last 24 hours, those will get summarized and shown on the dashboard.
Since the page would be publicly available, I had to protect it with some kind of a login for the "admin panel" as I didn't want other people messing with it. The stack I chose was:
- VueJS for the frontend
- ExpressJS for the backend
- PostgreSQL for DB
- ... and a few more external services visible in the diagram
Fetching the videos
The first piece of the puzzle was to fetch YouTube videos of given channel. To do that, we can use the YouTube Data API which allows us to fetch videos sorted by date.
We are only fetching videos published in the last 24h so we are passing the "publishedAfter" prop to get those.
After fetching, we need to double-check the database to see if that video was already processed. If it wasn't, proceed to fetch transcription for the given video.
Fetching the transcription
Fetching the video transcription can be done through many different 3rd party libraries, but I wanted to do it for free. To do that we need to use the native YouTube transcript API which is used in NPM package called "youtube-transcript-plus". The downside of this approach is that YouTube can change its API in any time and break the app, but since I'm building it for myself, it was a tradeoff I can live with.
Summarize with AI
Now comes the fun part. How to choose an AI model with so many to choose from? Well, my goal was super simple:
- Spend the least amount of money
- Doesn't have to be the best model on market
- Gets the job done
I decided that it doesn't have to be the "best model" because the task is super simple and generally all AI models are good enough with such simple tasks. After all, it's taking a bunch of text and summarizing it.
So which one did I choose? Random free models from openrouter. What do I mean by "random" ? Well, OpenRouter can be seen as a platform for different AI model providers and they can update their models to no longer be free, so I've let OpenRouter choose a model for me instead.
How to do it through code? One simple line.
Basically, we just write "openrouter/free" as model and let the platform choose appropriate/available one for the task. We can also specify the instructions for summarizing, such as this simple one:
Finally, we get the summary and store it into the database. At that point we can access it via dashboard and it looks something like this.
Key notes
The project is hosted on Oracle VPS Cloud. It is dockerized and uses nginx as the reverse proxy. Total amount of money I spent during this project is less than 10$. That's because I'm using Github Copilot Pro and carefully craft every prompt.
Room for improvement
There is always room for improvement, depending on the end goal. I started this project as a simple way of keeping up with the news and to see which video is interesting enough for me to watch whole. It serves it purpose at the moment.
However, other people might benefit from some additional features like:
- Configurable interval when the videos are fetched
- Configurable models to process transcripts and summarize them
- Multi user support
- Simpler way to add channels (At the moment there is no automated mechanism to fill channel ID)
Everything being said, it was a fun small project and the AI tools are definitely great addition to programming when properly used. I've made myself a tool that I can use each morning while I sip my coffee and that's enough for me.
The most issues I've had when vibecoding this project was related to nginx and docker. I really wanted to force it out without manually intervening with it.
P.S I didn't waste time optimizing anything in the code, the goal was just to get it working. This is definitely not a production ready project.
Github repo available at: https://github.com/kresohr/youtube-summary





Top comments (0)