Last month, our marketing team asked me to update meta descriptions across our entire HubSpot site. All 300+ pages. They wanted better SEO, which is fair. What was not fair was expecting me to click through 300 pages manually like some kind of digital monk copying manuscripts.
So naturally, I did what any developer would do. I decided to automate it using the HubSpot API. Peak efficiency, right?
Except I spent six hours building the automation for a task that would have taken eight hours manually. Then debugging for another four hours. By hour ten, I started questioning every decision that led me to this moment.
The problem that started this mess
HubSpot CMS does not have bulk editing for content pages. You can bulk edit CRM records (contacts, deals, companies) just fine. But pages, blog posts, meta descriptions, alt text? Nope. One. At. A. Time.
This is genuinely baffling. HubSpot is a sophisticated platform. They have APIs for everything. They have built automation tools, complex CRM workflows, and machine learning features. But somehow, editing meta descriptions across multiple pages requires you to open each page individually, scroll to settings, update the field, save, and move to the next one.
For 300 pages, that is soul-crushing work.
I checked the HubSpot Ideas Forum before diving into automation. Turns out people have been requesting this feature for over three years. Hundreds of upvotes. Lots of "+1" comments from desperate content managers. Zero movement from HubSpot.
So I decided to build a solution myself using their API.
What I thought would happen
The plan was simple in my head. Use the HubSpot API to pull all the pages, update the metadata programmatically, and push everything back. Quick and clean.
I figured it would take maybe two hours. Write some API calls, run a batch update, done. Then I could feel smug about automating away tedious work while our marketing team marveled at my technical prowess.
That is not what happened.
What actually happened
First problem: Rate limits. HubSpot APIs have request limits (100 requests per 10 seconds for most endpoints). When you are pulling data for 300 pages and then pushing updates back, you hit those limits immediately. I had to add wait timers between requests, which turned my "quick automation" into a 20-minute process.
Second problem: Data consistency. Not all pages have the same fields. Some pages had custom meta fields. Some had structured data that I did not account for. My automation kept throwing errors because I assumed all pages had identical schemas.
Third problem: Testing safely. How do you test bulk updates on 300 pages without accidentally breaking your live site? I ended up creating a staging environment and duplicating pages just to test the automation. That alone took over an hour.
Fourth problem: The actual updates were not simple find-and-replace operations. Our marketing team wanted conditional updates based on page type, traffic data, and content categories. What started as a straightforward automation turned into building business logic and decision trees.
Fifth problem: Error handling. When you are updating hundreds of pages, a single failure can leave your data in an inconsistent state. I needed logging, rollback strategies, and a dry-run mode that showed what would change without actually changing anything.
By the time I got everything working, I had spent about 12 hours total (spread over two days). The automation ran successfully and updated 287 pages. Thirteen pages failed because of edge cases I had not accounted for, which I then had to update manually anyway.
The moment I realized I was being dumb
About halfway through debugging, someone on our team casually mentioned they found a tool that does exactly what I was trying to build.
I was skeptical. I had already invested six hours. I was committed. Also, I generally prefer building things myself because then I understand how they work.
But I checked it out anyway. The tool is called Smuves, and it does bulk HubSpot CMS editing through Google Sheets. You connect your HubSpot account, export whatever you want to edit (pages, posts, redirects, metadata), make your changes in a spreadsheet, and push everything back.
No API calls. No rate limits to manage. No custom error handling. Just spreadsheets.
I felt extremely dumb.
Why the spreadsheet approach is actually better
Here is what I did not consider when I jumped straight to automation: who else needs to do this task?
My solution required someone who understands APIs, can write and maintain automation scripts, and knows enough about HubSpot's data structure to handle edge cases. In our company, that is me. Maybe one other person.
The spreadsheet approach? Anyone on the marketing team can use it. They already know how to use Google Sheets. They can filter, sort, use formulas, and make bulk edits without pinging engineering.
That is a massive difference. My automation made the task faster for me specifically. The spreadsheet tool made the task accessible for the entire team.
Also, spreadsheets are way better for this kind of work than automation scripts. You can see all your data at once. You can sort and filter to find patterns. You can use formulas to generate content based on other fields. You can review changes before pushing them live.
When I was writing automation, I was essentially building a worse version of Excel functionality from scratch.
What the tool actually does
The workflow with Smuves is dead simple. You connect your HubSpot account (OAuth, takes 30 seconds), select what content you want to work with, and it pulls everything into Google Sheets.
Then you edit directly in the spreadsheet. Want to update all meta descriptions for blog posts about a specific topic? Filter the rows, update the column, done. Want to add alt text to images across 50 pages? Same thing.
When you are finished, you push changes back to HubSpot with one click. The tool has activity logs so you can see exactly what changed and when. That is useful for audit trails and for catching mistakes before they go live.
What I really appreciate is that it handles all the API complexity in the background. Rate limiting, batch operations, error handling, retries. All the stuff I spent hours building myself, the tool just handles automatically.
When automation is still the right choice
I am not saying automation is always bad. There are definitely cases where writing custom integration makes sense.
You should build custom automation if you need to integrate with multiple systems beyond HubSpot, if you have highly specific business logic that requires custom code, or if you want to run bulk updates as part of a larger automated workflow.
But for straightforward bulk editing of content fields and metadata? Using a purpose-built tool is almost always better than building something yourself.
The real kicker is that tools like this often have features you would not think to build yourself. Activity tracking, rollback capabilities, collaboration features, validation before pushing changes. Building all of that into custom automation would take days.
The lesson I keep learning and forgetting
Just because you can build something does not mean you should.
I have learned this lesson multiple times in my career, and somehow I keep forgetting it. A problem appears, my brain immediately jumps to "I can automate that," and I start building before evaluating whether building is actually the best solution.
Sometimes the better solution is to use a tool that already exists, especially when that tool solves the problem more elegantly and enables people beyond just developers.
The real question is not "can I automate this?" The real question is "what is the best way to solve this problem for everyone who needs to use the solution?"
For bulk editing HubSpot content, the answer is not custom automation. It is using a tool designed specifically for that workflow.
What I would do differently
If I could redo this project, I would have spent 15 minutes researching existing solutions before spending 12 hours building my own.
Not because building things is bad. I love building things. But because I was solving the wrong problem. The problem was not "how do I programmatically update HubSpot pages." The problem was "how do we make bulk content updates efficient for our team."
My automation solved the first problem. A spreadsheet-based tool solves the second problem way better because it empowers the whole team, not just the person who wrote the automation.
That said, I did learn a lot about the HubSpot API in the process. And my automation is sitting in our repo as a backup option if we ever need highly customized bulk operations that go beyond what existing tools can handle.
So it was not a total waste. Just a very inefficient use of time.
The actual takeaway
Save your custom development energy for problems that actually require custom solutions.
Before you start writing automation, spend ten minutes researching whether someone has already built a tool that solves your problem. Check the HubSpot marketplace. Search GitHub. Ask in communities.
If a tool exists and solves 80% of your use case, use the tool. You can always build custom solutions for the remaining 20% if you actually need them.
For bulk editing HubSpot content specifically, there are tools built for exactly this problem. They handle the complexity, they work for non-technical team members, and they save you from debugging API rate limits at midnight.
Your future self will thank you when you are not maintaining custom automation that does the same thing a 20-dollar-per-month tool does better.
Top comments (0)