Helen and I are done wrestling Facebook Ad Manager every morning

M
Mark JonesAuthorPublished Apr 30, 2026
2

At a Glance

Target Audience
M365 Marketers, Small Business Owners
Problem Solved
Manual Facebook ad testing: slow iteration, high costs from poor variants, no AI feedback loop on performance metrics like ROAS.
Use Case
Automating ad campaigns for Microsoft 365 training/products, e.g., Collab365 workshops with real-time AI optimization.

Over the years, I’ve spent over £1m on Facebook ads across Collab365.

And most mornings, I’d sit there staring at impressions, CPC, cost per conversion and ROAS, watching good money burn on bad experiments.

Honestly, the “testing what works” phase is awful.

Expensive too.

  • You try five headlines.
  • Ten images.
  • Three calls to action.
  • Different audiences.
  • Different countries.
  • Different offers.

Before breakfast, you’re staring at 150 possible variants and wondering which one is about to set fire to your budget.

One bad test can torch £50 before lunch.

And for small businesses, this is brutal.

Most don’t have a copywriter, designer, audience psychographics expert and media buyer sitting around waiting to help.

So you learn it yourself.

Badly at first.

Then, eventually, less badly.

A year ago, we hacked part of the problem with Claude.

We fed it product notes, headline ideas, sales pages, objections, customer avatars and bits from our content repo.

It spat back polished ad copy, headline variants, image ideas and video scripts.

Brilliant.

But there was one massive problem.

Once the ads were uploaded into Meta, the relationship between Claude and the campaign was over.

Claude couldn’t see what happened next.

  • It couldn’t see which headline won.
  • Which image flopped.
  • Which audience clicked.
  • Which offer converted.
  • Which country drained the budget.

So it couldn’t mark its own homework.

The “AI ad assistant” was basically blind the second the campaign went live.

To close the loop, we had to copy and paste the metrics back in manually.

  1. Run the test.
  2. Wait.
  3. Check CPC.
  4. Check ROAS (Return On Ad Spend).
  5. Prompt again.
  6. Create new variants.
  7. Upload again.

By the time the loop closed, half the budget had gone and I’d usually been dragged into twelve other jobs.

That’s why some ads just sat there.

Our Inbox Zero workshop, for example, had around £130k of ad spend behind it.

It got to roughly 1:1 ROAS.

Not terrible.

But then we left it mostly untouched for months because the iteration loop was just too slow and there was only me running it.

That’s the part people miss about ads.

The hard bit isn’t making one good ad.

The hard bit is building a machine that learns fast enough.

And that’s why I’m suddenly very interested in what’s just happened with Meta’s new MCP access.

Think of MCP as a kind of adapter that lets AI tools plug directly into systems like Meta Ads.

Not just write copy outside the platform.

Actually interact with the campaign data.

Read performance.
Compare winners and losers.
Generate new variants.
Pause weak ads.
Suggest new angles.
Move budget.
Create the next test.

With human approval in the loop.

That changes everything.

Because AI-generated ads were never the full revolution.

The real breakthrough is AI that can see the results, learn from them, and improve the next round.

That’s also why our rebuild of Collab365 matters so much.

The old Collab365 stack was seven different systems stitched together:

  • WordPress.
  • LearnDash.
  • WooCommerce.
  • Circle.
  • FluentCRM.
  • Stripe.

A load of spreadsheets holding it all together with crossed fingers.

The data we needed to create effective ads was everywhere.

Product data in one place.
Customer behaviour somewhere else.
Sales data in another.
Email engagement buried in another tool.
Community signals somewhere completely different.

So even when we wanted to use AI properly, we couldn’t feed it the clean context it needed.

Now we’re rebuilding Collab365 as Collab365 Spaces.

AI-native from the start.

Greenfield.
Cloudflare Workers.
Full SQL control over our own data.
One engine instead of a pile of duct tape.

That means the ad loop can finally become much smarter.

AI can see which articles people read.
Which offers convert.
Which avatars respond.
Which messages work for graduates, founders, consultants, Microsoft 365 pros or enterprise buyers.

Then it can create tailored campaigns, link them to the right Pulse article or offer, watch the numbers, kill the losers early, and keep proposing better variants.

Not fully autonomous.

I’m not handing a robot my credit card and hoping for the best.

But with approval points, budget caps and clear rules?

That’s a completely different game.

And it’s exactly the kind of game small teams should be able to win.

The big training companies have bigger teams, bigger budgets and bigger stacks.

But they also have procurement cycles, platform lock-in, reporting silos and five meetings before someone changes a headline.

There are two of us in Telford.

Me and Helen.

We don’t need a steering committee to test a new angle.

We can ship, learn, kill, improve and repeat.

That’s the shift I think people are underestimating.

The advantage is no longer who can create the most ads.

It’s who can close the learning loop fastest.