Articles

AI for Meeting Minutes: How AI Replaces the Worst Job in Every Organization

Ibby SyedIbby Syed, Founder, Cotera
8 min readMarch 6, 2026

AI for Meeting Minutes: How AI Replaces the Worst Job in Every Organization

AI generating structured meeting minutes automatically

In June 2024, something happened that convinced me meeting minutes were fundamentally broken. We had a product planning session — about 47 minutes, seven people in the room. Our PM Nadia was taking notes. So was our engineering lead Rafael, independently. Neither knew the other was writing.

Afterward I compared the two documents. Nadia's minutes focused on customer impact and launch timelines. Rafael's focused on technical trade-offs, infrastructure dependencies, and the risk of our current caching approach under load. Same meeting. Same conversation. The two sets of minutes barely overlapped. It was like they'd attended different meetings.

This is the core flaw of human-generated minutes: they're filtered through whatever the note-taker cares about before they even hit the page. Nadia heard "customer retention might drop 2% if we delay." Rafael heard "the migration could take 14 days under best-case scenario." Both things were said. Each person captured only the half that matched their mental model. The other half evaporated.

Nobody wants the job in the first place. In 14 years of running teams, I've never had someone volunteer to take minutes. Not once. It's the organizational equivalent of cleaning the office kitchen. Everyone knows it matters, nobody wants to do it, and the person who gets stuck with it does a mediocre job while quietly resenting everyone else for not volunteering.

Why Meeting Minutes Have Been Bad for 50 Years

The basic meeting minutes template hasn't changed since the 1970s. Date, attendees, agenda items discussed, decisions made, action items, next meeting. I've seen this format in board meetings at companies doing $200M in revenue and in Monday standups at 4-person startups. The template is fine. What goes inside it is reliably terrible.

There are two failure modes and I've lived through both. Too detailed: someone writes a near-verbatim account that runs to four pages for a 30-minute meeting. Nobody reads it. I once sat in a cross-functional meeting at a previous company where the admin assistant produced a 6-page document. The meeting was 35 minutes. I asked five people whether they'd read it. Zero had. Six pages of effort for zero readers.

Too brief: "Discussed Q3 roadmap. Decided to prioritize feature X. Next meeting Thursday." This tells you what happened at the altitude of a satellite photo. Landmasses visible, roads invisible. Our COO Diana used to produce minutes like this. She'd joke that her style was "minimalist." It was also useless if you needed any detail about why we'd chosen feature X over feature Y or who was supposed to build it.

The useful middle ground — enough detail to reconstruct context but structured enough to scan in 60 seconds — is extraordinarily difficult to produce in real time. You're asking the note-taker to simultaneously listen, evaluate importance, compose written summaries, and keep up with conversation pace. Good executive assistants can do this. Most engineers, PMs, and managers cannot. Yet those are the people we assign to the task.

What AI Minutes Actually Look Like

When we switched to Fireflies for automatic meeting capture, the output quality was immediately better than what anyone on the team produced by hand. Not because the AI is smarter than Nadia or Rafael. Because it has two structural advantages they don't: it captures everything first and summarizes second (humans do both simultaneously), and it applies the same template every single time (humans vary wildly in how they organize information).

Our AI-generated minutes have five sections. Overview: 2-3 sentences on the meeting's purpose and outcome. Decisions: every explicit decision, who made it, and the reasoning discussed. Action items: every commitment with a person attached and any mentioned deadline. Key discussion points: the substantive topics with enough context to understand the reasoning. Open questions: things raised but not resolved.

The soundbite library agent adds something traditional minutes never could: direct links to the exact moment in the recording where a decision was made. When someone disputes what was agreed to — and I promise you this happens at least monthly in any organization — you play the 30-second clip. In our first quarter using this system, we had zero "that's not what I said" disputes. The quarter before, with human-generated minutes, we had four. Each one consumed at least 30 minutes of three people's time while they argued over whose recollection was correct.

The Follow-Through Problem Minutes Never Solved

Traditional minutes list action items. That's where their job ends. "Rafael to send technical spec by Friday." Whether Rafael actually sends the spec is a completely separate problem that the minutes document doesn't address and never has.

This is why meeting minutes have always felt incomplete to me. They capture what people said they'd do, not whether they did it. You'd need a separate project management system to track follow-through, and in practice, the bridge between minutes and task tracker never gets built. The minutes get filed somewhere. The action items get forgotten. Two weeks later, in the next meeting, someone asks "Did we ever send that spec?" and seven people glance around the table.

The meeting action tracker closes this gap. It pulls commitments from the full transcript — not the summary, the complete transcript, which catches commitments the summary sometimes misses — and tracks them. When Friday arrives and Rafael hasn't sent the spec, the tracker flags it.

First month we used it: 143 action items identified across all team meetings. 89 completed on time (62%). 31 completed late (22%). 23 never completed (16%). That last number shocked me. One in six commitments was falling into a hole. We thought we were disciplined. We weren't. By month three, the uncompleted rate had dropped to 7%, mostly because people knew their commitments were being tracked. Rafael told me he started setting calendar reminders for every commitment he made in meetings — not because the tracker told him to, but because he didn't want to be flagged.

Search Changes Everything About Minutes

Paper minutes — or their modern equivalent, a Google Doc linked from a calendar invite — are write-once-read-never documents. An Atlassian study put it at 73% of meeting notes never being referenced after the day they're written. The retrieval mechanism is the bottleneck. You'd need to remember which meeting covered the topic, find that meeting on your calendar, click the notes link, and then Ctrl-F within the doc. If the topic spanned multiple meetings, you're doing this three to five times. Most people give up and just Slack a colleague who was there.

AI minutes stored in a searchable system flip this completely. You search by topic, not by meeting. "What did we decide about the pricing model?" returns every meeting where pricing was discussed, with the specific decisions highlighted and timestamped. Search takes seconds. The accuracy is higher because you're searching against a full transcript, not against Nadia's filtered version of events.

Our product team told me this saves them about 3 hours per week collectively. They used to spend that time reconstructing decision history from fragments of memory and half-finished docs. Now they search, find the meeting, and have the full context in under a minute.

The meeting summary digest takes it further by proactively surfacing patterns. Instead of waiting for someone to search, it generates a weekly summary: major decisions, action items, recurring topics, open questions. Diana gets this every Monday morning. It replaced our Monday status meeting entirely. The information that meeting was supposed to produce now arrives in Slack before anyone makes coffee. Diana estimated we got back about 45 minutes of collective meeting time per week — seven people times a 40-minute status meeting that nobody missed once it was gone.

The Culture Shift I Didn't Expect

Switching from human minutes to AI isn't just a tooling change. It changes how people behave in meetings.

When minutes are manual, meetings are ephemeral. Whatever the note-taker doesn't capture is gone. People act accordingly. Important decisions get re-discussed "just to make sure everyone's aligned" — which really means "because nobody trusts the notes from last time." Meetings multiply to compensate for institutional amnesia.

When minutes are automated, meetings become permanent records. People speak more precisely because they know a transcript exists. Decisions stick because they're documented with context. And the number of meetings drops. We cut recurring meetings by about 15% in the first quarter after implementing AI minutes. Teams realized they could search for the answer instead of booking a room to ask the question.

The person who used to take minutes now participates. That matters more than it sounds. If you've been assigning your most junior team member to take notes in your most important meetings, you've been simultaneously excluding them from contributing and producing the lowest-quality record possible. AI fixes both. Our newest analyst, Kenji, told me he felt like he'd finally been "promoted from secretary to participant" in sprint planning. He was joking. Mostly.


Try These Agents

  • Soundbite Library -- Link meeting minutes to the exact recorded moments where decisions were made for indisputable reference
  • Meeting Action Tracker -- Extract and track every commitment from meeting transcripts to ensure nothing falls through the cracks
  • Meeting Summary Digest -- Generate weekly leadership digests of all meetings, replacing Monday status meetings with searchable summaries

For people who think busywork is boring

Build your first agent in minutes with no complex engineering, just typing out instructions.