By Monica Talan, Partner, CommsCollectiv

I’ve always believed that communications leaders are more than brand stewards. We are the conscience. We are the ones who ask the hard questions, flag the reputational risks, and advocate for decisions that reflect long-term values over short-term wins. And today, that role is more important than ever. Why? Because AI is here, shaping how we create, communicate, and make decisions. That means it’s on us to understand what responsible AI looks like and to make sure we are not just using it efficiently but using it ethically.

What does that mean in practice?

First, it means being clear about the purpose. Not every use of AI is good just because it saves time. We need to look at intent. Who benefits? Are we amplifying bias or reducing it? Is transparency built into the process? Responsible AI starts with asking the right questions, even when the answers are inconvenient.

Second, it means we need to be at the table. AI strategy is not just for the CIO or the data science team. If communicators are not involved from the beginning, we lose the chance to shape how AI shows up in our brands, our campaigns, our culture. We also lose the chance to spot when something is headed in the wrong direction.

Third, it means setting the tone internally. Ethical use of AI starts inside the organization. Are employees being trained on how to use generative tools responsibly? Is there a policy? Is someone accountable for reviewing outputs and ensuring accuracy? Communications leaders are uniquely positioned to lead those internal conversations and create guardrails.

Deloitte recently introduced an AI Governance Roadmap, and in this piece published by the Harvard Law School Forum on Corporate Governance, it accurately states: “In the rapidly evolving landscape of AI, organizations are increasingly leveraging AI technologies to drive innovation, enhance operational efficiency, and deliver value to stakeholders. However, AI’s transformative potential also brings significant challenges, including ethical considerations.” That line should hit home for all of us in comms. We are used to navigating complexity. AI just adds another layer.

Finally, it means making room for nuance. AI is not good or bad. It is not the future or the end. It is a tool. And like every tool, how we use it matters more than the tool itself. We should resist the temptation to oversimplify. The best path is usually the one that requires more thought, more collaboration, more dialogue.

As we move into this new era, let’s bring those same values to our work with AI. Let’s be bold, but let’s also be thoughtful. Because responsible AI does not start in the lab. It starts with us.