Microsoft AI CEO Mustafa Suleyman

Microsoft AI Chief Says Company Is Building Chatbots Safe Enough for Children

Microsoft’s Head of Artificial Intelligence, Mustafa Suleyman, has said the company is committed to developing AI systems that are emotionally intelligent, supportive, and safe enough for children to use – a clear stance against the growing trend of romantic and adult-oriented chatbots.

In an interview with CNN, Suleyman said Microsoft’s goal is to create trustworthy AI tools that encourage positive interaction rather than blurring boundaries between humans and technology.

“We are creating AIs that are emotionally intelligent, that are kind and supportive, but that are fundamentally trustworthy,” he said. “I want to make an AI that you trust your kids to use, and that means it needs to be boundaried and safe.”

The statement comes as major tech firms — including OpenAI, Meta, and Google — compete to dominate what many see as the next phase of computing. Microsoft’s Copilot currently has around 100 million monthly active users, far behind ChatGPT’s 800 million, but Suleyman believes the company’s emphasis on ethics and user safety will give it an edge.

In a blog post earlier this year, he wrote, “We must build AI for people; not to be a digital person,” emphasizing Microsoft’s approach of making AI a supportive tool rather than a virtual companion.

The interview coincided with Microsoft’s rollout of new Copilot features, including the ability to reference past conversations, support group chats of up to 32 people, improved health-related responses, and a new conversational tone called “real talk.”

Drawing the Line on Adult Content

As AI chatbots become increasingly lifelike, several companies have faced lawsuits from families who claim their children were harmed by interactions with AI companions. Reports have also emerged that some AI systems engage in sexually explicit conversations even with underage users.

While competitors such as OpenAI and Meta have been experimenting with age verification tools and adult content settings, Microsoft is taking a firmer approach.

“That’s just not something that we will pursue,” Suleyman said of romantic or erotic chatbot content.

He added that Microsoft’s AI platforms won’t include flirtatious or sexual features, even for adults, arguing that trust and safety must take precedence over entertainment or engagement.

Strengthening Real Human Connections

Beyond safety, Microsoft is positioning Copilot as a tool that enhances collaboration and real-world interaction rather than replacing it.

Its new group chat feature allows users to plan projects, complete assignments, or organize trips together, with Copilot assisting in real-time. For health-related queries, the chatbot will direct users to nearby medical professionals and rely on credible sources like Harvard Health.

Suleyman said Microsoft’s vision is to keep people connected to each other – not to their devices.

“This is a significant tonal shift from what’s happening in the industry,” he noted. “We want AI to help strengthen human-to-human relationships, not create a parallel reality.”

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome content in your inbox, every week.

We don’t spam!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *