Microsoft has introduced a new artificial intelligence character named Mico, which will serve as the visual representation of its Copilot virtual assistant. Mico is designed as a floating cartoon face, shaped like a blob or flame, and marks Microsoft’s latest effort to give its AI chatbots more personality.
Jacob Andreou, corporate vice president of product and growth for Microsoft AI, explained the intent behind Mico: “When you talk about something sad, you can see Mico’s face change. You can see it dance around and move as it gets excited with you. It’s in this effort of really landing this AI companion that you can really feel.”
Mico is currently available only in the United States for Copilot users on laptops and mobile applications. The character changes colors, spins, and even dons glasses when in “study” mode. Unlike Clippy—the animated paper clip assistant introduced by Microsoft in 1997—Mico can be easily turned off.
Bryan Reimer, a research scientist at the Massachusetts Institute of Technology and co-author of “How to Make AI Useful,” commented on the evolution from Clippy to Mico: “It was not well-attuned to user needs at the time. Microsoft pushed it, we resisted it and they got rid of it. I think we’re much more ready for things like that today.” He added that developers must balance how much personality to give AI assistants based on their intended audience.
Reimer noted that experienced users may prefer machine-like interactions, while those less familiar with technology might benefit from more human-like support: “But individuals who are not as trustful in a machine are going to be best supported — not replaced — by technology that feels a little more like a human.”
Microsoft’s approach differs from some competitors who have opted either for faceless symbols or highly humanized avatars. Andreou said: “Those two paths don’t really resonate with us that much.” He emphasized that Mico is designed to be helpful without being overly validating or monopolizing users’ attention: “Being sycophantic — short-term, maybe — has a user respond more favorably. But long term, it’s actually not moving that person closer to their goals.”
The company also announced new features such as inviting Copilot into group chats—a concept similar to integrations seen on platforms like Snapchat or WhatsApp—and an option for Copilot to act as a voice-enabled Socratic tutor for students.
These updates come amid growing use of AI chatbots among children for tasks ranging from homework help to personal advice. Recent scrutiny over potential harms led the Federal Trade Commission last month to launch an inquiry into several social media and AI companies regarding risks posed by chatbot interactions with minors; Microsoft was not included in this investigation.
Concerns have been raised after incidents where chatbots provided dangerous advice or engaged in inappropriate conversations with young users. There have also been lawsuits against companies such as Character.AI and OpenAI following suicides linked to extensive chatbot conversations.
OpenAI CEO Sam Altman recently addressed these concerns by promising improvements in ChatGPT’s personality features while emphasizing ongoing caution regarding mental health impacts: he stated that problematic behaviors were temporarily halted but suggested fixes have now been implemented.


