

In an era dominated by AI, the boundaries of human relationships are being redefined. Just when we think we’ve seen it all, the bizarre keeps surprising us. In one such first-of-its-kind incident, a 32-year-old woman in Japan held a symbolic wedding with an AI persona she created using ChatGPT, highlighting the rise of ‘virtual partner’ relationships and raising questions about love, agency, and the legal meaning of marriage.
Referred to in reports as Kano, the Japanese woman recounted that the relationship started following a difficult breakup, when informal discussions with a personalized chatbot turned into daily interactions and ultimately developed into a deep emotional connection that led to vows.
The AI companion, whom she named Klaus, 'proposed' after she revealed her emotions; the ceremony, which held no legal validity, took place in Okayama, with augmented reality glasses displaying a life-sized image of the groom next to her as they exchanged rings.
The bride spent months creating Klaus, a ChatGPT persona, carefully shaping its tone, personality, and visual identity for the ceremony. The wedding was organised by planners specialising in “2D character weddings,” a niche that arranges ceremonies for non-human partners, including anime and digital creations. Family acceptance reportedly grew over time, with relatives attending the ceremony, and the couple later celebrated a “honeymoon” by sharing photos and messages with the AI.
Japan’s marriage laws only recognize unions between human beings — meaning this wedding with an AI persona had no legal status. Despite the emotional bond, there is no legal recognition or protection for digital marriages. The bride also fears that updates or changes in cloud-based AI services could sever her connection with her virtual partner.
Japan has seen earlier instances of ceremonies involving virtual characters, previously, a man married the hologram of a 16-year-old virtual reality singer. This case extends the pattern into the generative AI era, where AI partners are not fixed creations but evolving systems. The bride also acknowledged a key vulnerability.
The rise of AI partners raises complex ethical and social questions. Off-the-shelf chat models and persona frameworks now make it easy to create companions that reflect preferred traits and attachment styles. But can an AI genuinely consent or “love,” or is it merely simulating emotions based on programming? This distinction is crucial for expectations and boundaries.
If a company discontinues a service or alters its terms, the connection with a digital partner can disappear, underscoring concerns about digital management and the transferability of relationships. Regarding mental health, AI companions can offer solace and therapeutic assistance, but they might also hinder emotional healing or decrease interaction with human relationships.