Over the past few years, artist Ian Cheng has been creating what he’s called “live simulations,” or digital moving-image works that change over time and address ideas about evolution, artificial intelligence, and our fear of the unknown. In 2017, some of these works were included in a solo exhibition at MoMA PS1 in New York, where they received rave reviews. For his first show with Gladstone Gallery in New York, Cheng is showing a new work, BOB (Bag of Beliefs), an “artificial lifeform,” as the artist calls it, centered around an AI creature who’s part snake, part branching tree. Users can interact with BOB, the piece’s titular protagonist, through an iOS app called Bob Shrine, which allows users to give BOB various objects, including a shiny fruit, a rock, and a shrub. BOB will then interact with each object and learn what he wants to do with it, and in the process determine whether he trusts the person that offered it to him or not. To learn more about the work, ARTnews chatted with Cheng ahead of the show’s opening. This interview has been condensed and edited for clarity.
ARTnews: This piece premiered at the Serpentine Galleries in London in March of last year. Since then, you’ve completely redone the work’s AI, right?
Ian Cheng: Yeah, that’s right. I pretty much restructured the architecture of its mind, which was a significant undertaking. So that’s probably the main difference between the Gladstone version and the Serpentine one, but it’s resulted in a lot of other implications, because the heart of the project is BOB’s AI. I realized that I needed to give BOB the other half of its brain, which I call the “Congress of Demons.” Primarily, it gives BOB motivations. [Humans] don’t just recognize an apple, we’re motivated to recognize that apple and do something with it because we’re hungry. We’re constantly motivated by something—we’re never just in a neutral observational state. Even when we’re neutrally observing, we’re hungry, maybe for information, or we’re staving off boredom.
I’m really curious about the language you’ve used to describe BOB. What is demonic about the demons?
There’s psychological literature about how several different sub-personalities compose a person. We built the work around the idea [of demons] because it motivated me and everyone that I work with to understand that level of sub-personalities. They’re impulsive and possessive, like demons possessing you. Your whole body is forced toward that one goal. I chose the metaphor of a “congress,” meaning that the demons meet every few seconds in BOB’s mind to decide who gets to be in charge, and they’re all competing.
Aside from the programming of BOB, the aesthetic of the project is really interesting to me, too. What went into how BOB’s look?
I was sitting in my studio thinking about what BOB looks like, and the natural combination manifested almost like a daydream. It was a snake that, as it grew and learned, fractalized like the branches of a tree. I don’t know why I chose that. Personally, I have a deep fear of snakes because as a kid, I went to the pet store and a baby snake latched on to my finger, and they had to pry it off, but they couldn’t kill it because it was merchandise. [Laughs.] Snakes also recur in imagery across cultures, and I think that’s because there’s an inherent neurobiological instinct to be alert to snakes. So snakes somehow became synonymous to alertness or awareness. A snake is not a purely evil being, but it’s a being that makes a human more aware.
You’ve continually returned to working with AI. What interests you about it?
I think it goes back to my own interest in cognitive science as an undergrad. I’ve just been very interested in behavior and how people learn. Specifically, I’m obsessed with this idea of how a person deals with change. And I’m obsessed with this duality, this relationship between known and unknown and between order and chaos. It’s manifested in the world, and it’s manifested in our own neurobiology. I think it’s so crazy that the two hemispheres dividing the brain are literally a mirror dividing the world into known and unknown. I think that’s so beautiful and under-appreciated.
People are generally scared of AI. Everyone’s so paranoid about its potential, it seems. Do you think that’s warranted?
It’s an interesting question, and I’m of two minds about it. On the one hand, the general fear of AI is coming from an insecure place. If human beings believe that the most threatening thing is oppressive power, then AI seems like a threat because it’s unknown—our projection is that its first move would be to use its power against us. However, that’s not entirely true, because in making BOB and working with AI, there’s a huge possibility of AI being an extension of a human being, integrated into human culture. I think there’s great opportunity to see it as an augmentation rather than a separate entity that’s potentially threatening.
However, in making BOB, the other side of me realized where a threat might lie. People are interested in AI that feels sentient and lifelike, whether at the level of a pet or a digital human companion. . . . In making BOB, I had to start from the most basic, animalistic, limbic core of the brain—the most basic drives and desires like threat, fight/flight, territory, status. There’s an instinctual layer [to sentient beings], which I would say is that demonic layer. There’s something maniacal about basic desires. Then there’s the [more] social layer where you learn to cooperate with other people, but you also learn to cooperate with your future self. To get BOB to the social level, I had to first program a foundational motivational structure for this brutal chimpanzee level. My fear about AI and making this is that whoever does this at an industrial scale, if they want life-like AI, they have to go through this chimpanzee level. But chimpanzees can rip each other apart over a banana. I recognize that that evolutionary stage is necessary to get anything remotely resembling the complexity and intelligence of our own culture and human behavior. But they better get through it fast.