Okay, now we’re getting somewhere.
I took the code that visualizes the Kinect depth data, and the code that knocks out the background, and combined them. So now, as things get further from the camera, they fade into transparency. I can insert any background here, and it doesn’t have to be the same colour, but for now I’ve put the blue recycling bin to work again.
There was a lazy susan in the studio, luckily enough, and it came in handy for spinning the chair (you can see the edge of it showing up in the clip above). Here’s the setup with the Kinect sensor in the background.
There’s something eerie about seeing the chair slip out of view, as if the scene is lit by candlelight. Fading the object into the background introduces a kind of fog, and the chair starts to seem larger, a building looming out of mist. Its four legs begin to read as the corners of a house. I want to use this effect to make a small landscape model feel like an island.
I’m also wondering if I could connect this to a higher-quality camera. The Kinect depth data wouldn’t sync perfectly, but it might work well enough. I’d love to try some photography or video outdoors using this effect. Funny to think about introducing virtual fog to the streets of St. John’s.
Here’s a part of a wonderful poem about embracing fog and flaws.
I tell you it has taken me all my life
to arrive at the vision of gas lamps as angels,
to soften and blur and finally banish
the edges you regret I don’t see,
to learn that the line I called the horizon
does not exist and sky and water,
so long apart, are the same state of being.
—Lisel Mueller, from Monet Refuses the Operation