Callie’s Page


This is my blog from my first semester at ITP. I have since switched to a Notion blog which you can view here.


Fall 2024



Intro to Physical Comp:



Intro to Comp Media:



Hypercinema:






Week 7 → Synthetic Media



Assignment:

“Create a 2-minute short video entirely created by AI-generated stills. This story should include, sound, voice over or narration and also be cohesive visually. How do you tell a story using generative media? What story do you want to tell and why?”


Final Result:



Themes/intent:

I think that (at least so far), the best media created from scratch with AI pokes fun at the medium itself. In my opinon, the visuals it generates are not very aesthetically interesting on their own. AI video especially retains an uncanny shifting/morphing quality that usually feels creepy to look at. 

I was interested in leaning into this characteristic of AI for my project. This video explores what it means when something feels “unnatural” through a synthetic nature documentary. 

Even real nature documentaries aren’t exactly natural - when applying narrative to wild animals for the purpose of entertainment, it’s easy to anthropomorphize and dramatize them in order to better fit our human experience. Despite suspecting that the truth is being manipulated behind the scenes, we can choose to suspend disbelief in order to enjoy them. 

The ‘bird courtship dance’ segment is classic part of most nature documentaries - sometimes it ends with success and sometimes with failure, but as human viewers we never quite know why. The females of many species of bird have evolved to be extremely choosy - many males will never get a chance to mate if they have a percieved flaw, and the criteria for success are not always clear even to scientists. 

Similarly, I think that most synthetic media still fails to pass the test. Even when it’s pretty convincing on the surface, our eyes and brains can tell that something is “off.” I believe there’s still a long way to go before we would choose something AI-generated over something created by a human, but maybe someday the courtship dance will be able to fool us. 


Process:

It felt icky to copy someone else’s voice, but David Attenborough is so singular in the world of nature documentaries that I thought his narration was important to get the point across. I wrote a script based on stereotypical natural documentary-ish tropes and used this model from FakeYou to make the audio. 

I had to chop up the audio and re-generate certain segments because it did not always nail them. It still sounds a little off in places (or like he has had too many drinks). 

Next, I generated the clips using Runway. At first I tried using their text to video model, but it would only really generate still images with some light zooming/morphing. 

I wanted some real bird movements, so I started experimenting with the video to video generation. It did a pretty convinding job modifying existing videos to look like the type of birds I wanted. 

I pulled clips from these videos to generate on top of: 

https://www.youtube.com/watch?v=qQgNWg96Rls
https://www.youtube.com/watch?v=BnLE-G1hVAE

Example of turning an existing clip into an AI video 




I had the most trouble generating the opening title sequence. I wanted it to modify the iconic Planet Earth to look a little “cursed” and have an AI-generated quality to it. 

This required some reverse engineering, because Runway gave me an error when I tried to do it directly. It didn’t specify why, but it must have been because of copyright violations. 



I ended up feeding a screenshot into the image generator, asking it to recreate the same background that was already there, and got this: 




Then I could feed it back into the video generator and animate the spinning Earth. 

Finally I got some sound effects from here and here, and then stitched it all together in Premiere. 

BESbswy