The following is a guest post by the Library’s 2023 Innovator in Residence Jeffrey Yoo Warren (link to press release). As part of his residency, Warren will publish a toolkit to empower communities to create relational reconstructions of destroyed neighborhoods of color using 3D modeling methods and historic photographs. In the following post, Warren discusses creating atmospheric techniques in collaboration with multidisciplinary artist and designer Alicia Renee Ball (link to artist’s page).
When I first began virtually reconstructing Providence Chinatown from 1914, at the start of Seeing Lost Enclaves, I quickly understood that the evenly lit renderings of these historical scenes would not be enough to really give us the feeling of being there. The level of detail of the facades was pretty good, but the flat, fluorescent-like illumination just didn’t bring me back to the feeling of a place. My own strongest memories are… richer, more colorful, and honestly, less precise; memory blurs, it distorts, it’s of too-hot days and dimly lit streets. I was going to have to get away from a more architectural style and learn how to do lighting and atmospheric effects. To become a cinematographer–and to build some 3D rendering skills.
My first attempts were all right–I was able to add colored light, some nice underlighting, to create some convincing simulations of snow and fog. But it had been a while since I used 3D programs, and the technology has grown immensely. Blender is a piece of free, open source software which I have used to create these environments. Although not all of these effects will import into an interactive environment like I’ve done in the past with Mozilla Hubs, there is still a lot that can be achieved from some simple lighting–night-time effects in particular are very evocative.
One artist whose work I have been following with admiration for some time is Alicia Renee Ball, and incidentally, much of her 3D visual design work is done in Blender. I was also very interested in Alicia’s work to understand historic spaces through a combination of deep research, detailed 3D scanning and modeling, and through a personal relationship with and passion for those stories. And I drew inspiration from her approach of focusing on texture, light, and mood to evoke a sense of emotional space in her virtual spaces.
I was so excited when Alicia expressed interest in collaborating, and we ended up co-authoring both a written guide to atmospheric effects in relational reconstruction–spanning both of our work–as well as a step-by-step video tutorial, which will be released with the toolkit in the fall.
Alicia’s portion of the guide took as a case study her prior work on a cabin at the site New Echota. In her words:
“This model is a reconstruction of a big timber log cabin found at the historic site, New Echota. New Echota was the Capital of the Cherokee Nation in the early 19th century in the state of Georgia before forced removal by the U.S government. This area of Georgia is the home of the Koasati (Coushatta, Coosawattee, Coosa), Miccosukee, Yuchi and Cherokee Indigenous peoples.
I have been studying the histories of Black people who lived under enslavement in this area, and who later would be known as Cherokee Freedmen following the Civil War. Their stories are part of the complex history of this place and this cabin shares some similarities with what their homes may have looked like. For me, this provides a partial glimpse of how they might have lived during this era, and is part of my journey to learn more about their lives.”
She had previously produced a model of the cabin using a technique called photogrammetry, and took that model as a starting point for the guide.
As her scene came together, we began brainstorming and prototyping ways to create atmospheric effects in Blender, and we both researched and built new skills to bring the scene to life. Alicia used a technique called volumetric fog which creates layers and clumps of fog or mist which hang onto parts of the scene, and drift slowly, as well as hold and diffract the various sources of light in the scene.
Finally, we wanted to try something special–we decided to develop a simluation of fireflies, to create a sense of mystery and beauty in the scene. Alicia used her prior experience with a “particle system” technique for creating flocks of birds–and working together, we adapted this motion system, called “Boids” to move tiny points of light. Alicia developed a set of behavioral rules to give them both random wandering paths, and to get them to avoid objects in the scene.
The result was even better than we had hoped, the scene coming alive with a range of dynamic lights, and the fireflies giving it a sense of depth and life we were really excited by.
I’ve continued to learn Blender animation and to work on ways to bring these effects and techniques into the ongoing series of relational reconstruction workshops I’ve begun hosting. And I’ve been excited to see Alicia take her own modeling to new levels, including creating real-world objects from them through 3D printing and handcrafting, as part of a series of new projects. I’m grateful for our time working together, and look forward to collaborating again in the coming year!
The relational reconstruction toolkit will be made available on the “Seeing Lost Enclaves” experiment page in the fall of 2023. See Jeff’s previous blog post on recreating a model of Portland’s historic Chinese Vegetable Gardens neighborhood with Dri Tattersfield.