More selected projects

Night Forest


produced by: Dongyuan Liu

Concept and Background research

Night Forest is an interactive immersion environment, whose goal is building a forest environment with fluorescent effect. I was inspired by the TeamLab projects, which have dreamlike environments. This project is also one part of my research project which is about the human relations. About the topic of interactive immersion environment, researchers always talk about the relations between human and visual, human and auditory, and human and space. I, however, want to explore the role of technology and how to adjust the relationships between humans and humans. David Rokeby explained,“Rather than creating finished works, the interactive artist creates relationships”. There are a lot of relationships in a common space. Technology as a mediator, it builds the connections among human. In my project, I also want to talk about the relations between human and nature, which is human always influence nature environment, but they are also influenced by nature environment. Furthermore, people are also one part of the environment. In daily life, people are hard to realize they are affecting nature environment. What I want to highlight is the unconscious state of people, a kind of passive state. The passive state also exists among audiences. To be specific, the connections were built by digital tools. In contrast with Future Park, by TeamLab, this state is more prominent.

Technical

This project is mainly composed of two parts. One is the trees which are influenced by people. Another one is the particle systems which are drew based on the human boundary. What’s more, I use Kinect to adjust the range of detection area and use ofxCv to calculate the total number of audiences and detect the human boundaries.

Tree class

I chose the template from Openprocessing webpage and translated it in openframeworks. This class was mainly composed of branches and roots, but I did not apply the part of root in this project. That is to say, the structure of this class is branch-tree-forest. By modifying the source code, the number of trees and length and number of forks will be changed with the increasing of the total number of the audiences who have already entered in the environment. As I mentioned before, I don’t want to audiences interact with this project actively, so I didn’t choose an obvious way of interaction, such as the movement, speed, and gesture of interactors. I want to highlight people’s passiveness and the role of technology.

When the total number of people increase, new class of tree will be generated and pushed back in the vector, and then the new tree will grow. I also map the total number of trees to the total number of people to set the limitation, which can make sure computer does not crash because of the huge number of branches. As for the color system, I choose ADD mode which can make the effect of fluorescent.

Particle system

The particle system is generated by the counterFinder function of ofxCv, which is composed by ofPolyline, so that I get access to the point which is in the polylines, and add them to the particle systems. The emergence of particle systems represents an extension of human boundary. What’s more, I want to enhance the immersion and existence feeling of audience, the feeling that they and technology co-exist. As for the dynamic effects of particles, particles scatter around the center of the screen. This presents that audiences are being influenced. Finally, the color of particles is extracted from the color of the boundary path on the color image of the Kinect. This is done to echo the real world color.

When the Kinect find the founder path, the for loop function will loop all the points which compose of polylines, and then I use calculation(g+=50) to skip some points to make sure that the total number of particles will not be too much.

Self-evaluation and future work

During the process of the exhibition, many people have gone through my work in a state of inadvertentness. When they found themselves entering my work, they start to do different gestures to try to change the content in the picture. They switched from a passive state to an active state. Additionally, one viewer asked me how the audiences can discover the humanities in the work. I suddenly realized that the audiences’ participations and the passive state I hoped for seemed to be a contradictory topic. I can not control the behaviors of the audiences. For now, my answer maybe is that after they stay there for a while, they will find. This question will be considered in the future.

 

 

Addons:

ofxKinect

ofxCv

ofxOpenCv

 

Referrences:                                                                                                                

1.Smoke Particle System by Daniel Shiffman. https://processing.org/examples/smokeparticlesystem.html                        

2.The Nature of Code by Daniel Shiffman, Chapter 4. Particle System https://natureofcode.com/book/chapter-4-particle-systems //

3.Star Tree by Karutt. https://www.openprocessing.org/sketch/408742                                                         

4.Code of Gaussian function in Openframeworks https://github.com/andyr0id/ofxGaussian/blob/master/src/ofxGaussian.cpp       

5.ContourFinder.cpp from ofxCv                                                                                               

6.webCam Piano Example from week 12                                                                                         

7.kinectGrid Example from week 16