- 86
(A)I Feel
Is it possible for machines to feel emotions?
As we know for now, machines have no emotion. But if we humans teach them, will they be able to learn?
(A)I FEEL is a project dedicated to find answers to our questions by creating a teaching and learning process between humans and a machine. The whole process will be visualized as an interactive installation that encourages people to participate in the project.
To teach the machine, it asks each user to draw a picture which represents a specific emotion. The drawing will be recognized and memorized as one of representatives of the emotion. Then, the machine fills the drawing with a color representing the emotion and projects the drawing back on a transparent screen.
To demonstrate what the machine has learned, the user can draw anything to portray his/her feeling(s) at that moment. The machine will analyze the drawing and interpret the user’s expressions then substitute emotions for colors (Joy=Yellow Sadness=Blue Anger=Red Fear=Green Disgust=Purple). The result, graphical visualization of the user’s emotions, will be poured into a sea of emotions visualized as a colorful fluid ball pit.
There is a possibility that, after machines have learned enough about emotions, they will be able to analyze emotions expressed in artists’ paintings. Then, we will get to understand more about how the artists feel while they were painting and the real meaning of their works.
At the end, after we have taught machines to the point that they can feel emotions, will they be able to create something which can respond to or satisfy human emotions?
Or will they create something else to express their own emotions?
As we know for now, machines have no emotion. But if we humans teach them, will they be able to learn?
(A)I FEEL is a project dedicated to find answers to our questions by creating a teaching and learning process between humans and a machine. The whole process will be visualized as an interactive installation that encourages people to participate in the project.
To teach the machine, it asks each user to draw a picture which represents a specific emotion. The drawing will be recognized and memorized as one of representatives of the emotion. Then, the machine fills the drawing with a color representing the emotion and projects the drawing back on a transparent screen.
To demonstrate what the machine has learned, the user can draw anything to portray his/her feeling(s) at that moment. The machine will analyze the drawing and interpret the user’s expressions then substitute emotions for colors (Joy=Yellow Sadness=Blue Anger=Red Fear=Green Disgust=Purple). The result, graphical visualization of the user’s emotions, will be poured into a sea of emotions visualized as a colorful fluid ball pit.
There is a possibility that, after machines have learned enough about emotions, they will be able to analyze emotions expressed in artists’ paintings. Then, we will get to understand more about how the artists feel while they were painting and the real meaning of their works.
At the end, after we have taught machines to the point that they can feel emotions, will they be able to create something which can respond to or satisfy human emotions?
Or will they create something else to express their own emotions?