The goal of this project was to track the motion of a human face and to display a projection of a cube on a VGA monitor that would change according to the motion of the userís face. We wanted it to seem as if the user was actually looking at a 3D cube, so if the user moved their head to the right for example, the projection should be displayed as if the user were actually looking at a cube from the right, and if the user gets closer to the camera, then the cube should get larger. The cube was chosen for simplicity and as a proof of concept, but this project could potentially be extended to more complicated objects or virtual reality environments.
In this project, we connected a video camera to the FPGA and determined the location of the userís face in each frame by examining the color content of each pixel and determining which ones could represent human skin. The faceís offset from the center of the cameraís field of view, as well as its size, was then used to draw the appropriate projection.