Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Ordinary projection screen is not sensitive to interaction, it cannot meet the demands of teaching, virtual reality, and other applications. Due to the fact that people always use hands to complete a variety of human–computer interaction, the finger-based interactive projection technology is worth being researched. In this paper, an ordinary monocular camera is used to acquire video frame on projection screen, and the touch signal of finger in frame is used as the input of interactive projection system. Because the differences between spatial frequency of common digital camera and the projection screen is small, the frame obtained from camera will contain moire fringe, which needs to be filtered in image frequency domain. Then the difference between current frame edge and previous frame edge is calculated to obtain moving object edge clues. According to these clues, the most possible contour curve is searched in current frame edge, and the curve is fitted by polynomial approximation method. Its curvature integration is used to match with the curvature integration of finger template curve. After that the fingers in the curve are recognized. Because color information is not needed, this method can be used to recognize gloved fingers. Finally, finger shadow is used to judge whether the finger touches projection screen to complete interactive process. The experiments of writing and collaboratively rotating picture on projector screen show that this method can effectively complete interactive operation with the projection screen and can realize the multi-user operation.
In this paper, a steerable, interactive projection display that has the shape of a disk is presented. Interactivity is provided through sensitivity to the contact of multiple fingertips and is achieved through the use of a RGBD camera. The surface is mounted on two gimbals which, in turn, provide two rotational degrees of freedom. Modulation of surface posture supports the ergonomy of the device but can be, alternatively, used as a means of user-interface input. The geometry for mapping visual content and localizing fingertip contacts upon this steerable display is provided, along with pertinent calibration methods for the proposed system. An accurate technique for touch detection is proposed, while touch detection and projection accuracy issues are studied and evaluated through extensive experimentation. Most importantly, the system is thoroughly evaluated as to its usability, through a pilot application that was developed for this purpose. We show that the outcome meets real-time performance, accuracy and usability requirements for employing the approach in human computer interaction.