Each arm has a thruster on it and the body moves in the resultant direction (diagonally up and to our left here). The player is standing in the middle of a room waving his hands.
Win: After a downward motion, the astronaut lands on the platform to win the game.
Unlike other gesture systems like the dataglove or the bodysuit, these applications span the space from coarse to fine motions, static to dynamic gestures, etc. In themselves also, these applications hold great interest since gestures are likely to become a key component of spatial interaction with the devices of tomorrow.
1. ges.ture \'jes(h)-ch*r\ n [ML gestura mode of action, fr. L gestus, pp.] 2: the use of motions of the limbs or body as a means of expression 3: a movement usu. of the body or limbs that expresses or emphasizes an idea, sentiment, or attitude.In this work, we not only interpret gestures, but react to them by passing the gesture on to a task. As recent results in Artificial Intelligence show, this is actually easier than trying to simply "understand" the gesture; the task-domain helps constrain the possible interpretations of the gesture.
Using gestures, one can use our fingers, with as many control points as we wish. The prototype GesturesCAD program works with an overhead camera looking at your hands as you move them on a desktop.
Drawing a spline with a tabletop gesture. Each finger controls a position and an (anchor point) orientation (tangent). The eight control points can be used to flex the closed contour interactively into many shapes.We have built an AutoCAD like basic 2D CAD tool in which a menu gives the user options for drawing lines, circles, rectangles, polygons, splines, etc. These can be combined to form geometric figures. For example, the heart-shape of the above figure has been combined with two lines and a circle to form the figure below.
A composite 2D drawing.
Simulating a gantry crane. The scrap ladle in the middle screen is being moved anti-clockwise under gestural commands. The left screen shows the input image with a calibration box. This is the set of gestures actually used in the steel shop at TISCO.With their major expansion (currently in the Cold-Rolling Mill), Tata Steel is one of several companies in India looking to use the detailed interactive simulation that Virtual Reality can provide in testing out the alternatives before investing money into very costly alternatives.
One of the applications in this area that IIT Kanpur is currently exploring is a vastly improved plant control interface where an operator can monitor the plant by virtually going to control points inside the actual plant instead of pressing F-keys on a console. This is possible due to the already existing level 1 automation in the plant, which provides on-line sensors at several points in the plant. These sensors can all be integrated and the user can be shown sensory data from any where in the plant.
For example, the operator may pick up a virtual pressure gauge, and then "fly-through" a fully realistic plant model to, say, the oxygen mains. Then by touching the pressure gauge anywhere on the body of the main, the gauge would indicate the pressure of the oxygen inside. This is possible by integrating the existing sensors, which are actually available only at a small number of points. However, their data can be interpolated using a model of the oxygen flow oxygen which comuputes pressure distributions at any point on the pipe wall. The vast improvement that this provides is that it allows people with some experience of the plant but no training in the control interface, to have immediate access to plant data. More importantly, it significantly reduces the possibility of human error (pressing the wrong F-key) through a much more direct interface. This is only one of the many possible industrial applications of Virtual Reality.