n and hence, force computation. Volumetric data is used extensively in medical imaging and scientific visualization. Currently the GHOST SDK, which is the development toolkit for the PHANToM, construes the haptic environment as scenes composed of geometric primitives. Huang, Qu, and Kaufman of SUNY-Stony Brook have developed a new interface which supports volume rendering, based on volumetric objects, with haptic interaction. The APSIL library (Huang, Qu, and Kaufman, 1998) is an extension of GHOST. To date the Stony Brook group has developed succesful demonstrations of volume rendering with haptic interaction from CT data of a lobster, a human brain, and a human head, simulating stiffness, friction, and texture solely from the volume voxel density. The development of the new interface may facilitate working directly with the volumetric representations of the teapots obtained through the view synthesis methods. The surface texture of an object can be displacement mapped (consisting of thousands of tiny polygons) (Srinivasan and Basdogan, 1997), although the computation demand is such that force discontinuities can occur, or more commonly, a "texture field" can be constructed from 2-D image data. For example, Ikei, Wakamatsu, and Fukuda (1997 created textures from images converted to greyscale, then enhanced to heighten brightness and contrast, such that the level and distribution of intensity corresponds to variation in the height of texture protrusions and retractions (Ikei et al., 202). They then employed an array of vibrating pins to communicate tactile sensations to the user's fingertip, with the amplitude of the vibration of each pin driven at the intensity level of the underlying portion of the image. Surface texture may also be rendered haptically, through techniques such as force perturbation, where the direction and magnitude of the force vector is altered using the local gradient of the texture field to simulate effects such a...