This research demonstrates a uniform functional composition framework for modeling and synthesizing complex textures. The appearance of a wide range of natural phenomena can be expressed and efficiently synthesized in this framework. Animation of texture is readily incorporated. Emphasis will be on explaining the properties leading to generality, expressivity, and efficiency. A system is described in which an image is approximated by a finite collection of samples, representing neighborhoods in the image. The user designs visual simulations of surface textures by constructing an algorithm that is to be independently computed at each image sample. Primitive functions are provided that allow control within the texture algorithm of visually important texture properties, such as frequency and first order spatial statistics. The user proceeds by building from these functions. Feedback is provided by images indicating the state of any computed quantity over all samples. The system includes primitive functions allowing the manipulation of such visually discriminable qualities as brightness, contrast, coherent discontinuities, orientation, and features possessing restricted ranges of frequency. These are used to build up composite functions allowing the manipulation of more sophisticated visual qualities. The system is applied to build the appearance of many textures such as water, star fields, flame, smoke, marble, clouds, stucco, rock, smoke, and soap films. Major results are twofold. First, it will be shown that a wide range of naturalistic visual textures can be constructed with this approach. Second, a number of particular functions will be demonstrated that encode the common visual elements of disparate visual textures.