Goal-Oriented Texture Synthesis


GTHW evolved texture 20091217aGWAC evolved texture gots20101204ccamouflaged prey on "flowers" background


This page presents an overview of several experiments related to goal-oriented texture synthesis. In this technique we generate textures (images) based on a description of a desired goal or property of the texture. One well-known technique that falls in this category is example-based texture synthesis, in which arbitrarily large textures are created based on a small “exemplar” texture. (See for example: State of the Art in Example-based Texture Synthesis) In that technique, the goal is to generate new textures that look like plausible extensions of the exemplar texture. In the work described on this page, we look at other kinds of goals. These include desired visual or artistic properties of the synthesized images, and desired functional properties such as camouflage.

The basic idea of goal-oriented texture synthesis is that we specify what we want, not how to achieve it. It is then up to some kind of automatic solver or optimizer to find synthetic textures that meet the stated goal. In these experiments, optimization is provided by evolutionary computation, specifically genetic programming. We express the goal as a fitness function (a program that rates the quality of a texture) then evolve a population of textures to find those that best meet the goal. Textures are represented procedurally as nested expressions of functions from a library of generators and operators. For more details see: Texture Synthesis Diary.

I gave an invited talk on goal-oriented texture synthesis in the closing plenary session of evo* (evoStar) on April, 29, 2011 in Torino Italy: slides (37MB PDF)


Evolving Textures from High Level Descriptions

In these experiments, the texture synthesis goal is described in terms of properties of the desired image. These properties include those that can be measured by looking at individual color pixels such as brightness, saturation and hue. We might look for maximum, minimum, average or median values of these properties. We might construct histograms of their distributions and look for certain properties of those. We may also analyze regions of the image larger than single pixels to measure spatial frequencies and orientations. We could look for the distributions of (for example) hue to see how they vary from one neighborhood to another.

A procedural description of the desired type of texture can serve as the fitness function to be used with evolutionary texture synthesis to generate a “random” texture that meets the description. This process can be repeated many times to generate a catalog of images from which an artist/designer can select suitable interesting textures.


Evolving Textures from High Level Descriptions: Harmonious Colors

This example evolves textures based on the idea of a “harmonious color triplet” from color theory. Fitness is based only on the colors in the texture. The geometric aspects of the texture (shapes, patterns, frequencies) have no effect on fitness, except to the extent that they influence the distribution of hues and saturations of colors in the texture. For the purpose of identifying harmonious color schemes, hue is discretized into a histogram with 12 buckets. This image category is described by four image metrics:
  1. colorful: most pixels must have saturation above a given threshold
  2. pixel colors should fall primarily into three hue buckets, and the counts in those buckets should be roughly in a ratio of 3:1:1
  3. the other nine histogram buckets should be nearly empty
  4. the three dominant hues should form “harmonious color triplet”
Diagram of a harmonious color triplet for a 12 step discretization of hue:

harmonious color triplet
Below: samples of textures evolved to meet this criteria:


GWAC evolved texture gots20110414i GWAC evolved texture gots20110414kGWAC evolved texture gots20110415cGWAC evolved texture gots20110416cGWAC evolved texture gots20110416dGWAC evolved texture gots20110417bGWAC evolved texture gots20110417gGWAC evolved texture gots20110418dGWAC evolved texture gots20110422a



Evolving Textures from High Level Descriptions: High frequency top, low frequency bottom (“hftlfb”)

These textures are evolved to have high frequencies at the top of the image and low (but not zero) frequencies at the bottom. Frequency measurements are made solely on the brightness (intensity) of the pixel values, so hue and saturation are ignored. The examples below which contain color do so only accidentally because it has no impact on fitness. This image category is described by three image metrics:
  1. high frequency in top 1/4 of image
  2. low frequency in bottom 1/4 of image
  3. prefer some variation at bottom
Below: samples of textures evolved to meet this criteria:


GWAC evolved texture gots20110411c GWAC evolved texture gots20110411bGWAC evolved texture gots20110408cGWAC evolved texture gots20110412hGWAC evolved texture gots20110409d GWAC evolved texture gots20110411dGWAC evolved texture gots20110411f_low_fitnessGWAC evolved texture gots20110413aGWAC evolved texture gots20110409e



Evolving Textures from High Level Descriptions: Gray with an Accent Color

This experiment involved choosing a single image type (“a gray texture with a small amount of saturated accent color”) then exploring how best to represent the fitness function to guide evolution texture synthesis to automatically produce many examples of it. This fitness function was programmed by hand. The eventual goal of this work is to find a way to allow a visual designer to specify their own goals. This experiment is described in a paper in the proceedings of evoMusArt 2011:


Craig Reynolds. 2011. Evolving Textures from High Level Descriptions: Gray with an Accent Color. In C. Di Chio et al. (Eds.): evoApplications 2011 (Applications of Evolutionary Computation). LNCS Springer.

Abstract: This paper describes a prototype evolutionary texture synthesis tool meant to assist a designer or artist by automatically discovering many candidate textures that fit a given stylistic description. The textures used here are small color images, created by procedural texture synthesis. This prototype uses a single stylistic description: a textured gray image with a small amount of color accent. A hand-written prototype fitness function rates how well an image meets this description. Genetic programming uses the fitness function to evolve programs written in a texture synthesis language. A tool like this can automatically generate a catalog of variations on the given theme. A designer could then scan through these to pick out those that seem aesthetically interesting. Their procedural “genetic” representation would allow them to be further adjusted by interactive evolution. It also allows re-rendering them at arbitrary resolutions and provides a way to store them in a highly compressed form allowing lossless reconstruction.

Below: samples of textures evolved to meet the criteria “a textured gray image with a small amount of color accent”:


GWAC evolved texture gots20101204cGWAC evolved texture gots20101124gGWAC evolved texture gots20101205bGWAC evolved texture gots20101203hGWAC evolved texture gots20101123bGWAC evolved texture gots20110102cGWAC evolved texture gots20101121kGWAC evolved texture gots20110125aGWAC evolved texture gots20101114cGWAC evolved texture gots20101221a



Testing evolutionary texture synthesis

My first version of “evolving textures from high level descriptions” was written to debug the evolutionary texture synthesis component of the camouflage project. I wanted to test the interface between my texture synthesis library and the evolutionary computation performed by Open BEAGLE. It didn't matter much what the fitness function was, as long as it drove evolution sufficiently to stress test the whole system. I tried some simple criteria but evolutionary computation is extremely good at finding simple solutions for simple criteria. The fitness function that resulted after several experiments combined four criteria or metrics:
  1. how close to midrange is the average intensity?
  2. how rare are regions of flat, unchanging color?
  3. are lights and darks well represented in the brightness histogram?
  4. is the average saturation above a certain threshold?
These are very broad criteria. Most well-exposed color photographs would score very high on these metrics. Other than these goals, the fitness function was fairly agnostic about other aspects of texture analysis. As a result the fitness space was fairly “flat” allowing a large variety of texture types (below). For more description and other examples, see the Texture Synthesis Diary.


GTHW evolved texture 20091217aGTHW evolved texture 20091129hGTHW evolved texture 20091124eGTHW evolved texture 20091123fGTHW evolved texture 20091129eGTHW evolved texture 20091120c





Interactive Evolution of Camouflage


Using goal oriented texture synthesis to evolve camouflage patterns with a human observer as predator, described in this paper:


Craig Reynolds. 2010. Interactive Evolution of Camouflage. In the proceedings of the 12th International Conference on the Synthesis and Simulation of Living Systems (ALife XII), August 2010. URL: http://www.red3d.com/cwr/iec/
 
(A revised version of this paper will appear in the journal Artificial Life 17(2) around April 2011.)

Abstract: This paper presents an abstract computation model of the evolution of camouflage in nature. The 2d model uses evolved textures for prey, a background texture representing the environment and a visual predator. In these experiments, the predator’s role is played by a human observer. They are shown a cohort of ten evolved textures overlaid on the background texture. They click on the five most conspicuous prey to remove (“eat”) them. These lower fitness textures are removed from the population and replaced with newly bred textures. Biological morphogenesis is represented in this model by procedural texture synthesis. Nested expressions of generators and operators form a texture description language. Natural evolution is represented by genetic programming, a variant of the genetic algorithm. GP searches the space of texture description programs for those which appear least conspicuous to the predator.

Below: camouflaged circular “prey” overlaid on the background image for which they were evolved: polished serpentine stone, yellow lichen, lantana flowers and leaves, green hedge, twisty wire against sky, tree bark, orange lentils, Yosemite granite:


camouflaged prey on "serpentine" backgroundcamouflaged prey on "lichen" backgroundcamouflaged prey on "flowers" backgroundcamouflaged prey on "hedge" backgroundcamouflaged prey on "twisty wire" backgroundcamouflaged prey on "bark" backgroundcamouflaged prey on "lentils" backgroundcamouflaged prey on "white granite" background



Related links:

Interactive Evolution of Camouflage—simulating the evolution of camouflage in nature. Uses goal-oriented texture synthesis.

Texture Synthesis Diary—blog about the design and implementation of the evolutionary texture synthesis underlying this work.

Open BEAGLE—a general purpose engine for evolutionary computation. Its genetic programming facility is used to implement the evolutionary texture synthesis used here.



Last update: May 7, 2011, Craig Reynolds