Monday, September 2, 2013

Texture Classification

In many applications, texture has a close relationship with the semantics. For example, Fig. 1(a) shows a CT scan of a lung. In this case, there are differences between the textures of healthy and unhealthy tissues. In satellite images -- Fig. 1(b) -- texture can be employed to differentiate terrains such as forests, buildings and sea.

Figure 1

The goal of this post is to provide a brief overview of how texture information can be used to classify images using Matlab. Basically, what we want to do is to learn a texture classifier from a set of labeled images depicting textures. Then, we will use the learned classifier to provide a class label for an unlabeled image.

Texture classification can be divided into 3 phases which are discussed in the following sections:
  1. Extracting texture features;
  2. Training a classifier;
  3. Classification of an unlabeled texture image;

Downloading the image set

In order to classify texture images, we will use as example the Textured Surfaces Dataset from Prof. Jean Ponce's research group. Just click on link and search (ctrl+f) for "Texture Database". Then download the 5 provided zip files (~40MB each).

Then unzip all the images into a single 'imgs' directory. We we will also a text file (image_class.txt) where each line contains the file name of each image and the corresponding class. Organize the images and image_class.txt under the following directory structure:

|    |--img1.jpg
|    |--img2.jpg
|    ...
|    |--img1000.jpg

Texture feature extraction

There are many feature extraction algorithms available for texture analysis. In this post I will use the SFTA Texture Extractor, (Matlab code available here) that I have developed as part of my PhD researchDownload the SFTA code and unzip into the working directory:

|    |--img1.jpg
|    |--img2.jpg
|    ...
|    |--img1000.jpg

To extract features we will use the sfta(I, nt) function, where I corresponds to the input image depicting a texture and nt is a parameter that defines the size of the feature vector.

Training the classifier

There are many good libraries and software that provide implementation of classification algorithms such as Weka for Java or LIBSVM. In this example, however, I will use the classifiers provided by Matlab. More specifically, I will use the Naive Bayes classifier.

In order to train the Naive Bayes we need to organize the images' feature vectors into a single NxM matrix, where N is the number of images and M is the size of the feature vector. Also, we will need a Nx1 cell array containing the images' labels. The following code reads the 'image_class.txt' file and generates the F matrix with the feature vectors and the L cell array with the images' labels:

imgListFile = fopen('image_class.txt', 'r');
F = zeros(1000, 24); L = cell(1000, 1);
tline = fgetl(imgListFile); currentLine = 1;
while ischar(tline)       
    splittedLine = regexp(tline, ',[ ]*', 'split');
    % Extract the SFTA feature vector from the image.
    imagePath = fullfile('imgs', splittedLine{1});
    I = imread(imagePath);
    f = sfta(I, 4);
    F(currentLine, :) = f;
    % Store the image label.
    L{currentLine} = splittedLine{2};
    tline = fgetl(imgListFile);
    currentLine = currentLine + 1;

Now that we have the F matrix and L cell array, we can train a Naive Bayes classifier nb using the following command:

>> nb =, L);


The nb classifier can be used to train an image Itest using the following command:

>> predict(nb, Itest)
ans = "knit"