Lecture 12 Texture

Download Report

Transcript Lecture 12 Texture

Introduction to Computer Vision Image

Texture Analysis Lecture 12 1

A few examples

• • Morphological processing for background illumination estimation Optical character recognition Roger S. Gaborski 2

Image with nonlinear illumination

Original Image Thresholded with graythresh 3

Obtain Estimate of Background

background = imopen(I,strel('disk',15)); %GRAYSCALE figure, imshow(background, []) figure, surf(double(background(1:8:end,1:8:end))),zlim([0 1]); Roger S. Gaborski 4

%subtract background estimate from original image I2 = I - background; figure, imshow(I2), title('Image with background removed') level = graythresh(I2); bw = im2bw(I2,level); figure, imshow(bw),title('threshold') Roger S. Gaborski 5

Comparison

Original Threshold Background Removal - Threshold Roger S. Gaborski 6

Optical Character Recognition

• • After segmenting a character we still need to recognize the character.

How do we determine if a matrix of pixels represents an ‘A’, ‘B’, etc?

Roger S. Gaborski 7

Roger S. Gaborski 8

Roger S. Gaborski 9

Approach

• • • Select line of text Segment each letter Recognize each letter as ‘A’, ‘B’, ‘C’, etc.

Roger S. Gaborski 10

Select line 3 : Samples of segment of individual letters in line 3: Roger S. Gaborski 11

• • We need labeled samples of each potential letter to compare to unknown Take the product of the unknown character and each labeled character and determine with labeled character is the closest match Roger S. Gaborski 12

%Load Database of characters (samples of known characters) load charDB08182009.mat

whos char08182009 Name Size Bytes Class Attributes char08182009 26x1050 218400 double EACH ROW IS VECTORIZED CHARACTER BITMAP Roger S. Gaborski 13

BasicOCR.m

CODE SOMETHING LIKE THIS: cc = ['A' 'B' 'C' 'D' 'E' 'F' 'G' 'H' 'I' 'J' 'K' 'L' 'M' 'N' 'O' ...

'P' 'Q' 'R' 'S' 'T' 'U' 'V' 'W' 'X' 'Y' 'Z']; First, convert matrix of text character to a row vector for j=1:26 score(j)= sum(t .* char08182009R(j,:)); end ind(i)=find(score= =max(score)); fprintf('Recognized Text %s, \n', cc(ind)) 14

How can I segment this image?

Assumption: uniformity of intensities in local image region Roger S. Gaborski University of Bonn 15

What is Texture?

Roger S. Gaborski University of Bonn 16

Roger S. Gaborski 17

• • • Edge Detection Histogram Threshold - graythresh Roger S. Gaborski 18

Roger S. Gaborski 19

Roger S. Gaborski 20

Roger S. Gaborski 21

lev = graythresh(I) lev = 0.5647

>> figure, imshow(I

What is Texture

• • No formal definition – There is significant variation in intensity levels between nearby pixels – Variations of intensities form certain repetitive patterns (homogeneous at some spatial scale) – The local image statistics are constant, slowly varying human visual system: textures are perceived as

homogeneous regions,

even though textures do not have uniform intensity Roger S. Gaborski 23

Texture

• Apparent homogeneous regions: Sand on a beach A brick wall – – In both cases the HVS will interpret areas of sand or bricks as a ‘ region ’ in an image But, close inspection will reveal strong variations in pixel intensity Roger S. Gaborski 24

Texture

• • • • Is the property of a ‘ group of pixels ’ /area; a single pixel does not have texture Is scale dependent – at different scales texture will take on different properties Large number of (if not countless) primitive objects – If the objects are few, then a group of countable objects are perceived instead of texture Involves the spatial distribution of intensities – 2D histograms – Co-occurrence matrixes Roger S. Gaborski 25

Scale Dependency

• • Scale is important – consider sand Close up – “ small rocks, sharp edges ” – “ rough looking surface ” – “ smoother ” • Far Away – – “ one object  brown/tan color ” Roger S. Gaborski 26

Terms (Properties) Used to Describe Texture

• • • • • • Coarseness Roughness Direction Frequency Uniformity Density How would describe dog fur, cat fur, grass, wood grain, pebbles, cloth, steel??

Roger S. Gaborski 27

The object has a fine grain and a smooth surface

” • Can we define these terms precisely in order to develop a computer vision recognition algorithm?

Roger S. Gaborski 28

Features

• • • Tone – based on pixel intensity in the texture primitive Structure – spatial relationships between primitives A pixel can be characterized by its Tonal/Structural properties of the group of pixels it belongs to Roger S. Gaborski 29

• Tonal: – Average intensity – Maximum intensity – Minimum intensity – Size, shape • Spatial Relationship of Primitives: – Random – Pair-wise dependent Roger S. Gaborski 30

Artificial Texture

                        Roger S. Gaborski 31

Artificial Texture

                        Segmenting into regions based on texture Roger S. Gaborski 32

Color Can Play an Important role in Texture

                                Roger S. Gaborski 33

Color Can Play an Important Role in Texture

                                Roger S. Gaborski 34

Statistical and Structural Texture

Consider a brick wall: • Statistical Pattern – close up pattern in bricks • Structural (Syntactic) Pattern – brick pattern  on previous slides can be represented by a grammar, such as,

ababab

) Roger S. Gaborski 35

Most current research focuses on statistical texture Edge density is a simple texture measure - edges per unit distance Segment object based on edge density HOW DO WE ESTIMATE EDGE DENSITY??

Roger S. Gaborski 36

Segment object based on edge density Move a window across the image and count the number of edges in the window ISSUE – window size?

 How large should the window be?

 What are the tradeoffs?

 How does window size affect accuracy of segmentation?

Roger S. Gaborski 37

Segment object based on edge density Move a window across the image and count the number of edges in the window ISSUE – window size?

 How large should the window be?

Large enough to get a good estimate Of edge density  What are the tradeoffs?

Larger windows result in larger overlap between textures  How does window size affect Accuracy of segmentation?

Smaller windows result in better region segmentation accuracy, but poorer Estimate of edge density Roger S. Gaborski 38

Average Edge Density Algorithm

• • • • Smooth image to remove noise Detect edges by thresholding image Count edges in n x n window Assign count to edge window • • Feature Vector  [gray level value, edge density] Segment image using feature vector Roger S. Gaborski 39

Run Length Coding Statistics

• • Runs of ‘ similar ’ gray level pixels Measure runs in the directions 0,45,90,135 0 0 2 3 1 2 1 0 2 3 1 3 3 3 1 0 Image Y( L, LEV, d) Where L is the number of runs of length L LEV is for gray level value and d is for direction d Roger S. Gaborski 40

Image 0 0 2 3 1 2 1 0 2 3 1 3 3 3 1 0 0 1 2 3 0 degrees Run Length, L 1 2 3 4 Roger S. Gaborski 0 1 2 3 45 degrees Run Length, L 1 2 3 4 41

Image 0 0 2 3 1 2 1 0 2 3 1 3 3 3 1 0 0 degrees 0 1 2 3 Run Length, L 3 3 1 4 1 2 0 0 0 1 3 0 1 0 0 0 0 4 0 0 Roger S. Gaborski 45 degrees 0 1 2 3 Run Length, L 0 3 1 4 4 2 0 0 0 1 1 0 3 0 0 0 0 4 0 0 42

Run Length Coding

• • • For gray level images with 8 bits  256 shades of gray  256 rows 1024x1024  1024 columns Reduce size of matrix by quantizing: – – Instead of 256 shades of gray, quantize each 8 levels into one resulting in 256/8 = 32 rows Quantize runs into ranges; run 1-8  first column, 9-16 the second…. Results in 128 columns Roger S. Gaborski 43

• •

Gray Level Co-occurrence Matrix, P[i,j]

Specify displacement vector

d

= (

d x

,

d y

) Count all pairs of pixels separated by

d

having gray level values

i

and

j.

Formally:

P

(

i, j

)

=

|

{(

x 1 , y 1

)

,

(

x 2 , y 2

)

: I

(

x 1 , y 1

)

= i, I

(

x 2 , 2 1

)

= j

}

|

Roger S. Gaborski 44

Gray Level Co-occurrence Matrix

• Consider simple image with gray level values 0,1,2 • Let

d

= (1,1)

y

2 0 0 1 2 1 2 1 2 0 2 1 2 2 1 0 1 2 0 0 1 2 0 1 1

x x y

One pixel right One pixel down Roger S. Gaborski 45

2 0 0 1 2 1 2 1 2 0 2 1 2 2 1 0 1 2 0 0 1 2 0 1 1 Count all pairs of pixels in which the first pixel has value

i

and the second value

j

displaced by

d

.

P(1,0) 1 0 P(2,1) 2 Etc.

1 Roger S. Gaborski 46

Co-occurrence Matrix, P[i,j]

2 0 0 1 2 1 2 1 2 0 2 1 2 2 1 0 1 2 0 0 1 2 0 1 1

i

0 1 2 0 0 2 2

j

1 2 1 3 2 2 2 2 P(

i

,

j

) There are 16 pairs, so normalize by 16 47 Roger S. Gaborski

Uniform Texture

d=(1,1)

x y

Let Black = 1, White = 0 P[i,j]

P(0,0)= P(0,1)= P(1,0)= P(1,1) =

Roger S. Gaborski 48

Uniform Texture

d=(1,1)

x y

Let Black = 1, White = 0 P[i,j]

P(0,0)= 24 P(0,1)= 0 P(1,0)= 0 P(1,1) = 25

Roger S. Gaborski 49

Uniform Texture

d=(1,0)

x y

Let Black = 1, White = 0 P[i,j]

P(0,0)= ?

P(0,1)= ?

P(1,0)= ?

P(1,1) = ?

Roger S. Gaborski 50

Uniform Texture

d=(1,0)

y

Let Black = 1, White = 0 P[i,j]

P(0,0)= 0 P(0,1)= 28 P(1,0)= 28 P(1,1) = 0

Roger S. Gaborski

x

51

Randomly Distributed Texture

What if the Black and white pixels where randomly distributed?

What will matrix P look like??

1 1 1 0 0 1 0 0 0 0 1 0 1 0 0 1 1 1 0 0 0 1 0 1 0 1 1 1 0 0 1 1 1 1 0 0 1 1 0 0 1 1 0 0 1 1 1 1 0 0 1 0 0 1 0 1 0 0 0 1 1 0 1 1 No preferred set of gray level pairs, matrix P will have approximately a uniform population Roger S. Gaborski 52

Co-occurrence Features

• G ray L evel C o-occurrence M atrices(GLCM) – Typically GLCM are calculated at four different – – – angles: 0, 45,90 and 135 degrees For each angles different distances can be used, d=1,2,3, etc.

Size of GLCM of a 8-bit image: 256x256 (2 8 ). Quantizing the image will result in smaller matrices. A 6-bit image will result in 64x64 matrices 14 features can be calculated from each GLCM. The features are used for texture calculations Roger S. Gaborski 53

Co-occurrence Features

P

(

g a ,g b ,d,t

) : – – –

g a g b d

   gray level pixel gray level pixel distance

d

‘ ‘

a b

’ ’ –

t

 angle

t

(0, 45,90,135) In many applications the transition

g a

to

g b

and

g b

both counted. This results in symmetric GLCMs: For P(0,0,1,0) to

g a

are 0 0 results in an entry of 2 for the ‘ 0 0 ’ entry Roger S. Gaborski 54

Co-occurrence Features

• The data in the GLCM are used to derive the features, not the original image data

Contrast

 

P j

) 2 • How do we interpret the contrast equation?

Roger S. Gaborski 55

Co-occurrence Features

• The data in the GLCM are used to derive the features, not the original image data: Measures the local variations in the gray-level co-occurrence matrix.

Contrast

 

P j

) 2 • How do we interpret the contrast equation?

The term (i-j) 2 : weighing factor (a squared term ) – values along the diagonal (i=j) are multiplied by zero. These values represent adjacent image pixels that do not have a gray level difference. – entries further away from the diagonal represent pixels that have a greater gray level difference, that is more contrast, and are multiplied by a larger weighing factor. Roger S. Gaborski 56

Co-occurrence Features

• Dissimilarity:

dissimilarity

 

P i j

| – Dissimilarity is similar to contrast, except the weights increase linearly Roger S. Gaborski 57

Co-occurrence Features

• Inverse Difference Moment

P IDM

 

j

) 2 – IDM has smaller numbers for images with high contrast, larger numbers for images low contrast Roger S. Gaborski 58

Co-occurrence Features

• A ngular S econd M oment(ASM) measures orderliness: how regular or orderly the pixel values are in the window

ASM

 

P

2 • • Energy is the square root of ASM

E

 

P

2 Entropy:

Entropy

 

P

2

P

) where ln(0)=0 Roger S. Gaborski 59

Matlab Texture Filter Functions

Function

rangefilt stdfilt

Description

Calculates the local range of an image.

Calculates the local standard deviation of an image.

entropyfilt Calculates the local entropy of a grayscale image. Entropy is a statistical measure of randomness Roger S. Gaborski 60

rangefilt

A = 1 3 5 5 2 4 3 4 2 6 8 7 3 5 4 6 2 7 2 2 1 8 9 6 7 Symmetrical Padding

1 1 3 5 5 2 2 max = 4, min = 1, range = 3 1

1 3 5 5 2

2 4

4 3 4 2 6

6 8 6

8 7 3 5 4

4

6 2 7 2 2

2 1

1 8 9 6 7

7 1 1 8 9 6 7 7

Roger S. Gaborski 61

rangefilt Results (3x3)

A = 1 3 5 5 2 4 3 4 2 6 8 7 3 5 4 6 2 7 2 2 1 8 9 6 7 >> R = rangefilt(A) R = 3 4 3 4 4 7 7 5 4 4 6 6 5 5 4 7 8 7 7 5 7 8 7 7 5 Roger S. Gaborski 62

rangefilt Results (5x5)

A = 1 3 5 5 2 4 3 4 2 6 8 7 3 5 4 6 2 7 2 2 1 8 9 6 7 >> R = rangefilt(A, ones(5)) R = 7 7 7 5 4 7 7 7 5 5 8 8 8 7 7 8 8 8 7 7 8 8 8 7 7 Roger S. Gaborski 63

Original image

Roger S. Gaborski 64

Imfilt = rangefilt(Im); figure, imshow(Imfilt, []), title('Image by rangefilt') Roger S. Gaborski 65

Imfilt = stdfilt(Im); figure, imshow(Imfilt, []), title('Image by stdfilt') Roger S. Gaborski 66

Imfilt = entropyfilt(Im); figure, imshow(Imfilt, []), title('Image by entropyfilt') Roger S. Gaborski 67

Matlab function: graycomatrix

• Computes GLCM of an image – glcm = graycomatrix(I) analyzes pairs of horizontally adjacent pixels in a scaled version of I. If I is a binary image, it is scaled to 2 levels. If I is an intensity image, it is scaled to 8 levels. – [glcm, SI] = graycomatrix(...) returns the scaled image used to calculate GLCM. The values in SI are between 1 and 'NumLevels'.

Roger S. Gaborski 68

Parameters

• • ‘Offset’ determines number of co-occurrences matrices generated offsets is a q x 2matrix – Each row in matrix has form [row_offset, col_offset] – row_off specifies number of rows between pixel of interest and its neighbors – col_off specifies number of columns between pixel of interest and its neighbors Roger S. Gaborski 69

Offset

• • • • • • [0,1] specifies neighbor one column to the left Angle 0 45 90 135 Offset [0 D] [-D D] [-D 0] [-D –D] Roger S. Gaborski 70

Orientation of offset

• The figure illustrates the array: offset = [0 1; -1 1; -1 0; -1 -1] 135, [-1,-1] 90, [-1,0] Roger S. Gaborski 45, [ -1,1] , 0 , [ 0 ] 1 71

Intensity Image

– mat2gray Convert matrix to intensity image.

I = mat2gray(A,[AMIN AMAX]) converts the matrix A to the intensity image I.

The returned matrix I contains values in the range 0.0 (black) to 1.0

Roger S. Gaborski 72

graycomatrix Example

From textbook, p 649 >> f = [ 1 1 7 5 3 2; 5 1 6 1 2 5; 8 8 6 8 1 2; 4 3 4 5 5 1; 8 7 8 7 6 2; 7 8 6 2 6 2] f = 1 1 7 5 3 2 5 1 6 1 2 5 8 8 6 8 1 2 4 3 4 5 5 1 8 7 8 7 6 2 7 8 6 2 6 2

Need to convert to an Intensity image [0,1]

Roger S. Gaborski 73

>> fm = mat2gray(f) fm = 0 0 0.8571 0.5714 0.2857 0.1429

0.5714 0 0.7143 0 0.1429 0.5714

1.0000 1.0000 0.7143 1.0000 0 0.1429

0.4286 0.2857 0.4286 0.5714 0.5714 0 1.0000 0.8571 1.0000 0.8571 0.7143 0.1429

0.8571 1.0000 0.7143 0.1429 0.7143 0.1429

Roger S. Gaborski 74

Quantize to 8 Levels

IS = 1 1 7 5 3 2 5 1 6 1 2 5 8 8 6 8 1 2 4 3 4 5 5 1 8 7 8 7 6 2 7 8 6 2 6 2 Roger S. Gaborski 75

>> offsets = [0 1];  >> [GS, IS] = graycomatrix(fm,'NumLevels', 8, 'Offset', offsets) GS = 1 2 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 0 0 0 1 0 1 0 0 0 2 0 1 0 1 0 0 0 1 3 0 0 0 0 0 1 0 0 0 0 1 1 0 2 1 0 0 0 0 2 2 1 Roger S. Gaborski See NEXT PAGE 76

GS = 1 2 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 0 0 0 1 0 1 0 0 0 2 0 1 0 1 0 0 0 1 3 0 0 0 0 0 1 0 0 0 0 1 1 0 2 1 0 0 0 0 2 2 1 IS = 1 1 7 5 3 2 5 1 6 1 2 5 8 8 6 8 1 2 4 3 4 5 5 1 8 7 8 7 6 2 7 8 6 2 6 2 Roger S. Gaborski 77

'GrayLimits' Two-element vector, [low high], that specifies how the grayscale values in I are linearly scaled into gray levels. Grayscale values less than or equal to low are scaled to 1. Grayscale values greater than or equal to high are scaled to NumLevels. If graylimits is set to [], graycomatrix uses the minimum and maximum grayscale values in the image as limits, [min(I(:)) max(I(:))].

>> [GS, IS] = graycomatrix(f,'NumLevels', 8, 'Offset', offsets, 'G',[]) Roger S. Gaborski 78

>> [GS, IS] = graycomatrix(f,'NumLevels', 8, 'Offset', offsets, 'G',[] ) >> I = rand(5) I = 0.0085 0.8452 0.2026 0.1901 0.6818

0.6311 0.1183 0.1947 0.1580 0.5397

0.2303 0.8539 0.6766 0.8251 0.9968

0.4624 0.7807 0.7231 0.5540 0.1104

0.3995 0.4229 0.7560 0.3559 0.6204

Roger S. Gaborski 79

>> [GS, IS] = graycomatrix(f,'NumLevels', 8, 'Offset', offsets, 'G',[]) GS = 1 2 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 0 0 0 1 0 1 0 0 0 2 0 1 0 1 0 0 0 1 3 0 0 0 0 0 1 0 0 0 0 1 1 0 2 1 0 0 0 0 2 2 1 IS = 1 1 7 5 3 2 5 1 6 1 2 5 8 8 6 8 1 2 4 3 4 5 5 1 8 7 8 7 6 2 7 8 6 2 6 2 Roger S. Gaborski 80

>> [GS, IS] = graycomatrix(f

,'NumLevels', 4,

'Offset', offsets, 'G',[]) GS = 3 0 3 1 1 2 1 0 6 1 1 1 1 0 4 5 IS = 1 1 4 3 2 1 ORIGINAL IMAGE QUANTIZED 3 1 3 1 1 3 TO 4 LEVELS 4 4 3 4 1 1 2 2 2 3 3 1 4 4 4 4 3 1 4 4 3 1 3 1 Roger S. Gaborski 81

Texture feature formula

Energy Entropy Contrast Homogeneity    

P

2

P

2

P P i j

||

P

)

j

) 2 Provides the sum of squared elements in the GLCM. (square root of ASM) Measure uncertainty of the image(variations) Measures the local variations in the gray-level co-occurrence matrix.

Measures the closeness of the distribution of elements in the GLCM to the GLCM diagonal.

Roger S. Gaborski 82

glcms = graycomatrix(Im, 'NumLevels', 256, 'G',[])) stats = graycoprops(glcms, 'Contrast Correlation Homogeneity ’ ); figure, plot([stats.Correlation]); title('Texture Correlation as a function of offset'); xlabel('Horizontal Offset'); ylabel('Correlation') Roger S. Gaborski 83

Texture Measurement

Quantize 256 Gray Levels to 32 Data Window 31x31 or 15x15 GLCM0 GLCM45 GLCM90 GLCM135 Feature for Each Matrix ENERGY ENTROPY CONTRAST etc Roger S. Gaborski Generate Feature Matrix For Each Feature 84

image Ideal map Roger S. Gaborski 85

Classmaps generated using the 3 best CO feature images Roger S. Gaborski 86

Classmaps generated using the 7 best CO feature images Roger S. Gaborski 31x31 produces the Best results, but large errors at borders 87

Law

s Texture Energy Features

• • • Use texture energy for segmentation General idea: energy measured within textured regions of an image will produce different values for each texture providing a means for segmentation Two part process: – Generate 2D kernels from 5 basis vectors – Convolve images with kernels Roger S. Gaborski 88

Law

s Kernel Generation

Level L5 = [ 1 4 6 4 1 ] Ripple R5 = [ 1 –4 6 –4 1 ] Edge E5 = [ -1 –2 0 2 1 ] Spot S5 = [ -1 0 2 0 –1 ] Wave W5 = [ -1 2 0 -2 1 ] To generate kernels, multiply one vector by the transpose of itself or another vector: L5E5 = [ 1 4 6 4 1 ] ’ * [ -1 –2 0 2 1 ] -1 -4 -6 -4 -1 -2 -8 -12 -8 -2 0 0 0 0 0 2 8 12 8 2 1 4 6 4 1 • 25 possible 2D kernels are • possible, but only 24 are used L5L5 is sensitive to mean brightness values and is not used Roger S. Gaborski 89

Roger S. Gaborski 90

Roger S. Gaborski 91

Roger S. Gaborski 92

textureExample.m

• • • • • Reads in image Converts to double and grayscale Create energy kernels Convolve with image Create data ‘ cube ’ Roger S. Gaborski 93

stone_building.jpg

Roger S. Gaborski 94

Roger S. Gaborski 95

Roger S. Gaborski 96

Roger S. Gaborski 97

Test 2

Roger S. Gaborski 98

Roger S. Gaborski 99

Roger S. Gaborski 100

Scale

• • How will scale affect energy measurements?

Reduce image to one quarter size imGraySm = imresize(imGray, 0.25, bicubic'); Roger S. Gaborski 101

Data

cube

’ >> data = cat(3, im(:,:,1), im(:,:,2), im(:,:,3), imL5R5, imR5E5); >> figure, imshow(data(:,:,1:3)) >> data_value=data(7,12,:) data_value(:,:,1) = 142 data_value(:,:,2) = 166 data_value(:,:,3) = 194 data_value(:,:,4) = 22 data_value(:,:,5) = 10 Roger S. Gaborski 102

Fractal Dimension

• • Hurst coefficient can be used to calculate the fractal dimension of a surface The fractal dimension can be interpreted as a measure of texture Consider the 5 pixel wide neighborhood (13 pixels) d c b c d b a b d c b c d Pixel Class a b c d Roger S. Gaborski Number Distance from center 1 4 4 4 0 1 1.414

2 103

Fractal Dimension Algorithm

• • • • • • Lay mask over original image Examine pixels in each of the classes Record the brightest and darkest for each class The pixel brightness difference (range ) for each pixel class is used to generate the Hurst plot Use least squares fit to construct a ln distance vs ln range plot The slope of this line is the Hurst coefficient for the specific pixel Roger S. Gaborski 104