Region-Based Feature Extraction of Prostate Ultrasound

Download Report

Transcript Region-Based Feature Extraction of Prostate Ultrasound

Region-Based Feature Extraction of Prostate
Ultrasound Images:
A Knowledge-Based Approach
Using Fuzzy Inferencing
Eric K. T. Hui
University of Waterloo, M.A.Sc. Seminar
Wednesday, November 12, 2003
4:30 PM in DC 2584
Outline









Introduction
Medical Background
Related Researches
Problem Formulation
Proposed Feature Extraction
Analysis
Conclusions
Future Works
Questions and Comments
2
Introduction
- Prostate Cancer 
Prostate cancer is the most frequently diagnosed
cancer in Canadian men:
18,800 will be newly diagnosed.
 4,200 will die of it.



Exact cause remains unknown.
Early detection is the key in controlling and
localizing cancerous cells.
3
Introduction
- TRUS 
Digital transrectal ultrasonography (TRUS)
One of the early detection techniques.
 Low cost, high availability, high safety, immediate
results.



TRUS can be used to plan and guide prostate
biopsy.
This thesis tries to automate the cancerous
region detection process.
4
Introduction
- Features 
Feature:


Measurement of some characteristics (e.g. darkness,
texture).
A good feature should be discriminative so that,
ideally, the cancerous regions are mapped to a
different range of feature values in the feature
space than the non-cancerous regions.
benign
cancerous
feature value
5
Introduction
- This Thesis 
This thesis proposes a new feature extraction
method:
Spatial location, symmetry, and other geometric
measurements of the regions-of-interest, in addition
to the greylevel and texture.
 Uses a semi-automatic fuzzy inferencing system
(FIS) to relate all the features and mimic radiologists’
knowledge.

Outline
6
Medical Background
- Male Reproductive System vas deferens
ureter
bladder
seminal vesicles
ejaculatory ducts
penis
prostate gland
urethra
rectum
bulbourethral gland
testis
7
Medical Background
- Prostate Zonal Anatomy vas deferens
bladder
seminal vesicles
anterior fibromuscular
stroma (AFMS)
central zone (CZ)
ejaculatory duct
transition zone (TZ)
verumontanum
peripheral zone (PZ)
urethra
rectum
8
Medical Background
- BPH 
Young and healthy prostate:
AFMS

Back
Prostate with Benign prostatic
hyperplasia (BPH):
AFMS
CZ
CZ
ejaculatory
duct
C
ejaculatory
duct
C
C’
TZ
TZ
PZ
urethra
AFMS
C’
PZ
urethra
urethra
urethra
TZ
TZ
CZ
CZ
PZ
PZ
ejaculatory ducts
ejaculatory ducts
9
Medical Background
- Prostate Cancer 
Prostate cancer involves the growth of
malignant prostate tumours and can be life
threatening.

Uneven statistical distribution:
70% originates in PZ.
 10% originates in CZ.
 20% originates in TZ.
urethra
TZ
CZ


PZ
ejaculatory ducts
Cancer tends to be localized in the early stage, any
asymmetry on the axial view might suggest cancer
development.
10
Medical Background
- TRUS Imaging 
Echoicities:
hypoechoic
… + anechoic
isoechoic
hyperechoic
11
Medical Background
- TRUS Imaging 
TRUS imaging:
About 80% of prostate cancer tissues consist of
hypoechoic tissues (mixed with other echoicities).
 Different probes (e.g. end-fire, side-fire) give
different shapes of the captured image of the Image
prostate.

12
Medical Background
- Summary 



Uneven cancer statistical distribution.
Asymmetry of regions-of-interest.
TRUS echoicities.
Different probes give different prostate shapes.
Outline
13
Related Researches

Transform-Based



Fourier Transform
Gabor Transform
Wavelet Transform

Statistic-Based


First-Order Statistics
Second-Order Statistics
14
Related Researches
- Fourier Transform 
Fourier Transform:

Decompose into pure frequencies:


 Not localized in spatial domain.
 A global operator.
F (u ) 
f ( x)e  j 2ux dx

Chapter Outline
15
Related Researches
- Gabor Transform Gabor Transform:

“Windowed Fourier Transform”.
FGT (k , u ) 

f ( x) w* ( x  k )e  j 2ux dx

Trade off between spatial and frequency resolutions.
f (x)
F (u )
x
f (x)
x
Frequency Domain

Spatial Domain

u
0
F (u )
0
u
16
Related Researches
- Gabor Transform 
Gabor Filter:
A variation of the Gabor Transform.
 Translate the window in the frequency domain to
capture different frequency components.

17
Related Researches
- Gabor Transform Gabor Filter:

It’s anisotropic (i.e. orientation dependent).
texture orientation
path of
ultrasound wave

Chapter Outline
18
Related Researches
- Wavelet Transform 
Wavelet Transform:
Multiresolution Analysis (MRA).
 Different dilations of basis functions to analyze
different scales.

19
Related Researches
- Transform-Based Limitations 
Limitations of transform-based methods:

Similar frequency spectrum.
F (u )
140
120
100
80
60
40
0
10
20
30
40
50
60
70
x
80
Frequency Domain
vtrOrig
160
Spatial Domain
f (x)
frqGT(u)
3000
2500
2000
1500
1000
500
0
0
10
20
30
40
50
60
70
Chapter Outline
u
80
20
Related Researches
- First-Order Statistics 
First-Order Statistics:
Greylevel of each pixel.
 One of the most discriminative features.

TRUS Image
Cancerous Region
Chapter Outline
21
Related Researches
- Second-Order Statistics 
Back
Second-Order Statistics:
Statistics on two neighbouring pixels.
 Requires a window defining the neighbourhood.
 Greylevel Difference Matrix (GLDM):
f
  i p(i | d )
 Contrast (CON):
 Mean (MEAN):
f
  i p(i | d )
f
  p(i | d ) log  p(i | d ) 
 Entropy (ENT):
p(i | d )
f

 Inverse Difference Moment (IDM):
 (i  1)
 Angular Second Moment (ASM):
f    p(i | d ) 

2
contrast
i
mean
i
entropy
i
idm
2
i
2
asm
i
22
Related Researches
- Summary 
All these methods were successfully applied to
extract features from:
Modalities with good resolution and image quality,
such as CT and MRI.
 High-level structures such as the overall prostate or
large regions (at least 64×64 pixels).


However, …
23
Related Researches
- Summary 

However, they are not suitable for extracting
features of low-level structures in ultrasound
images.
Any size of the window or wavelet basis:
Too large for region boundary integrity.
 Too small for reliable statistics.

Outline
24
Problem Formulation
- Resources 

Average image size 188.6×346.3 pixels.
Average cancerous region size 2920.3 pixels; that
is smaller than a circle with radius of 30.5 pixels!
Original TRUS Image
Prostate Outline
TZ Outline
Cancerous Region Outline
25
Problem Formulation
- Objectives 

To come up with a new set of features that can help
differentiate cancerous regions in a TRUS image from
the rest of the prostate.
Desirable criteria:




The features can be applied to analyze low-level structures,
such as the cancerous regions (~30-radius circle).
The boundary integrity of each region-of-interest should be
well preserved.
The features should be isotropic.
The features should be discriminative enough to differentiate
cancerous regions from the benign regions.
Outline
26
Proposed Feature Extraction Method
- Overview input
Region
Segmentation
Image Registration
Raw-Based Feature Extraction
Greylevel
design only
Feature
Design
Parameters
Texture
Region
Geometry
Model-Based Feature Extraction
Symmetry
Spatial Location
Feature Evaluation
FIS
PDF Estimation
Membership Functions
MI Evaluation
Fuzzy Rules
output
Feature Selection
Outline
27
Proposed Feature Extraction Method
- Region Segmentation 
Some region segmentation methods that I have
tried:
Graph-theory-based method by constructing
Minimum Spanning Tree (MST).
 Thresholding on histogram.

Graph-theory-based method
Thresholding-based method
28
Proposed Feature Extraction Method
- Region Segmentation 
Thresholding-based method:
Original
Gaussian Blurred
Greylevel
Segmentation
Zonal
Segmentation
Histogram
Morphological
Resulting Segmentation
Operators:
“open” and “holes”
Overview 29
Proposed Feature Extraction Method
- Image Registration 
Prostates have different shapes on TRUS images
due to:
Different physical shapes.
 Different probes (e.g. side-fire, end-fire).


Prostates may not be located at the centre of the
image.
Original Image
30
Proposed Feature Extraction Method
- Image Registration 
The idea is to deform all the prostates into a
common model shape:



The model shape should allow the ease of
specifying the relative spatial location of a given
point with respect to the whole prostate.
The model shape should be similar to an average
prostate outline.
The model shape should be reflectionally
symmetric about the vertical axis located at the
centre of the image.
31
Proposed Feature Extraction Method
- Image Registration 
A compromise:
Model Binary Mask
32
Proposed Feature Extraction Method
- Image Registration Affine
Transformation
Original Image
Outline-Based
Step 2
Texture-Based
Step 2
Fluid-Landmark-Based
Transformation
Model Binary Mask
Define Landmarks
Model-Based
Step 3;
Estimate Optimal Trajectories
Calculate Velocity Vectors
Interpolate Missing Pixels
33
Proposed Feature Extraction Method
- Image Registration 
Define landmarks:
16 equally spaced landmarks on the prostate outline.
 2 equally spaced landmarks on the vertical axis.


No medical knowledge of the anatomical
structure is required.
Subject Landmarks
Model Landmarks
34
Proposed Feature Extraction Method
- Image Registration 
Lagrangian trajectory:


0
The initial, intermediate, and final positions.
Velocity vectors:

 ( x n , t 0 )  x n   vx n , t dt
t0
v( xn , t )   ( xn , t )   ( xn , t  1)
Displacement of the position of a landmark (in a
unit of time).
Model Landmarks
Subject Landmarks
35
Proposed Feature Extraction Method
- Image Registration 
Estimate optimal trajectories:

Minimize:

Iterative gradient decent:




ˆ( xn , t )  arg min D( ( xn , t ))  P( ( xn , t )) 
 
 ( xn ,t ) 
 distanceerror
quadraticenergy 

 D(k ( xn , t )) P(k ( xn , t )) 

k 1 ( xn , t )  k ( xn , t )    

k ( xn , t ) 
 k ( xn , t )


P(k ( xn , t )) N 

T K (k ( xn , t ), k ( xm , t ))
  [k ( xm , t  1)  k ( xm , t )] 
 [k ( xm , t  1)  k ( xm , t )]






















k ( xn , t )
k ( xm , t )
m 1 

v ( xm ,t 1)
v ( xm ,t 1)

N
D(k ( xn , t )) 2 1[k ( xn , T )  xn,model ], t  T

k ( xn , t )
0,
t T

 2 K 1 (k ( xn , t  1), k ( xm , t  1))  [k ( xm , t  1)  k ( xm , t )]



m 1
v ( xm ,t 1)
N
 2 K 1 (k ( xn , t ), k ( xm , t ))  [k ( xm , t )  k ( xm , t  1)]



m 1
v ( xm , t )
36
Proposed Feature Extraction Method
- Image Registration 
Interpolate the optimal velocity vectors for the
whole image space:

Optimal velocity vectors of the landmarks:
vˆ( xn , t )  ˆ( xn , t )  ˆ( xn , t  1)

Optimal velocity vectors of the whole image space:
N
N
n 1
m 1
vˆ( x, t )   K (ˆ( xn , t ), x) K 1 (ˆ( xn , t ), ˆ( xm , t ))vˆ( xm , t )
37
Proposed Feature Extraction Method
- Image Registration 
Optimal velocity vectors:
vˆ( x, t  3)
vˆ( x, t  2)

Interpolate the optimal Lagrangian trajectories
for the whole image:
T
 ( x, T )  x   vˆ( x, t )
t 2
38
Proposed Feature Extraction Method
- Image Registration 
Interpolating missing pixels in the resulting
image using linear interpolation.
Marked Deformed Image
Marked Subject Image
Before deformation
After deformation
39
Proposed Feature Extraction Method
- Image Registration 
Now, we can easily measure spatial location and
symmetry!

Original images:
Original Image

Registered Images:
prostate14a088
prostate6a088
prostate8ba088
prostate5a088
Overview
40
Proposed Feature Extraction Method
- Greylevel 


Blur with Gaussian filter.
Design parameter: 
Take average over each region-of-interest.
TRUS
Pixel-Based
Greylevel (GL)
Region-Based
Greylevel (GL)
Overview
41
Proposed Feature Extraction Method
- Texture Equations
Pixel-Based

GLDM with different window size.
Design parameter: wROI
Region-Based

CON
MEA
ENT
IDM
ASM
Overview
42
Proposed Feature Extraction Method
- Symmetry 

Difference from flipped feature images.
Design parameter: none.
GreylevelSymmetry
(GS)
TextureSymmetry
(GS)
Pixel-Based before
inverse-deformation
Pixel-Based
Region-Based
Overview
43
Proposed Feature Extraction Method
- Spatial Location 

Define coordinate system using a “cone”.
Design parameter: ycentre
(r ,  )
44
Proposed Feature Extraction Method
- Spatial Location 

Spatial Radius (SR): 0 at origin, 1 at the perimeter.
Spatial Angle (SA): 0 at top, 1 at bottom.
SpatialRadius
(SR)
SpatialAngle
(SA)
Pixel-Based before
inverse-deformation
Pixel-Based
Region-Based
Overview
45
Proposed Feature Extraction Method
- Region Geometry 

Region Area (RA) = number of pixels.
Region Roundness (RR) = 2 RegionArea / 
RegionPeri meter
“perimeter of a circle with the same area” divided by
 “perimeter of the region”.

Region Area (RA)
Region Roundness (RR)
Overview
46
Proposed Feature Extraction Method
- Feature Evaluation 


How to fine-tune design parameters?
How to evaluate each feature?
How to compare the features?
Original TRUS
Expected
Cancerous Region
SR
ASM
47
Proposed Feature Extraction Method
- PDF Estimation 
We can analyze its probability density function
(PDF).

Parzen Estimation is used.
Pixel-Based PDFs of Greylevel feature.
P(x|Cancerous)
7
P(x|Benign)
P(x)
6
5
4
3
2
1
0
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
48
Proposed Feature Extraction Method
- MI Evaluation 
Entropy:


H (C )   p(c) log p(c)
cC
Measures the degree of uncertainty.
Mutual information between feature and class:
MI ( F ; C )  H (C )  H (C | F )
Measures the decrease in entropy with an
introduction of a feature F.
 Measures the interdependence between class and
feature.
 Bounds: 0  H (C | F )  H (C )

0  MI ( F ; C )  H (C )
49
Proposed Feature Extraction Method
- Feature Design Parameters 
Using MI(F;C), the optimal design parameter for
each feature can be selected more objectively.
GL
CON
MEA
ENT
IDM
ASM
SR,SA
Design Parameters
  {0.25,0.5,0.75,1,2,4}
wROI  {3,7,11,15,19}
wROI  {3,7,11,15,19}
wROI  {3,7,11,15,19}
wROI  {3,7,11,15,19}
wROI  {3,7,11,15,19}
ycentre  {0,10,15,20,25,30,50,75}
GS,TS n/a
RA,RR n/a
Chosen Design Parameter
 1
wROI
wROI
wROI
wROI
wROI
9
9
7
7
7
ycentre  0
n/a
n/a
50
Proposed Feature Extraction Method
- Feature Selection 
Select only a subset of the features.
For efficiency, and sometimes accuracy.
 Need to eliminate:

uninformative features … low MI(F;C).
 redundant features … high MI(F1:F2).

51
Proposed Feature Extraction Method
Back
- Feature Selection 
Use MI(F;C) to eliminate uninformative features.
GL
CON
MEA
E NT
IDM
ASM
SR
SA
GS
TS
RA
RR
Pixel-Based
MI(F;C)
H(F)
0.0488
0.5402
0.043
0.5402
0.0575
0.5402
0.0922
0.5402
0.07
0.5402
0.0745
0.5402
0.059
0.5028
0.0228
0.5028
0.0045
0.5028
0.035
0.5028
n/a
n/a
n/a
n/a
Region-Based
% MI(F;C)
H(F)
9.0%
0.0524
0.5402
8.0%
0.1079
0.5402
10.6%
0.1091
0.5402
17.1%
0.1118
0.5402
13.0%
0.0757
0.5402
13.8%
0.0824
0.5402
11.7%
0.1184
0.5028
4.5%
0.0688
0.5028
0.9%
0.0342
0.5028
7.0%
0.0627
0.5028
n/a
0.112
0.5402
n/a
0.0734
0.5402
%
9.7%
20.0%
20.2%
20.7%
14.0%
15.3%
23.5%
13.7%
6.8%
12.5%
20.7%
13.6%
52
Proposed Feature Extraction Method
- Feature Selection 
Use MI(F1;F2) to eliminate redundant features.
%
GL CON MEA ENT IDM ASM
SR
SA
GS
TS
RA
RR
GL 100.0
58.3
56.4
55.7 57.8 55.1 49.9 51.7 48.5 45.7 64.0 59.7
CON 58.3 100.0
74.4
58.7 58.5 54.4 56.5 60.4 48.2 36.6 64.2 67.4
MEA 56.4
74.4 100.0
53.2 53.9 52.2 54.7 58.4 46.3 37.0 62.8 66.8
ENT 55.7
58.7
53.2 100.0 51.2 83.2 52.4 57.1 43.6 36.5 63.8 63.9
IDM 57.8
58.5
53.9
51.2 100.0 44.4 48.1 52.9 41.3 35.6 62.1 58.5
ASM 55.1
54.4
52.2
83.2 44.4 100.0 46.9 54.5 40.6 33.6 62.9 59.4
SR 49.9
56.5
54.7
52.4 48.1 46.9 100.0 69.2 65.8 61.5 60.3 58.7
SA 51.7
60.4
58.4
57.1 52.9 54.5 69.2 100.0 69.8 65.3 63.5 61.5
GS 48.5
48.2
46.3
43.6 41.3 40.6 65.8 69.8 100.0 52.2 52.6 49.2
TS 45.7
36.6
37.0
36.5 35.6 33.6 61.5 65.3 52.2 100.0 42.7 37.5
RA 64.0
64.2
62.8
63.8 62.1 62.9 60.3 63.5 52.6 42.7 100.0 67.8
RR 59.7
67.4
66.8
63.9 58.5 59.4 58.7 61.5 49.2 37.5 67.8 100.0
Feature
GL CON MEA ENT IDM ASM
SR
SA
GS
TS
RA
RR
MI(F;C) 9.7% 20.0% 20.2% 20.7% 14.0% 15.3% 23.5% 13.7% 6.8% 12.5% 20.7% 13.6%
53
Proposed Feature Extraction Method
- Feature Selection 
Checking the feature selection visually:
TRUS
Expected
GL
CON
GS
TS
MEA
ENT
SR
SA
IDM
RA
ASM
RR
Overview
54
Proposed Feature Extraction Method
- Fuzzy Inferencing System 


Each feature by itself is not discriminative
enough.
Need to find out the relationship between the
selected features by analyzing them jointly
(collectively).
This thesis proposes to use a
Fuzzy Inferencing System (FIS).

The idea is to come up a set of fuzzy rules that
relate all the selected features.
55
Proposed Feature Extraction Method
- Fuzzy Inferencing System -
56
Proposed Feature Extraction Method
- Membership Functions 
Design the breakpoints of the membership
functions using PDFs.
2.5
Inspect local minima.
 Inspect intersection.


Semi-automatic.
2
0.0700
0.4600
0.3700
1.5
0.5400
0.0800
0.1200
0.5900
0.2100
0.4200
1
0.7600
0.9600
0.5200
0.7300
0.7900
0.9200
0.5
0
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
P(x|Cancerous) P(x|Benign) P(x)
57
Proposed Feature Extraction Method
- Membership Functions 
Chosen breakpoints and fuzziness.
Fuzziness
GL [0, 0.778, 0.9225, 1]
0.05
CON [0.03, 0.0462, 0.0659, 0.0722, 0.0829, 0.12]
0.001
MEA [0, 0.1097, 0.1493, 0.1788, 0.1919, 0.2028, 0.2188, 0.35] 0.005
ENT [0.3, 0.4182, 0.4696, 0.5136, 0.5430, 0.5650, 0.6603, 0.8] 0.01
IDM [0, 0.0448, 0.0704, 0.1153, 0.4546, 0.7]
0.01
ASM [0, 0.0422, 0.0548, 0.0928, 0.1392, 0.2741, 0.45]
0.005
SR
0.01
[0, 0.4584, 0.5051, 0.5891, 0.7105, 0.7572, 0.8319, 1]
SA
0.05
[0, 0.3700, 0.7600, 1]
GS
0.005
[0, 0.0464, 0.0635, 0.4]
TS
0.001
[0, 0.0373, 0.0625, 0.1042, 0.1319, 0.1736, 0.3]
RA [0, 0.0116, 0.0345, 0.0567, 0.0682, 0.08]
0.001
RR [0, 0.1207, 0.1637, 0.1837, 0.2267, 0.4]
0.005
Breakpoints
58
Proposed Feature Extraction Method
- Fuzzy Rules 
Generate fuzzy rules for each image:
p1 ( x)
p2 ( x)
...
60%
40%
MF1
MF2
MF3
MF4 MF5
x
MF1
x
MF3
MF2
Ratio3 = 0.6
Rule 1: if (FEATURE1 is MF2) and (FEATURE2 is MF3) then (CANCEROUS)
Rule 2: if (FEATURE1 is MF3) and (FEATURE2 is MF3) then (CANCEROUS)
Rule 3: if (FEATURE1 is MF3) and (FEATURE2 is MF3) then (LIKELY-CANCEROUS)
Rule 4: if (FEATURE1 is MF1) and (FEATURE2 is MF2) then (BENIGN)
Overview
59
Analysis

Some successful sample results:
Original
TRUS
Expected
Cancerous
Region
Proposed
Feature
Image
60
Analysis

Some less successful sample results:
Original
TRUS
Expected
Cancerous
Region
Proposed
Feature
Image
61
Analysis

Comparison between proposed feature
extraction method with other methods:
Individual region-based features:
Feature
GL CON MEA ENT IDM ASM
SR
SA
GS
TS
RA
RR
MI(F;C) 9.7% 20.0% 20.2% 20.7% 14.0% 15.3% 23.5% 13.7% 6.8% 12.5% 20.7% 13.6%
Combined feature:
GL + GLDM
Image 1 36.75%
Image 2 53.72%
Image 3 5.76%
Image 4 11.32%
Image 5 34.57%
Image 6 16.28%
Image 7 18.44%
Image 8 12.22%
Image 9 1.38%
Overall 16.91%
Proposed Feature Set
Pixel- vs. Region-Based
40.3%
53.48%
13.28%
32.13%
13% improvement due to FIS!
47.97%
57% improvement due to new features!!!
20.81%
28.30%
10.24%
12.53%
26.48%
Outline 62
Conclusions


Large-Fluid-Landmark Deformation was used to
deform prostates into a common model shape.
PDFs were used to:







Evaluate each feature individually using MI(F;C).
Eliminate redundant features using MI(F1;F2).
Design membership functions semi-automatically.
Generate fuzzy rules automatically.
Fuzzy rules mimics radiologists’ medical knowledge.
13% improvement due to FIS!
57% improvement due to new features, especially
Spatial Location features.
Outline
63
Future Works



Investigate on region segmentation that can best
serve the proposed feature extraction method.
Fully automate the membership function design
using PDFs.
Define optimal thresholds for classifying the
new feature.
64
Questions and Comments?

…
65
References
Medical Basics:


Texture Analysis:




A. H. Mir, M. Hanmandlu, S. N. Tandon, “Texture Analysis of CT Images”, IEEE Engineering in
Medicine and Biology, November / December 1995.
K. N. B. Prakash, A. G. Ramakrishnan, S. Suresh, T. W. P. Chow, “Fetal Lung Maturity Analysis Using
Ultrasound Image Features”, IEEE Transactions on Information Technology in Biomedicine, Vol. 6,
No. 1, March 2002.
O. Basset, Z. Sun, J. L. Mestas,G. Gimenez, “Texture Analysis of Ultrasound Images of the Prostate
by Means of Co-occurrence Matrices”, Ultrasound Imaging 15, 218-237 (1993).
Image Registration:


Sarang C. Joshi and Michael I. Miller, “Landmark Matching via Large Deformation
Diffeomorphisms,” IEEE Transactions on Image Processing, Vol. 9, No. 8, August 2000.
Symmetry:


Q. Li, S. Katsuragawa, K. Doi, “Improved contralateral subtraction images by use of elastic matching
technique”, Medical Physics, 27 (8), August 2000.
Feature Selection:



M. D. Rifkin, “Ultrasound of the Prostate: Imaging in the Diagnosis and Therapy of Prostatic
Disease”, 2nd Edition, Lippincott Williams and Wilkins, 1996.
R. Battiti, “Using Mutual Information for Selecting Features in Supervised Neural Net Learning”,
IEEE Transactions on Neural Networks, Vol. 5, No. 4, July 1994.
Please see my thesis for all other references.
66