Transcript Slide 1

Homework A
Let us use the log-likelihood function to derive an on-line adaptation rule
analogous to LMS. Our goal is to update our estimate of weights and our
estimate of noise variance sigma with each new data point.
l  w,  ; D   
1
2
n
 y  Xw   y  Xw   
2
T
i 1

log  2 
1/ 2
 log 

1. Find the first and second derivative of the log-likelihood with respect to the
weights and using Newton-Raphson write an update rule for w.
2. Use the same technique to derive an update rule for estimate of  2
hint: take derivatives with respect to  2
Homework B
Using simulations, determine whether ML estimate of  is unbiased or not.
T
x  1 x x2 


w  0.2 0.5 0.1
*
T
The “true” underlying process
What you measure
y
*(i )

*T (i )
w x
y (i )  y*(i )  



N 0, 2

 2 1
 ,x(n) , y(n) 
D  x(1) , y (1) , x(2) , y (2) ,
Your model of the process
y ( i )  wT x ( i )  
Start with n=5 data points.
2 for a given batch of data and then repeat for another
Compute  ML
2 .
batch. Do this a number of times to get an average value for  ML
Now increase n and repeat the procedure.
n5