Poster-Predicting Solar Generation from Weather Forecasts
Download
Report
Transcript Poster-Predicting Solar Generation from Weather Forecasts
Predicting Solar Generation from Weather Forecasts
Advisor: Professor Arye Nehorai
Chenlin Wu, Yuhan Lou
Department of Electrical and Systems Engineering
Principal Component Analysis (PCA)
Kernel Trick for SVR
Background
ο¬
Smart grid: increasing the contribution of renewable in grid
energy
ο¬
Solar generation: intermittent and non-dispatchable
The kernel trick is a way of mapping observations from a
general set S (Input space) into an inner product space V
(high dimensional feature space)
Ξ¦: βπ β βπ
Goals
Creating automatic prediction models
π(π₯)
ο¬
Predicting future solar power intensity given weather forecasts
=
πβ«π
π
πΌπ β πΌπ β π(π₯π , π₯)
Experiments
+π
where π π₯π , π₯ = Ο π₯π , Ο π₯ .
ο¬
NREL National Solar Radiation Database 1991-2010
ο¬
Hourly weather and solar intensity data for 20 years
Gaussian Processes (GP)
ο¬
Station: ST LOUIS LAMBERT INTβL ARPT, MO
GP regression model:
Input: (combination of 9 weather metrics)
π¦π = π π₯π + ππ , where noise ππ ~π(0, π 2 πΌ)
ο¬
Date, Time , Opaque Sky Cover, Dry-bulb Temperature, Dew-point
Temperature, Relative Humidity, Station Pressure,Wind Speed, Liquid
Precipitation Depth
Output :
ο
Amount of solar radiation (Wh/m2) received in a collimated beam on a
surface normal to the sun
Methods
ο¬
ο¬
In our research, regression is used to learn a mapping from
some input space of n-dimensional vectors to an output space of
real-valued targets
We apply different regression methods including:
ο
Linear least squares regression
ο
Support vector regression (SVR) using multiple kernel functions
ο
Gaussian processes
Linear Model
π¦ = π X = ππ π + π
where π¦ β βπ : measurement (solar intensity)
X β βπ×π+1 : each row is a p-dimensional input
π΄ β βπ+1 : unknown coefficient
π β βπ : random noise
Loss function(Square error): π¦ β π¦ 2 = π¦ β π π π 2
Support Vector Regression (SVR)
Given training data {(ππ , π¦1 ), (ππ , π¦2 )β¦(ππ , π¦π )
Linear SVR ModelοΌ
π π = π, π + π =
minimize
1
2
π 2 +πΆ
Applying PCA to remove redundant information
The graph shows the MSE with different input
dimensions. The feature set with 8 dimensions performs
the best with the lowest test error.
And as long as we keep more than 5 principle
components, the errors are lower than linear regression
π
Data Source
ο
Such as: Temperature & Time of the day
πΌπ β πΌπ β Ο π₯π
π=
ο¬
ο¬
Some weather metrics correlate strongly
π€ππ
+π
β
(ΞΎ
+ΞΎ
π π π )
π¦π β π(π₯π ) β€ π + ΞΎπ
β
π(π₯
)
β
π¦
β€
π
+
ΞΎ
subject to
π
π
π
ΞΎπ , ΞΎπ β β₯ 0
Loss function: (epsilon intensive)
0
ππ ΞΎ β€ π
ΞΎπβ
ΞΎ β π ππ‘βπππ€ππ π.
ο¬
Predictions are made by proposed methods
ο¬
20% of data is used to train & 10% of the data is used to test
Linear regression
ο¬
Assume a zero mean GP prior distribution
MSE is used to evaluate the
result of regression. Followings
are the prediction errors of the 3
different methods:
ο
over inference functions π β . In particular,
Linear Regression
π π₯ 1 , . . . , π π₯ π ~π 0, πΎ , πΎπ,π = πΆππ£(π π₯ π , π π₯ π ) = πΎ(π₯ π , π₯ π )
215.7884
To make predictions π¦ β at test points π β , where π¦ β = π π β + Ξ΅
π β : ππβ
πΎ π, π
~ π 0,
πΎ πβ, π
β
It follows that p π¦ π·, π
π, π β
πΎ
πΎ πβ, πβ
β
,
π
πβ
~ π 0,
π2πΌ
ο
130.1537
0
π2πΌ
0
β1
ο
SVM regression
= π(π, Ξ£)
where π = πΎ π, π β [πΎ(π, π) + π 2 πΌ]β1 π¦
Ξ£ = πΎ π β , π β β πΎ π, π β πΎ π, π + π 2 πΌ
SVR
122.9167
ο¬
πΎ πβ, π .
SPGP
Followings are 24-hour prediction
Sparse Pseudo-input GP (SPGP)
GPs are prohibitive for large data sets due to the inversion
of the covariance matrix.
Consider a model parameterized by a pseudo data set π· of
size π βͺ π, where n is the number of real data points.
Reduce training cost from π π3 to π π2 π , and prediction
cost from π π2 to π π2
Pseudo data set π·: πΏ = ππ π=1β¦π , π = ππ π=1β¦π
SPGP regression
Prior on Pseudo targets: π π πΏ = π(0, πΎπ )
Likelihood:
π
β1
π π¦ π, πΏ, π = π πΎπ πΎπ π,
π
πΎππ β πΎπ πΎπ
β1
LR
SVR
GP
πΎπ + π 2
Posterior distribution over π :
π π π·, πΏ = π πΎπ ππ β1 πΎππ (π¦ + π 2 π°)β1 π, πΎπ ππ β1 πΎπ
where ππ = πΎπ + πΎππ (π¦ + π 2 π°)β1 πΎππ
Given new input π₯ β , the predictive distribution:
π π¦ β π·, π β = π ππ π¦ β πβ , πΏ, π π π π·, πΏ = π πβ , Ξ£ β
β1
π
β
where π = πΎβ ππ πΎππ (π¦ + π 2 π°)β1 π
Ξ£ β = πΎββ β πΎβ π πΎπ β1 β ππ β1 πΎβ + π 2
Predicting Error
191.5258
93.2988
90.2835
Conclusions
ο¬
Using machine learning to automatically model the function of
predicting solar generation from weather forecast lead to a
acceptable result
ο¬
Gaussian processes achieved lowest error among all the methods