ASAC : Automatic Sensitivity Analysis for Approximate Computing
Download
Report
Transcript ASAC : Automatic Sensitivity Analysis for Approximate Computing
ASAC : Automatic Sensitivity
Analysis for Approximate
Computing
Pooja ROY, Rajarshi RAY, Chundong WANG, Weng Fai WONG
National University of Singapore
LCTES 2014
Why Approximate?
ASAC : Automatic Sensitivity Analysis for Approximate Computing
2
Why Approximate?
Quality of Service
High QoS
QOS Band
Relaxed accuracy
Acceptable QoS
ASAC : Automatic Sensitivity Analysis for Approximate Computing
3
Exploring Previous Works
Ansel et.al. (CGO’11)
Algorithm
Sampson et.al. (PLDI’11)
Carbin et.al. (ISSTA’10)
Baek et.al. (PLDI’10)
Carbin et.al.
(OOPSLA’13)
Misailovic et.al. (TECS’13)
Programming (API)
Zhu et.al. (POPL’12)
Compilation
Sidiroglou-Douskos et.al. (FSE’11)
Esmaeilzadeh et.al. (ASPLOS’12)
Hoffman et.al. (ASPLOS’11)
Gupta et.al. (ISLPED’11)
Sampsin et.al. (MICRO’13)
Venkataramani et.al. (MICRO’13)
Chippa
et.al.
(DAC’10)
Kahng et.al. (DAC’12)
ASAC : Automatic Sensitivity Analysis for Approximate Computing
Architecture
Circuit
4
Exploring Previous Works
Ansel et.al. (CGO’11)
Algorithm
Sampson et.al. (PLDI’11)
Carbin et.al. (ISSTA’10)
Baek et.al. (PLDI’10)
Carbin et.al.
(OOPSLA’13)
Misailovic et.al. (TECS’13)
Programming (API)
Zhu et.al. (POPL’12)
Compilation
Sidiroglou-Douskos et.al. (FSE’11)
Esmaeilzadeh et.al. (ASPLOS’12)
Hoffman et.al. (ASPLOS’11)
Gupta et.al. (ISLPED’11)
Sampsin et.al. (MICRO’13)
Venkataramani et.al. (MICRO’13)
Chippa
et.al.
(DAC’10)
Kahng et.al. (DAC’12)
ASAC : Automatic Sensitivity Analysis for Approximate Computing
Architecture
Circuit
5
Approximation based Programming Paradigm
New programming paradigm
Explicit classification of program data
(variables, methods etc.)
Code
Compilation
framework to
support
approximation
Approximable
data
Nonapproximable
data
ASAC : Automatic Sensitivity Analysis for Approximate Computing
6
Need of Automation
Original
Code
Rewrite
using new
language
constructs
• Programmer’s
Annotation,
• Provision of multiple
versions
Code
Compilation
framework to
support
approximation
Approximable
data
Nonapproximable
data
ASAC : Automatic Sensitivity Analysis for Approximate Computing
7
Need of Automation
Writing ‘binutils’ from scratch?
Expect app developers to provide many
versions?
Recompile and test ‘Picassa’, ‘VLC,’ with
multiple QoS requirements?
Providing for entire android/ios kernels?
ASAC : Automatic Sensitivity Analysis for Approximate Computing
8
Our Approach : ASAC
Automatic Sensitivity Analysis
Statistical perturbation based framework
Scalable
Specifically, considers internal program
data for approximation
ASAC : Automatic Sensitivity Analysis for Approximate Computing
9
Key Idea
Code
Perturb each
Variable
Perturbed
Output
Acceptable
QoS
sensitivity
Sensitivity particular variables’ contribution towards
the output
Based on ‘sensitivity’ the variables are ranked
Low ranked variables can be approximated
Higher ranked variables are critical
ASAC : Automatic Sensitivity Analysis for Approximate Computing
10
Key Idea
Code
Perturb each
Variable
Perturbed
Output
Acceptable
QoS
sensitivity
How to systematically perturb the variables?
How to translate the perturbed output to sensitivity
ranking?
ASAC : Automatic Sensitivity Analysis for Approximate Computing
11
Hyperbox Sampling
int sum(){
int i;
double a = 0.1, sum = 0.0;
for(i=0;i<10;i++){
sum += a/10;
}
return sum;
}
i
a
sum
Creating hyperbox with value range of each variable
ASAC : Automatic Sensitivity Analysis for Approximate Computing
12
Hyperbox Sampling
int sum(){
int i;
double a = 0.1, sum = 0.0;
for(i=0;i<10;i++){
sum += a/10;
}
return sum;
}
i
a
sum
Discretizing each dimension by ‘k’
ASAC : Automatic Sensitivity Analysis for Approximate Computing
13
Hyperbox Sampling
int sum(){
int i;
double a = 0.1, sum = 0.0;
for(i=0;i<10;i++){
sum += a/10;
}
return sum;
}
i
a
sum
Choosing samples based on “Latin Hyperbox Sampling”
ASAC : Automatic Sensitivity Analysis for Approximate Computing
14
Hyperbox Sampling
int sum(){
int i;
double a = 0.1, sum = 0.0;
for(i=0;i<10;i++){
sum += a/10;
}
return sum;
}
i
a
sum
0.2
3
0.7
Controlled perturbation
ASAC : Automatic Sensitivity Analysis for Approximate Computing
15
Perturbed Outputs
Rule 1
For a program with ‘n’ variables, discretization
constant ‘k’ and ‘m’ randomly chosen points ,
number of perturbed outputs are (k1)
n1
m* ( (k-i) )
i=0
Not trivial!
ASAC : Automatic Sensitivity Analysis for Approximate Computing
16
Key Idea
Code
Perturb each
Variable
Perturbed
Output
Acceptable
QoS
sensitivity
How to systematically perturb the variables?
How to translate the perturbed output to sensitivity
ranking?
ASAC : Automatic Sensitivity Analysis for Approximate Computing
17
Perturbed Outputs
‘good’ sample – within QoS band
‘bad’ sample – outlies the QoS band
(cumulative distribution function (cdf) for each variable)
ASAC : Automatic Sensitivity Analysis for Approximate Computing
18
Hypothesis Testing
Kolmogorov-Smirnov test calculates the
max distance between the curves
Rule 2
The maximum distance between the curves
is the sensitivity score for the variable.
Higher the score, the more the variable
contributes towards the program output.
ASAC : Automatic Sensitivity Analysis for Approximate Computing
19
Approximable vs. Critical
Sensitivity score ( > 0.5) is critical
For evaluation
Mild Error Injection : 1/3 (or 1/2) of approximable
variables
Medium Error Injection : 1/6 of approximable
variables
Aggressive Error Injection : All of the approximable
variables
Programs
SciMark2
MiBench (JPEG)
SPEC2006 (464.H264ref)
ASAC : Automatic Sensitivity Analysis for Approximate Computing
20
ASAC Correctness
ASAC : Automatic Sensitivity Analysis for Approximate Computing
21
ASAC Correctness
Bench
marks
True
Positive
False
Positive
False
Negative
True
Negative
Precision
Recall
Accuracy
SOR
5
0
1
2
0.83
1
0.88
SMM
1
0
1
6
0.50
1
0.88
Monte
2
0
1
2
0.67
1
0.80
FFT
15
2
2
12
0.88
0.88
0.87
LU
7
1
1
5
0.88
0.88
0.86
Average 0.75
0.95
0.86
*as compared to ‘manually annotated baseline’ (EnerJ, PLDI’11)
ASAC : Automatic Sensitivity Analysis for Approximate Computing
22
ASAC : JPEG
Input
Encode (Mild)
Decode (Mild)
Encode (Aggressive) Decode (Aggressive)
ASAC : Automatic Sensitivity Analysis for Approximate Computing
23
ASAC : H264
Error Rate
SNR_Y
SNR_U
SNR_V
BitRate
No Error
36.67
40.74
42.32
149.62
Mild
36.69
37.64
37.65
146.6
Medium
34.05
36.92
36.79
147.12
Aggressive
29.78
32.89
32.99
146.03
ASAC : Automatic Sensitivity Analysis for Approximate Computing
24
ASAC Runtime
ASAC : Automatic Sensitivity Analysis for Approximate Computing
25
ASAC Sanity Check
• JPEG : Encode and Decode with error injected in
variables marked as ‘non-approximable’
• H264 – Application crash
ASAC : Automatic Sensitivity Analysis for Approximate Computing
26
Concluding
ASAC
Automatic classification of approximable and
non-approximable data
Scalable
No profiling
Can be applied to program without available
source code
Approximation
Saves energy and without performance loss
ASAC : Automatic Sensitivity Analysis for Approximate Computing
27
Thank you
ASAC : Automatic Sensitivity Analysis for Approximate Computing
28