CS193H: High Performance Web Sites Class 1

Download Report

Transcript CS193H: High Performance Web Sites Class 1

CS193H: High Performance Web Sites

Lecture 25:

2008 State of Performance

Steve Souders Google [email protected]

announcements

Final exam: • not inclusive – only includes material since midterm • about the same size as midterm • 2 hour time limit (but should only take 1 hour) • two time slots, pick the one you want:  Tues Dec 9, 12:15-2:15 – Gates B03  Fri Dec 12, 12:15-2:15 – Gates B01

State of Performance Web 100 Future of Performance Steve's Little Red Book

Web 100

the data was very noisy issues due to unclear assignment and variability of test conditions: • HTML, JS, CSS compressed or uncompressed?

• logged in or not logged in?

• web site content changes • ads • for timing, variability of testers' setup

Web 100 sites

100 160 web sites from Alexa Top 500 4shared.com

about.com

aim.com

alibaba.com

alice.it

allegro.pl

amazon.com

anonym.to

answers.com

aol.com

apple.com

ask.com

att.com

att.net

badoo.com

baidu.com

bankofamerica.com

bbc.co.uk

bebo.com

bestbuy.com

blogger.com

break.com

brothersoft.com

careerbuilder.com

chase.com

circuitcity.com

cnn.com

comcast.net

conduit.com

craigslist.org

dailymotion.com

dell.com

deviantart.com

digg.com

disney.go.com

download.com

easy-share.com

easybizchina.com

ebay.com

en.netlog.com

expedia.com

facebook.com

fastclick.com

filefactory.com

flickr.com

foxsports.com

gamefaqs.com

gamespot.com

geocities.com

globo.com

go.com

google.com

googlesyndication.com

hi5.com

hp.com

hulu.com

icq.com

ideo.com

ig.com.br

ign.com

imageshack.us

imagevenue.com

imdb.com

imeem.com

indiatimes.com

isohunt.com

jayisgames.com

last.fm

latimes.com

linkbucks.com

linkedin.com

live.com

livejournal.com

mapquest.com

mediafire.com

metrolyrics.com

microsoft.com

miniclip.com

mininova.org

mixi.jp

mlb.com

monster.com

mozilla.com

msn.com

multiply.com

myspace.com

nba.com

nbcolympics.com

ndtv.com

neopets.com

netflix.com

newegg.com

newgrounds.com

nfl.com

noaa.gov

nytimes.com

onemanga.com

onet.pl

opendns.com

orange.fr

orbitz.com

partypoker.com

people.com

people.com

perfspot.com

pogo.com

qq.com

quizrocket.com

rediff.com

reference.com

saatchi-gallery.co.uk

sfgate.com

shopping.com

skype.com

skyrock.com

slickdeals.net

slide.com

smileycentral.com

softonic.com

sonico.com

sourceforge.net

sportsillustrated.cnn.com

sportsline.com

sweetim.com

tagged.com

target.com

telegraph.co.uk

terra.com.br

thefreedictionary.com

thepiratebay.org

theplanet.com

tinypic.com

tribalfusion.com

tv.com

typepad.com

univision.com

uol.com.br

ups.com

usps.com

veoh.com

verizon.net

verizonwireless.com

vmn.net

wachovia.com

walmart.com

wamu.com

washingtonpost.com

weather.com

wikia.com

wikipedia.org

worldofwarcraft.com

wowarmory.com

wunderground.com

xanga.com

yahoo.com

yelp.com

youtube.com

zedo.com

ziddu.com

zshare.net

Web 100 stats

average size: 466K average # of requests: 70 average response time: 4.75 seconds average backend: 337 ms average frontend: 4431 ms average ratio: 11% average YSlow grade: 54 corr(size, time) = 0.43

corr(requests, time) = 0.52

corr(yslow, time) = -0.43

Future of Performance

developers think "Web 2.0" visibility into the browser deferred JavaScript prefetch services speed as a distinguishing feature standards, benchmarks user-driven transparency performance off the desktop

web devs think "Web 2.0"

the days of Web 1.0 are fading away...

but web developers still think in terms of the page reloading on every user action Web 2.0 pages may persist for hours need to evolve the way we program to keep our eyes on the long run, for example: • watch for memory leaks • # of DOM elements • optimize JS and CSS for ongoing DHTML

visibility into the browser

hard to measure the exact things we're trying to optimize • HTML parsing • CSS parsing • JS parsing and execution (as the page loads) • DOM manipulation

web page profiler (concept)

paint events memory CPU JavaScript CSS

deferred JavaScript

tools to automatically split (huge) Web 2.0 JavaScript payload into smaller initial module and larger later module(s) a la Doloto

http://research.microsoft.com/research/pubs/view.aspx?tr_id=1402

ability to specify defer using HTML