Transcript Slide 1

Managing and Measuring Intranet Content

Julie Poroznuk, ABC CEBS JP Communication May 2007

Intranet Content

What is an intranet?

An intranet is an online presence secured behind the company’s firewall.

2

Intranet Content

Developing content and applications

Focus must be on business needs in order to have long-term value.

3

Intranet Content

  

Effective Content

is connected to the key objectives of the organization is up to date provides timely information is meaningful to the people who will use it

is from a trusted source

4

Intranet Content

Developing content Identify key work groups you want the intranet to serve

  

Functional teams Business units Key projects Ask the users: What are their content needs?

What will have most impact on their performance?

5

Intranet Content

Sample questions

What are the most important things your team has to do over the next couple of years?

How could you personally be twice as effective?

If you could have anything you wanted to help you do your job better, what would it be?

What are the most frustrating time-wasters in your life?

6

Intranet Content

Classify the ideas according to:

how they will impact the team

how they will support the strategic goals

the cost and other required resources

7

Intranet Content

 

build a list of content opportunities and options that range from quick-wins to high impact but high-cost applications plot the options on a chart where one axis is the estimated impact or value, and the other the cost or difficulty Value Cost

8

Intranet Content

 

The oddities of Web space

no sense of scale no sense of direction no sense of location

9

Intranet Content

 

Finding Information

“no more than three clicks” what really counts is how hard each click is card sorting

people read computer text 28% slower than printed text

79% of users only scan web pages

10

Intranet Content

    

re-purpose content chunking shorter pages = less scrolling get rid of half the words - and then do that again avoid happy talk (intro, welcome..)

11

Intranet Content

   

keep the important stuff “above the fold” use headings, sub-headings, point form avoid instructions - should be self-explanatory simple graphics

12

Intranet Content

Avoid PDF for On-Screen Reading

“Forcing users to browse PDF files makes usability approximately 300% worse compared to HTML pages. Only use PDF for documents that users are likely to print.”

Jakob Nielsen’s Alertbox

13

Managing the Content

 

content should be created, owned and managed by the people who own the knowledge

central team members should help content contributors improve the service they provide

identify areas not being served by the intranet

work as advisors for new and special projects

14

Managing the Content

   

employees should be able to update content independently IT bottlenecks reduce efficiency empower non-technical contributors establish work flow for content creation

15

Managing the Content

  – – – –

content management software should make it easy for you to set up the: page templates approval processes user roles rules about who should be responsible each content area this should not require extensive database development or specialized programming

16

Measuring Content

Measurement Guidelines

simple metrics related to business objectives can be very powerful

don't measure because you can - but because it is meaningful

use a mixture of quantitative and qualitative metrics

stories are more powerful than statistics

17

Measuring Content

  

Using metrics allows

targets to be set success to be assessed ROI to be estimated problems to be corrected

18

Measuring Content

Implementation metrics

System usage

     web usage statistics search engine usage messages sent/posted other knowledge creation measures knowledge use 19

Measuring Content

20

Measuring Content

21

Measuring Content

    

Number of users

 

hits page requests single page visits visits unique visits user sessions clickstream

22

Measuring Content

        

information quality information currency user feedback maintenance costs staff efficiency printing costs distributed authoring process efficiency, reduced time transaction costs

23

Measuring Content

   

Customer Service Metrics

product sales

 

lead conversion customer satisfaction consistency of advice call handling time transactions processed support requests

product development cycle

24

Measuring Content

  

Cultural Metrics

success stories, anecdotes staff morale, satisfaction cultural change staff learning

25

Measuring Content

    

Guidelines and Tips

 

be specific determine a baseline automate measures measure the right things less measures, not more effect of other activities re-evaluate metrics

26

Measuring Content

    

analyze log files regularly and act on findings site statistics can be very misleading focus on the user combine methods remember the big picture

27

http://www.usability.gov/process.html

28

Usability Testing

 

What you don’t need

a Ph.D. in Psychology

a high-tech lab with lots of test apparatus eye tracking device multimillion dollar budget

29

Usability Testing

   

The six steps of testing

develop test plan

select participants prepare test materials conduct the test debrief the participant transform data into findings and recommendations

30

Usability Testing

Test Objectives Examples:

How easily can users locate the benefits information they need to determine their level of coverage?

How easily can users make changes to personal information?

31

Usability Testing

Match tasks to test objectives

Your child needs braces. Find out how much your dental plan will pay for orthodontist services.

You just got married. Register your new spouse as an eligible dependent for benefits coverage.

32

Usability Testing

Examples of conditions for success:

 

The user should be able to find the correct information in less than three minutes.

No more than four clicks should be needed to find the information.

33

Usability Testing

Examples of Measures

Excellent: completed the task easily with time and clicks to spare.

Acceptable or OK: completed the task within or close to requirements

Unacceptable: did not complete the task, or took much longer and more clicks to find the information than is acceptable.

34

Usability Testing

How many participants?

typical test has 6 to 12 participants

three for a simple test is an absolute minimum (remove idiosyncrasies)

three to five: enough information to be comfortable with your conclusions

35

Usability Testing

   

How do I select participants?

actual users, if known (average employees) don’t use developers of the site make sure the participants show up provide some reward for participation include at least a few LCUs (least competent users)

beware of highly seasoned users

36

Usability Testing

Who should conduct the tests?

Test Monitor

most critical role

 

needs to objective sometimes this person is the whole testing team

sometimes an external party is the best choice

37

Usability Testing

Data Logger

takes down information as participant performs tasks

usually logs several types of data:

– – – –

time elapsed number of clicks path of clicks success or failure

38

Usability Testing

 

Conduct a Pilot Test

test drive the tasks

makes sure everything works properly don’t do this at the last minute preferably with someone who would qualify as an actual participant

39

Usability Testing

Analyzing the Results

 

Levels of severity

prevents completion of task

creates significant delay and frustration has a minor effect indicates possible future enhancements

40

Usability Testing

Recommending Changes

consider the complexity of the problems

consult with designers/developers, technical communicators, usability specialists

balance effort for benefit

41

Usability Testing

Communicating the Results

informal memo

verbal report

formal report

introduction

– – – – –

methodology user profile task list results discussion

42

Usability Testing

Usability Testing

moves the discussion from what’s right and what’s wrong to what works and what doesn’t

helps us realize that all users are not like us

The point is not to prove or disprove something, but to inform your judgement.

43

Usability Testing

 

What is a usability review?

a usability expert reviews your site and provides a report sometimes used before a re-design can be used before testing to identify problem areas

44

New Media

      

blogs wikis podcasts videocasts enterprise chat interactive screensavers VOIP (voice over Internet protocol)

45

Making Intranets Meaningful

   

put at least one new item on the global home page every day give prominence to strategic information and “new” news facilitate content contributions from everyone integrate services and applications

46

Global Intranet Strategies Survey Executive Summary

the intranet has entered maturity as a primary information tool

senior management perception of the intranet is out of sync with reality on the ground

intranets lack sufficient funding and resources

  

decision-making is an issue customer-facing functions are largely missing from the intranet primary strategy drivers are “building a common culture”

47

Global Intranet Strategies Survey

    

primary obstacle to achieving full potential is that it is too communication-oriented and lacks integrated applications intranet evaluation is irregular and inconsistent only 1 out of 4 organizations is obliged to demonstrate ROI for intranet investments information flows are strongest in top-down direction 3 out of 4 have an employee directory, but only 1 out of 5 contain information about peoples’ skills and expertise

48

Global Intranet Strategies Survey

 

Web 2.0 (blogs, wikis) are making their way to the intranet organizations who consider the intranet to be “business critical” are more likely to adopt Web 2.0 technologies and have stronger communication flows than average

49

Global Intranet Strategies Survey What evaluation techniques do you use?

Telephone interviews Online polls Other Formal benchmarking studies Expert analysis Informal benchmarking with other organizations Emails to the intranet managers Focus groups Analysis of search logs Online surveys Informal feedback from users Analysis of usage statistics 9 15 17 21 35 35 37 42 42 65 79 89

50

Global Intranet Strategies Survey What is the frequency of your formal evaluations?

Not regular (48%) Once a year (29%) At least twice a year (14%) Have not done any yet (9%)

51

Global Intranet Strategies Survey Conclusions

the intranet is still in its infancy

the intranet is moving towards the individual

senior management has a stronger role to play in the intranet

52

Making Intranets Meaningful

“Usability rules the Web... He or she who clicks the mouse gets to decide everything.” -Jacob Nielsen 53

References

Articles

Metrics for knowledge and content management by James Robertson http://www.steptwo.com.au/papers/kmc_metrics/index.html

Employing Strategic Content Management for Successful Intranets by Hank Barnes http://www.intranetjournal.com/articles/200106/cm_06_06_01a.

html

Developing business focused content and applications from Melcrum contentappdev

Practitioner’s Guide to Managing Intranets and Portals

http://www.vigorat.com/killerappsvigorat.htm#contentappdev#

54

References

Articles

Tools for Assessing Website Usage by Scott Anderson, Terri Willard, Heather Creech and Deborah Bakker http://www.iisd.org/pdf/2001/Web_evaluation.pdf

Global Intranet Strategies Today & Tomorrow Survey, Summary of Results by Jane McConnell http://netjmc.com/engl/survey06summary.html

Looking through the Portal by Philip Weiss, Communication World, May-June 2007

Making Intranets Meaningful by Jane McConnell, Communication World, May-June 2007

55

References

   

Usability Web Sites

www.usability.gov

www.useit.com

www.usableweb.com

www.intranetinsider.com

www.humanfactors.com

www.upassoc.org

www.userdox.com

56