Transcript Slide 1

Using the WOSP Model to Improve
End-user Productivity of Information
Systems
Edward Mahinda - NJIT (USA)
Brian Whitworth - Massey University (NZ)
Presented at the International Conference on Business IT 2006 (BIZIT 2006),
August 8 – 10, Kuala Lumpur, Malaysia
Significance of IT/IS
• A primary organizational survival factor
– organizations cannot afford weak information systems (Davenport et
al, 1994)
• IT/IS benefits fall into four purpose categories:
–
–
–
–
Increasing productivity and performance
Better management support
Gaining competitive advantage
A framework for business restructuring
• Some quantitative indications:
– In the last two decades, approximately 50% of all new capital
investments in organizations has been in IT (Westland et al, 2000)
– Total worldwide expenditure on IT exceeded USD1Tril. per
annum in 2001, with a 10% annual compounded growth rate
(Seddon et al, 2000)
2
Need for System Evaluation
• However, organizations today have less
financial resources available for IT (Rivard et al,
1997).
– Increasing desire to control IT related
spending by better information system
evaluation, i.e. “buying smarter”.
• Improves overall performance (Taylor et al, 1995)
• Gives senior executives the information needed to
justify huge IT investments (Hitt et al, 1996; Brynjolfsson,
1993).
3
Need for User Involvement
• Many system development projects are abandoned
before or after completion, and most fail to meet user
expectations
– Organization IS/IT project failure annual costs estimated to
exceed $100 billion in the US alone (Ewusi-Mensah, 1997; Standish, 1996)
• Main reason: Lack of end user involvement in
development and purchasing processes (Vassey et al, 1994)
– Customers who pay for the system are not those who actually
work with it Gause et al (1993)
– Requires IS/IT performance evaluation by the non-specialist
primary users of IT-related products and services (Chang et al, 2000)
4
Evaluation Requirements
• End-user evaluation would let end-users influence IS
development and purchase processes (Isomaki et al, 2005)
• IS/IT end-user evaluation should be:
–
–
–
–
–
Valid: Its dimensions predict IS/IT performance
Comprehensive: Includes all relevant IS/IT performance factors
Consistent: Constructs do not overlap or contradict
Understandable: Usable by non-expert IT/IS users
IS/IT Applicable: Applies in many IS/IT contexts
• The Web of System Performance (WOSP) model seems
to satisfy these requirements
• It is a broad yet simple performance model, based on
well known IS/IT constructs, carefully defined so as to
not overlap conceptually, and applies to any system
5
Technology Acceptance Model (TAM)
• Dominant user acceptance model
• Perceived Usefulness (PU)
+ Perceived Ease of Use (PEOU)
==> Attitude ==> Intention to adopt
• Made usability a key IS/IT quality requirement
• TAM advantages (Hu, Chau, Sheng and Tam, 1999) :
– Valid. Good theory base, significant empirical support
– IS/IT Applicable Applicable to diverse technologies,
users, organizational contexts
– Understandable: Parsimonious
6
TAM (cont’d)
• TAM weakness:
– Not comprehensive: Ignores IS/IT criteria like:
• Flexibility (Knoll & Jarvenpaa, 1994), Security (OECD,1996), Reliability (Jonsson, 1998)
• Privacy (Benassi,1999), Scalability (Berners-Lee, 2000) and standards (Alter, 1999)
• In a study of telemedicine acceptance(Hu, Chau, Sheng and Tam, 1999):
– PU+PEOU explained only 37% of attitude variance
– PU+Attitude explained only 44% of intention variance
– Attempts to make Usefulness include say Security
make the model inconsistent
– The UTAUT model adds non-system factors like
facility infrastructure and normative influence
• TAM is validly describes IS/IT performance, but
seems incomplete
7
WOSP Model
• Based on Systems theory
• Information systems are like any other natural system
• Performance is how well a system interacts with its
environment
• Involves 4 system elements, each with a dual role:
– Effectors: change external environment
• Functionality: to act on environment
• Usability: to reduce action costs
– Boundary: determines what enters system
• Security: to prevent entry
• Extendibility: to use outside objects
– Structure: Manages and supports system
• Reliability: to perform the same despite internal change
• Flexibility: To perform differently given external change
– Receptors: Enable communication
• Connectivity: to exchange social meaning
• Privacy: to limit social meaning exchange
8
WOSP cont’d
• Performance =
Fu+Se+Fl+Ex+Re+Us+Co+Pr
• All dimensions in natural tension
9
Research Question
• WOSP particularly applies to socialtechnical systems (STS)
(Whitworth and Whitworth, 2004)
– That have a social performance level, e.g.
email, browsers, bulletin boards, chat, e-bay
• Do users take account of the WOSP
factors when comparing the performance
of alternative social technical systems?
10
Application assessed
• Browser ( increasingly):
– An important universal platform for information
searches; email; discussion groups; internet; intranet;
and extranet applications
– A socio-technical system
– Many different browser versions
– Organizations may choose/recommend one for
compatibility reasons
11
Analysis Method
• Multivariate dependence analysis
– Dependent variable - Perceived performance
– Independent (predictor) variables - WOSP factors
• The predictor variables are known
• Method of choice: CONJOINT ANALYSIS
(Hair, Anderson,
Tatham and Black, 1995)
– People evaluate by adding up part utilities:
– Widely used in marketing and agriculture
• New to IS research
12
Subjects
• Conjoint analysis gives higher quality data than surveys
– over four hours per person
• 28 grad students: 43% female, 53% male
• Diverse cultural background
• Experienced browser users: average 8 years total, with
23hrs/week in last 6 months
• Reasons for use: e.g. information search; online
banking; online purchasing: email; taking courses
13
Experimental Method
• Preliminary priming phase (questionnaire):
• Subjects asked to rate on 1-5 scale illustrative
factor statements on clarity, validity, importance
• Second phase: to evaluate each browser:
•
•
•
•
Grade as strong, good, adequate, limited, weak
Score each browser 1-100
Rank each browser 1-33 (no two with same rank)
Explain reasoning behind decisions
• Whole procedure carried out via email
14
Results I
• Accuracy of results:
– Internal consistency of subjects for all but 3:
• Kendall’s tau (holdout/actual responses)> 0.4
(p<0.01)
– Extreme outliers of part worths:
• One outlier (for usability)
– 4 data sets excluded from further analysis
• Interpreting results:
– If av. Importance >= 12.5% factor is significant
– The percentage of subjects giving a factor
av.importance>= 12.5%
15
Results II
Performance Factor
Avg. Importance
Std Dev.
99% Confidence
above12.5%
(Subjects who gave factor
importance>12.5%)
Security
22.78
12.78
16.07-29.50
70.83
Privacy
15.47
9.19
20.30-10.64
58.33
Usability
14.16
9.88
19.36-8.97
50.00
Functionality
12.02
8.21
16.33-7.70
29.17
Reliability
11.64
8.15
15.93-7.36
33.33
Connectivity
9.24
6.54
12.68-5.80
33.33
Extendibility
7.69
4.56
10.09-5.30
16.67
Flexibility
6.99
6.46
10.39-3.59
16.67
Correlation with
Avg. Importance
0.95
16
Results III
(Graphical representation)
17
Conclusion
• All factors are not of equal significance
• Security, usability, functionality, reliability,
and connectivity are more significant
– Extendibility, flexibility not as significant, but
still important
• A high correlation (0.95) between %age of
subjects giving importance>= 12.5% and
the av. importance of the factors
18
Discussion I
• These results are only for browsers
• Other software may have different criterion weights software types may have distinct performance profiles
• WOSP dimensions outside TAM were used in the
evaluation, e.g. security and privacy
• The WOSP model seems more inclusive
– It adds to TAM factors well recognized in the system
requirements literature
• The WOSP model lets users better indicate their
software preferences to system designers.
– Helps tighten relationship between developers and customers,
and foster collective creation and sharing of knowledge (Fuller et al,
2004; Franz et al, 2003)
19
Discussion II
• Using Conjoint Analysis, the WOSP model can facilitate
the following product development functions (Hair et al, 1995):
• Segmentation: segment users according to the
importance they attach to each of the eight factors.
– Match users with systems of their preference to reduce
resistance
• Marketing information: get information on the relative
importance of the factors, plus the cost of providing them
– Provides insight on the profitability of providing applications
• Simulation: involving 3 steps:
– Estimation and validation of conjoint models for sample subjects
– Selection of stimuli for testing, based on an issue of interest
– Simulation of subject’s choices for selected stimuli to predict
application evaluations
20
Questions?
• See brianwhitworth.com, “Papers”,
for more papers
21