User Training vs. System Training Dr. Mark Hepworth Department of Information Science.

Download Report

Transcript User Training vs. System Training Dr. Mark Hepworth Department of Information Science.

User Training vs. System Training
Dr. Mark Hepworth
Department of Information Science
People
 Zipf’ s law: people will choose the path of
least resistance or "effort" … seeking
behaviour stops as soon as minimally
acceptable results are found (found at
http://en.wikipedia.org/wiki/Principle_of_least_effort [Accessed 15.10.08])

But how does acceptable get defined?
Young learners
 Are no different! And perform to expectation!
 Schooling has encouraged pragmatic, passive, surface learning
… fit for assessment
 Schooling provides little opportunity to develop independent, self
directed, learning skills and appropriate attitudes
 Educators do not seem to be aware of the complexity of
information seeking, management and use processes
 Employers …
 Search engines have fostered:
 a shallow attitude to information retrieval
 a naïve belief in the power of the machine
 an uncritical approach to information retrieval.
Learners (beginning semester 1)
 'I don't really understand what 'database'
means. Is there any difference between the
information that we search by using the
normal search engine and the databases that
we found via 'Metalib'?
Learners (mid semester 1)
 “the information retrieved from Google was not as specific as a
database” [compared to Emerald]
 “the Boolean AND function allowed me to select a subject
‘information science’ AND (in the title) ‘security system’ … to
narrow my search I used Computer & Information systems
database”
 “the information returned was reliable”
 “I used ABI … you could be more restrictive … there are a lot
of variables you can adjust …this makes it easier to locate
material that you are looking for”
 “Emerald has many features that allow the search to be
refined … the search results can be broadened or limited
through using Boolean searching … phrase … combining
terms made the information more specific”
The penny drops!

I used the Web of Science
seeking behaviour in libraries.
advanced search facility to research children’s information
Data has to be entered in the search box in a precise way using commands to search for
words in topics, titles etc. You can also search for authors, year published etc in the initial search. You
can use the Boolean operators – AND, NOT, OR and SAME.
I searched for ‘information seeking’ which brought up 7344 articles on many different subjects and user
groups in medicine as well as in libraries and other organisations . I then searched for articles
containing the words ‘children’ and ‘library’ in the initial search and this brought up 919 document types.
There was the facility on the database to combine these two searches which brought up 27
records. This was an easier number to deal with and there were several document types which looked
at children’s use of libraries and developing library services for younger users.
could have selected publication years,
document types, subject areas, e.g. information science and library science, authors and
In order to refine the search more I
many more. In each case the number of documents relating to that area was stated.
I found using this database easy once you know how to use the commands. It is
also extremely powerful and user friendly and actually very satisfying as you can often find whole
documents. I found some quite interesting ones I might use in my assignment. A disadvantage of using
this database, however, was its American bias. Out of the 27 document types I found there were
19 American ones and only one from the UK.
Experience the benefits – see the value
 Students are NOT opposed to effort when it
pays off (look at the range of variables
embraced in game playing)
 Students CAN use and DO see the VALUE of
more complex IR systems
 BUT they need to be taught in problem
based, contextually relevant, interesting way.
There are common problems associated
with the search process
Define the topic you want to search on
Capture
information
Select items
Identify useful words/concepts
Refine search
View/browse results
Choose search technique
Choose type of search
Determine what kind of info.
Identify sources
Select sources
Understand functionality
Develop search strategy
Common user experiences
 Too much
 Too little
 Irrelevant
Caused by:
 Not knowing where to go
 Not knowing how the systems work
 Wrong terms – unfamiliar with the domain
20 years for the penny to drop!
 Search engines seemed to ignore the online
and CDROM experience
 Naïve belief in their algorithms (eventually
had help systems and an advanced search!)
 BUT have changed
 Podcast of a marketing manager of an IR firm
who knew of Marcia Bates … had read
information behaviour research!
Better times
 Federated systems (less need to know where
to go)
 Faceted classification (different views,
enables narrowing down, prompts the user)
 Tag clouds (helps with limited domain
knowledge, prompts search terms, help to
narrow and broaden the search)
Possibly less IR training, quicker converts
Room for improvement
 Its not just about the retrieval process
 It doesn’t have to be all done in the background –
give people control
 However, within the IR remit need to allow:




better exploration of the subject domain
selection of multiple terms
more control – narrowing and broadening
More prompted - narrowing and broadening
 Dependent on the ‘richness’ of the record i.e. the
metadata (the richer the descriptions of the
information objects the more that can be extracted)
Connecting with information in more
depth
Map the domain
Identify key words
See connections
Topic
investigation
Knowing enough?
Constructing more specific
searches using the full
functionality of the systems
Evaluating sources
Identifyin
g sources
& systems
Connecting
with
information
Source orientation
Understanding functionality
Using more subject
specific sources
Constructing new knowledge
Analysis
Refining &
interpreting
Evaluation
Thinking
Behaviour
Organising,
storing
information
Scanning, browsing,
chaining
Mind maps,
concept tables
Synthesis
Reflection (on process, your thinking, knowledge, feelings,
learning style)
Questioning & challenging
Still a need for training
 To effectively use the full range of information
resources higher level skills are required
 To be an effective, empowered, ‘learner’
 Need to understand the information landscape
 Need to understand the information and knowledge
creation and storage process
 Need to understand the value of information
 Need to understand that IR is part of learning and it’s
messy
 For some tasks precision and power is required
 and so on …use, communication … information
literacy
Joining forces – it is happening!
Information & Computer Scientists people’s information behaviour,
information literacy and information
retrieval
MORE HOLISTIC
Systems that support the wider
information literacy experience
Information retrieval &
electronic publishing industry
professionals – the products,
the needs, the technology
Information professionals
(librarians, information
specialists) – the
environment, the needs,
information literacy
User training or system training?
Questions
Comments
Observations