our experience Sue Briggs

Download Report

Transcript our experience Sue Briggs

IN SERVICE SAFETY TESTING
AND INSPECTION OF
ELECTRICAL EQUIPMENT–
OUR EXPERIENCE
By Charles Stace, Sue Briggs,
Lynda Kutek
AGENDA






Outline process undergone to carry out
legal obligations regarding in service
safety inspection and testing of electrical
equipment.
The system we have developed.
Some of the problems we have
encountered.
Solutions.
I’m not the expert!
Mr Charles Stace responsible for
developing the system.
WHO WE ARE




School of Pharmacy and Medical Sciences,
University of South Australia.
Undergraduate and Post Graduate
teaching laboratories and classrooms as
well as Research laboratories,
Administration and Support areas.
Spread across 5 buildings located at three
different campuses.
110 staff composed of Academic,
Research, Technical and Administration
personnel.
OUR PROBLEM


Need to comply with OH&S
legislation which refers to AS/NZ
standard 3760-2001 - In service
Safety Inspection and Testing of
Electrical equipment.
This states in its forward;
“equipment….. needs to be subjected
to routine inspection and testing to
detect obvious damage, wear or
other condition which might render it
unsafe”.
OUR PROBLEM


School has approximately 4000
pieces of electrical equipment, much
of it moveable laboratory equipment
which needs to be tested annually,
spread across 3 campuses.
Tightly squeezed monetary and
staffing resources.
PROBLEM SOLVING
Initially



At City East, most of
testing carried out by
our in house Workshop
Manager.
2 lab assistants do data
entry.
At Mawson Lakes
Testing carried out by 7
Technical officers in
their individual areas.
PROBLEM SOLVING (cont)
Problems
 Slow and laborious work.

At City East Campus, “Superman“ was
stretched too far! Too much work for 1
person.
PROBLEM SOLVING (cont)
Problems
 Technicians at Mawson Lakes not
properly certified.

Records for each campus kept
separately. No uniformity. Data
entry system was slow and unwieldy.
Progress to Solution
Cost to outsource testing was researched:
 The company with the preferred option
used a data system that would enable us
to have direct access to our records.
 Their proposed charge was $4.00 per item
which meant an initial outlay of $16,000
and an annual outlay on average of
$8,000.
 Option was dismissed due to excessive
cost.
Progress to Solution (cont)


Technicians from across the school
attended an in-house 1 day
workshop conducted by Regency
College of TAFE, titled “In Service
Inspection and Testing of Electrical
Equipment”.
Cost was $100 per person. Outlay of
$1000. Considerably cheaper than
outsourcing.
Progress to Solution (cont)


Now 14 Technicians deemed to be
”competent persons “ as defined by the
legislation. Workload could be more
evenly spread across the school.
Our workshop manager centralised,
further developed and refined the
recording system.
WHAT HAPPENS NOW
New Equipment


New equipment arrives & goes, via
purchasing officer, to the Workshop
Manager.
Equipment is tested, data is entered
into the information system &
equipment is tagged.
WHAT HAPPENS NOW (cont)
Existing equipment
 Is tested and tagged at a frequency
required by law e.g. 3mths, 6mths, 1year,
5years.


Testing is conducted by laboratory
technicians with certification for electrical
testing, and by the workshop manager.
Technicians conduct testing in their own
work areas, which covers most of the
teaching and preparation laboratories and
some research areas.
WHAT HAPPENS NOW (cont)



The advantage of this is that they are
familiar with the location and working
condition of equipment.
Equipment that fails, is complicated or
needs repair is referred to the workshop
manager.
Workshop manager tests equipment not
covered by technicians eg common areas,
non-laboratory areas and some research
labs.
Testing procedure – In detail



Testing equipment
used is the Trio
Electrix Safe T
Check, model MKD.
Is an Australian
made, mains
powered insulation
resistance tester.
Is simple to use.
Testing procedure – In detail



Just gives PASS or
FAIL response.
Is suitable to be
used by people with
no testing
background.
Also has an inbuilt
facility for testing
extension cords.
Testing procedure – In detail (cont)



Work in pairs or singly. Working in
pairs is more efficient and breaks the
monotony.
Do one room at a time.
All equipment is first taken out of
cupboards.
Testing procedure – In detail (cont)


Then each piece of equipment is
tested and visually inspected.
If it passes, tag is signed and
attached.
Testing procedure – In detail (cont)
Testing procedure – In detail (cont)

This information is then transferred
to a Microsoft Excel Spreadsheet
which contains all relevant data for
each item of equipment.
Testing procedure – In detail (cont)
Testing procedure – In detail (cont)
Testing procedure – In detail (cont)
Testing procedure – In detail (cont)
Testing procedure – In detail (cont)
Testing procedure – In detail (cont)

Before the testing process begins the
new sequentially pre-numbered
tags are filled in with Plant
Numbers set out in the previous
year’s spreadsheet.
Testing procedure – In detail (cont)
Testing procedure – In detail (cont)


The number of the initial tag is hand
written against the first item in the
relevant plant list of that
department.
The accuracy of sequence of tag
allocation is confirmed every 10
entries.
Testing procedure – In detail (cont)
Testing procedure – In detail (cont)

As each item of equipment is tested,
a mark is placed against that item on
the printout. Tag is signed and
attached to that item of equipment.
Printout is further marked to show
that tagging of that item is
complete.
Testing procedure – In detail (cont)


Usually conducted in teams of two. One
tests then puts a line through the old tag,
the other tags and records.
However tagging and recording can be
done shortly after the testing and not
necessarily be performed by qualified
staff. Allows for flexibility of staff time
allocation.
Testing procedure – In detail (cont)


Data entry can be completed easily and
quickly on the Excel system by
Administration staff at a later date
This avoids the awkwardness of using a
computer on site and the slow process of
recording each item as it is tested.
Testing procedure – In detail (cont)

Recording system is simple to use. Is
accessible at any time for updating, for
checking progress of testing schedule,
and for locating a particular piece of
equipment.
UNSOLVED / ONGOING
PROBLEMS


Workload issues for Workshop
manager and many of the
technicians.
Some areas not covered by
Technicians neglected / overlooked.
UNSOLVED / ONGOING
PROBLEMS (cont)

The recording system is subject to human
error. The listing of tag numbers in
sequential order next to the plant
numbers sometimes goes awry e.g.
• When a piece of equipment has been added
into or has been deleted from the spreadsheet
incorrectly and the plant number sequence is
disrupted.
• When an incorrect plant number has been
written on the tag. The ensuing necessary
cross checking causes time delays.
WHERE TO FROM HERE?


Propose to train remaining 6 technicians
to spread load even further. Possibly
become organised into groups that were
responsible for whole floors so that no
areas were overlooked.
Possibly outsource testing of most
pressing overlooked areas. School has
limited scope to do this as cost is $4.00
per item, estimated 800 items
outstanding. ($3,200)
WHERE TO FROM HERE? (cont)



Possibly change software to Microsoft
Inspect.
Would allow both school and outside
contractor access to records.
Can record a risk assessment &
determine what and when testing is
required (further cost of $495).
WHERE TO FROM HERE? (cont)

Bar code reader.
At this stage school has decided against using this method.
(a) Because cost is prohibitive. Would need one per campus
and cost $2000 each.
(b) Would lose some of the advantages of present recording
system i.e. ability to correctly locate or relocate items by
reading tag or accessing data.
(c) Would lose flexibility of staff allocation as only one staff
member at a time would be involved n the process,
resulting in peak demand for equipment when staff are
traditionally free to do testing e.g. semester breaks.

May be considered in the future.
CONCLUSION

Comments?

Suggestions?

Share ideas to take back and further
improve process