Tracking for impact: lessons learned from attempts to

Download Report

Transcript Tracking for impact: lessons learned from attempts to

Tracking for impact: lessons learned from attempts to track participants beyond a project intervention Dr Sally Hobden Prof Paul Hobden

SAMEA 2013 1

Background and context

The project intervention

A school based educational intervention (henceforth referred to as the EI programme) in which successive cohorts of promising learners from disadvantaged backgrounds received scholarships to attend independent schools from Grade 10 to Grade 12. A key project aim was to increase the pool of learners from previously disadvantaged backgrounds able to access tertiary education, particularly in science related fields.

SAMEA 2013 2

Background and context

contd

The evaluation task

Several formative evaluation reports on the implementation and progress of the EI programme had been prepared.

As part of the summative report, the funders were interested to know the progress of the first two cohorts of scholarship learners after leaving school.

The brief was to contact the EI programme alumni and to find out their academic or work related progress since leaving school. This would include asking about the course of study, the institution, funding and any legacy benefits of being part of the EI. This would be an indicator of the impact of the EI intervention SAMEA 2013 3

Background and context

contd

South African context – the secondary/tertiary articulation gap

The investigation fully acknowledged the effects of material (especially financial) and affective factors on learning, but determined that academic factors are at the heart of the systemic obstacles to student success. It is widely accepted that student underpreparedness is the dominant learning related cause of the poor performance patterns in higher education (CHE, 2013, p. 16) Less than half students entering university complete the degree (CHE, 2013) Perold (2012) notes that despite the gains made since the 1994 democratic elections in increasing access to schooling, the ‘pipeline feed’ from schooling into post-school education and training has presented significant challenges. “Despite being young people who display resourcefulness, courage and resilience, they are unable to build on their asset base, and lack the means to find opportunities to nurture their talent, follow their passion and support their movement from schooling into further education, training and employment” (Graham, 2010). SAMEA 2013 4

Evaluation context

A sharp shift in methodology

Previously

Data collection was school based so we had a “captive group” Participants were current beneficiaries of the project We had multiple opportunities for data collection

Extension after school

Alumni of the project were scattered and sometimes difficult to contact They were no longer project beneficiaries – no obligation Limited opportunity for data collection SAMEA 2013 5

Contacting the alumni (N = 238)

The project co-ordinators had collected contact information from the learners when they were in Grade 12 and so we had some cell phone numbers and email addresses to start with, but the time lapse made this chiefly retrospective tracking.

The following process was followed:   All alumni with email addresses were sent a mail asking them to confirm the address and to provide a cell phone number and perhaps contact details of any of the other EI programme classmates they might know.

All alumni with cell phone numbers were sent a sms with same request – to verify their contact details and to help with other contact details.

Contact data was obtained for 125 alumni

SAMEA 2013 6

Data collection

An online questionnaire was generated, with a pdf version for those without easy internet access. This form could be filled in using a cellphone. All those with contact details were contacted and asked to either fill in the form online, electronically and email it back or request a phone interview to fill in the form.  Once a response had been received, a R50 cell phone recharge, or a R50 grocery store voucher was sent to the respondent as a token of appreciation.  Reminders were sent to alumni who did not initially respond. When the responses slowed and we began to get fraudulent entries in attempts to get additional vouchers, we closed the data collection.

104 valid responses were obtained giving a response refusal of 27%.

SAMEA 2013 7

Some exemplar data – Occupation 2/3 years after leaving school

Occupation in 2012 Studying

Bachelor of Arts Bachelor of Computing Bachelor of Education Bachelor of Law Bachelor of Social Science Bachelor of Science Bachelor of Science (Pharmacy) Bachelor of Science (OT) Bachelor of Science (Engineering) Medicine Bachelor of Commerce

Working “Gap” year Unknown Total

• Bachelor of Commerce (Accounting) Technical courses Course not clear Educational services Office and banking work Restaurant and catering Retail and sales Staying at home Information for 2012 not clear *These students are also studying so are counted under B Ed.

SAMEA 2013 3* 2 3 4 12 4 3

N

7 1 4 5 7 12 1 1 7 4 15

Total

83 10 6 6 104 8

Some exemplar data – Progress in years after leaving school

Second year out of school

Study 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33

Reference 7011 7013 7016 7018 7081 7095 7110 7130 7108 7008 7044 7114 7133 7024 7053 7056 7010 7014 7015 7021 7023 7025 7082 7121 7131 7132 7134 7136 7138 7139 7142 7143 7146 7026 7077 7005 7006 7091 7105 7009 7012 7039 7117 7137 7144 7145 7090 7083 7003 First year after school

Work “Gap” year

Second year out of school

No information

Third year out of school

Deeper shades indicate more years in that activity 10 11 12 13 6 7 8 9 1 2 3 4 5 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57

Reference 8025 8011 8018 8019 8020 8027 8034 8035 8047 8048 8049 8062 8063 8071 8074 8076 8078 8084 8086 8087 8088 8092 8095 8098 8100 8102 8111 8115 8119 8120 8121 8134 8138 8140 8142 8149 8028 8001 8004 8033 8046 8089 8097 8107 8123 8125 8148 8032 8114 8068 8070 8073 8124 8127 8101 8099 8113 First year after school

Cohort One Cohort Two SAMEA 2013 9

Results – response rate

By the close of data collection, 79 valid online forms had been completed, 23 responses were emailed in, and two questionnaires were filled in over the phone, providing a total of 104 responses. This was a 87% response rate from those tracked, and represented 44% of the alumni.

In a similar tracking study conducted in the United Kingdom (Boaler, 2012), strong claims were made based on a return rate of 22% (N=63) with Boaler claiming that “[d]espite the relatively low return, the sixty three young adults who responded were an interesting and important group to consider” (p. 47). We thus considered our return rate of 44% (N=104) to be satisfactory. SAMEA 2013 10

Results – representativeness of sample

How close is the group of 104 alumni respondents to the whole population of alumni? Responses were obtained from 16 of the 18 host schools in the project which is good school representativeness. The mean scores were higher than the respondent group in each of the EI focus subjects (Mathematics and English) and in a Critical Reasoning Similarly, the comparison of mean scores in Table 1 shows that the mean scores were Test. An independent samples t-test showed that this difference was statistically significant with p values not exceeding 0.001.

Table 1

Comparison of mean scores

Subject English NSC Mathematics NSC Critical reasoning Respondent Non-respondent Respondent Non-respondent Respondent Non-respondent N 104 134 104 134 100 126 Mean 63.03 56.81 58.58 49.85 48.81 41.17 S.D. 9.52 9.47 15.94 15.14 16.85 16.27 SAMEA 2013 11

Results – representativeness of sample

The respondents had higher median scores in each measure.

The sample can be considered biased in that the respondents form a higher ability group than the non-respondents Figure 1

Distribution of marks in key project areas for respondent and non-respondent alumni

SAMEA 2013 12

Challenges, Issues and Lessons

Challenges :

 

Obtaining contact data Obtaining quality data Issues

Incentives and fraud

Sample bias

SAMEA 2013 13

1. Contact data

The ongoing tracking has to be foreseen throughout the evaluation, and time and effort dedicated to the construction of a robust database of contact details and backup contact details. It became clear that young people change their cell phone numbers and email addresses very readily.

Lessons

The tracking must be fully discussed with the students while they are on the programme so that requests for information do not come as a surprise. In this case the tracking was part of an extension to the evaluation and we had not cued the learners to expect requests for further information.

Backup contact data should be collected - cellphone number of a family member for example.

SAMEA 2013 14

2. Assuring quality data

Many responses were ambiguous and required careful reading of the story line across years to work out the actual course of study. This is also due to the fact that many stories were indeed very complicated with false starts and changes of direction.

Lesson

Care needs to be taken to make the questions very specific, and possibly include some narrative that might clarify things

.

Budget should include provison for follow up phone calls.

SAMEA 2013 15

3. The issue of incentives

“Incentives improve response rates across all modes. The effect seems to be linear, larger incentives have bigger effects on response rates. Prepaid incentives, where possible, seem more effective in increasing response rates than promised incentives that are contingent on survey participation and money is more effective than gifts” (Berry, Pevar & Zander-Cotugno, 2008) Incentives have been found to increase participation in low income groups and may skew the sample composition but there is no evidence that data quality is affected (Singer, 2002)

Lesson

The incentives we offered seemed to be attractive and if they encouraged reluctant participants (those without good news stories) we would consider that a good thing.

SAMEA 2013 16

4. Fraudulent responses

These took one of three forms: Fabricated personal data (learner had not written Matric and had left the project but filled in that he was at university) Fabricated data on other learners (and then giving cellphone numbers of their own family members) Double dipping (learner fills in online form several times using variations of their name)

Lesson

Be alert, and delay incentives until data is verified.

Consider the level of incentive – maybe too attractive SAMEA 2013 17

Limitation of the study Bias of the sample

Statistically, the sample of EI programme alumni from whom we obtained data had achieved better than the non-respondents in academic tests in their final year of school. This bias of the sample towards successful students seems unavoidable for two main reasons: (a) successful students are more likely to be at university where they have regular and reliable access to email than unsuccessful students who may have returned to their more rural homes; and (b) successful students are more eager to share their stories and provide feedback than unsuccessful students. The results have to interpreted with this in mind SAMEA 2013 18

Thank you for your attention

Contact details:

Dr Sally Hobden

[email protected]

Prof Paul Hobden

[email protected]

SAMEA 2013 19