Applied Sociology in Vocational Education

Oil-style picture of two White people in paint-soaked clothes. A woman on the left holds a hammer. The man on the right holds a paint roller. We can't see their faces. The top third of the graphic has the heading "Appleid sociology in vocational education"

This is part one of two posts showing how applied sociology is used in a multi-disciplinary behavioural science project to improve social policy and program delivery.

Our randomised control trial (RCT) sought to improve outcomes for apprentices and trainees through a behavioural intervention. Learners and their employers were separately visited to discuss contractual responsibilities and to set goals that were meaningful to the learner. Fortnightly emails to employers and text messages (SMS) to learners then reinforced these themes for a period of three months. At the end of this time, separate phone calls to employers and learners were undertaken to check their progress on goals and to work through any workplace issues. We then stopped further communication and analysed completion rates 12-months later. Though our intervention did not lead to a statistically significant result in the retention rate of learners, we suggest early, behaviourally informed support in the first 12 months can help learners persevere toward apprenticeship completion.

The following is an excerpt from the Behavioural Insights Unit Update Report 2020.

Increasing employer support to boost apprenticeship completions

Completion rates for apprenticeships and traineeships have remained steady for years. In 2015, the NSW Premier set a State Priority to increase apprenticeship and traineeship completion from 50% to 65%.

The BIU (Behavioural Insights Unit) worked with Training Services NSW (TSNSW) to support employers with low apprenticeship and traineeship completion rates. This included:

  • a face-to-face meeting between TSNSW Advisors, learners and their supervisors to set goals and discuss commitment to contractual obligations
  • fortnightly messages to employers and their learners to reinforce these aims
  • a follow up phone call after three months to check on progress.

Twelve months after our intervention, we found no statistically significant results. Learners (apprentices and trainees) who received treatment were no more likely to stay in their contracts than the control group.

Nevertheless, around 20% of learners who quit their first contract will keep
studying. Our methods revealed a lag in cancellation or non-completion records and identified numerous ways to improve customer service through enhanced data collection and technology.


In 2010, the cost of NSW trade apprenticeship non-completion to the state and federal governments was $91 million, and the total cost of non completion including productivity forgone plus budgetary impacts was $348 million (Deloitte Access Economics, 2011). Longterm data indicates that the overall completion rate is stable and potentially hard to shift.

Since late 2015, the BIU has worked closely with TSNSW to design and deliver a range of interventions to increase the proportion of people completing apprenticeships and traineeships.

We conducted 50 fieldwork interviews with learners, employers, registered training organisations (including TAFE NSW) and other stakeholders. This showed two key barriers to completions:

  1. Lack of employer support: While some employers strongly support their learners, other employers lack the time, resources or skills to effectively supervise and mentor learners, leading to demotivation and lower completions. Learners spend 86% of their time in the workplace (the other 14% at their registered training organisation, such as TAFE). Employers with low completion rates tend to be reluctant to invest time and training for their learners. This lessens the opportunity for the learners to apply the skills that they have learnt at work.
  2. Significant disconnect between study at TAFE and what happens at work: At work, learners often do not communicate what they are learning to their employers. Consequently, employers do not value the training their learners received and do not give them a chance to practise new skills (BIU 2018).

We ran a RCT to test whether expanding support offered by Training Advisers (TAs) to apprentices and trainees (learners) with employers with historically low completion rates could improve completion rates. We hypothesised that this employer support would then increase the rate at which the learners completed their apprenticeships or traineeships. We focused on employers with low rates of completion over the past five years (below 42% completion; the NSW state average at the time being 47%).

We started with a sample of 343 employers which (along with their 2,229 learners) were randomised into two groups (treatment and control). Four months later we added 273 learners that had started working with the employers since the trial started. There were 1,975 learners in the final analysis, once we excluded learners who had cancelled their contract or completed their study before the trial started (that is, we lost around 20% of our sample by the time treatment began).

  1. Treatment (n = 906): these employers and learners received an expansion of support offered by TAs, including:
    • One site visit from a TA to see all the supervisors and learners (to review contractual responsibilities and establish agreed-upon goals for learners).
    • Fortnightly communications for a period of three months encouraging them to persevere with the learners’ goals and their mutual contractual obligations (text messages from the TA to the learner, and fortnightly emails from the TA to the learner’s supervisor).
    • A final phone call three months after the site visit, to check on progress on the learners’ goals, and troubleshoot any issues.
  2. Control (n = 1,069): BAU (business-as-usual). This included help already available to learners and employers who proactively request support, but no additional site visits, texts, emails or phone calls.
Figure showing intervention process with icons of workers and staff. There are five stage. 1) Poor performing employers are allocated a Training Advisor (TA). 2) TA visits the learner and employer. 3) TA communicates with the learner via SMS. 4) TA communicates with employer via email. 5) TA phones learner and employer after 3 months to check progress.
Figure: Behavioural intervention (‘treatment’)

What we learnt

Our intervention had no impact on whether a learner stayed in a contract or completed their study during the contract. Our primary measure was the 12-month retention rate of learners. We followed learners’ outcomes by tracing the first contract they had already started at the time our trial began in November 2017.

Image showing two speech bubbles. The first says: "Hi Lucy, it was great meeting you last week. This is just a reminder that if you need any help or have any questions I'm here to support you." The second says: "Hi Lucy, just a reminder that the work goal that you have agreed to work on for the period is: 'Always arriving on-time for work.' Thanks, Levini."
Figure: Sample of texts prompting goal attainment

We found our intervention had no effect on the likelihood that a learner would complete their course or continue studying. Around 52% of learners in the control group were either still in their first employer contract when the trial ended (June 2018), or they had completed their course. Similarly, we found that around 51% of learners in the treatment group were still in their contract.

Graph. On the left, Control is shown to have completed or stayed at 52.3%. On the right, Treatment is at 50.9%.
Figure: Percentage of learners who completed or stayed in their contract, by condition

Our intervention had no impact on employer performance

Looking at the level of the employer, for the control group, 62% of their learners had either completed or were still studying, compared to 60.5% of learners in treatment.

Graph. On eht left, Control is at 62.3%. On the right, Treatment is at 60.5%.
Figure: Rate of completion or staying in the trial contract by employer, by condition

Around one-fifth of learners who cancel a contract will go on to start a new contract

A cancelled contract is not the end of a learner’s journey, as 16.3% of learners in the treatment group went on to start a new contract after cancelling their first (and 15.9% of learners in the control group did the same).

Graph. On the left, Control is shown at 15.9% starting another contract after cancelling. On the right, Treatment is at 16.3%.
Figure: Percentage of learners who started another contract after cancelling, by condition

Our trial methodology identified data and service delivery issues that may have impacted our results.

  1. Learners who cancel their contracts tend to leave within the first three to six months of their first year. Our intervention started in November, meaning those who would have benefited most from the intervention had already cancelled by June. Without this trial, we could not have identified this issue of timeliness of support.
  2. To reach optimum sample size to achieve statistical power, we included learners from first to third year. Given our trial subsequently identified most learners who cancel will do so in their first year, our sample is capturing a range of behaviours for learners at different stages.

TSNSW endorsed our trial recommendations, which included:

  • sending timely messages within the first six months of the first year
  • enhancing processes using technology, including improved data collection, analysis and auditing of cancelled contracts.

Next steps

In line with our recommendations, in February 2019 we began testing an SMS intervention to encourage learners to proactively seek information and support when they encounter issues, instead of simply dropping out. This trial includes all first-year apprentices and trainees in NSW (n=13,100) based on the insight that learners tend to cancel in their first year without seeking help or notifying Training Services NSW.

Interim results

For this trial, we are measuring completion rates as our primary outcome. Whilst final data is not yet available, we have found encouraging short term engagement results. Two intervention groups are receiving messages with links to existing online resources on the TSNSW website as well as the option to call their local TSNSW office for further support (n=8,500). The Control group (4,600) receives BAU, which means no text messages, but they have the same access to the online resources and ability to phone the local office for support.

In the first half of 2019, we sent the two intervention groups three messages. This led to 4,400 clicks on links onto the TSNSW website for information on workplace rights and financial entitlements, plus almost 400 phone calls and a further 400 inbound text messages seeking help. The texts prompted students to tell TSNSW about a range of issues they would otherwise not have, such as unfair dismissal, lack of action on their learning plan, and financial assistance. We continued to message learners until the end of 2019 (a further three text messages) and are subsequently testing whether this engagement translates into completions.

Read the rest of the BIU report to learn about other trials and projects.

2 thoughts on “Applied Sociology in Vocational Education

Comments are closed.