Who will pay for lifelong learning?

The Australian‘s Higher Education supplement this week (12 June 2019, p. 27 and p. 29) contained a pair of articles on workplace learning, both based on a Swinburne University of Technology survey of how workers say they prefer to undertake training. The survey results are intermixed with the usual speeches about the need for constant re-training to keep up with a rapidly-changing world.

The first article reports that “most workers (56 percent) also see themselves as most responsible for preparing themselves for the future of work”. When reading such stories, however, I’ve often wondered when we’re supposed to have the time and resources to do all of this re-training, given that we’re generally expected to be at our jobs for a full working week.

The second article, written by one of the report’s authors, instead states that “Overwhelmingly, all Australian workers … prefer learning on the job as the best way to prepare to work in digital environments”. The article goes on to advocate “learning-in-work” in which learning happens in “disruptive work environments” instead of at university. The examples of “learning workers” given do not make it clear as to what the actual learning was (and “the agile team of cross-functional experts developing a new product” seems without any concrete meaning whatsoever) but I take learning-in-work to refer to what is more usually known as on-the-job training.

Many employers do offer various sorts of on-the-job training, such as the numerous teaching and health-and-safety workshops that I’ve attended across my time at universities. These sorts of workshops are about improving (or at least sharing opinions on) the worker’s existing practice, and are unlikely to transform, say, retail sales workers into university teachers, or teachers of English literature into data scientists, should the job market demand such transformations. Some more generous or patient organisations offer longer-term work/study programmes through which employees can obtain new qualifications, or allow staff to work part-time while they complete qualifications in their own time. But I just as often hear complaints that organisations are reducing training budgets, and one particularly memorable on-line comment pointed out to an article about investing in staff that job ads typically demand that applicants already be trained to the organisation’s exact specifications. I don’t know off-hand how many organisations might be of which sort.

I’m in the fortunate position of being able to take myself out of full-time work in order to pursue projects as I see fit, some of which involve learning. I’m not expecting to transform myself into anything other than an academic and software engineer—though I suppose I have the time and resources to train for a whole new trade or profession if that were what I wanted to do—but I’ve long since stopped believing that learning yet another programming framework, or throwing ideas around at yet another teaching workshop, are making much difference to my development (though it may be that learning another programming framework would improve my employability since this is what is typically asked for in job ads). I was already studying Malay while living in Singapore, and I imagine that running a consultancy will be a learning experience even if I don’t do any formal study on the subject.

More generally, I can see an argument for both on-the-job training and training leave, the former being for its current purpose of keeping workers up-to-date in their current practices and the latter allowing workers to develop themselves in other directions should they choose to do so. The hard part, in the present economic structure, might be finding someone willing to pay for it.

Leave a Reply