Will generative AI disadvantage new starters?

How will newer entrants to the public service learn and professionally develop in a world of generative AI?

In playing with ChatGPT and now some of the GPT agents, some interesting questions arise about what’s going to happen to a lot of the work currently done. While I’m not going to say all our jobs are going to disappear overnight, it’s clear to see that it’s going to start changing our ways of working, and possibly alter them dramatically.

And on current trends I suspect a lot of those changes are going to be at the disadvantage of younger or more junior staff.

Traditional public service learning journey

For a lot of public servants, whether they enter via a graduate program or elsewise, the key challenge of understanding the public sector is to get across all of the ‘dark matter’ – the traditions, the values, the culture, the protocols, the precedents and the underlying mindsets and unspoken rules. While much of the model of the policy process and other core practices are codified or made explicit, the reality is much more nuanced and complex (and sometimes disappointing).

To learn these things takes time and exposure and immersion in a range of tasks and meetings, and observing the dynamics and politics of bureaucracy and organisations. Part of the role of managers is to help ensure that these things are learnt and integrated, so that the next generation of public servants can navigate and handle the unique challenges of the public sector.

However, it’s not always easy to do so – sometimes it can feel easier for a manager to just do the task or thing itself, rather than work through it with the junior staff and helping them learn and work through what needs to be improved. Good managers don’t let themselves fall into that trap, but when things are busy or hectic or there are added frictions and transaction costs to doing so, it can be an easy temptation to fall for.

Remote/flexible working adds potential frictions to that journey

One friction that already exists is the growth of flexible and remote working. There’s already some evidence that remote working disadvantages junior staff as opposed to more established/experienced ones. While there are many advantages (access to different skills and people, flexibility to suit people’s life situations, etc.), it also requires more conscious attention to some things that might otherwise be easier. Getting a read about someone remotely can be harder, the incidental and casual conversations that happen in a shared workspace are harder to replicate, and small tasks and asks can be harder or more awkward to arrange.

This isn’t to argue that the shift to remote and flexible working is bad, but it is to recognise that it adds frictions and requires compensatory efforts. These are manageable, if not always easy, but I would say most of us are still learning about how to help more junior staff who are remote learn and develop their skills.

Generative AI is good at quite a lot of small things

ChatGPT is already good at quite a lot of small tasks. It’s not brilliant or faultless, but with care it can help in a range of ways, whether it be helping with ideation, scenario planning, drafting or editing, thinking through planning for a host of different types of work, and much more. For instance, one minute can produce something like this which may have previously taken half an hour or an hour of working through with someone the different options.

And with the current experimentation with AI ‘agents’, this could be expanded beyond simple tasks towards a more goal oriented approach. For instance, asking an agent to ‘Create a consultation plan for a government agency responsible for employment policy about the future of work in a world of generative AI’ quickly produced a set of tasks and outputs, including the following:

This sort of capability (admittedly still early days) reduces the transaction costs of someone doing a job themselves rather than asking someone else to do it. It becomes much easier to just think of something and do it, rather than sitting down with someone and walking them through the steps involved, and more importantly, the logic and thinking and strategy behind it.

The reason this matters is because a lot of these tasks are trojan horses for the embedded thinking, not just the outputs and outcomes. They are vital learning exercises, that matter as much as the work itself sometimes. Building the future capability of the workforce is an investment.

But as generative AI gets better at the ‘easier’ things, it pushes the ‘human’ work to the more complex end, which by virtue of sitting atop all these other capabilities, is likely to be less accessible to those newer to the public service or who are more junior. More of the dark matter risks becoming opaque to those joining, and the underlying logics and norms harder to internalise without the immersion in those tasks that may become automated or pushed to generative AI. (Of course, care needs to be taken with the current generation of tools from a privacy/official information perspective, but I am assuming many of these tools will become integrated and internalised into enterprise environments).

So what?

I’m not game to predict how these things will unfold. Nonetheless I’m going to suggest it’s already worth giving some thought about how to work with newer and more junior staff in a world where many of the previous ‘entry’ tasks might soon be done by machines. Like many things, I suspect we’re at the start of a big learning curve, and we’ll need to give extra attention to those entering the public service to help them navigate these big changes.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.