Science fiction has been trying to warn us for years that we are heading down a dangerous path. 2001 a Space Odyssey, The Matrix, I Robot, The Terminator... and the list goes on and on and on (and those are just the popular movies). Our culture is full of stories where tech is not always working in our best interest. To be fair, it is not the technology’s fault – humans created the technology, programmed the algorithms (or AI), and set it loose on the world. They just failed to see the consequences of their actions until it was too late.
But this is all science fiction… right?
Gunpowder was first invented around 700 AD in China for use in fireworks – a few centuries later, someone figured out it could be used for other more lethal applications. I wonder what the original inventors would think of how their amazing technology was used…
For that matter, what about the Wright brothers – do you think they considered dropping bombs from their amazing flying machine? Or Tesla – was he thinking fighter or spy drones when he introduced the first remote controlled vehicle?
It’s not the technology, it’s how you use it
The truth is, most technology out there is ‘dual use’ – it can be used to create or destroy. A knife can cut steak or stab someone, a molecular compound can be injected into the body to deliver life-saving drugs or a lethal poison, and social media platforms can keep people connected, share news, and raise awareness for amazing causes or it can be used to manipulate the populous, steal information, and scam thousands.
What is most striking is not when bad people do bad things, but when good people, or people with good intentions, inadvertently miss-use technology.
At the HR Technology Conference (#HRTechConf) earlier this month, Randi Zuckerberg gave a fantastic keynote that addressed several major issues including stating that “we need technology to save us from technology” – addressing some concerning tech trends she has noticed. For example, virtual reality is amazing for realistic training or walking through construction sites, but – cross that with modern war-simulation games, and what will be the consequences for our kids?
The road to hiring hell is paved with good intentions
The evidence is already all around us. If you are in a public place or an office, look up and look around. How many mobile devices do you see?
At the conference, the number of phones out was staggering. People were looking at them while they walked, in sessions, during meals. An event that pulls experts, partners, and practitioners from around the world to explore new HR technologies and it was being missed in favor of email, texts, and browsing. I wonder how many great opportunities to meet impactful people or have great conversations or discover new technologies were missed because of a device in front of someone’s nose (there is a movie for that too – WALL-E).
In hiring, these unintentional consequences come in many forms. From AI solutions that tout removing bias but all they do is automate it, to an ‘easy button’ user experiences that confuses a good experience with no effort, to assessments that map out the perfect hire that stifles diversity. There are many paths that will land your company or your candidates in hiring hell.
Tech for good
Luckily, the paths to hell are avoidable. It all starts with awareness and planning. If you know what you need, know what the technology will do, validate your designs and assumptions, and plan your program accordingly, you will greatly reduce the risk of unintended consequences.
Understand your problems and goals FIRST, then select technologies and solutions to solve them. The HR tech space is full of shiny new objects, grand claims, and alluring promises. But, the most important solution is one that solves your problem – so know your problem first, and what success looks like to solve it. Otherwise you could end up with that shiny new toy that no one uses, or, introduces new problems that you will then have to solve.
Do the due-diligence. Understand what the technology is doing, how the algorithms are created, where the data comes from, what the validity and adverse impact data looks like. Do not be afraid to ask the questions, get outside help from another party, or to walk away if you aren’t getting good (or any) answers.
Embrace pilot programs and measure results. New solutions and technologies have risk, and risk is not bad (without risk, there is no reward). But, that doesn’t mean you shouldn’t minimize that risk. Design pilot programs with clearly defined metrics of success, then measure that success. Not only will it validate your approach (or save you from a bad one), but it is a great way to increase adoption and generate internal excitement.
Realistic roll-out plan. One of the biggest failures of new technology programs has nothing to do with the technology or the solution. It is the roll-out plan. Not enough time, not enough resources, not enough planning. According to Forbes, 25% of new tech programs fail completely, with another 25% failing to provide any return. Why? Over half the time it is due to poor planning. So, take your time, plan well, and don’t hesitate to lean on your vendor partners and peer community to help along the way. It is amazing what you can learn from other’s experience – especially since failure is important for learning, but it doesn’t have to be your failure.
There is no need to be a victim of technology – at least not when it comes to hiring. Now, if only we could figure out how to pry phones away from our peers long enough to have a real conversation or avoid creating AI that becomes Skynet (Terminator).
Comments