Home News AI, Algorithms, and Who Owns the Outcome (Video)

AI, Algorithms, and Who Owns the Outcome (Video)

0
AI, Algorithms, and Who Owns the Outcome (Video)

 

Video with full transcript

 
A talk by John Sumser At Google with a follow up Discussion Panel

Presenter & Moderator: John Sumser
Panelist: Richard Rosenow, Head of People Analytics for Facebook
Panelist: Derek Zeller, Director of Recruiting Solutions for Engage Talent
Panelist: Jeff Dunn, Campus Relations Manager for Intel
Panelist: Heather Bussing, Employment Attorney, Law Offices of Heather Bussing

This is a talk I gave at Google’s Mountain View campus at the end of 2018 on AI, algorithms, and intelligent software in recruiting.

After my presentation, I moderated an expert panel to find answers to the big questions with AI and Intelligent Software coming into recruiting and HR.

I’d love to hear your thoughts on the talk and panel in the comments and you can watch more of my talks and interviews on my YouTube Channel »

 

 

Talk and Panel with full transcript

 

AI Algorithms and Who Owns the Outcome

File Length: 00:51:01

SPEAKERS

JS – John Sumser (Presenter and Panel Facilitator)
JD – Jeff Dunn
DZ – Derek Zeller
RR – Richard Rosenow
HB – Heather Bussing
AM – Audience Member
 

Transcript

 

Important: Our transcripts at HRExaminer are AI-powered (and quite accurate) but there are still instances where the robots get confused and make errors. Please expect some inaccuracies as you read through the text of this conversation. Thank you for your understanding.

FULL TRANSCRIPT (with timecode)

00:00:00:07 – 00:02:39:21
Thank you Stephen. Thank you. Thank you everybody for turning up. Thanks. Google. Anybody else I need to thank. Oh God. Wait till you see this panel. Holy crap I just want to get to the panel because it’s four of the most cantankerous and opinionated people I know. And they don’t agree on anything. All right. And so I’m hoping to organize a food fight with them once I’m done with my little spiel here.

So I am John Sumser and I run something called The HRExaminer dot com which if you’re not familiar with please please stop by the website. We publish two or three interesting articles every week about the edges of HR and an annual analysis of the state of the art of intelligent tools in each our technology. I use the phrase intelligent tools advisedly. I have yet to see any A.I. in the places a lot of machine learning and that’s that’s math right and it’s very interesting math. But there’s nothing intelligent about it what what machine learning does is it learns all about the past and synthesizes the past into something that you can use today. But it’s only as good as the past. It’s absolutely only as good as the past and what I’m looking for in intelligence is something that’s able to imagine a future and help you navigate the future.

So my deal with these sorts of talks is I always want to tell you what the takeaways are so you don’t have to stay awake for the rest of it. And there are five things if you get two or three of them it’s worth your time.

The first one is that models are simplifications and so everything that you see in intelligent technology is some mathematical model of some reality. And it’s always going to be less comprehensive than the reality itself. And the theory is that all you really need to do is get a model up to the point where the results stream is good over time. But the risk in using models to understand things is that you miss factors that you don’t understand right now and being in the early stages of this technology there’s a high likelihood that we don’t have the slightest clue about what we’re doing and we’re making some pretty big mistakes. And so the idea that models are simplifications is a reminder of something that’s the third piece here which is machines have opinions. It was the case up until four or five years ago that. You put data into the machine and what came back out of the machine was your data.

00:02:41:07 – 00:02:55:10
What happens today as you put data into the machine where the machine collects data in some way. And what comes out is not data filtered through a data model. And so the data model is the machine’s opinion. And like all opinions

00:02:56:18 – 00:03:20:00
You need more than one of them to make a decision. You can’t trust a machine to have the right opinion. You can’t you shouldn’t. It’s a bad idea. We’re at a stage where having really good questions is way better than having answers. And that’s a change of management it’s a change in the way that we think about work.

00:03:20:02 – 00:03:49:09
You know a lot of the stuff we’ve heard about today is based on the idea that work is something that you can precisely define and when you get it precisely defined you can understand exactly who would go into that precisely defined thing. Well let me tell you if you can precisely define it it’s going to be automated. Right. And so so that work of precisely defining stuff is really the first stage in the automation of layers of work. And what happens

00:03:51:02 – 00:04:36:03
Is that people have to sit back and watch that. When you automate work that way and that’s how we’re going to do it over the next 50 or so years we’re going to automate By understanding better and better what people do. You have to watch it. You have to supervise that process and one supervises that process. By having good questions not by having answers because the machine is in the process of getting the answers. It’s always going to be the case. You know we have a kind of an interesting example of this at the national level right now we’re paying attention to the details. Is it is a complete and inherent part of decision making. And so one of the things I’ll talk about is the importance of paying attention to fundamentals. And the last thing is

00:04:40:16 – 00:05:08:04
It’s not it’s not really hypocrisy but what we’re going to be managing is as different from the industrial world that our management models are built on as the industrial world was from the agricultural world that preceded it. And so we’re going to have to learn how to think about management and radically new realities fairly quickly because. The very first thing that we have to manage is these machines who are somewhat intelligent

00:05:10:05 – 00:11:31:22
So we’re here talking about talent acquisition. One of the things I do every year is give a. Orientation talk at the 8th Circuit conference. And so when it comes to talking about talent acquisition this is the slide I use. There are there are 30 observable. Areas in which software is being developed as a part of talent acquisition. Right. So when you hear. The idea that what’s going to happen is we’re going to build a single system that does all of these things or that what you’re going to do in your company is link all of these things together it requires a level of expertise that I don’t think most companies have. Right. These are like spices in the spice cabinet. And you have to know whether you’re cooking Mexican Indian or Thai for how you go into the cabinet and start building stuff. And. A pretty. Guy.

I love fruit salad. I love a big bowl of cold fruit salad in the refrigerator and the thing that I like most about this how all pieces of fruit taste good together. When a machine analyzes this to see what it is. It gets this. It’s fantastic. It gives you account of all of the fruits and all of the. Pieces. And you’ll notice. That the ball has polka dots. And there’s a pile of polka dots on the side. All right. That kind of kind of error is the first kind of error that you see here. The other thing is that there was a decision made about what to count and if you’re like me and you really love fruit salad what you know about fruit salad is it’s the juice in the bottom. That’s the whole thing. That’s the best part particularly gets a little bit of whipped cream in there. Oh they didn’t count juice. Right. Right. And so the model was simplified because the model adequately allows you to write the next recipe except for the messy thing about cutting a bowl and a spoon and a pile of polka dots bias or you could figure out how to cut this recipe by 66 percent and make a smaller batch of it. It just doesn’t get at the thing that’s most wonderful about fruit and fruit salad which is the experience of the fruit of the fruit salad. So this is this is a problem that every piece and data model has. And this is a problem that every single mathematically Engineered Machine Learning model has is that it only looks at what is measured. Right and so you make early decisions in your process about what you’re managing and measuring and that’s the limits that’s limits to the place where these things make errors. Is in the news is in the things that don’t get measured as in the things that you have to have to make the things that are measured together. Right. And so that’s the fruit salad. That means right. What this is is the machine’s opinion of what the fruit salad is. That’s the opinion right there. That’s the opinion. The machine has this opinion. And if you just took all of those pieces you’d get something close to a fruit salad. But you wouldn’t get a fruit salad. Because it wouldn’t have the juice. And so so the question that you have to ask when you’re overseeing these tools is what’s missing. Right. What is this thing doing because it doesn’t have all of the bits of data that it needs to have a comprehensive view of the universe. Now maybe it doesn’t need a comprehensive view of the universe but maybe it does. And that’s the question that you have to ask about each data model that you encounter. This gets crazy.

You know we’re talking about GDP earlier in the day there is a new Does everybody know about the new California privacy law. It’s kind of GDP Ha for California and what it means is that every company in the United States is going to have to do a bit abide by GDP or a level criteria in their privacy things and that gets right at who owns the content. It’s my recipe in your database. It’s my resume and database and and is it. So. I’ve been wrestling with with analogies. And I don’t have any good ones yet but it’s sort of like if I fall out of an airplane into your snowbank. And I get up and walk away. Who owns the imprint. Right. That’s the thing that’s the question. Here is my data in your system has some sort of an impact. And if I want it removed if I want to exercise my ownership rights how far into your data does my ownership extend because you built on the basis of me. Real problem real question that’s going to be an issue then after that. Who owns the inside that’s inside of the thing. Do the people who wrote the algorithms and insight to the people who write so. So if you buy from a vendor and they’re. Analysis of your data. Gives you some insight. Is it their property. Is it your property whose property is that inside and what’s owned and honorable. And then this whole universe of who owns the data is it’s way more in play than you’d think. And it’s not like it was we taught here in a process where employers owned everything about the employee. You know there’s 150 years ago slavery was outlawed but we’ve we’ve continued the ownership of human beings pretty consistently a lot of the models we used to think about management have to do with ownership. We call. Pallet assets or capital. This is all part of a change that’s going to happen where we learn how to think about people as something other than property.

00:11:35:08 – 00:11:47:20
The next piece of this is who makes the decision. So imagine. That you’ve got a thousand resumes as you put them into one of these machine learning hoppers 100 resumes pop out the bottom.

00:11:47:18 – 00:12:49:16
And you move on their finger. Who made the decision. Who cut who cut those people. Was it the vendor or was it you. And if there is an error in that process that results in some sort of civil liability for discrimination say. Is that your problem or theirs. All right so there’s a there’s a really big opening question about who’s got the liability of this whole thing. And I’ll tell you that the software companies will all say the employers have the liability. And and I’m starting to hear that software companies won’t sign licenses where they accept responsibility for their data models. But. If they don’t accept responsibility for their data model which I can’t imagine that this business is going to go very far because you can’t have that you can’t ask the. Customer to take responsibility for your thinking unless they have the ability to change it.

00:12:53:17 – 00:13:06:13
So this who has liability question one of the people on the panelists Heather Bussey who has a whole fistful of opinions about who has liability and employment decision making. We’ll talk about that more in the panel.

00:13:06:13 – 00:14:13:03
But this question again is who owns the result like you know it’s it’s maybe a little easier to understand of a Tesla runs over somebody. Is it Tesla’s fault or the drivers fault. That same ethical question is going to percolate through all of the places where we bring intelligent tools into our systems. So the last piece that I want to leave you with is we’re headed into a time where managing intelligent tools is going to be what we do and it’s different. Than managing people and it’s different than anything that you’ve ever heard about so it’s my view that. Within five or six years. Most scaled companies will have a library of data models for each person. And there might be 10 or 15 one of them might have something to do with attention retention and attrition promote ability might be a track learning gaps

00:14:13:20 – 00:14:35:01
And each one of these. Each one of these models is probably going to be the product of some. Process that isn’t inherently directly aligned with all the other processes. So the models are going to have differing opinions about the person. Will be a committee of machines with differing opinions about person.

00:14:35:03 – 00:15:40:16
So that’s the first thing we have to figure out how to use that data that will have conflicts and it isn’t any different than managing 360 degree feedback kinds of things you have a bunch of different opinions that you have to wait somehow. But there is this next thing which is that. All data models wear out. They wear out. And they were out in the following way. When you build a data model it’s set to learn something. And it turns chaos into order over and over returns. Kate I said that’s what they do. Eventually. The environment has lots of order in it. And when it gets lots of order in it it stops learning. That when it stops learning all sorts of bias could floated from places to have it what you want to be able to do is figure out how to keep the model learning. And so when it wears out like a pair of tires you have to replace it. So now you’re talking about 10 or 15 data models per employee plus. Replacements

00:15:40:13 – 00:16:15:17
Plus some work on what does it take to make the models better. And that looks that starts to look like. Well this thing is. An incubator for. Stem cells. And so it holds like 5000. Cast tubes and incubate the incubate the stem cells inside of these specialized. Tools. I think we’re going to have things like that. And I’ve seen some early work like that where. Every data model has

00:16:16:11 – 00:16:27:12
A set of attributes that you’re monitoring about it and when the dashboard indicates that the data model is not as useful as it might be there’s a replacement being developed underneath it.

00:16:27:17 – 00:16:43:03
And these can be relatively automated processes but we’re going to have to learn how to think about that we’ll learn how to deploy them. So I didn’t hear this earlier in the day the whole key here

00:16:44:00 – 00:18:01:20
Is that if your data isn’t governed properly you’re shit out of luck. Scuse me out there. I’m sorry if you’re concerned that. The last decade or so of. SAS software wasn’t something that you could kill or to use but you could customize workflows like nobody’s business and you could name fields like nobody’s business. And and so if you go out and look at your organization it’s it’s not unusual to find recruiting departments with a couple of hundred different workloads with. Different Noman cloture for the same thing across all of those workflows and in a learning system can’t learn anything when the same thing is named differently. There are some tools emerging that might help with that but there a little little far away. If you want to get started with this stuff now you have to do a data governance process of the state and the governance process looks like. Getting all of the stakeholders who have names for stuff together to the point where you can make decisions about standardizing on names for staff so that you can start having larger volumes of data to solve problems with.

00:18:01:23 – 00:18:17:00
So the very first thing that you have to do to really get this going is take care of it manicure your data. And the second thing is you have to have a problem to solve. This was covered pretty well during the day having a problem to solve.

00:18:17:06 – 00:18:25:20
I don’t know how I learned how to use spreadsheets and that sort of thing is not by understanding what they do but by trying to do something with it.

00:18:25:22 – 00:18:34:14
So having an internal problem that you’re trying to solve is a much better way of getting down the road with these tools.

00:18:34:14 – 00:19:08:01
So it says ever got to the top because I was in Japan last week and I’m a little jet lagged from it. But but one of the things I learned that I want to leave you with us as a closing thought is H.R. and recruiting both. A radically different based on the problems that you’re trying to solve. And so in Japan. Annual attrition is about 4 percent and people make two or three job changes over the course of their life. And so the volume

00:19:08:19 – 00:19:44:03
Of resonates and the volume of work is much more compact sacking and branding in Japan is almost always about what a great place our company is to work because alliance organizations this is a big deal. So if you were to take American views of recruiting to Japan and try to teach them how to do employment branding they’re really doing it teach them how to do value amendments but they don’t have the problem. And so so what I’ve learned is that. Something I’ve always thought was true recruiting is

00:19:44:19 – 00:20:07:23
At least culturally specific and probably specific to your company. And so what you have to watch as this technology rolls out is the way that the way that vendors have to organize to make money just by believing that there’s a standard set of answers to Stanford’s problems and the way that you have to differentiate competitively is by not believing that.
 

*** Panel Begins ***

SPEAKERS

JS – John Sumser (Presenter and Panel Facilitator)
JD – Jeff Dunn
DZ – Derek Zeller
RR – Richard Rosenow
HB – Heather Bussing
AM – Audience Member
 
00:20:08:02 – 00:20:26:07 | JS
And so there is an inherent tension in the relationship that you’re going to have with providers with this technology. So with that. I’m going to ask my esteemed panel to come up and while they’re coming up I’m going to introduce them. So Jeff Dunn who is

00:20:27:15 – 00:20:40:14 | JS
Looking very Intel like happens to be the guy who runs Intel’s college recruiting operation. Heather Bussing, who I know personally, because I’m married to her.

00:21:34:03 – 00:21:37:22 | JS
Huh. Huh.

00:21:39:04 – 00:22:06:22 | JS
Yeah. Yeah. His middle name is the man and he currently besides being a writer he currently does recruiting solutions and channels for engaged talent which is a company that does the most interesting thing they can predict the likelihood that you will be willing to take my call about a new job without knowing anything about the insides of your company.

00:22:07:02 – 00:22:34:07 | JS
And so and so they have this theory that a company is you know microphones work you can talk into them but you can also use them as a speaker. The sound will come out of a microphone. They think that companies are like that to their internal processes or visible externally. If you just understand what data to look at. And so it’s an interesting it’s an interesting thing. So let’s start. These are all the questions you can tell if you want to go or not.

00:22:34:14 – 00:22:45:14 | JS
Oh so so Jeff’s question is the way we’re going to do this is is the named person gets tossed the ball and then it’s the food fight.

00:22:45:19 – 00:22:49:19 | JS
Right. And let me give you the. Actually you can pick who assaulted.

00:22:49:20 – 00:22:55:10 | JS
So salad.

00:22:55:20 – 00:23:24:23 | JS
Yeah. Exactly. Exactly whipped cream whipped cream. So so the question is. Every everybody who’s built a recipe in the last 25 years has been coached in how to beat the system. And there’s a there’s a pretty solid argument that the reason that you have resumé raises so that you could beat the system. And. Now there’s new evaluation tools in place that are not keyword oriented. How do you coach people how to beat the system.

00:23:26:10 – 00:24:36:04 | JD
So it’s it’s a moving target. First of all the the idea of just putting in some keywords so that it comes up in your search results is is going to go away as these models change. So. It’s more of loading up the resumé with words and phrases and all the synonyms we’ve been talking about. It’s showing results and accomplishments and more numbers. If you don’t know exactly what the system is trying to catch you’re going to dump more and so you have more likelihood of sticking and that’s going to come up on somebodies radar. It also involves not only just. Putting down what you think. They’re looking for but it’s going in doing some intelligence gathering. Going to talk to people who are putting these together and saying Tell me about your company culture. Tell me what you’re looking for. Tell me what your process is. And so networking everyone knows you know when you’re looking for a job or you’re looking to fill a job. Networking becomes even more important than it is now because you want to connect to those people that in some cases will completely bypass this screening evaluation process and get your resume on the decision makers desk.

00:24:36:23 – 00:24:42:03 | JS
So what you’re saying is if you want to be with a system you can talk to a person yes.

00:24:42:14 – 00:24:48:11 | JS
Add on to that.

00:24:48:07 – 00:26:01:04 | DZ
Yeah that’s exactly right. I’ve I’ve I’ve coached college kids I’ve coached military coming out of the military. I tell them all the same thing. The resumé is a yours. It’s not a legal document it’s it’s it’s it’s your information so you can put anything you want on there but until you get the application and that’s where they’re gonna get you. So they don’t match up. That’s going to that’s that’s bad. So then the number two thing though is a resume is a key to open the door to get an interview with you the recruiter or with the manager. So you really want it to be tight. You really want it to be succinct. If you have been in the business as long as I have you may have a two page resume or a three page Rosemary. I can tell people to go over really over three. I mean I’ve had a very long storied career you’re coming out of college. There’s just ways of putting things like you need to put up with to get the degree and you know you’ve got a degree. Tell me what you did when you were interning. What did you do on the internship. Was it all you read part of a group. What part of the group was it. Information is power. The more information you can give a recruiter in a succinct way the easier is they’re going to want to talk to you. I think I really struggle with this question and I think it’s partially

00:26:03:13 – 00:26:47:18 | RR
I could see a future where we head towards more honesty in the market because of A.I. tools because there’s a bit of an arms race going on where it’s the kind of coaching and then we kind of try to go around the culture and kind of go back and forth with the company and the talent of the company in town to try to kind of game the systems back and forth. But the benefit with a lot of these tools is you’ve got a scale that we’ve never had before. And you’re able to look at a lot of different things that we’re never able to be kind of consumed before and brought into one place. And so I think ultimately where this is going is that as it becomes so big and the scale becomes so large as a candidate it’s going to get past what’s manageable for me as a candidate to game anymore. And once we get past that point there’s going to be a little more honesty in the job market here I think where you’ll get to a point where companies will be able to see the town market people to find the people. People will be able to find the companies they’re looking forward to.

00:26:47:19 – 00:27:11:16 | JS
So I see a brighter future with the kind of direction that’s heading and a little bit less of the kind of man versus machine and a little bit more of a democratized billions to imagine that there’s going to be a vendor next week who says oh here’s all the cultures here’s how they do the valuations we reverse engineer their data models and so give us your resumé and we’ll get it into their systems.

00:27:11:06 – 00:27:44:18 | RR
Yeah I think so. Right. And then I think they’ll be kind of a way to fight back on side I think about people that are putting white text in the back of their resume. That was a big thing for a very long time. Just every skill you could think of put in the white text and you can’t see it but the machines can work. But eventually the machines figured it out. And that’s okay. We’re gonna get rid of that kind of background texture. They’re starting to use a little bit differently. I mean you can fall in for a little bit. There’s there’s always a window and then you can have that arms race continues and I think it’s at some point it’s gonna get outside of the bounds of what humans can continue to keep up with. And I think that’s gonna be a bit of a relief. For a lot of candidates in the market.

00:27:48:14 – 00:27:57:00 | RR
Going I have no opinion. Good. Well this should be a first.

00:27:57:00 – 00:28:06:11 | JS
So Derek we’ve got these intelligent tools running a muck of the operation. When can we let them roll on their own.

00:28:06:18 – 00:28:14:03 | DZ
Oh boy we’re gonna have it. We already had this discussion. John and Jack I’m glad you asked the question. I don’t.

00:28:14:16 – 00:29:37:08 | DZ
I equate this I quit things to bring it down to a level where everybody can kind of understand it. So I look at it this is that I just was born. Machine learning has been around for a long time but it’s starting to grow so it’s more in the infancy of this. Just like we’re the infancy with social media. We’re seeing a lot of things changed in the last five years. I’m seeing machine learning learning. But as you raise a child you don’t just put a 10 year old in charge of themselves for dinner. You don’t give it. You show them how to use the oven. You show them how to use a knife. You explain to them the tools you hold their hand you when you’re not there when you’re going to school. You say what do we do. We look both ways right. These are all things that we’re learning so we’re teaching the machine. Eventually I think that a machine will always need to be monitored. Because as my grandmother once told me with my mom when I was fighting with her when I was 16 she said Son you have to understand. You’re always going to be. Her little boy. OK. To this day I got I talked to my mom every week and you know why. Because she if I don’t she calls me. Why didn’t you call me. She wants to know what’s going on with my life. And I still and I still call her and ask her questions about life. About relationships. What should I do about this. What should I do. What do you think about this mom. And that’s what I think we’re always going to do with machine learning. If we let it go then I’m seeing the minute

00:29:39:06 – 00:29:42:19 | JS
Dancing as I’m on it. Cool cool. Anybody else

00:29:44:17 – 00:29:49:14 | HB
I do have opinions on this of course. Dad. Yeah.

00:29:49:14 – 00:30:39:01 | HB
Well I mean I actually agree with you. I don’t think you can ever turn hiring people over to a machine especially if people are going to be working with other people. You know that the human factors machines machines are really good at things that you can quantify and and humans cannot be defined at that level ever. And we are we are qualitative messes as it should be. And so I I just think that if if you are asking machines to give opinions about who should be hired to work with somebody else you better make sure that those opinions are useful and and actually work.

00:30:39:02 – 00:30:52:20 | HB
And and as your teams change as you change as your company grows as as the things as technology changes how we work and what we’re doing all of that has to evolve with human supervision.

00:30:53:03 – 00:31:08:22 | JS
So I just want to beat this around a little bit. Most of the solutions that we heard about today. You take a great big stack of you to put them in and out comes a little small stock at the bottom right end it’s scale right.

00:31:08:23 – 00:31:15:12 | JS
It’s a thousand to a hundred or a thousand or ten that you’re talking about just exactly who’s going to supervise.

00:31:15:19 – 00:31:20:23 | HB
Right. You’re certainly not going to hire people to go back and double check every decision she made.

00:31:20:23 – 00:31:32:07 | HB
So they’ll have to you have to you have legal liability to make sure that that stack of 10 is not discriminating against protected classes.

00:31:32:08 – 00:31:34:09 | JS
That’s that’s an interesting assertion.

00:31:35:18 – 00:32:16:04 | RR
I think so. I think something else to think about here is that I don’t see eye replacing exactly what recruiters do. So taking the stock of Rasmussen finding the candidate that they might eventually be able to do that I think where they really succeed today is being able to take a million remains that recruiters weren’t able to look at. And surface if there’s any gems in there. And some it are taking a look at not not replacing the recruiter but augmenting or being able to kind of serve up like hey you passed on a lot of these people but this one might actually be someone you may want to take another look at. I think that’s where we see a lot of benefit and maybe a little less of that kind of like ethical crunch of like Are we replacing somebody or are we getting humans out of the equation but really tackling a slightly different problem that humans are not qualified to do.

00:32:16:06 – 00:32:39:17 | JS
I think I think that’s something that I keep running across and looking at the ideas that the real value is not some cost savings today. The real value is that we’re going to be able to do things that we weren’t able to do before and that’s that’s hard to sell in contemporary management structures.

00:32:39:20 – 00:32:57:21 | JS
But it’s where we are where we are. The idea that there is rely on this stuff. She would go appreciate this the idea that there is all of this stuff is a misplaced way of thinking about it. If you look to use this stuff as cost savings you put yourself out of business.

00:32:57:22 – 00:33:00:01 | JS
All right. Next question.

00:33:00:19 – 00:33:08:18 | HB
Who’s liable the employer is liable for its hiring decisions full stop.

00:33:08:18 – 00:33:32:15 | HB
It does not matter what technology you use. So if your technology is. Biased. Or offering discriminatory. Sets of ten resumes as the top choices you are the one who’s responsible for that and you’re responsible to the people who are being discriminated against in a disparate

00:33:35:05 – 00:34:23:16 | HB
Impact case that can be brought by them or by a government agency. And if you discriminate against someone specifically you could be liable to them. Now Hiring cases are very hard to prove because most people who are discriminated against never know that they were. But but when when you start to see the numbers in your company change. People will notice. So it’s you know even if you are not required to track your demographics under federal contracting requirements everyone who uses technology in hiring should be tracking their their.

00:34:24:08 – 00:34:30:07 | HB
Ratios and and making sure that they’re in disparate impact compliance.

00:34:30:08 – 00:34:35:16 | JS
So I’m going to skip by did you really say that you can’t hold the vendor accountable for the quality.

00:34:35:16 – 00:34:37:02 | HB
I did not say that.

00:34:37:03 – 00:35:07:13 | HB
But that depends on what the contract is how the courts are going to enforce those kinds of contracts and whether there’s some sort of civil work around if the if the vendors have a very clear indemnity or a real release of liability as part of the sales agreement. But we’re get we’re going to see a lot more contractual litigation over these issues. But it’ll be it’ll be determined based on contract law probably.

00:35:07:15 – 00:35:17:13 | JS
So do you think contract law is going to evolve what it produces is evidence Green. Yes it does. Discoverable evidence and reams that didn’t used to be there.

00:35:17:13 – 00:35:33:07
So. So if I have. A hand that there’s something hinky in your hiring process and I can get an attorney to set up a case then I can discover the data in your reporting system and do all sorts of things.

00:35:33:07 – 00:35:45:08 | HB
Yes yes evidence is much easier to acquire and analyze now. Yes it is. It’s not just data. It’s evidence. Yeah I think that’s it.

00:35:46:02 – 00:36:42:19 | RR
I think that’s a really interesting point with why I in this sentence really stands out because I think as it stands today there may be bias or whether there is bias in most companies hiring systems but it’s not discoverable. And so the unfortunate because even if I may have less bias than your current system if it’s discoverable then suddenly there’s a there’s a barrier there that didn’t exist before. And so I would love to see that kind of legal environment evolve somehow to allow for more experimentation in that kind of a space because I think it frankly has the potential to be a lot better than what’s going on in the human based systems today which you cannot discover you cannot track. You cannot understand in the same way. And again I think what we would be in a better world if we could understand the bias in a very quantitative way by using our systems. So one of the things that concerns me in terms of diversity is the fact that if I try to match a key set of attributes to my leadership or my top performers and those leaders are all white males

00:36:43:14 – 00:36:50:21 | JD
Maybe what comes out of the machine is all white males and I need to I need to account for that. Cool. OK

00:36:51:21 – 00:37:03:01 | JS
Maybe the last question. Richard you’re going to get the question that’s on. Sure you did. You did this to yourself. That’s fair.

00:37:03:23 – 00:37:23:21 | JS
So so Amazon just canned a long project because they couldn’t get my view. They couldn’t get the bias implicit in the history of their company out of the data so they couldn’t build the intelligence system that would be free from bias.

00:37:23:23 – 00:37:25:02
Your thoughts.

00:37:25:02 – 00:38:50:22 | RR
Yeah I think that’s a really interesting story. If anyone hasn’t had a chance to take a look at the kind of articles that kind of swarmed around that it came out a couple of months ago. Amazon have been running a some kind of system. I think it’s a little bit unclear whether or not the system was enacted. I didn’t think it was I don’t think it was I think was a group of engineers that were thinking about or working through and this is just me speculating at this because I don’t have inside information over there but I think it’s interesting that they shut it down and I think it’s more to do with that discover ability than it is to do with the actual effectiveness of what they looking through because I do think that Amazon with nearly 500000 employees now they must get millions of residents every year absolutely millions and to be able to look through all of them they have to find a way to scale recruiting. And to think through I know the Aleo team here they they’ve done the kind of math on how many recruiters would it take to look through a million resonates and how could we kind of replace that with a skilled solution. I think they do need something like that. It was a bit of a step back I think from a media perspective and just the space in general because it really took a hit from kind of this. Amazon is hiring men were there. Their history is based on men so this is what they’re doing and it just kind of took off from there. But I think what they were doing was the right effort in the right way. And I think they shut it down before it kind of got too bad. But. I think going forward they couldn’t really had a lot more success there. And I don’t think this is the end of that space and companies kind of looking into this to try to see how can we scale recruiting and how can we find these candidates in the rough

00:38:53:09 – 00:38:57:06 | JS
Yeah you know the theory is that you can eliminate bias. Right.

00:38:57:07 – 00:39:21:01 | JS
The theory is that you can eliminate bias and it might be the case that bias is just like a steady breeze and you got us you got attack against the steady breeze to go on a straight line and that you can mitigate bias so you can mitigate the impact of bias. But the idea that what you can do is prevent human beings from having a point of view and keep that out of their data that seems.

00:39:22:10 – 00:39:24:14
To like an extravagant hope.

00:39:25:06 – 00:39:59:06 | RR
Yeah. And I think that goes back to that augmentation piece. I think if Amazon was hoping to replace and get rid of recruiters I would hope that that’s not where their head was at with it. I think experimentation in this area though should be expanded and we continue to kind of work in that space and continue to innovate there because the worst thing that could happen is we just say full stop let’s just keep recruiting with humans as we are today and just scale that and just hire thousands and thousands of recruiters. I’m sorry the recruiters in the audience. I know it’s a it’s something that’s we’re going to need forever. But at the same time. Something’s got to change eventually here expect we’ll get to that scale.

00:40:00:15 – 00:41:16:22 | DZ
Government weigh in real quick. I have a really. Is a very good friend very dear friend who runs a company called Aspen advisors and that’s exactly what he does. He goes he’s hired by companies to come in and to tell them what is good and what is bad. And Andrew Godard’s gangrene was shot out with asking advisers him and I’ve said it many times about this. I’ve gone through the Myers Briggs to still be in process. There’s this there’s all these different wild things that are happening out there. But the thing is at the end of the day and Andrew will tell you this he’ll bring you the data. And he just gives you the data he says he analyzes and he gives it to you the good and the bad. And he has been told by companies here’s your check. Thanks. And then we did it everything. The companies need to step up. We could. I mean we can I can go I can go in and evaluate your entire recruiting team. I can come in for two weeks and tell you exactly who should be here and who shouldn’t. Despite listen to the conversations after 23 years I’m pretty sure I can tell you that. But I’ve got to have you. Accept those findings. That’s where bias starts. Doesn’t start with machines. It starts with the very top. And if they’re not willing to listen and trickle it down it’s never gonna change.

00:41:17:00 – 00:41:25:06 | JS
Awesome. So we have a little bit of time for questions. Who wants to toss a hard ball at one of these barbers here.

00:41:25:09 – 00:41:51:16 | AM
A quick question. So the machine will collect data. Do you think that privacy policy and privacy law are hollow promises as every prospective employer is going to collect your data. This is especially concerning in the U.K. where the right to be forgotten exists. How do you ensure that such privacy exists for the years to come. This question is for Heather. Thank you. Thank you. So much

00:41:53:16 – 00:43:26:14 | HB
Privacy is a really important right but it is also one of our most fragile rights in that we can very easily give it up by clicking a box saying I agree. And if you don’t know what you’re agreeing to. You know so. So. There are some things that laws can do. GDP I think is is an interesting compromise between privacy and usability of data. And and the most important thing that we’re going to figure out is is who has the burden you know is it something that the person who has the data must inform you of directly and get your affirmative knowledge and consent or is it enough to have a privacy policy on a website which is where California’s new data privacy law. That’s that’s their basic thing. There are some security things but but basically the burden is completely on the person involved to come back and say what data do you have and please take it off your Web site. And most of us have no idea who has our data and what they’re doing with it. You know that it’s it’s it’s a pretty clear transaction with the Safeway card right. They hand me their card. The deal is I get a discount in exchange for my data. I buy all my cards. My toilet paper at Costco. They’re still trying to figure out why I don’t use toilet paper.

00:43:26:17 – 00:43:30:03 | HB
You know and I’m OK with this.

00:43:30:03 – 00:43:55:04 | HB
But but but for most of these for most data the subject has no idea. And an anonymous data is another approach. But the truth is is that there’s no way to truly anonymous these data. It’s it’s very easy with almost a tiny bit of context to connect it and figure out what’s going on.

00:43:57:01 – 00:44:43:21 | RR
You said companies will just collect data. I think that is not true of my experience and I would not be as pessimistic on that front there. I talked to a lot of teams about you to be out because I’m on people analytics team at Facebook. GDP is actually part of what I do with my day job is working with that understanding it better like figuring out how this applies how we work with it. It is a massive conversation going on right now. And so it’s something just from the inside. I’m thrilled about to be able to kind of see how that works and see how that plays out. And employee privacy is absolutely at the utmost concern because I think as soon as one company messes that up the whole thing comes crashing down about how we understand this how we can help our employees. It would be a massive step back for the entire industry. And so I think this is something that within the people analytics space at least is highly highly being watched. Know we just didn’t really know what it could be is

00:44:45:05 – 00:45:44:07 | DZ
Not a cookie that we just had after lunch. But the cookies on the website. Does anybody else notice that all of a sudden the Web sites that you frequently go to are now giving a pop up saying hey we use cookies. That’s GDP. So when you click OK then you’re saying OK you can you can follow every click I do. OK. I work for a company called com score. That’s what we did. I hired data analytics people for almost three years. And that’s all we did is we track data specifically for the movie industry. And when you’re at home and you’re at Comcast and you’re clicking the channel and you’re like and the commercial comes on and you sit through the commercial I’ll know it. I don’t know who you are. But I’ll collect all that data and I sell it to the advertising agency. That’s how come scoring Nielsen make money. That’s what ratings are all about. But you’re giving them that information when you get that cable box you’re telling the cable company in your contract that you sign they’re going to be monitoring all of your information. So that’s where it’s going to go down to

00:45:45:11 – 00:46:00:12 | JD
And I think probably the key takeaway for corporate H.R. people is collect the minimum amount of data you need. Explain what you’re going to do with it and only use it for that purpose really.

00:46:00:14 – 00:46:06:23 | HB
And watch out for Fair Credit Reporting Act. So. Really

00:46:07:13 – 00:46:28:22 | JS
The way to understand the organization is by collecting as much as you can. That’s going to be quite a tough balance to strike. And if you imagine being an H.R. leader going to the CEO saying we don’t want to understand what’s going on with our workforce we’re only going to collect the minimum amount of data but I think that’s a ticket to short career.

00:46:28:22 – 00:46:45:17 | JD
Well we’re more purposeful for our recruiting perspective. We are not going to use it for other marketing say Intel products we’re not going to use it for other we’re not going to sell it to anybody. It’s used for recruiting. If you’re clicking in our database to apply for a job it’s only going to be for that purpose.

00:46:45:20 – 00:47:17:01 | JS
Last bit of this is to put each of you on the spot. There was some talk earlier in the day about actionable insights. So. So if you were going to tell somebody one thing to take away from. This conversation or today in general or something you learned reading the newspaper or the hotel room. What would it be. Start with you Jeff. I think whatever you decide to implement today based on learnings based on company offerings

00:47:17:00 – 00:47:23:04 | JD
Put up put a pin in revisiting it in in 12 months or less because things are going to change.

00:47:25:08 – 00:47:34:07 | HB
You can’t outsource responsibility.

00:47:34:12 – 00:47:41:09 | RR
I would say don’t compare a high against the perfect. You have to compare it against what you’re doing today and what we’re doing today is not perfect.

00:47:43:21 – 00:48:32:10 | DZ
Well no worries. Yeah. Good to see you both. Both stole everything that I was going to say. I love all of you all three of you. I guess my takeaway is this we’re not there yet. I don’t think we’re close. I really don’t. I think I just I don’t know. I don’t know when I think we’re going and if we were going at a really fast speed almost breakneck and then all of a sudden we hit a wall would you DPR. No offense for Facebook. We hit a wall some stuff there with Amazon hit a wall once again like I said I can give you the information it’s up to you if you want to accept it. And that’s that’s what I think the next big hurdle is going to be is because there’s gonna be people like John out there and bringing you that data and you’re not going to like it sometimes. So how do you deal with it.

00:48:32:17 – 00:48:37:07 | JS
So just to wrap this all up. A couple of things. One

00:48:39:05 – 00:48:45:18 | JS
Is it would be a reasonable thing to take away from this conversation that you should stay away from this crap for as long as you can.

00:48:47:05 – 00:48:49:06 | JS
And I urge you not to do that.

00:48:49:08 – 00:48:50:13 | JS
I urge you not to do that.

00:48:50:14 – 00:50:07:07 | JS
This is the way that work is going to be from here going forward it’s going to be less certain. It’s going to be less clear. It’s going to be more experimental. This is what the flattening of the hierarchy looks like and it’s going to be augmented with technology and so what you have to do is get your feet wet and. Try things and then try things to the point that there is some risk to you associated with those status because that’s how your companies are going to survive. You gotta get in you gotta get in now even though it’s uncertain even though it’s something other than perfectly clear. This is the make or break it for your company. And so get started. Second thing is for just the tiniest bit of promotion we wrote an incredible industry analysis of HRExaminers that looks at trends and vendors inside of HR. 70 percent of them over recruiting vendors that we cover. You might want to stop at the H.R. Examiner site and see that and you may want to follow us because we keep a steady pulse of information about the stuff going through there.

Lastly, let me remind you who these people are. Derek Zeller.

00:50:14:19 – 00:50:22:09 | JS
Is the head of recruiting projects at engaged talent. Richard Rose now runs people analytics for Facebook.

00:50:26:00 – 00:50:37:22 | JS
And by the way they have 50 people on the people analytics team at Facebook. 50 people, this is coming to your town. Heather Bussing, employment attorney

00:50:41:15 – 00:50:46:17 | JS
And Jeff Dunn, the head of college recruiting for Intel. Thank you guys. Thank you very much. Thank you.

 

Watch more of John’s talks on his YouTube Channel »

 

Buy Tickets for every event – Sports, Concerts, Festivals and more buytickets.com