• Morning interviews may yield higher scores due to interviewer bias, research shows.
  • Bias in hiring can be influenced by the time of day, affecting candidate evaluations.
  • AI tools could reduce this, offering fairer assessments than manual methods.

If you get to choose when to schedule a job interview, you might want to grab a coffee and go for a morning slot.

That's because some people conducting interviews tend to give higher scores to candidates they meet with earlier in the day compared with the afternoon, a startup's review of thousands of interviews found.

It's not an absolute, of course, and candidates can still kill it well after lunchtime. Yet, in a job market where employers in fields like tech have been slow to hire, even a modest advantage could make a difference, Shiran Danoch, an organizational psychologist, told Business Insider.

"Specific interviewers have a consistent tendency to be harsher or more lenient in their scores depending on the time of day," she said.

It's possible that in the morning, interviewers haven't yet been beaten down by back-to-back meetings — or are perhaps still enjoying their own first coffee, she said.

Danoch and her team noticed the morning-afternoon discrepancy while reviewing datasets on thousands of job interviews. Danoch is the CEO and founder of Informed Decisions, an artificial intelligence startup focused on helping organizations reduce bias and improve their interviewing processes.

She said the inferences on the time-of-day bias are drawn from the datasets of interviewers who use Informed Decisions tools to score candidates. The data reflected those who've done at least 20 interviews using the company's system. Danoch said that in her company's review of candidates' scores, those interviewed in the morning often get statistically significant higher marks.

The good news, she said, is that when interviewers are made aware that they might be more harsh in the afternoon, they often take steps to counteract that tendency.

"In many cases, happily, we're actually seeing that the feedback that we're providing helps to reduce the bias and eventually eliminate the bias," Danoch said.

However, she said, interviewers often don't get feedback about their hiring practices, even though finding the right talent is "such a crucial part" of what hiring managers and recruiters do.

She said other researchers have identified how the time of day — and whether someone might be a morning person or an evening person — can affect decision-making processes.

An examination of more than 1,000 parole decisions in Israel found that judges were likelier to show leniency at the start of the day and after breaks. However, that favorability decreased as judges made more decisions, according to the 2011 research.

Tech could help

It's possible that if tools like artificial intelligence take on more responsibility for hiring, job seekers won't have to worry about the time of day they interview.

For all of the concerns about biases in AI, partiality involved in more "manual" hiring where interviewers ask open-ended questions often leads to more bias than does AI, said Kiki Leutner, cofounder of SeeTalent.ai, a startup creating tests run by AI to simulate tasks associated with a job. She has researched AI ethics and that of assessments in general.

Leutner told BI that it's likely that in a video interview conducted by AI, for example, a candidate might have a fairer shot at landing a job.

"You don't just have people do unstructured interviews, ask whatever questions, make whatever decisions," she said.

And, because everything is recorded, Leutner said, there is documentation of what decisions were made and on what basis. Ultimately, she said, it's then possible to take that information and correct algorithms.

"Any structured process is better in recruitment than not structuring it," Leutner said.

Humans are 'hopelessly biased'

Eric Mosley, cofounder and CEO of Workhuman, which makes tools for recognizing employee achievements, told BI that data created by humans will be biased — because humans are "hopelessly biased."

He pointed to 2016 research indicating that juvenile court judges in Louisiana doled out tougher punishments — particularly to Black youths — after the Louisiana State University football team suffered a surprise defeat.

Mosley said, however, that AI can be trained to ignore certain biases and look for others to eliminate them.

Taking that approach can help humans guard against some of their natural tendencies. To get it right, however, it's important to have safeguards around the use of AI, he said. These might include ethics teams with representatives from legal departments and HR to focus on issues of data hygiene and algorithm hygiene.

Not taking those precautions and solely relying on AI can even risk scaling humans' biases, Mosley said.

"If you basically just unleash it in a very simplistic way, it'll just replicate them. But if you go in knowing that these biases exist, then you can get through it," he said.

Danoch, from Informed Decisions, said that if people conducting interviews suspect they might be less forgiving after the morning has passed, they can take steps to counteract that.

"Before you interview in the afternoons, take a little bit longer to prepare, have a cup of coffee, refresh yourself," she said.

Read the original article on Business Insider