Back to blog
Tech

The AI iceberg: What your chatbot can’t see beneath the resume

Todd Raphael
Senior Writer
October 10, 2025

Chatbots are everywhere. According to Mind the Bridge, 92% of HR departments are using chatbots to attain information for employee hiring. They can certainly screen resumes around the clock, but they’re also creating a dangerous “iceberg effect” — what you see on the surface, those neat little keywords on a resume, represent only a fraction of a candidate’s true potential. The most valuable qualities remain hidden beneath the surface.

We’ll talk more about chatbots and traditional keyword screening below, including a better way to see that whole iceberg.

The promise vs. reality of AI in recruiting

If your CEO has said, “We need to adopt AI quickly,” you’re not alone. Gallup found that 93% of Fortune 500 CHROs say their organization has begun using AI tools and technologies to improve business practices. By the time you read this, it will probably be close to 100%.

AI has raised hopes of significant efficiency gains. The problem? We’re now seeing growing evidence of limitations in resume screening that could cause headaches big enough to erase those gains.

The bias blind spot

Relying on AI to filter resumes sounds appealing, since it can save recruiters from combing through hundreds or even thousands of applicants. But the risks are real.

University of Washington researchers found “significant racial, gender, and intersectional bias in how three state-of-the-art LLMs ranked resumes.” Across 550 real-world resumes, those LLMs favored white-associated names 85% of the time, female-associated names only 11% of the time, and never favored Black male-associated names over white male ones.

Workday is now facing a collective-action lawsuit claiming its AI discriminates based on age, disability, and race. Even aside from the potential damages, just preparing a defense — and dealing with the negative publicity — can be incredibly costly.

The skills mirage

Harvard Business School found that AI systems rejected over 10 million qualified candidates in the U.S. due to rigid filtering criteria. These systems often can’t recognize transferable skills or unconventional career paths, leading to missed opportunities for both employers and job seekers.

A teacher could become an incredible corporate trainer. A meticulous accounts payable employee might make a great UX designer. A military veteran might have the leadership skills to run a business unit. But keyword filters don’t see any of that.

As Larry Cummings, co-founder of HR Tech Alliances, says: “These systems can cut people out rather than pull people in.”

The experience gap

AI-powered screening and assessment, when implemented responsibly, can reduce AI recruiting bias. But these methods often focus on matching people to jobs — not spotting potential. They miss the human signals that predict success.

Take Tom Brady. One of the most prolific quarterbacks in NFL history, he didn’t exactly dazzle scouts with his early stats. But the clues of his impending greatness were always there. “I have never seen Brady rattled,” said one observer. Another noted, “The guys around him love him and believe in him.”

Those kinds of insights — leadership, composure, trust — don’t show up in a resume or a keyword search. Yet they’re often what determine who thrives.

The cost of shallow waters

When chatbots filter out qualified candidates, companies lose more than talent — they lose time, culture fit, and credibility.

One Reddit user described being screened out by a pizza chain’s chatbot, even though he had experience in both delivery and customer service: “I don’t complain when employers reject me after an interview, but using an AI chatbot as the main source for who to hire and not hire someone is just so dystopian to me.”

He’s not alone. The irony is that companies turn to chatbots thinking they’ll save money, but missing out on great talent ends up costing them more.

Diving deeper: The 3D data solution

There’s a better way to use AI, one that “screens in” potential instead of screening people out.

With 3D data, you can see far beyond resumes and job titles to find candidates based on contextual, validated career attributes — things like revenue growth experience, team leadership, union experience, or impact during a company turnaround.

SaaS company Boulevard has used this 3D data approach with attributes such as “has SaaS experience” and “has startup experience” — the kind of nuanced criteria that actually reflect success. VP of Talent Acquisition Travis Baker says, “The quality of candidates that we’ve been able to identify and engage has massively improved.”

Navigating to safer waters

Every company is racing to adopt AI, and sometimes in a hurry. Before you invest in a hiring tool, ask:

  • Is it relying on flat keyword searches, or is it finding people through a 3D view of their potential?
  • Do you know how the technology ranks candidates, or is it a mystery?
  • Where does the training data come from? From one company, or from millions of diverse data points?
  • Could it be favoring people based on irrelevant or biased signals — like in the case of an AI hirint tool favoring former baseball players over softball players?
  • Is it part of a broader, multichannel hiring strategy that considers internal employees, referrals, alumni, and past applicants?

Seeing the whole iceberg

That resume-screening tool you’re considering might look impressive on the surface. It may even seem to work. But what it’s missing — the skills, potential, and human qualities that don’t fit into a keyword box — could sink your hiring strategy.

A better approach is to see the whole iceberg: to understand candidates through 3D data and success signals that reveal who will thrive, not just who checks a box.

Why ignore what’s under the surface when you already know it’s there? If you’d like to see how our customers are using 3D data to do exactly that, let’s talk.