Interview design is one of those things most organizations figure they’ll get around to eventually—right after optimizing the coffee machine. But here’s the thing: how you talk to candidates tells you a lot about what you actually value in hiring. Some formats give you solid, comparable data. Others mostly tell you whether you clicked with someone over lunch.
This guide covers the three main interview formats—what they actually involve, when each one makes sense, and how to design them so you end up with information you can act on instead of vague impressions.
A structured interview uses the same questions, asked in the same order, with consistent rating scales for every candidate. That’s the core idea. It’s not about memorizing a script word-for-word—it’s about creating a framework that lets you compare candidates fairly.
Consistency is the whole point. If you’re interviewing five people for a software engineering role, they should all answer the same core questions, scored against the same criteria. SHRM has been saying since the early 2000s that structured interviews reduce bias and actually predict job performance better than unstructured conversations. The data on this is pretty solid.
Most structured interviews use three question types. Situational questions ask how candidates would handle specific scenarios—”If you discovered a critical bug in production at 4 PM on Friday, what would you do?” Behavioral questions dig into past performance—”Tell me about a time you had to meet a tight deadline with limited resources.” Job-related questions test technical knowledge specific to the role.
The upsides are real. You get hiring decisions that hold up legally because everyone gets evaluated the same way. Bias drops significantly when interviewers use standardized scales. Candidates know what to expect. And you can train junior team members to conduct these effectively because the framework does most of the heavy lifting.
The downside is the upfront work. You need to develop and validate questions before you can use them. Some hiring managers chafe at the rigidity—they want to follow the conversation where it goes. And for roles where creative problem-solving or cultural fit matters most, a standardized format might miss things that a more flexible approach would catch.
Semi-structured interviews mix consistency with flexibility. You come in with a core set of questions—usually 5-7 key areas you need to cover—but you can deviate based on what the candidate says, what you’re curious about, or what’s most relevant to the specific role.
It’s like having a conversation with a purpose. You know where you need to end up, but you control the path based on what unfolds. If a candidate’s answer reveals something interesting about their leadership style, you can dig into that instead of rushing through to the next prepared question.
The main difference from fully structured interviews is question consistency. In a semi-structured format, you might make sure every candidate answers questions about stakeholder management, technical capabilities, and career goals—but the exact wording and follow-up questions depend on each conversation.
This format works well when you’re hiring for roles that need adaptability, cross-functional collaboration, or creative thinking. It shines in startup environments where cultural add matters as much as skill alignment, or in leadership roles where authentic conversation reveals more than scripted responses.
The risk is bias creeping in. Without standardization, you might accidentally ask easier questions to candidates you instinctively like, or give more latitude to certain applicants. The fix isn’t swinging back to rigid structure—it’s documenting your questions and rating approach while keeping the conversational flexibility.
An unstructured interview has no real format. The conversation might start with “Tell me about yourself” and go wherever interviewer curiosity, candidate background, or random tangents take it. No script, no consistent questions, often no clear evaluation criteria.
This format is still surprisingly common. A lot of managers think they’re “getting the real person” through free-flowing conversation. The research tells a different story.
Unstructured interviews have almost no predictive validity for job performance. Studies consistently show they perform barely better than chance. That gut feeling guiding many unstructured conversations? It correlates strongly with demographic factors like race, gender, and attractiveness—not job-relevant competencies.
I’m not saying never have a conversational interview. But treating unstructured conversation as your main evaluation method is a choice that deserves honest scrutiny. If you use this format, recognize you’re relying on intuition and rapport rather than systematic assessment.
That said, unstructured interviews have real uses. Early exploratory conversations with candidates can use this format to assess mutual interest before investing in formal evaluation. Executive searches sometimes employ unstructured dialogue to understand candidate philosophy and communication style. And informal conversations during longer interview loops can supplement structured assessments without replacing them.
Here’s how these formats stack up against each other:
| Aspect | Structured | Semi-Structured | Unstructured |
|---|---|---|---|
| Question Consistency | Same questions, same order | Core questions consistent, follow-ups flexible | No predetermined questions |
| Evaluation Method | Standardized rating scales | Documented criteria with flexibility | Subjective, impression-based |
| Interviewer Training | Required for consistency | Recommended | Often assumed unnecessary |
| Time to Prepare | High (question development) | Moderate (core questions) | Low (no preparation needed) |
| Legal Defensibility | Strong | Moderate | Weak |
| Predictive Validity | High | Moderate to High | Low |
| Candidate Comparison | Direct and objective | Possible with documentation | Difficult and biased |
| Best For | High-volume hiring, compliance-focused roles | Most professional positions | Exploratory conversations |
The most important difference is predictive validity—how well the interview actually predicts job performance. Structured interviews consistently show correlation coefficients around .5 or higher with job performance in meta-analyses. Unstructured interviews typically stay below .2. That’s the gap between a useful hiring tool and something barely better than flipping a coin.
Structured interviews take upfront work. Here’s how to do them right:
Start with the actual competencies that predict success in the role. Skip generic “leadership” or “communication”—get specific. For a marketing manager, that might mean campaign development, budget management, cross-functional collaboration, and data-driven decision making. Use job analysis, top performer data, and ideally some form of job preview to ground your competency selection.
Develop 5-8 questions per competency. Mix situational questions (predicting future behavior: “What would you do if…?”) with behavioral questions (probing past behavior: “Tell me about a time when…”). A solid structured interview usually has 8-12 total questions covering 3-5 competencies.
Create a rating scale with clear behavioral anchors. SHRM recommends 5-point scales where each level has specific behavioral descriptions. For example: “3 out of 5 – Demonstrated some strategic thinking but didn’t fully articulate tradeoffs or consider long-term implications.” Behavioral anchors turn vague impressions into observable criteria.
Train all interviewers. This isn’t optional. Even the best structured interview falls apart when interviewers improvise ratings, ask off-script questions, or apply different standards. Training should include practice rating responses and calibration sessions where interviewers compare their ratings on sample answers.
Document everything. Keep records showing what questions were asked, how candidates responded, what ratings were given, and the reasoning behind hiring decisions. This protects you in discrimination claims and helps you improve your question bank over time.
The design philosophy here is different. You’re not after complete standardization—you’re building a reliable framework that still allows conversational depth.
Start with 5-7 essential topics you must cover with every candidate. These might include relevant technical experience, past project accomplishments, career development trajectory, specific scenario responses, and cultural alignment indicators. Write these as topic areas rather than specific questions.
Develop 2-3 questions per topic as starting points. You don’t need to ask every candidate the exact same question, but you should ensure consistent coverage. If you’re hiring for a product manager role, your topics might include stakeholder navigation, data analysis approach, and product strategy development—each with a prepared question, but with room to explore responses naturally.
Create a simple documentation template. After each interview, note which topics you covered, what you learned about each, and your overall impressions. This forces some consistency even without standardized scoring and becomes invaluable when comparing candidates two weeks later.
Build in accountability mechanisms. Because semi-structured interviews allow more interviewer discretion, you need checks against bias. Require interviewers to document their ratings before discussing candidates with colleagues. Conduct periodic calibration sessions where your team reviews notes and discusses rating disparities.
For most organizations, this is the sweet spot—conversational feel with maintained rigor. You’re not reading from a script; you’re having a guided conversation with clear destinations and documented waypoints.
If you’re going to use unstructured interviews at all, design matters even more—because the format’s weaknesses require intentional countermeasures.
Limit unstructured interviews to specific purposes. Use them for initial screening conversations where you’re assessing candidate interest and availability rather than making evaluation judgments. Use them as supplements to structured formats in later stages. Never rely on them as your sole assessment method.
Establish topic boundaries even without standardized questions. Decide in advance what areas you want to explore—technical capabilities, leadership experience, problem-solving approach, team dynamics—and ensure you cover your priorities during the conversation.
Take detailed notes. Without a structured evaluation framework, your memory will betray you. Document specific stories, quotes, and observations immediately after the conversation. Review these notes before your next candidate to identify gaps in your assessment.
Debrief with other interviewers. Because unstructured interviews lack inherent consistency, you need human calibration. Discuss each candidate with colleagues who also conducted unstructured conversations and compare your impressions against theirs.
Recognize honestly that unstructured interviews are prone to bias. Confirmation bias leads you to seek information confirming your first impression. Similar-to-me bias makes you favor candidates who remind you of yourself. Horn bias causes negative first impressions to color everything that follows. The only defense is awareness and structured reflection.
Choosing the right format depends on your specific circumstances:
Use structured interviews when you have high-volume hiring, regulatory compliance requirements, or need to defend hiring decisions legally. They’re essential for positions where consistency matters—entry-level roles, positions with many applicants, organizations with diverse workforces where bias claims are likely.
Use semi-structured interviews for most professional and management positions. This format captures the vast middle ground where you need evaluation rigor but also want to assess adaptability, authentic communication, and cultural fit. Most senior individual contributor and mid-level management roles benefit from this approach.
Use unstructured interviews sparingly and intentionally. Reserve them for early-stage conversations where mutual fit exploration matters more than evaluation, or as supplementary conversations in multi-round processes. Never make them your primary method.
One thing most articles get wrong: you can mix formats within a single hiring process. Start with a structured phone screen to assess baseline qualifications. Follow with semi-structured video interviews to evaluate fit and depth. Add an unstructured lunch conversation to assess authentic personality. Each format serves different purposes—don’t let ideological purity about “the right approach” prevent you from using the tool that fits each stage.
Complete TikTok Shop guide for 2025: Learn proven strategies to sell products and explode your…
Discover the biggest social media trends 2024 that are reshaping digital marketing. Learn what's working…
Discover the top social media marketing trends 2024 to boost your brand. Learn proven strategies…
Master social media marketing in 2025 with our complete guide. Boost engagement, grow your following,…
Social media marketing strategies 2024: proven tactics that work. Learn how to grow your following…
Discover the most effective social media marketing strategies in 2024. Learn proven tactics to grow…