The "Gotcha" Questions That Expose Fake Experts
Stop hiring marketing "unicorns" who promise everything and deliver nothing. Use our battle-tested job description, killer interview questions, and homework scorecard to find marketers who actually know what they're doing.
Why Most Marketing Job Descriptions Attract the Wrong People
Generic job descriptions written by HR departments attract generic candidates. Here's why yours is probably failing:
The Unicorn Trap
"Must be expert in SEO, PPC, social media, graphic design, video editing, copywriting, analytics, and project management."
The Problem: These people don't exist. You'll get applicants who claim everything but deliver nothing.
The Specialist Reality
"Strategic thinker with deep expertise in paid acquisition and conversion optimisation. Comfortable managing specialists."
The Solution: Hire for strategic thinking and 1-2 deep skills. Build a team of specialists.
The Credential Obsession
"Must have Bachelor's degree in Marketing, Google Ads certification, HubSpot certification, and 5+ years at a leading agency."
The Problem: Certificates mean nothing. Results mean everything.
The Results Focus
"Proven track record scaling paid acquisition from $50k to $500k+ monthly spend while maintaining profitability."
The Solution: Ask for specific, measurable results in environments similar to yours.
The Three Deadly Mistakes
Mistake #1: Writing for Everyone
Vague descriptions like "manage marketing campaigns" attract hundreds of unqualified applicants. Good marketers need specifics: what channels, what scale, what outcomes?
Mistake #2: Listing Tasks, Not Outcomes
"Create social media posts" tells candidates nothing. "Scale organic social to 100,000 impressions per month while generating 500+ qualified leads" tells them exactly what success looks like.
Mistake #3: No Technical Bar
If your job description doesn't scare away beginners, you'll waste weeks interviewing people who can't do the job. Include technical requirements that only experienced marketers understand.
The Battle-Tested Job Description Template
Copy this template, customise it for your business, and start attracting serious candidates who can actually deliver results.
Copy-Paste Template
Customisation Tips:
- Replace [bracketed sections] with your specific details
- Adjust budget ranges and experience requirements to match your needs
- Add or remove channels based on what you actually need (don't list channels you won't use)
- Keep the results-focused language—it filters out weak candidates automatically
The "Anti-Description": What NOT to Ask For
Here's what every bad job description includes, and why you should delete it immediately:
"Must be an expert in SEO, PPC, social media, content, email, analytics, graphic design, and video editing"
Why it's wrong: Unicorns don't exist. You're asking for 7 different specialists in one person.
What to do instead: Pick 2-3 core skills you actually need (e.g., "Expert in paid acquisition and conversion optimisation") and hire specialists for everything else.
"Bachelor's degree in Marketing required"
Why it's wrong: University marketing degrees don't teach Google Ads, Facebook pixel implementation, or how to scale a paid acquisition funnel. You're screening out self-taught experts who learned through real campaign management.
What to do instead: Focus on demonstrable results. "Proven track record scaling digital campaigns from $X to $Y while maintaining Z% ROAS."
"5+ years experience at a leading agency"
Why it's wrong: Agency experience often means managing junior staff and client expectations, not hands-on execution. Plus, "leading agency" is meaningless, every agency calls themselves that.
What to do instead: "3+ years hands-on experience managing $100,000+ monthly ad budgets across paid search and paid social."
"Passionate, creative self-starter with excellent communication skills"
Why it's wrong: These are the most overused, meaningless phrases in job descriptions. Every candidate will claim these traits. You learn nothing.
What to do instead: Describe the actual work environment and challenges. "You'll work independently to diagnose underperforming campaigns, run rapid testing cycles, and present data-driven recommendations to senior leadership."
"Manage all marketing activities including branding, PR, events, and partnerships"
Why it's wrong: This is 4-5 completely different jobs requiring different skill sets. Branding is not performance marketing. PR is not paid acquisition. You'll hire someone mediocre at everything.
What to do instead: Pick ONE core function (e.g., "Lead all digital acquisition") and build your team around clear specialisations.
"Google Ads and HubSpot certifications required"
Why it's wrong: These free online courses take 2-4 hours to complete and test memorisation, not practical skill. Every beginner has these. Expert marketers often don't bother getting them.
What to do instead: "During the interview process, you'll audit one of our live ad accounts and present optimisation recommendations. We want to see how you think, not what certificates you collected."
The Golden Rule
If a requirement can't be verified through tangible results (ad account performance, campaign case studies, revenue impact), delete it from your job description. Focus only on what matters: proven ability to deliver measurable business outcomes.
The 5 "Killer" Interview Questions That Weed Out Fake Experts
Most marketing interviews are useless. Candidates rehearse generic answers about "thinking outside the box" and "driving engagement." These 5 technical questions expose who actually knows what they're doing:
"Walk me through how you'd diagnose why our cost per acquisition just doubled week-over-week."
What you're testing:
Diagnostic thinking, technical platform knowledge, and systematic problem-solving approach.
What a good answer sounds like:
- "First, I'd check if there was a tracking pixel issue or attribution window change that's inflating the reported CPA."
- "Then I'd segment by campaign, ad set, and creative to isolate which specific element caused the spike."
- "I'd also check for external factors: competitor activity, seasonal shifts, audience saturation, or platform algorithm changes."
- "Finally, I'd look at conversion rate data to see if the issue is acquisition efficiency or landing page performance."
Red flag answer:
"I'd probably review the campaigns and maybe adjust the targeting or try new ad creative." (Too vague, no systematic process, no technical depth.)
"You have a $50,000 monthly budget and need to acquire 500 customers at $100 CPA. Walk me through your channel allocation strategy."
What you're testing:
Strategic thinking, channel knowledge, budget management, and understanding of funnel economics.
What a good answer sounds like:
- "I'd allocate 60-70% ($30,000-$35,000) to proven high-intent channels like Google Search where we can hit the $100 CPA target reliably."
- "20-25% ($10,000-$12,500) to Facebook/Instagram for prospecting and retargeting, knowing CPA might be higher initially but improves with optimisation."
- "10-15% ($5,000-$7,500) to testing new channels, YouTube, TikTok, or programmatic display, to find efficiency opportunities."
- "I'd track weekly and reallocate based on performance, shifting budget from underperformers to overperformers."
Red flag answer:
"I'd split it evenly across Google, Facebook, Instagram, LinkedIn, TikTok, and YouTube to see what works." (No strategic thinking, no understanding of channel economics, spreading budget too thin.)
"Explain the difference between view-through conversions and click-through conversions, and when you'd optimise for each."
What you're testing:
Deep technical knowledge of attribution, conversion tracking, and when different optimisation strategies apply.
What a good answer sounds like:
- "Click-through conversions happen when someone clicks your ad and converts. View-through conversions happen when someone sees your ad but doesn't click, then converts later through another channel."
- "For direct-response campaigns focused on immediate action, like e-commerce or lead gen, I'd optimise for click-through conversions because they're higher intent."
- "For brand awareness or longer sales cycles, view-through conversions matter because you're influencing the customer journey even without a direct click."
- "The key is understanding your attribution model and not over-crediting view-throughs, which can inflate performance metrics."
Red flag answer:
"I'm not sure exactly, but I think it has something to do with people seeing versus clicking ads?" (Doesn't understand fundamental attribution concepts.)
"You're running Facebook ads and the frequency is 4.2 with declining CTR. What do you do?"
What you're testing:
Platform-specific expertise, understanding of ad fatigue, and practical optimisation tactics.
What a good answer sounds like:
- "Frequency above 3-4 usually signals audience fatigue. I'd immediately refresh the creative, new hooks, angles, and visuals, while keeping the winning messaging framework."
- "I'd also expand the audience to reduce frequency pressure, either through lookalikes, broader targeting, or interest stacking."
- "If budget allows, I'd implement frequency capping at 2-3 impressions per week to prevent over-saturation."
- "Long-term, I'd set up a creative testing system to rotate fresh ads every 2-3 weeks before fatigue sets in."
Red flag answer:
"I'd increase the budget to reach more people." (Misunderstands the problem, more budget doesn't solve audience fatigue.)
"How would you set up conversion tracking for a business that has both online purchases and phone enquiries that convert offline?"
What you're testing:
Understanding of multi-channel attribution, offline conversion tracking, and technical implementation complexity.
What a good answer sounds like:
- "For online purchases, I'd implement standard pixel tracking with purchase events sent to Google and Facebook."
- "For phone enquiries, I'd use call tracking software like CallRail or WhatConverts that assigns dynamic numbers to track which campaigns drive calls."
- "I'd integrate the call tracking with Google Ads offline conversion import to feed call data back to the platforms for optimisation."
- "For offline conversions from phone calls, I'd work with sales to implement a CRM system that tracks which enquiries close, then upload that data via Google's offline conversion API."
- "The key is connecting online clicks → phone calls → CRM sales → revenue, so we optimise for actual business outcomes, not just lead volume."
Red flag answer:
"I'd track the online purchases with Google Analytics and ask the sales team to count the phone calls." (No understanding of proper attribution or technical implementation.)
How to Use These Questions
Don't accept surface-level answers. If a candidate can't dive deep into technical specifics, they don't have hands-on experience. Great marketers will walk you through their thinking process, mention specific tools and tactics, and reference real scenarios they've handled.
Bonus tip: For senior roles, have candidates audit one of your live ad accounts before the final interview and present their findings. This single exercise will tell you more than 10 rounds of traditional interviews.
The Homework Scorecard: How to Evaluate Real Skills
Interviews can be faked. Homework can't. Give candidates a real-world task and use this scorecard to separate practitioners from pretenders.
The Homework Task
Give candidates access to one of your live ad accounts (Google Ads, Facebook Ads, or both) and ask them to:
- Audit the account performance over the last 30 days
- Identify 3-5 specific issues or opportunities
- Present actionable recommendations with expected impact
- Deliver a 15-minute presentation of their findings
Time limit: 2-3 hours. This filters out people who can't prioritise and those who overthink instead of execute.
The Scorecard: What to Look For (and What to Avoid)
Data Depth & Technical Understanding
Do they actually understand how to read an ad account, or are they guessing?
Good Answer Indicators
- • References specific metrics: CTR, CPC, CVR, ROAS, Quality Score, Relevance Score
- • Segments data by campaign, ad group, device, time of day, geography
- • Identifies trends over time, not just snapshots
- • Mentions platform-specific nuances (e.g., "broad match modifier changed to phrase")
- • Checks for tracking issues before blaming creative or targeting
Red Flags to Watch For
- • Only looks at account-level metrics without drilling down
- • Focuses on vanity metrics like impressions or reach instead of conversions
- • Can't explain why certain metrics matter
- • Doesn't check attribution settings or conversion tracking setup
- • Generic observations like "CTR is low" without context or benchmarks
Score This Section: 0-25 points
25 pts = Deep technical analysis with platform expertise. 0 pts = Surface-level observations with no data segmentation.
Prioritisation & Business Impact
Do they focus on what actually moves the needle, or get lost in irrelevant details?
Good Answer Indicators
- • Ranks recommendations by potential revenue impact, not just effort required
- • Focuses on fixing conversion bottlenecks before optimising top-of-funnel
- • Identifies quick wins vs. long-term strategic shifts
- • Quantifies expected impact: "This should reduce CPA by 15-20%"
- • Understands your business model and ties recommendations to revenue/profitability
Red Flags to Watch For
- • Lists 20+ micro-optimisations without prioritisation
- • Focuses on brand metrics when you need direct response performance
- • Recommends expensive platform changes before testing simple fixes
- • Can't estimate the business impact of their recommendations
- • Suggests tactics that worked at their last company without adapting to your business
Score This Section: 0-25 points
25 pts = Clear prioritisation with quantified business impact. 0 pts = Laundry list of tactics with no strategic thinking.
Actionability & Implementation Detail
Can they actually execute, or do they just talk in vague generalities?
Good Answer Indicators
- • Provides step-by-step implementation: "First do X, then Y, then measure Z"
- • Names specific tools, settings, and features they'd use
- • Includes testing methodology and success metrics
- • Anticipates potential issues: "If this doesn't work, we'd try..."
- • Provides realistic timelines: "Test for 2 weeks at $500/day before scaling"
Red Flags to Watch For
- • Vague recommendations: "Improve ad creative" or "Test new audiences"
- • No mention of how to actually implement the changes
- • Doesn't specify testing frameworks or measurement plans
- • Promises results without explaining the execution path
- • Recommends things they clearly haven't done before
Score This Section: 0-25 points
25 pts = Detailed, executable plan with clear next steps. 0 pts = Conceptual ideas with no implementation detail.
Communication & Presentation Quality
Can they explain complex ideas to non-technical stakeholders without jargon?
Good Answer Indicators
- • Clear structure: Problem → Analysis → Recommendation → Expected Outcome
- • Uses visuals (screenshots, charts) to support findings
- • Explains technical concepts in business terms
- • Anticipates and answers questions before being asked
- • Stays on time (15 minutes) without rushing or rambling
Red Flags to Watch For
- • Overwhelms with jargon without explaining the business impact
- • Disorganised presentation with no clear narrative
- • Gets defensive when questioned about recommendations
- • Reads slides instead of presenting insights
- • Can't simplify technical concepts for non-marketing stakeholders
Score This Section: 0-25 points
25 pts = Clear, confident presentation with strong storytelling. 0 pts = Confusing, jargon-heavy, or disorganised.
Your Hiring Decision Matrix
85-100 points: Hire Immediately
This candidate is a practitioner with deep technical expertise and strategic thinking. Make an offer before someone else does.
70-84 points: Strong Contender
Solid skills with some gaps. Could be great with the right support and mentoring. Proceed to final round.
50-69 points: Proceed with Caution
Has potential but significant skill gaps. Only consider if you have bandwidth to train and they're a cultural fit.
Below 50 points: Pass
Doesn't meet the technical bar or strategic thinking required. Keep looking, your ideal candidate is out there.
Pro Tip: Have 2-3 people independently score the homework using this rubric, then compare notes. This eliminates bias and ensures you're evaluating skills, not personality fit.
Why This Scorecard Works
Traditional interviews let candidates rehearse generic answers and fake expertise. This homework task forces them to demonstrate real skills under time pressure.
In 2-3 hours, you'll see exactly how they think, prioritise, and execute. You'll know if they can read data, spot opportunities, and communicate recommendations to stakeholders.
Agencies hire based on "vibes." You're hiring based on proven capability. That's how you build a team that actually delivers results.
Still Not Sure How to Hire the Right Marketing Talent?
We help Australian businesses build high-performing marketing teams from scratch. Our recruitment service includes role design, candidate sourcing, technical vetting (yes, we audit their ad accounts), and shortlist delivery. You only interview pre-qualified candidates who can actually do the job.
30-minute call to discuss your hiring needs and see if we're a fit