Many mentoring programmes run on gut feeling. With structured internal mentoring program survey questions, you see what really works: from matching and meetings to career impact and AI skills — and where mentors or mentees quietly drop out.
Survey questions for internal mentoring programmes
All closed questions use a 1–5 Likert scale (1 = Strongly disagree, 5 = Strongly agree), unless you adapt them. Tagging: [Mentee], [Mentor], [Both].
2.1 Closed questions (Likert scale)
- Q1. [Mentee] I understood the goals of the Mentoring Programm before I joined.
- Q2. [Mentee] The application and onboarding process for the Mentoring Programm was clear and simple.
- Q3. [Both] I know what is expected from me in my role as mentor/mentee.
- Q4. [Mentee] I received enough information to prepare for the first meeting with my mentor.
- Q5. [Mentor] HR explained the purpose and scope of the Mentoring Programm clearly to me.
- Q6. [Both] The time commitment for the Mentoring Programm was communicated realistically.
- Q7. [Mentee] My mentor’s experience matches the topics I want to work on.
- Q8. [Mentee] My mentor understands my role, context and challenges in this company.
- Q9. [Both] Our communication styles fit well enough to have productive conversations.
- Q10. [Mentee] I feel comfortable being open and honest with my mentor.
- Q11. [Mentor] I feel I have the right skills and experience to support my mentee’s goals.
- Q12. [Both] Language, culture or hierarchy differences do not block our collaboration.
- Q13. [Both] We meet/talk as often as agreed at the start of the Mentoring Programm. (Frequency: 1 = Never, 5 = Always)
- Q14. [Both] Our mentoring sessions start on time and use the scheduled duration well.
- Q15. [Both] Each session has a clear focus or agenda.
- Q16. [Mentee] I feel psychologically safe to ask “naive” or critical questions in our sessions.
- Q17. [Both] There is a good balance between mentee talking and mentor talking.
- Q18. [Both] We regularly review progress against the mentee’s goals.
- Q19. [Mentee] Through mentoring, I have more clarity about my next possible career steps.
- Q20. [Mentee] I discovered new internal roles, Projekte or career paths through this Mentoring Programm.
- Q21. [Mentee] I feel more confident to apply for internal roles or Projekte.
- Q22. [Mentee] My mentor helps me understand how promotion and progression work in our company.
- Q23. [Both] Our conversations connect to our company’s Career Framework or role levels.
- Q24. [Mentee] I see a clear link between mentoring and my long-term career goals here.
- Q25. [Mentee, AI/skills tracks] My mentor provides practical exercises to apply new skills during or between sessions.
- Q26. [Mentee, AI/skills tracks] I had enough hands-on practice with AI tools or new skills during mentoring.
- Q27. [Mentee, AI/skills tracks] I understand the guardrails for using AI or new tools (data, compliance, ethics).
- Q28. [Mentor, AI/skills tracks] I feel confident guiding my mentee on safe and compliant AI use.
- Q29. [Both, AI/skills tracks] The Mentoring Programm aligns with our company’s skill priorities and Skill Framework.
- Q30. [Mentee] I apply new skills from mentoring in my daily work. (Frequency: 1 = Never, 5 = Very often)
- Q31. [Both] HR is available when I have questions about the Mentoring Programm.
- Q32. [Both] HR provides useful materials or templates for mentoring (e.g. agendas, goal-setting forms).
- Q33. [Both] My Führungskraft supports my participation in the Mentoring Programm.
- Q34. [Both] My workload allows me to participate in mentoring without feeling overloaded.
- Q35. [Mentor] My Führungskraft recognises the time and effort I invest as a mentor.
- Q36. [Both] Scheduling mentoring sessions (rooms, tools, time) works smoothly.
- Q37. [Mentee] Mentoring has improved my overall development and learning at work.
- Q38. [Mentee] Mentoring has improved my performance or impact in my current role.
- Q39. [Mentee] Mentoring increased my engagement and motivation to stay with this company.
- Q40. [Both] The Mentoring Programm is worth the time I invest.
- Q41. [Both] I would recommend this Mentoring Programm to a colleague.
- Q42. [Both] I understand how the Mentoring Programm fits with other development offers (e.g. training, internal mobility).
- Q43. [Mentor] I felt well prepared for the mentor role before the first session.
- Q44. [Mentor] I have clear guidance on what a good mentoring session should look like.
- Q45. [Mentor] I know when I should escalate issues to HR or the mentee’s Führungskraft.
- Q46. [Mentor] I receive feedback from HR or mentees that helps me grow as a mentor.
- Q47. [Mentor] My contribution as a mentor is valued and recognised by the company.
- Q48. [Mentor] I would volunteer to be a mentor again in a future Mentoring Programm.
- Q49. [Both] If needed, I know how to request a change of mentoring match.
- Q50. [Both] The matching process felt transparent and fair.
- Q51. [Mentee] My mentor challenges me constructively, not just sharing advice.
- Q52. [Mentor] My mentee comes prepared to our meetings. (Frequency: 1 = Never, 5 = Always)
- Q53. [Both] Our mentoring relationship improved over time.
- Q54. [Mentee] I feel comfortable ending or pausing the mentoring relationship if it no longer helps.
- Q55. [Mentor] I can set reasonable boundaries around time and topics with my mentee.
- Q56. [Both] I know that my honest feedback about the Mentoring Programm will not be used against me.
Question mapping by dimension: Q1–Q6 = Awareness & Onboarding, Q7–Q12 = Matching Quality, Q13–Q18 = Meeting Rhythm & Quality, Q19–Q24 = Career & Internal Mobility, Q25–Q30 = Skills & AI Use, Q31–Q36 = Support from HR & Managers, Q37–Q42 = Overall Impact & Satisfaction, Q43–Q48 = Mentor-Specific, Q49–Q56 = Relationship & Process Health.
If you already run engagement surveys or manager 360s, you can align wording and scales with the templates in this engagement survey question guide to keep your survey landscape consistent.
2.2 Overall / NPS-style questions (0–10)
- NPS1. [Mentee] How likely are you to recommend this Mentoring Programm to a colleague? (0 = Not at all likely, 10 = Extremely likely)
- NPS2. [Mentee] To what extent has this Mentoring Programm improved your clarity about your career in our company? (0 = Not at all, 10 = Very strongly)
- NPS3. [Mentor] How likely are you to recommend being a mentor in this Mentoring Programm to another colleague? (0 = Not at all likely, 10 = Extremely likely)
2.3 Open-ended questions
- OE1. [Mentee] What has changed for you since you joined the Mentoring Programm (work, confidence, career clarity)?
- OE2. [Mentor] What has changed for you as a mentor (skills, visibility, workload, leadership style)?
- OE3. [Both] If you could change one thing about the matching process, what would it be and why?
- OE4. [Both] What should we change about the format of sessions (length, structure, topics)?
- OE5. [Mentee] Describe one mentoring session that was especially helpful. What made it work so well?
- OE6. [Mentee] Where did mentoring not help you as expected? What was missing?
- OE7. [Mentor] What kind of preparation, training or materials would help you support mentees better?
- OE8. [Both] Any other comments, concerns or ideas about the Mentoring Programm?
Decision & action table for mentoring survey results
| Dimension / Questions | Trigger score (average) | Recommended action | Owner | Timeline |
|---|---|---|---|---|
| Awareness & Onboarding (Q1–Q6) | Score <3.5 or ≥20 % “disagree” | Revise onboarding materials; run 30–45 min intro session for new cohorts. | HR / People Development | Within 30 days after survey |
| Matching Quality (Q7–Q12, Q50) | Score <3.5 or ≥10 % request rematch (OE3/OE8) | Refine matching criteria; introduce simple rematch process; communicate clearly. | HR + Programme Sponsor | Design within 45 days; go-live next cohort |
| Meeting Rhythm & Quality (Q13–Q18, Q51–Q53) | Score <3.5 or many comments about “no time” | Share session templates; brief Führungskräfte on time protection; suggest minimum cadence. | HR + People Managers | Guidelines within 21 days; manager briefings within 60 days |
| Career & Internal Mobility (Q19–Q24, NPS2) | Score <3.5 or NPS2 <7.0 | Link mentoring to internal mobility tools and Career Framework; run joint info session. | HR + Talent Management | Concept within 45 days; first session within 90 days |
| Skills & AI Use (Q25–Q30) | Score <3.5 in AI tracks | Create short AI/skills playbooks for mentors; offer 2–3 micro-trainings. | L&D / Digital Academy | Materials within 30 days; trainings over next 3 months |
| Support from HR & Managers (Q31–Q36) | Score <3.5 or strong gaps between units | Clarify expectations in Führungskraft briefing; add mentoring to 1:1 agendas. | HRBPs + Line Managers | Briefings within 60 days; follow-up in next quarter |
| Mentor Experience (Q43–Q48) | Score <3.8 or NPS3 <8.0 | Introduce mentor community check-ins; adjust recognition (e.g. goals, visibility). | HR / Programme Lead | Design within 45 days; first check-in within 90 days |
| Overall Impact & Safety (Q37–Q42, Q56, NPS1) | NPS1 <8.0 or Q56 <4.0 | Run focus groups; review confidentiality, GDPR info and conflict escalation paths. | HR + Works Council + DPO | Focus groups within 30 days; policy updates within 90 days |
Key takeaways
- Use clear question blocks to diagnose matching, meetings, careers and skills separately.
- Translate low scores into 30–90 day actions with named owners.
- Protect psychological safety with anonymised surveys and clear escalation options.
- Connect mentoring results to internal mobility, Career Frameworks and skill management.
- Close the loop: share changes after each survey to build trust and participation.
Definition & scope
This survey measures how mentees and mentors experience your internal Mentoring Programm across eight dimensions: onboarding, matching, meetings, careers/internal mobility, skills/AI, HR and manager support, overall impact and mentor experience. It targets all active mentors and mentees in career, peer, reverse or AI-skills mentoring. Results guide decisions on programme design, training, recognition, resource allocation and links to internal mobility and Talent Marketplace initiatives.
Blueprints: ready-made internal mentoring program survey setups
You rarely need all questions at once. Below are four plug-and-play blueprints using the internal mentoring program survey questions above. Use question numbers, not new items, to keep your data comparable over time.
a) Post-pilot survey for a new Mentoring Programm (15–20 items)
Goal: Understand whether the pilot works in principle: onboarding, matching, basic meeting quality, first impact. Run 1–2 weeks after pilot end.
Recommended closed questions: Q1–Q6 (Awareness & Onboarding), Q7–Q12 (Matching Quality), Q13–Q16 (Meeting Rhythm & Safety), Q37–Q40 (Early Impact & Value), Q56 (Psychological Safety). Add NPS1 and NPS3, plus OE1, OE3, OE7 and OE8.
- HR selects Q1–Q6, Q7–Q12, Q13–Q16, Q37–Q40, Q56, NPS1, NPS3, OE1, OE3, OE7, OE8.
- Send survey to all pilot mentors and mentees within 7 days after last planned session.
- Report by dimension (onboarding, matching, meetings, impact) with comments per block.
- Decide “Stop / Fix / Scale” and document changes before next cohort.
b) Annual pulse for an established Mentoring Programm (20–25 items)
Goal: Optimise a running programme and compare cohorts. Run once per year for all active pairs plus those who finished in the last 6–12 months.
Recommended closed questions: Q1–Q4, Q7–Q12, Q13–Q18, Q19–Q24, Q31–Q36, Q37–Q42, Q43–Q48. Add NPS1–NPS3 and OE2, OE3, OE4, OE8.
- Keep the core set stable year to year to track trends.
- Segment results by business unit, location, role level, mentoring type (career, reverse, AI).
- Link findings to your internal mobility and Talent Marketplace strategy.
- Share top three company-wide actions and unit-level insights within 45 days.
c) Short pulse after 3–4 sessions (10–12 items)
Goal: Catch early issues with matching, meetings or overload. Run 6–8 weeks after programme start.
Recommended closed questions: Q3, Q7–Q10, Q13–Q18, Q33–Q35, Q56, plus NPS1. Add OE3 and OE4.
- Use anonymous, short survey (≤5 minutes) to protect relationships.
- Trigger one-to-one follow-up if comments hint at severe mismatch or boundary issues.
- Where many mentees flag “no time”, involve Führungskräfte to unblock calendars.
- Adjust matching rules or offer rematches before frustration builds.
d) Mentor-only survey (10–12 items)
Goal: Understand mentor motivation, workload and future participation. Run annually or after each wave.
Recommended closed questions: Q5, Q11, Q31–Q36, Q43–Q48, Q52, Q55, Q56, plus NPS3. Add OE2, OE7, OE8.
- Keep mentor survey separate from mentee survey to allow honest comments.
- Discuss results in a mentor community call, then agree on 2–3 concrete improvements.
- Link mentor contributions to your leadership development and Career Framework.
- Use insights to refine recognition (e.g. goals, promotions, visibility, learning credits).
Scoring & thresholds
Use a 1–5 scale for most questions (1 = Strongly disagree, 5 = Strongly agree). For frequency items, use 1 = Never, 5 = Always/Very often. Treat averages as follows: <3.0 = critical, 3.0–3.7 = needs improvement, 3.8–4.2 = healthy, >4.2 = strong area.
- HR defines target scores per dimension before launching the survey (e.g. matching ≥4.0).
- Analyse results per block: Awareness (Q1–Q6), Matching (Q7–Q12), Meetings (Q13–Q18), etc.
- If a block averages <3.0, schedule a deep-dive workshop with mentors and mentees within 30 days.
- If scores are 3.0–3.7, implement light-touch fixes (templates, communication, training) within 60 days.
- If scores are ≥4.0, capture good practices and share them with new mentors and cohorts.
For NPS-style questions (0–10), treat ≥9 as “promoters”, 7–8 as “passive”, ≤6 as “detractors”. Aim for NPS1 ≥30 for mentees and NPS3 ≥20 for mentors after 2–3 cycles. A talent platform like Sprad Growth, or similar, can consolidate mentoring survey data with performance and development data in one place and support follow-up tasks, without turning this into a sales project.
Follow-up & responsibilities
Survey results are only useful if someone owns the next steps. Define a simple governance: Programme Lead for design changes, HRBPs for local follow-up, Führungskräfte for time and workload, mentors for session quality, and executive sponsor for resources.
- Programme Lead: synthesises results by dimension and unit, drafts action proposals within ≤14 days.
- HRBPs: discuss local results with Führungskräfte, agree on 2–3 actions per unit within ≤30 days.
- Führungskräfte: ensure time for mentoring in calendars and discuss expectations in 1:1s within ≤45 days.
- Mentors: adapt their practice (e.g. more structure, more challenge) based on feedback within ≤60 days.
- Executive Sponsor: reviews progress quarterly and decides on budget, tools or scope changes.
For very critical feedback (e.g. harassment, ethical issues) define a fast lane: HR reviews free-text within ≤24 h using clear rules, separates programme feedback from individual complaints and activates existing whistleblowing or complaint procedures if needed. Keep mentoring feedback out of individual performance reviews to protect trust; instead, connect it indirectly via development paths and manager feedback surveys.
Fairness & bias checks
Mentoring access and quality often differ by location, job family, gender, age or working model (remote vs. office). Use your survey to make these gaps visible and handle them fairly, especially in DACH environments with strong co-determination.
- Segment scores by location, business unit, role level, gender (where lawful), employment type and mentoring type.
- Check whether certain groups systematically get weaker matching (Q7–Q12) or lower career impact (Q19–Q24).
- If reverse mentoring or AI mentoring targets only specific groups, check their workload (Q34–Q35) and recognition (Q47).
- Discuss segmentation logic and anonymisation thresholds (e.g. ≥7 respondents per cut) with the Betriebsrat.
- Share high-level patterns, not individual comments, with Führungskräfte to avoid re-identification.
Typical patterns and responses:
| Pattern | Risk | Response |
|---|---|---|
| Women rate career impact lower (Q19–Q24) than men in same units. | Mentoring reinforces existing inequalities instead of reducing them. | Review matching criteria, increase senior sponsor involvement, and include targeted career topics. |
| Remote staff report lower meeting quality (Q13–Q18) than on-site staff. | Two-class experience; remote staff feel excluded. | Provide virtual session guidelines, tools support and coach mentors on remote relationship-building. |
| Blue-collar mentees score manager support (Q33–Q35) lower than white-collar mentees. | Frontline staff cannot use the Mentoring Programm effectively. | Align with operations leaders, adjust shift planning, offer mobile-friendly formats. |
Examples / use cases
Use case 1: Matching feels “off” and mentees disengage
In one DACH company, mentees rated Matching Quality (Q7–Q12) at 3.1, while overall satisfaction (Q40) stayed around 3.9. Comments said “my mentor is great, but not in my field” and “too big hierarchy gap”. HR and the Programme Lead changed matching to use skills and interests instead of only hierarchy and business unit.
They also added a simple rematch option and updated communication to say rematches are normal, not a failure. In the next wave, Matching scores rose to 4.1 and mentee drop-out after 2–3 sessions fell by about half. A similar approach works well when you connect your Mentoring Programm with skill data from a skill management solution.
Use case 2: Strong mentor enthusiasm, weak career impact
Another organisation saw very high Mentor Experience scores (Q43–Q48 ≈4.5) and NPS3 ≈9, but mentee Career & Internal Mobility (Q19–Q24) sat at 3.2. Mentors enjoyed “giving back”, yet conversations stayed generic.
HR built a simple pack of career conversation guides, including how to use their internal Talent Marketplace and Career Framework in sessions, based on ideas from their internal mobility software evaluation. After two cohorts, career impact scores increased to 3.9, and internal applications by mentees rose measurably.
Use case 3: AI mentoring without clear guardrails
A tech firm launched AI-skills mentoring. Skills & AI Use scores (Q25–Q30) differed hugely by team; comments showed confusion about GDPR, tools and data handling. Some mentors refused AI topics entirely, others experimented with sensitive data.
HR, IT and Legal set up a short AI policy, a list of approved tools and sandbox environments. They also offered an “AI safety” micro-training for mentors, inspired by their own AI coaching approach for managers. Six months later, AI items averaged 4.0 and qualitative comments shifted from “I’m scared” to “I know what’s allowed and can practise safely”.
Implementation & updates
Think of your mentoring survey as part of your broader talent and skills stack, not a one-off project. Keep it light enough for mentors and mentees, but structured enough for HR, DPO and Betriebsrat.
- Pilot: Run the post-pilot blueprint with one business unit or one mentoring track (e.g. career mentoring) first.
- Rollout: Expand to all mentoring types (career, peer, reverse, AI) once questions and thresholds feel stable.
- Timing: Use a short pulse after 3–4 sessions, then a deeper survey at programme end or annually.
- Training: Brief Führungskräfte and mentors on how results will be used and not used (no individual rating).
- Review: Once per year, review items, thresholds and processes with Programme Sponsor and Betriebsrat.
Track 3–5 core metrics over time: response rate (target ≥70 %), average Matching score (Q7–Q12), average Career Impact (Q19–Q24), Mentor NPS (NPS3), and share of mentees who move internally within 12–24 months. Connecting these KPIs with your broader talent development strategy helps you argue for or against scaling the Mentoring Programm.
DACH / GDPR / Betriebsrat notes
Handle mentoring surveys like any Mitarbeiterbefragung: clarify legal basis (usually legitimate interest), purpose (programme improvement only), data minimisation (no unnecessary demographics), anonymisation thresholds (e.g. no cuts <7 respondents) and retention (e.g. delete raw comments after 24 months). Agree these points with your DPO and Betriebsrat and document them in a short FAQ for participants.
Be explicit that feedback will not be used to evaluate individual mentors or mentees, and remind people about existing channels for personal conflicts or misconduct. Tools like Sprad Growth or similar HR analytics platforms can help automate retention periods and access controls without exposing sensitive comments widely.
Conclusion
Mentoring can be a powerful engine for internal mobility, skill development and leadership culture — or a well-meant side project that silently fades. The difference is rarely intent; it is whether you listen systematically to mentors and mentees and act on what you hear. A clear set of internal mentoring program survey questions gives you that signal.
With the question bank and blueprints above, you can spot problems earlier (mis-matches, unstructured sessions, unclear career links), improve conversations between mentors, mentees and Führungskräfte, and focus your development budget on what really moves careers in your company. Start small: pick one blueprint, configure it in your survey or Talent Management tool, and agree who owns analysis and follow-up.
Next, align your Mentoring Programm with existing processes like performance reviews, Career Frameworks and internal Talent Marketplaces. Over time, use your metrics to decide whether to expand, target specific populations (e.g. women in leadership, blue-collar internal mobility) or combine mentoring with formal learning and AI-enabled coaching. The survey will not fix mentoring on its own, but it gives you the evidence to make better, fairer and faster decisions.
FAQ
How often should we run mentoring surveys?
For most organisations, two touchpoints work well: a short pulse after 3–4 sessions (10–12 questions) and a deeper survey once per year or at programme end. If your Mentoring Programm runs in cohorts, align the deep survey with cohort end. If it’s continuous, pick a fixed month for all participants. Avoid surveying people more than 2–3 times per year across all HR surveys to prevent fatigue.
What should we do if scores are very low in one area?
First, look at comments and segments to understand who is struggling and why. Then define 1–3 concrete actions with owners and deadlines, using the decision table above. For example, poor Matching scores might lead to revising criteria and adding a rematch process; weak meeting quality might trigger mentor training and session templates. Communicate changes back to participants so they see that feedback matters.
How do we protect psychological safety and honest feedback?
Use an anonymous survey tool, set minimum group sizes for any breakdown (e.g. ≥7 respondents) and avoid asking for identifiable details (exact role, age, manager name). Clarify in your info sheet that responses are used to improve the Mentoring Programm, not to rate individuals, and that serious issues still belong in existing complaint channels. According to the German public sector guidance on mentoring from the U.S. Office of Personnel Management, clear expectations and boundaries are key to safe mentoring — your survey and communication should reinforce that.
How can we link mentoring results to internal mobility and careers?
Use the Career & Internal Mobility block (Q19–Q24) plus NPS2 as a bridge. Low scores there indicate that mentors need better information about internal roles, your Career Framework and Talent Marketplace or similar tools. Involve Talent Management colleagues to co-create career conversation guides, and track whether mentees apply internally or move roles after mentoring. Over time, aim to see both higher career clarity scores and higher internal fill rates.
How do we keep the question set up to date?
Review the survey annually with the Programme Sponsor, 2–3 experienced mentors, 2–3 mentees, HR, DPO and Betriebsrat. Keep 70–80 % of questions stable for trend analysis; swap out 3–5 items to reflect new priorities such as AI skills or new mentoring tracks. Archive old versions with dates so historical comparisons stay clear. If you introduce new mentoring formats, pilot them with a small, tailored question subset before merging data into your main time series.



