Manager 1:1 Meeting Survey Questions: How Employees Experience Their One-on-Ones

By Jürgen Ulbrich

Manager 1:1 meeting survey questions help you see if your “Mitarbeitergespräche” really work, not just if they happen. With this template you spot reliability issues, missing structure, weak follow‑through and hidden trust problems early – and you get clear actions for HR, Führungskräfte and works council.

Survey questions for manager 1:1 meetings

Closed questions (5‑point Likert scale)

All statements use a 1–5 scale: 1 = Strongly disagree, 5 = Strongly agree.

  • Q1 – Frequency & reliability: My 1:1s with my manager happen at the agreed regular cadence.
  • Q2 – Frequency & reliability: My manager rarely cancels our 1:1s (or quickly offers a new slot).
  • Q3 – Frequency & reliability: When a 1:1 is moved, it is rescheduled within ≤7 days.
  • Q4 – Frequency & reliability: I can rely on having enough time in 1:1s to discuss my topics.
  • Q5 – Frequency & reliability: My manager treats our 1:1s as a priority, not as “nice to have”.
  • Q6 – Frequency & reliability: I know what cadence is expected for 1:1s in my team or company.
  • Q7 – Agenda & structure: We usually have a clear agenda for our 1:1s.
  • Q8 – Agenda & structure: I can easily add my own topics to the 1:1 agenda.
  • Q9 – Agenda & structure: Our 1:1s balance day‑to‑day work topics and longer‑term development.
  • Q10 – Agenda & structure: Our 1:1s start on time and stay focused.
  • Q11 – Agenda & structure: Action items from previous 1:1s are reviewed and followed up.
  • Q12 – Agenda & structure: Our 1:1 format works well for remote or hybrid collaboration.
  • Q13 – Psychological safety & trust: I feel safe to raise difficult topics in my 1:1s.
  • Q14 – Psychological safety & trust: I can talk openly about mistakes without fear of blame.
  • Q15 – Psychological safety & trust: My manager listens actively and does not interrupt me.
  • Q16 – Psychological safety & trust: I trust my manager to handle sensitive information appropriately.
  • Q17 – Psychological safety & trust: I can give upward feedback in 1:1s without negative consequences.
  • Q18 – Psychological safety & trust: After a 1:1, I usually feel heard and respected.
  • Q19 – Feedback & development: I receive clear feedback on my performance in 1:1s.
  • Q20 – Feedback & development: I understand what is expected of me in my role.
  • Q21 – Feedback & development: We talk about my strengths and how to use them more often.
  • Q22 – Feedback & development: We discuss concrete development goals or an Individual Development Plan (IDP).
  • Q23 – Feedback & development: My manager supports me with resources for learning or training.
  • Q24 – Feedback & development: Development topics come up regularly, not only before reviews.
  • Q25 – Feedback & development: Feedback in 1:1s helps me improve my performance.
  • Q26 – Support & workload: My manager helps me prioritise when my workload is high.
  • Q27 – Support & workload: I can discuss blockers in 1:1s and usually get support.
  • Q28 – Support & workload: My manager respects my working hours and boundaries in 1:1s.
  • Q29 – Support & workload: We regularly review whether my goals are realistic.
  • Q30 – Support & workload: My manager checks in on my wellbeing (stress, energy) in 1:1s.
  • Q31 – Support & workload: 1:1s help me feel less overloaded and more focused.
  • Q32 – Career & internal mobility: We speak about my longer‑term career interests in 1:1s.
  • Q33 – Career & internal mobility: I know which skills I should build for future roles.
  • Q34 – Career & internal mobility: My manager informs me about internal opportunities or projects.
  • Q35 – Career & internal mobility: We discuss how my current work supports my career goals.
  • Q36 – Career & internal mobility: I feel my manager would support an internal move if it fits.
  • Q37 – Career & internal mobility: Our 1:1s make my career options in the company clearer.
  • Q38 – Overall impact & satisfaction: Our 1:1s help me perform better in my job.
  • Q39 – Overall impact & satisfaction: Our 1:1s strengthen my relationship with my manager.
  • Q40 – Overall impact & satisfaction: Overall, I am satisfied with the quality of my 1:1s.
  • Q41 – Overall impact & satisfaction: Our 1:1s are worth the time I invest in them.
  • Q42 – Overall impact & satisfaction: Compared to previous jobs, my current 1:1s work well.

Overall 0–10 rating question

  • Q43 – Overall rating: How likely are you to recommend your current 1:1s with this manager to a colleague as a useful format? (0 = Not at all likely, 10 = Extremely likely)

Open-ended questions

  • O1 – What is one thing that would make your 1:1s with your manager more valuable for you?
  • O2 – What is one thing your manager should start doing in 1:1s?
  • O3 – What is one thing your manager should stop doing in 1:1s?
  • O4 – What is one thing your manager should continue doing in 1:1s?
  • O5 – Describe a recent 1:1 that helped you a lot. What made it effective?
  • O6 – Describe a recent 1:1 that felt unhelpful. What was missing or went wrong?
  • O7 – How could the company better support managers to run high‑quality 1:1s?
  • O8 – Are there topics you would like to discuss in 1:1s but currently avoid? Why?
  • O9 – For remote/hybrid colleagues: What would improve virtual 1:1s with your manager?
  • O10 – Any other comments about how you experience your recurring 1:1 meetings?

Decision & action table

Area / items Threshold (team average) Recommended action Owner Timeline
Frequency & reliability (Q1–Q6) Score <3.0 or ≥30 % “disagree”/“strongly disagree” Clarify 1:1 cadence, block recurring slots, set cancellation rules; communicate to team. Direct manager, supported by HR Within 14 days after results
Agenda & structure (Q7–Q12) Score <3.3 Introduce standard 1:1 agenda template and shared notes; short manager training or e‑learning. HR / People Development Concept in 30 days; rollout next quarter
Psychological safety & trust (Q13–Q18) Score <3.0 or any item <2.5 Confidential coaching for manager; optional team check‑in; review complaints/whistleblowing channels. HRBP + manager’s manager Initial plan in ≤14 days; first coaching within 30 days
Feedback & development (Q19–Q25) Score 3.0–3.5 (medium) Link 1:1s to goals and IDPs; provide feedback phrasing guides and examples. HR / L&D Materials in 30 days; monitor next review cycle
Support & workload (Q26–Q31) Score <3.3 and high stress in engagement surveys Run workload review; adjust priorities; train managers in workload talks and boundaries. Manager + functional lead Start within 21 days; follow‑up in 60 days
Career & internal mobility (Q32–Q37) Score <3.0 while overall 1:1 satisfaction (Q40) ≥3.5 Introduce career framework, internal mobility communication and career‑focused 1:1 templates. HR / Talent Management Pilot in 60 days; company‑wide in 6–12 months
Overall impact & satisfaction (Q38–Q42, Q43) Q40 <3.5 or Q43 NPS <7.0 Run manager enablement program; share best‑practice 1:1 guides and peer learning. People & Culture + Leadership team Program concept in 45 days; first cohort in 3 months
Very low safety or misconduct signals (Q13–Q18 + O‑comments) Repeated comments on fear, retaliation or disrespect Escalate to HR, ggf. Betriebsrat; follow company process for investigation and support. HRBP + Compliance Initial assessment within ≤5 days

Key takeaways

  • Measure 1:1 quality, not just frequency, to improve engagement and performance.
  • Cluster results into clear areas: reliability, safety, feedback, workload, career.
  • Use thresholds and owners so every weak signal triggers a concrete action.
  • Protect anonymity in small teams and involve the Betriebsrat early.
  • Combine this survey with structured 1:1 agendas and manager enablement.

Definition & scope

This survey measures how employees experience recurring 1:1 meetings with their direct manager: structure, reliability, psychologische Sicherheit, feedback quality, support and career impact. It targets all Mitarbeitende with regular 1:1s and is especially useful after 1:1 rollouts or manager trainings. Results guide decisions on manager coaching, leadership programs, career frameworks and possible adjustments to performance management.

Scoring & thresholds

Closed questions use a 1–5 agreement scale; the overall rating uses 0–10. For interpretation, treat scores <3.0 as critical, 3.0–3.9 as improvement area, and ≥4.0 as strength. Look at both averages and distribution: a few very low ratings can signal serious issues even when the mean looks ok.

Turn scores into decisions by mapping items to dimensions: Q1–Q6 (reliability), Q7–Q12 (structure), Q13–Q18 (safety), Q19–Q25 (feedback/development), Q26–Q31 (support/workload), Q32–Q37 (career), Q38–Q42 + Q43 (overall). Combine these findings with your engagement or performance data for a fuller view. For deeper 1:1 content and agendas you can align this with your existing resources on effective 1:1 meetings.

  • HR defines standard thresholds (low, medium, high) and documents them in survey guidelines by end of quarter.
  • People Analytics prepares a dashboard showing averages and % favorable (4–5) for each dimension within 7 days after survey close.
  • HRBPs review results with line leaders where any dimension is <3.0 within 14 days.
  • Managers create 1–2 concrete improvement actions if their team’s overall satisfaction (Q40) is <3.5 within 30 days.
  • HR tracks completion of action plans and re‑checks scores in a pulse after 3–6 months.

Follow-up & responsibilities

Clear ownership prevents this survey from becoming “just another pulse”. Direct managers own local follow‑up with their teams. HR/People & Culture owns methodology, tooling and cross‑company analysis. Area leaders ensure that managers take signals seriously and invest time into improving 1:1s. In DACH, involve the Betriebsrat early and document your process.

A talent platform like Sprad Growth or an internal survey tool can automate sends, reminders and tracking of follow‑up actions, but responsibilities stay human. Tie this survey into your review processes and your broader performance management approach so 1:1s support goals, feedback and development consistently.

  • HR defines a RACI for survey design, rollout, analysis and follow‑up and shares it with leadership 4 weeks before launch.
  • HRBPs brief all managers on how to read results and discuss them with teams at least 1 week before results go live.
  • Managers present their team’s key findings and actions in a 30‑minute 1:1 or team slot within 21 days.
  • Area heads review completion of follow‑up actions (e.g. via a simple tracker) within 45 days after results.
  • HR consolidates learning across areas into manager enablement content and updates leadership once per quarter.

Fairness & bias checks

Look at results by relevant groups: location, business unit,職 level, tenure, remote vs. office, full‑time vs. part‑time. In DACH you also want to understand patterns across different employment agreements or sites with separate Betriebsräte, without exposing individuals. Always respect minimum cell sizes (e.g. ≥5 responses per slice) to protect anonymity.

Typical patterns: remote employees often score lower on frequency and structure; new joiners may rate psychological safety differently; teams under high pressure show weak workload and wellbeing support. Combine this survey with broader employee survey templates and governance so methods, scales and GDPR rules stay consistent.

  • People Analytics sets minimum group size rules (e.g. no breakdowns <5 respondents) and configures them in the reporting tool.
  • HR reviews dimension scores by gender, working time model and remote/office at company level within 14 days.
  • Where one group scores >0.5 points lower than others, HRBPs discuss root causes with local leaders within 30 days.
  • HR and Betriebsrat agree how far manager‑level results can be shown without risking identifiability before launch.
  • Once per year, HR audits items for potentially biased wording and updates the question bank with diverse manager input.

Examples / use cases

Use case 1: Good engagement, weak 1:1 structure

A tech team has strong engagement and performance, but scores 2.9 on agenda & structure (Q7–Q12). Comments mention “ad‑hoc chats” instead of real 1:1s. HR and the Director agree on a simple 30‑minute bi‑weekly 1:1 template with three blocks: priorities, feedback, development. Managers test it for 3 months; the next pulse shows structure scores at 3.9 and employees report clearer follow‑through.

Use case 2: Low psychological safety in one department

Overall safety scores are 3.8, but one sales unit sits at 2.6 on Q13–Q18. Comments mention fear of raising mistakes. HRBPs run confidential interviews, then arrange targeted coaching for the manager and a workshop on feedback and error culture. Complaints channels are restated. Six months later, scores rise to 3.3, and turnover in that unit drops.

Use case 3: Career discussions missing for mid‑career staff

Employees with 3–7 years tenure rate career & internal mobility (Q32–Q37) at 2.8, while others are above 3.5. HR introduces a basic career framework and integrates career check‑ins into 1:1s twice per year. Manager training uses scenarios and tools like talent development guides. In the next cycle, career scores rise, and internal mobility increases without extra headcount.

  • HR documents 2–3 real internal use cases after the first survey and shares them with leadership within 90 days.
  • HRBPs identify at least one “good practice” team with strong 1:1 scores and ask them to share formats with other managers.
  • Managers use survey stories in manager communities or brown‑bags to normalise talking about 1:1 quality.
  • HR links use cases to manager training modules, including AI‑supported prep as described in AI coaching for managers.

Implementation & updates

Start small: a pilot with one or two areas gives you real data and trust before a company‑wide rollout. Clarify legal basis under GDPR (usually legitimate interest with information duty), data minimisation and retention periods (e.g. delete raw data after 24 months). Involve the Betriebsrat early: share items, scales, anonymity rules and reporting boundaries.

Pick timing that fits your rhythm: many organisations run this survey once per year, plus a short pulse after new 1:1 formats, performance cycles or manager trainings. Align with tools you already use for engagement or performance; some performance platforms, or a talent system like Sprad Growth with Atlas AI, can embed 1:1 agendas and survey nudges directly in workflows.

  • Month 1: HR drafts the question set and thresholds, discusses them with HRBPs, Legal and Betriebsrat.
  • Month 2: Run a pilot in 1–2 departments (≥50 people), test communication, anonymity rules and dashboards.
  • Month 3: Adjust wording and thresholds based on feedback, then roll out company‑wide.
  • Month 4: Train managers to use results in 1:1s and team discussions, supported by short job aids or micro‑learning.
  • Annually: Review questions, add or retire items, and adjust thresholds based on trends and links to performance data.

Suggested KPIs to track

  • Participation rate in the 1:1 survey (target ≥70 % company‑wide, ≥60 % per area).
  • Average scores per dimension and % favorable (4–5) over time.
  • Share of managers with documented 1:1 improvement actions after each survey (target ≥80 %).
  • Change in engagement or eNPS in teams that improved 1:1 scores vs. those that did not.
  • Links to hard outcomes: turnover, internal mobility, performance review quality for teams with strong 1:1s.

Conclusion

Recurring manager 1:1s are one of your strongest levers for performance, engagement and retention – but only if employees experience them as reliable, safe and useful. A focused survey on 1:1 quality surfaces issues you rarely see in classic engagement pulses: cancelled meetings, missing agendas, weak follow‑through or low psychological safety.

With clear manager 1:1 meeting survey questions, thresholds and ownership, you can detect problems earlier, improve the quality of conversations and sharpen priorities for manager development. Start by piloting the survey in one area, configure it in your survey or performance tool, and name owners for analysis and follow‑up. Then, link results to your 1:1 templates, manager training and performance processes so improvements stick.

If you treat this survey as a repeating management tool – not a one‑off – you will see three things: earlier detection of relationship issues, better conversations about goals and development, and clearer decisions on where to invest in leadership support, coaching and career paths.

FAQ

How often should we run this 1:1 quality survey?

For most companies, once per year is enough for a full survey with all dimensions, plus a short pulse (5–7 items) after major changes like a new 1:1 format, performance cycle or manager training. If you run many surveys already, avoid fatigue: coordinate with engagement and pulse surveys so employees don’t get multiple long questionnaires in the same month.

What should we do if a team shows very low scores?

First, check anonymity and group size to avoid exposing individuals. Then HRBP and the manager’s manager review quantitative scores and comments together. If psychological safety or workload look critical (scores <2.5 or alarming comments), prioritise a quick conversation with the manager, optional listening sessions, and targeted coaching. Document actions and re‑check with a pulse after 3–6 months. Escalate potential misconduct through your usual channels.

How do we protect anonymity, especially in small teams?

Use a minimum cell size (e.g. 5) for any breakdown, and avoid showing item‑level results for teams below that threshold. For very small teams, aggregate to department level or combine several small teams. Communicate these rules clearly in your invite and privacy notice. In DACH, coordinate the concept and reporting logic with the Betriebsrat and Datenschutzbeauftragten before launch and document retention periods (e.g. 24 months).

How should managers share and discuss results with employees?

Managers should first digest their own results with HR or their manager, then share a simple summary with the team: 2–3 strengths, 2–3 improvement areas, and 1–2 agreed actions. Encourage open discussion but never “hunt” for who gave which rating. Short guides or phrases help less experienced managers. According to a Gallup analysis, regular high‑quality conversations strongly correlate with engagement, so build that habit.

How often should we update the question set?

Review questions annually. Keep core items for trend analysis, but refine wording if employees or managers find them unclear. Add 3–5 rotating “focus” items if you’re testing new initiatives (e.g. AI‑supported 1:1 prep, career frameworks). Align updates with other tools like performance reviews and 360‑degree feedback, for example using similar wording to your performance review survey questions, so people see one coherent system instead of many unrelated surveys.

Jürgen Ulbrich

CEO & Co-Founder of Sprad

Jürgen Ulbrich has more than a decade of experience in developing and leading high-performing teams and companies. As an expert in employee referral programs as well as feedback and performance processes, Jürgen has helped over 100 organizations optimize their talent acquisition and development strategies.

Free Templates &Downloads

Become part of the community in just 26 seconds and get free access to over 100 resources, templates, and guides.

Free One-on-One Meeting Template (Excel) – With Action Item Tracking
Video
Talent Development
Free One-on-One Meeting Template (Excel) – With Action Item Tracking
Free IDP Template Excel with SMART Goals & Skills Assessment | Individual Development Plan
Video
Performance Management
Free IDP Template Excel with SMART Goals & Skills Assessment | Individual Development Plan

The People Powered HR Community is for HR professionals who put people at the center of their HR and recruiting work. Together, let’s turn our shared conviction into a movement that transforms the world of HR.