Manager 1:1 meeting survey questions help you see what really happens in Mitarbeitergespräche, beyond manager self-perception. With a focused survey on one-on-ones, you spot patterns early: cancelled meetings, missing follow-ups, no time for development, or low psychological safety – and you can fix them with concrete changes.
Survey questions
Statements below use a 5‑point scale from “Strongly disagree” (1) to “Strongly agree” (5). Tags show whether each item fits better for a short pulse after a review cycle [Pulse] or for a deeper periodic 1:1 quality survey [Annual / deep-dive].
2.1 Closed questions (Likert-scale)
- My manager and I have 1:1 meetings on a predictable, agreed cadence. [Pulse]
- Our 1:1s are rarely cancelled or moved at the last minute. [Pulse]
- If a 1:1 is postponed, my manager suggests a new slot within 7 days. [Annual / deep-dive]
- When my workload increases, my manager protects time for our 1:1s. [Annual / deep-dive]
- I can request extra 1:1 time when needed, and my manager responds quickly. [Pulse]
- Overall, I can rely on our 1:1s actually happening as planned. [Pulse]
- Our 1:1s have a clear agenda before the meeting. [Pulse]
- Both my topics and my manager’s topics are reflected in the 1:1 agenda. [Annual / deep-dive]
- We start 1:1s on time and use the time efficiently. [Pulse]
- We review notes or action items from previous 1:1s at the start of the meeting. [Annual / deep-dive]
- My manager comes prepared for our 1:1s (updates, data, feedback). [Pulse]
- I also prepare for our 1:1s and know what I want to discuss. [Annual / deep-dive]
- After our 1:1s, I am clear about my key priorities for the next weeks. [Pulse]
- Our 1:1s give me clarity on my role goals and expectations. [Annual / deep-dive]
- We regularly link our 1:1 conversations to team or company goals (e.g. OKRs). [Annual / deep-dive]
- We spend enough time on what matters most, not just urgent operational topics. [Pulse]
- When priorities change, we re-align them in our next 1:1. [Pulse]
- I know how my work in the next weeks will be evaluated. [Annual / deep-dive]
- There is space in our 1:1s for me to give feedback to my manager. [Annual / deep-dive]
- My manager regularly gives me helpful feedback on my performance. [Pulse]
- The feedback I receive in 1:1s is specific and linked to recent examples. [Annual / deep-dive]
- I feel recognised for my work and contributions in our 1:1s. [Pulse]
- Critical feedback in our 1:1s is respectful and focused on behaviour, not personality. [Annual / deep-dive]
- Our 1:1s help me avoid surprises in formal performance reviews. [Annual / deep-dive]
- We regularly discuss my skills, strengths, and development areas in 1:1s. [Annual / deep-dive]
- Our 1:1s cover medium- to long-term career perspectives, not only short-term tasks. [Annual / deep-dive]
- My manager supports me in finding development options (trainings, projects, mentoring). [Annual / deep-dive]
- I can openly express my career ambitions in 1:1s. [Pulse]
- We follow up on development actions agreed in previous 1:1s. [Annual / deep-dive]
- Our 1:1s make me feel that my growth matters to my Führungskraft. [Pulse]
- There is space in our 1:1s to talk about my workload and stress level. [Pulse]
- I feel comfortable sharing when I’m at risk of overload or burnout. [Annual / deep-dive]
- My manager responds constructively when I talk about stress, boundaries, or private constraints. [Annual / deep-dive]
- We sometimes step back from tasks and talk about my general wellbeing. [Pulse]
- If I say “this is too much”, my manager helps to re-prioritise. [Pulse]
- Our 1:1s help me set healthy boundaries between work and private life. [Annual / deep-dive]
- I feel safe to speak openly in our 1:1s, even about mistakes or bad news. [Pulse]
- I can raise concerns about team culture, conflicts, or inclusion in our 1:1s. [Annual / deep-dive]
- My manager listens without interrupting and takes my perspective seriously. [Pulse]
- When we disagree in a 1:1, the discussion stays respectful and constructive. [Annual / deep-dive]
- I trust my manager to keep sensitive topics from our 1:1s confidential. [Annual / deep-dive]
- Because of our 1:1s, I feel more confident bringing up issues early. [Pulse]
- Overall, our 1:1s are a good use of my time. [Pulse]
- Our 1:1s help me perform better in my role. [Annual / deep-dive]
- Because of our 1:1s, I feel more engaged with my work. [Annual / deep-dive]
- Our 1:1s improve collaboration between me and my manager outside the meeting. [Annual / deep-dive]
- If our 1:1s stopped, my work would clearly suffer. [Annual / deep-dive]
- Overall, I am satisfied with the quality of my 1:1s with my manager. [Pulse]
2.2 Overall / NPS-style questions (0–10 scale)
Use a 0–10 scale where 0 = “Not at all” and 10 = “Extremely”.
- How useful are your current 1:1s with your manager for your day-to-day work? (0–10)
- How safe do you feel to speak openly in your 1:1s, including about mistakes or concerns? (0–10)
- How likely are you to recommend the way your manager runs 1:1s to a colleague? (0–10)
2.3 Open-ended questions
- What makes your 1:1s with your manager most helpful for you?
- What would need to change for your 1:1s to be more useful?
- What is one thing your manager should start doing in 1:1s?
- What is one thing your manager should stop doing in 1:1s?
- What is one thing your manager does in 1:1s that should definitely continue?
Decision & action table
| Area (related questions) | Trigger / threshold | Recommended action | Owner | Timeline |
|---|---|---|---|---|
| Frequency & reliability (items on cadence, cancellations, rescheduling) | Average score <3.0 or ≥25% “Disagree/Strongly disagree” | Agree standard 1:1 cadence, block recurring slots, set cancellation rules, communicate to team. | Manager, aligned with HR | Within 14 days after results |
| Structure & preparation (agenda, start on time, follow-up) | Average 3.0–3.4 = needs improvement; <3.0 = critical | Introduce shared agenda template, 10-minute prep rule, and simple note-taking & follow-up checklist. | Manager, with HR providing template | Template ready in 7 days; visible change in next 1–2 1:1s |
| Goals & priorities (clarity, link to OKRs/targets) | Average <3.5 or open-text mentions of “unclear priorities” | Run goal-clarity session, update targets in performance system, embed goals as fixed agenda item. | Manager, supported by HR / People Partner | Within 30 days |
| Feedback & recognition (constructive input, bi-directional feedback) | Average <3.5 or ≥20% low scores on feedback items | Provide manager coaching on feedback skills; add “two-way feedback” slot to every 1:1 agenda. | HR L&D for training, manager for practice | Training within 45 days; agenda change immediately |
| Development & career (growth talks, follow-up on actions) | Average <3.5 or low 0–10 impact ratings | Schedule separate quarterly development 1:1, link to career framework and development plans. | Manager, with HR providing frameworks | First dev 1:1 within 60 days |
| Wellbeing & workload (stress, boundaries) | Average <3.0 or strong negative comments | Clarify that workload & wellbeing are valid 1:1 topics; review workload, re-prioritise, adjust goals. | Manager; HR for escalation if health risks | Initial review within 7 days for affected teams |
| Psychological safety & trust (speaking up, confidentiality) | Any score <2.5 or 0–10 safety <6.0 | Escalate to HR for coaching; consider external coach or training on psychologische Sicherheit. | HR / People Partner, manager’s manager | Start intervention within 14 days |
| Overall satisfaction & impact (usefulness, NPS-style question) | Average satisfaction <3.5 or NPS <7.0 | Run mini-retro with team on 1:1 format; agree 2–3 concrete changes and re-measure in next pulse. | Manager, with HR template support | Retro within 21 days; follow-up survey within 90 days |
Key takeaways
- Use this survey to see how one-on-ones truly feel, not how they “should” be.
- Translate low-score areas directly into 2–3 concrete team-level changes.
- Combine quick post-cycle pulses with one deeper annual 1:1 experience survey.
- Discuss aggregated results with managers, not just in HR or leadership circles.
- Re-run the survey to check if new 1:1 habits stick and improve scores.
Definition & scope
This survey measures the quality, frequency, and impact of manager–employee 1:1 meetings (Mitarbeitergespräche). It is designed for all employees who have regular 1:1s with a direct Führungskraft, regardless of level or location. Results guide decisions on manager coaching, 1:1 guidelines, performance management, and wellbeing support, and complement broader manager effectiveness or engagement surveys.
Scoring & thresholds for manager 1:1 meeting survey questions
Most items use a 1–5 agreement scale. For analysis, calculate average scores per question and per area: Frequency & reliability, Structure & preparation, Goals & priorities, Feedback & recognition, Development & career, Wellbeing & workload, Psychological safety & trust, Overall satisfaction & impact.
Define clear bands:
- Low / critical: average <3.0 → immediate action needed.
- Medium / needs improvement: 3.0–3.9 → plan targeted adjustments.
- High / strong: ≥4.0 → maintain and learn from good practices.
For 0–10 ratings, set similar thresholds (e.g. <6.0 critical, 6.0–7.9 ok, ≥8.0 strong). Turn scores into decisions by mapping each band to actions: coaching for low psychological safety, agenda templates for weak structure, or fixed cadences for unreliable 1:1s. A talent platform like Sprad Growth or the Atlas AI assistant can help automate survey sends, reminders, and action tracking from one place.
- HR defines score bands and actions per area and documents them in a short playbook within 14 days.
- People Analytics or HR exports results by team and computes averages within 3–5 days after survey close.
- HR shares a 1–2 page summary with each manager, highlighting 2–3 strongest and weakest areas.
- Managers discuss results with their team within 21 days and agree on 2–3 concrete 1:1 changes.
- HR re-measures with a short pulse survey after 90 days to see if scores improved.
Survey blueprints for different cadences
You rarely want to use all questions at once. Build smaller, focused surveys from the bank above.
| Blueprint | Length | When to use | Main areas covered |
|---|---|---|---|
| Quarterly 1:1 quality pulse | 10–12 questions | Every quarter, right after performance or goal check-ins | Frequency & reliability, Structure & preparation, Goals & priorities, Overall satisfaction |
| Annual 1:1 experience survey | 18–20 items | Once per year, together with engagement or manager surveys | All eight areas, plus 2–3 open questions |
| Targeted pulse after new 1:1 guidelines or tools | 8–10 questions | 6–8 weeks after rollout of new 1:1 standards or software | Structure & preparation, Feedback & recognition, Overall satisfaction |
| 1:1 quality + manager coaching needs | 15–18 questions | Before manager training cycles | Psychological safety, Development & career, Feedback skills, 0–10 safety/usefulness ratings |
For concrete 1:1 formats and question ideas, you can combine this survey with existing resources like your company’s 1:1 playbook or external guides on effective 1:1 meetings and modern performance management.
Follow-up & responsibilities
Without clear ownership, survey insights die in slide decks. Define who reacts to which signals and by when. In DACH, you also need transparent rules for managers, HR, Betriebsrat, and employees.
- HR / People team: designs the survey, defines thresholds, ensures GDPR-compliant handling, and prepares team-level reports within 7 days after survey close.
- Direct managers: review their team’s results, run a 30–60 minute discussion on findings, and agree 2–3 1:1 improvements within 21 days.
- Manager’s manager / department lead: supports managers with very low scores (e.g. psychological safety <3.0), agrees on coaching or training, and monitors progress.
- Betriebsrat (where applicable): is informed early about purpose, anonymity rules, and data retention; reviews the concept before the first rollout.
- Employees: give honest feedback, participate in follow-up discussions, and help test new 1:1 routines over at least one quarter.
Set response times: very critical feedback (e.g. harassment, health risks) must trigger an HR response within ≤24 hours via existing whistleblowing or escalation channels. For “normal” low scores, managers should start action planning within ≤7 days, and teams should see first changes in 1–2 cycles of 1:1s.
Fairness & bias checks
Even for 1:1s, fairness matters. Look at patterns across teams, locations, roles, and demographics (where legally and ethically appropriate). The goal is not to judge individuals, but to spot structural issues.
- HR analyses results by team, location, function, and remote vs. office, only where ≥5 respondents per group to protect anonymity.
- If one function shows strong gaps in “Development & career” compared to others, HR discusses role expectations and career frameworks with leadership.
- If women, part-time employees, or remote staff systematically rate “Psychological safety & trust” lower, HR and leadership review meeting norms, language, and inclusion training.
- If one manager’s team scores extremely high, invite them to share concrete 1:1 practices in a manager community session.
- Document decisions and avoid using 1:1 survey results in isolation for performance ratings or compensation outcomes.
Combine quantitative scores with anonymous comments. Use simple text analysis or AI tools to cluster themes (e.g. “cancellations”, “no development time”, “no agenda”). Always double-check that you don’t expose individuals when sharing comment excerpts with managers or teams.
Examples / use cases
Use case 1: Unreliable 1:1s in a growing tech team
A software team grew from 8 to 15 people. In the first 1:1 survey, “Frequency & reliability” averaged 2.8. Comments mentioned “1:1s always cancelled before releases” and “no time to talk about my development”.
HR and the Engineering Manager agreed on changes: fixed bi-weekly 1:1 slots protected in calendars, a simple agenda with three fixed sections (check-in, priorities, development), and a rule that cancellations must be rebooked within 7 days. After 3 months, a short pulse showed the frequency score up to 3.9 and overall satisfaction up to 4.1.
Use case 2: Strong goals talk, weak psychological safety
In a sales department, “Goals & priorities” scored 4.3, but “Psychological safety & trust” was only 3.0. Comments mentioned “I don’t dare to share bad pipeline news” and “1:1s feel like interrogations”.
HR ran a manager workshop focusing on coaching-style questions, active listening, and reacting to bad news without blame. Managers also added a fixed agenda point: “What’s not going well / where do you need help?”. Six months later, psychological safety increased to 3.7, and fewer deals slipped at the end of quarter because risks surfaced earlier in 1:1s.
Use case 3: Revising 1:1 guidelines after rollout of a new tool
After introducing a new 1:1 note-taking feature in their talent platform, HR launched a targeted pulse survey based on the “Structure & preparation” and “Feedback & recognition” items. Scores showed that the tool itself was fine, but managers still arrived unprepared.
HR changed the rollout strategy: instead of only showing features, they trained managers on how to use pre-filled agendas, past notes, and performance highlights (with support from an AI assistant) to prepare in 10 minutes. A follow-up pulse 8 weeks later showed significant improvements in preparation and perceived usefulness.
Implementation & updates
Introduce manager 1:1 meeting survey questions in a simple, staged way. Start with one or two pilot areas, refine wording and thresholds, then scale. In DACH, align early with Datenschutz and Betriebsrat, especially on anonymisation, legal basis, and retention periods.
- Pilot: Choose 1–3 teams with ≥10 employees each, run the quarterly pulse blueprint, and debrief managers and employees within 30 days.
- Rollout: Extend to more departments or company-wide in the next cycle, using the refined annual 1:1 experience survey (18–20 items).
- Training: Offer short sessions for managers on 1:1 basics, psychological safety, and how to interpret survey results; integrate into existing leadership programs.
- Governance: Define data minimisation rules (e.g. no health data in comments), retention windows (e.g. delete raw data after 24 months), and access rights.
- Review: Once per year, HR reviews questions, thresholds, and blueprints, based on feedback and links to other surveys such as your employee engagement & retention program.
Track a few core KPIs: participation rate (aim ≥70% for annual survey, ≥50% for pulses), average scores per area, change in scores after actions, percentage of teams that agreed at least 2 improvements, and follow-through rate on those actions. Over time, link 1:1 survey data with outcomes like performance scores, internal moves, or attrition to understand impact.
Conclusion
1:1 meetings are where goals, feedback, wellbeing, and development come together – or fall apart. A focused 1:1 survey gives you an early-warning system: you see if 1:1s are cancelled, shallow, or unsafe long before problems show up in engagement or performance metrics. At the same time, you highlight great practices and spread them across managers.
Used with clear thresholds and responsibilities, this survey improves the quality of everyday conversations, not just the quality of dashboards. You get three benefits: earlier detection of issues (e.g. overload, unclear priorities), better conversation quality (more feedback, more listening), and sharper priorities for manager development (e.g. coaching skills, basic structure, psychological safety).
Next steps can be simple: pick one pilot area, configure the quarterly pulse blueprint in your survey or talent platform, and align with Betriebsrat and Datenschutz on concept and anonymisation. Then train managers to discuss results openly with their teams and agree on 2–3 practical changes, like fixed 1:1 slots, shared agendas, and follow-up notes. Once you see first score improvements, embed the 1:1 survey into your regular performance management rhythm and review the question set once a year.
FAQ
How often should we run this 1:1 quality survey?
Most organisations combine one annual deep-dive with short pulses. Use the annual 18–20 item survey once per year to get a full picture across all eight areas. Add a 10–12 question pulse after each performance or goal review cycle, usually quarterly or half-yearly. If scores are stable and high, you can reduce pulse frequency; if you are in change or repair mode, keep it quarterly.
What should we do if scores are very low for a specific manager?
First, protect anonymity: only share data where ≥5 responses exist. Then look at patterns, not one number: which areas are weakest, and what do comments say? HR should meet the manager and their manager to interpret results and agree on concrete support, such as coaching, peer shadowing, or training. Re-run a pulse survey after 3–6 months to check progress before deciding on further steps.
How do we handle critical or very sensitive comments?
Explain up front that comments should not contain health data or names. Use your existing whistleblowing or complaint channels for acute issues like harassment or discrimination, not the 1:1 survey. If a comment still indicates risk (e.g. burnout, bullying), HR should act within ≤24 hours via established escalation processes. When sharing comments with managers, remove identifiers and only show themes or anonymised quotes to avoid exposing individuals.
How do we involve managers and Mitarbeitende so this doesn’t feel like “more surveys”?
Frame the survey as a tool for them, not about them. Before launch, explain to managers that results will guide support (training, tools, templates) rather than punishment. Tell employees how results will be used and commit to visible changes within a clear timeframe. After each survey, managers should discuss team-level results and co-create 2–3 improvements. This visible follow-up builds trust and keeps participation high.
How should we update the question bank over time?
Keep the core eight areas stable so you can compare trends year over year. Once a year, review open-text themes and remove 2–3 questions that add little insight or confuse people. Add new questions if your 1:1 guidelines or performance process change. Pilot any new items in one business unit before rolling out broadly. Document all versions and dates so you know which questions were active in each cycle and can interpret trends correctly.



