Nearly 60% of employees say their company’s performance review process feels unfair or unclear, according to recent Gallup data. That is not just a perception problem. It directly affects whether people trust their leaders, commit to their goals, and stay long term.
Well-designed performance review survey questions help you see your review cycle through employees’ eyes. They measure clarity, perceived fairness, psychological safety, and the usefulness of the conversation. They also show whether people want to take part again next cycle or would opt out if they could.
These surveys sit next to, not instead of, performance ratings. You still need ratings, calibration, and decisions on pay or promotion. But you also need a separate feedback loop about the experience of the process itself. Crucially, these questions focus on the process, not on rating individual managers publicly.
In this article you get:
- Targeted performance review survey questions across 7 themes
- Ready-made blueprints for a short pulse, a deep annual survey, and a manager pulse
- Guidance for running review experience surveys in DACH, including GDPR and Betriebsrat considerations
- Practical tips on timing, anonymity, and how to turn feedback into better training, rubrics, and calibration
So how can you design performance review survey questions that employees trust and leaders actually use? Let’s walk through the key sections and then build them into concrete survey designs.
1. Clarity & Expectations: Are Your Review Goals Crystal Clear?
A transparent review process starts with clear goals and criteria. When employees understand expectations, their trust in the performance review rises sharply.
Research shows that organizations with strong goal-setting and clear expectations see up to 30% higher satisfaction with performance reviews (Harvard Business Review). Yet many people still enter their Mitarbeitergespräch unsure how they will be evaluated.
In one mid-size German tech company (around 250 employees), leadership rewrote performance criteria in plain language and shared sample evaluations before review season. They also ran a short explainer session on the rating scale. In the following pulse survey, perceived fairness increased by 25 percentage points.
Typical issues you want to measure:
- Did employees understand their goals?
- Did they know what each rating meant?
- Did they see a clear link between expectations, feedback, and final evaluation?
Example performance review survey questions – Clarity & expectations (5-point Likert: strongly disagree to strongly agree):
- “Before my performance review, I clearly understood the goals for my role.”
- “I knew what ‘good performance’ looked like for my position.”
- “The performance criteria used for my review were clear to me.”
- “The rating scale (e.g. 1–5) and what each level means were easy to understand.”
- “I understood how my goals would influence my final performance rating.”
- “Information about the review process was communicated early enough.”
- “I understood how my performance review connects to salary or bonus decisions.”
Open-ended items:
- “What was unclear about the goals or criteria used in your performance review?”
- “If you could change one thing about how expectations are communicated, what would it be?”
Example outcome from a simple clarity initiative:
| Clarity factor | Before change (% agree) | After change (% agree) |
|---|---|---|
| Understood criteria | 51 | 78 |
| Knew rating scale | 47 | 73 |
| Trusted process | 54 | 68 |
These clarity-focused performance review survey questions work well in both a short post-cycle pulse and a deeper annual survey. Next you want to know whether people had enough time and evidence to prepare.
2. Preparation & Evidence: Did Everyone Have What They Needed?
Even the best-designed review process fails if nobody has time to prepare. Employees need space to reflect. Managers need data and examples, not vague impressions.
SHRM reports that around 70% of employees feel unprepared for reviews because they lack ongoing feedback or documentation. In one Swiss manufacturing company with 400 employees, HR started sending a simple Selbsteinschätzung template two weeks before each Mitarbeitergespräch. After two cycles, more than half of employees moved from “poorly prepared” to “well prepared” in follow-up surveys.
Key aspects to measure:
- Notice period before the review
- Access to performance data (goals, feedback, KPIs)
- Time and support for self-evaluation
- Quality of evidence used in the conversation
Example performance review survey questions – Preparation & evidence (Likert 1–5):
- “I had enough time to prepare for my performance review.”
- “I received a clear invitation and agenda for my Mitarbeitergespräch.”
- “I had access to relevant data (e.g. goals, KPIs, feedback) before the review.”
- “My manager seemed well prepared for our performance review conversation.”
- “We discussed specific examples of my work, not only general impressions.”
- “My self-evaluation (‘Selbsteinschätzung’) was considered in the review.”
- “I understood how feedback from colleagues or stakeholders was used in my review.”
0–10 rating item:
- “On a scale from 0–10, how well prepared did you feel for your performance review?”
Open-ended prompt:
- “What would have helped you feel better prepared for your review?”
Many organizations run a quick pulse with 3–4 of these items immediately after the review, then combine results with more detailed annual data.
| Prep element | Employee prepared (%) | Manager prepared (%) | Improvement after templates (% points) |
|---|---|---|---|
| Self-evaluation completed | 45 | n/a | +28 |
| Evidence brought to the meeting | 33 | 41 | +22 |
| Peer feedback used | 15 | n/a | +9 |
Once you know whether people came prepared, you can focus on the quality of the actual conversation.
3. Conversation Quality: Was the Review Constructive and Two-Way?
A performance review should feel like a real dialogue, not a verdict. Employees want specific feedback, time to ask questions, and space to disagree respectfully.
Data from platforms like Glint and CultureAmp show that reviews rated as “two-way” are usually considered twice as useful as one-way feedback sessions. Yet many employees still describe their last Mitarbeitergespräch as a monologue.
In a Berlin fintech, HR introduced a simple Gesprächsleitfaden with prompts such as “What are you most proud of this year?” and “What would you like to do differently next quarter?”. Within one review cycle, the share of employees who said they “felt heard” rose by more than 15 percentage points.
Topics to measure:
- Space to speak and ask questions
- Specific, behavior-based feedback
- Balance between strengths and growth areas
- Emotional tone and respect
Example performance review survey questions – Conversation quality (Likert 1–5):
- “My manager listened carefully to my perspective during the review.”
- “I felt I could speak openly in my Mitarbeitergespräch.”
- “The feedback I received was specific and based on concrete examples.”
- “My strengths were clearly discussed.”
- “My development areas were discussed in a constructive way.”
- “There was a good balance between looking back and looking forward.”
- “I left the conversation feeling respected as a person.”
0–10 rating item:
- “How useful was this performance review conversation for your work and development?” (0–10)
Open-ended prompt:
- “What made this performance review conversation particularly helpful or frustrating for you?”
| Conversation aspect | Positive responses (%) before guides | After guides (%) |
|---|---|---|
| Felt heard | 48 | 66 |
| Received specific feedback | 34 | 59 |
| Felt able to disagree | 29 | 52 |
Even great conversations can be undermined if ratings or calibration feel biased. That is why fairness and bias deserve their own section in your performance review survey questions.
4. Fairness & Bias: Was the Process Fair for Everyone?
Perceived fairness is central. If employees believe ratings are inconsistent or biased, trust erodes, and future participation drops. Fairness also shapes psychological safety: do people feel safe challenging outcomes?
Gartner found that organizations with strong fairness and transparency in calibration enjoy engagement scores around 19% higher than peers. Yet in many DACH companies, only a minority of employees say they trust the Kalibrierungsrunde to be objective.
One logistics company introduced standardized calibration rubrics and required each rating change in calibration to be justified in writing. They also added a confidential comment field in the post-review survey about perceived fairness. Within 1 year, internal reports of perceived favoritism fell by around 50%.
Key dimensions to measure:
- Fairness of ratings and criteria
- Transparency of the calibration process
- Psychological safety to disagree
- Consistency across teams
Example performance review survey questions – Fairness & bias (Likert 1–5):
- “I feel my performance rating was fair given my contribution.”
- “I understand how my rating was decided.”
- “People in similar roles are evaluated using the same criteria.”
- “I trust the calibration process (‘Kalibrierungsprozess’) to be objective.”
- “If I disagreed with my rating, I could say so without negative consequences.”
- “Bias (e.g. gender, age, nationality) does not influence performance ratings in my team.”
0–10 rating item:
- “On a scale from 0–10, how fair do you feel this year’s performance review process was for you personally?”
Open-ended prompts:
- “If you disagreed with your rating or outcome, how was this handled?”
- “What would increase your sense of fairness in our performance review process?”
| Fairness indicator | Pre-rubric complaints (%) | Post-rubric complaints (%) |
|---|---|---|
| Perceived favoritism | 21 | 10 |
| “Disagreement handled well” | 35 | 61 |
| Felt safe to challenge rating | 18 | 44 |
The next question is whether the process actually supports development or just delivers scores.
5. Development & Next Steps: Is Growth Actually Supported?
The real impact of a review lies in what happens next. Employees want clarity on strengths, growth areas, and concrete next steps, not only a rating.
According to LinkedIn Learning, employees who see clear development steps after a review are about twice as likely to stay at least one more year. Yet in many DACH organizations, fewer than half of employees say their last review included specific learning actions.
An Austrian engineering company introduced a simple Entwicklungsplan template that every manager had to complete with each team member during their Mitarbeitergespräch. Within 6 months, internal mobility applications increased by nearly 25%, and employees reported much higher clarity about their career paths.
Key angles to measure:
- Clarity of strengths and development areas
- Concrete actions and timelines
- Support for learning and growth
- Follow-up and check-ins
Example performance review survey questions – Development & next steps (Likert 1–5):
- “After my performance review, I am clear about my key strengths.”
- “I understand which skills or behaviors I should develop next.”
- “We agreed on concrete next steps (e.g. projects, trainings, mentoring).”
- “We discussed realistic timelines for my development goals.”
- “My manager and I agreed how we will follow up on my development plan.”
- “I know where to find learning resources that support my goals.”
0–10 rating item:
- “How confident are you that your development plan (‘Entwicklungsplan’) will be followed through in the next 6–12 months?” (0–10)
Open-ended prompt:
- “What would make your development plan more actionable and useful?”
| Development action | Included last cycle (%) | Next cycle target (%) |
|---|---|---|
| Written summary of strengths | 43 | 80 |
| Concrete growth steps | 39 | 75 |
| Follow-up meeting scheduled | 28 | 70 |
Of course, none of this works without solid support from managers and HR throughout the review cycle.
6. Manager & HR Support: Did You Feel Backed Up Throughout?
Support before, during, and after the Mitarbeitergespräch shapes how safe employees feel and how well they can use feedback. Support also matters for managers, who juggle heavy admin and emotional conversations.
McKinsey has found that ongoing, high-quality manager support can increase positive sentiment after performance reviews by roughly 35%. Still, many employees in the DACH region say they are unsure whom to contact if they have concerns about their review.
A Munich SaaS company tested an “HR Office Hours” model plus a dedicated chat channel during review season. Employees could ask questions about the process, calibration, or next steps. Satisfaction with HR support jumped from “rather low” to “high” in internal surveys.
Dimensions to measure for employees:
- Availability and responsiveness of the manager
- Clarity on escalation paths (“Beschwerdeweg”)
- Access to HR for questions or concerns
- Perceived empathy and support
Example performance review survey questions – Manager & HR support (Likert 1–5):
- “My manager supported me well in preparing for the review.”
- “During the conversation, my manager handled difficult topics with empathy.”
- “I know whom to contact in HR if I have questions about my review.”
- “HR was available to answer questions during the review cycle.”
- “I know how to raise concerns if I feel my review was unfair.”
- “Overall, I felt supported by the company throughout the performance review process.”
Pulse-style yes/no or multiple choice:
- “Did you know whom to contact if something felt off during your review?” (Yes/No)
Open-ended prompt:
- “What could your manager or HR have done differently to make you feel more supported?”
| Support channel | Awareness rate (%) | Used last cycle (%) |
|---|---|---|
| HR office hours | 24 | 9 |
| Helpdesk chat | 38 | 14 |
| Formal escalation route | 31 | 6 |
Finally, you want a global view: how do employees rate the overall experience, and will they willingly participate in the process again?
7. Overall Experience & Future Intent: Would You Do It Again?
The most powerful single signal from performance review survey questions is whether people would recommend the process to others or choose to participate again. If they would opt out, you have a serious trust problem.
Many organizations use an NPS-style item to track this. Companies that monitor review NPS and act on feedback tend to see 20–35% better participation and satisfaction over time.
A Dutch retail chain simplified its review form, shortened the meeting length, and introduced an open-ended “What should we change first?” question after each cycle. Within two cycles, the share of employees who would “recommend the performance review process” increased by more than one third.
Suggested overall experience items (Likert 1–5):
- “Overall, I am satisfied with this year’s performance review process.”
- “The performance review was worth the time I invested.”
- “I understand how to use the outcomes of this review in my daily work.”
- “I would feel comfortable going through this review process again next year.”
0–10 NPS-style items:
- “How likely are you to recommend our performance review process to a colleague?” (0–10)
- “If you had the option, how likely would you be to participate in this review process again next cycle?” (0–10)
Open-ended items:
- “What should we change first about our performance review process?”
- “What is one thing we should definitely keep?”
| Experience item | Pre-change score | Post-change score |
|---|---|---|
| Overall satisfaction (% favourable) | 48 | 63 |
| Recommend process (% 9–10) | 22 | 33 |
| Plan to participate again (% 8–10) | 55 | 72 |
With the main themes covered, you can now combine these performance review survey questions into practical survey blueprints.
8. Ready-Made Survey Blueprints
To make this concrete, here are 3 blueprints you can adapt: a short post-review pulse, an annual deep-dive, and a manager-focused pulse. Mix and match questions from the sections above to fit your context.
8.1 Short Post-Review Pulse (10–12 items)
Purpose: Quick, anonymous check right after the Mitarbeitergespräch to capture fresh impressions. Ideal for every review cycle.
Suggested items:
- “Before my performance review, I clearly understood the goals for my role.”
- “The performance criteria used for my review were clear to me.”
- “I had enough time to prepare for my performance review.”
- “My manager seemed well prepared for our performance review conversation.”
- “The feedback I received was specific and based on concrete examples.”
- “I felt I could speak openly in my Mitarbeitergespräch.”
- “I feel my performance rating was fair given my contribution.”
- “I understand how my rating was decided.”
- “After my performance review, I am clear about my next steps.”
- “Overall, I am satisfied with this year’s performance review process.”
- 0–10: “How likely are you to recommend our performance review process to a colleague?”
- Open text: “What should we improve first about our performance review process?”
8.2 Annual In-Depth Review Experience Survey (18–20 items)
Purpose: Once per year, after the main review and Kalibrierungsrunde, take a deeper look across all 7 themes: clarity, preparation, conversation, fairness, development, support, and overall experience.
Sample structure (all Likert 1–5 unless noted):
- Clarity & expectations:
- “I knew what ‘good performance’ looked like for my position.”
- “The rating scale and criteria were easy to understand.”
- Preparation & evidence:
- “I had access to relevant data before my review.”
- “We discussed specific examples of my work.”
- Conversation quality:
- “My manager listened carefully to my perspective.”
- “The conversation felt like a two-way dialogue.”
- Fairness & bias:
- “I feel my performance rating was fair.”
- “I trust our calibration process to be objective.”
- 0–10: “How fair was this year’s performance review process for you personally?”
- Development & next steps:
- “I am clear about my key strengths.”
- “We agreed on concrete development steps for the next 6–12 months.”
- Manager & HR support:
- “I know whom to contact in HR if I have questions about my review.”
- “Overall, I felt supported by my manager throughout the review process.”
- Overall experience & intent:
- “The performance review was worth the time I invested.”
- 0–10: “How likely are you to recommend our review process to a colleague?”
- 0–10: “How likely are you to participate again next cycle if it remains similar?”
- Open text:
- “What made this year’s performance review cycle particularly helpful?”
- “What made it frustrating?”
- “If you could change one element about our performance review process, what would it be?”
8.3 Manager Pulse on the Review Process (8–10 items)
Purpose: Understand how the process works for managers. This survey focuses on workload, tools, clarity of guidelines, and support from HR.
Example manager pulse items (Likert 1–5):
- “I received clear guidelines on how to conduct performance reviews this cycle.”
- “The review forms and tools were easy to use.”
- “I had enough time to prepare fair and thoughtful reviews for my team.”
- “I had access to the data I needed (e.g. goals, feedback, KPIs) to make informed ratings.”
- “The calibration meetings (‘Kalibrierungsrunde’) helped improve consistency and fairness across ratings.”
- “I felt supported by HR during the review and calibration process.”
- “The administrative effort required for reviews was reasonable.”
- “I feel confident in handling difficult performance conversations with my team members.”
- Open text: “What should we change in our performance review process to better support you as a manager?”
- Open text: “Which resources or trainings would help you run better performance reviews?”
To decide which blueprint to use, it helps to compare them side by side.
| Blueprint type | # items | Themes covered | Best timing | Key use case |
|---|---|---|---|---|
| Short post-cycle pulse | 10–12 | Clarity, prep, conversation, fairness, overall | Immediately after Mitarbeitergespräch | Quick temperature check each cycle |
| Annual deep dive | 18–20 | All 7 themes including development & support | After calibration round | Strategic process redesign and training needs |
| Manager pulse | 8–10 | Preparation, tools, admin load, HR support | After main review cycle | Improving manager workflows and guidance |
9. Implementation Guidance for DACH: Timing, Anonymity & Routing
Good performance review survey questions are one part. Running them well in a DACH context requires attention to timing, anonymity, GDPR, and works council expectations.
9.1 Timing the surveys
Typical rhythm:
- Short pulse: Send 1–3 days after each Mitarbeitergespräch. Keep it short (5–10 minutes).
- Annual deep-dive: Send once per year, when all reviews and the main Kalibrierungsrunde are complete, so employees can reflect on the full process.
- Manager pulse: Send to managers right after the cycle ends, when admin workload and calibration discussions are still fresh.
Avoid survey fatigue by clearly communicating purpose and planned follow-up actions. Employees in Germany, Austria, and Switzerland usually appreciate transparency on how results feed into concrete changes.
9.2 Anonymity thresholds
To build trust and stay safe from a data protection and Betriebsrat perspective, you should:
- Analyze results only for groups with at least 7 respondents (sometimes 5, but 7 is safer).
- Avoid small cross-cuts that make individuals identifiable (e.g. team + gender + seniority in a team of 6).
- Aggregate some results at department or location level instead of by single team when numbers are small.
Always communicate these rules in your survey introduction. That reassures employees that their answers cannot be traced back to them personally.
9.3 Routing urgent issues and complaints
Most performance review survey questions are about patterns, not individual issues. Still, sometimes a comment points to an urgent problem (e.g. bullying, discrimination).
Good practice in DACH:
- Offer separate, named channels for complaints or “Beschwerdeweg” (e.g. HR mailbox, ombudsperson, Vertrauensperson) and explain them in the survey text.
- Make clear that the survey itself is anonymous and not monitored in real time for emergencies.
- If you ask about serious topics (e.g. discrimination), add a note on other channels employees can use if they want direct support.
This balance keeps your survey focused on process improvements while still pointing people to help when needed.
10. DACH/GDPR Notes: Legal Basis, Data Minimisation & Betriebsrat
Performance review experience surveys touch on personal perceptions but usually not on sensitive data. Still, in the DACH region you should align with GDPR (DSGVO) and often involve the Betriebsrat.
10.1 GDPR basics for review experience surveys
Key principles:
- Legal basis: Many companies use “legitimate interest” under Art. 6(1)(f) DSGVO to run employee surveys aimed at improving HR processes.
- Datensparsamkeit (data minimisation): Ask only what you need to improve the review process. Avoid health data, union membership, religious views, or other special categories.
- Purpose limitation: Clearly state that the data will be used to improve the performance review process and related training or guidelines.
- Retention: Define how long you keep raw responses (for example, 12–24 months) and who has access. Store only aggregated reports longer term.
Also provide a short privacy notice in the survey explaining contact details of the data controller, legal basis, storage period, and rights of the data subjects.
10.2 Role of the Betriebsrat / works council
In many German and Austrian companies, surveys that relate to behavior, performance or “Mitarbeiterbeurteilung” fall under co-determination. That means the Betriebsrat expects:
- Early involvement in survey design (themes, anonymity rules, how results are used).
- Clear statement that results are not used to evaluate individual employees or identify “low performers”.
- Agreement on how and at what level results are shared with leaders.
In practice, most works councils support performance review experience surveys if you can show they are used to improve fairness, psychological safety, and manager training, not to tighten control.
Conclusion: Performance Review Surveys That Build Trust—and Drive Improvement
Performance review survey questions give you a direct line into how employees really experience your review cycle. They uncover whether people understand expectations, see ratings as fair, and leave conversations with clear next steps.
Three core insights stand out:
- The right questions move beyond ratings and expose concrete design issues in your process.
- Trust depends on transparency and psychological safety across the whole cycle, from goal-setting to calibration.
- Using survey results to adjust rubrics, training, and communication creates visible improvements without blaming individuals.
Practical next steps:
- Pick a short post-review pulse blueprint and test it after your next Mitarbeitergespräch round.
- Plan one annual deep-dive to guide bigger improvements in calibration, criteria, and manager training.
- Include a manager pulse to understand their workload and support needs.
- Involve your Betriebsrat early, agree on anonymity rules, and be transparent about how you handle data under DSGVO.
As organizations in the DACH region become more agile and distributed, performance reviews will only work if they are trusted. Systematically measuring review experience, then acting on the results, turns a compliance ritual into a real driver of development and engagement.
Häufig gestellte Fragen (FAQ)
What are the most important performance review survey questions every HR leader should ask?
Focus on a core set that covers clarity, fairness, development, conversation quality, and overall satisfaction. For example: “Did you understand expectations?”, “Was your rating fair?”, “Were next steps clear?”, “Did you feel heard?”, “How likely are you to recommend this process?”. Combine 5-point Likert items with at least one open-ended question per theme.
How do I ensure our post-performance review surveys stay GDPR compliant in Germany?
Limit questions to what you need to improve the process (“Datensparsamkeit”), avoid sensitive categories like health data, and keep responses anonymous by using minimum group sizes. Inform employees about purpose, storage time, and legal basis, often Art. 6(1)(f) DSGVO (legitimate interest). Involve your Betriebsrat early and document any agreement on survey use.
Why should we separate engagement surveys from performance review experience surveys?
Engagement surveys look at broad topics such as workload, leadership, and culture. Performance review experience surveys zoom in on a specific process: goals, ratings, calibration, and the Mitarbeitergespräch. Keeping them separate allows you to see which issues come from the review process itself and fix them faster, without mixing them into general engagement signals.
How often should we run a performance review experience survey?
A good pattern is a short pulse after each main review cycle plus one annual deep-dive. The pulse focuses on immediate impressions of the Mitarbeitergespräch and ratings. The annual survey takes a broader view across all 7 themes, including development and HR support. You can add a manager-focused pulse when you change tools, rubrics, or timelines.
Can we use these survey results to evaluate individual managers’ effectiveness?
Ideally, no. These surveys are designed to improve the system: process design, rubrics, training, and communication. Using anonymous answers to judge single managers breaks trust and raises legal and co-determination concerns. If you plan to give manager-level feedback, co-create the rules with your Betriebsrat and communicate clearly what will and will not be done with the data.









