For Managers

Running 360 feedback cycles

Set up 360 feedback cycles for direct reports — assigning peer reviewers, tracking submissions, reviewing aggregated feedback, and pulling it into summaries.

A 360 feedback cycle collects feedback about one of your direct reports from a curated set of peers, cross-functional partners, and (when applicable) their own direct reports. The "360" comes from gathering perspectives from above, alongside, and below the employee — a fuller picture than a manager can provide alone.

Performance Blocks treats 360 feedback as a structured cycle: you assign reviewers, set a deadline, send response forms, track submissions, and pull the aggregated feedback into the employee's summary.

360 feedback cycles require the feedbackAssignment feature. If the feature is not enabled in your org, the 360 tab on the employee profile will not appear. Ask your org admin to enable it.

When to run a 360

The most common patterns:

  • Aligned to summary cycles — run a 360 in the weeks before each major summary cycle so the feedback can be pulled into the summary.
  • Promotion cases — run a focused 360 when an employee is being considered for promotion, especially across functions.
  • New role transitions — run a 360 partway through a new role to validate fit and surface coaching needs.
  • On request — when an employee asks for one as part of their development.

Avoid running 360s constantly. The signal degrades and reviewers fatigue. Once or twice a year per employee is the right cadence for most organizations.

Setting up a cycle

To start a 360 cycle for a direct report:

  1. Open the employee profile.
  2. Go to the 360 feedback tab.
  3. Click Start a cycle.
  4. Fill in the cycle setup form (covered below).
  5. Review the reviewer list and the questions.
  6. Click Launch cycle.

When you launch, every assigned reviewer receives an email with a link to the response form and a deadline.

Cycle setup fields

Field Description
Cycle name A short label, e.g., "H1 review" or "Promotion to L5". Visible to reviewers.
Period The time window the feedback should reflect on (typically the last 6–12 months).
Deadline When responses must be submitted. Reviewers receive reminders as the date approaches.
Anonymity See the anonymity section below.
Reviewers The list of people you are inviting to give feedback.
Questions The response form. Either the org's default 360 form or a custom set.
Self-reflection Whether to include a self-reflection from the employee themselves.

Reviewer assignment

Most cycles draw from three or four reviewer types:

  • Peers — colleagues at a similar level the employee works with regularly.
  • Cross-functional partners — partners in other functions (design, product, sales, etc.) the employee collaborates with.
  • Direct reports — only if the employee manages people. This is "upward feedback."
  • Stakeholders — internal customers, executives, or external partners the employee serves.

A typical 360 has 5–8 reviewers, weighted toward the people the employee works with most. Fewer than four reviewers and the anonymity gets thin; more than ten and reviewer fatigue erodes quality.

To assign reviewers:

  1. In the cycle setup form, click Add reviewer.
  2. Search by name. The autocomplete pulls from your organization directory.
  3. For each reviewer, indicate their relationship to the employee (peer, partner, report, stakeholder). The relationship label may be visible to readers depending on anonymity settings.
  4. Optionally, ask the employee to suggest reviewers — they get an email and can submit a list you then approve or modify.

Letting the employee suggest reviewers

In the setup form, toggle Ask the employee for reviewer suggestions. The employee receives a request to suggest 5–10 names. You then review the list, add or remove names, and launch.

This pattern usually produces a better reviewer mix than the manager picking alone. Employees know who they work with most closely, and involving them in selection reduces the "why was X asked?" friction afterward.

The response form

Performance Blocks ships with a default 360 form your org admin can customize. The default form typically includes:

  • Strengths — open-text response about what the employee does especially well.
  • Opportunities — open-text response about where the employee could grow.
  • Attribute ratings — Likert-scale ratings against the org's attribute library, scoped to attributes that apply to the employee's role.
  • Specific examples — a prompt asking for concrete moments the reviewer is drawing on.

Org admins can swap in a custom form for any cycle. Keep custom forms short — the longer the form, the lower the response rate and the more generic the responses.

Tracking submission status

After launch, the cycle dashboard shows every reviewer with their status:

  • Not started — invited but no response opened.
  • In progress — opened the form, has a partial response saved.
  • Submitted — completed and submitted.
  • Declined — the reviewer declined to participate (they can decline from the email).

You can:

  • Send a reminder to a specific reviewer or to all not-yet-submitted.
  • Extend the deadline if needed (notifies all reviewers).
  • Add a reviewer mid-cycle if a key person was missed.
  • Remove a reviewer if they decline or become unreachable.

The cycle stays open until the deadline. Late responses can still be submitted within a grace window your org admin configures.

Reminder cadence

Default reminders go out:

  • Three days after launch to anyone who has not opened the form.
  • Three days before the deadline to anyone not yet submitted.
  • On the deadline day to anyone with an in-progress draft.

You can send additional manual reminders at any time. Don't overdo it — three reminders is usually the limit before reviewers tune out.

What happens when a peer submits

When a reviewer submits feedback:

  • The submission appears in the cycle dashboard.
  • The aggregated view updates with the new response.
  • The reviewer receives a confirmation email.

You do not see a notification per submission by default — instead, you see a single "responses ready" notification when the cycle closes or when a configurable threshold of submissions is reached (e.g., 80 percent of invited reviewers).

You cannot see individual responses until the cycle closes (or until the threshold is met, depending on org settings). This prevents you from anchoring on the first response or chasing reviewers based on what you have read so far.

Reviewing aggregated feedback

When the cycle closes, open the cycle from the employee profile to see the aggregated view. The view is organized into:

Themes

A synthesis of patterns across responses. Themes are surfaced based on:

  • Recurring phrases or concepts in open-text responses.
  • Attribute rating clusters (multiple reviewers giving the same attribute high or low scores).
  • Concordance between strengths and opportunities sections (multiple reviewers naming the same area).

Themes are a starting point, not a substitute for reading the full responses. Read the themes first to orient, then read the underlying responses to understand the nuance.

Open-text responses

Strengths and opportunities responses, grouped by question. Depending on anonymity settings, responses may be:

  • Attributed — each response shows the reviewer's name.
  • Role-attributed — each response shows the reviewer's role or relationship (e.g., "Peer," "Direct report") but not their name.
  • Anonymous — responses are not attributed.

Attribute ratings

A summary of Likert ratings across attributes:

  • Mean rating per attribute.
  • Distribution per attribute (so you can see when reviewers disagreed).
  • Comparison to your manager rating, if you have completed one.
  • Comparison to the self-reflection, if included.

Disagreement among reviewers is information. An attribute where ratings range from "Below expectations" to "Significantly exceeds" suggests the employee performs differently in different contexts — worth digging into.

Self-reflection and gap analysis

If you included a self-reflection in the cycle, the aggregated view includes a gap analysis showing where the employee's view of themselves diverges from the reviewer view. Gaps in either direction are coaching material:

  • Higher self-rating than reviewer rating — the employee may be over-estimating in this area; explore why.
  • Lower self-rating than reviewer rating — the employee may be under-confident or under-claiming; surface the strengths they are not seeing.

Pulling 360 feedback into summaries

When you draft a summary for the employee, the evidence panel includes 360 responses from cycles whose period overlaps the summary period. You can:

  • Filter to "360 feedback only" to focus on it.
  • Drag specific responses into the strengths or opportunities sections.
  • Reference themes in your synthesis.

How you cite 360 feedback in the summary depends on the cycle's anonymity setting:

  • Attributed cycles — you can name reviewers in the summary if helpful.
  • Role-attributed or anonymous cycles — refer to feedback in aggregate ("Multiple peers cited..." or "Across the cross-functional partners..."). Do not name individuals or characterize responses in ways that would let the employee infer who said what.

If a 360 cycle was run with strict anonymity, the platform will warn you if you attempt to attribute a response by name in the summary editor.

Sharing results with the employee

Sharing 360 results with the employee is a separate action from closing the cycle. To share:

  1. Open the closed cycle.
  2. Click Share with employee.
  3. Confirm the anonymity settings for the shared view.

The employee then sees the same aggregated view you do, with attribution per the cycle's settings.

Best practice: share results in the context of a live conversation, not asynchronously. 360 feedback is information-dense and can land hard out of context. Walk through the themes together, leave space for reaction, and end with a small number of focus areas.

Anonymity options

Anonymity is set per cycle. Three modes are available:

Attributed

Each response is shown with the reviewer's name. Suitable for high-trust environments where directness is the norm and reviewers are senior enough to own their feedback.

Best for: senior leaders running 360s on each other; small teams; cycles where the goal is dialogue, not just signal.

Risks: can reduce candor, especially in upward feedback; can chill cross-functional feedback if reviewers fear retaliation.

Role-attributed

Each response is shown with the reviewer's relationship label (Peer, Direct report, Partner, Stakeholder) but not their name. The default for most cycles.

Best for: most situations. Provides enough context to interpret feedback without identifying individuals.

Risks: in small reviewer pools (especially direct reports), the role label can effectively identify the reviewer. The platform warns when this is the case.

Anonymous

Responses are not attributed at all. Used when a small reviewer pool would otherwise let the employee infer who said what, or when the org's culture requires it for upward feedback.

Best for: upward feedback with one or two direct reports; cycles where reviewers have explicitly asked for anonymity.

Risks: hardest to interpret — you cannot ask a clarifying question, and ambiguous responses can be hard to act on.

Choosing the right setting

The cycle setup defaults to role-attributed; your org admin can change the default. Some orgs require certain modes for certain cycle types (for example, all upward feedback must be anonymous). The setup form will show any policies that constrain your choice.

Best practices for selecting reviewers

Mix sources

A 360 with five peers and no one else is a peer review, not a 360. Aim for a mix:

  • 2–3 peers
  • 1–2 cross-functional partners
  • 1–2 direct reports (if applicable)
  • 1 stakeholder (if applicable)

The exact mix depends on the role. A senior IC needs more cross-functional partners; a manager needs direct reports.

Pick reviewers who have observed the employee in real situations

A reviewer who has only worked with the employee in passing will give shallow feedback. Reviewers should have at least 3–6 months of regular collaboration.

Avoid stacking the deck

Don't pick only people you know will give the employee high marks, and don't pick people you know have a grudge. Pick the people whose perspectives are most useful — usually the ones with the most direct working relationship.

Refresh reviewers across cycles

If you run a 360 every six months, swap out at least half the reviewer pool each time. Fresh perspectives keep the feedback honest; the same reviewers giving feedback every cycle drift toward consensus.

Communicate the purpose

When you launch a cycle, the email to reviewers explains who the feedback is for and how it will be used. If you are running a cycle for an unusual purpose (a promotion case, a role transition), add a custom note in the setup form so reviewers can tailor their responses.

Notifications

You receive notifications when:

  • A cycle reaches its response threshold (typically 80 percent submitted).
  • A cycle's deadline passes.
  • A reviewer declines participation.
  • The employee submits their self-reflection (if included).

Reviewers receive notifications on launch, on reminder schedule, and on the deadline. Adjust delivery channels under Settings → Notifications.

Privacy and retention

  • 360 responses are visible only to you (the cycle owner) and, if shared, the employee. Org admins see metadata (response counts, completion rate) but not content unless they are the cycle owner or you explicitly share.
  • Anonymized responses are anonymized at storage time; the platform does not retain a link from response to reviewer for cycles set to anonymous mode.
  • Closed cycles are retained per your org's retention policy. Most orgs keep them at least until the next summary cycle is complete.

Troubleshooting

A reviewer says they did not get the email

Check the cycle dashboard for their status. If their status is "Not started," resend the invite from the dashboard. If the email is going to spam, ask your org admin to verify the sending domain configuration.

I cannot launch a cycle

Common causes:

  • Fewer than the minimum number of reviewers (the platform enforces a minimum, usually 3).
  • Anonymity policy violations (e.g., trying to launch an attributed upward cycle when policy requires anonymous).
  • Missing required fields (deadline, reviewers, period).

Address the issues flagged in the setup form and try again.

The aggregated view is missing responses

Aggregation runs after the cycle closes (or hits the response threshold). If you see partial data:

  • Confirm the cycle is closed or has reached the configured threshold.
  • Wait a few minutes — aggregation runs asynchronously and can take a few minutes for cycles with many responses.
  • If responses are still missing after 10 minutes, contact support.

© 2026 Performance Blocks. All rights reserved.