Decision Answer

How do I run a weighted decision matrix?

A structured method for comparing options across multiple criteria. When to use it — and when the numbers are lying to you.

List your options as columns and your criteria as rows. Assign each criterion a weight out of 100, with weights totalling 100. Score each option on each criterion from 1 to 10. Multiply the score by the weight, sum the results for each option, and compare totals. The option with the highest weighted score is the analytical winner. If that result surprises you, the surprise is worth examining — it means something important is either missing from the matrix or weighted incorrectly.

Step-by-step: building the matrix

Start by writing your options across the top of a table. If you have more than five options, trim to the most viable three or four before you begin. A matrix with eight options becomes unwieldy and the scoring exercise loses discipline.

Down the left side, list your decision criteria. These are the dimensions that genuinely matter for this specific decision. For a job offer comparison, criteria might include compensation, role scope, growth potential, commute, and manager quality. Assign each criterion a weight that reflects its relative importance to you. The weights should sum to 100. This step forces the prioritisation that most informal decisions skip.

Now score each option on each criterion from 1 (very poor fit) to 10 (excellent fit). Multiply each score by its criterion weight, then sum the column for each option. The totals give you a ranked comparison. Most decision matrix tools present this in a spreadsheet; a simple table on paper works equally well.

When the numbers are lying to you

A weighted decision matrix is only as honest as the weights and scores you put into it. The most common failure mode is reverse-engineering: setting weights and scores so that a preferred option wins, then treating the matrix total as objective confirmation. This feels like rigour but is rationalisation. The tell is that you feel relief when a particular option scores highest, rather than genuine surprise or curiosity.

The gut-check test is the most useful corrective. When the matrix produces its winner, notice your immediate emotional reaction. If you feel pleased, consider whether the weights were honest. If you feel disappointed, the matrix may be more useful than you wanted it to be. Either reaction carries signal. A strong negative reaction to the matrix result often means there is a criterion you left out, or a criterion you underweighted because naming it honestly felt uncomfortable.

Post-decision rationalisation

Post-decision rationalisation is the tendency to work backward from a preferred conclusion and construct reasoning that supports it. In a decision matrix, this manifests as unconsciously setting weights to favour the option you already want. The fix is to set your weights before you score the options, and ideally before you know which option will benefit from which weight distribution.

Getting more from the tool

Two practices significantly improve decision matrix quality. First, set your criterion weights before you start scoring options. Committing to what matters most before you see how each option performs removes the temptation to adjust weights to suit a preferred outcome.

Second, run the matrix once, look at the result, then check whether any criterion is conspicuously absent. Decisions often have one factor that people are reluctant to put on the table explicitly, usually because it makes their actual preferences visible. If you notice you have avoided a criterion, add it and re-run. A matrix that includes the things you find awkward to name is far more useful than a tidy one that avoids them.

Don't just read about it — run your actual decision through our AI Decision Assistant.

DecisionsMatter.ai is an AI decision assistant that walks you through a structured 5-step analysis: framing, bias check, pre-mortem, and decision record. Your first analysis is free.

Try the AI Decision App →

One decision insight a week.

Mental models, frameworks, and decision science — no noise.

Common questions

How many criteria should I include in the matrix?
Between four and seven criteria is the practical range. Fewer than four often means you are oversimplifying a real trade-off. More than seven and the cognitive load of scoring honestly becomes difficult, and the weights get spread thin enough that they lose meaningful distinction. If you have a long list, group related criteria into categories and weight the categories instead.
How do I decide the weights for each criterion?
Ask yourself: if I could only optimise this decision for one thing, what would it be? Assign that criterion the highest weight. Then rank the rest relative to it. The weights should reflect your genuine priorities, not what you think they should be. A useful check is to show your weights to someone who knows you well and ask if they ring true. If they seem surprised, revisit them.
What if two options score very close together?
A close score is itself a signal: the two options are genuinely comparable on the criteria you chose. In that case, look for any unquantified factor you left out of the matrix. A near-tie often resolves when you surface the one thing you were reluctant to include as a criterion, usually because it felt too subjective. Name it, add it, and see what happens to the scores.
Should I include criteria I cannot quantify?
Yes, and this is where most people underuse the tool. Gut feel, cultural fit, alignment with values, and energy are all legitimate criteria. You can score them 1 to 10 even without hard data. The act of forcing a number onto something qualitative is useful because it requires you to take a position rather than leave it as a vague sentiment. Just be honest that the score is a judgment, not a measurement.

← All Answers Field Notes →

References & further reading

© All referenced works remain the intellectual property of their respective authors and publishers. Summaries and interpretations on this page are original commentary provided for educational purposes only.