In quantum theory, general quantum measurements are described by the so-called positive-operator-valued measures (POVMs). In finite dimensions, they formally correspond to a set of positive semidefinite operators $E_i \ge 0$ that sum to identity, $\sum_{i} E_{i} = \mathbb{I}$.
POVMs are useful because they allow us to calculate the probability of obtaining a particular outcome when measuring a quantum state. Namely, if we perform a measurement described by POVM $\{E_{i}\}_{i}$ on a quantum system in state $|\psi\rangle$, then the probability of getting outcome $i$ is given by $\mathrm{Pr}[i] = \mathrm{tr}\left( E_{i} |\psi\rangle\langle \psi|\right) = \langle \psi| E_{i} |\psi\rangle$.
The simplest example of a POVM is one where each $E_i$ is a projection onto a vector $|e_i\rangle$ that belongs to an orthonormal basis, that is, $E_{i} = |e_i\rangle\langle e_i|$. This is called a projective measurement. It turns out that having outcomes assigned to orthogonal operators is quite convenient for analyzing quantum experiments.
While not all POVMs are projective, a result called Naimark's dilation theorem (named after Mark Aronovich Naimark) that demonstrates how a POVM can be obtained from a projective measurement on a larger system. What this shows is that if we perform a projective measurement on a system but we have access to only part of it, then the result of that measurement on the accessible subsystem is properly described with a POVM.
Here we wish to describe one explicit procedure for computing the projective measurement given a non-projective POVM. Without loss of generality, we can consider POVMs where $E_i = \alpha_i |e_i\rangle\langle e_i |$, where $0 < \alpha < 1$. This is because if $E_i$ involved more terms, we can always fine-grain the POVM into one with more outcomes, find the projections for that POVM, then coarse-grain back to fewer outcomes by combining projections.
Suppose $E_i = \alpha_i |e_i\rangle\langle e_i |$ for $i = 1,2,..., n$. Then we can form the matrix by using the vector $|u_i\rangle := \sqrt{\alpha_i} |e_i\rangle$ as columns:
$W = \begin{pmatrix} U \\ V \end{pmatrix} = \left( |w_1\rangle |w_2\rangle \cdots |w_n\rangle\right)$, where $U = \left( |u_1\rangle |u_2\rangle \cdots |u_n\rangle\right)$.
If $|e_i\rangle$ are $d$-dimensional vectors, then $U$ is a $d \times n$ matrix and $V$ is a $(n-d) \times n$ matrix so that $W$ is an $n \times n$ square matrix. If we can find $V$ such that $W$ is a unitary matrix, then the projections we want are given by the columns of $W$, that is, for the POVM element $E_i$, its corresponding orthogonal projection in the larger space is $P_i = |w_i\rangle\langle w_i|$.
The solution is not unique so any $V$ that makes $W$ unitary is valid. But how does one find such a matrix $V$? One simple way is to realize that if $W$ is unitary, then so is $W^\dagger$, so it means we want to construct a matrix $W$ with orthonormal rows and columns.
Now all we actually need is to use any matrix $V$ such that all the rows of $W$ are linearly independent. If one chooses the matrix elements of $V$ uniformly at random, this is almost surely the case, but one can also make a simple analytic choice as long as $W$ becomes full-rank, which can be checked, say by reducing to row echelon form.
Because $U$ is part of a larger unitary matrix, its rows are already orthogonal. To make the rows of $V$ orthogonal, then we can just apply the Gram-Schmidt process. Suppose after that, $V$ becomes $\widetilde{V}$ (in particular, the rows and columns are properly normalized) and we get $\widetilde{W} = \begin{pmatrix} U \\ \widetilde{V} \end{pmatrix}$. This is the desired unitary matrix: the orthogonal projections of the Naimark dilation can be constructed from the columns of $\widetilde{W}$.
To see how that all works, we provide a simple worked example. Consider the following three-outcome qubit POVM:
$E_1 = \frac{1}{4}|0\rangle\langle 0|, \quad E_2 = \frac{1}{2}|+\rangle\langle +| + \frac{1}{2}|1\rangle\langle 1|, \quad E_3 = \frac{1}{4}|0\rangle\langle 0| + \frac{1}{2}|-\rangle\langle -|$,
where $|0\rangle = \begin{pmatrix} 1 \\ 0 \end{pmatrix}, |1\rangle = \begin{pmatrix} 0 \\ 1 \end{pmatrix}, |\pm\rangle = \frac{1}{\sqrt{2}}\begin{pmatrix} 1 \\ \pm 1 \end{pmatrix}$.
First thing we do is split $E_2$ and $E_3$ into two parts for each term, so that when we construct the matrix $U$, it will have 5 columns: first corresponds to $E_1$, second and third to $E_2$ and fourth and fifth to $E_3$. So we have
$U = \begin{pmatrix} \tfrac{1}{2} & \tfrac{1}{\sqrt{2}} & 0 & \tfrac{1}{2} & \tfrac{1}{\sqrt{2}} \\ 0 & \tfrac{1}{\sqrt{2}} & \tfrac{1}{\sqrt{2}} & 0 & -\tfrac{1}{\sqrt{2}} \end{pmatrix}$.
It can be verified that one way to choose $V$ so that its rows are orthogonal to the rows of $U$ is
$V = \begin{pmatrix} \tfrac{1}{2} & 0 & 0 & -\tfrac{1}{2} & 0 \\ 0& -1 & \sqrt{2} & 0 & 1 \\ \sqrt{2} &-1 & 0 & \sqrt{2} & 1 \end{pmatrix}$.
Finally to have the rows (and columns) are properly normalized, we take
$\widetilde{W} = \frac{1}{\sqrt{6}} \begin{pmatrix} 1 & \sqrt{2} & 0 &1&\sqrt{2} \\ 0&\sqrt{2}&\sqrt{2}&0&-\sqrt{2}\\ \sqrt{3} & 0 &0 &-\sqrt{3} & 0 \\ 0& -1 & 2 & 0 & 1 \\ \sqrt{2} & -1 & 0 &\sqrt{2} & -1& \end{pmatrix} = \left(|w_1\rangle \, |w_2\rangle \, |w_3\rangle \, |w_4\rangle \, |w_5\rangle \right)$.
The projections $P_i$ corresponding to each POVM element $E_i$ are
$P_1 = |w_1\rangle\langle w_1|, \, P_2 = |w_2\rangle\langle w_2| + |w_3\rangle\langle w_3|, \, P_3 = |w_4\rangle\langle w_4| + |w_5\rangle\langle w_5|$.
To show that you get the same probabilities, observe that when the POVM $\{ E_i \}_{i}$ is measured on the state $|\psi \rangle = \begin{pmatrix} a \\ b \end{pmatrix}$, then the projections for its Naimark dilation are measured on the state $|\Psi\rangle = \begin{pmatrix} a \\ b \\ 0 \\ 0 \\ 0 \end{pmatrix}$.