const fieldName = "theme"; // Your field with linksconst oldPrefix = "Thoughts/01 Themes/";const newPrefix = "Digital Garden/Topics/";const relatedLinks = dv.current()[fieldName];if (Array.isArray(relatedLinks)) { // Map over the links, replace the path, and output only clickable links dv.el("span", relatedLinks .map(link => { if (link && link.path) { let newPath = link.path.startsWith(oldPrefix) ? link.path.replace(oldPrefix, newPrefix) : link.path; return dv.fileLink(newPath); } }) .filter(Boolean).join(", ") // Remove any undefined/null items );} else { dv.el(dv.current().theme);}
Proposition
Let G∼N(0,1)⊗d×m be a random matrix and y∈Rm. Then
Law(Gy)=N(0,∣∣y∣∣2Id)
Proof
Gy is linear in G. The entries of G form a gaussian random vector, and so Gy must be Gaussian as well. Thus, it suffices to calculate the mean and covariance to find its law.
Since each entry of G, call them Giα, is iid N(0,1), we get
Above, we found the expectation and covariance in terms of the entry-wise products and sums. We can also do the calculation in terms of matrices.
Proof (matrices)
Expectation
If there is only one random matrix involved, we can write
E[Gy]=E[G]y(=0)(∗)_(∗) In terms of our same setting above. This is a simple result of linearity of expectation.
Covariance
If g1,…,gm∈Rd are the columns of G so that each gα∼N(0,Id) iid, then we see that
\text{Cov}(Gy) &= \mathbb{E}\left[ (Gy)(Gy)^{\intercal} \right] \\
&= \mathbb{E}\left[ \left( \sum_{\alpha=1}^m y_{\alpha} g_{\alpha} \right)\left( \sum_{\beta=1}^m y_{\beta} g_{\beta}^{\intercal} \right) \right] \\
&= \sum_{\alpha,\beta=1}^m y_{\alpha} y_{\beta} \cdot \mathbb{E}\left[ g_{\alpha}g_{\beta}^{\intercal} \right] \\
(*) &= \sum_{\alpha=1}^m y_{\alpha}^2 \cdot I_{d} \\
&= \lvert \lvert y \rvert \rvert ^2 I_{d}
\end{align}$$
Where again $(*)$ is in terms of the setting above. In this line we use the law of the $g_{\alpha}$.
$$\tag*{$\blacksquare$}$$
^proof-1