You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/posts/tensors-signals-kernels/index.md
+34-1Lines changed: 34 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -793,7 +793,7 @@ for each $\hat T^\prime$ we could construct. This finalizes the definition of $\
793
793
794
794
{{% /hint %}}
795
795
796
-
We have shown that vector-valued multilinear maps defined on a single vector space (such as operators) do identify tensors uniquely, despite not being of tensor form. Also, we have shown how tensors uniquely identify elements of tensor product spaces. Hence, it is normal refer to all of these objects as tensors.
796
+
We have shown that vector-valued multilinear maps defined on a single vector space (such as operators) do identify tensors uniquely, despite not being of tensor form. Also, we have shown how tensors uniquely identify elements of tensor product spaces defined on a single vector space. This is why it is normal refer to all of these objects as tensors.
797
797
798
798
{{% hint title="3.30. Examples" %}}
799
799
@@ -870,10 +870,43 @@ With heterogeneous tensors, one must also carry a mapping of type index to corre
870
870
871
871
#### Tensor Contractions
872
872
873
+
The statements of $(9)$ and $(11)$ may initially seem like a cryptic justification of our choice of vocabulary; they justify why we use the word "tensor" so liberally, with the most general use being in reference to an element of a heterogeneous tensor product space (up to isomorphism).
874
+
875
+
But beyond justifying use of language, $(9)$ and $(11)$ also provide a clear perspective on computation with tensors. They imply that all tensors can be "used" both as vectors and as multilinear maps -- they are both multi-argument functions and possible inputs to other multi-argument functions. To better understand this, we will take a look at [partial application](https://en.wikipedia.org/wiki/Partial_application) in this context.
876
+
877
+
{{% hint title="3.33. Example" %}}
878
+
879
+
Consider the quadratic form $q : (v, w) \mapsto v^\top A w$, which from 3.24 is a (homogeneous) tensor of type $(0, 2)$. It is a multilinear map of the form $q : V \times V \to \mathbb{F}$. If we fix the argument $v$, we can obtain $\hat q : w \mapsto v^\top A w$, which is a $1$-linear map of form $\hat q : V \to \mathbb{F}$ and a tensor of type $(0, 1)$.
880
+
881
+
{{% /hint %}}
882
+
883
+
In this example, we combined a multilinear map and a vector to obtain another multilinear map via partial application. Taking note that all the objects involved in this process are tensors, we can study how partial application is related to the type of the tensors involved.
884
+
885
+
{{% hint title="3.34. Note" %}}
886
+
887
+
Let $T$ be a homogeneous tensor of type $(m, n)$ on a vector space $V$. Partial application of $k$ of its arguments in $V$ and $h$ of its arguments in $V^\*$ will result in a new tensor $\hat T$ of type $(m - h, \\, n - k)$. Further, observe that by 3.25 one can construct a unique bilinear form $\tilde T$ from $T$ where an equivalent partial application can be done in a single argument, such that for a unique $z \in (\otimes^h \\, V^*) \otimes (\otimes^k \\, V)$,
Above, $z$ is exactly the tensor product of the vectors that were used as arguments during partial application on$T$ in order to obtain $\hat T$. Note that $z$, by statement $(9)$, identifies another tensor of type $(h, k)$.
897
+
898
+
{{% /hint %}}
899
+
900
+
The note above explains why (in the homogeneus case) partial application of multiple tensor arguments is in fact partial application of another tensor as an arguent on a uniquely associated bilinear map. This view shows how natural it is to think of partial application as a process that transforms two tensors into a third.
901
+
873
902
0. Tensor contraction
874
903
1. Einstein notation (mention einsum)
875
904
2. Penrose diagrams
876
905
906
+
{{< hcenter >}}
907
+
{{< figure src="roger-penrose.png" width="256" caption="Sir Roger Penrose (born August 8, 1931)" >}}
$$<p>for each $\hat T^\prime$ we could construct. This finalizes the definition of $\hat \Gamma : T \mapsto \hat T$. Each step above is bijective, so $\hat \Gamma$ is itself a bijection.</p>
1566
1566
</div>
1567
1567
</div>
1568
-
<p>We have shown that vector-valued multilinear maps defined on a single vector space (such as operators) do identify tensors uniquely, despite not being of tensor form. Also, we have shown how tensors uniquely identify elements of tensor product spaces. Hence, it is normal refer to all of these objects as tensors.</p>
1568
+
<p>We have shown that vector-valued multilinear maps defined on a single vector space (such as operators) do identify tensors uniquely, despite not being of tensor form. Also, we have shown how tensors uniquely identify elements of tensor product spaces defined on a single vector space. This is why it is normal refer to all of these objects as tensors.</p>
<p>The statements of $(9)$ and $(11)$ may initially seem like a cryptic justification of our choice of vocabulary; they justify why we use the word “tensor” so liberally, with the most general use being in reference to an element of a heterogeneous tensor product space (up to isomorphism).</p>
1721
+
<p>But beyond justifying use of language, $(9)$ and $(11)$ also provide a clear perspective on computation with tensors. They imply that all tensors can be “used” both as vectors and as multilinear maps – they are both multi-argument functions and possible inputs to other multi-argument functions. To better understand this, we will take a look at <ahref="https://en.wikipedia.org/wiki/Partial_application">partial application</a> in this context.</p>
1722
+
<style>
1723
+
.box-body>:last-child {
1724
+
margin-bottom:0!important;
1725
+
}
1726
+
1727
+
.box-body>:first-child {
1728
+
margin-top:0!important;
1729
+
}
1730
+
</style>
1731
+
<div
1732
+
class="hint-box"
1733
+
style="
1734
+
border: 1px solid #000000;
1735
+
padding: 10px;
1736
+
border-radius: 5px;
1737
+
margin: 25px 0;
1738
+
background-color: rgba(0, 0, 0, 0.05);
1739
+
"
1740
+
>
1741
+
<strongstyle="display: block; margin-bottom: 5px"
1742
+
>3.33. Example</strong
1743
+
>
1744
+
<hr
1745
+
style="
1746
+
border: none;
1747
+
border-top: 1px solid #000000;
1748
+
margin: 10px 0;
1749
+
width: calc(100%);
1750
+
"
1751
+
/>
1752
+
<divstyle="font-size: 0.92em" class="box-body">
1753
+
<p>Consider the quadratic form $q : (v, w) \mapsto v^\top A w$, which from 3.24 is a (homogeneous) tensor of type $(0, 2)$. It is a multilinear map of the form $q : V \times V \to \mathbb{F}$. If we fix the argument $v$, we can obtain $\hat q : w \mapsto v^\top A w$, which is a $1$-linear map of form $\hat q : V \to \mathbb{F}$ and a tensor of type $(0, 1)$.</p>
1754
+
</div>
1755
+
</div>
1756
+
<p>In this example, we combined a multilinear map and a vector to obtain another multilinear map via partial application. Taking note that all the objects involved in this process are tensors, we can study how partial application is related to the type of the tensors involved.</p>
1757
+
<style>
1758
+
.box-body>:last-child {
1759
+
margin-bottom:0!important;
1760
+
}
1761
+
1762
+
.box-body>:first-child {
1763
+
margin-top:0!important;
1764
+
}
1765
+
</style>
1766
+
<div
1767
+
class="hint-box"
1768
+
style="
1769
+
border: 1px solid #000000;
1770
+
padding: 10px;
1771
+
border-radius: 5px;
1772
+
margin: 25px 0;
1773
+
background-color: rgba(0, 0, 0, 0.05);
1774
+
"
1775
+
>
1776
+
<strongstyle="display: block; margin-bottom: 5px"
1777
+
>3.34. Note</strong
1778
+
>
1779
+
<hr
1780
+
style="
1781
+
border: none;
1782
+
border-top: 1px solid #000000;
1783
+
margin: 10px 0;
1784
+
width: calc(100%);
1785
+
"
1786
+
/>
1787
+
<divstyle="font-size: 0.92em" class="box-body">
1788
+
<p>Let $T$ be a homogeneous tensor of type $(m, n)$ on a vector space $V$. Partial application of $k$ of its arguments in $V$ and $h$ of its arguments in $V^*$ will result in a new tensor $\hat T$ of type $(m - h, \, n - k)$. Further, observe that by 3.25 one can construct a unique bilinear form $\tilde T$ from $T$ where an equivalent partial application can be done in a single argument, such that for a unique $z \in (\otimes^h \, V^*) \otimes (\otimes^k \, V)$,</p>
$$<p>Above, $z$ is exactly the tensor product of the vectors that were used as arguments during partial application on$T$ in order to obtain $\hat T$. Note that $z$, by statement $(9)$, identifies another tensor of type $(h, k)$.</p>
1795
+
</div>
1796
+
</div>
1797
+
<p>The note above explains why (in the homogeneus case) partial application of multiple tensor arguments is in fact partial application of another tensor as an arguent on a uniquely associated bilinear map. This view shows how natural it is to think of partial application as a process that transforms two tensors into a third.</p>
1720
1798
<olstart="0">
1721
1799
<li>Tensor contraction</li>
1722
1800
<li>Einstein notation (mention einsum)</li>
1723
1801
<li>Penrose diagrams</li>
1724
1802
</ol>
1803
+
<style>
1804
+
.halign-container {
1805
+
display: flex;
1806
+
width:100%;
1807
+
justify-content: center;
1808
+
}
1809
+
</style>
1810
+
<divclass="halign-container">
1811
+
<figure>
1812
+
<imgloading="lazy" src="roger-penrose.png"
1813
+
alt="Sir Roger Penrose (born August 8, 1931)" width="256"/><figcaption>
1814
+
<p>Sir Roger Penrose (born August 8, 1931)</p>
1815
+
</figcaption>
1816
+
</figure>
1817
+
1818
+
</div>
1819
+
1725
1820
<h4id="overview-2">Overview</h4>
1726
1821
<h3id="signals-and-systems">Signals and Systems</h3>
0 commit comments