Sovi.AI - AI Math Tutor

Scan to solve math questions

QUESTION IMAGE

nanyang technological university ay2025-2026, semester 2, mh1201 (linea…

Question

nanyang technological university
ay2025-2026, semester 2, mh1201 (linear algebra ii)
tutorial problem set 4 (study week 5)

problem 1
letting $b,c \in \mathbb{r}$, define a function $t: \mathbb{r}^3 \to \mathbb{r}^2$ as follows:
$\forall x,y,z \in \mathbb{r}: \quad t(x,y,z) \stackrel{df}{=} (2x - 4y + 3z + b,6x + cxyz).$
show that $t \in \mathcal{l}(\mathbb{r}^3,\mathbb{r}^2)$ if and only if $b = c = 0$.

problem 2
let (i) $v$ be a one-dimensional vector space over a field $\mathbb{f}$, and (ii) $t \in \mathcal{l}(v)$.
show that there exists a unique scalar $\lambda \in \mathbb{f}$ such that, for all $\mathbf{v} \in v$, we have $t(\mathbf{v}) = \lambda \cdot \mathbf{v}$.
remark: a linear transformation on a one-dimensional vector space is therefore just a scaling.

problem 3
let
(i) $v$ and $w$ be vector spaces over the same field,
(ii) $t \in \mathcal{l}(v,w)$, and
(iii) $s$ be a linearly dependent subset of $v$.
is it necessarily true that the image $ts$ of $s$ under $t$ is a linearly dependent subset of $w$?

problem 4
let
(i) $m,n \in \mathbb{n}$,
(ii) $\mathbb{f}$ be a field,
(iii) $v$ be an $m$-dimensional vector space over $\mathbb{f}$, with basis $\\{\mathbf{v}_1,\dots,\mathbf{v}_m\\}$, and
(iv) $w$ be an $n$-dimensional vector space over $\mathbb{f}$, with basis $\\{\mathbf{w}_1,\dots,\mathbf{w}_n\\}$.
given $(i,j) \in \\{1,\dots,m\\} \times \\{1,\dots,n\\}$, let $t_{i,j} \in \mathcal{l}(v,w)$ be uniquely specified by the rule
$\forall k \in \\{1,\dots,m\\}: \quad t_{i,j}(\mathbf{v}_k) = \

$$\begin{cases} \\mathbf{w}_j, & \\text{if } k = i, \\\\ \\mathbf{0}_w, & \\text{if } k \ eq i. \\end{cases}$$

$
show that $\\{t_{i,j}\\}_{i=1,j=1}^{i=m,j=n}$ is a basis for $\mathcal{l}(v,w)$ (hence, $\dim(\mathcal{l}(v,w)) = mn$).

problem 5
let $v$ be a finite-dimensional vector space, with $\dim(v) > 1$.
show that there exist $s,t \in \mathcal{l}(v)$ such that $s \circ t \
eq t \circ s$.

Explanation:

Response
Problem 1

Step1: Recall linear transformation axioms

A function $T: \mathbb{R}^3 \to \mathbb{R}^2$ is linear iff:

  1. $T(\mathbf{u}+\mathbf{v}) = T(\mathbf{u}) + T(\mathbf{v})$ for all $\mathbf{u},\mathbf{v}\in\mathbb{R}^3$
  2. $T(k\mathbf{v}) = kT(\mathbf{v})$ for all $k\in\mathbb{R}, \mathbf{v}\in\mathbb{R}^3$

Or equivalently, $T(a\mathbf{u}+b\mathbf{v})=aT(\mathbf{u})+bT(\mathbf{v})$ for all $a,b\in\mathbb{R}, \mathbf{u},\mathbf{v}\in\mathbb{R}^3$, and $T(\mathbf{0})=\mathbf{0}$.

Step2: Check $T(\mathbf{0})=\mathbf{0}$ condition

Compute $T(0,0,0) = (2*0 -4*0 +3*0 + b, 6*0 + c*0*0*0) = (b, 0)$.
For $T$ linear, $T(\mathbf{0})=(0,0)$, so $b=0$.

Step3: Check additivity with $b=0$

Let $\mathbf{u}=(x_1,y_1,z_1), \mathbf{v}=(x_2,y_2,z_2)$.
Compute $T(\mathbf{u}+\mathbf{v}) = (2(x_1+x_2)-4(y_1+y_2)+3(z_1+z_2), 6(x_1+x_2)+c(x_1+x_2)(y_1+y_2)(z_1+z_2))$
$T(\mathbf{u})+T(\mathbf{v}) = (2x_1-4y_1+3z_1 + 2x_2-4y_2+3z_2, 6x_1+cx_1y_1z_1 + 6x_2+cx_2y_2z_2)$
Equate the second components:
$6(x_1+x_2)+c(x_1+x_2)(y_1+y_2)(z_1+z_2) = 6(x_1+x_2) + c(x_1y_1z_1 + x_2y_2z_2)$
Simplify: $c[(x_1+x_2)(y_1+y_2)(z_1+z_2) - x_1y_1z_1 - x_2y_2z_2] = 0$
The term in brackets is non-zero for some $x_1,y_1,z_1,x_2,y_2,z_2$ (e.g., $x_1=y_1=z_1=1, x_2=y_2=z_2=0$ gives $1-0-0=1$), so $c=0$.

Step4: Verify $b=c=0$ gives linearity

If $b=0,c=0$, $T(x,y,z)=(2x-4y+3z,6x)$. This is a linear transformation (it is a linear combination of the components, no constant or nonlinear terms).

Step1: Fix a basis for $V$

Since $V$ is 1-dimensional over $\mathbb{F}$, let $\{\mathbf{v}\}$ be a basis for $V$. Then any $\mathbf{w}\in V$ can be written as $\mathbf{w}=k\mathbf{v}$ for some unique $k\in\mathbb{F}$.

Step2: Define $\lambda$ from $T(\mathbf{v})$

Since $T\in\mathcal{L}(V)$, $T(\mathbf{v})\in V$, so there exists $\lambda\in\mathbb{F}$ such that $T(\mathbf{v})=\lambda\mathbf{v}$.

Step3: Show $T(\mathbf{w})=\lambda\mathbf{w}$ for all $\mathbf{w}$

Take any $\mathbf{w}=k\mathbf{v}\in V$. By linearity:
$T(\mathbf{w})=T(k\mathbf{v})=kT(\mathbf{v})=k(\lambda\mathbf{v})=\lambda(k\mathbf{v})=\lambda\mathbf{w}$.

Step4: Prove uniqueness of $\lambda$

Suppose there exists $\lambda'\in\mathbb{F}$ such that $T(\mathbf{w})=\lambda'\mathbf{w}$ for all $\mathbf{w}\in V$. Then $T(\mathbf{v})=\lambda\mathbf{v}=\lambda'\mathbf{v}$, so $(\lambda-\lambda')\mathbf{v}=\mathbf{0}_V$. Since $\mathbf{v}
eq\mathbf{0}_V$ (basis vector), $\lambda-\lambda'=0$, so $\lambda=\lambda'$.

Step1: Recall linear dependence definition

A subset $S\subseteq V$ is linearly dependent if there exist distinct $\mathbf{v}_1,...,\mathbf{v}_k\in S$ and scalars $a_1,...,a_k\in\mathbb{F}$, not all zero, such that $\sum_{i=1}^k a_i\mathbf{v}_i = \mathbf{0}_V$.

Step2: Consider the zero transformation

Let $T$ be the zero transformation: $T(\mathbf{v})=\mathbf{0}_W$ for all $\mathbf{v}\in V$. Take any non-trivial linearly dependent set $S$ (e.g., $S=\{\mathbf{v},2\mathbf{v}\}$ for $\mathbf{v}
eq\mathbf{0}_V$). The image $T[S]=\{\mathbf{0}_W\}$, which is linearly dependent? No—wait, take $T$ non-zero, but consider $S=\{\mathbf{v},\mathbf{u}\}$ where $\mathbf{v}
eq\mathbf{u}$ and $a\mathbf{v}+b\mathbf{u}=\mathbf{0}_V$ with $a,b
eq0$. If $T(\mathbf{v})=T(\mathbf{u})=\mathbf{w}
eq\mathbf{0}_W$, then $aT(\mathbf{v})+bT(\mathbf{u})=(a+b)\mathbf{w}$. But if $a=-b$, then $\sum a_iT(\mathbf{v}_i)=\mathbf{0}_W$, but if $T$ is injective? No, take $T$ as a projection: let $V=\mathbb{R}^2$, $W=\mathbb{R}$, $T(x,y)=x$. Let $S=\{(1,0),(1,1),(0,1)\}$, which is linearly dependent. $T[S]=\{1,1,0\}$, which is linearly dependent. Wait, counterexample: Let $V=\mathbb{R}^2$, $W=\mathbb{R}$, $T(x,y)=0$ (zero map). Let $S=\{(1,0),(0,1)\}$, which is linearly independent? No, take $S=\{(1,0),(2,0)\}$ (linearly dependent). $T[S]=\{0,0\}$, which is linearly dependent. Wait, correct counterexample: Let $V=\mathbb{R}^2$, $W=\mathbb{R}^2$, $T$ be the zero transformation. Let $S=\{(1,0),(0,1)\}$ is independent, but take $S=\{(1,0),(2,0),(0,1)\}$ (dependent). $T[S]=\{(0,0),(0,0),(0,0)\}$, which is dependent. Wait, no—wait, the correct counterexample is: Let $V=\mathbb{R}^2$, $W=\mathbb{R}$, $T(x,y)=x+y$. Let $S=\{(1,-1),(2,-2)\}$, which is linearly dependent (since $2(1,-1)-(2,-2)=(0,0)$). The image $T[S]=\{0,0\}$, which is linearly dependent. Wait, no—wait, the mistake is: is there a case where $S$ is dependent but $T[S]$ is independent? No, wait: suppose $S$ is dependent, so $\sum_{i=1}^k a_i\mathbf{v}_i=\mathbf{0}_V$ with not all $a_i=0$. Then $\sum_{i=1}^k a_iT(\mathbf{v}_i)=T(\sum_{i=1}^k a_i\mathbf{v}_i)=T(\mathbf{0}_V)=\mathbf{0}_W$. Now, if $T[S]$ is linearly independent, then all $a_i=0$, which contradicts $S$ being dependent. Wait, no—wait, if the scalars are not all zero, but $\sum a_iT(\mathbf{v}_i)=\mathbf{0}_W$, that means $T[S]$ is dependent only if the scalars are not all zero. But if $T(\mathbf{v}_i)=\mathbf{0}_W$ for all $i$, then $\sum a_iT(\mathbf{v}_i)=\mathbf{0}_W$ for any $a_i$, which means $T[S]$ is dependent (since you can take non-zero scalars). Wait, no—actually, the only way $T[S]$ is independent is if the only linear combination of elements of $T[S]$ that gives $\mathbf{0}_W$ is the trivial one. But since $S$ is dependent, there exists a non-trivial linear combination of elements of $S$ that gives $\mathbf{0}_V$, which maps to $\mathbf{0}_W$ under $T$. However, if the elements of $T[S]$ are such that the non-trivial combination maps to $\mathbf{0}_W$, but that doesn't mean $T[S]$ is dependent—wait, no: if $S=\{\mathbf{v}_1,...,\mathbf{v}_k\}$ is dependent, then there exist $a_1,...,a_k\in\mathbb{F}$, not all zero, with $\sum a_i\mathbf{v}_i=\mathbf{0}_V$. Then $\sum a_iT(\mathbf{v}_i)=\mathbf{0}_W$. Now, if the $T(\mathbf{v}_i)$ are distinct, then this is a non-trivial linear combination of elements of $T[S]$ that gives $\mathbf{0}_W$, so $T[S]$ is dependent. If some $T(\mathbf{v}_i)$ are equal, then we can reduce the set to distinct elements, but the linear combination still gives $\mathbf{0}_W$ with not all coefficients zero (s…

Answer:

$T \in \mathcal{L}(\mathbb{R}^3, \mathbb{R}^2)$ if and only if $b=c=0$, as shown by enforcing linear transformation axioms and verifying the resulting function is linear.

---

Problem 2