Conjugate gradient method - direct method

共軛梯度法 (CG method, conjugate gradient method) 目錄:

  1. CG method - Direct mehtod

    直接法

  2. CG method - iterate mehtod

    迭代法


For solving $Ax=b$, where $A$ is a square symmetric positive definite matrix.

Assumptions:

$A\in M_{n\times n}$ is a symmetric positive definite matrix.

Definition:

  • A-orthogonal (A-conjugate) 假設有兩個向量 $u_1$ 跟 $u_2$ 皆非 $0$ 且 $u_1 \neq u_2$,若這兩個向量滿足 $$ \langle{u_1},A{u_2}\rangle = {u_1}^TA{u_2} = 0, $$ 則稱之為 A-orthogonal (或 A-conjugate).
  • A-orthonormal 假設有兩個向量 $u_1$ 跟 $u_2$ 為 A-orthogonal, 並且 $\langle{u_i}, {u_i}\rangle_A = 1$ for $1\le i\le 2$, 則稱之為 A-orthonormal.

Recall : ${u_1}$ and ${u_2}$ are orthogonal if ${u_1}^T{u_2} = 0$.

Note : We can define $$ \langle{u_1}, {u_2}\rangle_A = \langle{u_1},A{u_2}\rangle= \langle A{u_1},{u_2}\rangle, $$ then $\langle \cdot\rangle_A$ is an inner product.


Lemma

如果 ${u_0, \cdots, u_k}$ 是個 A-orthogonal set, 則 ${u_0, \cdots, u_k}$ 也是一個 linearly independent set.

pf:

假設 $c_0 u_0 + \cdots c_ku_k =0$.

將此方程兩邊同時乘以 $u^T_j A$, 由於 $u_j^TAu_i=0$ for $i\ne j$, 因此得到 $c_j u^T_j Au_j = 0$. 此外, 由於 $A$ 是正定矩陣, $u^T_j Au_j > 0$, 因此 $c_j=0$.

由於以上論述對所有 $j$ 都對, 因此得到 $c_i=0$ $\forall i$, 因此 ${u_0, \cdots, u_k}$ 為線性獨立.

Note : 如果 ${u_0, \cdots, u_{n-1}}\subset\mathbb{R}^n$ 是個 A-orthogonal set, 則他們也是 $\mathbb{R}^n$ 的一組 basis.


CG as a direct method

Theorem:

假設對一個矩陣 A,我們可以找到一組 A-orthogonal set ${u_0, \ldots, u_{n-1}}$, 則線性系統 $Ax=b$ 的解為 $$ x = \sum^{n-1}_{j=0}\frac{u^T_j b}{u^T_jAu_j} u_j. $$

pf:

假設 $x = \sum^{n-1}_{i=0} c_i u_i$, 則

$$ Ax = \sum^{n-1}_{i=0} c_i Au_i.
$$

將之代入 $b = Ax$ 並將兩側同時乘以 $u^T_j$ 得到 $$ u^T_j b = \sum^{n-1}_{i=0} c_i u^T_jAu_i = c_ju^T_jAu_j. $$ 因此 $$ c_j = \frac{u^T_j b}{u^T_jAu_j}. $$


實作上我們需要以迭代來尋找 $A$-orthogonal set, 並以迭代法來解線性系統.


Avatar
Te-Sheng Lin (林得勝)
Professor

The focus of my research is concerned with the development of analytical and computational tools, and further to communicate with scientists from other disciplines to solve engineering problems in practice.