Block matrix

In the mathematical discipline of matrix theory, a block matrix or a partitioned matrix is a partition of a matrix into rectangular smaller matrices called blocks. Looking at it another way, the matrix is written in terms of smaller matrices written side-by-side. A block matrix must conform to a consistent way of splitting up the rows, and the columns: we group the rows into some adjacent 'bunches', and the columns likewise. The partition is into the rectangles described by one bunch of adjacent rows crossing one bunch of adjacent columns. In other words, the matrix is split up by some horizontal and vertical lines that go all the way across.

Example

The matrix

[itex]P = \begin{bmatrix}

1 & 1 & 2 & 2\\ 1 & 1 & 2 & 2\\ 3 & 3 & 4 & 4\\ 3 & 3 & 4 & 4\end{bmatrix}[itex]

can be partitioned into 4 2×2 blocks

[itex]P_{11} = \begin{bmatrix}

1 & 1 \\ 1 & 1 \end{bmatrix}, P_{12} = \begin{bmatrix} 2 & 2\\ 2 & 2\end{bmatrix}, P_{21} = \begin{bmatrix} 3 & 3 \\ 3 & 3 \end{bmatrix}, P_{22} = \begin{bmatrix} 4 & 4\\ 4 & 4\end{bmatrix}.[itex]

The partitioned matrix can then be written as

[itex]P_{\mathrm{partitioned}} = \begin{bmatrix}

P_{11} & P_{12}\\ P_{21} & P_{22}\end{bmatrix}.[itex]

Block diagonal matrices

A block diagonal matrix is a block matrix which is a square matrix, and having main diagonal blocks square matrices, such that the off-diagonal blocks are zero matrices. A block diagonal matrix A has the form

[itex]

\mathbf{A} = \begin{bmatrix} A_{1} & 0 & \cdots & 0 \\ 0 & A_{2} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & A_{n} \end{bmatrix} [itex]

where Ak is an square matrix. Any square matrix can be trivially considered as a block diagonal matrix, taking n=1.

Application

In linear algebra terms, the use of a block matrix corresponds to having a linear mapping thought of in terms of corresponding 'bunches' of basis vectors. That again matches the idea of having distinguished direct sum decompositions of the domain and range. It is always particularly significant if a block is the zero matrix; that carries the information that a summand maps into a sub-sum.

Given the interpretation via linear mappings and direct sums, there is a special type of block matrix that occurs for square matrices (the case m = n). For those we can assume an interpretation as an endomorphism of an n-dimensional space V; the block structure in which the bunching of rows and columns is the same is of importance because it corresponds to having a single direct sum decomposition on V (rather than two). In that case, for example, the diagonal blocks in the obvious sense are all square. This type of structure is required to describe the Jordan normal form.

This technique is used to cut down calculations of matrices, column-row expansions, and many computer science applications, including VLSI chip design. An example is the Strassen algorithm for fast matrix multiplication.

• Art and Cultures
• Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
• Space and Astronomy