Question 25

Let A be a $$3 \times 3$$ matrix such that A+ A^{T} = 0. If $$A\begin{bmatrix} 1 \\-1 \\ 0 \end{bmatrix}=\begin{bmatrix} 3 \\3 \\ 2 \end{bmatrix},A^{2}\begin{bmatrix} 1 \\-1 \\ 0 \end{bmatrix}=\begin{bmatrix} -3 \\19 \\ -24 \end{bmatrix}$$ and $$det(adj(2 adj(A+I))) = (2)^{\alpha }\cdot (3)^{\beta}\cdot (11)^{\gamma},\alpha,\beta,\gamma$$ are non-negative integers, then $$\alpha+\beta+\gamma$$ is equal to _____


Correct Answer: 18

Given that $$A$$ is a $$3 \times 3$$ skew-symmetric matrix, so $$A + A^T = 0$$. The matrix $$A$$ has the form:

$$ A = \begin{bmatrix} 0 & a & b \\ -a & 0 & c \\ -b & -c & 0 \end{bmatrix} $$

Using the given condition $$A \begin{bmatrix} 1 \\ -1 \\ 0 \end{bmatrix} = \begin{bmatrix} 3 \\ 3 \\ 2 \end{bmatrix}$$:

$$ \begin{bmatrix} 0 & a & b \\ -a & 0 & c \\ -b & -c & 0 \end{bmatrix} \begin{bmatrix} 1 \\ -1 \\ 0 \end{bmatrix} = \begin{bmatrix} -a \\ -a \\ -b + c \end{bmatrix} = \begin{bmatrix} 3 \\ 3 \\ 2 \end{bmatrix} $$

This gives:

$$ -a = 3 \implies a = -3 $$

$$ -b + c = 2 \implies c - b = 2 \quad \text{(1)} $$

Using the second condition $$A^2 \begin{bmatrix} 1 \\ -1 \\ 0 \end{bmatrix} = \begin{bmatrix} -3 \\ 19 \\ -24 \end{bmatrix}$$:

First, $$A \begin{bmatrix} 1 \\ -1 \\ 0 \end{bmatrix} = \begin{bmatrix} 3 \\ 3 \\ 2 \end{bmatrix}$$, so:

$$ A^2 \begin{bmatrix} 1 \\ -1 \\ 0 \end{bmatrix} = A \begin{bmatrix} 3 \\ 3 \\ 2 \end{bmatrix} = \begin{bmatrix} 0 & a & b \\ -a & 0 & c \\ -b & -c & 0 \end{bmatrix} \begin{bmatrix} 3 \\ 3 \\ 2 \end{bmatrix} = \begin{bmatrix} 3a + 2b \\ -3a + 2c \\ -3b - 3c \end{bmatrix} = \begin{bmatrix} -3 \\ 19 \\ -24 \end{bmatrix} $$

Substituting $$a = -3$$:

$$ 3(-3) + 2b = -3 \implies -9 + 2b = -3 \implies 2b = 6 \implies b = 3 $$

$$ -3(-3) + 2c = 19 \implies 9 + 2c = 19 \implies 2c = 10 \implies c = 5 $$

$$ -3(3) - 3(5) = -9 - 15 = -24 \quad \text{(verified)} $$

From equation (1): $$c - b = 5 - 3 = 2$$, which holds. Thus,

$$ A = \begin{bmatrix} 0 & -3 & 3 \\ 3 & 0 & 5 \\ -3 & -5 & 0 \end{bmatrix} $$

Now compute $$A + I$$:

$$ A + I = \begin{bmatrix} 0 & -3 & 3 \\ 3 & 0 & 5 \\ -3 & -5 & 0 \end{bmatrix} + \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} = \begin{bmatrix} 1 & -3 & 3 \\ 3 & 1 & 5 \\ -3 & -5 & 1 \end{bmatrix} $$

Set $$B = A + I$$. The determinant of $$B$$ is:

$$ \det(B) = \begin{vmatrix} 1 & -3 & 3 \\ 3 & 1 & 5 \\ -3 & -5 & 1 \end{vmatrix} = 1 \cdot \begin{vmatrix} 1 & 5 \\ -5 & 1 \end{vmatrix} - (-3) \cdot \begin{vmatrix} 3 & 5 \\ -3 & 1 \end{vmatrix} + 3 \cdot \begin{vmatrix} 3 & 1 \\ -3 & -5 \end{vmatrix} $$

$$ = 1 \cdot (1 \cdot 1 - 5 \cdot (-5)) + 3 \cdot (3 \cdot 1 - 5 \cdot (-3)) + 3 \cdot (3 \cdot (-5) - 1 \cdot (-3)) $$

$$ = 1 \cdot (1 + 25) + 3 \cdot (3 + 15) + 3 \cdot (-15 + 3) = 1 \cdot 26 + 3 \cdot 18 + 3 \cdot (-12) = 26 + 54 - 36 = 44 $$

Set $$C = \operatorname{adj}(B)$$. For a $$3 \times 3$$ matrix, $$\operatorname{adj}(\operatorname{adj}(B)) = \det(B)^{3-2} B = \det(B) B$$, since $$\det(B) = 44 \neq 0$$. Thus,

$$ \operatorname{adj}(C) = \operatorname{adj}(\operatorname{adj}(B)) = 44 B $$

Now compute $$\operatorname{adj}(2C)$$. For any scalar $$k$$ and matrix $$M$$, $$\operatorname{adj}(kM) = k^{n-1} \operatorname{adj}(M)$$, where $$n = 3$$. So,

$$ \operatorname{adj}(2C) = 2^{3-1} \operatorname{adj}(C) = 2^2 \cdot 44 B = 4 \cdot 44 B = 176 B $$

The determinant is:

$$ \det(\operatorname{adj}(2C)) = \det(176 B) $$

For any scalar $$k$$ and $$n \times n$$ matrix $$M$$, $$\det(kM) = k^n \det(M)$$. Here $$n = 3$$, so:

$$ \det(176 B) = 176^3 \det(B) = 176^3 \cdot 44 $$

Factorize:

$$ 176 = 2^4 \cdot 11, \quad 44 = 2^2 \cdot 11 $$

Thus,

$$ 176^3 = (2^4 \cdot 11)^3 = 2^{12} \cdot 11^3 $$

$$ 176^3 \cdot 44 = (2^{12} \cdot 11^3) \cdot (2^2 \cdot 11) = 2^{14} \cdot 11^4 $$

So, $$\det(\operatorname{adj}(2 \operatorname{adj}(A + I))) = 2^{14} \cdot 3^{0} \cdot 11^{4}$$, giving $$\alpha = 14$$, $$\beta = 0$$, $$\gamma = 4$$.

Therefore,

$$ \alpha + \beta + \gamma = 14 + 0 + 4 = 18 $$

Create a FREE account and get:

  • Free JEE Mains Previous Papers PDF
  • Take JEE Mains paper tests