Skip to main content

Questions tagged [diagonalization]

For questions about matrix diagonalization. Diagonalization is the process of finding a corresponding diagonal matrix for a diagonalizable matrix or linear map. This tag is NOT for diagonalization arguments common to logic and set theory.

Filter by
Sorted by
Tagged with
4 votes
2 answers
256 views

Exactly, I can't understand the real symmetric matrix is diagonalizable only from the symmetry. I can prove that the diagonalization of this kind of matrices by mathematical induction,as in Artin's ...
Rei Rockey's user avatar
1 vote
2 answers
141 views

Isn't it possible to form the liar's sentence by using the diagonal lemma to form the sentence "This sentence proves all statements"? I.e. For some classical first order theory $T$ ...
William Oliver's user avatar
3 votes
1 answer
238 views

I am very confused by the following highlighted lines in a proof of necessary and sufficient conditions for diagonalisability in Linear Algebra Done Right (4th ed.), Axler S. (2024). Questions. How ...
microhaus's user avatar
  • 1,106
0 votes
0 answers
35 views

In a lecture on exotic $\mathbb R^4$s, Robert Gompf claims that the following bilinear forms are "equivalent" (I presume, over the integers): $$ \left(\begin{array}{c|c} 1 & 0\\\hline 0 &...
ruza-net's user avatar
  • 365
2 votes
2 answers
228 views

I want to prove that the following $2\times 2$ matrices are similar over $\mathbb R$: $$ A=\begin{pmatrix} 0 & -1\\ 1 & 0\\ \end{pmatrix} \qquad B=\begin{pmatrix} -3 & -5\\ 2 & 3\\ \...
boaz's user avatar
  • 5,715
2 votes
1 answer
151 views

For which $M∈\text{Mat}(n×n, ℂ)$ is $M+𝛼M^*$ diagonalizable for arbitrary $𝛼∈ℂ \setminus \{0\}$? Alternatively just choose $𝛼∈ℝ \setminus \{0\}$ and analyse, when $M+𝛼M^\mathrm{T}$ is ...
Sinthoras's user avatar
  • 103
2 votes
2 answers
356 views

I'd like to show that, given $A$ a non-upper triangular matrix, then $\exp(A)$ mustn't be upper triangular either. Equivalently, I could show that the logarithm of an upper triangular matrix is always ...
CallieWallie's user avatar
0 votes
3 answers
82 views

I've recently practiced some old linear algebra exams and came across this question. Given $\alpha \in \mathbb{R}$ and $$ A_\alpha=\left[\begin{array}{lll}0 & 0 & \alpha \\ 0 & 1 & 0 \\...
Johann Kleindopf's user avatar
0 votes
1 answer
79 views

I am considering a certain Lie algebra, in particular a complex upper-triangular Lie algerbra. Furthermore, I wish to find a nice way to write the exponential of an arbitrary element in this upper-...
CallieWallie's user avatar
1 vote
1 answer
129 views

Note: After posting I realized that it may be difficult to distinguish between $\mathfrak{x}_{\bar{i}}$ and $\mathfrak{x}_{\tilde{i}}$ when viewed with a web browser. I suggest using ...
Steven Thomas Hatton's user avatar
0 votes
1 answer
75 views

Suppose that I have two matrices $A$ and $B$ which are both symmetric: $A=A^T, B=B^T$. Moreover, I know how to diagonalize both $A$ and $B$. Now I would like to define $T=A^{1/2}BA^{1/2}$, which is ...
miggle's user avatar
  • 391
1 vote
1 answer
70 views

I've read the standard proof of the self-adjoint spectral theorem which uses induction, but I thought of another idea and am attempting to make it work, I'd like to know if I'm on to something or if ...
Zacharie Etienne's user avatar
1 vote
0 answers
76 views

This question in the sample final for my linear algebra class has me, for the first time in this class, stumped (and the prof hasn't released a solution set): Suppose $M$ is a matrix of the form $\...
Ryder Mendelson's user avatar
0 votes
2 answers
82 views

When a matrix is diagonalizable, it is possible to find left and right eigenvectors $u_i$ and $v_j$ such that $u_i^T v_j = \delta_{ij}$ (Kronecker delta), $u_i^T A = \lambda_i u_i^T$, and $A v_i = \...
Heraklit's user avatar
  • 331
0 votes
0 answers
33 views

Only the zero matrix admits two coprime annihilating polynomials. Is this true or false. I think it's true but the notebook I use claims it's false without elaborating. My approach is the following: ...
Z23_0's user avatar
  • 1
0 votes
0 answers
22 views

If you have a collection of n (nonzero and unique) eigenvectors, is there a way to find a general form of an n-by-n matrix that corresponds to them in such a way that 'rules out' alternative forms? ...
Sciencemaster's user avatar
1 vote
0 answers
41 views

I'm studying the shape operator of Hopf hypersurfaces, and in this setting, all of its eigenvalues are supposed to be constant. I'm wondering whether there is a theorem that guarantees the following: ...
Hypatia's user avatar
  • 43
0 votes
0 answers
43 views

It's a bit hard to construct this question precisely, I may do an analogy to similar transformation. From the passive point of view, two matrix connected by a similar transformation can be viewed as ...
MakiseKurisu's user avatar
1 vote
1 answer
64 views

We know that for a symmetric matrix $P$ on a $n$-dimensional real linear space $V$, it always can be diagonalized. The reason lies in the proposition that $V$ can be expressed as the sum of all ...
user1405622's user avatar
1 vote
0 answers
79 views

The following discussions are all restricted to finite-dimensional linear spaces and eigenvalues are by default discussed in the field of complex numbers. We know that for a symmetric transformation $...
user1551346's user avatar
0 votes
1 answer
139 views

This is an exercise given in an entrance exam to a highly regarded French engineering school: Let $A$ a square matrix of size $n$ over a certain field (The real field or the complex field for instance)...
Serge Belhassen's user avatar
6 votes
1 answer
222 views

The matrix $\begin{pmatrix}1&1\\0&1 \end{pmatrix}$ is "special". For example, it is perhaps the first example a student comes across of a non-diagonalisable matrix. Moreover, since $\...
Dylan's user avatar
  • 1,024
2 votes
3 answers
168 views

I am presented with a matrix $A$ \begin{equation*} A = \begin{bmatrix} -8 & -30 & 10 \\ 5 & 20 & -7 \\ 10 & 33 & -10 \end{bmatrix} \end{equation*} and have been asked to find ...
Ang Ming Wen's user avatar
1 vote
0 answers
79 views

Consider a $C^1$ function $\Sigma:\mathbb{R}^d \to \mathbb{R}^{d\times d}$ such that $\Sigma(\theta)$ is positive semidefinite for all $\theta \in \mathbb{R}^d$. Is it possible to find matrix-valued ...
FT5's user avatar
  • 476
4 votes
1 answer
169 views

The Question: What does it mean for an element of $\operatorname{SL}_2(q)$ to be semisimple but not diagonalisable? That is, what does it look like in general? The Details: Recall that $\...
Shaun's user avatar
  • 48.5k
3 votes
2 answers
107 views

Let $A,C$ be matrices such that $CA = I$. Then $AC$ is defined. I conjecture the following: $AC$ is orthogonally diagonalizable. Its eigenvalues are either $1$ or $0$. It is Hermitian Informally, ...
SRobertJames's user avatar
  • 6,461
0 votes
2 answers
144 views

I think this question is more of a mathematics question rather than a programming question, read below for further details. Goal I have a system of somwehat big ($64 \times 64$) matrix equations of ...
clebbf's user avatar
  • 1
0 votes
0 answers
59 views

let $A$ be real matrix such the its characteristic polynomial has only one real eigenvalue $\lambda$ so, necessarily it has two conjugated complex eigen values $\beta$ and $\bar{\beta}$. By the ...
Donnie Darko's user avatar
3 votes
1 answer
91 views

I'm writing some notes on Linear Algebra and thought of the following question: What is the relation between diagonalizability of a linear operator and its dual. Here are my definitions: A linear ...
Luiz Cordeiro's user avatar
0 votes
0 answers
35 views

Let $V$ be an $N$-dimensional real inner product space with an orthonormal basis $\beta$. Let $G$ act on $\beta$ by permuting its elements and identify $\text{End}_G(V)$ as a subspace of $\text{Mat}_N(...
khashayar's user avatar
  • 2,613
1 vote
1 answer
87 views

I wonder if we can always parameterize the eigenvectors $V$ in an eigendecomposition of real symmetric matrices with Cayley's parameterization, under which any orthogonal matrix with −1 as an ...
nalzok's user avatar
  • 866
3 votes
1 answer
92 views

Let $\mathbb{V}$ be a finite dimension vector space over a field $\mathbb{k}$ and let $f,g:\mathbb{V}\to\mathbb{V}$ be endomorphisms such that $f\circ g=g\circ f$. Prove that if $f$ is diagonalizable ...
Aaron's user avatar
  • 146
0 votes
0 answers
33 views

Let $\mathbb{V}$ be a finite-dimensional inner product vector space over $\mathbb{C}$ and $f:\mathbb{V} \to \mathbb{V}$ an endomorphism such that $f+f^*=0$ with $f^*$ being the adjoint function of $f$....
Aaron's user avatar
  • 146
0 votes
0 answers
74 views

I was taught that any orthogonal matrix must have eigenvalues of magnitude 1, i.e. $\lambda_i=\pm1,\forall i$. However, I am given the following matrix $$ M = \frac{1}{2}\begin{pmatrix} -1 & -1 &...
Kalo's user avatar
  • 11
1 vote
1 answer
94 views

A standard result in linear algebra is that for a real, square matrix $A \in \mathbb{R}^{n \times n}$ can be diagonalised as follows, $$A = U \Lambda U^T = U \Lambda U^{-1}.$$ Where, $\Lambda \in \...
microhaus's user avatar
  • 1,106
1 vote
0 answers
67 views

I am trying to find all eigenvectors and eigenvalues of a Hermitian antilinear operator because the problem appears in the computation of complex rational minimax approximations. Antilinear operators ...
Rasmus's user avatar
  • 553
2 votes
1 answer
98 views

Using a little trickery, one can prove the real spectral theorem from the real SVD: Can the spectral theorem from linear algebra be proved easily using the SVD? Can one do this for complex instead of ...
D.R.'s user avatar
  • 11.2k
0 votes
0 answers
56 views

If $H$ is a positive definite matrix, it is well known by Williamson's theorem that one can brought $H$ into the block diagonal form $S^T H S = \begin{pmatrix} D & 0 \\ 0 & D \end{pmatrix}$ ...
Wild_Axolott's user avatar
4 votes
3 answers
185 views

In $\Bbb C$, how to prove that $A=\left(\begin{array}{cccccccccccccccccccc}a^2&ac&c^2\\ 2ab&ad+bc&2cd\\ b^2&bd&d^2\end{array}\right) $ is Diagonalizable if and only if $ \...
b-box's user avatar
  • 327
2 votes
1 answer
116 views

If $A,B \in \mathbb{R}^{4\times4}$ are diagonalisable matrices such that $rank(A)+rank(B)=5$, is it possible $(AB)^2 = 0$? That is, can matrix $AB$ be nilpotent? My work: I think it may be impossible. ...
FoolAlex's user avatar
1 vote
1 answer
107 views

We have a $n\times n$ matrix $A$ , which is also block diagonalized into $$ \left(\begin{matrix} A_1 & & &\\ & A_2 & &\\ & & \ddots & \\ & & & A_s \...
Rodri's user avatar
  • 21
1 vote
1 answer
46 views

Given $ A_1, \dots, A_M \in \mathbb{R}^{n \times n}$, I am wondering whether there exist matrices $X, Y \in \mathbb{R}^{r \times n} $ (with $ r $ potentially larger than $ n $) such that $ X A_k Y^T $ ...
Yunfei's user avatar
  • 315
1 vote
2 answers
75 views

This is exercise 7.13 from Chapter 6 of Aluffi's Algebra: Chapter 0 textbook. It says that if $R$ is an integral domain, $A \in M_{n}(R)$ is diagonalizable with distinct eigenvalues and $AB = BA$ (...
user594756's user avatar
2 votes
0 answers
95 views

I need help finding the source or an idea to solve the following problem: Suppose that $V$ is a finite-dimensional inner product space over $\mathbb{C}$, and $L:V\to V$ is an operator. Assume that the ...
Ángela Flores's user avatar
0 votes
0 answers
38 views

I'm working on a linear algebra problem involving JNF and similarity transformations. Here’s the setup: Given a matrix M = \begin{pmatrix} 7 & 0 & 0 \\ 0 & 7 & 0 \\ -1 & 4 & 7 ...
iwueh's user avatar
  • 17
0 votes
1 answer
372 views

Is this question true: Prove or give a counterexample: If $T \in \mathcal{L}(V)$ and $T^2$ has an upper-triangular matrix with respect to some basis of $V,$ then $T$ has an upper-triangular matrix ...
Intuition's user avatar
  • 3,153
1 vote
0 answers
37 views

I've searched around and it seems that little can be said in general about the spectrum of Hadamard (elementwise) products. Thankfully, my problem has structure: Given two mutually-diagonalizable PSD ...
dkarkada's user avatar
  • 111
0 votes
0 answers
62 views

I am trying to understand how I can compute the eigenvalues of $A^{-1}B$ where $A$ and $B$ are diagonalizable matrices (in patricular, they are symmetric tridiagonal matrices). I know that $A$ can be ...
Math Undergrad Student's user avatar
0 votes
1 answer
68 views

I was practising with exercises on Jordan canonical form in Section 7.1 of Friedberg et al. Linear Algebra, in the point f) of exercise 7 it's said to prove that given a diagonalizable linear operator ...
Iacopo's user avatar
  • 65
3 votes
1 answer
120 views

For the following question I've gotten a matrix $P =\begin{pmatrix} 1 & 1 & 1\\ 1 & 0 & 1\\ 0 & 1 & 1 \end{pmatrix}$ that works for $P^{-1}AP=J$ but doesn't work for the ...
MathematikZauber1's user avatar

1
2 3 4 5
52