I am reading Spacetime and Geometry : An Introduction to General Relativity – by Sean M Carroll. The blog contains answers to his exercises, commentaries, questions and more.
Saturday 14 April 2018
Exercise 1.07 Tensors and Vectors
The Question was
To see my answer Click Here. The formulae are too complicated to put into this simple editor.
One thing that Sean Carroll had omitted to tell us was that an index can be raised or lowered by applying the metric tensor. So for example,
Xμν = ηνσ Xσμ
This is pretty vital for this exercise! The Wikipedia article is here.
No, no, no! Lambda is repeated so it is summed over. ##X_{\ \ \lambda}^\lambda=X_{\ \ 0}^0+X_{\ \ 1}^1+X_{\ \ 2}^2+X_{\ \ 3}^3## so it is just the sum of the diagonal components of ##X_{\ \ \nu}^\mu## which was the question (a). ##X_{\ \ \lambda}^\lambda## is a scalar.
I spent a long time worrying about whether covariant vectors were column or row vectors and then contravariant would be the opposite. It is in fact completely irrelevant. Both can be written as either. It just depends on the particular equation. The Einstein summation convention means that $$V_μX^{μν}=V_0X^{0ν}+V_1X^{1ν}+V_2X^{2ν}+V_3X^{3ν}$$ To make the matrix multiplication work you have to write ##V_μ## as a row matrix.
What happens if you want to calculate the components of ##V_μX^{νμ}##? I have made a small addendum to the pdf in case you get stuck.
Thank you very much, so basically it doesn't really matter if it is a row or a column matrix, as long as we take into account the index we are taking the summation (meaning that in the case of V_μ*X^{μν}, the μ represents the column of the tensor matrix, so in order for the multiplication to work, V_μ has to be a row vector. The same logic applies to V_μ*X^{νμ}, but now μ represents the row of the tensor matrix, so we will take V_μ as a column matrix). Thanks once again, I got stuck trying to solve this exercise for a set, this has helped me understand vectors better.
I think point e is a 4-vector: each lambda is a component
ReplyDeleteNo, no, no! Lambda is repeated so it is summed over. ##X_{\ \ \lambda}^\lambda=X_{\ \ 0}^0+X_{\ \ 1}^1+X_{\ \ 2}^2+X_{\ \ 3}^3## so it is just the sum of the diagonal components of ##X_{\ \ \nu}^\mu## which was the question (a). ##X_{\ \ \lambda}^\lambda## is a scalar.
DeleteI don't get why in f, we have $V_{\mu} V^{\mu}$=9 instead of 7. Do you introduce the metric implicitly?
ReplyDeleteStefano said "I don't get why in f, we have $##V_{\mu} V^{\mu}=9## instead of ##7##. Do you introduce the metric implicitly?"
DeleteSorry I made a mistake and the answer is 7 not 9. The metric does come into it. The answer would be 9 with a Euclidean metric. Thanks Stefano
why did you take V_mu as a row matrix and not a column matrix ? Usually covariant vectors are represented by column matrix
ReplyDeleteI am referring to g)
DeleteI spent a long time worrying about whether covariant vectors were column or row vectors and then contravariant would be the opposite. It is in fact completely irrelevant. Both can be written as either. It just depends on the particular equation. The Einstein summation convention means that $$V_μX^{μν}=V_0X^{0ν}+V_1X^{1ν}+V_2X^{2ν}+V_3X^{3ν}$$
DeleteTo make the matrix multiplication work you have to write ##V_μ## as a row matrix.
What happens if you want to calculate the components of ##V_μX^{νμ}##? I have made a small addendum to the pdf in case you get stuck.
Thank you very much, so basically it doesn't really matter if it is a row or a column matrix, as long as we take into account the index we are taking the summation (meaning that in the case of V_μ*X^{μν}, the μ represents the column of the tensor matrix, so in order for the multiplication to work, V_μ has to be a row vector. The same logic applies to V_μ*X^{νμ}, but now μ represents the row of the tensor matrix, so we will take V_μ as a column matrix).
DeleteThanks once again, I got stuck trying to solve this exercise for a set, this has helped me understand vectors better.