tag:blogger.com,1999:blog-4005750306801645234.post4966035764969774969..comments2024-02-02T20:17:24.676+01:00Comments on Spacetime and Geometry: Exercise 1.07 Tensors and VectorsUnknownnoreply@blogger.comBlogger8125tag:blogger.com,1999:blog-4005750306801645234.post-50314463928997267772023-10-15T14:19:19.253+02:002023-10-15T14:19:19.253+02:00Thank you very much, so basically it doesn't r...Thank you very much, so basically it doesn't really matter if it is a row or a column matrix, as long as we take into account the index we are taking the summation (meaning that in the case of V_μ*X^{μν}, the μ represents the column of the tensor matrix, so in order for the multiplication to work, V_μ has to be a row vector. The same logic applies to V_μ*X^{νμ}, but now μ represents the row of the tensor matrix, so we will take V_μ as a column matrix).<br />Thanks once again, I got stuck trying to solve this exercise for a set, this has helped me understand vectors better.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4005750306801645234.post-2174014243278570442023-10-15T13:46:11.377+02:002023-10-15T13:46:11.377+02:00I spent a long time worrying about whether covaria...I spent a long time worrying about whether covariant vectors were column or row vectors and then contravariant would be the opposite. It is in fact completely irrelevant. Both can be written as either. It just depends on the particular equation. The Einstein summation convention means that $$V_μX^{μν}=V_0X^{0ν}+V_1X^{1ν}+V_2X^{2ν}+V_3X^{3ν}$$<br />To make the matrix multiplication work you have to write ##V_μ## as a row matrix. <br /><br />What happens if you want to calculate the components of ##V_μX^{νμ}##? I have made a small addendum to the pdf in case you get stuck.Georgehttps://www.blogger.com/profile/04824865122846470839noreply@blogger.comtag:blogger.com,1999:blog-4005750306801645234.post-44083163338156773852023-10-14T16:49:20.813+02:002023-10-14T16:49:20.813+02:00I am referring to g)
I am referring to g)<br />Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4005750306801645234.post-19682830181818182772023-10-14T16:48:46.076+02:002023-10-14T16:48:46.076+02:00why did you take V_mu as a row matrix and not a co...why did you take V_mu as a row matrix and not a column matrix ? Usually covariant vectors are represented by column matrixAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-4005750306801645234.post-63792000534462211732023-01-15T12:01:32.986+01:002023-01-15T12:01:32.986+01:00Stefano said "I don't get why in f, we ha...Stefano said "I don't get why in f, we have $##V_{\mu} V^{\mu}=9## instead of ##7##. Do you introduce the metric implicitly?"<br /><br />Sorry I made a mistake and the answer is 7 not 9. The metric does come into it. The answer would be 9 with a Euclidean metric. Thanks StefanoGeorgehttps://www.blogger.com/profile/04824865122846470839noreply@blogger.comtag:blogger.com,1999:blog-4005750306801645234.post-29143349252121315972023-01-15T01:49:05.774+01:002023-01-15T01:49:05.774+01:00I don't get why in f, we have $V_{\mu} V^{\mu}...I don't get why in f, we have $V_{\mu} V^{\mu}$=9 instead of 7. Do you introduce the metric implicitly?Stefanonoreply@blogger.comtag:blogger.com,1999:blog-4005750306801645234.post-72093373268561609832022-03-21T16:31:06.440+01:002022-03-21T16:31:06.440+01:00No, no, no! Lambda is repeated so it is summed ove...No, no, no! Lambda is repeated so it is summed over. ##X_{\ \ \lambda}^\lambda=X_{\ \ 0}^0+X_{\ \ 1}^1+X_{\ \ 2}^2+X_{\ \ 3}^3## so it is just the sum of the diagonal components of ##X_{\ \ \nu}^\mu## which was the question (a). ##X_{\ \ \lambda}^\lambda## is a scalar.Georgehttps://www.blogger.com/profile/04824865122846470839noreply@blogger.comtag:blogger.com,1999:blog-4005750306801645234.post-59939435679699124272022-03-20T20:19:34.217+01:002022-03-20T20:19:34.217+01:00I think point e is a 4-vector: each lambda is a co...I think point e is a 4-vector: each lambda is a component<br />Anonymoushttps://www.blogger.com/profile/00379430359983595252noreply@blogger.com