What are the necessary and sufficient conditions for a two-form to be an exterior product of two one-forms?












2














Consider a two-form $gamma in Lambda^2(V)$ where $V$ is a real vector space. Now I would like to know the necessary and sufficient conditions for $gamma$ to be expressible as an exterior product of two one-forms, $gamma=alpha wedge beta,, alpha,beta in Lambda(V)$.



Obviously, a necessary condition for the decomposition to hold is that any exterior power of $gamma$ vanishes, $gammawedge...wedgegamma = 0$. But it is not clear to me whether this condition is sufficient.










share|cite|improve this question


















  • 4




    I recommend that you search for "Plucker quadratic relations".
    – Jason Starr
    Dec 17 at 11:09
















2














Consider a two-form $gamma in Lambda^2(V)$ where $V$ is a real vector space. Now I would like to know the necessary and sufficient conditions for $gamma$ to be expressible as an exterior product of two one-forms, $gamma=alpha wedge beta,, alpha,beta in Lambda(V)$.



Obviously, a necessary condition for the decomposition to hold is that any exterior power of $gamma$ vanishes, $gammawedge...wedgegamma = 0$. But it is not clear to me whether this condition is sufficient.










share|cite|improve this question


















  • 4




    I recommend that you search for "Plucker quadratic relations".
    – Jason Starr
    Dec 17 at 11:09














2












2








2







Consider a two-form $gamma in Lambda^2(V)$ where $V$ is a real vector space. Now I would like to know the necessary and sufficient conditions for $gamma$ to be expressible as an exterior product of two one-forms, $gamma=alpha wedge beta,, alpha,beta in Lambda(V)$.



Obviously, a necessary condition for the decomposition to hold is that any exterior power of $gamma$ vanishes, $gammawedge...wedgegamma = 0$. But it is not clear to me whether this condition is sufficient.










share|cite|improve this question













Consider a two-form $gamma in Lambda^2(V)$ where $V$ is a real vector space. Now I would like to know the necessary and sufficient conditions for $gamma$ to be expressible as an exterior product of two one-forms, $gamma=alpha wedge beta,, alpha,beta in Lambda(V)$.



Obviously, a necessary condition for the decomposition to hold is that any exterior power of $gamma$ vanishes, $gammawedge...wedgegamma = 0$. But it is not clear to me whether this condition is sufficient.







exterior-algebra






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Dec 17 at 11:02









Void

1164




1164








  • 4




    I recommend that you search for "Plucker quadratic relations".
    – Jason Starr
    Dec 17 at 11:09














  • 4




    I recommend that you search for "Plucker quadratic relations".
    – Jason Starr
    Dec 17 at 11:09








4




4




I recommend that you search for "Plucker quadratic relations".
– Jason Starr
Dec 17 at 11:09




I recommend that you search for "Plucker quadratic relations".
– Jason Starr
Dec 17 at 11:09










2 Answers
2






active

oldest

votes


















8














While Plücker relations give the general theory, a direct answer to your question is the following: a 2-form $gamma$ is decomposable (is a product of two 1-forms) iff $(iota_v gamma) wedge gamma = 0$ for any vector $v$, where $iota_v gamma$ is the usual contraction of a vector with a 2-form, that is, $(iota_v gamma)(u) = gamma(v,u)$ for any other vector $u$.



The proof is straightforward. Pick some vector $v$ such that $alpha = iota_v gamma ne 0$. If it doesn't exist, then $gamma = 0$, which is the trivial case. Then $alpha wedge gamma = 0$ implies* that there exists $beta$ such that $gamma = alpha wedge beta$ and you are done.



* If the implication is not obvious, extend $e_1 = alpha$ to a basis $e_i$ in $Lambda(V)$ and expand $gamma$ in the induced basis $e_iwedge e_j$. Clearly, $e_1wedge gamma = 0$ iff the only non-zero coefficients in the expansion are in front of the terms $e_1 wedge e_j$.






share|cite|improve this answer





























    6














    Let $e_1$, ..., $e_n$ be a basis of $V$. Let $gamma = sum c_{ij} e_i otimes e_j$, so $C = (c_{ij})$ is a skew symmetric matrix. Then $gamma$ factors as $alpha wedge beta$ if and only if $C$ has rank $leq 2$.



    Proof: Let $C$ be a skew symmetric matrix, we must show that $C$ can be written as $a b^T - b a^T$ for vectors $a$ and $b$ if and only if $mathrm{rank}(C) leq 2$. The condition is clearly necessary, since $mathrm{rank}(a b^T) = mathrm{rank}(b a^T) leq 1$.



    Choose a $2$-dimensional subspace $mathrm{Span}(v_1, v_2)$ containing the image of $C$. Since $C^T = -C$, this space also contains the image of $C^T$. Replacing $C$ by $SCS^T$, we may assume that $v_1$ and $v_2$ are the first two basis vectors.



    Then the conditions on the images of $C$ and $C^T$ say that $C$ is $0$ outside the intersection of the first two rows and the first two columns, and skew symmetry further says that the upper left $2 times 2$ block is of the form $left( begin{smallmatrix} 0&c \ -c&0 end{smallmatrix} right)$. For such matrices, the claim is obvious. $square$



    The OP asks whether it is enough to ask that $gamma wedge gamma$ vanish (well, the OP asks about all higher wedge powers vanishing as well, but that clearly follows from $gamma wedge gamma=0$.) The answer is yes. Any $2$-form can be written as $u_1 wedge v_1 + u_2 wedge v_2 + cdots + u_r wedge v_{r}$ where $2r$ is the rank of the matrix $C$ and $u_1$, ..., $u_r$, $v_1$, ..., $v_{r}$ are linearly independent; this is more or less the classification of skew symmetric bilinear forms. Then
    $$gamma^{wedge k} = k! sum_{1 leq i_1 < i_2 < cdots < i_k leq r} u_{i_1} wedge v_{i_1} wedge u_{i_2} wedge v_{i_2} wedge cdots wedge u_{i_k} wedge v_{i_k}.$$
    This is clearly nonzero if and only if $k leq r$. In particular, $r leq 1$ if and only if $gamma wedge gamma=0$.



    The condition that $gamma wedge gamma =0$, expanded in coordinates, states that the $4 times 4$ principal Pfaffians of $C$ vanish. It is a nice high school algebra challenge to show that this implies the $3 times 3$ minors of $C$ vanish, which is the more standard algebraic test for a matrix to have rank $leq 2$.






    share|cite|improve this answer























      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "504"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f318855%2fwhat-are-the-necessary-and-sufficient-conditions-for-a-two-form-to-be-an-exterio%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      8














      While Plücker relations give the general theory, a direct answer to your question is the following: a 2-form $gamma$ is decomposable (is a product of two 1-forms) iff $(iota_v gamma) wedge gamma = 0$ for any vector $v$, where $iota_v gamma$ is the usual contraction of a vector with a 2-form, that is, $(iota_v gamma)(u) = gamma(v,u)$ for any other vector $u$.



      The proof is straightforward. Pick some vector $v$ such that $alpha = iota_v gamma ne 0$. If it doesn't exist, then $gamma = 0$, which is the trivial case. Then $alpha wedge gamma = 0$ implies* that there exists $beta$ such that $gamma = alpha wedge beta$ and you are done.



      * If the implication is not obvious, extend $e_1 = alpha$ to a basis $e_i$ in $Lambda(V)$ and expand $gamma$ in the induced basis $e_iwedge e_j$. Clearly, $e_1wedge gamma = 0$ iff the only non-zero coefficients in the expansion are in front of the terms $e_1 wedge e_j$.






      share|cite|improve this answer


























        8














        While Plücker relations give the general theory, a direct answer to your question is the following: a 2-form $gamma$ is decomposable (is a product of two 1-forms) iff $(iota_v gamma) wedge gamma = 0$ for any vector $v$, where $iota_v gamma$ is the usual contraction of a vector with a 2-form, that is, $(iota_v gamma)(u) = gamma(v,u)$ for any other vector $u$.



        The proof is straightforward. Pick some vector $v$ such that $alpha = iota_v gamma ne 0$. If it doesn't exist, then $gamma = 0$, which is the trivial case. Then $alpha wedge gamma = 0$ implies* that there exists $beta$ such that $gamma = alpha wedge beta$ and you are done.



        * If the implication is not obvious, extend $e_1 = alpha$ to a basis $e_i$ in $Lambda(V)$ and expand $gamma$ in the induced basis $e_iwedge e_j$. Clearly, $e_1wedge gamma = 0$ iff the only non-zero coefficients in the expansion are in front of the terms $e_1 wedge e_j$.






        share|cite|improve this answer
























          8












          8








          8






          While Plücker relations give the general theory, a direct answer to your question is the following: a 2-form $gamma$ is decomposable (is a product of two 1-forms) iff $(iota_v gamma) wedge gamma = 0$ for any vector $v$, where $iota_v gamma$ is the usual contraction of a vector with a 2-form, that is, $(iota_v gamma)(u) = gamma(v,u)$ for any other vector $u$.



          The proof is straightforward. Pick some vector $v$ such that $alpha = iota_v gamma ne 0$. If it doesn't exist, then $gamma = 0$, which is the trivial case. Then $alpha wedge gamma = 0$ implies* that there exists $beta$ such that $gamma = alpha wedge beta$ and you are done.



          * If the implication is not obvious, extend $e_1 = alpha$ to a basis $e_i$ in $Lambda(V)$ and expand $gamma$ in the induced basis $e_iwedge e_j$. Clearly, $e_1wedge gamma = 0$ iff the only non-zero coefficients in the expansion are in front of the terms $e_1 wedge e_j$.






          share|cite|improve this answer












          While Plücker relations give the general theory, a direct answer to your question is the following: a 2-form $gamma$ is decomposable (is a product of two 1-forms) iff $(iota_v gamma) wedge gamma = 0$ for any vector $v$, where $iota_v gamma$ is the usual contraction of a vector with a 2-form, that is, $(iota_v gamma)(u) = gamma(v,u)$ for any other vector $u$.



          The proof is straightforward. Pick some vector $v$ such that $alpha = iota_v gamma ne 0$. If it doesn't exist, then $gamma = 0$, which is the trivial case. Then $alpha wedge gamma = 0$ implies* that there exists $beta$ such that $gamma = alpha wedge beta$ and you are done.



          * If the implication is not obvious, extend $e_1 = alpha$ to a basis $e_i$ in $Lambda(V)$ and expand $gamma$ in the induced basis $e_iwedge e_j$. Clearly, $e_1wedge gamma = 0$ iff the only non-zero coefficients in the expansion are in front of the terms $e_1 wedge e_j$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Dec 17 at 11:39









          Igor Khavkine

          11.6k23572




          11.6k23572























              6














              Let $e_1$, ..., $e_n$ be a basis of $V$. Let $gamma = sum c_{ij} e_i otimes e_j$, so $C = (c_{ij})$ is a skew symmetric matrix. Then $gamma$ factors as $alpha wedge beta$ if and only if $C$ has rank $leq 2$.



              Proof: Let $C$ be a skew symmetric matrix, we must show that $C$ can be written as $a b^T - b a^T$ for vectors $a$ and $b$ if and only if $mathrm{rank}(C) leq 2$. The condition is clearly necessary, since $mathrm{rank}(a b^T) = mathrm{rank}(b a^T) leq 1$.



              Choose a $2$-dimensional subspace $mathrm{Span}(v_1, v_2)$ containing the image of $C$. Since $C^T = -C$, this space also contains the image of $C^T$. Replacing $C$ by $SCS^T$, we may assume that $v_1$ and $v_2$ are the first two basis vectors.



              Then the conditions on the images of $C$ and $C^T$ say that $C$ is $0$ outside the intersection of the first two rows and the first two columns, and skew symmetry further says that the upper left $2 times 2$ block is of the form $left( begin{smallmatrix} 0&c \ -c&0 end{smallmatrix} right)$. For such matrices, the claim is obvious. $square$



              The OP asks whether it is enough to ask that $gamma wedge gamma$ vanish (well, the OP asks about all higher wedge powers vanishing as well, but that clearly follows from $gamma wedge gamma=0$.) The answer is yes. Any $2$-form can be written as $u_1 wedge v_1 + u_2 wedge v_2 + cdots + u_r wedge v_{r}$ where $2r$ is the rank of the matrix $C$ and $u_1$, ..., $u_r$, $v_1$, ..., $v_{r}$ are linearly independent; this is more or less the classification of skew symmetric bilinear forms. Then
              $$gamma^{wedge k} = k! sum_{1 leq i_1 < i_2 < cdots < i_k leq r} u_{i_1} wedge v_{i_1} wedge u_{i_2} wedge v_{i_2} wedge cdots wedge u_{i_k} wedge v_{i_k}.$$
              This is clearly nonzero if and only if $k leq r$. In particular, $r leq 1$ if and only if $gamma wedge gamma=0$.



              The condition that $gamma wedge gamma =0$, expanded in coordinates, states that the $4 times 4$ principal Pfaffians of $C$ vanish. It is a nice high school algebra challenge to show that this implies the $3 times 3$ minors of $C$ vanish, which is the more standard algebraic test for a matrix to have rank $leq 2$.






              share|cite|improve this answer




























                6














                Let $e_1$, ..., $e_n$ be a basis of $V$. Let $gamma = sum c_{ij} e_i otimes e_j$, so $C = (c_{ij})$ is a skew symmetric matrix. Then $gamma$ factors as $alpha wedge beta$ if and only if $C$ has rank $leq 2$.



                Proof: Let $C$ be a skew symmetric matrix, we must show that $C$ can be written as $a b^T - b a^T$ for vectors $a$ and $b$ if and only if $mathrm{rank}(C) leq 2$. The condition is clearly necessary, since $mathrm{rank}(a b^T) = mathrm{rank}(b a^T) leq 1$.



                Choose a $2$-dimensional subspace $mathrm{Span}(v_1, v_2)$ containing the image of $C$. Since $C^T = -C$, this space also contains the image of $C^T$. Replacing $C$ by $SCS^T$, we may assume that $v_1$ and $v_2$ are the first two basis vectors.



                Then the conditions on the images of $C$ and $C^T$ say that $C$ is $0$ outside the intersection of the first two rows and the first two columns, and skew symmetry further says that the upper left $2 times 2$ block is of the form $left( begin{smallmatrix} 0&c \ -c&0 end{smallmatrix} right)$. For such matrices, the claim is obvious. $square$



                The OP asks whether it is enough to ask that $gamma wedge gamma$ vanish (well, the OP asks about all higher wedge powers vanishing as well, but that clearly follows from $gamma wedge gamma=0$.) The answer is yes. Any $2$-form can be written as $u_1 wedge v_1 + u_2 wedge v_2 + cdots + u_r wedge v_{r}$ where $2r$ is the rank of the matrix $C$ and $u_1$, ..., $u_r$, $v_1$, ..., $v_{r}$ are linearly independent; this is more or less the classification of skew symmetric bilinear forms. Then
                $$gamma^{wedge k} = k! sum_{1 leq i_1 < i_2 < cdots < i_k leq r} u_{i_1} wedge v_{i_1} wedge u_{i_2} wedge v_{i_2} wedge cdots wedge u_{i_k} wedge v_{i_k}.$$
                This is clearly nonzero if and only if $k leq r$. In particular, $r leq 1$ if and only if $gamma wedge gamma=0$.



                The condition that $gamma wedge gamma =0$, expanded in coordinates, states that the $4 times 4$ principal Pfaffians of $C$ vanish. It is a nice high school algebra challenge to show that this implies the $3 times 3$ minors of $C$ vanish, which is the more standard algebraic test for a matrix to have rank $leq 2$.






                share|cite|improve this answer


























                  6












                  6








                  6






                  Let $e_1$, ..., $e_n$ be a basis of $V$. Let $gamma = sum c_{ij} e_i otimes e_j$, so $C = (c_{ij})$ is a skew symmetric matrix. Then $gamma$ factors as $alpha wedge beta$ if and only if $C$ has rank $leq 2$.



                  Proof: Let $C$ be a skew symmetric matrix, we must show that $C$ can be written as $a b^T - b a^T$ for vectors $a$ and $b$ if and only if $mathrm{rank}(C) leq 2$. The condition is clearly necessary, since $mathrm{rank}(a b^T) = mathrm{rank}(b a^T) leq 1$.



                  Choose a $2$-dimensional subspace $mathrm{Span}(v_1, v_2)$ containing the image of $C$. Since $C^T = -C$, this space also contains the image of $C^T$. Replacing $C$ by $SCS^T$, we may assume that $v_1$ and $v_2$ are the first two basis vectors.



                  Then the conditions on the images of $C$ and $C^T$ say that $C$ is $0$ outside the intersection of the first two rows and the first two columns, and skew symmetry further says that the upper left $2 times 2$ block is of the form $left( begin{smallmatrix} 0&c \ -c&0 end{smallmatrix} right)$. For such matrices, the claim is obvious. $square$



                  The OP asks whether it is enough to ask that $gamma wedge gamma$ vanish (well, the OP asks about all higher wedge powers vanishing as well, but that clearly follows from $gamma wedge gamma=0$.) The answer is yes. Any $2$-form can be written as $u_1 wedge v_1 + u_2 wedge v_2 + cdots + u_r wedge v_{r}$ where $2r$ is the rank of the matrix $C$ and $u_1$, ..., $u_r$, $v_1$, ..., $v_{r}$ are linearly independent; this is more or less the classification of skew symmetric bilinear forms. Then
                  $$gamma^{wedge k} = k! sum_{1 leq i_1 < i_2 < cdots < i_k leq r} u_{i_1} wedge v_{i_1} wedge u_{i_2} wedge v_{i_2} wedge cdots wedge u_{i_k} wedge v_{i_k}.$$
                  This is clearly nonzero if and only if $k leq r$. In particular, $r leq 1$ if and only if $gamma wedge gamma=0$.



                  The condition that $gamma wedge gamma =0$, expanded in coordinates, states that the $4 times 4$ principal Pfaffians of $C$ vanish. It is a nice high school algebra challenge to show that this implies the $3 times 3$ minors of $C$ vanish, which is the more standard algebraic test for a matrix to have rank $leq 2$.






                  share|cite|improve this answer














                  Let $e_1$, ..., $e_n$ be a basis of $V$. Let $gamma = sum c_{ij} e_i otimes e_j$, so $C = (c_{ij})$ is a skew symmetric matrix. Then $gamma$ factors as $alpha wedge beta$ if and only if $C$ has rank $leq 2$.



                  Proof: Let $C$ be a skew symmetric matrix, we must show that $C$ can be written as $a b^T - b a^T$ for vectors $a$ and $b$ if and only if $mathrm{rank}(C) leq 2$. The condition is clearly necessary, since $mathrm{rank}(a b^T) = mathrm{rank}(b a^T) leq 1$.



                  Choose a $2$-dimensional subspace $mathrm{Span}(v_1, v_2)$ containing the image of $C$. Since $C^T = -C$, this space also contains the image of $C^T$. Replacing $C$ by $SCS^T$, we may assume that $v_1$ and $v_2$ are the first two basis vectors.



                  Then the conditions on the images of $C$ and $C^T$ say that $C$ is $0$ outside the intersection of the first two rows and the first two columns, and skew symmetry further says that the upper left $2 times 2$ block is of the form $left( begin{smallmatrix} 0&c \ -c&0 end{smallmatrix} right)$. For such matrices, the claim is obvious. $square$



                  The OP asks whether it is enough to ask that $gamma wedge gamma$ vanish (well, the OP asks about all higher wedge powers vanishing as well, but that clearly follows from $gamma wedge gamma=0$.) The answer is yes. Any $2$-form can be written as $u_1 wedge v_1 + u_2 wedge v_2 + cdots + u_r wedge v_{r}$ where $2r$ is the rank of the matrix $C$ and $u_1$, ..., $u_r$, $v_1$, ..., $v_{r}$ are linearly independent; this is more or less the classification of skew symmetric bilinear forms. Then
                  $$gamma^{wedge k} = k! sum_{1 leq i_1 < i_2 < cdots < i_k leq r} u_{i_1} wedge v_{i_1} wedge u_{i_2} wedge v_{i_2} wedge cdots wedge u_{i_k} wedge v_{i_k}.$$
                  This is clearly nonzero if and only if $k leq r$. In particular, $r leq 1$ if and only if $gamma wedge gamma=0$.



                  The condition that $gamma wedge gamma =0$, expanded in coordinates, states that the $4 times 4$ principal Pfaffians of $C$ vanish. It is a nice high school algebra challenge to show that this implies the $3 times 3$ minors of $C$ vanish, which is the more standard algebraic test for a matrix to have rank $leq 2$.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Dec 17 at 17:53

























                  answered Dec 17 at 17:24









                  David E Speyer

                  105k8273534




                  105k8273534






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to MathOverflow!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.





                      Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                      Please pay close attention to the following guidance:


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f318855%2fwhat-are-the-necessary-and-sufficient-conditions-for-a-two-form-to-be-an-exterio%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Morgemoulin

                      Scott Moir

                      Souastre