What problems might you encounter if you use a block size much greater than AES?











up vote
4
down vote

favorite












What problems might you encounter if you use a block size much greater than AES (1000 bits, say)?










share|improve this question


















  • 1




    Are there any such Cryptographic algorithm? If you have a secure one 1) Small messages will be very big. 2) key size will be big. 3) big,big big
    – kelalaka
    Dec 2 at 21:57






  • 2




    Theoretically, MD6 compression function (described in Chapter 2.5 and Chapter 9 of the document “The MD6 hash function: A proposal to NIST for SHA-3”) is defined for any block size (assuming that the block size is a multiple of the word size). Since 1000 is divisible by 8, it would be possible to define this function for 1000-bit blocks. There are two serious problems though: 1) The number of rounds required to ensure the security is very big; 2) The algorithm relies on heuristic ways to choose optimal parameters for each block/word size. The second problem is what really annoys me.
    – lyrically wicked
    Dec 4 at 6:48

















up vote
4
down vote

favorite












What problems might you encounter if you use a block size much greater than AES (1000 bits, say)?










share|improve this question


















  • 1




    Are there any such Cryptographic algorithm? If you have a secure one 1) Small messages will be very big. 2) key size will be big. 3) big,big big
    – kelalaka
    Dec 2 at 21:57






  • 2




    Theoretically, MD6 compression function (described in Chapter 2.5 and Chapter 9 of the document “The MD6 hash function: A proposal to NIST for SHA-3”) is defined for any block size (assuming that the block size is a multiple of the word size). Since 1000 is divisible by 8, it would be possible to define this function for 1000-bit blocks. There are two serious problems though: 1) The number of rounds required to ensure the security is very big; 2) The algorithm relies on heuristic ways to choose optimal parameters for each block/word size. The second problem is what really annoys me.
    – lyrically wicked
    Dec 4 at 6:48















up vote
4
down vote

favorite









up vote
4
down vote

favorite











What problems might you encounter if you use a block size much greater than AES (1000 bits, say)?










share|improve this question













What problems might you encounter if you use a block size much greater than AES (1000 bits, say)?







aes






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Dec 2 at 21:49









user63954

211




211








  • 1




    Are there any such Cryptographic algorithm? If you have a secure one 1) Small messages will be very big. 2) key size will be big. 3) big,big big
    – kelalaka
    Dec 2 at 21:57






  • 2




    Theoretically, MD6 compression function (described in Chapter 2.5 and Chapter 9 of the document “The MD6 hash function: A proposal to NIST for SHA-3”) is defined for any block size (assuming that the block size is a multiple of the word size). Since 1000 is divisible by 8, it would be possible to define this function for 1000-bit blocks. There are two serious problems though: 1) The number of rounds required to ensure the security is very big; 2) The algorithm relies on heuristic ways to choose optimal parameters for each block/word size. The second problem is what really annoys me.
    – lyrically wicked
    Dec 4 at 6:48
















  • 1




    Are there any such Cryptographic algorithm? If you have a secure one 1) Small messages will be very big. 2) key size will be big. 3) big,big big
    – kelalaka
    Dec 2 at 21:57






  • 2




    Theoretically, MD6 compression function (described in Chapter 2.5 and Chapter 9 of the document “The MD6 hash function: A proposal to NIST for SHA-3”) is defined for any block size (assuming that the block size is a multiple of the word size). Since 1000 is divisible by 8, it would be possible to define this function for 1000-bit blocks. There are two serious problems though: 1) The number of rounds required to ensure the security is very big; 2) The algorithm relies on heuristic ways to choose optimal parameters for each block/word size. The second problem is what really annoys me.
    – lyrically wicked
    Dec 4 at 6:48










1




1




Are there any such Cryptographic algorithm? If you have a secure one 1) Small messages will be very big. 2) key size will be big. 3) big,big big
– kelalaka
Dec 2 at 21:57




Are there any such Cryptographic algorithm? If you have a secure one 1) Small messages will be very big. 2) key size will be big. 3) big,big big
– kelalaka
Dec 2 at 21:57




2




2




Theoretically, MD6 compression function (described in Chapter 2.5 and Chapter 9 of the document “The MD6 hash function: A proposal to NIST for SHA-3”) is defined for any block size (assuming that the block size is a multiple of the word size). Since 1000 is divisible by 8, it would be possible to define this function for 1000-bit blocks. There are two serious problems though: 1) The number of rounds required to ensure the security is very big; 2) The algorithm relies on heuristic ways to choose optimal parameters for each block/word size. The second problem is what really annoys me.
– lyrically wicked
Dec 4 at 6:48






Theoretically, MD6 compression function (described in Chapter 2.5 and Chapter 9 of the document “The MD6 hash function: A proposal to NIST for SHA-3”) is defined for any block size (assuming that the block size is a multiple of the word size). Since 1000 is divisible by 8, it would be possible to define this function for 1000-bit blocks. There are two serious problems though: 1) The number of rounds required to ensure the security is very big; 2) The algorithm relies on heuristic ways to choose optimal parameters for each block/word size. The second problem is what really annoys me.
– lyrically wicked
Dec 4 at 6:48












2 Answers
2






active

oldest

votes

















up vote
4
down vote













It may be very tricky to have a good distribution / diffusion of the input bits for each round. That would mean that you would either require a very intricate design for each round, or a lot of rounds to make up for it.





Furthermore it will also mean that you need to perform an operation of at least 1000 bits to perform any kind of calculation. This is not a desirable property in most circumstances. For some modes of operation such as CBC, it will mean that the minimum message size would be 1000 bits - and the IV would take up another 1000 bits. For counter (CTR) mode your message will not grow as only part of the output can be used. However, that still means doing 1000 bit calculations for possibly much smaller messages.



Most modes of operation function quite well for block sizes of 128 bits, although 256 bits would be beneficial for modes such as CTR (because the IV is split between a nonce and the counter).





Although a large block size is desirable for use in the design of hash functions, the avalanche effect does require a good diffusion of bits. Threefish - the block cipher used for Skein, one of the SHA-3 finalists - however does support a block size of 1024 bits, so ~1000 bits is certainly not unheard of (quite often the internal hash state is at least double the intermediate / output size). Skein 512 however requires 80 (!) rounds of Threefish1024 so it certainly wasn't the fastest algorithm out there.





A relatively fast large block cipher may also be beneficial for some format preserving encryption, by the way. In principle a single block encrypt of a unique value is secure, so having a large block size may be helpful avoid repetition of blocks. You could build a cipher with a larger block size from a block cipher with a small block size, but those kind of constructs are generally not all that efficient.






share|improve this answer























  • Happy to upvote any other answers, especially those who actually designed a block cipher... Please do not accept yet.
    – Maarten Bodewes
    Dec 3 at 9:07










  • It is very interesting how Keccak algorithm manages to deal with 1600-bit blocks using only 24 (or even twelve!) rounds...
    – lyrically wicked
    Dec 4 at 6:49










  • Yeah, but if you would ever try to implement it Keccak's F is not exactly a peach to implement. Especially if you're a nut like me and try to do it "from spec".
    – Maarten Bodewes
    Dec 4 at 14:49


















up vote
3
down vote













This problem has been studied and comes under wider block-cipher encryptions, examples are Mercy block cipher , Bear and Lion , and Simpira.
My answer ties with Simpira block cipher case.
Simpira has been recently developed to work on 64 bit processors (v2 is also available) which uses 128 bit x b , where b is a positive integer.



Simpira is a cryptographic permutation that combines AES and Feistel structure.
In Simpira cipher , the number of rounds required to achieve full diffusion increases along the b value , (12,15,21 rounds for b=1,2,3).This means , to achieve full diffusion of wider size more rounds are required therefore performance will downgrade.



for further information: Simpira






share|improve this answer























    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "281"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f64502%2fwhat-problems-might-you-encounter-if-you-use-a-block-size-much-greater-than-aes%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    4
    down vote













    It may be very tricky to have a good distribution / diffusion of the input bits for each round. That would mean that you would either require a very intricate design for each round, or a lot of rounds to make up for it.





    Furthermore it will also mean that you need to perform an operation of at least 1000 bits to perform any kind of calculation. This is not a desirable property in most circumstances. For some modes of operation such as CBC, it will mean that the minimum message size would be 1000 bits - and the IV would take up another 1000 bits. For counter (CTR) mode your message will not grow as only part of the output can be used. However, that still means doing 1000 bit calculations for possibly much smaller messages.



    Most modes of operation function quite well for block sizes of 128 bits, although 256 bits would be beneficial for modes such as CTR (because the IV is split between a nonce and the counter).





    Although a large block size is desirable for use in the design of hash functions, the avalanche effect does require a good diffusion of bits. Threefish - the block cipher used for Skein, one of the SHA-3 finalists - however does support a block size of 1024 bits, so ~1000 bits is certainly not unheard of (quite often the internal hash state is at least double the intermediate / output size). Skein 512 however requires 80 (!) rounds of Threefish1024 so it certainly wasn't the fastest algorithm out there.





    A relatively fast large block cipher may also be beneficial for some format preserving encryption, by the way. In principle a single block encrypt of a unique value is secure, so having a large block size may be helpful avoid repetition of blocks. You could build a cipher with a larger block size from a block cipher with a small block size, but those kind of constructs are generally not all that efficient.






    share|improve this answer























    • Happy to upvote any other answers, especially those who actually designed a block cipher... Please do not accept yet.
      – Maarten Bodewes
      Dec 3 at 9:07










    • It is very interesting how Keccak algorithm manages to deal with 1600-bit blocks using only 24 (or even twelve!) rounds...
      – lyrically wicked
      Dec 4 at 6:49










    • Yeah, but if you would ever try to implement it Keccak's F is not exactly a peach to implement. Especially if you're a nut like me and try to do it "from spec".
      – Maarten Bodewes
      Dec 4 at 14:49















    up vote
    4
    down vote













    It may be very tricky to have a good distribution / diffusion of the input bits for each round. That would mean that you would either require a very intricate design for each round, or a lot of rounds to make up for it.





    Furthermore it will also mean that you need to perform an operation of at least 1000 bits to perform any kind of calculation. This is not a desirable property in most circumstances. For some modes of operation such as CBC, it will mean that the minimum message size would be 1000 bits - and the IV would take up another 1000 bits. For counter (CTR) mode your message will not grow as only part of the output can be used. However, that still means doing 1000 bit calculations for possibly much smaller messages.



    Most modes of operation function quite well for block sizes of 128 bits, although 256 bits would be beneficial for modes such as CTR (because the IV is split between a nonce and the counter).





    Although a large block size is desirable for use in the design of hash functions, the avalanche effect does require a good diffusion of bits. Threefish - the block cipher used for Skein, one of the SHA-3 finalists - however does support a block size of 1024 bits, so ~1000 bits is certainly not unheard of (quite often the internal hash state is at least double the intermediate / output size). Skein 512 however requires 80 (!) rounds of Threefish1024 so it certainly wasn't the fastest algorithm out there.





    A relatively fast large block cipher may also be beneficial for some format preserving encryption, by the way. In principle a single block encrypt of a unique value is secure, so having a large block size may be helpful avoid repetition of blocks. You could build a cipher with a larger block size from a block cipher with a small block size, but those kind of constructs are generally not all that efficient.






    share|improve this answer























    • Happy to upvote any other answers, especially those who actually designed a block cipher... Please do not accept yet.
      – Maarten Bodewes
      Dec 3 at 9:07










    • It is very interesting how Keccak algorithm manages to deal with 1600-bit blocks using only 24 (or even twelve!) rounds...
      – lyrically wicked
      Dec 4 at 6:49










    • Yeah, but if you would ever try to implement it Keccak's F is not exactly a peach to implement. Especially if you're a nut like me and try to do it "from spec".
      – Maarten Bodewes
      Dec 4 at 14:49













    up vote
    4
    down vote










    up vote
    4
    down vote









    It may be very tricky to have a good distribution / diffusion of the input bits for each round. That would mean that you would either require a very intricate design for each round, or a lot of rounds to make up for it.





    Furthermore it will also mean that you need to perform an operation of at least 1000 bits to perform any kind of calculation. This is not a desirable property in most circumstances. For some modes of operation such as CBC, it will mean that the minimum message size would be 1000 bits - and the IV would take up another 1000 bits. For counter (CTR) mode your message will not grow as only part of the output can be used. However, that still means doing 1000 bit calculations for possibly much smaller messages.



    Most modes of operation function quite well for block sizes of 128 bits, although 256 bits would be beneficial for modes such as CTR (because the IV is split between a nonce and the counter).





    Although a large block size is desirable for use in the design of hash functions, the avalanche effect does require a good diffusion of bits. Threefish - the block cipher used for Skein, one of the SHA-3 finalists - however does support a block size of 1024 bits, so ~1000 bits is certainly not unheard of (quite often the internal hash state is at least double the intermediate / output size). Skein 512 however requires 80 (!) rounds of Threefish1024 so it certainly wasn't the fastest algorithm out there.





    A relatively fast large block cipher may also be beneficial for some format preserving encryption, by the way. In principle a single block encrypt of a unique value is secure, so having a large block size may be helpful avoid repetition of blocks. You could build a cipher with a larger block size from a block cipher with a small block size, but those kind of constructs are generally not all that efficient.






    share|improve this answer














    It may be very tricky to have a good distribution / diffusion of the input bits for each round. That would mean that you would either require a very intricate design for each round, or a lot of rounds to make up for it.





    Furthermore it will also mean that you need to perform an operation of at least 1000 bits to perform any kind of calculation. This is not a desirable property in most circumstances. For some modes of operation such as CBC, it will mean that the minimum message size would be 1000 bits - and the IV would take up another 1000 bits. For counter (CTR) mode your message will not grow as only part of the output can be used. However, that still means doing 1000 bit calculations for possibly much smaller messages.



    Most modes of operation function quite well for block sizes of 128 bits, although 256 bits would be beneficial for modes such as CTR (because the IV is split between a nonce and the counter).





    Although a large block size is desirable for use in the design of hash functions, the avalanche effect does require a good diffusion of bits. Threefish - the block cipher used for Skein, one of the SHA-3 finalists - however does support a block size of 1024 bits, so ~1000 bits is certainly not unheard of (quite often the internal hash state is at least double the intermediate / output size). Skein 512 however requires 80 (!) rounds of Threefish1024 so it certainly wasn't the fastest algorithm out there.





    A relatively fast large block cipher may also be beneficial for some format preserving encryption, by the way. In principle a single block encrypt of a unique value is secure, so having a large block size may be helpful avoid repetition of blocks. You could build a cipher with a larger block size from a block cipher with a small block size, but those kind of constructs are generally not all that efficient.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Dec 2 at 22:30

























    answered Dec 2 at 22:23









    Maarten Bodewes

    52.2k676190




    52.2k676190












    • Happy to upvote any other answers, especially those who actually designed a block cipher... Please do not accept yet.
      – Maarten Bodewes
      Dec 3 at 9:07










    • It is very interesting how Keccak algorithm manages to deal with 1600-bit blocks using only 24 (or even twelve!) rounds...
      – lyrically wicked
      Dec 4 at 6:49










    • Yeah, but if you would ever try to implement it Keccak's F is not exactly a peach to implement. Especially if you're a nut like me and try to do it "from spec".
      – Maarten Bodewes
      Dec 4 at 14:49


















    • Happy to upvote any other answers, especially those who actually designed a block cipher... Please do not accept yet.
      – Maarten Bodewes
      Dec 3 at 9:07










    • It is very interesting how Keccak algorithm manages to deal with 1600-bit blocks using only 24 (or even twelve!) rounds...
      – lyrically wicked
      Dec 4 at 6:49










    • Yeah, but if you would ever try to implement it Keccak's F is not exactly a peach to implement. Especially if you're a nut like me and try to do it "from spec".
      – Maarten Bodewes
      Dec 4 at 14:49
















    Happy to upvote any other answers, especially those who actually designed a block cipher... Please do not accept yet.
    – Maarten Bodewes
    Dec 3 at 9:07




    Happy to upvote any other answers, especially those who actually designed a block cipher... Please do not accept yet.
    – Maarten Bodewes
    Dec 3 at 9:07












    It is very interesting how Keccak algorithm manages to deal with 1600-bit blocks using only 24 (or even twelve!) rounds...
    – lyrically wicked
    Dec 4 at 6:49




    It is very interesting how Keccak algorithm manages to deal with 1600-bit blocks using only 24 (or even twelve!) rounds...
    – lyrically wicked
    Dec 4 at 6:49












    Yeah, but if you would ever try to implement it Keccak's F is not exactly a peach to implement. Especially if you're a nut like me and try to do it "from spec".
    – Maarten Bodewes
    Dec 4 at 14:49




    Yeah, but if you would ever try to implement it Keccak's F is not exactly a peach to implement. Especially if you're a nut like me and try to do it "from spec".
    – Maarten Bodewes
    Dec 4 at 14:49










    up vote
    3
    down vote













    This problem has been studied and comes under wider block-cipher encryptions, examples are Mercy block cipher , Bear and Lion , and Simpira.
    My answer ties with Simpira block cipher case.
    Simpira has been recently developed to work on 64 bit processors (v2 is also available) which uses 128 bit x b , where b is a positive integer.



    Simpira is a cryptographic permutation that combines AES and Feistel structure.
    In Simpira cipher , the number of rounds required to achieve full diffusion increases along the b value , (12,15,21 rounds for b=1,2,3).This means , to achieve full diffusion of wider size more rounds are required therefore performance will downgrade.



    for further information: Simpira






    share|improve this answer



























      up vote
      3
      down vote













      This problem has been studied and comes under wider block-cipher encryptions, examples are Mercy block cipher , Bear and Lion , and Simpira.
      My answer ties with Simpira block cipher case.
      Simpira has been recently developed to work on 64 bit processors (v2 is also available) which uses 128 bit x b , where b is a positive integer.



      Simpira is a cryptographic permutation that combines AES and Feistel structure.
      In Simpira cipher , the number of rounds required to achieve full diffusion increases along the b value , (12,15,21 rounds for b=1,2,3).This means , to achieve full diffusion of wider size more rounds are required therefore performance will downgrade.



      for further information: Simpira






      share|improve this answer

























        up vote
        3
        down vote










        up vote
        3
        down vote









        This problem has been studied and comes under wider block-cipher encryptions, examples are Mercy block cipher , Bear and Lion , and Simpira.
        My answer ties with Simpira block cipher case.
        Simpira has been recently developed to work on 64 bit processors (v2 is also available) which uses 128 bit x b , where b is a positive integer.



        Simpira is a cryptographic permutation that combines AES and Feistel structure.
        In Simpira cipher , the number of rounds required to achieve full diffusion increases along the b value , (12,15,21 rounds for b=1,2,3).This means , to achieve full diffusion of wider size more rounds are required therefore performance will downgrade.



        for further information: Simpira






        share|improve this answer














        This problem has been studied and comes under wider block-cipher encryptions, examples are Mercy block cipher , Bear and Lion , and Simpira.
        My answer ties with Simpira block cipher case.
        Simpira has been recently developed to work on 64 bit processors (v2 is also available) which uses 128 bit x b , where b is a positive integer.



        Simpira is a cryptographic permutation that combines AES and Feistel structure.
        In Simpira cipher , the number of rounds required to achieve full diffusion increases along the b value , (12,15,21 rounds for b=1,2,3).This means , to achieve full diffusion of wider size more rounds are required therefore performance will downgrade.



        for further information: Simpira







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Dec 4 at 11:04

























        answered Dec 4 at 9:51









        hardyrama

        6961425




        6961425






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Cryptography Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f64502%2fwhat-problems-might-you-encounter-if-you-use-a-block-size-much-greater-than-aes%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Morgemoulin

            Scott Moir

            Souastre