Welcome to Software Development on Codidact!
Will you help us build our independent community of developers helping developers? We're small and trying to grow. We welcome questions about all aspects of software development, from design to code to QA and more. Got questions? Got answers? Got code you'd like someone to review? Please join us.
Post History
It has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep ...
Answer
#7: Post edited
- It has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero, well-defined and portable.
- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type. It can only wrap-around.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
- It has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero, well-defined and portable.
- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type. It can only wrap-around.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> - 1 bytes, likely 2<sup>32</sup>-1 or 2<sup>64</sup>-1 on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
#6: Post edited
- It has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero, well-defined and portable.
- Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type.- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
- It has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero, well-defined and portable.
- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type. It can only wrap-around.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
#5: Post edited
- It has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).
- That `man` page is confused and has some technical problems:
- It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero.- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
- It has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero, well-defined and portable.
- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
#4: Post edited
It a has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero.
- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
- It has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero.
- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
#3: Post edited
- It a has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero.
- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
- It a has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero.
- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
#2: Post edited
- It a has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero.
- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
- It a has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed.
- The C99 rationale document points out that `calloc(0, sizeof(OBJ));` is apparently wide-spread use (according to the rationale), supposedly as a way to distinguish between "nothing allocated" and "zero bytes allocated". The Committee did not recommend it - this is implementation-defined and non-portable (and soon to be explicitly undefined behavior in C23).
- I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented).
- That `man` page is confused and has some technical problems:
- - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero.
- - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type.
- In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place.
- - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API.
- - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.
#1: Initial revision
It a has 2 parameters for weird historical reasons that nobody seems to know the rationale for any longer. Like most functions in the C standard library, the function API was not well-designed. Keep in mind that many of these functions were designed in the 1960s(!) and early 1970s, some 20 years before good programming practices were invented. By the time C got ISO standardized, they chose to pick existing functions at a whim rather than encouraging better functions to be designed. I just checked K&R 1st edition and it has two functions `alloc` and `calloc`, with a corresponding `cfree` for `calloc` specifically. `calloc` has 2 parameters even there and K&R 1st edition recommends a cast of the result (I believe `void*` had not yet been invented). That `man` page is confused and has some technical problems: - It is not possible for the unsigned `size_t` argument of `malloc`/`calloc` to _overflow_, although _wrap-around_ is possible. These two terms have very different meaning in C. Overflow means signed overflow, which is undefined behavior. Wrap-around means an unsigned integer going beyond the max value and starting over at zero. - Similarly due to "the usual arithemtic promotions", `nmemb * size` cannot overflow, given that at least one parameter is `size_t` or another large unsigned type. In case neither operand is a large unsigned integer type, then that's a caller-side arithmetic bug completely unrelated to the `malloc` function. Like for example the common beginner mistake of typing `int` all over the place. - `malloc` can at most allocate 2<sup>(CHAR_BIT * sizeof(size_t))</sup> bytes, likely 2<sup>32</sup> or 2<sup>64</sup> on mainstream systems. `calloc` opens up the possibility to request even more than that through its dysfunctional 2-parameter API. - So the only "increased" error checking `calloc` can do internally is to ensure that no over-allocation caused by `calloc`'s own bad API is taking place.