Question :
Because this program, which divides a number into decimal notation, transforms it into binary notation and prints the number in the correct sequence (from the most significant to the least significant bit ). uses malloc
twice?
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
int main(int argc, char *argv[]) {
// Stack
int decimal, q, r;
int counter, i;
char *binary = NULL;
char *aux;
printf("Digite um número em base decimal: ");
scanf("%d", &decimal);
counter = 1;
while (decimal >= 2) {
q = decimal / 2;
r = decimal - (q * 2);
// Heap
aux = (char *) malloc(counter * sizeof(char));
if (binary != NULL) {
memcpy(aux, binary, counter-1);
free(binary);
}
binary = aux;
if (r == 0) {
binary[counter-1] = 48; //'0';
} else {
binary[counter-1] = 49; //'1';
}
//printf("resto %d = %dn", counter, r);
counter++;
decimal = q;
}
//printf("ultimo quociente = %dn", q);
// Heap
aux = (char *) malloc(counter * sizeof(char));
if (binary != NULL) {
memcpy(aux, binary, counter-1);
free(binary);
}
binary = aux;
if (decimal == 0) {
binary[counter-1] = 48; //'0';
} else {
binary[counter-1] = 49; //'1';
}
printf("Resultado em binário = ");
for (i = counter-1; i >= 0; i--) {
printf("%c", binary[i]);
}
printf("n");
free(binary);
return 0;
}
Answer :
If the problem is to avoid making every calculation of one of the debris again after the loop ends, it is easy to solve. The problem is that it is stopping when it arrives at the rest 2 and correct is to do until the rest is 1. So a simple change in the condition of while
solves this.
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
int main(int argc, char *argv[]) {
// Stack
int decimal, q, r;
int counter, i;
char *binary = NULL;
char *aux;
printf("Digite um número em base decimal: ");
scanf("%d", &decimal);
counter = 1;
while (decimal >= 1) { // <=============================== Mudei aqui de 2 para 1
q = decimal / 2;
r = decimal - (q * 2);
aux = (char *) malloc(counter * sizeof(char));
if (binary != NULL) {
memcpy(aux, binary, counter-1);
free(binary);
}
binary = aux;
if (r == 0) {
binary[counter-1] = 48; //'0';
} else {
binary[counter-1] = 49; //'1';
}
counter++;
decimal = q;
}
printf("Resultado em binário = ");
for (i = counter-1; i >= 0; i--) {
printf("%c", binary[i]);
}
printf("n");
free(binary);
return 0;
}
See running on ideone .
I did not analyze other aspects.
Why does this program […] use
malloc
twice?
For numbers with N
bits, the while
program cycle runs N - 1
time. This cycle does not run when decimal
is 0
or 1
.
while (decimal >= 2) { /* ... */ }
Therefore, within the loop, the allocation made is not sufficient to store the entire binary representation.
So, at the end of the cycle, the program makes one more allocation to the last bit.
This exercise they put you in is interesting: to understand the code of others and
eventually modify it.
The code of the statement has the virtue of: introducing several concepts
(memory types, mallocs, memcpy, etc.) and still have “workable” things
In addition to your question, if it were me (keeping the spirit of it) in addition to the above improvements,
(1) it ended with dynamic memory
char binary[32];
or at least allocate all at once (required size = 1 + log2 (decimal):
binary=malloc((int)(1+log2(decimal)))
(2) replaced the
if (r == 0) {
binary[counter-1] = 48; //'0';
} else {
binary[counter-1] = 49; //'1';
}
by
binary[counter-1] = r + '0';
In summary …
...
#include <math.h>
int main() {
int decimal, q, r, counter;
char *binary;
printf("Digite um número em base decimal: ");
scanf("%d", &decimal);
binary=malloc((int)(1+log2(decimal)));
counter = 0; //from @pmg
while (decimal > 0) { //from @bigown
q = decimal / 2;
r = decimal % 2;
binary[counter] = r + '0';
counter++;
decimal = q;
}
printf("Resultado em binário = ");
while (counter--) { putchar(binary[counter]); }
printf("n");
free(binary);
return 0;
}