Jump to content

Need Help Converting Binary to Decimal in C

Solace_

In a nutshell I have a school project that wants me to convert an entered binary number to a decimal.

 

Program should go something like this:

User enters binary number -> binary number is converted one digit at a time -> decimal number is displayed.

 

I am very new to C and programming in general. If you have any tips or snippets of code you could share that would be great.

 

Thanks very much!

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Solace_ said:

In a nutshell I have a school project that wants me to convert an entered binary number to a decimal.

Do you know how to read binary by hand?

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, straight_stewie said:

Do you know how to read binary by hand?

yes I do

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Solace_ said:

yes I do

Great! This is actually a pretty simple thing to do.

Here's the algorithm:

  1. get the users input
  2. initialize an integer to 0
  3. Put the users input into a char array
  4. Reverse the char array
  5. Iterate through the char array from smallest index to largest doing steps 6 and 7
  6. If the current char in the array is character '1', then the integer is equal to itself plus 2 to the power of the current index of the array (integer += 2index)
  7. If the current char in the array is character '0', do nothing
  8. Output the integer.

Can you post your code when you're done?

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, straight_stewie said:

Great! This is actually a pretty simple thing to do.

Here's the algorithm:

  1. get the users input
  2. initialize an integer to 0
  3. Put the users input into a char array
  4. Reverse the char array
  5. Iterate through the char array from smallest index to largest doing steps 6 and 7
  6. If the current char in the array is character '1', then the integer is equal to itself plus 2 to the power of the current index of the array (integer += 2index)
  7. If the current char in the array is character '0', do nothing
  8. Output the integer.

Can you post your code when you're done?

thanks very much, I will certainly try and remember to.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, Solace_ said:

In a nutshell I have a school project that wants me to convert an entered binary number to a decimal.

 

Program should go something like this:

User enters binary number -> binary number is converted one digit at a time -> decimal number is displayed.

 

I am very new to C and programming in general. If you have any tips or snippets of code you could share that would be great.

 

Thanks very much!

int getDecimal (int binary){
	if (binary == 0){
		return 0;
	}
	int decimal = 0; // keep track of the total
	int j = floor(log10(abs(binary))) + 1; // returns how many digits are in the binary number
	int i = 0; // using i to iterate through each digit in the for loop.
	
	for int (; i < j; i++){ //starting with the left most digit start adding up the value for each position.
	
			decimal+= binary[i]^j/2; // Adds the decimal value of the binary position to the end result.
			
			j--; // takes j to the next digit so it can take the correct power.
		
	}
	return decimal;
}

I didn't test this, but it should work I think.

CPU: Ryzen 5950X Ram: Corsair Vengeance 32GB DDR4 3600 CL14 | Graphics: GIGABYTE GAMING OC RTX 3090 |  Mobo: GIGABYTE B550 AORUS MASTER | Storage: SEAGATE FIRECUDA 520 2TB PSU: Be Quiet! Dark Power Pro 12 - 1500W | Monitor: Acer Predator XB271HU & LG C1

 

Link to comment
Share on other sites

Link to post
Share on other sites

The way I like to do it is read one digit at a time from right to left, and do 2^pos*digit and then add al of them up

Link to comment
Share on other sites

Link to post
Share on other sites

Implementation of @straight_stewie 's algorithm (sort of)

#include <stdio.h>
#include <string.h>

int main(void)
{
	//Maximum amount of bits user can enter. +1 because '\n' gets recorded.
	const int  MAXBITS = (sizeof(unsigned int) * 8) + 1;	

	//Get binary number from user into InBuffer.
	char InBuffer[MAXBITS];		
	printf("Enter binary number: \n");
	fgets(InBuffer, MAXBITS, stdin);
	int Length = strlen(InBuffer) - 1; //-1 because '\n' gets recorded.

	//Loop trough each character in the input string.
	unsigned int Result = 0;
	for (int i = 0; i < Length; i++)
	{
		//Check for invalid characters.
		if (InBuffer[i] > '1' || InBuffer[i] < '0')
		{
			printf("Invalid binary number entered!\n");
			return 1;
		}

		//Multiply each char with it's bitvalue and add to result.
		Result += (InBuffer[i] - '0') * (1 << (Length - i - 1));
	}
	
	printf("Result: %d\n", Result);

	return 0;
}

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×