Convert char to int in cpp

Convert char to int in C and C++

@Alf P. Steinbach: The original question was vague regarding which language. With keywords c and c++ , I think answers confronting both languages are reasonable.

From my extensive experience on other technical forums, my intuition is that the OP really means «how do I take the textual representation of a number (in base 10) and convert it to the corresponding number?» Generally speaking, C and C++ neophytes usually have incredibly fuzzy ideas about how text works in those languages and what char really means.

@KarlKnechtel: If that’s true (I give it about 50/50 as lots of early tutorials also encourage getting ASCII values out of chars, even though ASCII doesn’t cover the full range), the OP needs to clarity – but that’s a dupe of stackoverflow.com/questions/439573/….

The OP had three hours to clarify this question and failed to do so. As it is, there’s no way to know what is actually asked. Voted to close.

14 Answers 14

Depends on what you want to do:

to read the value as an ascii code, you can write

char a = 'a'; int ia = (int)a; /* note that the int cast is not necessary -- int ia = a would suffice */ 

to convert the character ‘0’ -> 0 , ‘1’ -> 1 , etc, you can write

char a = '4'; int ia = a - '0'; /* check here if ia is bounded by 0 and 9 */ 

Explanation:
a — ‘0’ is equivalent to ((int)a) — ((int)’0′) , which means the ascii values of the characters are subtracted from each other. Since 0 comes directly before 1 in the ascii table (and so on until 9 ), the difference between the two gives the number that the character a represents.

Читайте также:  Html div style min width

@KshitijBanerjee That’s not a good idea for two reasons: it gives you a negative number for ascii characters before ‘0’ (like & -> -10), and it gives you numbers larger than 10 (like x -> 26)

@kevin001 If you want to convert the char to int and a character ‘1’ provides a ascii number that’s not 1 , you need to remove the offset ‘0’ to realign it to count from 0-9. The consecutive numbers 1-9 are adjacent in the ascii integer number.

@foo-bah But I didn’t understand why we have to subtract it with character ‘0’, if we only typecast that character into integer and store it into integer, why it throws error.?

Well, in ASCII code, the numbers (digits) start from 48. All you need to do is:

Or, since the character ‘0’ has the ASCII code of 48, you can just write:

int x = character - '0'; // The (int) cast is not necessary. 

C and C++ always promote types to at least int . Furthermore character literals are of type int in C and char in C++.

You can convert a char type simply by assigning to an int .

char c = 'a'; // narrowing on C int a = c; 

-1 The answer is incorrect for the only meaningful interpretation of the question. This (code int a = c; ) will keep any negative values, which C standard library functions can’t deal with. The C standard library functions set the standard for what it means to handle char values as int .

@Matt: I’m keeping the downvote. I’d strengthen it if possible! The question interpretation you and others have assumed is not meaningful, because it’s too utterly trivial, and because for the OP’s particular combination of types there is a not-so-trivial very important practical issue. The advice you give is directly dangerous to the novice. It will most likely result in Undefined Behavior for their programs that use C standard library character classification functions. Re ref. to @Sayam’s answer, he has deleted that answer.

What do you mean by «always promote»? Values are promoted during implicit conversions, certain types of parameters passing (e.g., to a varargs function), and when an operator must makes its operands compatible types. But there are certainly times when a value is not promoted (like if a I pass a char to a function expecting a char), otherwise we wouldn’t have any types smaller than an int.

char is just a 1 byte integer. There is nothing magic with the char type! Just as you can assign a short to an int, or an int to a long, you can assign a char to an int.

Yes, the name of the primitive data type happens to be «char», which insinuates that it should only contain characters. But in reality, «char» is just a poor name choice to confuse everyone who tries to learn the language. A better name for it is int8_t, and you can use that name instead, if your compiler follows the latest C standard.

Though of course you should use the char type when doing string handling, because the index of the classic ASCII table fits in 1 byte. You could however do string handling with regular ints as well, although there is no practical reason in the real world why you would ever want to do that. For example, the following code will work perfectly:

You have to realize that characters and strings are just numbers, like everything else in the computer. When you write ‘a’ in the source code, it is pre-processed into the number 97, which is an integer constant.

So if you write an expression like

this is actually equivalent to

char ch = (int)53; ch = ch - (int)48; 

which is then going through the C language integer promotions

and then truncated to a char to fit the result type

There’s a lot of subtle things like this going on between the lines, where char is implicitly treated as an int.

Источник

How to convert a single char into an int [duplicate]

I have a string of digits, e.g. «123456789», and I need to extract each one of them to use them in a calculation. I can of course access each char by index, but how do I convert it into an int? I’ve looked into atoi(), but it takes a string as argument. Hence I must convert each char into a string and then call atoi on it. Is there a better way?

The string is not really a number, but individual digits, To be exact, a social security number. I want to run a calculation validating the ssn.

11 Answers 11

You can utilize the fact that the character encodings for digits are all in order from 48 (for ‘0’) to 57 (for ‘9’). This holds true for ASCII, UTF-x and practically all other encodings (see comments below for more on this).

Therefore the integer value for any digit is the digit minus ‘0’ (or 48).

char c = '1'; int i = c - '0'; // i is now equal to 1, not '1' 
char c = '1'; int i = c - 48; // i is now equal to 1, not '1' 

However I find the first c — ‘0’ far more readable.

Is there any encoding in which ‘9’-‘0’ != 9 ? I’m not even sure if such an encoding would be allowed per ISO C++.

On encodings and the order of digits, I asked this question stackoverflow.com/questions/782373/…. The short answer is «Any encoding based on Ascii or EBCDIC, yes» (which means 99.9% of encodings we’ll meet in everyday life and the web). Also interestingly the c/c++ standards seem to state that they only support encodings where the digits are ordered.

Is there any encoding where it does not hold ‘0’ < '1' < '2' < '3' It would be at least a very very strange decision

The C++ standard guarantees that ‘0’ through ‘9’ occur adjacently and in the right order in the character set. So the c — ‘0’ works on all systems, whereas c — 48 wouldn’t work on EBCDIC for example.

Note that C11 §5.2.1 Character sets ¶3 says: In both the source and execution basic character sets, the value of each character after 0 in the above list of decimal digits shall be one greater than the value of the previous. The C++ standard will have a similar rule.

Or you could use the «correct» method, similar to your original atoi approach, but with std::stringstream instead. That should work with chars as input as well as strings. (boost::lexical_cast is another option for a more convenient syntax)

(atoi is an old C function, and it’s generally recommended to use the more flexible and typesafe C++ equivalents where possible. std::stringstream covers conversion to and from strings)

You can make use of atoi() function

#include #include int main(int argc, char* argv[])

The answers provided are great as long as you only want to handle Arabic numerals, and are working in an encoding where those numerals are sequential, and in the same place as ASCII.

This is almost always the case.

If it isn’t then you need a proper library to help you.

  1. First convert the byte-string to a unicode string. (Left as an exercise for the reader).
  2. Then use uchar.h to look at each character.
  3. if we the character is UBool u_isdigit (UChar32 c)
  4. then the value is int32_t u_charDigitValue ( UChar32 c )

Or maybe ICU has some function to do it for you — I haven’t looked at it in detail.

Источник

How To Convert Char to Int in C++

C++ Tutorials

Are you looking to know how to convert Char to Int in C++? This article will show you the different ways to convert Char to Int.

In C++, most of the Char and Integer values can be represented by ASCII [1] values. Hence if you know the ASCII values for the number that are currently represented in char then you can easily convert those into integers.

Convert Char to Int in C++ Using ASCII Values

Let us see the below example code of how you can convert the number represented in Char to an integer value.

#include using namespace std; int main() < char example = '9'; //Typecasting the char to int //Note that this convert the ASCII //Value of Char to Int int convertedNumber = int(example); cout

As you can see in the above code, ‘9’ was represented in char, and then with the use of typecasting the char value that reads it in ASCII form and then converts it to an integer value. Hence if you are aware that the char value can be represented in the ASCII value then only you can use this method and it would work.

Convert Char to Int Using Subtraction

This method you can use if you have a character that cannot be represented in ASCII value for example number 20. Hence to convert the char ’20’ in integer we need to use the subtraction method that will provide the ASCII for each digit and then we can add those to get the actual number.

Let us see in the below example code the usage of subtraction to get the Integer from Char in C++.

#include using namespace std; int main() < char example = '8'; //Using Subtraction to get the ASCII //We are subtracting ASCII value of 0 and // ASCII value of example int convertedNumber = example - '0'; //Printing the Converted Number cout

As per the above code, example — ‘0’ is equivalent to ((int)example) — ((int)’0′), which means the ASCII values of the characters are subtracted from each other. Since 0 comes directly before 1 in the ASCII table (and so on until 9 ), the difference between the two gives the number that the character a represents.

How To Convert Char to Int in C++

Convert Char* To Int in C++

If you want to convert char* to int in C++ when char* represents the number. And if the char* variable does not represent number then the below methods will not work.

Let us check in below example code below how you can change the char* to Int in C++.

//CPP Program to Initialize Vector #include #include using namespace std; int main() < char* numberChar = "12"; int convertedNum; stringstream newStream(numberChar); //Converting Char* to Int using the Stream newStream >> convertedNum; cout

As you can see in the above I was able to convert the char* to int using the string stream library. But this method is not 100% secure and you will receive a warning in C++ that you should not convert string to char*.

But if you still want to convert the char* to int ignoring the warning then you can use the above code to resolve this problem.

Wrap Up

I hope you got the answer related to how to convert char to int in C++. I have discussed two methods above that you can use directly to do the conversion.

Let me know in the comment section if you have found any method that is better than the one discussed above I will be happy to add it here.

If you liked the above tutorial then please follow us on Facebook and Twitter. Let us know the questions and answer you want to cover in this blog.

Further Read:

Источник

Оцените статью