### rust

#### How to convert char to integer so that '1' becomes 1?

```I am trying to find sum of all the digits of a given number, like 134 gives 8.
My plan is to iterate over the number by converting it into a string using .to_string() and then use .chars() to iterate over the number. Then I want to convert every char in the iteration into an integer and add into a variable. I want to get the final value of this variable.
I tried using the code below to convert a char into an integer (Playground):
fn main() {
let x = "123";
for y in x.chars() {
let z = y.parse::<i32>().unwrap();
println!("{}", z + 1);
}
}
But it results in this error:
error: no method named `parse` found for type `char` in the current scope
--> <anon>:4:19
|
4 | let z = y.parse::<i32>().unwrap();
| ^^^^^
How can I convert a char into an integer?
The below code does exactly what I want to do (Playground):
fn main() {
let mut sum = 0;
let x = 123;
let x = x.to_string();
for y in x.chars() {
// converting `y` to string and then to integer
let z = (y.to_string()).parse::<i32>().unwrap();
// incrementing `sum` by `z`
sum += z;
}
println!("{}", sum);
}
but first I have to convert char into a string and then into an integer to increment sum by z. Is there a way to directly convert char into integer?
```
```The method you need is char::to_digit. It converts char to a number it represents in the given radix.
You can also use Iterator::sum to calculate sum of a sequence conveniently:
fn main() {
let x = "134";
}
```
```my_char as u32 - '0' as u32
It works because the ASCII (and thus UTF-8) encodings have the Arabic numerals 0-9 ordered in ascending order. You can get the scalar values and subtract them.
However, what should it do for values outside this range? What happens if you provide 'p'? It returns 64. What about '.'? This will panic. And '♥' will return 9781.
Strings are not just bags of bytes. They are UTF-8 encoded and you cannot just ignore that fact. Every char can hold any Unicode scalar value.
That's why strings are the wrong abstraction for the problem.
From an efficiency perspective, allocating a string seems inefficient. Rosetta Code has an example of using an iterator which only does numeric operations:
struct DigitIter(usize, usize);
impl Iterator for DigitIter {
type Item = usize;
fn next(&mut self) -> Option<Self::Item> {
if self.0 == 0 {
None
} else {
let ret = self.0 % self.1;
self.0 /= self.1;
Some(ret)
}
}
}
fn main() {
println!("{}", DigitIter(1234, 10).sum::<usize>());
}
```
```The method parse() is defined on str, not on char. A char is a Unicode codepoint, which is 32 bits wide. If you cast it to an integer, using u32 is preferred over i32.
You can cast it via as or into():
let a = '♥' as u32;
let b: u32 = '♥'.into();
```
```Another way is to iterate over the characters of your string and convert and add them using fold.
fn sum_of_string(s: &str) -> u32 {
s.chars().fold(0, |acc, c| c.to_digit(10).unwrap_or(0) + acc)
}
fn main() {
let x = "123";
println!("{}", sum_of_string(x));
}```

### Resources

Mobile Apps Dev
Database Users
javascript
java
csharp
php
android
MS Developer
developer works
python
ios
c
html
jquery
RDBMS discuss
Cloud Virtualization