**Problem 80:**

It is well known that if the square root of a natural number is not an integer, then it is irrational. The decimal expansion of such square roots is infinite without any repeating pattern at all.

The square root of two is 1.41421356237309504880…, and the digital sum of the first one hundred decimal digits is 475.

For the first one hundred natural numbers, find the total of the digital sums of the first one hundred decimal digits for all the irrational square roots.

**Idea:**

Since I need a lot of digits / precision, I need to use the sister class of BigInteger: BigDecimal. I saw BigDecimal had no square root method, so I was going to use the power method, but it only accepts integers (so I can’t raise a decimal to the 1/2 power, i.e. square root). So I had to create a square root method myself (after a little research).

With that completed, I just needed to check to see if the resulting square root was an integer or decimal. From there, just add the first 100 decimal values together for all the non-integer square roots.

int answer = 0; MathContext precision = new MathContext(105); for(int i = 1; i <= 100; i++) { BigDecimal temp = EulerUtils.sqrt(BigDecimal.valueOf(i), precision); String number = temp.toString(); if(number.indexOf(".") == -1) { continue; } else { number = number.substring(0,101); number = number.replace(".",""); for(char c:number.toCharArray()) { answer += Integer.parseInt(""+c); } } }

Note: sqrt() is a method I wrote to calculate the square root of a BigDecimal up to a given precision.