And, strangely enough, the value that 1 - 1/2 + 1/3 - 1/4 + 1/5... converges to if evaluated left to right is the natural log of 2. It doesn't converge very quickly, though. It takes a billion terms to reach just six decimal places of precision.
But then if you integrate the infinite sum term by term, you'll get:
\int (1 - x + x^2 - x^3 + ...) = x - x^2/2 + x^3/3 - ...
On the other hand, if you integrate 1/(1+x), you'll get precisely log(1+x). Now, one would want to argue that:
log(1+x) = x - x^2/2 + x^3/3 - ...
and here this is actually true, but in general this may not be true that the integral of the sum of infinite series is equal to sum of the integrals of each term -- what you need here is the notion of uniform convergence, but fortunately 1 - x + x^2 - ... converges uniformly.
But, since Leonard Euler wouldn't really be pedantic about stuff like this, as long as the result is correct, neither should the occasional math hacker. Thus, we put x = 1 in both sides of the equality and we get 1 - 1/2 + 1/3 - ... = log(2).
Apart from integrating term by term, there's one more problem with above reasoning, and there are bonus points for people who notice that (hint: uggc://ra.jvxvcrqvn.bet/jvxv/Nory'f_gurberz )
The natural logarithm of 2 is only described so succinctly because that number (and related ones) were so common and useful that we introduced a shorthand to denote the limiting process which tends to it. I think you'd find that, were the definition expanded completely, that its description is not so succinct after all. :)