Actually from what is shown, it's not a problem with the principle of least suprise, it's simply wrong:
the sequence f is defined by f[0] = 1, f[n] = n for n > 0, thus summation over n =0,…,m gives
Σ f[n] = ½ m(m+1) + 1
It appears that the symbolic summation ignores the special definition of f[0] = 1,
and only cares about the definition f[n_] = n, and thus it gives a wron result, in the symbolic case,
and gives the right one for the numeric value, where (at I suppose) f[n] is evaluated explicitely.
Nevertheless Mathematica does it wrong.