When the matrix is real and symmetric, ARPACK does resort to Lanczos, or at least the implicitly restarted version thereof. Straight from the homepage:
When the matrix A is symmetric it reduces to a variant of the Lanczos process called the Implicitly Restarted Lanczos Method (IRLM).
I always wondered why the ARPACK creators didn't bother to use the same for complex Hermitian problems. Even for that case, the Lanczos / Arnoldi orthogonalization will automatically yield a reduced matrix which is real and tridiagonal (instead of Heisenberg in the generic case) so I would think there is some benefit.
On a different note, typically you cannot easily find smallest eigenvalues of a matrix with a Krylov method if you just multiply with A, if smallest is supposed to mean closest to zero in absolute value (smallest magnitude). You can only find extremal eigenvalues, which are on the outer regions of the spectrum. For the smallest magnitude eigenvalues, in the generic case, one has to use a Krylov subspace built using the inverse of A.
Since your matrix is Hermitian, you know the spectrum will be real, but still it might have many positive and negative eigenvalues and so the same comments still apply if you are looking for the eigenvalues with smallest magnitude. If by smallest you mean, most negative, then there is no problem (you should use :SR in eigs). If you know more, e.g. that your matrix is not only Hermitian but also positive definite, then all eigenvalues are positive and smallest magnitude also means smallest real.