1. You are correct that in general there will be a variance associated with each feature and a covariance between two features. However, the spherical Gaussian distribution assumes the following:
a. All covariances between different features are zero
b. The variances for all features are equal (called sigma) in sge.
2. Naive Bayes classifier has to be designed by using the sge() to first estimate the parameters for the two classes, and then designing a Bayesian classifier (this part is simpler than the Bayesian classifier for multi-variate normal case due to spherical Gaussian assumption mentioned above).
Think of this in two steps:
1. Estimate parameters for class 1 and 2. This gives you probability distributions for the two classes.
2. Design the naive Bayes classifier by working from first principles using estimated probability distributions in step 1.
Regards,
Vinay