Hi,
For
astrometry.net, you usually need 12-15 of the brightest stars to be detected.
Fundamentally, detecting stars is all about signal-to-noise. In an idealize CCD image, you have "sky" background, which varies pixel-to-pixel, giving you noise, as well as readout noise. And then the signal from the stars gets spread out into multiple pixels by the Point Spread Function. Uniform clouds would just reduce the amount of signal.
In the web version, if you have an image with a wide PSF (blurry), you can use the "Downsample" option (default = 2). With the "solve-field" command, you can use "--downsample"; there's also the "--nsigma" option that says how much signal-to-noise is needed to detect a star - good when there are clouds or noise making the stars faint.
cheers,
dustin