Ameer --
Yes, you will have to estimate many parameters about Ponds (and/or Wetlands), for which there will likely not be much real data. The areas you can get from land-cover data (satellite or aerial photos), or open-water GIS data for your watershed, if you're lucky. You'll likely have to guess about average water depths (I often use 1.5 m for small water bodies with no data). This would give you principal areas and volumes.
"Emergency" areas and volumes are somewhat conceptual. In the model, all flow above an emergency volume is spilled downstream during the same time increment it occurs. I.e., there is no available storage to reduce stormflows above the emergency volume. This doesn't really happen in reality, but it keeps the model from flooding large parts of subbasins (which might actually occur). I don't have good advice here -- I usually pick some multiple of the principal area (2-3x?) and principal volume (3-5x?) to estimate the emergency area and volume.
"Natural" drawdown of the Pond levels occurs at the rate determined by NDTARG, and only for water volumes above principal and below emergency -- and only for the designated "flood" months. During "non-flood" months, the water level is allowed to rise up to the emergency volume, below which nothing spills and above which everything spills. This is unrealistic, so I avoid using "non-flood" months as much as possible. The parameters IFLOD1 & 2 are used to set the beginning and end of the non-flood period -- which I don't want -- so I set IFLOD1=12 (December) and IFLOD2=1 (January). This minimizes the "non-flood" season to be just from December to January, but that's OK in my part of the world where little is flowing in the winter (the same would be true enough for most of Canada).
Finally, setting PND_FR (or WET_FR) is even more a guessing game at this point. A simple approximation could be to assume it's a simple multiple of the principal area. In one of my watersheds, where I did more work with the DEM to try and actually measure it, the PND_FR value was about 3.3x the principal area (with the entire subbasin area as the upper limit, which is very possible in some settings). BUT -- this is highly variable and could be off by very large amount. Next year I hope to have a student or colleague help construct a GIS tool that will calculate cumulative depression area, volume, and drainage area in a subbasin, which could then objectively be used to parameterize SWAT. Until that time -- and with no promises -- you'll have to estimate on your own as objectively as possible.
I usually start out with the K value = 0 to stop all seepage, and only increase it as needed. If you have a non-contributing basin (i.e., no surface outlet), you can increase K (to 5-10? I forget...) to be large enough so that the Pond always seeps out and never spills, thereby trapping all sediment (and all phosphorus?).
Wetlands function identically as Ponds, except that the NDTARG value is hard-coded to be 10 days. This may not be appropriate, so lately I've just been using Ponds. Or, in some cases, I've used Wetlands to represent closed depressions (high seepage, no spilling) and Ponds to represent open depressions (spilling to a surface-water outlet), since they (should) impact water quality very differently.
The good thing about Ponds is that you can alter the parameters easily. The bad things -- well, they never quite seem to have the hydraulic and water-quality impacts that I expect. The whole flood vs. non-flood month stuff should be simplified and removed in the default case, so all volumes between principal and emergency flow out based on NDTARG or an input stage-discharge relation. Seepage must be allowed to contribute to baseflow (if it doesn't already).
Remember -- when you add Ponds, they will alter your HRU-level output. To see the "raw" HRU water, sediment, & nutrient yields before they are altered by draining into Ponds, turn off Ponds by temporarily setting PND_FR=0 for all subbasins.
That should get you started,
-- Jim