Hey Karen,
From having played with the API a lot my gut feeling is that it's not even close to 80GB, sadly.
A single request of 200 articles will give you body text for those articles of around 1.8MB saved as a text file. 200 articles is the maximum allowed per request ("page"), it'll tell you there are 9,258 "pages" (requests) of data. Back of napkin calculation 9,258 "pages", each with 200 articles = 9,258 x 1.8MB = 16,664MB = 16.6GB.
With the developer level of access you're allowed to make 5,000 calls per day. So on the up side it would only take you 2 days to fetch all the articles, which is somewhere in the region of 1.8 Million articles.
Hope this helps.
-D