this has been solved recently (the result couldn't be larger than 64k on most platforms). Right now it has no limitations, although its surely finite.
Please do keep in mind that (on top of your task being forced to pass around big chunks of data) result is always json string.
Whenever you need to save and load a result, you convert back and forth to/from json.
If your "thing to do" its in the form of a series of zillions of rows that needs to stored properly, you'd waste far less "energy" just making your scheduled task saving the result in the "final" table rather than returning the result and then having another piece that fetches the result from the scheduled task and translates the result to the table.