Yes, you could just run "scrapy crawl" from cron, as Scrapy is not coupled (in any way) with Scrapyd.
Scrapyd is an application (not a programming framework, like Scrapy) that lets you manage your running spiders more conveniently by providing a HTTP api to schedule jobs and download data, and a web UI (although still basic, but would improve over time) for visualizing spiders running, pending and completed, as long as a way to control/limit how many spiders can run in parallel (in order not to overload the machine).