$> scrapy crawl detail -s LOG_LEVEL=WARNING
2014-09-05 16:40:46-0400 [scrapy] INFO: Scrapy 0.24.4 started (bot: detail)
2014-09-05 16:40:46-0400 [scrapy] INFO: Optional features available: ssl, http11
2014-09-05 16:40:46-0400 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'crawler.spiders', 'LOG_LEVEL': 'WARNING', 'SPIDER_MODULES': ['crawler.spiders'], 'BOT_NAME': 'chrome_store_crawler', 'USER_AGENT': '...', 'DOWNLOAD_DELAY': 0.3}
2014-09-05 16:40:47-0400 [scrapy] INFO: Enabled extensions: LogStats, TelnetConsole, CloseSpider, WebService, CoreStats, SpiderState
2014-09-05 16:40:48-0400 [scrapy] INFO: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddleware, ChunkedTransferMiddleware, DownloaderStats
2014-09-05 16:40:48-0400 [scrapy] INFO: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2014-09-05 16:40:48-0400 [scrapy] INFO: Enabled item pipelines: CsvExporterPipeline
2014-09-05 16:40:48-0400 [detail] INFO: Spider opened
......
Would there be any settings that would conflict with this? I'm running Scrapy v0.24.4