Hi Dhruv, your help is welcome. Here are a few pointers that come to mind:
Get familiar with the command line; that's where developers spend their test iterations.
Setup a virual working environment:
$ mkvirtualenv dhruv-scrapy
Create a directory to work in:
$ mkdir ~/Scrapy
Setup Scrapy so you have the dependencies installed:
$ pip install scrapy
Fork scrapy from Github: (click the "Fork" button, Github account required)
Install & configure Git:
Clone your fork:
$ git clone g...@github.com:dhruv/scrapy.git
Install your fork:
$ pip install -e ~/Scrapy/scrapy
Run the tests:
If you get the all-green from the tox tests then you're ready to contribute.
See the contribution page:
And select some easy tickets to start with:
Start writing some spiders and play with the code.
Good luck and have fun. +steven