Javascript support in the crawler/spider

60 views
Skip to first unread message

Andres Riancho

unread,
Nov 17, 2009, 12:32:01 AM11/17/09
to Websecurify
pdp,

Does websecurify support javascript in the crawler/spidering
process? I didn't had the time to verify it against a tool like wivet.

Cheers,

--
Andrés Riancho
http://w3af.sf.net/
http://www.bonsai-sec.com/

pdp

unread,
Nov 19, 2009, 9:34:44 AM11/19/09
to Websecurify
Yes and no. The spider uses a pattern matching strategy to extract
urls and forms (destinations in the internal lingo). Everything works
in a generic fashion. Support for other patterns can be easily added.
The next version will allow you to perform some browsing of the
application before starting a test which will significantly improve
the process of identifying ajax problems.

pdp

P.S. the current trunk supports python so there are opportunities for
future integration between both testing platforms. :)

Andres Riancho

unread,
Nov 20, 2009, 3:56:08 PM11/20/09
to webse...@googlegroups.com
pdp,

On Thu, Nov 19, 2009 at 12:34 PM, pdp <pdp.gnu...@googlemail.com> wrote:
>
> Yes and no. The spider uses a pattern matching strategy to extract
> urls and forms (destinations in the internal lingo).

why this approach and not a js code that does something like:

"""
for tag in document.getTagByName('*'):
tag.click()
tag.onMouseOver()
tag.onChange()
"""

I think that something like that would be TRIVIAL to implement in your
environment, and would provide:

- Crawling of normal websites, without depending on URL regular
expressions, which always suck.
- Crawling of any website, with any combination of javascript.

The only problem I see, is that you're going to be running javascript
code that you don't control, which could potentially harm (somehow)
the scanner (add an infinite loop in js, when the scanner clicks on
it, it will go into infinite loop too) or the client box (don't really
know how to harm the box).

> Everything works
> in a generic fashion.

Sounds nice.

> Support for other patterns can be easily added.

Patterns == regular expressions?
Patterns are bad and you should avoid them. I can't not in w3af...
but.... you should avoid them. Your environment is more friendly.

> The next version will allow you to perform some browsing of the
> application before starting a test which will significantly improve
> the process of identifying ajax problems.

Great.

> pdp
>
> P.S. the current trunk supports python so there are opportunities for
> future integration between both testing platforms. :)

hehe, nice. We'll see how that works. I've been trying to find a GPL
tool that will "click over javascript" for me, and hand me out the
results somehow... maybe websecurify is the tool I'm looking for...
but only if the JS support is added.

Cheers,

> On Nov 17, 5:32 am, Andres Riancho <andres.rian...@gmail.com> wrote:
>> pdp,
>>
>>     Does websecurify support javascript in the crawler/spidering
>> process? I didn't had the time to verify it against a tool like wivet.
>>
>> Cheers,
>>
>> --
>> Andrés Rianchohttp://w3af.sf.net/http://www.bonsai-sec.com/

--
Andrés Riancho
Founder, Bonsai - Information Security
http://www.bonsai-sec.com/
http://w3af.sf.net/

pdp

unread,
Nov 21, 2009, 8:44:58 AM11/21/09
to Websecurify
heh :) one thing I've learned over years of coding experience is to
keep things simple.

however, you r right, the current platform does allow a full and
complete introspection and automation of AJAX. However, I am thinking
that this should be implemented as an extension on the top of the
current engine. The reason for this is because the core engine can run
from the command line as well and adding a mozilla-specific feature
will break this.

the rest of my comments are inlined

On Nov 20, 8:56 pm, Andres Riancho <andres.rian...@gmail.com> wrote:
> pdp,
>
> On Thu, Nov 19, 2009 at 12:34 PM, pdp <pdp.gnuciti...@googlemail.com> wrote:
>
> > Yes and no. The spider uses a pattern matching strategy to extract
> > urls and forms (destinations in the internal lingo).
>
> why this approach and not a js code that does something like:
>
> """
> for tag in document.getTagByName('*'):
>     tag.click()
>     tag.onMouseOver()
>     tag.onChange()
> """
>

I am not sure if this is any as better. If we are about to use this
kind of ajaxy feature we need to write a mega complicated model to
store and restore states.

>
> I think that something like that would be TRIVIAL to implement in your
> environment, and would provide:
>
> - Crawling of normal websites, without depending on URL regular
> expressions, which always suck.

Well the current crawler is based on regex pattern matching and does
spider normal and AJAX sites quite well. Of course, there are some
exceptions and limitations

>
> - Crawling of any website, with any combination of javascript.
>
> The only problem I see, is that you're going to be running javascript
> code that you don't control, which could potentially harm (somehow)
> the scanner (add an infinite loop in js, when the scanner clicks on
> it, it will go into infinite loop too) or the client box (don't really
> know how to harm the box).
>

The same SOP rules will apply, which means that unless the site
contains an 0day for mozilla, it shouldn't be able to harm your PC.

>
> > Everything works
> > in a generic fashion.
>
> Sounds nice.
>
> > Support for other patterns can be easily added.
>
> Patterns == regular expressions?
> Patterns are bad and you should avoid them. I can't not in w3af...
> but.... you should avoid them. Your environment is more friendly.
>
> > The next version will allow you to perform some browsing of the
> > application before starting a test which will significantly improve
> > the process of identifying ajax problems.
>
> Great.
>
> > pdp
>
> > P.S. the current trunk supports python so there are opportunities for
> > future integration between both testing platforms. :)
>
> hehe, nice. We'll see how that works. I've been trying to find a GPL
> tool that will "click over javascript" for me, and hand me out the
> results somehow... maybe websecurify is the tool I'm looking for...
> but only if the JS support is added.
>

I think that websecurify is too complicated for this purpose. However,
if we write an AJAX spider on the top of Websecurify, then that can be
easily ported into a small standalone application (still big in terms
of disk space but fast and flexible)
Reply all
Reply to author
Forward
0 new messages