Skip to content

Commit

Permalink
Version 2.08b: Many changes including dir refactor
Browse files Browse the repository at this point in the history
  - Added Host header XSS testing.
  - Added HTML encoding XSS tests to detect scenarios where our
    injection string ends up in an attributes that execute HTML encoded
    Javascript. For example: onclick.
  - Bruteforcing is now disabled for URLs that gave a directory listing.
  - Added subject alternate name checking for SSL certificates (cheers
    to Matt Caroll for his feedback)
  - Added signature matching (see doc/signatures.txt) which means a lot
    of the content based issues are no longer hardcoded.
  - Added active XSSI test. The passive XSSI stays (for now) but this
    active check is more acurate and will remove issues detected by the
    passive one if they cannot be confirmed. This reduces false
    positives
  - Added HTML tag XSS test which triggers when our payload is used
    as a tag attribute value but without quotes (courtesy of wavsep).
  - Added javascript: scheme XSS testing (courtesy of wavsep).
  - Added form based authentication. During these authenticated
    scans, skipfish will check if the session has ended and re-authenticates
    if necessary.
  - Fixed a bug where in slow scans the console output could mess up
    due to the high(er) refresh rate.
  - Fixed a bug where a missed response during the injection tests could
    result in a crash. (courtesy of Sebastian Roschke)
  - Restructure the source package a bit by adding a src/, doc/ and
    tools/ directory.
  • Loading branch information
spinkham committed Sep 12, 2012
1 parent a655d58 commit c9d5b74
Show file tree
Hide file tree
Showing 39 changed files with 2,551 additions and 544 deletions.
38 changes: 38 additions & 0 deletions ChangeLog
Original file line number Diff line number Diff line change
@@ -1,3 +1,41 @@
Version 2.08b:

- Added Host header XSS testing.

- Added HTML encoding XSS tests to detect scenarios where our
injection string ends up in an attributes that execute HTML encoded
Javascript. For example: onclick.

- Bruteforcing is now disabled for URLs that gave a directory listing.

- Added subject alternate name checking for SSL certificates (cheers
to Matt Caroll for his feedback)

- Added signature matching (see doc/signatures.txt) which means a lot
of the content based issues are no longer hardcoded.

- Added active XSSI test. The passive XSSI stays (for now) but this
active check is more acurate and will remove issues detected by the
passive one if they cannot be confirmed. This reduces false positives

- Added HTML tag XSS test which triggers when our payload is used
as a tag attribute value but without quotes (courtesy of wavsep).

- Added javascript: scheme XSS testing (courtesy of wavsep).

- Added form based authentication. During these authenticated
scans, skipfish will check if the session has ended and re-authenticates
if necessary.

- Fixed a bug where in slow scans the console output could mess up
due to the high(er) refresh rate.

- Fixed a bug where a missed response during the injection tests could
result in a crash. (courtesy of Sebastian Roschke)

- Restructure the source package a bit by adding a src/, doc/ and
tools/ directory.

Version 2.07b:
--------------

Expand Down
36 changes: 20 additions & 16 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -20,45 +20,49 @@
#

PROGNAME = skipfish
VERSION = 2.07b
VERSION = 2.08b

OBJFILES = http_client.c database.c crawler.c analysis.c report.c \
checks.c
INCFILES = alloc-inl.h string-inl.h debug.h types.h http_client.h \
SRCDIR = src
SFILES = http_client.c database.c crawler.c analysis.c report.c \
checks.c signatures.c auth.c
IFILES = alloc-inl.h string-inl.h debug.h types.h http_client.h \
database.h crawler.h analysis.h config.h report.h \
checks.h
checks.h signatures.h auth.h

OBJFILES = $(patsubst %,$(SRCDIR)/%,$(SFILES))
INCFILES = $(patsubst %,$(SRCDIR)/%,$(IFILES))

CFLAGS_GEN = -Wall -funsigned-char -g -ggdb -I/usr/local/include/ \
-I/opt/local/include/ $(CFLAGS) -DVERSION=\"$(VERSION)\"
CFLAGS_DBG = -DLOG_STDERR=1 -DDEBUG_ALLOCATOR=1 $(CFLAGS_GEN)
CFLAGS_OPT = -O3 -Wno-format $(CFLAGS_GEN)
CFLAGS_OPT = -O3 -Wno-format $(CFLAGS_GEN)

LDFLAGS += -L/usr/local/lib/ -L/opt/local/lib
LIBS += -lcrypto -lssl -lidn -lz
LIBS += -lcrypto -lssl -lidn -lz -lpcre

all: $(PROGNAME)

$(PROGNAME): $(PROGNAME).c $(OBJFILES) $(INCFILES)
$(CC) $(LDFLAGS) $(PROGNAME).c -o $(PROGNAME) $(CFLAGS_OPT) \
$(OBJFILES) $(LIBS)
$(PROGNAME): $(SRCDIR)/$(PROGNAME).c $(OBJFILES) $(INCFILES)
$(CC) $(LDFLAGS) $(SRCDIR)/$(PROGNAME).c -o $(PROGNAME) \
$(CFLAGS_OPT) $(OBJFILES) $(LIBS)
@echo
@echo "See dictionaries/README-FIRST to pick a dictionary for the tool."
@echo "See doc/dictionaries.txt to pick a dictionary for the tool."
@echo
@echo "Having problems with your scans? Be sure to visit:"
@echo "http://code.google.com/p/skipfish/wiki/KnownIssues"
@echo

debug: $(PROGNAME).c $(OBJFILES) $(INCFILES)
$(CC) $(LDFLAGS) $(PROGNAME).c -o $(PROGNAME) $(CFLAGS_DBG) \
$(OBJFILES) $(LIBS)
debug: $(SRCDIR)/$(PROGNAME).c $(OBJFILES) $(INCFILES)
$(CC) $(LDFLAGS) $(SRCDIR)/$(PROGNAME).c -o $(PROGNAME) \
$(CFLAGS_DBG) $(OBJFILES) $(LIBS)

clean:
rm -f $(PROGNAME) *.exe *.o *~ a.out core core.[1-9][0-9]* *.stackdump \
LOG same_test
rm -rf tmpdir

same_test: same_test.c $(OBJFILES) $(INCFILES)
$(CC) same_test.c -o same_test $(CFLAGS_DBG) $(OBJFILES) $(LDFLAGS) \
same_test: $(SRCDIR)/same_test.c $(OBJFILES) $(INCFILES)
$(CC) $(SRCDIR)/same_test.c -o same_test $(CFLAGS_DBG) $(OBJFILES) $(LDFLAGS) \
$(LIBS)

publish: clean
Expand Down
29 changes: 11 additions & 18 deletions README
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,9 @@ associated with web security scanners. Specific advantages include:
stored XSS (path, parameters, headers), blind SQL or XML injection,
or blind shell injection.

* Snort style content signatures which will highlight server errors,
information leaks or potentially dangerous web applications.

* Report post-processing drastically reduces the noise caused by any
remaining false positives or server gimmicks by identifying repetitive
patterns.
Expand Down Expand Up @@ -274,23 +277,15 @@ report will be non-destructively annotated by adding red background to all
new or changed nodes; and blue background to all new or changed issues
found.

Some sites may require authentication; for simple HTTP credentials, you can
try:

$ ./skipfish -A user:pass ...other parameters...

Alternatively, if the site relies on HTTP cookies instead, log in in your
browser or using a simple curl script, and then provide skipfish with a
session cookie:
Some sites may require authentication for which our support is described
in docs/authentication.txt. In most cases, you'll be wanting to use the
form authentication method which is capable of detecting broken sessions
in order to re-authenticate.

$ ./skipfish -C name=val ...other parameters...

Other session cookies may be passed the same way, one per each -C option.

Certain URLs on the site may log out your session; you can combat this in two
ways: by using the -N option, which causes the scanner to reject attempts to
set or delete cookies; or with the -X parameter, which prevents matching URLs
from being fetched:
Once authenticated, certain URLs on the site may log out your session;
you can combat this in two ways: by using the -N option, which causes
the scanner to reject attempts to set or delete cookies; or with the -X
parameter, which prevents matching URLs from being fetched:

$ ./skipfish -X /logout/logout.aspx ...other parameters...

Expand Down Expand Up @@ -544,8 +539,6 @@ know:

* Scheduling and management web UI.

* A database for banner / version checks or other configurable rules?

-------------------------------------
9. Oy! Something went horribly wrong!
-------------------------------------
Expand Down
8 changes: 7 additions & 1 deletion assets/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -278,6 +278,7 @@
"10804": "Conflicting MIME / charset info (low risk)",
"10901": "Numerical filename - consider enumerating",
"10902": "OGNL-like parameter behavior",
"10909": "Signature match (informational)",

"20101": "Resource fetch failed",
"20102": "Limits exceeded, fetch suppressed",
Expand All @@ -294,6 +295,7 @@
"30203": "SSL certificate host name mismatch",
"30204": "No SSL certificate data found",
"30205": "Weak SSL cipher negotiated",
"30206": "Host name length mismatch (name string has null byte)",
"30301": "Directory listing restrictions bypassed",
"30401": "Redirection to attacker-supplied URLs",
"30402": "Attacker-supplied URLs in embedded content (lower risk)",
Expand All @@ -305,11 +307,13 @@
"30701": "Incorrect caching directives (lower risk)",
"30801": "User-controlled response prefix (BOM / plugin attacks)",
"30901": "HTTP header injection vector",
"30909": "Signature match detected",

"40101": "XSS vector in document body",
"40102": "XSS vector via arbitrary URLs",
"40103": "HTTP response header splitting",
"40104": "Attacker-supplied URLs in embedded content (higher risk)",
"40105": "XSS vector via injected HTML tag attribute",
"40201": "External content embedded on a page (higher risk)",
"40202": "Mixed content embedded on a page (higher risk)",
"40301": "Incorrect or missing MIME type (higher risk)",
Expand All @@ -321,6 +325,7 @@
"40501": "Directory traversal / file inclusion possible",
"40601": "Incorrect caching directives (higher risk)",
"40701": "Password form submits from or to non-HTTPS page",
"40909": "Signature match detected (higher risk)",

"50101": "Server-side XML injection vector",
"50102": "Shell injection vector",
Expand All @@ -329,7 +334,8 @@
"50105": "Integer overflow vector",
"50106": "File inclusion",
"50201": "SQL query or similar syntax in parameters",
"50301": "PUT request accepted"
"50301": "PUT request accepted",
"50909": "Signature match detected (high risk)"

};

Expand Down
2 changes: 0 additions & 2 deletions dictionaries/extensions-only.wl
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,6 @@ e 1 1 1 sql
e 1 1 1 stackdump
e 1 1 1 svn-base
e 1 1 1 swf
e 1 1 1 swp
e 1 1 1 tar
e 1 1 1 tar.bz2
e 1 1 1 tar.gz
Expand All @@ -107,4 +106,3 @@ e 1 1 1 xsl
e 1 1 1 xslt
e 1 1 1 yml
e 1 1 1 zip
e 1 1 1 ~
98 changes: 98 additions & 0 deletions doc/authentication.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@


This document describes 3 different methods you can use to run
authenticated skipfish scans.

1) Form authentication
2) Cookie authentication
3) Basic HTTP authentication



-----------------------
1. Form authentication
----------------------

With form authentication, skipfish will submit credentials using the
given login form. The server is expected to reply with authenticated
cookies which will than be used during the rest of the scan.

An example to login using this feature:

$ ./skipfish --auth-form http://example.org/login \
--auth-user myuser \
--auth-pass mypass \
--auth-verify-url http://example.org/profile \
[...other options...]

This is how it works:

1. Upon start of the scan, the authentication form at /login will be
fetched by skipfish. We will try to complete the username and password
fields and submit the form.

2. Once a server response is obtained, skipfish will fetch the
verification URL twice: once with the new session cookies and once
without any cookies. Both responses are expected to be different.

3. During the scan, the verification URL will be used many times to
test whether we are authenticated. If at some point our session has
been terminated server-side, skipfish will re-authenticate using the
--auth-form (/login in our example) .

Verifying whether the session is still active requires a good verification
URL where an authenticated request is going to get a different response
than an anonymous request. For example a 'profile' or 'my account' page.

Troubleshooting:
----------------

1. Login field names not recognized

If the username and password form fields are not recognized, skipfish
will complain. In this case, you should specify the field names using
the --auth-user-field and --auth-pass-field flags.

2. The form is not submitted to the right location

If the login form doesn't specify an action="" location, skipfish
will submit the form's content to the form URL. This will fail in some
occasions. For example, when the login page uses Javascript to submit
the form to a different location.

Use the --auth-form-target flag to specify the URL where you want skipfish
to submit the form to.

3. Skipfish keeps getting logged out

Make sure you blacklist any URLs that will log you out. For example,
using the " -X /logout"


-------------------------
2. Cookie authentication
-------------------------

Alternatively, if the site relies on HTTP cookies you can also feed these
to skipfish manually. To do this log in using your browser or using a
simple curl script, and then provide skipfish with a session cookie:

$ ./skipfish -C name=val [...other options...]

Other session cookies may be passed the same way, one per each -C option.

The -N option, which causes new cookies to be rejected by skipfish,
is almost always a good choice when running cookie authenticated scans
(e.g. to avoid your precious cookies from being overwritten).

$ ./skipfish -N -C name=val [...other options...]

-----------------------------
3. Basic HTTP authentication
-----------------------------

For simple HTTP credentials, you can use the -A option to pass the
credentials.

$ ./skipfish -A user:pass [...other options...]

File renamed without changes.
Loading

0 comments on commit c9d5b74

Please sign in to comment.