4. Introducing edgescan™
• edgescan™ is a sophisticated,
enterprise-grade vulnerability
assessment and management
solution that gives you the tools
you need to control and manage
IT security risk
• edgescan™ helps from small &
medium-sized to large
enterprises identify and
remediate known vulnerabilities
in any platform
or web application
• edgescan™ is a cloud based
SaaS which provides a unique
combination of technology and
human expertise to assist you
with maintaining a strong
security posture
4
5. How we get the Statistical model
• 1000’s of vulnerability
assessments globally.
• #Fullstack view of security
• False positive free (99%)
• Industries: Media, Energy,
Government, Pharma,
Finance, Software etc….
8. Risk Dispersion
73% of all vulnerabilities are not in the application
layer .
Which has more risk/weakness? Network or Web
Application?
9. What does this mean?
• We are finding more vulnerabilities in the non Web layer but
here’s where it gets interesting…
10. Web Application Layer (Layer 7)
Lots of high or critical risk issues!!
Easily exploitable
Very Damaging
Very Bad
11. Infrastructure Layer (Non Web app)
Lots of vulnerabilities!!
Not many high or Critical Risk.
More problems but less
vulnerable
12. What does this mean?
• Even though we are finding MORE issues in the NETWORK
layer, MOST serious / high risk issues are in the web application
layer.
• The web application layer has a higher “Risk Density”
13. More Detail
SSL is Dead. June 30th 2018 SSL and TLS 1.0
need to be killed off. – Payment Card Industry
(PCI)
Patching is still a big issue.
Some CVE’s are common and easily exploitable.
14. More Detail
System configuration and secure deployment is a big issue.
Client-Side security: XSS, HTML Injection, Browser based
issues are still very common.
Insecure JavaScript Libraries are common.
Version control of software components is required.
15. Known Vulnerabilities - age
Patching and version maintenance is
still a key part of maintaining a secure
posture.
17. How do we improve?
• Automation?
• Visibility?
19. Coverage is King!
• Automation is very useful.
• It works with DevSecOps.
• Helps us scale and “move quickly”.
• It has many challenges and can result in poor assessment
coverage and poor security!
20. Pitfall Explanation Solution
CSRF Tokens Preventing
Crawling
Cross-Site-Request Forgery tokens need to be resent with
every request. If the token is not valid the application may
invalidate the session. Tokens can be embedded in the
HTML and not automatically used by the scanner. This
results in the scanner not crawling or testing the site
adequately.
Using tools which can be configured to
“replay” the appropriate token with the
request.
Not all tools are capable of this. In some
cases multiple tools require to be
“chained” in order to satisfy this
restriction. Macros need to be written.
Tools running a virtual browser.
DOM Security
Vulnerabilities
Client-Side security issues which do not generate HTTP
requests may go undiscovered due to tools only testing the
application via sending and receiving HTTP requests. DOM
(Document Object Model) vulnerabilities may go
undiscovered as the tool does not process client side scripts.
Using tools which can provide virtual
browser capability solves this issue as
dynamic scripts in the browser are
processed and tested by the security tool.
This is also important in relation to
systems built using client-side frameworks
(Angular, Node.js etc) and detects issues
such as DOM XSS. Taint analysis of
JavaScript code is also important to help
discover client-side security issues.
21. Pitfall Explanation Solution
Dynamically Generated
Requests
Contemporary applications may dynamically generate HTTP requests
via JavaScript functions and tools which crawl applications to establish
site maps may not detect such dynamic links and requests.
Using tools which leverage virtual browsers solve
this problem as the JavaScript is executed as per a
regular users usage of the application. This results
in adequate coverage and detection of dynamic
page elements.
Recursive Links - Limiting
Repetitive Functionality
Applications with recursive links may result in 1000’s of unnecessary
requests. An example of this could be a calendar control or search
result function. This may result in 1000’s of extra requests being sent
to the application with little value to be yielded.
Example:
/Item/5/view
/Item/6/view
Some tools have the ability to limit recursiveness
and depth of requests such that if the tool starts to
crawl a link with 1000’s of permutations of the
same page it will stop the unnecessary resource
and time spent for both the assessment and the
hosting environment to service the assessment.
SSL/TLS Vulnerabilities Many tools which are designed to detect cryptographic issues simply
do it incorrectly. We have worked with some major tool vendors to
assist them with bug fixes in this area.
Using multiple tools to detect the same issue
results in clarity if the issues is present or it’s a false
positive.
Non Standard Protocols Some protocoals simply are not handled by certain tools. If protocols
wuch as Websockets, CORS, AMT, GWTK are not supported they will
not get adequately tested
Using multiple tools in this case helps with
coverage. The tools chosen to deliver the
assessment are based on initial manual
enumeration of the target system.
Insufficient Testing vectors used All tools test for defined vulnerabilities using a defined set of vectors.
Other tools also include tests for “known” vulnerabilities. Using one
scanning engine may result in not testing for security vulnerabilitys
adequately due to a restricted list of testing vectors used.
Leveraging multiple tools to test for particular
vulnerabilities results in more test cases and a
larger set of vectors being sued to test to the
vulnerability.
22. Pitfall Explanation Solution
Non Standard 404 Some sites will use the standard 404 handler, but many have
started to customize them to offer a better user experience.
Custom 404 that response as a 200. This is the simple one,
but many scanners will get caught by this
Using tools which can be configured to
recognise custom errors is important in
order to avoid false positives.
Session Management It is a challenge for any tools stay logged into an application.
The scanner must avoid logout functions, must properly
pass along session tokens wherever they happen to be at
the moment (sometimes cookies, sometimes on the URL,
sometimes in hidden form field) and adjust to multiple
possibilities taking place on a single app.
The scanner must also properly identify when it has lost its
session, and then be able to re-login (requires automated
login process mentioned above) to continue its scan.
Using multiple tools assists with this as not
all tools can be configured reliable to
maintain session state. Not having a reliable
session state or locking out accounts results
in poor coverage and disruption to the
engagement.
Ability to Test Web 2.0
(AJAX), Web Services and
Mobile
Related to a number of pitfalls above; application with
dynamic API calls via JavaScript, Restful requests etc can go
undiscovered and not get invoked at all.
Using multiple tools avoids configured with
REST-awareness can avoid missing area of
the application leaving it untested or
requiring that entire section to tested by
hand.