B-3-UK082V

Test pages/domains.... actually slowing down your site?

Blog Post created by B-3-UK082V Employee on Jun 17, 2015

Test pages, test scripts, test servers and test infrastructure are a security and performance issue, and it can be surprisingly easy to avoid. Sometimes it can be hard to detect, because it may be that a third party is the problem.


Performance

To make a long story short, I was looking at some Webpagetests performance numbers, and I noticed a strange thing, there were references to test servers in my waterfalls and more importantly not being served from production level infrastructure. This got me thinking about single points of failures, also known as SPOFs. In a test on that site, I was able to push out doc complete by about about 20 seconds by blackholing the domain. To put it another way, a script was  implemented in a way that its response time affects the loading of the rest of page, and that script was on an infrastructure that was never meant to be performant or handle large loads. It's also worth noting this was served on every page of the site. It's not unreasonable to assume this could be affecting the page performance if the test environment is slow to respond or worse, depending on the robustness of the network, goes down due to the load.


Thinking about the performance impact of test infrastructure, this spiraled out into musing about the other aspects of security, performance/availability with regards to test infrastructure, and test code.


Security

For many years I was involved in preliminary security assessments of applications, and I was always amazed how often a "test.html" or "test.aspx" would show up when one of our tools profiled someone's webapp or when we were looking at the source code. Typically the things you worry about are

  • Vulnerabilities that can be exploited
  • Code that can affect the availability, performance, and data integrity of your site

 

A not-so hypothetical example would be if you have a test page that has heavy operations, what happens if someone scripts thousands of tests to hit it? You could grind the system down to a halt.

 

A lot of times they were innocuous pages, static HTML to test out scripts, on a very rare occasion it would do something scary like write a whole lot of test data to a prod database or be a utility to do something in  a test bed or local to a dev machine. In general these pages were never tested for security, and would often spew out database errors and other things that could be used to profile the server because it was never meant to be public. While I can't speak to how often these are exploited, to me it makes logical sense they might be launching point for an attack because they typically wouldn't be developed with security in mind.

 

This got me to thinking "how do you address this"? It's probably a mixture of strategies, and to be honest, it'll likely take a mixture of strategies that will likely only be implemented depending on level of effort. Having worked with a few of these systems, it seems relatively trivial to put in most of these checks, and I know the security tools often have these checks, they just need to be turned on.


It's worth noting that I speak from the assessment perspective, from what I've seen and I'm trying to suggest some possible tactics to address the issue. I encourage those in development/security to provide feedback on approaches that would benefit others.

 

What you can do about it

Most of these suggestions are vendor/tool agnostic, and offer a variety of options to validate that these test scripts/servers aren't present. As I mentioned before, best approach would be to apply all these strategies in a perfect world, but realistically the easy to implement solutions are usually what get implemented.

    Code base

  1. Train developers to not check in test code, or the very least put it in a different branch.
  2. Build system should have checks for files with the name "test" and flag them. This obviously wouldn't catch everything, but at least it would take the obvious out of the equation.
  3. Source code scanners  (SAST) should be checking for test pages as part of the assessment process.
    • Usually called Static Analysis Tools or Code Review Tools, they are extensible and can allow you to add your own checks if they don't already contain them.

 

    Application Infrastructure

  1. Review infrastructure settings/accounts
  2. Web Application Firewalls (WAF) can be configured to block these requests. It's probably too simplistic to look for "test.html" or "*test*" but it's worth looking into
    • i.e. block "test.html", block "test.php", and so on
  3. Dynamic application scanners (DAST)  should be used to verify the "easy" url test files aren't accessible. They actually have a dictionary of tests they run on every detected folder.

 

    Third Party Components

    Some of the code that may be a SPOF , affecting performance, or perhaps even be malicious, may not even be on your infrastructure. Here are some things you can do.

  • At some point someone should do a regular, possibly scripted, analysis of the domains served from their production site. It could be as simple as running a webpagetest, pulling down the list of the third party domains and then doing an NS lookup, which is a trivial amount of script to write... Once you have the list, validate it's what you're expecting. With third party ads, sometimes this can be difficult, but try asking the following:
    • Is the domain expected? Do I know what it is?
    • Are they on a CDN? I have a somewhat more robust article I'll be publishing on this soon.
    • Are they reputable? Sometimes sites are hacked and used as a malware distribution point. Are you sure your site isn't serving something up that shouldn't be there?
      • This is always a fun topic in the case of forums, where the site doesn't even need to be hacked to do this.


    Change Management And Review

    Have a change management and review process for all of the above


     Assessments

     Security companies can and do test for these things. If you are using a reputable security auditing company, they will have tests for various default configuration settings and obvious test files. If you're an Akamai customer, reach out to your PS contact and have an assessment done on performance , which can include analysis of first and third party SPOFs so you can understand the impact of these third parties on your site performance


Last word

 


These issues are found in large and small companies, no one is immune to it. Sure you can argue "it's just a test page", or "test script", but they have real world implications for performance and security. It seems common sense to never do this, but it's still surprising when/where this type of things comes up. 


 



Outcomes