Many Apache Web servers, including those hosting some popular websites, expose information about the internal structure of the sites they host, the IP (Internet Protocol) addresses of their visitors, the resources users access and other potentially sensitive details because their status pages are left unprotected.
The Apache mod_status module generates a "server status" page that contains information about the server's CPU and memory load, as well as details about active user requests, including paths to various internal files and IP addresses.
While this page can be a valuable resource for server administrators, the information it exposes can help hackers better plan their attacks, Daniel Cid, chief technology officer of Web security firm Sucuri, said Tuesday in a blog post.
Sucuri researchers ran a test that involved crawling over 10 million websites and found hundreds of them that expose their server status pages to the whole world. The list of affected websites includes php.net, metacafe.com, disney.go.com, staples.com, nba.com, cisco.com, ford.com, apache.org and many others. Some of them have fixed the problem since Sucuri's report, but many haven't.
"Is that a big deal that I can go to staples.com/server-status/ and see all those orders/connections being made and their IPs?" Cid said. "Or go to one of them and search for 'admin-p' and find a mostly unprotected admin panel (I won't disclose the site). Or find all the internal URLs and vhost mapping for nba.com or ford.com?"
"Probably not a big deal by itself (well, if you don't have an unprotected admin panel), but that can help attackers easily find more information about these environments and use them for more complex attacks," the security researcher said.
At first glance, the solution is simple: add access control directives in the server configuration file in order to restrict access to the /server-status path and only allow access to IP addresses that need to have access to the page.
However, server administrators need to consider the configuration of their whole Web infrastructure, because there are some scenarios in which the Apache access control directives could be inadvertently bypassed.
For example, if a Squid Web caching service is running in reverse proxy mode on the same machine as a Web server that allows /server-status access only from the local IP address, the restriction will be bypassed, Marcus Povey, an independent IT consultant and software developer from U.K., said Thursday via email.
This happens because requests from users are received by the Squid proxy first and then passed to the Web server, causing the server to see the requests as coming from the proxy's local IP address.
When running in this configuration, Squid stores static versions of pages generated by the Web server for a limited period of time and serves them to users, which prevents the server from overloading when dealing with a lot of traffic. Without caching services like Squid, the Apache server would have to regenerate PHP or other dynamic pages for every visitor, which can quickly consume the machine's available resources during a traffic spike.
Running Squid and Apache on the same machine is particularly popular with owners of smaller websites, who can't afford running these services on separate machines, Povey said. "Many, myself included, just have a rented box in a data center somewhere."
Larger companies might run dedicated Web caching servers. However, even in those cases, if the Apache access control rules for /server-status allow access to the whole IP range of the internal network, which includes the caching servers, the same problem would occur.
"You'd have to take extra steps to ensure you didn't expose this," Povey said. "Server-status et al [other server info pages like the one generated by mod_info] is something that is easy to overlook."
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.