Customers should beware of what details they give out. With our second company, an online trader, their credit card details would be unsafe.
Mali found a range of “horrors” through the firm’s website.
The web server had “thousands” of vulnerabilities: there was no intrusion detection or monitoring and no access controls. The firm would know if the system went down but would have no way of telling if any data had been stolen. “It’s wide open,” he says.
In addition, the firm’s technical person knew so little that he referred Mali to his ISP, who claimed to have blocked holes that Mali found open.
“Customer information kept by this company is in danger. They [also] did not have a clue how unsafe they were,” he says.
The firm is negotiating partnerships with large corporates. Mali advises the big boys, if they find out, should not bother because of the security risks, or the company should get its act together “big time”.
Mali performed a similar test to DataGlobal, looking for vulnerabilities and exploits to test and gathering information about the website from Domainz.
He found the name server information pointing to the ISP, hosting the site. A “pingsweep”, likened to feeling the pulse of the system, revealed a multiple server within the same network segment hosting multiple sites.
“In a real scenario, the other systems may be a possible host for consideration to gain access to the target system, which makes the target host very vulnerable,” Mali says.
He then performed a port scan and identified a server running unwanted services including NetBIOS 139, MS-SQL and pcAnywhere. He then performed a “fingerprint” (checking the type of back-end systems) of the web server and established it as Microsoft IIS 4.0. “This gave me a very good starting point to narrow my vulnerability search,” he says.
Performing a vulnerability scan using Nessus and NT security scanner, he found the system was “poorly configured with default sample server and files”.
“There are number of exploits to gain access and issue DOS attacks,” he says.
The NT vulnerability scanner was then used to extract users and groups. “The system was lightly hardened as I couldn’t find any disk/shares on the server,” he says.
Mali explains the test was similar to the DataGlobal test in reconnaissance mode, but the server at the operating system level was hardened a little so he didn’t want to focus on exploits around operating system. The server was also “patched” to reasonable level.
“The web server was running IIS 4.0 with a back-end of MS-SQL, also with a default sample website files are accessible. This was my target instead of the operating system. In DataGlobal’s case the operating system was the target,” he says.
Mali found a “couple of big holes and some minor ones” and gave the Wellington firm the same score as DataGlobal.
“There is no firewall in the [Wellington] site and the server is outside the firewall with widely open ports. This is a heaven-come-true scenario as you can see the SQL server open. Port 139 is a good starting point to suck the users’ detail. This gives way for a possible brute-force attack,” he says.
“The SQL port was open and gave a good understanding of the web server application architecture. The user/customer information is stored on the SQL server. With a couple of known exploits I could possibly access the customer information. On some sites this might include sensitive information such as credit card information or personal addresses,” he says.
Mali says there were a couple of other web servers in the same network as the target host. “If the target host is hardened a bit and few vulnerabilities are found for an easy hack, these other systems may be used to attack the target host. Since this is a web hosting scenario, the target host is unnecessarily exposed.
“In addition, the default sample website and files related to it was accessible. I tracked down a number of vulnerabilities related to it, including uploading a Trojan (a back door program) that could be used to gain direct access to the server,” he says.
Mali says the ISP immediately carried out changes he recommended such as locking down unwanted ports and deleting sample website files. However, longer term, more work is needed, such as proper hardening of the Windows NT operating system and hardening system access with a commercial access control system.
“This would protect files [such as HTML and scripts] from various hacks like defacement and exploitation through various vulnerabilities,” he says.
Mali says the test also uncovered confusion over who was responsible for security breaches: the firm or the ISP.
“When I spoke with the ISP technical contact, he told me that the server is hardened and whatever information I found shouldn’t be there. When I told him that I have proof, he couldn’t believe it and went on to say that someone else might have opened it up. That could mean there is no proper change control in place currently at this site or poor security practices are in place,” he says.
Mali concludes: “The need for a third-party independent audit is key to testing client beliefs about the security of their environment.”