Backdoor Trojan Attack on Manufacturing Lab

Event Year: 2004 Reliability: Confirmed
Country: Unknown
Industry Type: Electronic Manufacturing

This incident describes a complex and wide reaching malware-based attack against the manufacturing lab systems of a major electronics manufacturer. The lab was a large integrated test and development facility with a significant number of Windows servers and development machines spread over several building sites. The attack was a backdoor trojan that was at the time a new and unknown variant. Whether this was a directed attack and the intent of the attack is not known.

Initially, it appeared that only one server had been infected and then cleaned automatically by its anti-virus software. Inspection of the anti-virus logs on this server indicated that the virus had been deleted. Unfortunately, later investigation proved this not to be the case. The virus had created a file named administrator.txt which contained a list of IP addresses for all the lab machines, along with all of the account names for each machine recorded, and the password for that account. Many of the accounts that were recorded were local administrator accounts with blank passwords or passwords consisting of the phrase “password”. The virus had configured an ftp server and was sending this information to an unknown location. The server was disconnected and the administrator.txt file was printed.

Another server was experiencing similar problems and a decision was made to disconnect this server from the network as it most likely had a virus. However, the users refused as they couldn’t spare the down time so management was then asked to disconnect the infected lab machines which would result in decreased production and therefore cost money. In a few short hours, at least half of the lab machines were discovered to be infected and were disconnected from the network (with production stoppages).

From here, the issue was escalated and corporate entities were contacted to share information. The corporate network and desktop support venders were informed of the situation and a call was made to the organization’s network security. A representative at the anti-virus software vendor was also contacted. The problem was considered contained by the end of the day but not solved.

Almost a week went by and there was a desperate need for an immediate solution. The engineers decided to invoke the equivalent of a mutiny by reconfiguring the test beds with the machines hooked to hubs and switches for connectivity. There was no access to DNS servers, no communication process and no documentation for changing the many embedded passwords. There was no official fix yet available and some valuable resources were not properly backed up.

Ultimately, users were helped with work-arounds until the network and all related resources were up and running.


All in all about 3 weeks of development time and countless other related hours were lost although the actual number is not known.

Action Description: Significant procedural changes were made to minimize the chance and impact of a similar incident.