Tag Archives: systems
A Stuxnet Malware FAQ and How to Avoid It
Stuxnet is a new piece of malware that is spreading widely through the use of USB flash drives. It is starting to be quite a danger, especially in industrial plants, and many in the security business are getting very nervous. What follows is an FAQ about the Stuxnet malware.
How does Stuxnet spread?
Stuxnet spreads through USB devices. A recently discovered Microsoft Windows vulnerability has been found to allow a program to run just by browsing to a folder that contains a shortcut to it, or a “.lnk” file. Once the worm runs it scans to see if the computer is running software created by a company known as Siemens, which is very popular in certain industries. If this is the case, the worm is able to install itself on the victim computer. It will infect all future removable media that is connected to the computer and installs a rootkit, a very sophisticated type of software that deletes all record of the worm existing on the computer. After that the computer continues to steal as much data as possible from the computer and transmit it back to a remote location.
What does Stuxnet do?
Stuxnet is considered by many people to be the first-ever “control system” malware. What this means is that it has the capability of infecting control systems for large companies and factories that use software created by Siemens. Unfortunately, this software is very widely used, especially in large industrial manufacturing organizations, small and large utilities, and even defense systems. In one case it was found that this software could infect nuclear-powered aircraft carriers.
How widespread is Stuxnet and where is it most common?
At the moment Stuxnet is not that widespread. It is most common in India, Indonesia, Iran, Pakistan, Afghanistan, the United States, and Malaysia, in that order. However, it has the potential to spread very rapidly. It only affects computers running Siemens software, but computers without that software can still act as “carriers,” infecting other removable media that is inserted into them. Other countries have seen infections, but mostly they have been localized and have not caused any damage so far.
How dangerous is Stuxnet?
This is always the big question with a large virus outbreak. Right now Stuxnet is not that dangerous. Unfortunately, it is targeted at “Control Systems.” Siemens is most known for making software for sophisticated systems used in such areas as the military, large industrial plants, and utility plants. If any of these were to be infected the damage could be irreparable. The infection is clearly tailored to steal confidential information and possibly shut down “smart grids.” Therefore, while it is not a danger to consumers, any large corporation or plant must be very careful to avoid this infection.
File Processing Systems
Even the earliest business computer systems were used to process business records and produce information. They were generally faster and more accurate than equivalent manual systems. These systems stored groups of records in separate files, and so they were called file processing systems. Although file processing systems are a great improvement over manual systems, they do have the following limitations:
Data is separated and isolated.
Data is often duplicated.
Application programs are dependent on file formats.
It is difficult to represent complex objects using file processing systems. Data is separate and isolated. Recall that as the marketing manager you needed to relate sales data to customer data. Somehow you need to extract data from both the CUSTOMER and ORDER files and combine it into a single file for processing. To do this, computer programmers determine which parts of each of the files are needed. Then they determine how the files are related to one another, and finally they coordinate the processing of the files so the correct data is extracted. This data is then used to produce the information. Imagine the problems of extracting data from ten or fifteen files instead of just two! Data is often duplicated. In the record club example, a member’s name, address, and membership number are stored in both files. Although this duplicate data wastes a small amount of file space, that is not the most serious problem with duplicate data. The major problem concerns data integrity. A collection of data has integrity if the data is logically consistent. This means, in part, that duplicated data items agree with one another. Poor data integrity often develops in file processing systems. If a member were to change his or her name or address, then all files containing that data need to be updated. The danger lies in the risk that all files might not be updated, causing discrepancies between the files. Data integrity problems are serious. If data items differ, inconsistent results will be produced. A report from one application might disagree with a report from another application. At least one of them will be incorrect, but who can tell which one? When this occurs, the credibility of the stored data comes into question. Application programs are dependent on file formats. In file processing systems, the physical formats of files and records are entered in the application programs that process the files. In COBOL, for example, file formats are written in the DATA DIVISION. The problem with this arrangement is that changes in file formats result in program updates. For example, if the Customer record were modified to expand the ZIP Code field from five to nine digits, all programs that use the Customer record need to be modified, even if they do not use the ZIP Code field. There might be twenty programs that process the CUSTOMER file. A change like this one means that a programmer needs to identify all the affected programs, then modify and retest them. This is both time consuming and error-prone. It is also very frustrating to have to modify programs that do not even use the field whose format changed. It is difficult to represent complex objects using file processing systems. This last weakness of file processing systems may seem a bit theoretical, but it is an important shortcoming.
File Processing Systems
Even the earliest business computer systems were used to process business records and produce information. They were generally faster and more accurate than equivalent manual systems. These systems stored groups of records in separate files, and so they were called file processing systems. Although file processing systems are a great improvement over manual systems, they do have the following limitations:
Data is separated and isolated.
Data is often duplicated.
Application programs are dependent on file formats.
It is difficult to represent complex objects using file processing systems. Data is separate and isolated. Recall that as the marketing manager you needed to relate sales data to customer data. Somehow you need to extract data from both the CUSTOMER and ORDER files and combine it into a single file for processing. To do this, computer programmers determine which parts of each of the files are needed. Then they determine how the files are related to one another, and finally they coordinate the processing of the files so the correct data is extracted. This data is then used to produce the information. Imagine the problems of extracting data from ten or fifteen files instead of just two! Data is often duplicated. In the record club example, a member’s name, address, and membership number are stored in both files. Although this duplicate data wastes a small amount of file space, that is not the most serious problem with duplicate data. The major problem concerns data integrity. A collection of data has integrity if the data is logically consistent. This means, in part, that duplicated data items agree with one another. Poor data integrity often develops in file processing systems. If a member were to change his or her name or address, then all files containing that data need to be updated. The danger lies in the risk that all files might not be updated, causing discrepancies between the files. Data integrity problems are serious. If data items differ, inconsistent results will be produced. A report from one application might disagree with a report from another application. At least one of them will be incorrect, but who can tell which one? When this occurs, the credibility of the stored data comes into question. Application programs are dependent on file formats. In file processing systems, the physical formats of files and records are entered in the application programs that process the files. In COBOL, for example, file formats are written in the DATA DIVISION. The problem with this arrangement is that changes in file formats result in program updates. For example, if the Customer record were modified to expand the ZIP Code field from five to nine digits, all programs that use the Customer record need to be modified, even if they do not use the ZIP Code field. There might be twenty programs that process the CUSTOMER file. A change like this one means that a programmer needs to identify all the affected programs, then modify and retest them. This is both time consuming and error-prone. It is also very frustrating to have to modify programs that do not even use the field whose format changed. It is difficult to represent complex objects using file processing systems. This last weakness of file processing systems may seem a bit theoretical, but it is an important shortcoming.