Static code analysis usually refers to the use of tools that attempt to detect possible vulnerabilities in "static" non-running source code. This method represents a white box test.
Basically, a distinction is made between dynamic and static test procedures. With dynamic test procedures, such as Dynamic Application Security Testing (DAST), the functionality is tested by executing code. This category includes unit, pen and function tests. In contrast, static tests include manual reviews and static code analysis. Static means that the program is not executed, but the check is only performed on the source code.
Ideally, such tools will automatically find security flaws with a high degree of confidence that what's found is actually a mistake. However, this exceeds the state of the art for many types of security vulnerabilities in the application. Therefore, such tools often serve as tools for an analyst to help them find security-related code parts so that they can identify errors more efficiently than a tool that simply finds errors automatically. Some tools begin to integrate into the Integrated Development Environment (IDE). For the types of problems, which can be detected during the software development phase itself, this is a powerful phase within the development lifecycle to deploy such tools, as they are the most developer gives immediate feedback on problems he encounters during code development himself in the code that I could bring to the table. This immediate feedback is very useful if you want to identify vulnerabilities much later. in the development cycle.
There are several techniques for analyzing static source code for potential vulnerabilities that might be combined into a solution. These techniques are often derived from compiler technologies.
Data flow analysis is used to collect runtime (dynamic) information about data in the software while it is in a static state. There are three common terms used in data flow analysis: Basic construction (the code), control flow analysis (the data flow), and control flow path (the path the data takes): Data flow analysis considers the state of a variable along a path. First, a graph of the program (or a function) is created, which contains all paths. Then it is recorded which actions are carried out along each path with this variable. A distinction can be made between four different actions. The first patterns do not necessarily have to be errors, but can also have been intentional.
In the control flow analysis, the program flow is analyzed. For example code fragments that can never be reached during program execution. Often already compilers detect such errors (for example the Java compiler javac). Such code does not necessarily lead to mistakes. However, this is very likely to be the case, that the programmer of this fragment had intended a different program flow.
Taint analysis attempts to identify variables that are associated with user-defined inputs. "polluted" and pursues them to possible vulnerable functions, also known as the "sink." can be designated. If the variable is passed to a sink without being "cleaned" first it is identified as a vulnerability.
In syntax analysis, often also called lexical analysis, the source code is checked against syntax and grammar rules. Tools that perform syntax analysis are compilers and interpreters. The syntax analysis takes place during each compiler run. If an error is detected, an error message is generated and the compiling process is aborted. The style analysis uses sets of rules that affect the programming style. On the one hand, these can be, for example, company-internal coding style guides. This keeps the code uniform within a company and makes it easier to read and maintain. On the other hand, they can also be programming language-specific guidelines. The aim of these guidelines is to avoid unsafe programming constructs. In many safety-critical areas of software in embedded systems, compliance with such standards is even mandatory.