Data integrity
A property whereby data has not been altered in an unauthorized manner since it was created, transmitted, or stored. Data integrity covers data in storage, during data processing, and while in transit.
Data key
A cryptographic key used to cryptographically process data (e.g., encrypt, decrypt, and authenticate).
Data latency
A measure of the currency of security-related data or information. Data latency refers to the time between when information is collected and when it is used. It allows an organization to respond to “where the threat or vulnerability is and where it is headed,” instead of “where it was.” When responding to threats and/or vulnerabilities, this is an important data point that shortens a risk decision cycle.
Data level
Three levels of data are possible: Level 1 is classified data. Level 2 is unclassified data requiring special protection, for example, Privacy Act, for Official Use Only, Technical Documents Restricted to Limited Distribution. Level 3 is all other unclassified data.
Data link layer
Provides security for the layer that handles communications on the physical network components of the ISO/OSI reference model.
Data link layer protocols
Data link layer protocols provide (1) error control to retransmit damaged or lost frames and (2) flow control to prevent a fast sender from overpowering a slow receiver. The sliding window mechanism is used to integrate error control and flow control. In data link layer, various framing methods are used including character count, byte stuffing, and bit stuffing. Examples of data link layer protocols and sliding window protocols include bit-oriented protocols such as SDLC, HDLC, ADCCP, or LAPB (Tanenbaum).
Data management
Providing or controlling access to data stored in a computer and the use of input/output devices.
Data mart
A data mart is a subset of a data warehouse with its goal to make the data available to more decision makers.
Data minimization
A generalization of the principle of variable minimization, in which the standardized parts of a message or data are replaced by a much shorter code, thereby reducing the risk of erroneous actions or improper use.
Data mining
Data mining is the process of posing a series of queries to extract information from a database, or data warehouse.
Data origin authentication
The corroboration that the source of data received is as claimed.
Data owner
The authority, individual, or organization who has original responsibility for the data by management directive or other. Compare with data custodian.
Data path
The physical or logical route over which data passes. Note that a physical data path may be shared by multiple logical data paths.
Data privacy
It refers to restrictions to prevent unauthorized people having access to sensitive data and information stored on paper or magnetic media and to prevent interception of data.
Data/record retention
Retention periods for all data records, paper, and electronic files should be defined to facilitate routine backup, periodic purging (deletion), and archiving of records. This practice will protects the organization from failure to comply with external requirements and management guidelines and help it maximize the use of storage space and magnetic media (e.g., tapes, disks, cassettes, and cartridges).
Data reengineering
(1) A system-level process that purifies data definitions and values. This process establishes a meaningful and nonredundant data definitions and valid and consistent data values. (2) It is used to improve the quality of data within a computer system. It examines and alters the data definitions, values, and the use of data. Data definitions and flows are tracked through the system. This process reveals hidden data models. Data names and definitions are examined and made consistent. Hard-coded parameters that are subject to change may be removed. This process is important because data problems are often deeply rooted within computer systems.
Data regrade
Data is regraded when information is transferred from high to low network data and users. Automated techniques such as processing, filtering, and blocking are used during data regrading.
Data release
It is the process of returning all unused disk space to the system when a data set is closed at the end of processing.
Data remanence
The residual data that may be left over on a storage medium after it has been erased.
Data sanitization
Process to remove information from media such that information recovery is not possible. It includes removing all labels, markings, and activity logs. It is the changing of content information in order to meet the requirements of the sensitivity level of the network to which the information is being sent. It uses automatic techniques such as processing, filtering, and blocking during data sanitization.
Data security
The protection of data from unauthorized (accidental or intentional) modification, destruction, or disclosure.
Data storage methods
(1) Primary storage is the main general-purpose storage region directly accessed by the microprocessor. This storage is called random access memory (RAM), which is a semiconductor-based memory that can be read by and written to by the CPU or other hardware devices. The storage locations can be accessed in any random order. The term RAM generally indicates volatile memory that can be written to as well as read. It loses its contents when the power is turned off. (2) Secondary storage is the amount of space available on disks and tapes, sometimes called backup storage. (3) Real storage is the amount of RAM memory in a system, as distinguished from virtual memory. Real storage is also called physical memory or physical storage. (4) An application program sees virtual memory to be larger and more uniform than it is. Virtual memory may be partially simulated by secondary storage such as a hard disk. Application programs access memory through virtual addresses, which are translated by special hardware and software into physical addresses. Virtual memory is also called disk memory.
Data warehouse
A data warehouse facilitates information retrieval and data analysis as it stores pre-computed, historical, descriptive, and numerical data.
Database administrator (DBA)
A person responsible for the day-to-day control and monitoring of the databases, including data warehouse and data mining activities. The DBA deals with the physical design of the database while the data administrator deals with the logical design.
Database server
A repository for event information recorded by sensors, agents, or management servers.
Deactivated state
The cryptographic key life cycle state in which a key is not to be used to apply cryptographic protection to data. Under certain circumstances, the key may be used to process already protected data.
Decision tables
Tabular representation of the conditions, actions, and rules in making a decision. Decision tables provide a clear and coherent analysis of complex logical combinations and relationships, and detect logic errors. Used in decision-intensive and computational application systems.
Decision trees
Graphic representation of the conditions, actions, and rule of decision making. Used in application systems to develop plans in order to reduce risks and exposures. They use probabilities for calculating outcomes.