Advertisement
opexxx

ISC2 CISSP® (EN) Export

Sep 28th, 2015
1,330
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 102.59 KB | None | 0 0
  1. Access : Opportunity to make use of an information system (IS) resource.
  2. Access control : Limiting access to information system resources only to authorized users, programs, processes, or other systems.
  3. Access control list (ACL) : Mechanism implementing discretionary and/or mandatory access control between subjects and objects.
  4. Access control mechanism : Security safeguard designed to detect and deny unauthorized access and permit authorized access in an information system.
  5. Access level : Hierarchical portion of the security level used to identify the sensitivity of information system data and the clearance or authorization of users. Access level, in conjunction with the nonhierarchical categories, forms the sensitivity label of an object. (See category.)
  6. Access list : (IS) Compilation of users, programs, or processes and the access levels and types to which each is authorized.
  7.  
  8. (COMSEC) Roster of individuals authorized admittance to a controlled area.
  9. Access profile : Associates each user with a list of protected objects the user may access.
  10. Access type : Privilege to perform action on an object. Read, write, execute, append, modify, delete, and create are examples of access types. (See write.)
  11. Accountability : (IS) Process of tracing information system activities to a responsible source.
  12.  
  13. (COMSEC) Principle that an individual is entrusted to safeguard and control equipment, keying material, and information and is answerable to proper authority for the loss or misuse of that equipment or information.
  14. Accreditation : Formal declaration by a Designated Accrediting Authority (DAA) that an information system is approved to operate at an acceptable level of risk, based on the implementation of an approved set of technical, managerial, and procedural safeguards. (See security safeguards.)
  15. Accrediting authority : Synonymous with Designated Accrediting Authority (DAA).
  16. Adequate security : Security commensurate with the risk and magnitude of harm resulting from the loss, misuse, or unauthorized access to or modification of information. This includes assuring that information systems operate effectively and provide appropriate confidentiality, integrity, and availability, through the use of cost-effective management, personnel, operational, and technical controls. (OMB Circular A-130)
  17. Advanced Encryption Standard (AES) : FIPS approved cryptographic algorithm that is a symmetric block cipher using cryptographic key sizes of 128, 192, and 256 bits to encrypt and decrypt data in blocks of 128 bits.
  18. Advisory : Notification of significant new trends or developments regarding the threat to the information system of an organization. This notification may include analytical insights into trends, intentions, technologies, or tactics of an adversary targeting information systems.
  19. Alert : Notification that a specific attack has been directed at the information system of an organization.
  20. Application : Software program that performs a specific function directly for a user and can be executed without access to system control, monitoring, or administrative privileges.
  21. Assurance : Measure of confidence that the security features, practices, procedures, and architecture of an information system accurately mediates and enforces the security policy.
  22. Attack : Attempt to gain unauthorized access to an information system's services, resources, or information, or the attempt to compromise an information system's integrity, availability, or confidentiality.
  23. Audit : Independent review and examination of records and activities to assess the adequacy of system controls, to ensure compliance with established policies and operational procedures, and to recommend necessary changes in controls, policies, or procedures.
  24. Audit trail : Chronological record of system activities to enable the reconstruction and examination of the sequence of events and/or changes in an event.
  25. Authenticate : To verify the identity of a user, user device, or other entity, or the integrity of data stored, transmitted, or otherwise exposed to unauthorized modification in an information system, or to establish the validity of a transmission.
  26. Authentication : Security measure designed to establish the validity of a transmission, message, or originator, or a means of verifying an individual's authorization to receive specific categories of information.
  27. Authentication system : Cryptosystem or process used for authentication.
  28. Authenticator : Means used to confirm the identity of a station, originator, or individual.
  29. Authorization : Access privileges granted to a user, program, or process.
  30. Authorized vendor : Manufacturer of INFOSEC equipment authorized to produce quantities in excess of contractual requirements for direct sale to eligible buyers. Eligible buyers are typically U.S. Government organizations or U.S. Government contractors.
  31. Authorized Vendor Program (AVP) : Program in which a vendor, producing an INFOSEC product under contract to NSA, is authorized to produce that product in numbers exceeding the contracted requirements for direct marketing and sale to eligible buyers. Eligible buyers are typically U.S. Government organizations or U.S. Government contractors. Products approved for marketing and sale through the AVP are placed on the Endorsed Cryptographic Products List (ECPL).
  32. Availability : "Ensuring timely and reliable access and use of information." (44 USC Sec. 3542)
  33. Back door : Hidden software or hardware mechanism used to circumvent security controls. Synonymous with trap door.
  34. Backup : Copy of files and programs made to facilitate recovery, if necessary.
  35. Banner : Display on an information system that sets parameters for system or data use.
  36. Bell-LaPadula : A formal state transition model of computer security policy that describes a set of access control rules that uses security labels on objects and clearances for subjects. It was developed by David E. Bell and Leonard J. LaPadula. Bell-LaPadula security model is for meeting the confidentiality security objective only.
  37. Benign : Condition of cryptographic data that cannot be compromised by human access.
  38. Benign environment : Non-hostile environment that may be protected from external hostile elements by physical, personnel, and procedural security countermeasures.
  39. Biba : A formal state transition access control security model that focuses on data integrity in an information system. In general, Biba integrity model has three goals: Prevent data modification by unauthorized subject, prevent unauthorized data modification by authorized subject, and maintain internal and external consistency. It is defined by Kenneth J. Biba. (A MITRE alumni)
  40. Binding : Process of associating a specific communications terminal with a specific cryptographic key or associating two related elements of information.
  41. biometrics : Automated methods of authenticating or verifying an individual based upon a physical or behavioral characteristic.
  42. Bit error rate : Ratio between the number of bits incorrectly received and the total number of bits transmitted in a telecommunications system.
  43. BLACK : Designation applied to information systems, and to associated areas, circuits, components, and equipment, in which national security information is encrypted or is not processed.
  44. Boundary : Software, hardware, or physical barrier that limits access to a system or part of a system.
  45. Browsing : Act of searching through information system storage to locate or acquire information, without necessarily knowing the existence or format of information being sought.
  46. Bulk encryption : Simultaneous encryption of all channels of a multichannel telecommunications link.
  47. Call back : Procedure for identifying and authenticating a remote information system terminal, whereby the host system disconnects the terminal and reestablishes contact. Synonymous with dial back.
  48. Central office : The physical building used to house inside plant equipment including telephone switches, which make telephone calls "work" in the sense of making connections and relaying the speech information.
  49. Certificate : Digitally signed document that binds a public key with an identity. The certificate contains, at a minimum, the identity of the issuing Certification Authority, the user identification information, and the user's public key.
  50. Certificate management : Process whereby certificates (as defined above) are generated, stored, protected, transferred, loaded, used, and destroyed.
  51. Certificate revocation list (CRL) : List of invalid certificates (as defined above) that have been revoked by the issuer.
  52. Certification : Comprehensive evaluation of the technical and nontechnical security safeguards of an information system to support the accreditation process that establishes the extent to which a particular design and implementation meets a set of specified security requirements.
  53. Certification authority (CA) : (C&A) Official responsible for performing the comprehensive evaluation of the security features of an information system and determining the degree to which it meets its security requirements.
  54.  
  55. (PKI) Trusted entity authorized to create, sign, and issue public key certificates. By digitally signing each certificate issued, the user's identity is certified, and the association of the certified identity with a public key is validated.
  56. Certification package : Product of the certification effort documenting the detailed results of the certification activities.
  57. Certification test and evaluation (CT&E) : Software and hardware security tests conducted during development of an information system.
  58. Certified TEMPEST technical authority (CTTA) : An experienced, technically qualified U.S. Government employee who has met established certification requirements in accordance with CNSS (NSTISSC)-approved criteria and has been appointed by a U.S. Government Department or Agency to fulfill CTTA responsibilities.
  59. Certifier : Individual responsible for making a technical judgment of the system's compliance with stated requirements, identifying and assessing the risks associated with operating the system, coordinating the certification activities, and consolidating the final certification and accreditation packages.
  60. Challenge and reply authentication : Prearranged procedure in which a subject requests authentication of another and the latter establishes validity with a correct reply.
  61. Checksum : Value computed on data to detect error or manipulation during transmission. (See hash total.)
  62. Check word : Cipher text generated by cryptographic logic to detect failures in cryptography.
  63. Cipher : Any cryptographic system in which arbitrary symbols or groups of symbols, represent units of plain text, or in which units of plain text are rearranged, or both.
  64. Cipher text : Enciphered information.
  65. Clark-Wilson : A formal security model to preserve information integrity in an information system. The model focuses on "well-formed" transaction using a set of enforcement and certification rules. It is developed by David D. Clark and David R. Wilson.
  66. Classified information : Information that has been determined pursuant to Executive Order 12958 or any predecessor Order, or by the Atomic Energy Act of 1954, as amended, to require protection against unauthorized disclosure and is marked to indicate its classified status.
  67. Classified information spillage : Security incident that occurs whenever classified data is spilled either onto an unclassified information system or to an information system with a lower level of classification.
  68. Clearance : Formal security determination by an authorized adjudicative office that an individual is authorized access, on a need to know basis, to a specific level of collateral classified information (TOP SECRET, SECRET, CONFIDENTIAL).
  69. Client : Individual or process acting on behalf of an individual who makes requests of a guard or dedicated server. The client's requests to the guard or dedicated server can involve data transfer to, from, or through the guard or dedicated server.
  70. Closed security environment : Environment providing sufficient assurance that applications and equipment are protected against the introduction of malicious logic during an information system life cycle. Closed security is based upon a system's developers, operators, and maintenance personnel having sufficient clearances, authorization, and configuration control.
  71. Confidentiality : "Preserving authorized restriction on information access and disclosure, including means for protecting personal privacy and proprietary information." (44 USC Sec. 3542)
  72. Cold site : An inexpensive type of backup site with no IT infrastructure (e.g., computing and network hardware) in place.
  73. Cold start : Procedure for initially keying crypto-equipment.
  74. Collaborative computing : Applications and technology (e.g. , whiteboarding, group conferencing) that allow two or more individuals to share information real time in an inter- or intra-enterprise environment.
  75. Commercial COMSEC Evaluation Program (CCEP) : Relationship between NSA and industry in which NSA provides the COMSEC expertise (i.e., standards, algorithms, evaluations, and guidance) and industry provides design, development, and production capabilities to produce a type 1 or type 2 product. Products developed under the CCEP may include modules, subsystems, equipment, systems, and ancillary devices.
  76. Common Criteria : Provides a comprehensive, rigorous method for specifying security function and assurance requirements for products and systems. (International Standard ISO/IEC 5408, Common Criteria for Information Technology Security Evaluation [ITSEC])
  77. Communications deception : Deliberate transmission, retransmission, or alteration of communications to mislead an adversary's interpretation of the communications. (See imitative communications deception and manipulative communications deception.)
  78. Communications profile : Analytic model of communications associated with an organization or activity. The model is prepared from a systematic examination of communications content and patterns, the functions they reflect, and the communications security measures applied.
  79. Communications security (COMSEC) : (COMSEC) Measures and controls taken to deny unauthorized individuals information derived from telecommunications and to ensure the authenticity of such telecommunications. Communications security includes cryptosecurity, transmission security, emission security, and physical security of COMSEC material.
  80. Community risk : Probability that a particular vulnerability will be exploited within an interacting population and adversely impact some members of that population.
  81. Compartmentalization : A nonhierarchical grouping of sensitive information used to control access to data more finely than with hierarchical security classification alone.
  82. Compartmented mode : Mode of operation wherein each user with direct or indirect access to a system, its peripherals, remote terminals, or remote hosts has all of the following: (a) valid security clearance for the most restricted information processed in the system; (b) formal access approval and signed nondisclosure agreements for that information which a user is to have access; and (c) valid need-to-know for information which a user is to have access.
  83. Compromise : Type of incident where information is disclosed to unauthorized individuals or a violation of the security policy of a system in which unauthorized intentional or unintentional disclosure, modification, destruction, or loss of an object may have occurred.
  84. Compromising emanations : Unintentional signals that, if intercepted and analyzed, would disclose the information transmitted, received, handled, or otherwise processed by information systems equipment. (See TEMPEST.)
  85. Computer abuse : Intentional or reckless misuse, alteration, disruption, or destruction of information processing resources.
  86. Computer cryptography : Use of a crypto-algorithm program by a computer to authenticate or encrypt/decrypt information.
  87. Computer security : Measures and controls that ensure confidentiality, integrity, and availability of information system assets including hardware, software, firmware, and information being processed, stored, and communicated.
  88. Computer security incident : See incident.
  89. Computer security subsystem : Hardware/software designed to provide computer security features in a larger system environment. Computing environment Workstation or server (host) and its operating system, peripherals, and applications.
  90. COMSEC account : Administrative entity, identified by an account number, used to maintain accountability, custody, and control of COMSEC material.
  91. COMSEC assembly : Group of parts, elements, subassemblies, or circuits that are removable items of COMSEC equipment.
  92. COMSEC boundary : Definable perimeter encompassing all hardware, firmware, and software components performing critical COMSEC functions, such as key generation, handling, and storage.
  93. COMSEC control program : Computer instructions or routines controlling or affecting the externally performed functions of key generation, key distribution, message encryption/decryption, or authentication.
  94. COMSEC custodian : Individual designated by proper authority to be responsible for the receipt, transfer, accounting, safeguarding, and destruction of COMSEC material assigned to a COMSEC account.
  95. COMSEC element : Removable item of COMSEC equipment, assembly, or subassembly; normally consisting of a single piece or group of replaceable parts.
  96. COMSEC equipment : Equipment designed to provide security to telecommunications by converting information to a form unintelligible to an unauthorized interceptor and, subsequently, by reconverting such information to its original form for authorized recipients; also, equipment designed specifically to aid in, or as an essential element of, the conversion process. COMSEC equipment includes crypto-equipment, crypto-ancillary equipment, crypto-production equipment, and authentication equipment.
  97. COMSEC facility : Authorized and approved space used for generating, storing, repairing, or using COMSEC material.
  98. COMSEC incident : See incident.
  99. COMSEC manager : Individual who manages the COMSEC resources of an organization.
  100. COMSEC material : Item designed to secure or authenticate telecommunications. COMSEC material includes, but is not limited to key, equipment, devices, documents, firmware, or software that embodies or describes cryptographic logic and other items that perform COMSEC functions.
  101. COMSEC Material Control System (CMCS) : Logistics and accounting system through which COMSEC material marked "CRYPTO" is distributed, controlled, and safeguarded. Included are the COMSEC central offices of record, crypto-logistic depots, and COMSEC accounts. COMSEC material other than key may be handled through the CMCS.
  102. COMSEC module : Removable component that performs COMSEC functions in a telecommunications equipment or system.
  103. COMSEC monitoring : Act of listening to, copying, or recording transmissions of one's own official telecommunications to analyze the degree of security.
  104. COMSEC training : Teaching of skills relating to COMSEC accounting, use of COMSEC aids, or installation, use, maintenance, and repair of COMSEC equipment.
  105. Concept of operations (CONOP) : Document detailing the method, act, process, or effect of using an information system.
  106. Confidentiality : Assurance that information is not disclosed to unauthorized individuals, processes, or devices.
  107. Configuration control : Process of controlling modifications to hardware, firmware, software, and documentation to ensure the information system is protected against improper modifications prior to, during, and after system implementation.
  108. Configuration management : Management of security features and assurances through control of changes made to hardware, software, firmware, documentation, test, test fixtures, and test documentation throughout the life cycle of an information system.
  109. Contamination : Type of incident involving the introduction of data of one security classification or security category into data of a lower security classification or different security category.
  110. Contingency key : Key held for use under specific operational conditions or in support of specific contingency plans. (See reserve keying material.)
  111. Continuity of operations plan : Plan for continuing an organization's (usually a (COOP) headquarters element) essential functions at an alternate site and performing those functions for the duration of an event with little or no loss of continuity before returning to normal operations.
  112. Controlled access area : Physical area (e.g., building, room, etc.) to which only authorized personnel are granted unrestricted access. All other personnel are either escorted by authorized personnel or are under continuous surveillance.
  113. Controlled access protection : Minimum set of security functionality that enforces access control on individual users and makes them accountable for their actions through login procedures, auditing of security-relevant events, and resource isolation.
  114. Controlled cryptographic item (CCI) : Secure telecommunications or information handling equipment, or associated cryptographic component, that is unclassified but governed by a special set of control requirements. Such items are marked "CONTROLLED CRYPTOGRAPHIC ITEM" or, where space is limited, "CCI."
  115. Controlled interface : Mechanism that facilitates the adjudication of different interconnected system security policies (e.g., controlling the flow of information into or out of an interconnected system).
  116. Controlled space : Three-dimensional space surrounding information system equipment, within which unauthorized individuals are denied unrestricted access and are either escorted by authorized individuals or are under continuous physical or electronic surveillance.
  117. Controlling authority : Official responsible for directing the operation of a cryptonet and for managing the operational use and control of keying material assigned to the cryptonet.
  118. Countermeasure : Action, device, procedure, technique, or other measure that reduces the vulnerability of an information system.
  119. Covert channel : Unintended and/or unauthorized communications path that can be used to transfer information in a manner that violates an information system security policy. (See overt channel and exploitable channel.)
  120. Covert channel analysis : Determination of the extent to which the security policy model and subsequent lower-level program descriptions may allow unauthorized access to information.
  121. Covert storage channel : Covert channel involving the direct or indirect writing to a storage location by one process and the direct or indirect reading of the storage location by another process. Covert storage channels typically involve a finite resource (e.g., sectors on a disk) that is shared by two subjects at different security levels.
  122. Covert timing channel : Covert channel in which one process signals information to another process by modulating its own use of system resources (e.g., central processing unit time) in such a way that this manipulation affects the real response time observed by the second process.
  123. Credentials : Information, passed from one entity to another, used to establish the sending entity's access rights.
  124. Critical infrastructures : System and assets, whether physical or virtual, so vital to the U.S. that the incapacity or destruction of such systems and assets would have a debilitating impact on security, national economic security, national public health or safety, or any combination of those matters. [Critical Infrastructures Protection Act of 2001, 42 U.S.C. 5195c(e)]
  125. Cross domain solution : Information assurance solution that provides the ability to access or transfer information between two or more security domains. (See multi level security.)
  126. Cryptanalysis : Operations performed in converting encrypted messages to plain text without initial knowledge of the crypto-algorithm and/or key employed in the encryption.
  127. CRYPTO : Marking or designator identifying COMSEC keying material used to secure or authenticate telecommunications carrying classified or sensitive U.S. Government or U.S. Government-derived information.
  128. Crypto-alarm : Circuit or device that detects failures or aberrations in the logic or operation of crypto-equipment. Crypto-alarm may inhibit transmission or may provide a visible and/or audible alarm.
  129. Crypto-algorithm : Well-defined procedure or sequence of rules or steps, or a series of mathematical equations used to describe cryptographic processes such as encryption/decryption, key generation, authentication, signatures, etc.
  130. Crypto-ancillary equipment : Equipment designed specifically to facilitate efficient or reliable operation of crypto-equipment, without performing cryptographic functions itself.
  131. Crypto-equipment : Equipment that embodies a cryptographic logic.
  132. Cryptographic : Pertaining to, or concerned with, cryptography.
  133. Cryptographic component : Hardware or firmware embodiment of the cryptographic logic. A cryptographic component may be a modular assembly, a printed wiring assembly, a microcircuit, or a combination of these items.
  134. Cryptographic initialization : Function used to set the state of a cryptographic logic prior to key generation, encryption, or other operating mode.
  135. Cryptographic logic : The embodiment of one (or more) cryptoalgorithm(s) along with alarms, checks, and other processes essential to effective and secure performance of the cryptographic process(es).
  136. Cryptographic randomization : Function that randomly determines the transmit state of a cryptographic logic.
  137. Cryptography : Art or science concerning the principles, means, and methods for rendering plain information unintelligible and for restoring encrypted information to intelligible form.
  138. Crypto-ignition key (CIK) : Device or electronic key used to unlock the secure mode of crypto-equipment.
  139. Cryptology : Field encompassing both cryptography and cryptanalysis.
  140. Crypto-period : Time span during which each key setting remains in effect.
  141. Crypto-security : Component of COMSEC resulting from the provision of technically sound cryptosystems and their proper use.
  142. Crypto-synchronization : Process by which a receiving decrypting cryptographic logic attains the same internal state as the transmitting encrypting logic.
  143. Cryptosystem : Associated INFOSEC items interacting to provide a single means of encryption or decryption.
  144. Cryptosystem analysis : Process of establishing the exploitability of a cryptosystem, normally by reviewing transmitted traffic protected or secured by the system under study.
  145. Cryptosystem evaluation : Process of determining vulnerabilities of a cryptosystem.
  146. Cryptosystem review : Examination of a cryptosystem by the controlling authority ensuring its adequacy of design and content, continued need, and proper distribution.
  147. Cryptosystem survey : Management technique in which actual holders of a cryptosystem express opinions on the system's suitability and provide usage information for technical evaluations.
  148. Cyclic redundancy check : Error checking mechanism that checks data integrity by computing a polynomial algorithm based checksum.
  149. Data aggregation : Compilation of unclassified individual data systems and data elements that could result in the totality of the information being classified or of beneficial use to an adversary.
  150. Data Encryption Standard (DES) : Cryptographic algorithm, designed for the protection of unclassified data and published by the National Institute of Standards and Technology (NIST) in Federal Information Processing Standard (FIPS) Publication 46. (FIPS 46-3 withdrawn 19 May 2005) (See Triple DES) and CNSS Advisory IA/02-04 Revised March 2005)
  151. Data flow control : Synonymous with information flow control.
  152. Data integrity : Condition existing when data is unchanged from its source and has not been accidentally or maliciously modified, altered, or destroyed.
  153. Data origin authentication : Corroborating the source of data is as claimed.
  154. Data security : Protection of data from unauthorized (accidental or intentional) modification, destruction, or disclosure.
  155. Data transfer device (DTD) : Fill device designed to securely store, transport, and transfer electronically both COMSEC and TRANSEC key, designed to be backward compatible with the previous generation of COMSEC common fill devices, and programmable to support modern mission systems.
  156. Decertification : Revocation of the certification of an information system item or equipment for cause.
  157. Decipher : Convert enciphered text to plain text by means of a cryptographic system.
  158. Decode : Convert encoded text to plain text by means of a code.
  159. Decrypt : Generic term encompassing decode and decipher.
  160. Dedicated mode : information system security mode of operation wherein each user, with direct or indirect access to the system, its peripherals, remote terminals, or remote hosts, has all of the following: a. valid security clearance for all information within the system; b. formal access approval and signed nondisclosure agreements for all the information stored and/or processed (including all compartments, sub-compartments, and/or special access programs); and c. valid need-to-know for all information contained within the information system. When in the dedicated security mode, a system is specifically and exclusively dedicated to and controlled for the processing of one particular type or classification of information, either for full-time operation or for a specified period of time.
  161. Default classification : Temporary classification reflecting the highest classification being processed in an information system. Default classification is included in the caution statement affixed to an object.
  162. Defense-in-depth : IA strategy integrating people, technology, and operations capabilities to establish variable barriers across multiple layers and dimensions of networks. Synonymous with security-in-depth.
  163. Degaussing : Procedure that reduces the magnetic flux to virtual zero by applying a reverse magnetizing field. Also called demagnetizing.
  164. Delegated development program : INFOSEC program in which the Director, NSA, delegates, on a case by case basis, the development and/or production of an entire telecommunications product, including the INFOSEC portion, to a lead department or agency.
  165. Denial of service : Any action or series of actions that prevents any part of an information system from functioning.
  166. Descriptive top-level specification : Top-level specification written in a natural language (e.g., English), an informal design notation, or a combination of the two. Descriptive top-level specification, required for a class B2 and B3 (as defined in the Orange Book, Department of Defense Trusted Computer System Evaluation Criteria, DoD 5200.28-STD) information system, completely and accurately describes a trusted computing base. (See formal top-level specification.)
  167. Designated approval authority (DAA) : Official with the authority to formally assume responsibility for operating a system at an acceptable level of risk. This term is synonymous with authorizing official, designated accrediting authority, and delegated accrediting authority.
  168. Dial back : Synonymous with call back.
  169. Digital signature : Cryptographic process used to assure message originator authenticity, integrity, and non-repudiation. Synonymous with electronic signature.
  170. Digital signature algorithm : Procedure that appends data to, or performs a cryptographic transformation of, a data unit. The appended data or cryptographic transformation allows reception of the data unit and protects against forgery, e.g., by the recipient.
  171. Direct shipment : Shipment of COMSEC material directly from NSA to user COMSEC accounts.
  172. Disaster recovery plan : Provides for the continuity of system operations after a disaster.
  173. Discretionary access control (DAC) : Means of restricting access to objects based on the (DAC) identity and need-to-know of users and/or groups to which the object belongs. Controls are discretionary in the sense that a subject with a certain access permission is capable of passing that permission (directly or indirectly) to any other subject. (See mandatory access control.)
  174. Distinguished name : Globally unique identifier representing an individual's identity.
  175. DMZ (Demilitarized Zone) : Perimeter network segment that is logically between internal and external networks. Its purpose is to enforce the internal network's IA policy for external information exchange and to provide external, un-trusted sources with restricted access to releasable information while shielding the internal networks from outside attacks. A DMZ is also called a "screened subnet."
  176. Domain : System or group of systems operating under a common security policy.
  177. Electronically generated key : Key generated in a COMSEC device by introducing (either mechanically or electronically) a seed key into the device and then using the seed, together with a software algorithm stored in the device, to produce the desired key.
  178. Electronic Key Management System (EKMS) : Interoperable collection of systems being developed by services and agencies of the U.S. Government to automate the planning, ordering, generating, distributing, storing, filling, using, and destroying of electronic key and management of other types of COMSEC material.
  179. Electronic Messaging Services : Services providing interpersonal messaging capability; meeting specific functional, management, and technical requirements; and yielding a business-quality electronic mail service suitable for the conduct of official government business.
  180. Electronic security (ELSEC) : Protection resulting from measures designed to deny unauthorized individuals information derived from the interception and analysis of noncommunications electromagnetic radiations.
  181. Electronic signature : See digital signature.
  182. Embedded computer : Computer system that is an integral part of a larger system.
  183. Embedded cryptography : Cryptography engineered into an equipment or system whose basic function is not cryptographic.
  184. Embedded cryptographic system : Cryptosystem performing or controlling a function as an integral element of a larger system or subsystem.
  185. Emissions security (EMSEC) : Protection resulting from measures taken to deny unauthorized individuals information derived from intercept and analysis of compromising emanations from crypto-equipment or an information system. (See TEMPEST.)
  186. Encipher : Convert plain text to cipher text by means of a cryptographic system.
  187. Enclave : Collection of computing environments connected by one or more internal networks under the control of a single authority and security policy, including personnel and physical security.
  188. Enclave boundary : Point at which an enclave's internal network service layer connects to an external network's service layer, i.e., to another enclave or to a Wide Area Network (WAN).
  189. Encode : Convert plain text to cipher text by means of a code.
  190. Encrypt : Generic term encompassing encipher and encode.
  191. Encryption algorithm : Set of mathematically expressed rules for rendering data unintelligible by executing a series of conversions controlled by a key.
  192. End-item accounting : Accounting for all the accountable components of a COMSEC equipment configuration by a single short title.
  193. End-to-end encryption : Encryption of information at its origin and decryption at its intended destination without intermediate decryption.
  194. End-to-end security : Safeguarding information in an information system from point of origin to point of destination.
  195. Endorsed for unclassified cryptographic item (EUCI) : Unclassified cryptographic equipment that embodies a U.S. Government classified cryptographic logic and is endorsed by NSA for the protection of national security information. (See type 2 product.)
  196. Endorsement : NSA approval of a commercially developed product for safeguarding national security information.
  197. Entrapment : Deliberate planting of apparent flaws in an information system for the purpose of detecting attempted penetrations.
  198. Environment : Aggregate of external procedures, conditions, and objects affecting the development, operation, and maintenance of an information system.
  199. Erasure : Process intended to render magnetically stored information irretrievable by normal means.
  200. Evaluation Assurance Level (EAL) : Set of assurance requirements that represent a point on the Common Criteria predefined assurance scale.
  201. Event : Occurrence, not yet assessed, that may affect the performance of an information system.
  202. Executive state : One of several states in which an information system may operate, and the only one in which certain privileged instructions may be executed. Such privileged instructions cannot be executed when the system is operating in other states. Synonymous with supervisor state.
  203. Exercise key : Key used exclusively to safeguard communications transmitted over-the-air during military or organized civil training exercises.
  204. Exploitable channel : Channel that allows the violation of the security policy governing an information system and is usable or detectable by subjects external to the trusted computing base. (See covert channel.)
  205. Exposure : An information security "exposure" is a system configuration issue or a mistake in software that allows access to information or capabilities that can be used by a hacker as a stepping-stone into a system or network.
  206. Extraction resistance : Capability of crypto-equipment or secure telecommunications equipment to resist efforts to extract key.
  207. Extranet : Extension to the intranet allowing selected outside users access to portions of an organization's intranet.
  208. Fail safe : Automatic protection of programs and/or processing systems when hardware or software failure is detected.
  209. Fail soft : Selective termination of affected nonessential processing when hardware or software failure is determined to be imminent.
  210. Failure access : Type of incident in which unauthorized access to data results from hardware or software failure.
  211. Failure control : Methodology used to detect imminent hardware or software failure and provide fail safe or fail soft recovery.
  212. File protection : Aggregate of processes and procedures designed to inhibit unauthorized access, contamination, elimination, modification, or destruction of a file or any of its contents.
  213. File security : Means by which access to computer files is limited to authorized users only.
  214. Fill device : COMSEC item used to transfer or store key in electronic form or to insert key into a crypto-equipment.
  215. FIREFLY : Key management protocol based on public key cryptography.
  216. Firewall : System designed to defend against unauthorized access to or from a private network.
  217. Firmware : Program recorded in permanent or semi-permanent computer memory.
  218. Fixed COMSEC facility : COMSEC facility located in an immobile structure or aboard a ship.
  219. Flaw : Error of commission, omission, or oversight in an information system that may allow protection mechanisms to be bypassed.
  220. Flaw hypothesis methodology : System analysis and penetration technique in which the specification and documentation for an information system are analyzed to produce a list of hypothetical flaws. This list is prioritized on the basis of the estimated probability that a flaw exists, on the ease of exploiting it, and on the extent of control or compromise it would provide. The prioritized list is used to perform penetration testing of a system.
  221. Flooding : Type of incident involving insertion of a large volume of data resulting in denial of service.
  222. Formal access approval : Process for authorizing access to classified or sensitive information with specified access requirements, such as Sensitive Compartmented Information (SCI) or Privacy Data, based on the specified access requirements and a determination of the individual's security eligibility and need-to-know.
  223. Formal development : Software development strategy that proves security methodology design specifications.
  224. Formal method : Mathematical argument which verifies that the system satisfies a mathematically described security policy.
  225. Formal proof : Complete and convincing mathematical argument presenting the full logical justification for each proof step and for the truth of a theorem or set of theorems.
  226. Formal security policy : Mathematically precise statement of a security policy.
  227. Formal top-level specification : Top-level specification written in a formal mathematical language to allow theorems, showing the correspondence of the system specification to its formal requirements, to be hypothesized and formally proven.
  228. Formal verification : Process of using formal proofs to demonstrate the consistency between formal specification of a system and formal security policy model (design verification) or between formal specification and its high-level program implementation (implementation verification).
  229. Frequency hopping : Repeated switching of frequencies during radio transmission according to a specified algorithm, to minimize unauthorized interception or jamming of telecommunications.
  230. Front-end security filter : Security filter logically separated from the remainder of an information system to protect system integrity. Synonymous with firewall.
  231. Full maintenance : Complete diagnostic repair, modification, and overhaul of COMSEC equipment, including repair of defective assemblies by piece part replacement. (See limited maintenance.)
  232. Functional proponent : See network sponsor.
  233. Functional testing : Segment of security testing in which advertised security mechanisms of an information system are tested under operational conditions.
  234. Gateway : Interface providing a compatibility between networks by converting transmission speeds, protocols, codes, or security measures.
  235. Global Information Grid : The globally interconnected, end-to-end set of information capabilities, associated processes, and personnel for collecting, processing, storing, disseminating, and managing information on demand to war fighters, policy makers, and support personnel. (DoD Directive 8100.1, 19 Sept. 2002)
  236. Guard : Mechanism limiting the exchange of information between systems.
  237. Hacker : Unauthorized user who attempts to or gains access to an information system.
  238. Handshaking procedures : Dialogue between two information system's for synchronizing, identifying, and authenticating themselves to one another.
  239. Hard copy key : Physical keying material, such as printed key lists, punched or printed key tapes, or programmable, read-only memories (PROM).
  240. Hardwired key : Permanently installed key.
  241. Hash total : Value computed on data to detect error or manipulation. (See checksum.)
  242. Hashing : Computation of a hash total.
  243. Hash word : Memory address containing hash total.
  244. High assurance guard (HAG) : Device comprised of both hardware and software that is designed to enforce security rules during the transmission of X.400 message and X.500 directory traffic between enclaves of different classification levels (e.g., UNCLASSIFIED and SECRET).
  245. Hot site : A backup site that is a duplicate of original data center with full IT computing infrastructure and replicated data. It is the most expensive business continuity solution.
  246. IA architecture : Activity that aggregates the functions of developing IA operational, system, and technical architecture products for the purpose of specifying and implementing new or modified IA capabilities within the IT environment. (DoD Directive 8100.1, 19 Sept 2002)
  247. IA-enabled information technology product : Product or technology whose primary role is not security, but which provides security services as an associated feature of its intended operating capabilities. Examples include such products as security-enabled web browsers, screening routers, trusted operating systems, and security-enabled messaging systems.
  248. Identification : Process an information system uses to recognize an entity.
  249. Identity token : Smart card, metal key, or other physical object used to authenticate identity.
  250. Identity validation : Tests enabling an information system to authenticate users or resources.
  251. Imitative communications deception : Introduction of deceptive messages or signals into an adversary's telecommunications signals. (See communications deception and manipulative communications deception.)
  252. Impersonating : Form of spoofing.
  253. Implant : Electronic device or electronic equipment modification designed to gain unauthorized interception of information-bearing emanations.
  254. Inadvertent disclosure : Type of incident involving accidental exposure of information to an individual not authorized access.
  255. Incident : (IS) Assessed occurrence having actual or potentially adverse effects on an information system. (COMSEC) Occurrence that potentially jeopardizes the security of COMSEC material or the secure electrical transmission of national security information.
  256. Incomplete parameter checking : System flaw that exists when the operating system does not check all parameters fully for accuracy and consistency, thus making the system vulnerable to penetration.
  257. Indicator : Recognized action, specific, generalized, or theoretical, that an adversary might be expected to take in preparation for an attack.
  258. Individual accountability : Ability to associate positively the identity of a user with the time, method, and degree of access to an information system.
  259. Informal security policy : Natural language description, possibly supplemented by mathematical arguments, demonstrating the correspondence of the functional specification to the high-level design.
  260. Information assurance (IA) : Measures that protect and defend information and information systems by ensuring their availability, integrity, authentication, confidentiality, and non-repudiation. These measures include providing for restoration of information systems by incorporating protection, detection, and reaction capabilities.
  261. Information assurance manager (IAM) : See information systems security manager.
  262. Information assurance officer (IAO) : See information systems security officer.
  263. Information assurance product : Product or technology whose primary purpose is to provide security services (e.g., confidentiality, authentication, integrity, access control, non-repudiation of data) correct known vulnerabilities; and/or provide layered defense against various categories of non-authorized or malicious penetrations of information systems or networks. Examples include such products as data/network encryptors, firewalls, and intrusion detection devices.
  264. Information environment : Aggregate of individuals, organizations, or systems that collect, process, or disseminate information, also included is the information itself.
  265. Information flow control : Procedure to ensure that information transfers within an information system are not made from a higher security level object to an object of a lower security level.
  266. Information operations (IO) : Actions taken to affect adversary information and information systems while defending one's own information and information systems.
  267. Information owner : Official with statutory or operational authority for specified information and responsibility for establishing the controls for its generation, collection, processing, dissemination, and disposal.
  268. Information security policy : Aggregate of directives, regulations, rules, and practices that prescribe how an organization manages, protects, and distributes information.
  269. Information system (IS) : Set of information resources organized for the collection, storage, processing, maintenance, use, sharing, dissemination, disposition, display, or transmission of information.
  270. Information systems security (INFOSEC) : Protection of information systems against unauthorized access to or modification of information, whether in storage, processing or transit, and against the denial of service to authorized users, including those measures necessary to detect, document, and counter such threats.
  271. Information systems security engineering (ISSE) : Process that captures and refines information protection requirements and ensures their integration into IT acquisition processes through purposeful security design or configuration.
  272. Information systems security equipment modification : Modification of any fielded hardware, firmware, software, or portion thereof, under NSA configuration control. There are three classes of modifications: mandatory (to include human safety); optional/special mission modifications; and repair actions. These classes apply to elements, subassemblies, equipment, systems, and software packages performing functions such as key generation, key distribution, message encryption, decryption, authentication, or those mechanisms necessary to satisfy security policy, labeling, identification, or accountability.
  273. Information systems security manager (ISSM) : Individual responsible for a program, organization, system, or enclave's information assurance program.
  274. Information systems security officer (ISSO) : Individual responsible to the ISSM for ensuring the appropriate operational IA posture is maintained for a system, program, or enclave.
  275. Information systems security product : Item (chip, module, assembly, or equipment), technique, or service that performs or relates to information systems security.
  276. Initialize : Setting the state of a cryptographic logic prior to key generation, encryption, or other operating mode.
  277. Inspectable space : Three dimensional space surrounding equipment that process classified and/or sensitive information within which TEMPEST exploitation is not considered practical or where legal authority to identify and remove a potential TEMPEST exploitation exists. Synonymous with zone of control.
  278. Integrity : "Guarding against improper information modification or destruction, and includes ensuring information non-repudiation and authenticity." (44 USC Sec. 3542)
  279. Integrity check value : Checksum capable of detecting modification of an information system.
  280. Interconnection security agreement : Written management authorization to interconnect information systems based upon acceptance of risk and implementation of established controls.
  281. Interface : Common boundary between independent systems or modules where interactions take place.
  282. Interface control document : Technical document describing interface controls and identifying the authorities and responsibilities for ensuring the operation of such controls. This document is baselined during the preliminary design review and is maintained throughout the information system lifecycle.
  283. Interim Approval To Operate (IATO) : Temporary authorization granted by a DAA for an information system to process information based on preliminary results of a security evaluation of the system.
  284. Interim Approval To Test (IATT) : Temporary authorization to test an information system in a specified operational information environment within the timeframe and under the conditions or constraints enumerated in the written authorization.
  285. Internal security controls : Hardware, firmware, or software features within an information system that restrict access to resources only to authorized subjects.
  286. Internet Protocol (IP) : Standard protocol for transmission of data from source to destinations in packet-switched communications networks and interconnected systems of such networks.
  287. IP broadcast methods : There are three methods:
  288. - Unicast: Packet is sent from a single source to a single destination.
  289. - Broadcast: Source packet is copied and sent to all the nodes on a network.
  290. - Multicast: Source packet is copied and then sent to multiple destinations on a network.
  291. Intrusion : Unauthorized act of bypassing the security mechanisms of a system.
  292. Key : Usually a sequence of random or pseudorandom bits used initially to set up and periodically change the operations performed in crypto-equipment for the purpose of encrypting or decrypting electronic signals, or for determining electronic counter countermeasures patterns, or for producing other key.
  293. Key-auto-key (KAK) : Cryptographic logic using previous key to produce key.
  294. Key distribution center (KDC) : COMSEC facility generating and distributing key in electrical form.
  295. Key-encryption-key (KEK) : Key that encrypts or decrypts other key for transmission or storage.
  296. Key exchange : Process of exchanging public keys (and other information) in order to establish secure communications.
  297. Key list : Printed series of key settings for a specific cryptonet. Key lists may be produced in list, pad, or printed tape format.
  298. Key management infrastructure (KMI) : Framework and services that provide the (KMI) generation, production, storage, protection, distribution, control, tracking, and destruction for all cryptographic key material, symmetric keys as well as public keys and public key certificates.
  299. Key pair : Public key and its corresponding private key as used in public key cryptography.
  300. Key production key (KPK) : Key used to initialize a keystream generator for the production of other electronically generated key.
  301. Key recovery : Mechanisms and processes that allow authorized parties to retrieve the cryptographic key used for data confidentiality.
  302. Key stream : Sequence of symbols (or their electrical or mechanical equivalents) produced in a machine or auto-manual cryptosystem to combine with plain text to produce cipher text, control transmission security processes, or produce key.
  303. Key tag : Identification information associated with certain types of electronic key.
  304. Key tape : Punched or magnetic tape containing key. Printed key in tape form is referred to as a key list.
  305. Key updating : Irreversible cryptographic process for modifying key.
  306. Keying material : Key, code, or authentication information in physical or magnetic form.
  307. label : See security label.
  308. Labeled security protections : Elementary-level mandatory access control protection features and intermediate-level discretionary access control features in a TCB that uses sensitivity labels to make access control decisions.
  309. Laboratory attack : Use of sophisticated signal recovery equipment in a laboratory environment to recover information from data storage media.
  310. Least privilege : Principle requiring that each subject be granted the most restrictive set of privileges needed for the performance of authorized tasks. Application of this principle limits the damage that can result from accident, error, or unauthorized use of an information system.
  311. Level of concern : Rating assigned to an information system indicating the extent to which protection measures, techniques, and procedures must be applied. High, Medium, and Basic are identified levels of concern. A separate Level-of-Concern is assigned to each information system for confidentiality, integrity, and availability.
  312. Level of protection : Extent to which protective measures, techniques, and procedures must be applied to information systems and networks based on risk, threat, vulnerability, system interconnectivity considerations, and information assurance needs. Levels of protection are: 1. Basic: information system and networks requiring implementation of standard minimum security countermeasures. 2. Medium: information system and networks requiring layering of additional safeguards above the standard minimum security countermeasures. 3. High: information system and networks requiring the most stringent protection and rigorous security countermeasures.
  313. Limited maintenance : COMSEC maintenance restricted to fault isolation, removal, and replacement of plug-in assemblies. Soldering or unsoldering usually is prohibited in limited maintenance. (See full maintenance.)
  314. line conditioning : Elimination of unintentional signals or noise induced or conducted on a telecommunications or information system signal, power, control, indicator, or other external interface line.
  315. Line conduction : Unintentional signals or noise induced or conducted on a telecommunications or information system signal, power, control, indicator, or other external interface line.
  316. Link encryption : Encryption of information between nodes of a communications system.
  317. List-oriented : information system protection in which each protected object has a list of all subjects authorized to access it.
  318. Local authority : Organization responsible for generating and signing user certificates.
  319. Local Management Device/ Key Processor (LMD/KP) : EKMS platform providing automated management of COMSEC material and generating key for designated users. Lock and key protection system Protection system that involves matching a key or password with a specific access requirement. Logic bomb Resident computer program triggering an unauthorized act when particular states of an information system are realized.
  320. Logical completeness measure : Means for assessing the effectiveness and degree to which a set of security and access control mechanisms meets security specifications.
  321. Long title : Descriptive title of a COMSEC item.
  322. Low probability of detection : Result of measures used to hide or disguise intentional electromagnetic transmissions.
  323. Low probability of intercept : Result of measures to prevent the intercept of intentional electromagnetic transmissions.
  324. magnetic remanence : Magnetic representation of residual information remaining on a magnetic medium after the medium has been cleared. (See clearing.)
  325. maintenance hook : Special instructions (trapdoors) in software allowing easy maintenance and additional feature development. Since maintenance hooks frequently allow entry into the code without the usual checks, they are a serious security risk if they are not removed prior to live implementation.
  326. Maintenance key : Key intended only for in-shop use.
  327. Malicious applets : Small application programs automatically downloaded and executed that perform an unauthorized function on an information system.
  328. Malicious code : Software or firmware intended to perform an unauthorized process that will have adverse impact on the confidentiality, integrity, or availability of an information system. (See Trojan horse.)
  329. malicious logic : Hardware, software, or firmware capable of performing an unauthorized function on an information system.
  330. Mandatory access control (MAC) : Means of restricting access to objects based on the sensitivity of the information contained in the objects and the formal authorization (i.e., clearance, formal access approvals, and need-to-know) of subjects to access information of such sensitivity. (See discretionary access control.)
  331. mandatory modification : Change to a COMSEC end-item that NSA requires to be completed and reported by a specified date. (See optional modification.)
  332. manipulative communications : Alteration or simulation of friendly deception telecommunications for the purpose of deception. (See communications deception and imitative communications deception.)
  333. manual cryptosystem : Cryptosystem in which the cryptographic processes are performed without the use of crypto-equipment or auto-manual devices.
  334. Manual remote rekeying : Procedure by which a distant crypto-equipment is rekeyed electrically, with specific actions required by the receiving terminal operator. Synonymous with cooperative remote rekeying. (Also see automatic remote keying.)
  335. masquerading : See spoofing.
  336. Master crypto-ignition key : Key device with electronic logic and circuits providing the capability for adding more operational CIKs to a keyset.
  337. Memory scavenging : The collection of residual information from data storage.
  338. Message authentication code : Data associated with an authenticated message allowing a receiver to verify the integrity of the message.
  339. Message externals : Information outside of the message text, such as the header, trailer, etc.
  340. message indicator : Sequence of bits transmitted over a communications system for synchronizing crypto-equipment. Some off-line cryptosystems, such as the KL-51 and one-time pad systems, employ message indicators to establish decryption starting points.
  341. Mimicking : See spoofing.
  342. Mobile code : Software modules obtained from remote systems, transferred across a network, and then downloaded and executed on local systems without explicit installation or execution by the recipient.
  343. Mode of operation : Description of the conditions under which an information system operates based on the sensitivity of information processed and the clearance levels, formal access approvals, and need-to-know of its users. Four modes of operation are authorized for processing or transmitting information: dedicated mode, system-high mode, compartmented/partitioned mode, and multilevel mode.
  344. Multilevel device : Equipment trusted to properly maintain and separate data of different security categories.
  345. Multilevel mode : INFOSEC mode of operation wherein all the following statements are satisfied concerning the users who have direct or indirect access to the system, its peripherals, remote terminals, or remote hosts: a. some users do not have a valid security clearance for all the information processed in the information system; b. all users have the proper security clearance and appropriate formal access approval for that information to which they have access; and c. all users have a valid need-to-know only for information to which they have access.
  346. Multilevel security (MLS) : Concept of processing information with different classifications and categories that simultaneously permits access by users with different security clearances and denies access to users who lack authorization. (See cross domain solution.)
  347. multi-security level (MSL) : Capability to process information of different security classifications or categories by using periods processing or peripheral sharing.
  348. Mutual suspicion : Condition in which two information systems need to rely upon each other to perform a service, yet neither trusts the other to properly protect shared data.
  349. National Information Assurance Partnership (NIAP) : Joint initiative between NSA and NIST responsible for security testing needs of both IT consumers and producers and promoting the development of technically sound security requirements for IT products and systems and appropriate measures for evaluating those products and systems.
  350. National Information Infrastructure (NII) : Nationwide interconnection of communications networks, computers, databases, and consumer electronics that make vast amounts of information available to users. It includes both public and private networks, the internet, the public switched network, and cable, wireless, and satellite communications.
  351. National security information : Information that has been determined, pursuant to (NSI) Executive Order 12958 (as amended) (Ref b.) or any predecessor order, to require protection against unauthorized disclosure.
  352. National security system : Any information system (including any telecommunications system) used or operated by an agency or by a contractor of any agency, or other organization on behalf of an agency, the function, operation, or use of which: I. involves intelligence activities; II. Involves cryptologic activities related to national security; III. Involves command and control of military forces; IV. Involves equipment that is an integral part of a weapon or weapon system; or V. subject to subparagraph (B), is critical to the direct fulfillment of military or intelligence missions; or is protected at all times by procedures established for information that have been specifically authorized under criteria established by an Executive Order or an Act of Congress to be kept classified in the interest of national defense or foreign policy. (B). Does not include a system that is to be used for routine administrative and business applications (including payroll, finance, logistics, and personnel management applications). (Title 44 U.S. Code Section 3542, Federal Information Security Management Act of 2002.)
  353. need-to-know : Necessity for access to, or knowledge or possession of, specific official information required to carry out official duties.
  354. Need to know determination : Decision made by an authorized holder of official information that a prospective recipient requires access to specific official information to carry out official duties.
  355. Network : information system implemented with a collection of interconnected nodes.
  356. Network front-end : Device implementing protocols that allow attachment of a computer system to a network.
  357. Network reference monitor : See reference monitor.
  358. Network security : See information systems security.
  359. Network security officer : See information systems security officer.
  360. Network sponsor : Individual or organization responsible for stating the security policy enforced by the network, designing the network security architecture to properly enforce that policy, and ensuring the network is implemented in such a way that the policy is enforced.
  361. Network system : System implemented with a collection of interconnected components. A network system is based on a coherent security architecture and design.
  362. Network weaving : Penetration technique in which different communication networks are linked to access an information system to avoid detection and trace-back.
  363. No-lone zone : Area, room, or space that, when staffed, must be occupied by two or more appropriately cleared individuals who remain within sight of each other. (See two-person integrity.)
  364. Non-repudiation : Assurance the sender of data is provided with proof of delivery and the recipient is provided with proof of the sender's identity, so neither can later deny having processed the data.
  365. Null : Dummy letter, letter symbol, or code group inserted into an encrypted message to delay or prevent its decryption or to complete encrypted groups for transmission or transmission security purposes.
  366. Object : Passive entity containing or receiving information. Access to an object implies access to the information it contains.
  367. Object reuse : Reassignment and re-use of a storage medium containing one or more objects after ensuring no residual data remains on the storage medium.
  368. Official information : All information in the custody and control of a U.S. Government department or agency that was acquired by U.S. Government employees as a part of their official duties or because of their official status and has not been cleared for public release.
  369. One-time cryptosystem : Cryptosystem employing key used only once.
  370. One-time pad : Manual one-time cryptosystem produced in pad form.
  371. One-time tape : Punched paper tape used to provide key streams on a one-time basis in certain machine cryptosystems.
  372. On-line cryptosystem : Cryptosystem in which encryption and decryption are performed in association with the transmitting and receiving functions.
  373. Open storage : Storage of classified information within an accredited facility, but not in General Services Administration approved secure containers, while the facility is unoccupied by authorized personnel.
  374. Operational key : Key intended for use over-the-air for protection of operational information or for the production or secure electrical transmission of key streams.
  375. Operational vulnerability : Information that describes the presence of a information vulnerability within a specific operational setting or network.
  376. Operational waiver : Authority for continued use of unmodified COMSEC end-items pending the completion of a mandatory modification.
  377. Operations code : Code composed largely of words and phrases suitable for general communications use.
  378. Operations security (OPSEC) : Systematic and proven process by which potential adversaries can be denied information about capabilities and intentions by identifying, controlling, and protecting generally unclassified evidence of the planning and execution of sensitive activities. The process involves five steps: identification of critical information, analysis of threats, analysis of vulnerabilities, assessment of risks, and application of appropriate countermeasures.
  379. Optional modification : NSA-approved modification not required for universal implementation by all holders of a COMSEC end-item. This class of modification requires all of the engineering/doctrinal control of mandatory modification but is usually not related to security, safety, TEMPEST, or reliability. (See mandatory modification.)
  380. Organizational maintenance : Limited maintenance performed by a user organization.
  381. Organizational registration : Entity within the PKI that authenticates the authority (ORA) identity and the organizational affiliation of the users.
  382. Over-the-air key distribution : Providing electronic key via over-the-air rekeying, over-the-air key transfer, or cooperative key generation.
  383. Over-the-air key transfer : Electronically distributing key without changing traffic encryption key used on the secured communications path over which the transfer is accomplished.
  384. Over-the-air rekeying (OTAR) : Changing traffic encryption key or transmission security key in remote crypto-equipment by sending new key directly to the remote crypto-equipment over the communications path it secures.
  385. Overt channel : Communications path within a computer system or network designed for the authorized transfer of data. (See covert channel.)
  386. Overwrite procedure : Process of writing patterns of data on top of the data stored on a magnetic medium.
  387. Parity : Bit(s) used to determine whether a block of data has been altered.
  388. Partitioned security mode : information system security mode of operation wherein all personnel have the clearance, but not necessarily formal access approval and need-to-know, for all information handled by an information system.
  389. Password : Protected/private string of letters, numbers, and special characters used to authenticate an identity or to authorize access to data.
  390. Penetration : See intrusion.
  391. Penetration testing : Security testing in which evaluators attempt to circumvent the security features of a system based on their understanding of the system design and implementation.
  392. Per-call key : Unique traffic encryption key generated automatically by certain secure telecommunications systems to secure single voice or data transmissions. (See cooperative key generation.)
  393. Periods processing : Processing of various levels of classified and unclassified information at distinctly different times. Under the concept of periods processing, the system must be purged of all information from one processing period before transitioning to the next.
  394. Perimeter : Encompasses all those components of the system that are to be accredited by the DAA, and excludes separately accredited systems to which the system is connected.
  395. Permuter : Device used in crypto-equipment to change the order in which the contents of a shift register are used in various nonlinear combining circuits.
  396. Plain text : Unencrypted information.
  397. Policy approving authority (PAA) : First level of the PKI Certification Management Authority that approves the security policy of each PCA.
  398. Policy certification authority (PCA) : Second level of the PKI Certification Management Authority that formulates the security policy under which it and its subordinate Cas will issue public key certificates.
  399. Positive control material : Generic term referring to a sealed authenticator system, permissive action link, coded switch system, positive enable system, or nuclear command and control documents, material, or devices.
  400. Pre-production model : Version of INFOSEC equipment employing standard parts and suitable for complete evaluation of form, design, and performance. Preproduction models are often referred to as beta models.
  401. Principal accrediting authority (PAA) : Senior official with authority and responsibility for all intelligence systems within an agency.
  402. Print suppression : Eliminating the display of characters in order to preserve their secrecy.
  403. Privacy system : Commercial encryption system that affords telecommunications limited protection to deter a casual listener, but cannot withstand a technically competent cryptanalytic attack.
  404. Privileged user : Individual who has access to system control, monitoring, or administration functions (e.g., system administrator, system ISSO, maintainers, system programmers, etc.)
  405. Probe : Type of incident involving an attempt to gather information about an information system for the apparent purpose of circumventing its security controls.
  406. Production model : INFOSEC equipment in its final mechanical and electrical form.
  407. Proprietary information : Material and information relating to or associated with a company's products, business, or activities, including but not limited to financial information; data or statements; trade secrets; product research and development; existing and future product designs and performance specifications; marketing plans or techniques; schematics; client lists; computer programs; processes; and know-how that has been clearly identified and properly marked by the company as proprietary information, trade secrets, or company confidential information. The information must have been developed by the company and not be available to the Government or to the public without restriction from another source.
  408. Protected distribution systems (PDS) : Wire line or fiber optic distribution system used to transmit unencrypted classified national security information through an area of lesser classification or control.
  409. Protection philosophy : Informal description of the overall design of an information system delineating each of the protection mechanisms employed. Combination of formal and informal techniques, appropriate to the evaluation class, used to show the mechanisms are adequate to enforce the security policy.
  410. Protection profile : Common Criteria specification that represents an implementation-independent set of security requirements for a category of Target of Evaluations (TOE) that meets specific consumer needs.
  411. Protection ring : One of a hierarchy of privileged modes of an information system that gives certain access rights to user programs and processes that are authorized to operate in a given mode.
  412. Protective packaging : Packaging techniques for COMSEC material that discourage penetration, reveal a penetration has occurred or was attempted, or inhibit viewing or copying of keying material prior to the time it is exposed for use.
  413. Protective technologies : Special tamper-evident features and materials employed for the purpose of detecting tampering and deterring attempts to compromise, modify, penetrate, extract, or substitute information processing equipment and keying material.
  414. Protocol : Set of rules and formats, semantic and syntactic, permitting information systems to exchange information.
  415. Proxy : Software agent that performs a function or operation on behalf of another application or system while hiding the details involved.
  416. Public domain software : Software not protected by copyright laws of any nation that may be freely used without permission of, or payment to, the creator, and that carries no warranties from, or liabilities to the creator.
  417. Public key certificate : Contains the name of a user, the public key component of the user, and the name of the issuer who vouches that the public key component is bound to the named user.
  418. Public key cryptography (PKC) : Encryption system using a linked pair of keys. What one key encrypts, the other key decrypts.
  419. Public key infrastructure (PKI) : Framework established to issue, maintain, and revoke public key certificates accommodating a variety of security technologies, including the use of software.
  420. Purging : Rendering stored information unrecoverable. (See sanitize.)
  421. QUADRANT : Short name referring to technology that provides tamper-resistant protection to crypto-equipment.
  422. Randomizer : Analog or digital source of unpredictable, unbiased, and usually independent bits. Randomizers can be used for several different functions, including key generation or to provide a starting state for a key generator.
  423. Read : Fundamental operation in an information system that results only in the flow of information from an object to a subject.
  424. Read access : Permission to read information in an information system.
  425. Real-time reaction : Immediate response to a penetration attempt that is detected and diagnosed in time to prevent access.
  426. Recovery procedures : Actions necessary to restore data files of an information system and computational capability after a system failure.
  427. RED : Designation applied to an information system, and associated areas, circuits, components, and equipment in which unencrypted national security information is being processed.
  428. RED/BLACK concept : Separation of electrical and electronic circuits, components, equipment, and systems that handle national security information (RED), in electrical form, from those that handle non-national security information (BLACK) in the same form.
  429. Red team : Interdisciplinary group of individuals authorized to conduct an independent and focused threat-based effort as a simulated adversary to expose and exploit system vulnerabilities for the purpose of improving the security posture of information systems.
  430. RED signal : Any electronic emission (e.g., plain text, key, key stream, subkey stream, initial fill, or control signal) that would divulge national security information if recovered.
  431. Reference Monitor : Concept of an abstract machine that enforces Target of Evaluation (TOE) access control policies.
  432. Release prefix : Prefix appended to the short title of U.S.-produced keying material to indicate its foreign releasability. "A" designates material that is releasable to specific allied nations and "U.S." designates material intended exclusively for U. S. use.
  433. Remanence : Residual information remaining on storage media after clearing. (See magnetic remanence and clearing.)
  434. Remote access : Access for authorized users external to an enclave established through a controlled access point at the enclave boundary.
  435. Remote rekeying : Procedure by which a distant crypto-equipment is rekeyed electrically. (See automatic remote rekeying and manual remote rekeying.)
  436. Repair action : NSA-approved change to a COMSEC end-item that does not affect the original characteristics of the end-item and is provided for optional application by holders. Repair actions are limited to minor electrical and/or mechanical improvements to enhance operation, maintenance, or reliability. They do not require an identification label, marking, or control but must be fully documented by changes to the maintenance manual.
  437. Reserve keying material : Key held to satisfy unplanned needs. (See contingency key.)
  438. Residual risk : Portion of risk remaining after security measures have been applied.
  439. Residue : Data left in storage after information processing operations are complete, but before degaussing or overwriting has taken place.
  440. Resource encapsulation : Method by which the reference monitor mediates accesses to an information system resource. Resource is protected and not directly accessible by a subject. Satisfies requirement for accurate auditing of resource usage.
  441. Risk : Possibility that a particular threat will adversely impact an information system by exploiting a particular vulnerability.
  442. Risk analysis : Examination of information to identify the risk to an information system.
  443. Risk assessment : Process of analyzing threats to and vulnerabilities of an information system, and the potential impact resulting from the loss of information or capabilities of a system. This analysis is used as a basis for identifying appropriate and cost-effective security countermeasures.
  444. Risk index : Difference between the minimum clearance or authorization of information system users and the maximum sensitivity (e.g.; classification and categories) of data processed by the system.
  445. Risk management : Process of managing risks to agency operations (including mission, functions, image, or reputation), agency assets, or individuals resulting from the operation of an information system. It includes risk assessment; cost-benefit analysis; the selection, implementation, and assessment of security controls; and the formal authorization to operate the system. The process considers effectiveness, efficiency, and constraints due to laws, directives, policies, or regulations. (NIST Special Pub 800-53)
  446. Safeguard : 1. Protection included to counteract a known or expected condition.
  447. 2. Incorporated countermeasure or set of countermeasures within a base release. :
  448. Safeguarding statement : Statement affixed to a computer output or printout that states the highest classification being processed at the time the product was produced and requires control of the product, at that level, until determination of the true classification by an authorized individual. Synonymous with banner.
  449. Sanitize : Process to remove information from media such that data recovery is not possible. It includes removing all classified labels, markings, and activity logs. (See purging.)
  450. Scavenging : Searching through object residue to acquire data.
  451. Secure communications : Telecommunications deriving security through use of type 1 products and/or PDSs.
  452. Secure hash standard : Specification for a secure hash algorithm that can generate a condensed message representation called a message digest.
  453. Secure state : Condition in which no subject can access any object in an unauthorized manner.
  454. Secure subsystem : Subsystem containing its own implementation of the reference monitor concept for those resources it controls. Secure subsystem must depend on other controls and the base operating system for the control of subjects and the more primitive system objects.
  455. Security controls : Management, operational, and technical controls (i.e., safeguards or countermeasures) prescribed for an information system to protect the confidentiality, integrity, and availability of the system and its information. (NIST Special Pub 800-53)
  456. Securety fault analysis : Assessment, usually performed on information system hardware, to (SFA) determine the security properties of a device when hardware fault is encountered.
  457. Securety features users guide (SFUG) : Guide or manual explaining how the security mechanisms in a specific system work.
  458. Security filter : information system trusted subsystem that enforces security policy on the data passing through it.
  459. Security in depth : Synonymous with defense in depth.
  460. Security inspection : Examination of an information system to determine compliance with security policy, procedures, and practices.
  461. Security kernel : Hardware, firmware, and software elements of a trusted computing base implementing the reference monitor concept. Security kernel must mediate all accesses, be protected from modification, and be verifiable as correct.
  462. Security label : Information representing the sensitivity of a subject or object, such as UNCLASSIFIED or its hierarchical classification (CONFIDENTIAL, SECRET, TOP SECRET) together with any applicable nonhierarchical security categories (e.g., sensitive compartmented information, critical nuclear weapon design information).
  463. Security net control station : Management system overseeing and controlling implementation of network security policy.
  464. Security perimeter : Boundary where security controls are in effect to protect assets.
  465. Security range : Highest and lowest security levels that are permitted in or on an information system, system component, subsystem, or network.
  466. Security requirements : Types and levels of protection necessary for equipment, data, information, applications, and facilities to meet information system security policy.
  467. Security requirements baseline : Description of the minimum requirements necessary for an information system to maintain an acceptable level of security.
  468. Security safeguards : Protective measures and controls prescribed to meet the security requirements specified for an information system. Safeguards may include security features, management constraints, personnel security, and security of physical structures, areas, and devices. (See accreditation.)
  469. Security specification : Detailed description of the safeguards required to protect an information system.
  470. Security target : Common Criteria specification that represents a set of security requirements to be used as the basis of an evaluation of an identified Target of Evaluation (TOE).
  471. Security test and evaluation (ST&E) : Examination and analysis of the safeguards required to protect an information system, as they have been applied in an operational environment, to determine the security posture of that system.
  472. Security testing : Process to determine that an information system protects data and maintains functionality as intended.
  473. Seed key : Initial key used to start an updating or key generation process.
  474. Sensitive compartmented information (SCI) : Classified information concerning or derived from intelligence sources, methods, or analytical processes, which is required to be handled within formal access control systems established by the Director of Central Intelligence.
  475. Sensitive compartmented information facility (SCIF) : Accredited area, room, or group of rooms, buildings, or installation where SCI may be stored, used, discussed, and/or processed.
  476. Sensitive information : Information, the loss, misuse, or unauthorized access to or modification of, that could adversely affect the national interest or the conduct of federal programs, or the privacy to which individuals are entitled under 5 U.S.C. Section 552a (the Privacy Act), but that has not been specifically authorized under criteria established by an Executive Order or an Act of Congress to be kept classified in the interest of national defense or foreign policy. (Systems that are not national security systems, but contain sensitive information, are to be protected in accordance with the requirements of the Computer Security Act of 1987 (P.L.100-235).)
  477. Sensitivity label : Information representing elements of the security label(s) of a subject and an object. Sensitivity labels are used by the trusted computing base (TCB) as the basis for mandatory access control decisions.
  478. Separation of duties : Concept of having more than one person required to complete a task. (Source: Wikipedia)
  479. Shielded enclosure : Room or container designed to attenuate electromagnetic radiation, acoustic signals, or emanations.
  480. Simple security property : Bell-LaPadula security model rule allowing a subject read access to an object, only if the security level of the subject dominates the security level of the object.
  481. Sniffer : Software tool for auditing and identifying network traffic packets.
  482. Software assurance : Level of confidence that software is free from vulnerabilities, either intentionally designed into the software or accidentally inserted at anytime during its lifecycle, and that the software functions in the intended manner.
  483. Software system test and evaluation process : Process that plans, develops, and documents the quantitative demonstration of the fulfillment of all baseline functional performance, operational, and interface requirements.
  484. Special access program (SAP) : Sensitive program, approved in writing by a head of agency with original top secret classification authority, that imposes need-to-know and access controls beyond those normally provided for access to Confidential, Secret, or Top Secret information. The level of controls is based on the criticality of the program and the assessed hostile intelligence threat. The program may be an acquisition program, an intelligence program, or an operations and support program. (Joint Pub 1-02, 12 Apr 2001)
  485. Spillage : See classified information spillage.
  486. Split knowledge : Separation of data or information into two or more parts, each part constantly kept under control of separate authorized individuals or teams so that no one individual or team will know the whole data.
  487. Spoofing : Unauthorized use of legitimate Identification and Authentication (I&A) data, however it was obtained, to mimic a subject different from the attacker. Impersonating, masquerading, piggybacking, and mimicking are forms of spoofing.
  488. Spread spectrum : Telecommunications techniques in which a signal is transmitted in a bandwidth considerably greater than the frequency content of the original information. Frequency hopping, direct sequence spreading, time scrambling, and combinations of these techniques are forms of spread spectrum.
  489. Storage object : Object supporting both read and write accesses to an information system.
  490. Strong authentication : Layered authentication approach relying on two or more authenticators to establish the identity of an originator or receiver of information.
  491. Subject : Generally an individual, process, or device causing information to flow among objects or change to the system state.
  492. Subject security level : Sensitivity label(s) of the objects to which the subject has both read and write access. Security level of a subject must always be dominated by the clearance level of the user associated with the subject.
  493. Suppression measure : Action, procedure, modification, or device that reduces the level of, or inhibits the generation of, compromising emanations in an information system.
  494. Symmetric key : Encryption methodology in which the encryptor and decryptor use the same key, which must be kept secret.
  495. Synchronous crypto-operation : Method of on-line crypto-operation in which crypto-equipment and associated terminals have timing systems to keep them in step.
  496. System administrator (SA) : Individual responsible for the installation and maintenance of an information system, providing effective information system utilization, adequate security parameters, and sound implementation of established IA policy and procedures.
  497. System assets : Any software, hardware, data, administrative, physical, communications, or personnel resource within an information system.
  498. System development : Methodologies developed through software methodologies engineering to manage the complexity of system development. Development methodologies include software engineering aids and high-level design analysis tools.
  499. System high : Highest security level supported by an information system.
  500. System high mode : Information system security mode of operation wherein each user, with direct or indirect access to the information system, its peripherals, remote terminals, or remote hosts, has all of the following: a. valid security clearance for all information within an information system; b. formal access approval and signed nondisclosure agreements for all the information stored and/or processed (including all compartments, sub-compartments and/or special access programs); and c. valid need-to- know for some of the information contained within the information system.
  501. System indicator : Symbol or group of symbols in an off-line encrypted message identifying the specific cryptosystem or key used in the encryption.
  502. System integrity : Attribute of an information system when it performs its intended function in an unimpaired manner, free from deliberate or inadvertent unauthorized manipulation of the system.
  503. System low : Lowest security level supported by an information system.
  504. System profile : Detailed security description of the physical structure, equipment component, location, relationships, and general operating environment of an information system.
  505. System security : See information systems security.
  506. System security engineering : See information systems security engineering.
  507. System security officer : See information system security officer.
  508. System security plan : Document fully describing the planned security tasks and controls required to meet system security requirements.
  509. Tampering : Unauthorized modification altering the proper functioning of INFOSEC equipment.
  510. Target of evaluation (TOE) : IT product or system and its associated (TOE) administrator and user guidance documentation that is the subject of an evaluation.
  511. Technical controls : Security controls (i.e., safeguards or countermeasures) for an information system that are primarily implemented and executed by the information system through mechanisms contained in the hardware, software, or firmware components of the system. (NIST Special Pub 800-53.)
  512. Technical vulnerability information : Detailed description of a vulnerability to include the implementable steps (such as code) necessary to exploit that vulnerability.
  513. Telecommunications : Preparation, transmission, communication, or related processing of information (writing, images, sounds, or other data) by electrical, electromagnetic, electromechanical, electro-optical, or electronic means.
  514. TEMPEST : Short name referring to investigation, study, and control of compromising emanations from information system equipment.
  515. TEMPEST test : Laboratory or on-site test to determine the nature of compromising emanations associated with an information system.
  516. TEMPEST zone : Designated area within a facility where equipment with appropriate TEMPEST characteristics (TEMPEST zone assignment) may be operated.
  517. Test key : Key intended for testing of COMSEC equipment or systems.
  518. Threat : Any circumstance or event with the potential to adversely impact an information system through unauthorized access, destruction, disclosure, modification of data, and/or denial of service.
  519. Threat analysis : Examination of information to identify the elements comprising a threat.
  520. Threat assessment : Formal description and evaluation of threat to an information system.
  521. Threat monitoring : Analysis, assessment, and review of audit trails and other information collected for the purpose of searching out system events that may constitute violations of system security.
  522. Time bomb : Resident computer program that triggers an unauthorized act at a predefined time.
  523. TOE Security Functions (TSF) : Set consisting of all hardware, software, and firmware of the TOE that must be relied upon for the correct enforcement of the TSP.
  524. TOE Security Policy (TSP) : Set of rules that regulate how assets are managed, protected, and distributed within the TOE.
  525. Traditional INFOSEC program : Program in which NSA acts as the central procurement agency for the development and, in some cases, the production of INFOSEC items. This includes the Authorized Vendor Program. Modifications to the INFOSEC end-items used in products developed and/or produced under these programs must be approved by NSA.
  526. Traffic analysis (TA) : Study of communications patterns.
  527. Traffic-flow security (TFS) : Measure used to conceal the presence of valid messages in an on-line cryptosystem or secure communications system.
  528. Transmission security(TRANSEC) : Component of COMSEC resulting from the application of measures designed to protect transmissions from interception and exploitation by means other than cryptanalysis.
  529. Trap door : Synonymous with back door.
  530. Triple DES : Product cipher that, like DES, operates on 64-bit data blocks. There are several forms, each of which uses the DES cipher 3 times. Some forms use two 56-bit keys, some use three. (See NIST FIPS 46-3 and CNSSAM IA/02-04)
  531. Trojan horse : Program containing hidden code allowing the unauthorized collection, falsification, or destruction of information. (See malicious code.)
  532. Trusted channel : Means by which a TOE Security Function (TSF) and a remote trusted IT product can communicate with necessary confidence to support the TOE Security Policy (TSP).
  533. Trusted computer system : information system employing sufficient hardware and software assurance measures to allow simultaneous processing of a range of classified or sensitive information.
  534. Trusted computing base (TCB) : Totality of protection mechanisms within a computer system, including hardware, firmware, and software, the combination responsible for enforcing a security policy.
  535. Trusted distribution : Method for distributing trusted computing base (TCB) hardware, software, and firmware components that protects the TCB from modification during distribution.
  536. Trusted path : Means by which a user and a TOE Security Function (TSF) can communicate with necessary confidence to support the TOE Security Policy (TSP).
  537. Trusted recovery : Ability to ensure recovery without compromise after a system failure.
  538. Tunneling : Technology enabling one network to send its data via another network's connections. Tunneling works by encapsulating a network protocol within packets carried by the second network.
  539. Two-person control : Continuous surveillance and control of positive control material at all times by a minimum of two authorized individuals, each capable of detecting incorrect and unauthorized procedures with respect to the task being performed, and each familiar with established security and safety requirements.
  540. Type certification : The certification acceptance of replica information systems based on the comprehensive evaluation of the technical and non-technical security features of an information system and other safeguards, made as part of and in support of the accreditation process, to establish the extent to which a particular design and implementation meet a specified set of security requirements.
  541. Type 1 key : Generated and distributed under the auspices of NSA for use in a cryptographic device for the protection of classified and sensitive national security information.
  542. Type 1 product : Cryptographic equipment, assembly or component classified or certified by NSA for encrypting and decrypting classified and sensitive national security information when appropriately keyed. Developed using established NSA business processes and containing NSA approved algorithms. Used to protect systems requiring the most stringent protection mechanisms.
  543. Type 2 key : Generated and distributed under the auspices of NSA for use in a cryptographic device for the protection of unclassified national security information.
  544. Type 2 product : Cryptographic equipment, assembly, or component certified by NSA for encrypting or decrypting sensitive national security information when appropriately keyed. Developed using established NSA business processes and containing NSA approved algorithms. Used to protect systems requiring protection mechanisms exceeding best commercial practices including systems used for the protection of unclassified national security information.
  545. Type 3 key : Used in a cryptographic device for the protection of unclassified sensitive information, even if used in a Type 1 or Type 2 product.
  546. Type 3 product : Unclassified cryptographic equipment, assembly, or component used, when appropriately keyed, for encrypting or decrypting unclassified sensitive U.S. Government or commercial information, and to protect systems requiring protection mechanisms consistent with standard commercial practices. Developed using established commercial standards and containing NIST approved cryptographic algorithms/modules or successfully evaluated by the National Information Assurance Partnership (NIAP).
  547. Type 4 key : Used by a cryptographic device in support of its Type 4 functionality; i.e., any provision of key that lacks U.S. Government endorsement or oversight.
  548. Type 4 product : Unevaluated commercial cryptographic equipment, assemblies, or components that neither NSA nor NIST certify for any Government usage. These products are typically delivered as part of commercial offerings and are commensurate with the vendor's commercial practices. These products may contain either vendor proprietary algorithms, algorithms registered by NIST, or algorithms registered by NIST and published in a FIPS.
  549. Unauthorized disclosure : Type of event involving exposure of information to individuals not authorized to receive it.
  550. Unclassified : Information that has not been determined pursuant to E.O. 12958 or any predecessor order to require protection against unauthorized disclosure and that is not designated as classified.
  551. Updating : Automatic or manual cryptographic process that irreversibly modifies the state of a COMSEC key, equipment, device, or system.
  552. User : Individual or process authorized to access an information system.
  553. (PKI) Individual defined, registered, and bound to a public key structure by a certification authority (CA). :
  554. User ID : Unique symbol or character string used by an information system to identify a specific user.
  555. User Partnership Program : Partnership between the NSA and a U.S. (UPP) Government agency to facilitate development of secure information system equipment incorporating NSA-approved cryptography. The result of this program is the authorization of the product or system to safeguard national security information in the user's specific application.
  556. User representative : Individual authorized by an organization to order COMSEC keying material and interface with the keying system, provide information to key users, and ensure the correct type of key is ordered.
  557. U.S.-controlled facility : Base or building to which access is physically controlled by U.S. individuals who are authorized U.S. Government or U.S. Government contractor employees.
  558. Validated products list : List of validated products that have been successfully evaluated under the National Information Assurance Partnership (NIAP) Common Criteria Evaluation and Validation Scheme (CCEVS).
  559. Validation : Process of applying specialized security test and evaluation procedures, tools, and equipment needed to establish acceptance for joint usage of an information system by one or more departments or agencies and their contractors.
  560. Variant : One of two or more code symbols having the same plain text equivalent.
  561. Verification : Process of comparing two levels of an information system specification for proper correspondence (e.g., security policy model with top-level specification, top-level specification with source code, or source code with object code).
  562. Virtual private network (VPN) : Protected information system link utilizing tunneling, security controls (see information assurance), and endpoint address translation giving the impression of a dedicated line.
  563. Virus : Self-replicating, malicious code that attaches itself to an application program or other executable system component and leaves no obvious signs of its presence.
  564. Vulnerability : Weakness in an information system, system security procedures, internal controls, or implementation that could be exploited.
  565. Vulnerability analysis : Examination of information to identify the elements comprising a vulnerability.
  566. Vulnerability assessment : Formal description and evaluation of vulnerabilities of an information system.
  567. Warm site : A backup site that contains the IT infrastructure (hardware-wise, sometimes application), but not the data.
  568. Web risk assessment : Process for ensuring websites are in compliance with applicable policies.
  569. Wireless technology : Permits the active or passive transfer of information between separated points without physical connection. Active information transfer may entail a transmit and/or receive emanation of energy, whereas passive information transfer entails a receive-only capability. Currently wireless technologies use IR, acoustic, RF, and optical but, as technology evolves, wireless could include other methods of transmission.
  570. Work factor : Estimate of the effort or time needed by a potential perpetrator, with specified expertise and resources, to overcome a protective measure.
  571. Worm : See malicious code.
  572. Write : Fundamental operation in an information system that results only in the flow of information from a subject to an object. (See access type.)
  573. Write access : Permission to write to an object in an information system.
  574. Zero fill : To fill unused storage locations in an information system with the representation of the character denoting "0."
  575. Zeroize : To remove or eliminate the key from a cryptoequipmentor fill device.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement