Classifying data
The day-to-day management of access control requires management of labels, clearances, formal access approval & need to know. These formal mechanisms are typically used to protect highly sensitive data, such as government or military data.
Labels
- Objects have labels and subjects have clearances
- The object labels used by many world governments are confidential, secret & top-secret
- According to Executive Order 12356 National Security Information:
- Top Secret shall be applied to information, of which the unauthorised disclosure could reasonably be expected to cause exceptionally grave damage to national security
- Secret shall be applied to information, of which the unauthorised disclosure could reasonably be expected to cause serious damage to national security
- Confidential shall be applied to information, of which the unauthorised disclosure could reasonably be expected to cause damage to national security
- Private sector companies use labels such as “Internal Use Only” and “Company Proprietary” to categorise information.
Clearance
- A clearance is a formal determination of whether a user can be trusted with a specific level of information
- Clearances must determine the subject’s current and potential future trustworthiness; the latter is harder (and more expensive) to assess
- Some higher-level clearances include access to compartmented information
- Compartmentalisation is a technical method for enforcing need to know
Formal access approval
- Formal access approval is documented approval from the data owner for a subject to access certain objects, requiring the subject to understand all of the rules & requirements for accessing data, as well as the consequences should the data become lost, destroyed or compromised
Need to know
- Need to know refers to answering the question “does the user ‘need to know’ the specific data they may attempt to access?”
- Most computer systems rely on least privilege and require the users to police themselves by following the set policy, and therefore only attempting to obtain access to access the information of which they have a need to know
- Need to know is more granular than least privilege: unlike least privilege, which typically groups objects together, need to know access decisions are based on each individual object
Sensitive information & media security
- Though security & controls related to the people within an enterprise are vitally important, so is having a regimented process for handling sensitive information, including media security. Key concepts that are an important component of a strong overall info sec posture include:
- Sensitive information: All organisations have sensitive information that requires protection, and this physically resides on some for of media. In addition to primary storage, backup storage must also be considered. Wherever data exists, there must be processes in place to ensure that the data is not destroyed or inaccessible (breach of accessibility), disclosed (breach of confidentiality) or altered (breach of integrity)
- Handling: People handling sensitive media should be trusted & vetted individuals. They must understand their role within the organisations overall info sec posture. There should be strict policies regarding the handling of sensitive media, which require the inclusion of written logs detailing the person responsible for the media. Historically, handling backup media has proved a significant problem for organisations.
- Retention: Media & information have a limited period of usefulness. Retention of sensitive information should not persist beyond this period (or legal requirements, whichever is greater), as it needlessly exposes the data to threats of disclosure when it is no longer needed by the organisation.
Ownership
There are a number of primary info sec roles, each with a different set of responsibilities in securing an organisation’s assets.
Business or mission owners
- Business owners or mission owners (senior management) create the info sec program and ensure that it is properly staffed and funded, as well as given appropriate organisational priority.
- They are ultimately responsible for ensuring all organisational assets are protected
Data owners
- The data owner (also called information owner) is a manager responsible for ensuring that specific data is protected
- Data owners determine data sensitivity labels & the frequency of data backup
- They focus on the data itself, whether in electronic or paper format
- A company with multiple lines of business may have multiple data owners
- The data owner performs management duties, while custodians perform the hands-on protection of data
System owners
- The system owner is a manager who is responsible for the actual computers that house data, including the hardware/software config
- System owners ensure that the hardware is physically secure, operating systems are patched and up to date, the system is hardened etc.
- Technical, hands-on responsibilies are delegated to custodians
Custodian
- A custodian provides hands-on protection of assets such as data
- They perform backups & restores, patch systems, configure AV software etc.
- Custodians follow detailed orders and do not make critical decisions about the protection of data – e.g. the data owner may dictate that all data must be backed up every 24 hours, then the custodians would deploy and operate a backup solution that meets these requirements
Users
- Users must follow the rules: complying with mandatory policies, procedures, standards etc.
- For example, they must not write their passwords down or share accounts
- Users must be made aware of these risks and requirements, and of the penalty for failing to comply with mandatory directives and policies.
Data controllers & processors
- Data controllers create & manage sensitive data within an organisation
- HR employees are often data controllers, as they create and manage sensitive data such as salary/benefit data and disciplinary reports
- Data processors manage data on behalf of data controllers
- An outsourced payroll company is an example of a data processor, managing payroll data on behalf on a data controller (such as an HR department)
Data collection limitation
- Organisations should collect the minimum amount of sensitive information that is required
- The Organisation for Economic Co-operation & Development (OECD) Collection Limitation Principle states that “There should be limits to the collection of personal data, and any such data should be obtained by lawful & fair means and, where appropriate, with the knowledge/consent of the data subject.”
Memory & remanence
Data remanence
- Data remanence is data that persists beyond non-invasive means to delete it
- Though data remenance is sometimes used specifically to refer to residual data that persists on magnetic storage, remanence concerns go beyond just that of magnetic storage media, e.g. optical media, solid-state drives
Memory
- Memory is a series of on/off switches representing bits: 0s (off) and 1s (on)
- May be chip-based, disk-based or tape-based
- RAM is random-access memory, meaning that the CPU may jump to any desired location in memory
- Sequential memory, such as tape, must sequentially read (or fast forward past) memory, beginning at offset zero to the desired portion of memory
- Real, or primary memory (such as RAM) is directly accessible by the CPU and is used to hold instructions & data for currently executing processes. Secondary memory, such as disk-based memory, is not directly accessible.
- Some common types of memory include:
- Cache memory is the fastest system memory, required to keep up with the CPU as it detches & executes instructions. The data most frequently used by the CPU is stored in cache memory. The fastest portion of the CPU is made up multiple registers, small storage locations used by the CPU to hold instructions and data. The next fastest form of cache memory is Level 1 cache, located on the CPU itself. Finally, Level 2 cache is connected to (but outside of) the CPU. Static random-access memory (SRAM) is used for cache memory.
- RAM is volatile memory used to hold instructions & data of currently running programs. It loses integrity after loss of power.
- SRAM (static RAM) is fast, expensive memory that uses small latches called “flip-flops” to store bits.
- DRAM (dynamic RAM) stores bits in small capacitors, and is slower and cheaper than SRAM. The capacitors used by DRAM leak charge, and so they must be continually refreshed (typically every few to few hundered milliseconds) to maintain integrity. Refreshing reads the bits and writes them back to memory.
- SRAM does not require refreshing, and maintains integrity as long as power is supplied
- ROM is non-volatile: it maintains integrity after loss of power. A computer BIOS firmware is stored in ROM. While ROM is nominally “read only”, some types of ROM may be written to via flashing.
- Firmware stores programs that do not change frequently, such as a computer’s BIOS or a router’s OS and saved config
- Various types of ROM chips may store firmware, including:
- PROM (programmable ROM) which can be written to only once, typically at the factory
- EPROM (erasable PROM) can be “flashed” or erased and written to multiple (erasure requires exposure to UV light), while EEPROM is the same but can be erased electronically
- EPROMs, EEPROMs and flash memory are examples of programmable logic devices (PLDs): field-programmmable devices, which means they are programmed after leaving the factory
- Flash memory, such as that used in USB thumb drives, is a specific type of EEPROM used for storage. The difference is that any byte of an EEPROM may be written, while flash drives are written by larger sectors
- Solid-state drives (SSDs) are a combination of flash memory and DRAM.
- Degaussing has no effect on SSDs
- While physical disks have physical blocks (e.g. Block 1 is a specific physical location on a magnetic disk), blocks on SSDs are logical and are mapped to physical blocks.
- Also, SSDs do not overwrite blocks that contain data; the device will instead write data to an unused block and mark the previous block unallocated.
- A process called garbage collection later takes care of these old blocks, working in the background to identify which memory cells contain unneeded data and clearing them during off-peak times to maintain optimal write speeds during normal operations.
- The TRIM command, an attribute of the ATA Data Set Management Command, improves garbage collection by more efficiently marking data as “invalid” (requiring garbage collection) and skipping data that can be ignored. It improves compatibility, endurance and performance, but does not reliably destroy data.
- A sector-by-sector overwrite behaves very differently on an SSD versus a magnetic drive, and it does not reliably destroy all data. Electronically shredding a file (i.e. overwriting the file’s data before deleting it) is not effective either.
- Data on SSD drives that are not physically damaged may be securely removed via ATA Secure Erase. For damaged SSDs, the best option is physical destruction.
Data destruction
- All forms of media should be securely cleaned or destroyed prior to disposal to prevent object reuse: the act of recovering information from previously-used objects
- Objects can be physical, such as paper files, or electronic, such as data & files on a hard drive
- Object reuse attacks range from non-technical, such as dumpster diving, to technical, such as recovering information from unallocated blocks on a hard drive
- Simply “deleting” a file removes the entry from the file allocation table (FAT) and marks the data blocks as “unallocated”. Reformatting a disk destroys the old FAT and replaces it with a new one. In both cases, data usually remains and can be recovered through the use of forensic tools. This issue is called data remanence, referring to “remnants” of data left behind
- The act of overwriting actually writes over every character of a file or entire disk, so is far more secure than deleting or formatting. Common methods include writing all zeroes, or random characters. Electronic shredding or wiping overwrites the file’s data and then removes the FAT entry.
- Degaussing destroys the integrity of magnetic medium, such as a tape or disk drive, by exposing it to a strong magnetic field, which destroys the integrity of the medium and the data it contains. As a side effect, the magnetic field is usually strong enough to damage the sensitive electronics of modern hard drives, as well as wipe the platters, rendering the drive unsuitable for reuse.
- Destruction physically destroys the integrity of media by damaging or destroying the media itself, such as the platters of a disk drive. Methods include incineration, pulverising, shredding or bathing metal components in acid. Destroying objects is more secure than overwriting them, and is suitable for damaged media that may not be possible to overwrite but could still allow someone with the right tools to recover data. Highly sensitive data should be degaussed or destroyed, perhaps in addition to overwriting for a belt-and-braces approach.
- A simple form of media sanitisation is shredding, a type of physical destruction rather than the electronic wiping technique mentioned above. Here “shredding” refers to the process of making unrecoverable any data printed on paper or on smaller objects such as floppy or optical disks.
Determining data security controls
Determing which data security controls to apply is a critical skill. Standards, scoping & tailroing are used to choose & customise controls. The determination of controls will also be dictated by whether the data is at rest or in motion (transit).
Certification & accreditation
- Certification means a system has been certified to meet the security requirements of the data owner. You can be CERTain that it meets its requirements!
- Certification considers the system, the security measures taken to protect it, and the residual risk represented by it.
- Accreditation is the data owner’s formal acceptance of the certification and of the residual risk, which is required before the system is put into production. The data owner believes the system is CREDible!
Standards & control frameworks
A number of standards are available to determine security controls:
PCI DSS
- Industry specific: applies to vendors who store, process and/or transmit payment card data
- Created by the Payment Card Industry Security Standards Council, comprised of AmEx, Discover, MasterCard, Visa and other.
- Seeks to protect credit card data by requiring vendors to take specific precautions
- Based on a set of core principles:
- Build & maintain a secure network, and systems
- Protect cardholder data
- Maintain a vulnerability management program
- Implement strong access control measures
- Regularly monitor and test networks
- Maintain an information security policy
- Vendors must either carry out regular web vulnerability scans, or place their applications behind a web application firewall
The remaining standards are more general:
OCTAVE
- Stands for Operationally Critical Threat, Asset & Vulnerability Evaluation
- A risk management framework from Carnegie Mellon University
- Describes a three-phase process for managing risk:
- Phase 1 identifies staff knowledge, assets & threats
- Phase 2 identifies vulnerabilities and evaluates safeguards
- Phase 3 conducts the risk analysis & develops the risk mitigation strategy
Common Criteria
- The International Common Criteria is a standard for describing and testing the security of IT products
- Presents a hierarchy of requirements for a range of classifications & systems
- Uses specific terms when defining certain portions of the testing process:
- Target of evaluation (ToE): The system or product that is being evaluated
- Security target (ST): The documentation describing the ToE, including the security requirements and operation environment
- Protection profile (PP): An independent set of security requirements & objectives for a specific category of products/systems, such as firewalls or IDSs
- Evaluation assurance level (EAL): The evaluation score of the tested product or system. There are seven EALs, each building upon the previous level (for example, EAL3 products can be expected to meet or exceed the requirements of products rated EAL1 or EAL2):
- EAL1: Functionally tested
- EAL2: Structurally tested
- EAL3: Methodically tested & checked
- EAL4: Methodically designed, tested & reviewed
- EAL5: Semi-formally designed & tested
- EAL6: Semi-formally verified, designed & tested
- EAL7: Formally verified, designed & tested
The ISO 27000 series
- ISO 27002 is a set of optional guidelines for an information security code of practice. It was based on BS 7799 Part 1 and was renumbered from ISO 17799 in 2005 for consistency with other ISO security standards. It has 11 areas, each focusing on specific info sec controls:
- Policy
- Organisation of info sec
- Asset management
- HR security
- Physical & environmental security
- Comms & operations management
- Access control
- Information systems acquisition, development & maintenance
- Info sec incident management
- Business continuity management
- Compliance
- ISO 27001 is a related standard and comprises mandatory requirements for organisations wishing to be certified against it
COBIT
- A control framework for employing info sec governance best practices within an organisation
- Developed by ISACA (Information Systems Audit & Control Association)
- Made up of four domains:
- Plan & Organise
- Acquire & Implement
- Deliver & Support
- Monitor & Evaluate
- There are a total of 34 IT processes split across the four domains.
ITIL
- Information Technology Infrastructure Library
- A framework for providing best practice in IT Service Management
- Contains five core pulications providing guidance on various service management practices:
- Service Strategy: helps IT provide services
- Service Design: details the infrastructure & architecture required to deliver IT services
- Service Transition: describes taking new projects and making them operational
- Service Operation: covers IT operations controls
- Continual Service Improvement: describes ways to improve existing IT services
Scoping & tailoring
- Scoping is the process of determining which parts of a standard will be employed by an organisation. For example, an organisation that does not employ wireless equipment may declare the wireless provisions of a particular standard are out of scope and therefore do not apply.
- Tailoring is the process of customising a standard for an organisation. It begins with controls selection, continues with scoping & finishes with the application of compensating controls.
Protecting data in motion & at rest
- Data at rest is stored data that resides on a disk and/or in a file
- Data in motion is data that is being transferred across a network
- Each form of data requires different controls for protection
Drive & tape encryption
- Drive & tape encryption protect data at rest, and is one of the few controls that will protect data after physical security has been breached
- Controls to encrypt data at rest are recommended for all mobile devices and any media containing sensitive information that may physically leave a site or security zone
- Whole-disk (or full-disk) encryption of mobile device hard drives is recommended, since partially encrypted solutions, such as encrypted folders or partitions, often risk exposing sensitive data stored in temporary files, unallocate dpace, swap space etc
Media storage & transportation
- All sensitive backup data should be stored offsite, whether transmitted electronically over networks or physically moved as backup media
- Sites using backup media should follow string procedures for rotating media offsite
- Always use a bonded & insured company for offsite media storage, who should use secure vehicles and store media at a secure site.
- It’s important to ensure that the storage site is unlikely to be impacted by the same disaster that may strike the primary site (e.g. flood, earthquake or fire)
- Never use informal practices, such as storing backup media at employee’s houses
Protecting data in motion
- Data in motion is best protected via standards-based end-to-end encryption, such as an IPsec VPN
- This includes data sent over untrusted networks such as the Internet, but VPNs may also be uses as an additional defence-in-depth measure on internal networks such as a corporate WAN or private circuits such as T1 lines lease from a service provider.
Summary of exam objectives
- Concept of data classification, and roles required to protect data
- An understanding of the remanence properties of volatile and non-volatile memory & storage media is critical to master, along with a knowledge of effective secure destruction methiods
- Industry-specific and more general standards/guidelines, and processes including scoping & tailoring