Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
he Computer Forensics and Cybersecurity Governance Model (figure 1), as described in this article, provides a logical and compartmentalized approach for assessing risk. Specifically, technology assurance procedures were developed as a major component of the computer forensics and cybersecurity examination methodology to provide structure to the risk assessment process. Corporate governance over computer forensics and cybersecurity is a partnership among employees, management, the audit committee, the board of directors, regulators and consultants. Establishing a partnership with each individual within a corporation or government agency is crucial to the initial success of establishing and maintaining a workable governance model. The execution and enforcement of the governance model typically is performed by a technology assurance professional. The communication reporting chain of command starts from Figure 1The Computer Forensics and Cybersecurity Governance Model
A Macro Processes Overview (Cross reference figure 5)
Homeland Security Composite Rating for Computer Forensics and Cybersecurity Preparedness (Cross reference to figure 2)
the field auditor or examiner, who reports findings to a supervisor, then to the CIO and CEO of the company or government agency. Conclusions reached and recommendations made by the technology assurance personnel then are communicated to the audit committee and finally to the board of directors. Any material high-risk issues then are reported to the regulators and law enforcement as needs warrant.
Validation of IDS, Vulnerability Assessment and other Network Testing Validation of Infosec Policies and Procedures Management Risk Assessment of Computer Forensics and Cybersecurity Preparedness
Step 1 Complete the Computer Forensics and Cybersecurity Governance Model exam procedures.
Step 2 Complete Process and Practice Risk Scorecards. Step 3 Complete Heatmap (a consolidation of all color-coded risk scorecards).
Step 4 Evaluate the Process Risk Scorecard for completeness and accuracy against the computer forensic and cybersecurity policy and procedures (processes).
Step 6 Complete a Process Conclusion Memo describing strengths and weaknesses of computer forensic and cybersecurity processes (policy and procedures).
Step 7 Complete the Computer Forensics and Cybersecurity Examination Procedures, Practices
Step 5 Complete Computer Forensics Examination Procedures Cybersecurity Risk Assessment (process review of policies and procedures).
Step 8 Evaluate the managementprepared Practice Risk Scorecard for completeness and accuracy against the results of the recently completed Computer Forensics and Cybersecurity Examination Procedures. Step 9 Complete a Practice Conclusion Memo describing strengths and weaknesses of computer forensic and cybersecurity practices.
Step 11 Develop an Executive Summary Report that summarizes the final conclusion on computer forensic and cybersecurity risks.
Figure 3The Computer Forensics and Cybersecurity Governance Model The Risk Assessment Scorecard Criteria
A Confidentiality Infosec policies and procedures provide strong documented controls to protect data confidentiality. B Confidentiality Infosec policies and procedures provide adequate documented controls to protect data confidentiality. C Confidentiality Infosec policies and procedures need strengthening to ensure the existence of strong documented controls to protect data confidentiality. D Confidentiality Infosec policies and procedures are inadequate to ensure the existence of strong documented controls to protect data confidentiality. E Confidentiality Infosec policies and procedures are critically deficient to ensure the existence of strong documented controls to protect data confidentiality.
Integrity Infosec policies and procedures strongly protect data integrity through a high level of controls governing authenticity, nonrepudiation and accountability.
Integrity Infosec policies and procedures adequately protect data integrity through a high level of controls governing authenticity, nonrepudiation and accountability.
Integrity Infosec policies and procedures need strengthening to ensure a strong level of data integrity exists through a high level of controls governing authenticity, nonrepudiation and accountability.
Integrity Infosec policies and procedures are inadequate to ensure a strong level of data integrity exists through a high level of controls governing authenticity, nonrepudiation and accountability.
Integrity Infosec policies and procedures are critically deficient to ensure a strong level of data integrity exists through a high level of controls governing authenticity, nonrepudiation and accountability.
Availability Infosec policies and procedures provide a strong internal control standard to protect the timely availability of information technology resources (system and data).
Availability Infosec policies and procedures provide reasonable internal control standards to protect the timely availability of information technology resources (system and data).
Availability Infosec policies and procedures need strengthening to protect the timely availability of information technology resources (system and data).
Availability Infosec policies and procedures are inadequate to protect the timely availability of information technology resources (system and data).
Availability Infosec policies and procedures are critically deficient and require fundamental improvement to protect the timely availability of information technology resources (system and data).
An intersection exists when the prevention and detection color scorecard rating match, which validates managements original risk attestation (prevention), based on the completed computer forensic analysis performed in step 2. Conversely, an intersection does not exist when there is a disparity in color risk rating assigned during the infosec (prevention) and computer forensic (detection) processes. As a result of the color risk rating disparity, further investigation and root cause analysis need to be completed and the problem(s) resolved to prevent future occurrences of the infosec vulnerabilities. Included within figure 5 are the procedures that should be followed by assurance, infosec and regulatory professionals when implementing the Computer Forensics and Cybersecurity Governance Model and the Homeland Security Advisory System. Additional supporting details can be found throughout the article. There is a need for comprehensive common criteria that allow public and private sector entities to understand the risks and controls governing computer forensics and cybersecurity. Recent federal legislation, as noted within the Homeland
Security PDD3 signed by US President George W. Bush on 11 March 2002, provides a comprehensive and effective means to disseminate information regarding the risk of terrorist acts to federal, state and local authorities and to the American people, but is focused exclusively on the protection against terrorist threats on the nations physical infrastructure and does not establish common criteria for measuring and understanding cybersecurity risks. The key objective of the Computer Forensics and Cybersecurity Governance Model is to establish the initial governance framework that will attempt to clarify the role of technology assurance in understanding the risks and controls governing computer forensics and preventive information security defenses. Raising the awareness and understanding of the risks and controls governing computer forensics and their interrelationship to preventive information security defenses is critical to developing adequate preventive controls needed to safeguard information systems against cyberattacks and prevent the need for computer forensics. A large risk exists in the general lack of a clear understanding of computer forensics and
Infosec policies and practices are CRITICALLY DEFICIENT and need immediate remedial corrective action. As a result of woefully inadequate infosec controls, the potential for computer crime and the need for computer forensics are extremely high. Infosec policies and practices are INADEQUATE to reduce cybersecurity risk. As a result of inadequate infosec controls, the potential for computer crime and the need for computer forensics are high. Infosec policies and practices NEED STRENGTHENING to ensure adequate controls exist to safeguard against cybersecurity and the need for a computer forensics examination is reduced. Infosec policies and practices are ADEQUATE to reduce the risk of unauthorized access into mission-critical systems. As a result of the adequate controls, the likelihood of cybercrime and the need for a computer forensics examination is reduced. Infosec policies and practices are STRONG, greatly reducing the risk of unauthorized access into mission-critical systems. As a result of the strong controls, the likelihood of cybercrime and the need for a computer forensics examination are reduced.
cybersecurity among assurance professionals and other IT and security personnel. To address the aforementioned deficiency, a significant amount of research was conducted on this elusive topic. A secondary source of information originated from discussions with practitioners of computer forensics in US federal law enforcement and academia.
Develop Common Criteria to Identify, Measure, Monitor and Control Cybersecurity Risks and Computer Forensics Evidentiary Data
There is a need for a comprehensive common criterion that allows public and private sector entities and assurance professionals to understand the risks and controls governing
computer forensics and cybersecurity. The US federal government has made strong efforts to train special agents, criminal investigators and other law enforcement personnel in the science of computer forensics; however, there is limited information on this important subject matter that is available in the public domain. Assurance, infosec and regulatory professionals play an important role in contributing to the evaluation of the risks in computer forensics and cybersecurity. Paradoxically, assurance, infosec and regulatory professionals who should be on the front line in assessing computer forensics and cybersecurity risks may not be able to conveniently receive the appropriate level of quality training comparable to the law enforcement community. With only a few universities and private corporations providing specialized computer forensic training for non-law enforcement individuals, the nations critical infrastructure (e.g., financial institutions) runs the risk of being either inaccurately audited or examined by assurance, infosec and regulatory personnel. As a negative consequence, there are islands of intelligence (dots), which include unauthorized computer intrusions or hacks into systems that may go undetected, not reported to law enforcement, or not analyzed internally by the entity in a complete and accurate manner.
tively implement the plan as a matter of routine. Trigger points should be established for each intrusion scenario, which would include contacting local and/or federal law enforcement computer crimes. Similar to continuity of business (COB), a contact list needs development as part of a comprehensive CIRT plan to ensure timely and accurate communication to the appropriate people and organizations.
Each process (1) and practice (2-6) receives a risk scorecard rating based on the risk assessment scorecard criteria (figure 3)
Input: General and Application Controls 14. Encryption 15. Employee surveillance 16. E-mail 17. Firewalls 18. Internet 19. Intranet 20. Extranet 21. LAN/WAN/WAP 22. Logging conrols/audit trails 23. Microcomputers 24. Outsourcing 25. Passwords 26. Virus prevention, detection and removal 27. Contingency planning 28. Data classification 29. Digital signatures 30. E-commerce 31. Laptops 32. Data privacy 33. Telecommunications 34. Telephone systems 35. VPN 36. PKI 37. Steganography 38. Physical security 39. Operating systems 40. Ethical hack 41. IDS 42. Vulnerability assessment 43. Social engineering 44. Audit reports
D B A C
Input: General and Application Controls 45. Identification and authentication 46. Software import control 47. Controlling interactive software 48. Software licensing 49. Remote access 50. Access to internal databases 51. Data integrity 52. Intellectual legal 53. Regulatory and legal requirements
D B A C B E A D
Input: 1. Purpose, mission and scope 2. Code of conduct 3. Hiring and maintaining CIRT staff 4. Communications (internal and external) 5. Critical information to collect 6. Tools for incident response 7. Handling major events 8. Physical issues 9. Sharing information with other agencies
D B A C B E A D E
D B A C B E A D D B A C B E E E E E E E B E A D D B A C B E A D
Trap-and-traceThe most prudent strategy is to consult legal counsel, local, state and federal law enforcement representatives for the appropriate guidance on this matter. Generally speaking, a court order apparently is not required for a non-law enforcement person conducting a trap-and-trace on a black hat who is trying to gain access to a companys network from the Internet; however, such activities may be frowned upon by certain courts. Specifically, the courts may view a trapand-trace activity as a violation of wiretap laws. For a network trap-and-trace, a UNIX SA or other qualified person who has root access can type the word who and this will provide the login name of the user, the terminal being used, the location of the terminal date and time of the user who logged in and the IP address. Careful documentation by the SA and other authorized personnel of this information should occur. Secondly, assuming this is a UNIX operating system, the SA can issue a finger command to receive more detailed information on the user (e.g., full name of the user, device number of the users terminals).
*B
*FINAL SCORECARD RATING Prior to assigning a composite risk assessment rating, refer to figure 4, Advisory System for Cybersecurity Preparedness for FINAL rating criteria.
Security. Specifically, the composition of the cybersecurity and computer forensics preparedness task force needs to include private and public sector industry technology and legal experts along with representation from the federal, state and local law enforcement community to provide insight in developing standardization and common criteria over the science of digital evidence and required information security defenses.
Conclusion
The single greatest impediment that this author envisions in successfully formulating the standardization and common criteria for computer forensics, digital evidence and cybersecurity is the continual reticence on the part of private sector entities and public sector agencies to be forthcoming in revealing embarrassing details about unauthorized attempts to break into their systems. Ostensibly, the admission of actual or suspected unauthorized computer intrusions by management of the entity is to plead mea culpa to security shortcomings, which may result in reputation risk, negative public perception and reduced profitability for the victimized private sector entities, attributed to the loss of the publics confidence in that organization.
Future Steps
To ensure the law enforcement community, private sector entities and government agencies are all in lockstep with consistent and universal standards and procedures governing the collection, analysis, preservation and reporting of evidentiary digital data in the US, a task force needs to be established and directed by the US Office of Homeland
Disclaimer: The contents of this article represent the opinions of the author only. They may not be shared and in agreement with the opinions of his current or previous employers. Kenneth C. Brancik, CISA has worked in technology assurance for approximately 17 years. He is a senior bank examiner for the Federal Reserve Bank of New York, where he is involved in understanding the risks and controls over emerging technologies as they impact the banking
industry. Branciks prior employers include Citigroup, where he was a vice president/manager within the audit and risk review department; PricewaterhouseCoopers LLP Assurance & Business Advisory Service, as a manager; and Merrill Lynch & Company, as a corporate technology auditor. He currently is a doctoral student at Pace Universitys Computer Science Department, where he is studying software engineering and the impact of emerging technologies on technology and software development. He has a masters degree in management systems from New York University.
Information Systems Control Journal, formerly the IS Audit & Control Journal, is published by the Information Systems Audit and Control Association, Inc.. Membership in the association, a voluntary organization of persons interested in information systems (IS) auditing, control and security, entitles one to receive an annual subscription to the Information Systems Control Journal. Opinions expressed in the Information Systems Control Journal represent the views of the authors and advertisers. They may differ from policies and official statements of the Information Systems Audit and Control Association and/or the IT Governance Institute and their committees, and from opinions endorsed by authors' employers, or the editors of this Journal. Information Systems Control Journal does not attest to the originality of authors' content. Copyright 2003 by Information Systems Audit and Control Association Inc., formerly the EDP Auditors Association. All rights reserved. ISCATM Information Systems Control AssociationTM Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles owned by the Information Systems Audit and Control Association Inc., for a flat fee of US $2.50 per article plus 25 per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited. www.isaca.org