Transcript Slide 1

Information Security
Program Management
Ramachandra Kulkarni
Agenda

Information security program




Organization and budgeting
Program components
Staffing and career development
Metrics and measurement
Organization and budgeting
Information security organization structure
(How will I organize myself)
• Business Unit Resilience
• Technology Resilience
• Crisis Response
• Supports BCP Coordinators
Application Risk
Infrastructure Risk
Information security Operational services
Investigations
Business Information Risk
Policy and Risk Practices
Program Office
Architecture
• Application Security
• Risk Monitoring
• Incident Response
• Supports IT and LSCs
• Policies and Standards
• Training and Awareness
• Risk Measurement Systems
• Incident Learning
Business Continuity Planning
• Strategy and Planning
• Macro Risk Profiles
• Threat Horizon
• Solutions Catalog
• Security Architecture
• BU Information Protection
• Risk Monitoring
• Incident/Investigations Response
• E-Discovery investigations
• Forensic investigations
• Infrastructure Security
• Risk Monitoring
• Incident Response
• Supports IT Infra and LSCs
•Operational support to
Security streams
Acts as a service engine
Information security – Key activity streams
Risk
consulting
•
•
Consult with business groups on their projects with regards to info risk
Ability to develop / suggest and help implement controls around new
systems
• Ensure operational monitoring of events
Risk
Monitoring
Risk
Reporting
• Ensure implementation of defined policies
• Ensure proper incident response mechanism effectiveness
• Ensure control framework deployment
• Ensure proper segregation of duties
• Reporting of external and internal threats and organization readiness
• Provide analytics for prioritized investment in the protection program
• Provide insight into resource allocation.
• Provide inputs to the risk analysis process
• Monitor and measure trends in the effectiveness of the security program
Planning and Funding





Define scope and alignment with business groups
Yearly planner with quarterly reviews
Determine the info. security model (centralized / de-centralized).
Determine the method of funding (stakeholder / IT budgets /
independent)
Major cost components






Implementation of new security technologies
Reporting and tools
Staffing
Audits and assessment (Internal / External)
Compliance (Internal policies / Regulatory / Contractual)
Typically 4 – 7 % (up to 7 – 10 % in specific cases) of the total “IT”
budget.
Working the interfaces
Divisions
Interface
areas
IT operations
Buy-in, Tech Design, IT
services, BCP, Capacity
& perf Management.
Business units
BCP, Buy-in for policies,
Customer requirements,
New systems and apps
and awareness
Legal / compliance
Compliance reporting,
Legal requirements,
Policy considerations.
Physical security
Physical controls, visitors
Handling, differentiated
access setup in the
Buildings..
Information
Security
Building credibility – the infosec brand.

Be the “go-to” person for any issue involving security (build domain expertise)

Admit existence of issues if they are (do not defend).

Communicate appropriately (multiple levels of communication).

Accept and acknowledge solutions from any source (most common issue).

Build programs for effective communications (dashboards / scorecards).


Bring in risk transparency
Do not encourage the “differentiated security” culture (very difficult and sensitive).
Ultimately different groups should feel that information security is acting as an
enabler and is not a department which puts brakes on every initiative. The answer
is communicate !!! communicate !!! communicate !!!
Program components
Program component framework
Asset classification
User classification
Threat classification
Risk Analysis
Education
Strategy identification
Metrics
Change management
Security plan development
Security architecture
Source : Burton research
Program input components

Asset classification




User classification




Inventory of all information assets
Classification according to value to the organization
Feed into risk analysis
Categories of users
Not every user will require every bit of information, facilities
Key input to segregation of duties within user community
Threat classification


Value of consequences
Location of the operational elements
Program core components

Risk analysis




Strategy identification



Key inputs from the risk analysis
Strategy includes resource, technology, reporting and program
Security plan development




Most important component of information security strategy
Output will be key facts such as priority of resources to be protected,
impact of the threats as well as resources to be made available.
Need not always be a very high analytical (tool / etc).
Details of the projects to be taken up
Details on organization chart and responsibilities
Details on the control environment
Security architecture



Technology architecture (technology biases, tool standardization)
Policy framework (adopted standard, policies and procedure
Control framework (baseline, project based controls)
Program support components

Change management



Education





Structured approach to changes in the information environment
Analysis of security impact before a change is done
Education during induction
On-going education initiatives
Educating users of new systems
Delivery channels – classroom, email, videos and articles
Metrics



Important barometer for protection standard
Key inputs in terms of improvement / decline in security levels
Operational and quantitative metrics
Staffing and career development
Information security – Staffing parameters

How big should be your team depends on






No of systems (IT infrastructure)
Percentage of critical business running on IT
Overlapping functions (BCP, Compliance)
No of locations
No of employees
Some pointers for staffing


5 – 7 % of IT staff
In large organizations (> 4000) One info. Security staff for every
700 employees (where > 70 % data is based on systems).
Information Security Organization – Skills mapping
Business understanding
Ability to influence
CISO
Metrics management
Program manager – Info. Security
Assurance mechanisms
Senior Manager – Info. Security
Program management
Processes expertise
Regulatory frameworks
Technology
Defining and managing processes
Verification and Validation
Team leader / Managers
Technology (specific areas)
Business processes awareness
Technology
Process Appreciation
Team members / Security
specialists
Communication Skills
Technology certification
Assessments and Reviews
Certifications
Information Security Staff - Skills composition
Information Security - Skills
Composition
Audit and
Assessment
15%
Processes
10%
Consulting
25%
Technology
Technology
20%
Monitoring and
reporting
Consulting
Monitoring
and
reporting
30%
Processes
Audit and
Assessment
Information Security Career Paths (broad level)
(budgets, design
Risk and Assurance
validation, stakeholder
Manager
Interface)
(design, planning,
Implementation and
Roll-out)
Security Specialist
(Firewall, VPN, IDS)
Security administrator
(O/s, Network, DB)
Security Engineering
Security Manager
Risk and compliance
Analyst
Controls Analyst
Process Consultant
Information Security Program Management Office (CISO)
Risk Management and Assurance
Engineering Head
Staff challenges and strategies

Challenges





Security being equated to technologies (firewall, IDS, anti virus etc).
Lack of business understanding
Lack of appreciation towards processes
Inability to convince / influence stakeholders
Some of the possible solutions




Develop ability to interact in “business language”
Educate information security staff on the big picture (go beyond
firewall, IDS etc).
Penalize for process failures through the appraisal system.
Staff Rotation


Within security team on different areas
Agreement with the IT team for rotation
Metrics and Measurement
Why measuring security is difficult

What do dictionaries say




Security – “freedom from risk or danger”
Security – “keeping from harm”
Inherently the definitions are pointing to a relative term
Some of the things difficult to measure



Employee morale
Opportunity cost due to outage caused from infosec
Statistics need to be blended with enterprise knowledge

A trader unable to access the application for 30 minutes (from 19:00
to 19:30 hours) is totally different from a trader unable to access the
application from 11:00 to 11:30
As a result what gets measured is what can be visualized but unfortunately
a large percentage of that is not meaningful
Measurement strategy

Possible options for measurement






Gap assessment (prioritization is a challenge)
Against previous performance (aligning with organization goals is not
easy)
Measure against business criticality (ideal but very difficult to
measure, and requires extensive enterprise knowledge)
ROI metrics ( There is hardly any robust method of computing ROI
on security)
Comparing against other organization (Does not align with
organization goals and sharing of information is a limitation)
Program management (top / down) metrics (Blending the details is a
big challenge)
None of the above methods give the required assurance independently however a combination
of two or more of the above approaches will be a reasonable plan towards metrics
What information risk metrics should offer





Should provide fact based decision making
Provide resource allocation insight
Serve as a communication tool to influence organization.
Metrics should essentially throw up risk elements
Metrics should provide pointers on when to






Accept the risk
Avoid the risk
Mitigate the risk
Transfer the risk
Should provide trends of improvement or lack of it.
Should also provide pointers to investment areas
Design Principles for Information Risk Scorecard
Performance Measurement and Communication: In their efforts to create reporting processes that truly resonate with diverse audiences,
Members cite a litany of obstacles, including a dearth of reliable key performance indicators, lack of consensus around what should be measured,
and the perennial challenge of quantifying risk.
Each Metric Should…
…Enable Decision Making: Metrics should translate
technical security data into business risk implications
that executives can leverage to drive mitigation trade-off
decisions.
• Start with the key decisions that need to be made and derive
metrics that support those decisions, instead of a bottom-up
aggregation of available metrics.
• Include CISO risk assertions and concise summaries.
…Articulate Future Readiness: Metrics should not only
provide a view of current performance but also provide
directional guidance on readiness to meet future threat
scenarios.
• Include a CISO explanation of estimated future direction.
…Be Comparable Over Time: Each metric should capture
historical trends to outline performance.
• Develop metrics with longevity in-mind, considering the life of
various systems and anticipated business changes that may
diminish effectiveness and/or complicate collection in the future.
…Require Minimal Resource Consumption: The data for each
metric should be easily and cost-effectively captured.
• Identify both the upfront burden of putting collection process in
place as well as the ongoing burden of regular data collection.
The Overall Scorecard Should…
Be Simple: Performance measurement reports should be
concise and quickly highlight areas of concern.
Present Nontechnical Data: The scorecard should speak to an
executive audience and focus on nontechnical data.
• To draw audience attention to highest-priority issues, adopt an
exception-based reporting approach instead of inundating the
audience with reams of data.
• Involve the audiences in the design by soliciting input prior and
during scorecard creation.
Source: IREC research.
Sample approach

Steps






Define categories
Define risk indicators
Define Tolerance levels
Input the current reporting period results
Measure against tolerance
Obtain trends
Define categories
Sample Metrics Categories
Infrastructure
Applications
Management
Physical security
Data protection
Awareness
Change management
Sample – Key Risk Indicator definition
Sample – Key risk indicator definition
Quarterly data analysis
Process: Unauthorized access via ext connections
Control objective : To ensure that all the external
connections have security violation detection
mechanisms
Event
type
Area
Infra
Key Risk Indicator and
Description
Tolerance
Levels
G
Percentage of external connection with < 80
security monitoring.
Y
>= 80
R
=100
Previous
Status
Current
Value
95
Q3
FY 2008
78
Q4
2008
Trend
74
All external connections should be
under security monitoring
Event types
1.
2.
3.
4.
5.
Control Failure
External threats
Internal threats
Processing failure
Unauthorized activity
One event per metric
Areas
1.
2.
3.
4.
5.
6.
7.
Infrastructure
Applications
Management
Physical security
Data protection
Awareness
Change management
Previous reporting
period
Questions