Sei sulla pagina 1di 24

DevOps Maturity Model

IBM Rational

03/10/2014

1
What Are Our Goals?
• Ultimate goal is to continuously improve tools and process to produce a
quality product with faster delivery to our customers.

• Measuring the continuous improvement is important to show we are


focusing in the areas that provide us the best ROI.

• How do we plan to measure?


– DevOps Maturity Model Self Assessment
– DevOps Outcome Based Self Measurement

• How often do we plan on doing the self assessments?


– Current plan is once quarterly

• What will be done with the measurements?


– Identify Tools/Processes that can help improve DevOps Maturity
– Continuous improvement measured through the metrics collected.
– Will be shared with the executive teams
2 © 2013 IBM Corporation
DevOps Principles and Values
 Develop and test business strategy requirements against
DevOps Maturity Model a production-like system
Self Assessment
"Plan & Measure" Questions.
1.
 Iterative and frequent deployments using repeatable and
.
63.
reliable processes

 Continuously monitor and validate operational quality


DevOps Maturity Model
characteristics

 Amplify feedback loops


Define Manage
Manage environments
environments Automate
Automate problem
problem isolation
isolation
Define release
release with
with Improve
Improve continuously
continuously with
with through
through automation
automation and
and issue
issue resolution
resolution
business
business objectives
objectives development
development intelligence
intelligence
Measure Provide
Provide self-service
self-service build,
build, Optimize
Optimize to
to customer
customer KPIs
KPIs
Measure to
to customer
customer value
value Test
Test Continuously
Continuously provision
provision and
and deploy
deploy continuously
continuously

Plan
Plan and
and source
source Manage
Manage data
data and
and virtualize
virtualize Standardize
Standardize and
and automate
automate Optimize
strategically
strategically services
services for
for test
test cross-enterprise
cross-enterprise Optimize applications
applications
Dashboard Use
Use enterprise
enterprise issue
issue
Dashboard portfolio
portfolio Deliver
Deliver and
and integrate
integrate Automate
Automate patterns-based
patterns-based resolution
resolution procedures
procedures
measures
measures continuously
continuously provision
provision and
and deploy
deploy

Link
Link objectives
objectives to
to releases
releases Link
Link lifecycle
lifecycle information
information Plan
Plan departmental
departmental releases
releases Monitor
Monitor using
using business
business and
and
STG DevOps Proof of Concept Investigation
Deliver and
and automate
automate status end
end user
user context
context
Centralize
Centralize Requirements
Requirements Deliver andand build
build with
with test
test status
Management
Management Centralize Automated
Automated deployment
deployment with Centralize
Centralize event
event notification
notification
Centralize management
management with and
and incident
incident resolution
resolution
Measure
Measure to
to project
project metrics and
and automate
automate test standard
standard topologies
topologies
metrics test Customer

Manage
Manage Lifecycle
Lifecycle artifacts
artifacts Monitor
Monitor resources
resources
Customer  Service Management Connect
 Content
Interaction
Document
Document objectives
Manage
objectives locally
Manage department
department
locally
Schedule
Schedule SCMSCM integrations
integrations Plan
Plan and
and manage
manage releases
releases consistently
consistently Interaction  Feedback
resources
resources
and
Test
and automated
automated builds
Test following
builds
following construction
construction
Standardize
Standardize deployments
deployments Collaborate
Collaborate Dev/Ops
Dev/Ops
informally
informally
 Download
Hosted Environment
RFE Driver VM
Fully Achieved Partially Achieved Goals
SCE
Continuous Feedback Continuous Test
Driver VM

Driver VM

Outcome Based Metrics Agile Development


Continuous Integration
Continuous Build
Continuous Deployment

1) Which of the following best describes the state of your code at the end of each iteration?
* Ready for testing
* Partially tested, ready for additional integration, performance, security, and/or other testing
* Fully tested, documented, and ready for production delivery or GA release (modulo translation work or
legal approvals)
 Rational Focal Point Rational UrbanCode

2) How quickly can your team pivot (complete a feature in progress and start working on a newly-arrived, high-
priority feature)? Rational Team Concert
* 3 months or longer
* less than 3 months
 Task WI, Change Record WI  Security (AppScan)
* less than one month
* less than one week  Jazz SCM
3) How quickly are you able to change a line of code and deliver to customers as part of a fully-tested, non-fixpack
 Jazz Build Engine (JBE)
release?
* 12 months or longer
* less than 6 months
* less than 3 months
* less than one month
* less than one week
* less than one day
4) What is the cost (in person-hours) of executing a full functional regression test? Image Catalog
<enter value>
Development VM Builder VM Build Resources Test Environment
Compilers AppScan
5) How long does it take for developers to find out that they have committed a source code change that breaks a
critical function? Web Browser RTC Eclipse Client Driver Images
* One week or longer
AppScan Compile Pool Resource Driver Images
RTC Eclipse Client Debug Environment
* 1 to 7 days Compilers Web Browser
* 12 to 24 hours UrbanCode Deploy? Test Resources
Test Resources
* 3 to 12 hours RTC Web Client Compile Pool Resource RTC Web Client
* 1 to 3 hours RTC Build Engine Client RTC Web Client
* less than one hour
Debug Environment
RTC Build Engine Agent RTC Eclipse Client RTC Eclipse Client
6) Which of the following best describes the state of your deployment automation for the environments used for
testing? Focal Point Client
* We have no deployment automation. Setting up our test environments is entirely manual.
RTC Build Client (JBE)
* We have some deployment automation, but manual intervention is typically required (for example, to
provision machines, setup dependencies, or to complete the process).
* We create fully-configured test environments from scratch and we reliably deploy into those environments
without manual intervention.
* We create fully-configured production-congruent environments from scratch and we reliably deploy into
those environments without manual intervention.
3 © 2013 IBM Corporation
Where do you start? DevOps improvements adoption
Assess and define outcomes & supporting practices to drive strategy and roll-out

Determine Activities Objective

What am I • Think through business-level drivers for improvement


Step 1

trying to • Define measurable goals for your organizational investment


achieve? • Look across silos and include key Dev and Ops stakeholders
Business Goal
Determination

• What do you measure and currently achieve


Step 2

Where am I • What don’t you measure, but should to improve


currently? • What practices are difficult, incubating, well scaled
• How do your team members agree with these findings Current Practice
Assessment

• Start where you are today and where your improvement goals Define
Define release
release with
with Improve
Improve continuously
continuously with
with
Manage
Manage environments
environments Automate
Automate problem
problem isolation
isolation
Step 3

business through
through automation
automation and
and issue
issue resolution
resolution
business objectives
objectives development
development intelligence
intelligence
Measure Provide
Provide self-service
self-service build,
build, Optimize
Optimize to
to customer
customer KPIs
KPIs
Measure to
to customer
customer value
value Test
Test Continuously
Continuously

• Consider changes to People, Practices, Technology


provision
provision and
and deploy
deploy continuously
continuously

What are my Plan


Plan and
and source

Dashboard
source
strategically
strategically
Dashboard portfolio
portfolio
measures
measures
Manage
Manage data
data and
services
Deliver
Deliver and
and virtualize
services for
virtualize
for test
test
and integrate
integrate
continuously
continuously
Standardize
Standardize and

Automate
and automate
automate
cross-enterprise
cross-enterprise
Automate patterns-based
patterns-based
provision
provision and
and deploy
deploy
Optimize
Optimize applications
Use
applications
Use enterprise
enterprise issue
resolution
issue
resolution procedures
procedures

priorities ? • Prioritize change using goals, complexities and dependencies


Link
Link objectives
objectives to
to releases
releases Link
Link lifecycle
lifecycle information
information Plan
Plan departmental
departmental releases
releases Monitor
Monitor using
using business
business and
and
Deliver and
and automate
automate status end
end user
user context
context
Centralize
Centralize Requirements
Requirements Deliver andand build
build with
with test
test status
Management
Management Centralize Automated
Automated deployment
deployment with Centralize
Centralize event
event notification
notification
Centralize management
management with and
and incident
incident resolution
resolution
Measure
Measure to
to project
project metrics
metrics and
and automate
automate test
test standard
standard topologies
topologies

Manage
Manage Lifecycle
Lifecycle artifacts
artifacts Monitor
Monitor resources
resources
Document
Document objectives
objectives locally
locally Plan
Schedule
Schedule SCMSCM integrations
integrations Plan and
and manage
manage releases
releases consistently
consistently
Manage
Manage department
department and
and automated
automated builds
builds Standardize
Standardize deployments
deployments Collaborate
Collaborate Dev/Ops
Dev/Ops

Objective & Prioritized


resources
resources informally
Test
Test following
following construction
construction informally

Fully Achieved Partially Achieved Goals

Capabilities

• Understand your appetite for cross functional change


How should
Step 4

• Target improvements which get the best bang for the buck
my practices • Roadmap and agree on an actionable plan
improve? • Use measurable milestones that include early wins
Roadmap
4 © 2013 IBM Corporation
Outcome Based Metrics
1) Which of the following best describes the state of your code at the end of each iteration?
* Ready for testing
* Partially tested, ready for additional integration, performance, security, and/or other testing
* Fully tested, documented, and ready for production delivery or GA release (modulo translation work or legal approvals)
2) How quickly can your team pivot (complete a feature in progress and start working on a newly-arrived, high-priority feature)?
* 3 months or longer
* less than 3 months
* less than one month
* less than one week
3) How quickly are you able to change a line of code and deliver to customers as part of a fully-tested, non-fixpack release?
* 12 months or longer
* less than 6 months
* less than 3 months
* less than one month
* less than one week
* less than one day
4) What is the cost (in person-hours) of executing a full functional regression test?
<enter value>
5) How long does it take for developers to find out that they have committed a source code change that breaks a critical function?
* One week or longer
* 1 to 7 days
* 12 to 24 hours
* 3 to 12 hours
* 1 to 3 hours
* less than one hour
6) Which of the following best describes the state of your deployment automation for the environments used for testing?
* We have no deployment automation. Setting up our test environments is entirely manual.
* We have some deployment automation, but manual intervention is typically required (for example, to provision machines,
setup dependencies, or to complete the process).
* We create fully-configured test environments from scratch and we reliably deploy into those environments without manual
intervention.
* We create fully-configured production-congruent environments from scratch and we reliably deploy into those environments
without manual intervention.
5 © 2013 IBM Corporation
Outcome Based Metrics
7) (If your product is SaaS) Which of the following best describes the state of your deployment automation for
your staging and production environments?
* We have no deployment automation. Setting up our staging and production environments is entirely
manual.
* We have some deployment automation, but manual intervention is typically required (for example, to
provision machines, setup dependencies, or to complete the process).
* We create fully configured staging and production environments from scratch and we reliably deploy into
those environments without manual intervention.

8) (If your product is SaaS) Are you able to make business decisions based on data provided by
infrastructure, application, and customer experience monitoring?
* yes
* no

9) (If your product is SaaS) How much downtime is generally required to deploy a new version into
production? * 4 hours or longer
* 1-4 hours
* less than 1 hour
* No downtime is needed

10) (If your product is SaaS) How often do problems occur when deploying a new version into production?
* Problems always occur
* Problems occur about 50% of the time
* Problems occur about 25% of the time
* Problems occur about 10% of the time
* Problems are rare
6 © 2013 IBM Corporation
Initial Rollout
Project Contacts DevOps Overview & Self Assessment Analyze Results Roadmap for Goals
Maturity Model

Ice Castle Marla Berg, John Beveridge, Lafonda Richburg 01/30/2014 03/07/2014 03/21/2014

Platform Resource Scheduler Anumpa, Jason Adelman, Shen Wu. 01/30/2014 03/06/2014 03/21/2014

IBM Cloud OpenStack Platform Christine Grev, Gary Palmersheim 01/30/2014 03/06/2014 03/21/2014

GPFS Bonnie Pulver, Lyle Gayne, Steve Duersch, Yuri L Volobuev 02/19/2014 02/21/2014 03/07/2014

Pinehurst Sue Townsend 01/30/2014

Power FW Atit 02/19/2014

AIX Atit 01/30/2014

PowerVC (Thomas)

Platform LSF (Akthar)

Platform Symphony (Akthar)

MCP (Frye)

PowerKVM (Frye)

Power Linux? (Frye)

7 © 2013 IBM Corporation


Sample Results
DevOps Maturity Model Self Assessment Results
Outcome Based Metrics

8 © 2013 IBM Corporation


Sample Details

Plan & Measure


Reliable

9 © 2013 IBM Corporation


Sample Details

Develop & Test


Practiced

10 © 2013 IBM Corporation


Sample Details

Release & Deploy


Practiced

11 © 2013 IBM Corporation


Sample Details

Monitor & Optimize


Practiced

12 © 2013 IBM Corporation


DevOps Outcome Metrics

DevOp Outcome Metrics Sample1 Sample2

1. State of code at end of iteration Partially Tested Partially tested

2. How quick pivot to new high priority Less than 1 month Less than 3 months

3. How long from LOC to tested


Less than 1 month Less than 3 months
release
4. Person Hours of full functional (if 1-2 weeks, what
280 hours (7 PW)
regression PW)?
5. Dev time to knowing critical function
1-3 hours 12-24 hours
broken

We create fully
configured test
We have some
environments from
deployment
scratch and we
6. State of deployment automation automation but
reliably deploy into
manual intervention
those environments
is typically required
without manual
intervention

7. If SAAS, Deployment automation for


N/A N/A
staging
8. If SAAS, decisions based on
N/A N/A
monitoring?
9. If SAAS, downtime required to
N/A N/A
deploy prod
10. If SAAS, problems on prod
N/A N/A
deployments

13 © 2013 IBM Corporation


Sample Maturity Model Assessment

Plan / Measure Development / Test Release / Deploy Monitor / Optimize

Manage environments Automate problem isolation


Scaled

Define release with Improve continuously with through automation and issue resolution
business objectives development intelligence
Measure to customer value Test Continuously Provide self-service build, Optimize to customer KPIs
provision and deploy continuously

Plan and source Manage data and virtualize Standardize and automate
Reliable

strategically services for test cross-enterprise Optimize applications


Dashboard portfolio Deliver and integrate Automate patterns-based Use enterprise issue
measures continuously provision and deploy resolution procedures
Repeatable

Link objectives to releases Automated test environment Plan departmental releases Monitor using business and
Centralize Requirements deployment and automate status end user context
Management Run unattended test Automated deployment Centralize event notification
Measure to project metrics automation / regression with standard topologies and incident resolution
Practiced

Document objectives locally Monitor resources


Schedule SCM integrations Plan and manage releases consistently
Manage department and automated builds
resources Test following construction Standardize deployments Collaborate Dev/Ops
informally

Fully Achieved Partially Achieved Goal

14 © 2013 IBM Corporation


14
Sample Maturity Model Assessment

Plan / Measure Development / Test Release / Deploy Monitor / Optimize

Manage environments Automate problem isolation


Scaled

Define release with Improve continuously with through automation and issue resolution
business objectives development intelligence
Measure to customer value Test Continuously Provide self-service build, Optimize to customer KPIs
provision and deploy continuously

GOALS:
Plan and source Focus up
Manage data and virtualize Standardize and automate
Reliable

strategically services for test Where is the


cross-enterprise Optimize applications
Use enterprise issue
Dashboard portfolio
measures
Deliver and integrate
continuously best
Automate
and result?
patterns-based
provision deploy resolution procedures
Repeatable

Link objectives to releases Automated test environment Plan departmental releases Monitor using business and
Centralize Requirements deployment and automate status end user context
Management Run unattended test Automated deployment Centralize event notification
Measure to project metrics automation / regression with standard topologies and incident resolution
Practiced

Document objectives locally Monitor resources


Manage department
Schedule SCM integrations
and automated builds Focus
Plan and across
manage releases
Standardize deployments
consistently
Collaborate Dev/Ops
resources Test following construction
informally

Fully Achieved Partially Achieved Goal

15 © 2013 IBM Corporation


15
Goal Discussion: Planning for Initiative #1

Pain Point Improvement Value Effort Priority Next Step


Required

16 © 2013 IBM Corporation


DevOps Proof of Concept Investigation
Customer
Customer  Service Management Connect
 Content
Interaction
Interaction  Feedback
 Download
Hosted Environment
RFE Driver VM

SCE
Continuous Feedback Continuous Test
Driver VM

Driver VM

Continuous Deployment
Agile Development
Continuous Build
Continuous Integration

 Rational Focal Point Rational UrbanCode


 Rational Team Concert
 Task WI, Change Record WI
 Security (AppScan)
 Jazz SCM
 Jazz Build Engine (JBE)

Image Catalog
Development VM Builder VM Build Resources Test Environment
Compilers AppScan
Web Browser RTC Eclipse Client Driver Images
AppScan Compile Pool Resource Driver Images
RTC Eclipse Client Debug Environment
Compilers Web Browser
UrbanCode Deploy? Test Resources
Test Resources
RTC Web Client Compile Pool Resource RTC Web Client
RTC Build Engine Client RTC Web Client Debug Environment
RTC Build Engine Agent RTC Eclipse Client RTC Eclipse Client
Focal Point Client
RTC Build Client (JBE)
17 © 2013 IBM Corporation
Introduction to Practice Based Maturity Model

Plan / Measure Development / Test Release / Deploy Monitor / Optimize

Manage environments Automate problem isolation


Scaled

Define release with Improve continuously with through automation and issue resolution
business objectives development intelligence
Measure to customer value Test Continuously Provide self-service build, Optimize to customer KPIs
provision and deploy continuously

Plan and source Manage data and virtualize Standardize and automate
Reliable

strategically services for test cross-enterprise Optimize applications


Dashboard portfolio Deliver and integrate Automate patterns-based Use enterprise issue
measures continuously provision and deploy resolution procedures
Repeatable

Link objectives to releases Link lifecycle information Plan departmental releases Monitor using business and
Centralize Requirements Deliver and build with test and automate status end user context
Management Centralize management and Automated deployment with Centralize event notification
Measure to project metrics automate test standard topologies and incident resolution
Practiced

Manage Lifecycle artifacts Monitor resources


Document objectives locally Plan and manage releases consistently
Schedule SCM integrations
Manage department and automated builds Standardize deployments Collaborate Dev/Ops
resources informally
Test following construction

18 © 2013 IBM Corporation


Maturity Levels Defined
Specific maturity levels are defined by how well an organization can
perform practices. The levels look at consistency, standardization, usage
models, defined practices, mentor team or center of excellence,
automation, continuous improvement and organizational or technical
change management.

Some teams exercise activities associated with the practice, inconsistently. No enterprise
Practiced
standards defined. Automation may be in place but without consistent usage models.

Enterprise standards for practice are defined. Some teams exercise activities associated
Repeatable
with the practice and follow the standards. No core team or COE to assist with practice
(Consistent)
adoption. Automation, if used, follows enterprise standards.
Mechanisms exist to assist adoption and ensure that standards are being followed. Core
Reliable
team of mentors available to assist in adoption.
Institutionalized adoption across the enterprise. COE is a matured and integral part of
Scaled continuous improvement and enablement. Practices are mainstreamed across the
enterprise. Feedback process in place to improve the standards.

19 © 2013 IBM Corporation


Plan/Measure
At the practiced level, organizations capture business cases or goals in documents for
each project to define scope within the strategy but resourcing for projects are managed at
the department level. Once projects are executed change decisions and scope are
managed within the context of the project or program to achieve goals within budget/time.
As organizations mature business needs are documented within the context of the
enterprise and measured to meet customer value metrics. Those needs are then prioritized
and aligned to releases and linked to program or project requirements. Project change
decisions and scope are managed at the portfolio level.

20 © 2013 IBM Corporation


Development/Test
At the practiced level, project and program teams produce multiple software development lifecycle
products in the form of documents, spreadsheets to explain their requirements, design, test plans. Code
changes and application level builds are performed on a formal, periodic schedule to ensure sufficient
resources are available to overcome challenges. Testing, except for unit level, is performed following a
formal delivery of the application build to the QA team after most if not all construction is completed. As
organizations mature, software development lifecycle information is linked at the object level to improve
collaboration within the context of specific tasks and information. This provides the basis for
development intelligence used to assess the impact of processor technology improvements,
continuously. A centralized testing organization and service provides support across application/projects
that can continuously test regressions and higher level automated tests provided infrastructure and
application deployment can also support. Software delivery, integration and build with code scans/unit
testing are performed routinely and on a continuous basis for individual developers, teams, applications
and products.
.

21 © 2013 IBM Corporation


Release/Deploy
At the practiced level, releases are planned annually for new features and maintenance teams. Critical
repairs and off-cycle releases emerge as needed. All are managed in a spreadsheet updated through
face-to-face meetings. Impact analysis of change is performed manually as events occur. Application
deployments and middleware configurations are performed consistently across departments using
manual or manually staged and initiated scripts. Infrastructure and middleware are provisioned similarly.
As organization mature, releases are managed centrally in a collaborated environment that leverages
automation to maintain the status of individual applications. deployments and middleware
configurations are automated then move to a self-service providing individual developers, teams, testers
and deployment managers with a capability to build, provision, deploy, test and promote, continuously .
Infrastructure and middleware provisioning evolves to an automated then self-service capability similar
to application deployment. Operations engineers move to changing automation code and re-deploying
over manual or scripted changes to existing environments.
.

22 © 2013 IBM Corporation


Monitor/Optimize
At the practiced level, deployed resources are monitored and events or issues are addressed as they
occur without context of the affected business application. Dev and Ops coordination is usually informal
and event driven. Feedback of user experience with business applications is achieved through
formalized defect programs. As organizations mature, monitoring is performed within the context of
business applications and optimization begins in QA environments to improve stability, availability and
overall performance. Customer experience is monitored to optimize experiences within business
applications. Optimization to customer KPIs is part of the continuous improvement program.

23 © 2013 IBM Corporation


Sample: Practice based maturity model: Maturity Goals for an
Initiative
Plan / Measure Development / Test Release / Deploy Monitor / Optimize

Manage environments Automate problem isolation


Scaled

Define release with Improve continuously with through automation and issue resolution
business objectives development intelligence
Measure to customer value Test Continuously Provide self-service build, Optimize to customer KPIs
provision and deploy continuously

Plan and source Manage data and virtualize Standardize and automate
Reliable

strategically services for test cross-enterprise Optimize applications


Dashboard portfolio Deliver and integrate Automate patterns-based Use enterprise issue
measures continuously provision and deploy resolution procedures
Repeatable

Link objectives to releases Link lifecycle information Plan departmental releases Monitor using business and
Centralize Requirements Deliver and build with test and automate status end user context
Management Centralize management Automated deployment Centralize event notification
Measure to project metrics and automate test with standard topologies and incident resolution
Practiced

Manage Lifecycle artifacts Monitor resources


Document objectives locally Plan and manage releases consistently
Schedule SCM integrations
Manage department and automated builds Standardize deployments Collaborate Dev/Ops
resources informally
Test following construction

Fully Achieved Partially Achieved Goals

24 © 2013 IBM Corporation

Potrebbero piacerti anche