CORONAVIRUS (COVID-19) RESOURCE CENTER Read More

A Performance-Based Health Department Approach to Aligning Agency Plans

State: GA Type: Promising Practice Year: 2020

Cobb & Douglas Public Health (CDPH), a two-county health district located in the northwest suburbs of Metro Atlanta is one of 18 local health districts in Georgia. With its partners, CDPH promotes and protects the health and safety of approximately 902,196 residents of Cobb and Douglas counties.  Specifically, in 2018, Cobb County was the second most populous county in the state with 756,865 residents, of which 63% are White, 29% are Black, 6% are Asian racially, and 13% are of Hispanic ethnicity. Douglas County is more rural with 145,331 residents, of which are 47% are White, 48% are Black, 2% are Asian racially, and 10% are of Hispanic ethnicity. More information about CDPH can be found here: http://www.cobbanddouglaspublichealth.com/


In May 2015, CDPH became the first health district in Georgia to achieve initial accreditation from the Public Health Accreditation Board (PHAB). This accomplishment was only possible after completing an action plan to address gaps identified in the performance management (PM) system and alignment of critical plans. Since then, and in preparation for reaccreditation, CDPH has developed an advanced integrated PM system that aligns processes towards accomplishing organizational goals.


CDPH adopted the Balanced Scorecard (BSC) framework in 2009, which was managed manually through excel spreadsheets. There were many inefficiencies associated with PM system management, including staff time, duplication of efforts, and lack of real-time data. A robust strategy was developed in 2016 to facilitate collection, reporting, and use of the PM system to drive decision-making and results.


The following goals and objectives for this project were integrated as a priority within CDPH's strategic plan and completed between 2016 and 2019.

Goal 1: Identify a cloud-based PM software solution by mid-2016.

  • Objective 1.1: Conduct a needs assessment of PM stakeholders to understand desired features in PM tool. Needs identified include: low total and annual cost, hosted on the cloud, automated archiving (keeping a record of previous data/multiple data points for trend use), ability to create reports, support data drill downs, interoperability with other IT systems, user friendliness, ability to link PHAB documents/reaccreditation process, level IT support, ability to send automated reporting reminders, data exporting capabilities, ability to store documents, security-based access to control edit/view rights.
  • Objective 1.2: Research desired PM solutions currently available to adapt to CDPH's needs identified in Objective 1.1.
  • Objective 1.3: Obtain a request for proposal (RFP) from top two potential PM vendors and sign a three-year contract (InsightVision was selected based on their expertise with the BSC framework).


Goal 2: Implement PM tool to align agency plans by 2018.

  • Objective 2.1:  Develop 30 program-level scorecards to align with agency-level scorecard by mid-2017, pilot for one year, and finalize in mid-2018.
  • Objective 2.2: Integrate QI projects, workforce development goals, stakeholder feedback, and community partnership goals into each program scorecard.
  • Objective 2.3: Create CHIP scorecards for two county coalitions and link to program scorecards, if applicable.
  • Objective 2.4: Integrate strategic plan initiatives into agency-level scorecard.
  • Objective 2.5: Integrate population health outcome measures into each program scorecard.


Goal 3: Evaluate performance of programs with PM tool by 2019.

  • Objective 3.1: Transfer program evaluations from 2009 to 2016 into new PM tool.
  • Objective 3.2: Transition from paper-based program evaluation to electronic in new PM tool.
  • Objective 3.3: Integrate strategic planning concepts into each program evaluation, including program mission, vision, SWOT analysis, analysis of scorecard measures, QI project, goals for next three years, budget for upcoming fiscal year, organizational chart, and program in action story.


Key factors that facilitated implementation of CDPH's integrated PM system include leadership support, dedicated staffing, and sharing the ownership of PM. First, leadership support was essential throughout the process to emphasize the importance of PM, ensure enough funding for activities, allocation of staff time to further develop the agency-level and program scorecards with a consultant. Secondly, CDPH dedicated full-time staff to manage PM, QI, and accreditation by creating the Office of Quality Management to lead efforts and ensure alignment. Most importantly, cascading the agency BSC to program scorecards and individual employee performance (annual performance reviews), facilitated shared ownership of the PM system and use of the tool among scorecard owners who contribute towards PM processes, without overreliance on any single staff member.


CDPH has developed into a performance-based, data-driven health department. The public health impact of this practice is tremendous because CDPH has found a dynamic and sustainable way to measure, track, and demonstrate CDPH's impact of community health outcomes through the alignment of the agency's PM efforts and plans. This process has increased the use of data for decision making and shifted CDPH towards furthering a culture of quality.

CDPH's integrated PM system is responsive to the entire target population, listed below, in the following ways:

  1. 350 CDPH staff – Each employee's performance is evaluated annually through an individual employee scorecard, that has measures linked to their applicable program or agency scorecard measure in InsightVision. These individual scorecards are updated annually to adapt to changes in InsightVision and employee needs.  In addition, staff satisfaction is also included in the agency scorecard (E3a, E3b) to ensure that leadership is responsive to employee needs and informs workforce development plan strategies.
  2. 30 Programs – Each program scorecard, managed by a program manager, is linked to the agency scorecard, managed by a Leadership Team member. When programs undergo changes in their deliverables, program managers update their scorecards to reflect measures that accurately evaluate the program's performance. For example, the Adolescent Health & Youth Development program changed the way they deliver sex education from solely providing training classes outside of the school setting to integrating it into school curriculum. Therefore, the scorecard measure was updated from # of training classes” to # of students completing sex education course.”
  3. Customers/Patients of CDPH Services – Each customer or patient that receives services from CDPH also receives a satisfaction survey assessing their experience in hopes to identify areas for improvement. The scores from the satisfaction surveys are entered into each program's scorecard measure for C1a: Overall Satisfaction” and C1b: Timeliness of Service Delivery” (wait time). This ensures that leadership and program managers are actively monitoring customer feedback for process improvement.
  4. Partners – Similar to customers/patients receiving satisfaction surveys, CDPH partners also receive partner satisfaction surveys annually to assure that relationships are effectively managed B3b. Partner satisfaction data is used to inform CDPH's strategic plan and community health assessment and improvement planning processes.


This PM solution is a burst of innovative which builds on previous model practices. In 2009, CDPH became the first health department to adopt the evidenced-based BSC framework and adapt the for-profit model for governmental sectors through the creation of its initial agency-level scorecard. In 2012, CDPH received a NACCHO Model Practice Award for CDPH's approach to building infrastructure to support strategic performance management through the development of its Office of Quality Management. In 2014, CDPH's program evaluation process (formerly called The Program Summary Tool) earned a NACCHO Model Practice Award. Since then, CDPH has taken a deliberate and thoughtful approach to enhance its PM systems by developing an integrated framework that can be sustained by CDPH and replicated within other health departments.


The BSC was created by Robert Kaplan and David Norton in the 1990s at the Harvard Business School in efforts to help agencies manage their performance on four different criteria, called perspectives, rather than just one (financial). These four perspectives are customer, business process, employee learning & growth, and financial. Based on these, measures are strategically selected and categorized accordingly. The system connects the dots between big picture strategy elements such as mission/vision/values, strategic focus areas and the more operational elements such as objectives (continuous improvement activities), measures (or KPIs, which track strategic performance), targets, and initiatives. This methodology shifts the organization towards a strategic management system rather than just a measurement system. CDPH has successfully adapted this business framework to public health. Listed below is CDPH's agency-level scorecard, with notes describing how agency plans are linked.

Example: Agency-Level Scorecard 

Customer (C) Perspective

Objective C1. Provide High Quality Services to our Customers     

  • Measure C1a. Customer Satisfaction Rating – all programs list their customer satisfaction ratings here.
    • Customer feedback linkage (external)
  • Measure C1b. Timeliness of Service Delivery – also known as wait time, collected from customer survey.

Objective C2. Promote Health and Prevent Injury and Disease to Achieve Healthy Outcomes        

  • Measure C2a. Programs Meeting Activity Targets – All programs list their most mission-related activity here.
  • Measure C2b. Programs Meeting Outcome Targets – The outcomes of C2a activity counts.
  • Measure C2c. Community Health Metrics Meeting Targets – Population health outcomes for each program are listed here.
    • CHA, CHIP linkage


Business Process (B) Perspective

Objective B1. Improve Operational Effectiveness and Efficiency  

  • Measure B1a. Culture of Quality Rating
    • Customer feedback linkage (internal)
  • Measure B1b. # of Service Encounters – also known as patients served by each program.
  • Measure B1c. On Track with Process Improvements – All program's QI projects are linked here and strategic plan initiatives since those are agency-wide improvement efforts.
    • QI Plan and Strategic Plan linkage

Objective B2. Promote Effective Communication and Collaboration          

  • Measure B2a. # of External Media Mentions
  • Measure B2b. Internal Communication Rating
    • Customer feedback linkage (internal)
  • Measure B2c. # of website visits on cobbanddouglaspublichealth.org

Objective B3. Promote, Develop, and Evaluate Community Partnerships

  • Measure B3a. # of Partnerships Developed to Fill Strategic Needs
    • CHIP linkage
  • Measure B3b. % of Partnerships Achieving a Positive Rating on the Partnership Evaluation
    • Customer feedback linkage (external)


Employee Learning and Growth (E) Perspective

Objective E1. Utilize Technology to Improve Service Delivery and Management Decisions

  • Measure E1a. Technology Work Orders Completed
  • Measure E1b. Employee Technology Satisfaction Rating
    • Customer feedback linkage (internal)

Objective E2. Attract, Develop, Retain Effective Performers         

  • Measure E2a. % of Employee Accomplishing Individual Scorecard Goals
  • Measure E2b. Employee Retention Rate
  • Measure E2c. Employees Meeting Development Goals
    • Workforce Development Plan linkage

Objective E3. Build a Safe and Healthy Environment Where People Feel Valued and We Celebrate Success             

  • Measure E3a. Employee Satisfaction Rating
    • Customer feedback linkage (internal)
  • Measure E3b. Employee Values and Success Rating


Financial (F) Perspective

Objective F1. Allocate Resources Based on Priorities and Results

  • Measure F1a. Total Discretionary $ Allocated to Agency Mission Goals
    • Strategic Plan linkage

Objective F2. Diversify, Grow, and Sustain Funding Sources          

  • Measure F2a. Funding from New, Non-Traditional Funding Sources
  • Measure F2b. Fund Balance Reserves as % of Annual Budget
  • Measure F2c. % Billing and Collection Rate – all programs that bill for services, track their collection rate here.

Objective F3. Excel in Stewardship and Financial Accountability  

  • Measure F3a. Agency Total Budget Dollars in Green Category – all programmatic budgets receive a monthly red, green, yellow evaluation.


Listed below is one of 30 program-level scorecards that mirrors the agency-level scorecard but is adapted to fit programmatic needs. Performance measures for all 31 (total, including agency) scorecards are developed/selected based on three criteria: 

  1. Strategic versus operational – All measures are categorized as either strategic, if they evaluate progress of a desired change towards the vision of the agency/program, or operational, if they track a static measure for decision-making or reporting purposes.     
  2. BSC Framework – All measures must also fit within the four BSC perspectives (customer, business process, employee learning & growth, financial) and under one of the 12 objectives, listed above. 
  3. Mandatory cascade – some agency-level measures are required to be on the program-level scorecards for the opportunity to drill down to see where bright/trouble spots originate. An example of a mandatory cascaded measure, indicated by the prefix (perspective): (objective): (measure), include C1a: Customer Satisfaction Rating; C2a: Program Activity Count; B1b: Agency Service Encounters; E2b: Employee Retention Rate; F3a: Budget Status.  Measures that do not have a prefix are not monitored at the program-level for program decision making purposes.


Example: Immunizations Program Scorecard (Abbreviated with Imm” prefix)

Customer (C) Perspective

Objective Imm: C1. Provide High Quality Immunization Services to our Customers             

  • Measure Imm: C1a. Customer Satisfaction Rating
  • Measure Imm: C1b. Timeliness of Service Delivery

Objective Imm: C2. Promote Health and Prevent Vaccine-Preventable Diseases to Achieve Healthy Outcomes      

  • Measure Imm: C2a. # of Immunizations Provided
  • Measure Imm: C2b. % of 19-35-Month-Old Children Seen by CDPH Who Are Up to Date on Their Vaccines
  • Measure Imm: C2c. % of Children Fully Immunized in Cobb & Douglas Counties
  • Measure Imm: # of Vaccine Preventable Cases in Cobb & Douglas Counties


Business Process (B) Perspective

Objective Imm: B1. Improve Operational Effectiveness and Efficiency of Immunization Registration and Check-In

  • Measure Imm: B1b. # of Clients Served
  • Measure Imm: B1c. On Track with Process Improvement

Objective Imm: B3. Work with Community Partners to Ensure High Immunization Rates  

  • Measure Imm: % of Daycare Center Immunization Audits in Compliance with Georgia Law
  • Measure Imm: % of School Audits in Compliance with Georgia Immunization Laws


Employee Learning and Growth (E) Perspective

Objective Imm: E1. Improve the Use of Technology to Streamline and Enhance Immunization Services     

  • Measure Imm: % of Immunization Patients Who Agreed or Strongly Agreed that the Wait Time Was Acceptable

Objective Imm: E2. Attract, Develop, Retain High Quality Performers       

  • Measure Imm: E2a. Employee Accomplishing Individual Scorecard Goals
  • Measure Imm: E2b. Employee Retention Rate
  • Measure Imm: E2c. Employees Meeting Development Goals

Objective Imm: E3. Foster an Environment Where Employees Feel Valued and We Celebrate Their Successes       

  • Measure Imm: E3a. Employee Satisfaction Rating (for immunizations program staff)
  • Measure Imm: E3b. Employee Values and Success Rating (for immunizations program staff)


Financial (F) Perspective

Objective Imm: F2. Increase Billing and Collections for Immunizations     

  • Measure Imm: F2c. % Billing and Collection Rate

Objective Imm: F3. Excel in Stewardship and Financial Accountability       

  • Measure Imm: F3a. Program budget status

While this practice is primarily internal to CDPH, its reach extends to impact community health across external partners, particularly collaboration within both county's CHIP coalitions. This collaboration is listed below under each goal and objective, with emphasis on implementation details.


Goal 1: Identify a cloud-based PM software solution by mid-2016.

  • Objective 1.1: Conduct a needs assessment of PM stakeholders to understand desired features in PM tool in February 2016.
    • OQM created the needs assessment in January 2016.
    • Stakeholders being assessed included 12 CDPH Leadership members and 24 Program Managers.
    • Needs identified include: low total and annual cost, hosted on the cloud, automated archiving (keeping a record of previous data/multiple data points for trend use), ability to create reports, support data drill downs, interoperability with other IT systems, user friendliness, ability to link PHAB documents/reaccreditation process, level IT support, ability to send automated reporting reminders, data exporting capabilities, ability to store documents, security-based access to control edit/view rights
  • Objective 1.2: Research desired PM tools currently available to adapt to CDPH's needs identified in objective 1.1 in March 2016.
    • Various formal and informal methods of investigation were used to compile a list of PM tools that other health departments were using, along with generic PM tools that could be adapted for public health use. Formally researching various public health websites, such as the Public Health Quality Improvement Exchange's Community Forum, helped identify a few PM tools. Informally, networking at national conferences, the Quality Improvement Open Forum and Public Health Improvement Training, helped with research. CDPH quickly discovered that most health departments doing PM used Microsoft Excel spreadsheets. There were only a few health departments using more advanced PM tools, and of these, there was not one standard tool used. It seemed that health departments picked PM tools based on their specific needs; furthermore, supporting the importance of the initial needs assessment conducted.
    • OQM narrowed the research down to the following PM tools: Microsoft Excel (current method used), CorVu, DEHC Dashboard, SharePoint, AchieveIT, Klipfolio, Active Strategy, VMSG, InsightVision.
  • Objective 1.3: Obtain a request for proposal (RFP) from top two potential PM vendors in April 2016, review proposals in May 2016, and sign a three-year contract in June 2016.
    • The top two vendors were sent an RFP, which was adapted from Jefferson County Health Department in Alabama. The top candidate, InsightVision, was selected based on their expertise with the BSC framework, which was already a big part of CDPH's culture.


Goal 2: Implement PM tool to align agency plans by 2018.

  • Objective 2.1:  Develop 30 program-level scorecards to align with agency-level scorecard by mid-2017, pilot for one year, and finalize in mid-2018.
    • First, in August 2016, a multi-day BSC and PM refresher training was conducted for 24 program managers and 12 Leadership Team members on the history and meaning behind the two concepts. This truly helped set the stage for how PM related to the agency's mission and vision, and to each CDPH program's and employee's role.
    • Second, from September 2016 to February 2017, one-on-one coaching sessions were held between CDPH's 24 program managers and InsightVision consultants to develop 30 program scorecard drafts. Some program managers lead multiple programs.
    • Third, in March 2017, program managers presented their scorecards to the Health Director, and their Leadership Team member for approval. Any recommendations for improvement were incorporated before the July 2017 implementation launch date.
  • Fourth, from July 2017 to June 2018, scorecard implementation used the Plan-Do-Study-Act cycle framework to ensure continuous quality improvement for successful implementation. Final evaluation of users (qualitative) and data collected (quantitative) was utilized to ensure responsiveness to user needs.
  • Objective 2.2: Integrate QI projects, workforce development goals, stakeholder feedback, and community partnership goals into each program scorecard.
    • In March 2017, InsightVision CEO facilitated CDPH's Leadership Team retreat to ensure the PM tool was meeting CDPH's needs, PHAB requirements, and provide training on strategy management to sustain the PM infrastructure that has been built.
    • By July 2017, most of the elements listed above were incorporated into all scorecards to begin trial implementation. This effort included all 12 Leadership Team members and 24 program managers.
  • Objective 2.3: Create CHIP scorecards for two county coalitions and link to program scorecards, if applicable.
    • By December 2018, the CHIP coalition scorecards were transitioned from Microsoft Excel documents into InsightVision. By this time, the program scorecards were also developed and allowed for linkages to be made between measures from program scorecards to CHIP scorecards. This effort included the CHIP Director, coalitions members, program managers, and Leadership Team members.
  • Objective 2.4: Integrate strategic plan initiatives into agency-level scorecard.
    • By July 2018, CDPH had finalized the next iteration of the strategic plan and the strategic initiatives, with their goals and objectives, were incorporated into InsightVision for tracking. This was led by the OQM, strategic initiative owners (Leadership Team members), and strategic initiative multi-disciplinary teams with staff from various levels.
  • Objective 2.5: Integrate population health outcome measures into each program scorecard.
    • By December 2018, all program scorecards had identified a population health outcome measure that most closely related to program activities. This effort was led by the OQM, Epidemiology, and CHIP Director, working with all 24 program managers and 12 Leadership Team members.


Goal 3: Evaluate performance of programs with PM tool by 2019.

  • Objective 3.1: Transfer program evaluations from 2009 to 2016 into new PM Tool.
    • CDPH's 30 programs undergo an annual program evaluation that began in 2009. To incorporate historical knowledge and support information sharing, these old evaluation documents were uploaded into InsightVision by the OQM and InsightVision consultants.
  • Objective 3.2: Transition from paper-based program evaluation to electronic in new PM Tool.
    • The OQM, InsightVision consultants, and 24 program managers helped transition the Microsoft Word document version of the program evaluation to an electronic format in InsightVision during the 2017 and 2018 evaluations.
  • Objective 3.3: Integrate strategic planning concepts into each program evaluation, including program mission, vision, SWOT analysis, analysis of scorecard measures, QI project, goals for next three years, budget proposal for upcoming fiscal year, organizational chart, and program in action story.
    • During the 2019 evaluations, the program evaluation format was changed to focus on programmatic strategy.  The heavy lifting was done by the OQM, 24 program managers, and their staff (all front-line staff).


To foster ongoing collaboration of this practice, the OQM supports an optional quarterly PM Coffee Hour to provide support to programs and staff on updating the PM system, reviewing data compared with targets, and/or reporting; meetings are held in conjunction with data deadlines. In addition, an annual Leadership Team Retreat is held in March that focuses on discussing low-performing measures, reviewing performance patterns/ trends/barriers/successes from the previous year(s), and proposing targets for the next fiscal year starting in July.


An example of how these collaborative efforts work across the agency can be seen by the agency-level measure for Customer Satisfaction Results. In one-screenshot view, the program-level data is listed below the agency-level summary so that programs not meeting targets can be easily identified. This supports collaboration across the health department since multiple areas are involved in the customer satisfaction process; for example, IT supports ensuring that the text survey is being sent out, OQM ensures survey data is downloaded from Qualtrics, and program managers share results of the survey with the staff directly related to the program/process being evaluated so that data can achieve/maintain desired targets.


Costs associated with this project include:

Year 1: $91,454 (Consulting) + $20,192 (Licensing) + $ 160,000 (in kind, 38 employee staff time) = $271,646

Year 2: $19,550 (Consulting) + $16,387.50 (Licensing) + $160,000 (in kind, 38 employee staff time) = $195,937.50

Year 3: $0 (Consulting) + $5,637.50 (Licensing) + $80,000 (in kind, 38 employee staff time) = $85,637.50


Total 3-Year Cost: $111,004 (Consulting) + $42,217 (Licensing), + $400,000 (in-kind, 38 employee staff time) = $553,221


Ongoing Cost: $5,637.50 (Licensing for 36 licenses) + $80,000 (in kind, 38 employee staff time) = $85,637.50


While CDPH invested substantial funds to build its integrated system and does not endorse a specific-software tool, CDPH has been fortunate to share the system/scorecards, approach and lessons learned informally with other health departments seeking to further develop and align their PM system and agency plans. The infrastructure and scorecards CDPH created can be replicated by other health departments without having to recreate this from scratch like CDPH did.

Through the trial implementation period from July 2017 to June 2018, several PDSA cycles were conducted to evaluate effectiveness and efficiency of the PM process. Some of the process evaluation findings included:

  • Ongoing training for new program managers was needed to ensure sustainability of PM tool usage. To address this, training for InsightVision was added to all new supervisors and program manager onboarding requirements in the Workforce Development Plan. Additionally, a QI/PM training was added to the QI training series (in the QI Plan) to address how QI and PM are related. This training is required for all supervisors and program managers and optional for all other staff.
  • Some agency-level measures could not be cascaded to program-level scorecards because of varying program requirements. Therefore, the OQM worked with Leadership Team and program managers to accommodate this customization and still meet the goals of the project. For example, all programs have different customers. For those that have individual customers, they were required to utilize the agency Qualtrics survey, but could administer it in any way they felt adequate (i.e., text, paper-based, on a tablet). For programs that mainly work with partners, they were required to have a different partner satisfaction measure that communicated partner survey results. This survey could be anything the program wanted to use but was recommended to try and ask similar questions to the Qualtrics survey for comparability. Additionally, if programs were doing measuring activities related to agency-level measures, these program measures were cascaded and linked to the agency-level scorecard even though they were not required for all programs. This depicts the strategic efforts certain programs are doing to further CDPH's mission and strategic plan outside of their mandated requirements. 
  • The process of program evaluations needed to change in three ways to adapt to the new PM tool for sustainability and strategic impact. First, the timeframe for program evaluations was shifted from the Fall (October/November) to the Winter (January/February) to allow evaluation data to be communicated to leadership prior to the budget process beginning in March for the next fiscal year. Secondly, the entire chain of command for a program was invited to attend the program evaluation meeting so that all voices could be heard in the discussion of decisions made based on the data from the PM tool. Thirdly, the content of the program evaluation was simplified to be more strategic and less operational to allow program managers more time to think about ways to lead their programs rather than day-to-day operations.


Since the full launch of this latest PM refresh in July 2018, after multiple PDSA rounds, the outcomes were evaluated as follows:

  • This practice has shifted ownership of being one person's job to being everyone's job. OQM is now facilitating PM compared to doing PM for the entire agency.
  • Program managers have shifted their thinking from operational to strategic, from just measuring everything that is easy to measure and creating useless reports, to channeling energy towards measuring what is most meaningful for the program/agency and creating reports that tell a story about the change CDPH is striving to achieve.
  • Program managers and staff are more empowered to make changes because of the BSC emphasis on the four perspectives (customer, business process, employee learning & growth, financial) that guides them to think about their customers, how they could make process improvements, build/strengthen partnerships in the community, and generate more funds via their program.
  • PM data was retrospectively added from at least 2016 to 2019, allowing for three years of data trends. The impact of this effort was tremendous because its improved data accuracy and availability. Each measure incorporated descriptions for what was being measured, data source, target value, measure owner, related documents that could provide more information on data values, and comments about each data point for further explanation. Now when there is turnover, the new PM user understands the reasoning behind the measure and knows exactly how to pull the data value without recreating a new measure.
  • This practice allows more information to be shared across the agency. Specifically, staff can use the PM tool to pull data for their needs rather than asking the program manager for this information.
    • For example, when applying for grants, the Development Office is able to look in InsightVision to pull data about program measures (volume, health outcomes, etc.) and strategies (program evaluations mission, SWOT, goals, organizational charts).
    • A second example includes the Center for Administration being able to pull activity counts to prove how budgets are being spent and submit this data for the Government Finance Officers Association Award in 2018 and 2019.
  • Program scorecards were assessed during 2019 program evaluations and any requests for modification of measures were finalized by July 1, 2019 for the next fiscal year.
  • Agency-level measure targets continue to be revised during annual Leadership Team Retreats. In March 2019, measure owners proposed new targets for each measure for fiscal year 2020 based on trends from the previous year. These new targets began July 1, 2019.
    • An example of a target proposal for C1a: Customer Satisfaction went from 3.47 in 2019 to 3.55 in 2020 based on a consistently higher data value trend.
    • One example of data driven decisions made include the discussion of the measure B1b: # of Patient Encounters for the East Cobb clinic location. The patient volume had been decreasing significantly over the past three years. This information, paired with the CHA data concluding that this geographic area in Cobb County had a lower need for public health services, led CDPH's Leadership Team to close this clinic location and shift efforts towards the South Cobb region, where health inequities exist.
  • This PM practice was listed as a strength in the PHAB annual report compared to it being a weakness in the 2014/2015 initial accreditation site visit report and action plan.


Goals and objectives listed below were all accomplished from 2016 to 2019.

Goal 1: Identify a cloud-based PM software solution by mid-2016.

  • Objective 1.1: Conduct a needs assessment of PM stakeholders to understand desired features in PM tool.
  • Objective 1.2: Research desired PM tools currently available to adapt to CDPH's needs identified in objective 1.1.
  • Objective 1.3: Obtain a request for proposal (RFP) from top two potential PM vendors and sign a three-year contract.


Goal 2: Implement PM tool to align agency plans by 2018.

  • Objective 2.1:  Develop 30 program-level scorecards to align with agency-level scorecard by mid-2017, pilot for one year, and finalize in mid-2018.
  • Objective 2.2: Integrate QI projects, workforce development goals, stakeholder feedback, and community partnership goals into each program scorecard.
  • Objective 2.3: Create CHIP scorecards for two county coalitions and link to program scorecards, if applicable.
  • Objective 2.4: Integrate strategic plan initiatives into agency-level scorecard.
  • Objective 2.5: Integrate population health outcome measures into each program scorecard.


Goal 3: Evaluate performance of programs with PM tool by 2019.

  • Objective 3.1: Transfer program evaluations from 2009 to 2016 into new PM Tool.
  • Objective 3.2: Transition from paper-based program evaluation to electronic in new PM Tool.
  • Objective 3.3: Integrate strategic planning concepts into each program evaluation, including program mission, vision, SWOT analysis, analysis of scorecard measures, QI project, goals for next three years, budget proposal for upcoming fiscal year, organizational chart, and program in action story.

Sustainability of this practice was the most important need identified at the beginning of this project and has been at the forefront of all project decisions. Some key sustainability factors are listed below.

  1. The first sustainability tactic was to shift ownership of PM from the Office of Quality Management to all staff. To help support staff ownership of the PM tool, the agency holds 36 licenses for InsightVision so that all 12 Leadership Team members and all 24 Program Managers have their own logins. A few Program Managers have multiple program scorecards that they manage because there are 30 total programs. Each Program Manager is responsible for updating the program scorecard measures quarterly/annually and sharing this with their staff at least at the same frequency. The performance measures are cascaded from the agency-level to program-level (30 programs), and then to the individual/employee-level (350 employees), which is how CDPH has aligned its activities to achieving its mission and vision. This cascaded approach, also allows CDPH to drill down to identify targeted areas for improvement at specific levels of the organization.  To support effective assignment of responsibilities, each Leadership Team member is assigned to be the owner of an agency-level measure. Measure owners track their measures quarterly or annually depending on the measure. Specifically, measure owners ensure that programs entering data and working towards meeting targets have updated their information. Each measure owner presents their data to the Leadership Team on a quarterly/annual basis to support data-driven decision-making. This team approach of ownership helps spread the workload from just being in the Office of Quality Management, improves communication across the agency, and reduces silos between programs. 
  2. Secondly, a plan for managing this practice has been widely communicated and engrained in the CDPH culture. Implementation of the PMS is done monthly, quarterly, semi-annually, or annually based on measure timeframes. All data is due 30 days after the time period ends and is uploaded by program managers for program scorecards and Leadership Team for the agency scorecard. On a quarterly basis, Leadership Team members comprehensively review the agency-level scorecard, focusing on the cascaded measures. Also, quarterly, Program-level scorecards are reviewed by Program Managers and their staff. Fourth quarter reporting also includes annual data. During these quarterly multi-level discussions, progress and trends are analyzed, plans are revised if necessary, and resources are reallocated as needed.
  3. Third and most importantly, in addition to quarterly and annual measure reporting, each program presents a thorough program evaluation to their chain of command, including the Health Director and Chief Financial Officer. This program evaluation truly ties all the pieces of PM, strategy, and all agency plans together. It consists of a summary of each program's scorecard measures from the previous year, SWOT analysis, Mission, Vision, and goals for the next year based on results from the SWOT, logic model, and QI project. The purpose of this program evaluation is to ensure all levels of staff related to each program are aware of how program and performance information is being utilized for decision-making and planning for the upcoming fiscal year. This also helps ensure Leadership are aware of program needs for consideration in the budget approval process for the next fiscal year.
  4. Fourth, the Leadership Team looks at the agency-level scorecard and does an agency evaluation at the annual Leadership Team Retreat in March, which is strategically scheduled to occur after program evaluations are complete and prior to starting the budget approval process. This allows transparent discussions to occur surrounding all aspects of the agency and assists in planning ahead for the upcoming year(s). After the Leadership Retreat, decisions are shared with staff through multiple channels, including a strategic plan progress report, Quarterly Board of Health report(s), the agency newsletter, and at supervisors/staff meetings.
I am a previous Model Practices applicant |Colleague in my LHD