Posted By Kelly Gerard,
Friday, January 29, 2016
Updated: Friday, January 29, 2016
Denver Public Health, the host-site agency for my fellowship, is driven by an internal desire to improve its processes and outcomes. This desire is achieved through Continuous Quality Improvement (QI), an ongoing effort to increase an agency’s approach to manage performance, motivate improvement, and capture lessons learned in areas that may or may not be measured as part of the public health department accreditation process. Denver Public Health’s approach to QI is guided by Lean. Lean is a systematic approach, based on the Toyota Production System of Lean principles and tools, with a defined improvement process to identify and eliminate waste such as inefficiency, error, and redundancy.1 Through a Lean-driven QI process, Denver Public Health improves the efficiency, effectiveness, quality, or performance of services, processes, capacities, and outcomes.2
Awarding QI Efforts
Denver Public Health established a Quality Committee to oversee efforts related to QI projects, staff QI training, customer satisfaction, and related communications. It is a multi-disciplinary committee with representation from all divisions.3 The Quality Committee is also responsible for selecting recipients for the Oppy Award. The Oppy Award provides recognition to teams and programs for exceptional QI projects. Projects are voted on by the Quality Committee each month, and Oppy travels to each winner to be proudly displayed. Winners are encouraged to decorate and add “flair” to Oppy. A picture of Oppy with the award winners is taken and shared on the agency’s intranet site and on bulletin boards throughout the department. The mascot represents the core values of continuous quality improvement and encourages and promotes a QI culture in public health.
Example QI Projects that have won the Oppy Award:
Screening, Brief Intervention, Referral to Treatment (SBIRT) process improvement in the STD Clinic to identify, reduce and prevent problematic use, abuse and dependence on alcohol and illicit drugs 4
Text message reminders for travel patients in the Immunization and Travel Clinic
Personnel Grant Management (PGM) system in the Public Health Administration team
Travel authorizations process improvement by a cross-departmental team to simplify and standardize the travel authorization process
Vaccine preventable disease response by Epidemiology and Preparedness and Immunization and Travel Clinic teams
Email best practices by a cross-departmental team to reduce the burden of email and improve the consistency of emails originating from our department
Optimizing the Grant Tracking Database by the Public Health Administration team and Kelly Gerard
Call to Action
QI in public health is a continuous and ongoing effort to focus on improvement activities that are responsive to community needs and improving population health. 5 Public health departments wanting to achieve measurable improvements should consider adopting Lean principles and tools and promoting a QI culture. Successful implementation of QI requires a commitment throughout all levels of the organization. An effective way to engage and motivate employees is through internal awards, such as the Oppy Award, to acknowledge QI efforts throughout the department.
Lean Systems Improvement, Lean at Denver Health: Saving Lives, Saving Money, Saving Jobs, Denver Health, 2012.
Public Health Foundation and the National Public Health Performance Standards Program, Acronyms,Glossary, and Reference Terms, CDC, 2007.
Denver Public Health, Performance Improvement Plan, 2015.
SAMHSA-HRSA Center for Integrated Health Solutions, accessed from web: http://www.integration.samhsa.gov/resource/sbirt-resource-page
Riley, Moran, Corso, Beitsch, Bialek, and Cofsky, Defining Quality Improvement in Public Health, Journal of Public Health Management and Practice, January/February, 2010.
Kelly Gerard, MSHI, RHIA is an Applied Public Health Informatics Fellow at Denver Public Health, Denver Health in Denver, Colorado. APHIF applications are due Monday. To apply for APHIF, HSIP, and I-TIPP fellowships, please visit theProject SHINE website.
Posted By Lauren Thie,
Friday, January 22, 2016
Updated: Tuesday, January 12, 2016
CSTE provides international consultancies for its members to support CDC programs. You may have a colleague who went to West Africa to help support Ebola efforts through CSTE. When I joined CSTE in 2011 as a new North Carolina health department employee, I had recently returned home from some international work. I let CSTE know I would be interested in international consultancy, and what my skills were. Through the CSTE international consultancy program, I was able to assist in an influenza epidemiological surveillance review in Bamako, Mali in September 2015.
I received training for the influenza surveillance review in advance of my departure. CSTE colleagues experienced in surveillance reviews offered their expertise and documents from their previous reviews in West Africa. CDC-CSTE calls were held to offer me training for the consultancy. CDC shared background documents on Mali’s influenza work. North Carolina public health has a strong history of international consultancy and influenza work, and colleagues shared their wealth of experience. I left for Bamako in early September feeling prepared.
Mali is located in West Africa, sharing borders with 7 different countries. It is south of Algeria, west of Niger, north of Burkina Faso and Cote d’Ivoire, and east of Mauritania and Senegal. Mali is twice the size of Texas and is home to 15 million people. Bamako is the capital city, located in western Mali. Bamako’s population is two million and is the sixth fastest growing city in the world.
The CDC Influenza program and the Center for Vaccine Development Mali planned the surveillance review itinerary and most of the logistics. During my week in Bamako, I used the CDC surveillance tool to document Mali’s influenza surveillance program in laboratory, Influenza-Like Illness (ILI), and Severe Acute Respiratory Illness (SARI) surveillance sites, and nationwide epidemiological surveillance. I recorded what Mali’s influenza epidemiologists, clinicians, and laboratory scientists shared with me about their budding program. With the help of CDC colleagues in Atlanta and Accra, I reported to CDC on the overall system, SARI, ILI, laboratory, data, a SARI sentinel site visit (Gabriel Toure Hospital Pediatric Department, Bamako), and an ILI site visit (Commune I, Bamako).
My international consultancy work with CSTE and CDC on influenza surveillance was inspiring. Professionally, I was impressed by my public health colleagues in Bamako. I have done several public health projects in lower resource settings and this was by far the most impressive I have seen. With time, I believe the Mali influenza program will be an example in West Africa. I am grateful to CSTE, CDC, and the Center for Vaccine Development Mali for an outstanding epidemiology experience during my September 2015 visit.
Touring the Center for Vaccine Development Mali’s (CVD-Mali) laboratory, which performs influenza testing. Left to right: Dr. Boubou Tamboura (CVD-Mali laboratory director, Bamako, Mali), Dr. Talla Nzussouo (CDC epidemiology and laboratory regional advisor based in Accra, Ghana), me (Lauren Thie, NC Division of Public Health, CSTE member), Thelma Williams (CDC project officer, based in Atlanta, USA).
Lauren Thie, MSPH is an Environmental Program Consultant in Occupational and Environmental Epidemiology at the North Carolina Division of Public Health. For more information on international consultancies, please contact CSTE. CSTE is seeking epidemiologists for rapid Ebola deployment in West Africa, including Portuguese and French speakers.
Posted By Rachel Linz,
Friday, January 15, 2016
Updated: Tuesday, January 12, 2016
The Reproductive Health Program in Oregon is a little different than in most other states. Not only do we administer a Title X grant (Title X of the Public Health Services Act, signed into law by President Richard Nixon in 1970, is the only federal funding dedicated solely to family planning services), but we also administer a Section 1115 family planning demonstration waiver through the Centers for Medicare and Medicaid Services (CMS), despite being within our state’s Public Health Division rather than our state’s Medicaid office. Our waiver is called Oregon ContraceptiveCare, or CCare, and covers family planning and contraceptive management services for individuals who are U.S. citizens or lawful permanent residents with household incomes up to 250% of the federal poverty level and who are not enrolled in the state’s Medicaid program. The RH Program’s provider network includes all local public health departments in the state as well as Planned Parenthood health centers, university health centers, community health centers and School-Based Health Centers, totaling 150 clinics statewide. Through our entire provider network, we serve over 80,000 clients annually.
One area of focus for the Oregon RH Program has been to increase access to long-acting reversible contraceptives, or LARC methods. These include contraceptive implants and intrauterine devices, are effective for up to 3-10 years depending on type, and have failure rates similar to sterilization methods (see Figure 1). In fact, LARC methods are about 20 times more effective at preventing pregnancy than birth control pills! We provide technical assistance and training for clinicians and billing staff regarding insertion and removal of LARC devices, billing, reimbursement and maintenance of device stock on site, and best practices regarding client counseling techniques to increase client success with their methods, regardless of which method a client chooses.
As all readers of CSTE Features no doubt know, the United States transitioned to the 10th revision of the International Statistical Classification of Diseases and Related Health Problems (ICD-10) on October 1, 2015. For the Oregon RH Program, we are fortunate that our clinical data collection is narrowly focused and we only require diagnosis codes for visits under CCare, not for Title X (which covers a broader scope of services than CCare). Because of CMS requirements, CCare visits must include a primary diagnosis code indicating that contraceptive management was the primary purpose of visit (V25 codes under ICD-9, Z30 codes under ICD-10). To assist our provider network in managing the transition, we created a crosswalk that includes the ICD-9 codes for each contraceptive method alongside the appropriate ICD-10 code, as well as the Healthcare Common Procedure Coding System (HCPCS) supply codes associated with each method type (see Figure 2).
The biggest challenge with coding for the Oregon RH Program under ICD-9 has continued under ICD-10: several contraceptive methods do not have their own unique codes. We’ve all heard about new ICD-10 codes that have been created to document very specific types of injuries in specific locations, but what has not been in the news is the fact that the most effective LARC method, the hormonal implant (<0.5% failure rate) actually lost its unique codes that it had under ICD-9! Of the 18 different contraceptive methods available in the U.S., only four have their own specific diagnosis codes: intrauterine devices, oral contraceptives, injectable contraceptives, and natural family planning. Both female and male sterilization methods use the same diagnosis codes.
Our solution, which aligns with recommendations from national family planning and coding experts, is the following: for hormonal methods that do not have their own specific codes (the contraceptive implant, patch and ring), to use the codes for “unspecified” contraceptives (Z30.019 for initial encounter, Z30.40 for follow-up or surveillance encounters). For less effective methods that do not have their own specific codes (cervical cap, diaphragm, sponge, female and male condoms, and spermicide), we recommend using codes for “other” contraceptives (Z30.018 and Z30.49). This way, although we cannot determine specific contraceptive methods from diagnosis codes alone, we can determine the approximate level of effectiveness. The bottom line is that other information, such as HCPCS codes and National Drug Code (NDC) numbers, is required to determine exactly which contraceptive methods are dispensed. Additional ICD-10 codes may become available in the future, but for now, tracking ongoing use of certain long-acting methods remains a challenge.
Figure 1. Contraceptive method effectiveness. Most effective methods include the contraceptive implant, intrauterine devices, and sterilization methods. Moderately effective methods include injectables, pills, patches, rings and diaphragms.
Figure 2. A portion of the Oregon RH Program’s ICD-9/ICD-10 crosswalk. Under ICD-10, the hormonal implant lost its unique diagnosis codes while injectable contraceptives gained unique codes. Other methods such as the diaphragm have never had their own unique codes.
Rachel Linz, MPH is an Informatics Training in Place Program (I-TIPP) fellow and senior research analyst with the Reproductive Health Program at the Oregon Health Authority. To learn more about ICD-9 and ICD-10, join subcommittees in the Surveillance/Informatics Steering Committee.
Posted By Emily J. Holubowich ,
Monday, January 4, 2016
Updated: Monday, January 4, 2016
Emily Holubowich, Senior Vice President at CRD Associates, is CSTE’s Washington representative and leads our advocacy efforts in the nation’s capital.
After a couple of fiscal “close calls” this fall—shutdown threats, last-minute budget negotiations, and a couple of stopgap spending measures to keep the government running—Congress ultimately passed and the President quickly signed the Consolidated Appropriations Act of 2015 before heading home for the holidays. This trillion-dollar spending measure provided appropriations for all “discretionary” government functions, including those administered by the Department of Health and Human Services.
In the end, public health fared well, all things considered. The Centers for Disease Control and Prevention (CDC) received nearly $7.2 billion in the “omnibus” spending bill for fiscal year (FY) 2016. That’s a $277.7 million (four percent) increase over FY 2015 levels. This funding includes nearly $6.3 billion in discretionary budget authority, as well as more than $892 million in mandatory Prevention and Public Health Fund (PPHF) dollars and $15 million from the Public Health and Social Services Emergency Fund.
The overall increase in funding should translate into good news for state and territorial epidemiologists. The National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) received nearly $580 million, including $52 million from PPHF. This funding level represents a $175 million (43 percent) increase over FY 2015. Within NCEZID, the antibiotic resistance (AR) initiative would receive $160 million in new funding; less than the President’s requested $264 million for CDC. Based on the President’s budget request submitted to Congress in early 2015, we would expect much of the NCEZID funding, including AR, to support core infectious disease surveillance capacity at state and local health departments through Epidemiology and Laboratory Capacity (ELC) grants. This funding would be in addition to $40 million from the mandatory PPHF provided to ELC grants for the fifth consecutive year in the omnibus. The final spending measure requires CDC to submit to Congress a detailed spend plan for AR within 60 days of the legislation’s enactment, so more specific information about ELC funding and the bill’s impact on states and territories will be available soon.
Among our other NCEZID appropriations priorities, food safety received increased funding ($52 million) and advanced molecular detection was flat funded ($30 million).
The Public Health Workforce program, through which the CDC/CSTE Applied Epidemiology Fellowship receives funding, is also provided flat funding of $52.2 million and no supplemental PPHF funding. The appropriations bills do not specify how much funding would be dedicated to the Applied Epidemiology Fellowship program per se, but with flat funding of the program we might expect flat funding for our fellows. The President had requested a $15.2-million increase in budget authority for Public Health Workforce, as well as $36.2 million in PPHF. Three years ago, Congress eliminated $15 million in PPHF dollars for Public Health Workforce in the wake of sequestration.
Some other notable items related to public health:
The National Center for Injury Prevention and Control received $70 million to combat the opioid epidemic, a $50 million increase over FY 2015 levels.
National Center for Environmental Health budget was increased by $2.9 million over FY 2015 and most of the cuts proposed in earlier spending legislation were restored with the exception of the $2.8-million “Built Environment and Health Initiative,” which was eliminated.
Funding for CDC’s tobacco programs sustained a $6.5 million cut compared to FY 2015, but was mostly restored after being cut by $100 million in proposed spending bills.
With the enactment of FY 2016 spending legislation, legislators will begin work in earnest on FY 2017 spending legislation when they return to Washington in January. CSTE will once again partner with the Association of Public Health Laboratories and other colleagues in the public health community to advocate for our key priorities—strong support for disease monitoring and for training the next generation of epidemiologists. In addition, our executive leaders are travelling to Washington in early February to discuss our funding requests with key decision-makers in Congress and the administration. Until then, we anxiously await the release of the President’s final budget request of the administration, expected to be released the first week in February, to see what the White House has in store for disease surveillance.
For more information about funding levels for your specific priorities, please click here for a copy of the omnibus spending legislation, and click here for a copy of the accompanying report that provides more detailed instructions about public health funding levels and intended purposes.
Posted By D. Rebecca Prevots,
Tuesday, December 22, 2015
Updated: Friday, December 18, 2015
Editor’s note: CSTE Executive Board Secretary-Treasurer and Hawaii State Epidemiologist, Sarah Park, recently met up with D. Rebecca Prevots, Ph.D., Chief of the Epidemiology Unit of the Laboratory of Clinical Infectious Diseases, Division of Intramural Research at the National Institute of Allergy and Infectious Diseases (NIAID) at NIH. Sarah recommended that Rebecca write this blog for CSTE describing the epidemiology research and support at NIH and how she might help CSTE members with applied epi questions not addressed by other federal partners.
I will give an overview of epidemiology and epidemiologic capacity at NIH based on my experience here. Since I spent 12 years at CDC and now 12 years at NIH, I do have some perspective into how epi fits into these different cultures (the more academic and the more applied public health). NIH is a very big place, with the 27 different institutes and centers comprising the National Institutes of Health (note the “s”). Most institutes have extra- and intramural groups, with the extramural groups funding research outside of NIH and the intramural groups conducting research on the NIH campus. Approximately 90% of the NIH budget goes to extramural research, usually in the form of grants to academic institutions, and the remainder is for intramural research.
In addition to the intra\extramural distinction, each institute or center varies in its mission, structure, and function, and therefore it is difficult to provide one picture of epidemiology across NIH. Most large institutes (NCI, NIAID, NIEHS, NHLBI, and NICHD) have intra- and extramural epidemiology groups. The descriptions of the various epidemiology groups can be found on the NIH webpage under their respective institutes. The epidemiology groups in each institute will have expertise related to their specific areas of study. The main mission of the NIH is biomedical research in support of human health, which the epidemiology units complement in a variety of ways that include:
design and analysis of NIH clinical studies,
involvement in design and analysis of field studies, and
analysis of large datasets to look at population patterns
The nature of the ongoing research varies widely, but intramural epi researchers at those institutes typically do original research, usually in populations outside of NIH. The overarching mission of intramural epi groups is to add value to the mission by focusing on rare diseases or high-risk research that otherwise wouldn’t get funded extramurally.
I can speak to what I know best: in my epi group at NIAID, we seek to lead and support research within the Division of Intramural Research. This includes research on rare lung diseases from nontuberculous mycobacteria and fungi, design and analysis of clinical research data, analysis of population-based data (e.g., datasets from the Centers for Medicaid and Medicare Services and the Agency for Health Research and Quality), and using approaches such as spatial analysis to better understand risk factors for some conditions. We also provide epidemiologic expertise into international field studies conducted by NIH, such as the intramural-conducted field studies of malaria in Mali.
In summary, there is epi expertise at NIH, and that expertise varies widely across groups. Certainly if there is a topic of interest (such as NTM), there are often experts here who can help. And certainly I would be interested in fostering ties with CSTE!
Posted By Virginia Dick,
Friday, December 18, 2015
Updated: Friday, December 18, 2015
Evaluation is a crucial activity for many state and local public health agencies. In addition to the evaluation requirements in many federal funding programs, more and more states are recognizing the need and value of conducting thorough process and outcome evaluations of local, state, and regional efforts. All funders, governmental and non-governmental, are placing an increasing emphasis on demonstrating the impact of the program, policy or system change.
It is only through a comprehensive evaluation that jurisdictions can gain a better understanding of the impact of programs or policy changes as well as the key components involved in implementation of those programs and policies. Many evaluators, myself included, advocate that there is a critical need for process evaluation as well as outcome evaluation. A solid process evaluation provides the foundation for understanding why a program, system change, or policy had the anticipated impact (or not) as well as what components were most critical and how replication can occur in other settings.
Unfortunately, evaluation responsibilities often fall to epidemiologists or data managers who have little direct evaluation training and experience. In ideal circumstances, epidemiologists and evaluators work hand in hand during evaluation to provide the most thorough review of the process as well as aligning related data elements and issues. However, due to staffing and funding challenges, many jurisdictions are not able to have both of these positions engaged on the same effort at the same time, or the staff are spread across many efforts. Building evaluation capacity within the current epidemiology workforce is one way to help build the overall evaluation capacity within jurisdictions.
In 2015, CSTE conducted a four-part webinar series to discuss and examine public health evaluation. The series was designed to provide a high-level overview discussion of evaluation and how evaluation broadly should be approached. While the series was developed in conjunction with the Chronic Disease Subcommittee, the presentations were done in a manner to allow generalizability across all areas of public health. All of the webinars are available in the CSTE webinar library and can be viewed at any time. Below is a brief description of each of the webinars as well as the link to the webinar itself if you would like to learn more about any of the topics.
Some “think about it” questions that were generated included: What is the first step in determining your evaluation design? What are some of the primary reasons for conducting evaluations? What is the biggest challenge you face in determining evaluation design, engaging stakeholders, and determining what tools to use?
Lesson 2: Approaches to Evaluation / Evaluation Types (March 19, 2015) - Webinar Slides
This webinar discussed the strengths, weaknesses, and primary focuses of four primary approaches to evaluation. The four approaches that were discussed in depth include Utilization-focused evaluation, Developmental evaluation, Theory Driven, and the Kirkpatrick model. In addition, time was spent discussing the difference between summative and formative evaluations and outcome and process evaluations. While discussing the differences, it is also important to consider the interconnectedness between the different types.
“Think about it” questions to consider from this week included: examining which evaluation approach you are most comfortable with, identifying what types of evaluation are typically conducted within your agency, and whether your approach is consistent with the types of evaluation that you have typically been involved with in your agency.
Lesson 3: Outcome/Process Evaluations (April 16, 2015) - Webinar Slides
This webinar spent significantly more time delving into the distinction between and relationship among formative/summative and outcome/process evaluation. Items considered and discussed including the appropriate different types of questions for each type of evaluation, how different evaluation theories or approaches examine each of the types of evaluation, and most appropriate data, analysis, and reporting mechanisms for each type of evaluation. Significant time was spent in this session discussing the importance of mixed methods designs in robust evaluations.
“Think about it” questions that were generated from this session included considering the most appropriate evaluation questions for each evaluation type, examining how summative evaluations can be used to inform programmatic practice, and thinking about the pitfalls of relying on only summative evaluation reports.
Lesson 4: Data Visualization and Reporting ( June 14, 2015) - Webinar Slides
The final webinar in this series discussed data visualization and reporting. In particular, how to determine the best reporting mechanisms for various stakeholder audiences, different methods for visualizing data, and the critical importance of different reporting tools were all discussed. This session tied together several underlying themes from the other sessions, including consideration of all stakeholders and ensuring the usability and relevance of the evaluation by all stakeholders.
Some of the “think about it” questions that were generated at this session included how to handle challenging responses from stakeholders to evaluation findings, how to identify and engage with key stakeholders to discuss the data, and how stakeholders can support evaluation efforts by providing more detailed understanding of key findings.
Dr. Virginia Dick is director of research and evaluation at CSTE. If you are interested in discussing evaluation in more detail, please feel free to contact her at firstname.lastname@example.org. For other interesting webinars on data and applied public health epidemiology issues, visit the webinar library.
Posted By Brittni Frederiksen,
Friday, December 11, 2015
Updated: Friday, December 11, 2015
The buzzword in the Maternal and Child Health and Reproductive Health communities is LARC – long-acting reversible contraceptives (e.g. intrauterine devices (IUDs) and implants). LARCs are game-changing, highly effective contraceptive devices that can decrease unintended pregnancies and increase birth spacing. They are also cost-effective and require little effort on the part of the user, making them an appealing contraceptive method to both teens and adult women.
As a CDC/CSTE Applied Epidemiology fellow in Maternal and Child Health (MCH) in the Bureau of Family Health at the Iowa Department of Public Health (IDPH), I have had the opportunity to work on a number of projects related to LARCs. Iowa’s involvement in LARC-related initiatives started when Iowa and Colorado were funded by a private donor to lead Initiatives to Reduce Unintended Pregnancies by promoting LARC use and removing barriers to uptake. The Iowa Initiative grant allowed clinics to expand hours and locations, train clinical nurse practitioners and physicians on the benefits of LARCs and how to use them, and most importantly, purchase LARCs so clinics could offer them at low cost or no cost to their patients. LARCs are expensive upfront, and prior to the Iowa Initiative it was difficult for clinics to afford to offer LARCs to their patients because of the cost. During the Iowa Initiative period, use of LARCs as a primary method of contraception increased substantially while the percent of unintended pregnancies declined by 11% and pregnancies terminated by abortion declined by 25%. Even though the Iowa Initiative ended in 2012, LARCs have continued to remain a popular contraceptive choice for women in Iowa.
Over the past year Iowa has participated in a multi-state LARC Learning Community led by the Association of State and Territorial Health Officials (ASTHO). This initiative is designed to assist states in the implementation of immediate post-partum LARC insertion. In other words, strategies to provide women with a LARC before her hospital discharge, post-delivery. Immediate post-partum insertion of LARCs allow women to prevent unintended pregnancies and effectively space pregnancies. This in turn can decrease poor health outcomes for mothers and babies. One of the barriers to immediate post-partum LARC insertion has been the bundling of the post-partum insertion in the delivery payment. Bundled reimbursement prevented providers from getting reimbursed for the LARC device and insertion. In February 2014 Iowa Medicaid Enterprise released an informational letter no. 1349 to unbundle LARCs from the payment for the inpatient admission associated with the delivery. This was a significant step in promoting immediate post-partum LARC insertion. We were fortunate to have the opportunity to work with two Harvard students that visited IDPH for a week in January. We created an evaluation plan so that we can evaluate the effectiveness of the LARC unbundling as well as a proposed outreach and training program to educate providers, billing staff, and Medicaid recipients about insertion and billing of LARCs in the immediate post-partum period.
To ensure women have access to quality family planning services and to encourage use of the more effective contraceptive methods, the Office of Population Affairs has proposed two performance measures: The percentage of women aged 15-44 years at risk of unintended pregnancy that adopts or continues use of:
The most effective (i.e., male or female sterilization, implants, IUDs) or moderately effective (i.e., injectables, oral pills, patch, ring, or diaphragm) FDA-methods of contraception
An FDA-approved, LARC.
The first measure is an intermediate outcome measure, and it is desirable to have a high percentage of women who are using the most effective or moderately effective contraceptive methods. The second measure is an access measure, and the focus is on making sure that women have access to LARC methods. These two measures applied to women who have had a live birth are also in development to measure postpartum contraceptive use among women ages 15-44.
Iowa has had the unique opportunity to collaborate with the Office of Population Affairs and the Centers for Disease Control and Prevention to pilot the two performance measures using Title X data and Medicaid paid claims data. As a CDC/CSTE Applied Epidemiology fellow, I was able to apply my SAS programming skills to develop SAS code to calculate the performance measures using Medicaid claims data that other states can also use to calculate the measures among their Medicaid populations. The Center for Medicaid and CHIP Services (CMCS) recently awarded 13 states, including Iowa, and one US territory funding to support state efforts to collect and report data to CMCS on the new developmental quality measures to assess progress on their Maternal and Infant Health Initiative goal to increase the use of effective methods of contraception among all women in Medicaid and CHIP. This has been an incredible opportunity to apply my epidemiological skills to a national initiative that will ensure women have access to the contraceptive method of their choice while reducing unintended pregnancies and improving birth outcomes.
Brittni Frederiksen is a CDC/CSTE Applied Epidemiology fellow at the Bureau of Family Health in the Iowa Department of Public Health. For more information on related issues, join the CSTE Maternal and Child Health Subcommittee to follow updates on current activities.
Posted By Christine Dobson,
Friday, December 4, 2015
Updated: Friday, December 4, 2015
The event that would largely shape my introduction to the applied epidemiology field came as I was en route to the CSTE Applied Epidemiology Fellowship Orientation Week. Early in the morning on August 24, 2014, a 6.0-magnitude earthquake struck the San Francisco Bay Area where I live and work. I learned of the earthquake from the East Coast while traveling to orientation in Atlanta. Had I been home, the earthquake would have jolted me awake at 2 in the morning and rocked me for about 20 seconds. The earthquake, which was epicentered approximately 30 miles northeast of San Francisco, caused five residential fires, downed power lines, and resulted in damage to buildings, homes, roadways, and water mains. Suspecting that the California Department of Public Health (CDPH), my host agency, might play a role in its aftermath, I contacted colleagues offering to help with any response effort upon my return to California.
That response effort was a community rapid needs assessment, called a Community Assessment for Preparedness and Emergency Response (CASPER). The Emergency Preparedness Team (EPT) within the Division of Environmental and Occupational Disease Control at CDPH had been asked by the Public Health Division in Napa County, one of the hardest-hit counties, for assistance assessing earthquake-related physical injuries, chronic disease exacerbation, mental health issues, and general emergency preparedness among households in Napa County. A team comprised of epidemiologists, health educators, and physicians from the county and state health departments and the CDC was formed to plan the CASPER event. Using guidance from CDC’s CASPER toolkit, the team worked together to design a 48-question questionnaire for household-based interviews, to select the geographic areas for household sampling, to recruit volunteers to canvass the selected households, and to coordinate myriad logistics for a three-day field sampling event scheduled three weeks after the earthquake. Over those three days, 15 two-person teams of volunteers approached 488 households and successfully completed an interview with a member of 201 of those households.
Although we assessed many different health impacts related to the earthquake, impacts to mental health were a major focus of the CASPER. Just over a quarter of households had a member who experienced a traumatic psychological exposure, which is known to be associated with posttraumatic stress disorder, in the immediate aftermath of the earthquake. Feelings of psychological distress, such as feeling anxious, fearful, or distracted, were experienced in 75% of households. Among all households in which a traumatic or distressing psychological experience occurred, professional help or other emotional support was sought in 41% of households. In response to the data generated by the CASPER, Napa County was able to more effectively reallocate its mental health resources in the immediate aftermath of the earthquake. In addition, the county conducted public training sessions and education campaigns to support those with mental health risks, conducted community social events on the earthquake anniversary date to promote community resilience and earthquake preparedness, and provided psychological first aid training to outreach workers.
The CASPER findings were summarized in a report drafted by the EPT and submitted to Napa County Public Health. We also communicated our findings to a wider public health audience at a number of different meetings through both poster and oral presentation platforms: a webinar sponsored by the Safe States Alliance; poster presentation at the 10th Annual Preparedness Summit (Atlanta, GA); an invited talk at the 2015 CSTE Disaster Epidemiology Subcommittee Workgroup Meeting (Atlanta, GA); and presentations at the 2015 EIS Conference (Atlanta, GA) and the 2015 CSTE Annual Conference (Boston, MA). In addition, our findings specific to mental health needs as a result of the earthquake were presented in a MMWR report.
In addition to the actionable data that were generated, the CASPER, the first large-scale project to which I contributed as a CDC/CSTE fellow, also provided a comprehensive and intensive introduction to the work and workings of local and state health departments. I was part of the collaboration and discussions that occurred among local, state, and federal health officials. I was involved in the development of survey materials, in the training of volunteers in sampling and interviewing methods, in the analysis of our collected data, in the drafting of the report, and in presenting our findings across many forums. From start to finish, the CASPER provided opportunities both for learning and for developing my epidemiologic skills in the applied public health space.
Moreover, the CASPER afforded me the opportunity to meet community members affected by the earthquake. Many wanted to share their stories of the shaking and disorienting chaos that gave way to weeks and months of rebuilding their homes and regaining their physical and emotional health. Yet when asked at the conclusion of the interview what their household was most in need of now, the residents I spoke to most often replied, “nothing.” They were simply grateful that their plight had not been more severe. And so my experience on the frontlines of public health had taught me another important lesson, which was the resiliency of those impacted by disaster.
Christine Dobson, ScD is a CDC/CSTE Applied Epidemiology fellow in the Environmental Health Investigations Branch and Occupational Health Branch at the California Department of Public Health. AEF fellow applications are currently open for prospective fellows until January 13, 2016. If you work at a health agency and would like to host a fellow, HSIP host-site applications must be completed by December 16.
Posted By John Satre,
Friday, November 20, 2015
Updated: Friday, November 20, 2015
The Public Health ELR Network: A working model that increases interstate communication and reduces connections required between laboratories and public health1
The concept of a nationwide electronic laboratory reporting (ELR) network among state public health agencies organically sprang out of a field that had already been planted with seeds requiring time to develop and grow. Over the course of the last decade, these seeds have developed and are now producing fruit across the nation. It is appropriate to briefly outline some of the relevant components that make it possible to realize a nationwide, functioning public health ELR network.
Planting the Seeds
The following developments have been necessary for a public health ELR network to blossom:
Health Level 7 (HL7) Standardization – The ELR 2.5.1 HL7 Version 2.5.1 Implementation Guide: ELR to Public Health, Release 1 (US Realm) provided a target for partners involved in public health disease surveillance to develop independent interoperable messaging.
NEDSS and ELC – the National Electronic Disease Surveillance System (NEDSS) initiative provided the overarching vision and the Epidemiology and Laboratory Capacity (ELC) cooperative agreements provided the support to adopt or develop new surveillance systems capable of handling data in a much more sophisticated manner.
LOINC, SNOMED, and HL7 – these coding systems provide a common language for data sharing partners to communicate key concepts.
HITECH – The Health Information Technology for Economic and Clinical Health (HITECH) Act provided education, valuable work products, and technical assistance to public health jurisdictions related to Meaningful Use and ELR.
Meaningful Use Public Health Objectives – These incentives provide impetus for hospitals to approach public health with the necessary resources to plan, develop, and implement electronic connections in each state.
Activities in public health often take excessive amounts of time, but they also happen to be spread over short spans measured only in minutes. These activities, when summed up by day or week accumulate into a large amount of time. One of these activities is simply referring a laboratory result received by one public health jurisdiction to a different state public health jurisdiction for investigation. It is not uncommon for a public health jurisdiction to receive a reportable laboratory result when it should have been reported to a different jurisdiction. In addition, there are times when one jurisdiction – upon receipt of a laboratory report - sends a request for assistance to another state (for example, collecting treatment information from a healthcare provider in the second state). This interaction generally occurs through mail, fax, or phone communication; timeliness of the referral or response tends to linger in the midst of busy schedules. Iowa’s electronic laboratory reporting team – in collaboration with multiple disease surveillance teams in Iowa and Nebraska – has developed program-specific rules and technology to evaluate electronic lab reports upon receipt for possible referral to another jurisdiction; the process is called ELR Redirect. Every laboratory result received by ELR passes through a component where the reportable condition is identified, so it can be associated with program rules. Then the appropriate state is calculated from the combination of up to three addresses for potential referral or an automated request for assistance. These include patient state, ordering physician state, and ordering facility state. Based on the combination of condition and addresses, the laboratory report is either redirected to another state public health jurisdiction or passed on to the original recipient’s surveillance system. Sometimes it is appropriate for the lab result to be kept by the original recipient and redirected to another state. This innovative design accomplishes several things: it saves time for the original recipient and the secondary recipient, provides the lab result to the appropriate public health jurisdiction in near real-time, and allows the surveillance system to consume a standard message automatically thereby eliminating manual data entry of this record into the final destination system(s).
Taking the Next Step
Standards have been implemented over the past 10 years along with new innovative design. The next step allows for a nationwide electronic surveillance network. If a state public health agency is able to receive a laboratory report electronically and electronically pass that report on to another state public health agency, the potential exists for every state public health agency to redirect a laboratory result to every other state public health agency. Costly electronic connections that have yet to be established between laboratories and public health agencies to achieve ELR may now be rendered unnecessary. The possibility of a disease surveillance community that is 100 percent electronic is within sight.
Iowa is currently working with United Clinical Labs (UCL) which serves healthcare providers and their patients in Iowa, Illinois, and Wisconsin. Multiple projects are underway to establish an electronic connection between UCL, which is located in Dubuque, IA, and the public health agencies in Iowa, Illinois, and Wisconsin through the UCL-to-Iowa ELR connection.
1Click here for image licensing information for the video.
John Satre is an Informatics–Training in Place (I-TIPP) fellow at the Iowa Department of Public Health. For more information or to request the customizable software component, e-mail John Satre. If you would like to become an I-TIPP fellow, apply before April 1, 2016. For more information about ELR, join one of CSTE’s three Surveillance/Informatics subcommittees.
Posted By Katrina Hansen,
Friday, November 13, 2015
Updated: Friday, November 13, 2015
We, the New Hampshire Healthcare-Associated Infections (HAI) Program, have published nine annual reports on the occurrence of HAI in hospitals and ambulatory surgery centers. These reports provide a summary of HAI data routinely reported in accordance with state law and can be used to identify and monitor improvement over time. The original intent of the NH HAI law was to increase transparency for healthcare consumers and assist with making healthcare decisions. However, experience shows that these reports are primarily used by healthcare facilities. With no standardized template available, our reports have evolved over time as we created our own and adapted various practices from other states to accommodate healthcare facilities and consumers. The NH HAI Program has and continues to struggle with balancing the needs of both technical and consumer audiences while preserving the important nuance and details that are needed to understand the data.
The picture above is a snapshot of NH HAI 2014 Hospital Report. NH uses a ‘red, green, and yellow light’ scheme to indicate if measures are higher, lower, or similar to national data, respectively. Example 1: The central line bloodstream infection (CLABSI) standardized infection ratio (SIR) was 0.61 or 39% fewer (green) infections than predicted based on national data. Example 2: The catheter associated urinary tract infection (CAUTI) SIR was 1.26 or 26% more infections than predicted based on national data. However, this different was not statistically significant, which means the over number of CAUTI was SIMILAR (yellow).
With the leadership of Lindsey Weiner (epidemiologist at the Division of Healthcare Quality Promotion, CDC) and Andrea Alvarez (HAI/influenza program coordinator at the Virginia Department of Health), a multidisciplinary group was formed to collaborate and establish best practices in the analysis and display of HAI data, which resulted in the HAI Data Analysis and Presentation Standardization (DAPS) toolkit. I was privileged to be a part of this workgroup, learn from other HAI colleagues, and participate in discussions to develop a tool that can be used by other programs. Throughout my involvement, common themes arose, including:
We are all healthcare consumers and have competing priorities in our day-to-day lives. We need to remember that short and simple messages are best when trying to reach a general audience.
HAI metrics [e.g., the standardized infection ratio (SIR)] and data limitations are not easy to explain to a consumer audience, let alone some technical audiences.
Symbols and colors can quickly and effectively convey a message. The challenge is picking the right symbols and colors applicable to all settings.
HAI data are continuously evolving; state and federal mandates change; and we need to be flexible to adapting needs.
While on this workgroup, I learned a lot from the discussions regarding challenges other jurisdictions encounter when publishing these important data. Coming from a rural and less populated state, we struggle with having robust enough data to present information for certain measures and facility types. I know we are not the only state working within this context and it was helpful learning from others with similar experiences. I kept this in mind when providing feedback on the toolkit and the need to provide an option to states that have comparable restrictions. Similarly, I used our own experience in NH to help contribute to the healthcare personnel vaccination component of the template. Several approaches are addressed and made available within the toolkit in order to provide flexibility to states.
Going forward, I plan to incorporate as much as possible in future reports. I estimate that we will reach a wider audience and ultimately increase HAI knowledge by ensuring that these data are more meaningful and accessible to all. Participating in the CSTE workgroup was a great opportunity to not only improve our own work in New Hampshire, but also to help shape the way healthcare-associated infections programs move forward together across jurisdictions in a complementary and consistent way.
Katrina Hansen, MPH is healthcare-associated infections program manager at the New Hampshire HAI Program. Click the link above to learn more about the HAI DAPS Toolkit and join the HAI Subcommittee for the opportunity to engage in initiatives such as this.