Login Event Calendar Careers About NACCHO Contact Us Site Map

Dear Visitor,

You have reached the archived version of NACCHO's website. As of February 1, 2016, the content on this site will no longer be updated and may contain outdated information. To see NACCHO's most recent and updated content, please visit our new site at www.naccho.org.

If you have additional questions, please contact us at support@naccho.org.

Print this page Print This Page

Email this page E-Mail This Page

Bookmark and Share

H1N1 Data: What Does It All Really Mean?

February 23, 2010

Monitoring Actions for Outcomes

At a town hall session at last week's Public Health Preparedness Summit in Atlanta, presenters focused on how best to collect, evaluate, and use data and statistics on the recent H1N1 pandemic. At last Thursday's session, Tamar Klaiman, PhD, MPH and Michael Stoto, professor of Health Systems Administration and Population at Georgetown University, presented their evaluations of H1N1 data to the audience.

Klaiman looked at the effects of the school closures that had been administered across the country over fall and winter last year. According to the School Dismissal Monitoring System from the Centers for Disease Control and Prevention (CDC) and the Department of Education, 1,905 schools closed in the U.S. between Aug. 3 and Dec. 3. This affected 1,905 students.

Klaiman said the closures were not effective in preventing the spread of H1N1 because young people still found ways to congregate—at private gatherings and at public places like malls. This point emphasized the importance of monitoring the outcomes of response activities. "It's important to clarify the goal of a response action before taking it," said Klaiman.

Stoto, a fellow with the American Statistical Association, questioned the CDC's interpretation of statistics on rates of deaths and hospitalizations according to age groups, arguing that children may not in fact have been disproportionately impacted. According to his findings, the differences may have been accounted for by discrepancies in rates of testing and reporting. 

After Action Reports Point to Variety of Responses

The presentation also focused on local variation in the public health response, based on findings from after action reviews (AARs). Mary Davis of the North Carolina Preparedness and Emergency Response Research Center (NCPERRC) spoke of H1N1 as an opportunity for collecting data to lead to quality improvement (QI) initiatives.

NCPERRC followed methods for collecting, evaluating, and reporting data according to the Homeland Security Exercise Evaluation Program (HSEEP). The organizations created AARs on H1N1 with the intention of identifying the nature of response activities, testing for differences in local response between accredited and non-accredited health agencies, and using their findings to identify areas for improvement.  An event evaluation guide (EvEG) was developed to determine how to evaluate data that was collected through a series of on-site visits and structured interviews at health agencies.

According to their findings there were wide variations in the scope and timing of local health response, and there were differences between accredited and non-accredited agencies in the speed, breadth, and capacity of the response activities. 

Davis offered recommendations for conducting AARs:

  • Use a central location and format (that can be updated electronically) for documenting response activities;
  • Use a standardized data tool;
  • Recruit outside facilitators; and
  • Determine the most important points for partners to understand, and collect only the essential data.


Comments about this post

The query did not return any results.