How we choose data

  1. Thousands of people have shared their expertise in the development of our initiative. In each of the topic areas, dozens of advisory groups comprised of experts in the field have shared their recommendations about what is most important for measuring progress in our state and communities. 

  2. Wilder Research staff assemble and review advisory group recommendations to identify 3-4 key measures to be maintained in each topic area on Minnesota Compass. Advisory group suggestions are evaluated based on several criteria: (a) those that generated the most discussion and interest from advisors, (b) availability of existing data, and (c) alignment with a set of criteria set forth by a technical advisory panel.

  3. Wilder Research staff compile data from credible sources, such as the American Community Survey, an ongoing year-round survey of households in the United States conducted by the U.S. Census Bureau. Sources can be found on each graph under "Data & notes."

  4. Wilder Research staff create and maintain a set of easy-to-understand charts and tables in each topic area. Where data are available, charts are created for the state and smaller geographies, as well as for several demographic subgroups. Charts and tables are updated when new data are released.

  5. Topic area advisory groups are intermittently reconvened to ensure that we are measuring what matters for understanding and making progress toward better quality of life for all Minnesotans.

Frequently asked questions (FAQs)

Key measures are indicators that help judge how we are doing in a given area. For example, the high school graduation rate is a measure of how well we are preparing our students for post-secondary education and the workforce. When indicators are measured over time, we can trace trends—are we improving, staying the same, or getting worse? By measuring these kinds of data, we can:

  • Learn where we are today
  • Identify where there are disparities in outcomes
  • Measure progress over time
  • Inspire action to improve the quality of life in the region

In each of the topics that measure quality of life, advisory groups comprised of experts in the field, and academic and business leaders convened to choose the indicators. They used a set of criteria and guidelines set forth by a Technical Advisory Committee.

Criteria

RELEVANT
Data relate to stated quality of life topic goals.

VALID
Data truly measure what they are intended to measure.

CONSISTENT OVER TIME
Data are regularly collected the same way.

LEADING
Indicators signal broader changes to come, allowing the community to respond proactively.

POLICY-RESPONSIVE
Indicators can be impacted by policy changes within a relatively short time period.

AFFORDABLE
Data can be easily collected within project budget.

SECONDARY CRITERIA
Understandable
Comparable
Standardized
Outcome-oriented

Most of the key measure data has been compiled by Wilder Research staff using data from one or more credible sources. The source(s) citation for each key measure is provided below each data graph. To find more detailed information, use the drop down menu found above each key measure graph, go to View, and choose "Data & notes."

Additional source information:

Sources used to compile the geographic profile pages

Sources used to compile demographic data

Whenever possible, Minnesota Compass tries to rely on a single data source within a key measure to maintain consistency across charts. Occasionally, we have to use data from several data sources, usually so that we can provide data for smaller geographies (e.g., counties, small cities) or for smaller sub-groups in the population (e.g., foreign born, older adults).

For example, our voter turnout key measure uses data from the Secretary of State of Minnesota for the overall turnout figure for the state. But we use data from the Current Population Survey in order to look at turnout by race and other demographic characteristics.

In rare instances, this means that an estimate for the same geography (e.g., Minnesota) in the same year is slightly different across charts.

One of our main data sources for key measures is the American Community Survey (ACS), an ongoing, year-round survey of households in the United States conducted by the U.S. Census Bureau. The ACS produces estimates that describe average characteristics of an area over a specific time period. Depending on the size of a population, we need to use 1, 3, or 5 years of pooled data to publish reliable estimates. ACS 1-year estimates describe average characteristics over a year, while ACS 3- or 5-year estimates describe average characteristics over that three or five year time period.

Most estimates on Compass are based on information collected from a sample of the total population. Relying on a sample introduces possible error, because estimates would likely vary if the same survey was conducted with a different sample of the population.

The margin of error (+/-) gives a measure of statistical uncertainty. Adding and subtracting the margin of error from an estimate gives a range with a certain level of confidence that the true population value falls within that range. When possible, Compass publishes margins of error for a 90% or 95% confidence level, depending on the standard used by the original data source.

For example, the Census Bureau uses a 90% confidence level. An estimate from American Community Survey data of 49% with a margin of error of +/-1% means we can be 90% confident that the true population percentage is between 48% and 50%.

In general, margins of error are larger for smaller groups or smaller levels of geography. For example, estimated characteristics of American Indian residents in Minneapolis will tend to have larger margins of error than estimated characteristics of all Minneapolis residents. And estimated characteristics of Minneapolis residents will tend to have larger margins of error than estimated characteristics of all Minnesota residents.

When using the same data source to compare two points in time or two groups, margins of error can be helpful in determining whether there is evidence of change over time or differences between groups. If estimates are different but their margins of error overlap, one cannot be certain that there are differences in the population. For example, if the uninsured rate in your community was 10% (+/-3%) in the year 2000 and 11% (+/-3%) in 2001. The ranges created by adding and subtracting margins of error from their respective estimates are overlapping: an estimated 7-13% uninsured in 2000, and an estimated 8-14% uninsured in 2001. Therefore, we cannot know whether the "increase" from 10% to 11% between years was due to chance, or is an actual increase in the uninsured rate in the community.

Margins of error are on the "Data & notes" page for all Compass graphs. In some instances, Minnesota Compass staff have chosen not to publish data if margins of error are too large to draw reliable conclusions.

The results of surveys are subject to other types of error besides sampling error. Consult the original data source for additional details about methodology and potential sources of error. In addition, not all data found on Minnesota Compass is survey data; therefore, some data does not have posted margins of error. Data about educational test scores, low birth weight babies, and crime, for example, come from actual reported counts so there is no sampling error involved. However, there may be other types of error present, such as crimes that occurred but were not reported to police. In addition, some data for small numbers may be suppressed to protect privacy. Where this is known, Minnesota Compass staff have indicated this on the “data & notes” page. Finally, as with all human endeavors, it is possible that Minnesota Compass staff may have introduced an error. 

If you think you have found an error or have a question about the data, please let us know. We appreciate all feedback that can help us improve our website!