Change Text Size:  Normal Text Medium Text Large Text

Reporting Cluster Research Tables

The reporting cluster research tables contain aggregate cluster results from the administration of the California Standardized Testing and Reporting (STAR) Program. These research tables contain the average percent correct for all reporting clusters for each of the California Standards Tests (CSTs), California Modified Assessment (CMA), Standards-based Tests in Spanish (STS) for which the State Board of Education has adopted performance levels. These percentages are reported for the state and each county, district, and school in California. These results are provided for "All Students." No demographic data are included. The reporting cluster research tables are compressed and self extracting to allow easier download and table import management.

To protect student confidentiality, test scores are not reported on this site (or included in the research tables) for any group when ten or fewer students had valid test scores.


Statewide Reporting Cluster Data Research Tables

The statewide reporting cluster data contains codes to identify schools and districts as well as the individual reporting cluster names. To make full use of these data, the user is provided a data layout document and tables with school and district names and reporting cluster names. For ease of usage, the data files and the data format document have been compressed into a single self extracting archive file. Compressed files containing the data tables are available in three formats. The file formats provided are: comma delimited (cvs), fixed-length (txt), and XML. Specifically, each of the "zipped" archive files contain:


For detailed directions regarding the use and manipulation of these tables, please see:


Downloadable Tables:

2012

2012 California Statewide Reporting Cluster Research Data, comma delimited (2.0 MB)

2012 California Statewide Reporting Cluster Research Data, fixed-length (2.0 MB)

2012 California Statewide Reporting Cluster Research Data, XML (2.7 MB)

2011

2011 California Statewide Reporting Cluster Research Data, comma delimited (2.0 MB)

2011 California Statewide Reporting Cluster Research Data, fixed-length (2.0 MB)

2011 California Statewide Reporting Cluster Research Data, XML (2.7 MB)

2010

2010 California Statewide Reporting Cluster Research Data, comma delimited (2.0 MB)

2010 California Statewide Reporting Cluster Research Data, fixed-length (2.0 MB)

2010 California Statewide Reporting Cluster Research Data, XML (2.7 MB)

2009

2009 California Statewide Reporting Cluster Research Data, comma delimited (2.0 MB)

2009 California Statewide Reporting Cluster Research Data, fixed-length (2.0 MB)

2009 California Statewide Reporting Cluster Research Data, XML (2.7 MB)

2008

2008 California Statewide Reporting Cluster Research Data, comma delimited (2.0 MB)

2008 California Statewide Reporting Cluster Research Data, fixed-length (2.0 MB)

2008 California Statewide Reporting Cluster Research Data, XML (2.7 MB)

2007

2007 California Statewide Reporting Cluster Research Data, comma delimited (2.0 MB)

2007 California Statewide Reporting Cluster Research Data, fixed-length (2.0 MB)

2007 California Statewide Reporting Cluster Research Data, XML (2.7 MB)

2006

2006 California Statewide Reporting Cluster Research Data, comma delimited (2.0 MB)

2006 California Statewide Reporting Cluster Research Data, fixed-length (2.0 MB)

2006 California Statewide Reporting Cluster Research Data, XML (2.7 MB)

2005

2005 California Statewide Reporting Cluster Research Data, comma delimited (2.28 MB)

2005 California Statewide Reporting Cluster Research Data, fixed-length (2.16 MB)

2005 California Statewide Reporting Cluster Research Data, XML (3.04 MB)


Data Table Contents and Formats

The reporting cluster research tables provided at this site are large and require expertise in the management of large data tables. The reporting cluster research data table may be too large for spreadsheet applications such as MS Excel. In addition, the "relational" table structure of these data can make spreadsheet applications incapable of fully managing this data. Providing accurate and meaningful reports from the reporting cluster tables generally requires the "linking" of the statewide reporting cluster data table, the entities table, and the test name table. Database applications like MS Access, SAS, or SPSS are generally required to fully manage these research tables.

Working with these reporting cluster tables to achieve accurate results requires an understanding of the structure and content of the two primary tables: the entities and the reporting cluster data tables. The reporting cluster data table has many rows for each entity (school, district, county, or state). Each record represents a different combination of entities, tests, and reporting clusters. In addition, the content cluster name table includes numerous individual tests; each a unique combination of grade and test. Each individual test, on average, includes five reporting clusters. With so many records, it is critical that the desired combination of characteristics are accurately selected. In order to accurately work with the data, the appropriate constraints must be applied to correctly limit the reported data. Failure to correctly apply the appropriate constraints may result in "double counting" or other inaccurate results. These constraints are discussed below.

Tables and Documents

Reporting Cluster Data Table - This table is comprised of the average percent correct for every cluster for each CST, CMA, and STS at the state, county, district, and school levels. These data are reported for "All Students" and contain no demographic data. The cluster averages are reported as the percent of test items correct within a cluster. The number of clusters for a test can vary from 4 to 6.

Entities Table - This table is comprised of the state, all counties, districts, and schools in California. Because there are school and district summary records, as well as county and state summary records, it is critical that in any analysis a "type" is selected to ensure that the analysis is confined to a single entity type. This will help avoid double or triple counting that will occur when a school count is also counted in the associated district record.

Test Cluster Name Table - This table lists the cluster names for each test and is linked (or joined) to the cluster data table through the "ID Code."

Table Format Document - This document contains the table layouts for each of the three table formats listed above. For each of these tables, the data elements are named along with additional data characteristics.


Instructions for Downloading Data Tables

Downloading Instructions for PC Users

  1. Select the link for the file type that you will be using (CSV, TXT, or XML).
  2. Save the compressed cluster score data file to your PC.
  3. Uncompress the file. Each file has been compressed and will require compression software to uncompress the file. An evaluation copy of WinZip is available at no cost at http://www.winzip.com (Outside source).

Downloading Instructions for Mac Users

  1. Select the link for the file type that you will be using (CSV, TXT, or XML).
  2. Save the compressed cluster score data file to your Mac.
  3. Uncompress the file. Each file has been compressed and will require compression software to uncompress the file. An evaluation copy of Stuffit Expander is available at no cost at http://www.stuffit.com/mac/expander/ (Outside source).

Usage: Interpreting Reporting Clusters

The STAR Student Reports and a number of reports provided to school districts include reporting cluster information. Depending on the report, the reporting cluster results are shown as percent correct, average percent correct, or diamonds placed relative to a band that shows the range of average percent correct scores for students statewide who scored proficient on the total subject area test: English-language arts, reading/language arts (STS), mathematics, science, or history-social science. Regardless of the cluster data source, the results should be interpreted cautiously with two important limitations always in mind:

  1. Reporting clusters are based on different numbers of questions, and in some cases, the number of questions that make up a reporting cluster may be quite small. Because of the smaller number of questions comprising any individual reporting cluster, these scores are less accurate than the overall test scores.
  2. Reporting cluster scores may vary from year to year because the difficulty of the questions in the reporting clusters may vary. While the overall test scores are equated to adjust for differences in difficulty from year to year, this is not done for the reporting clusters.

Two useful reference points for interpreting reporting clusters are the performance on the clusters for students statewide who scored at the lowest score for proficient and students statewide who scored at the lowest score for advanced on the total test. Educational Testing Service has calculated the average percent correct for students statewide who scored at these reference points on each test. The score range is available in Appendix A of each year's post-test guide. These average percent correct values provide information about the relative difficulty of different clusters which is important to take into account when considering the performance of students in the school or district.


California Department of Education
1430 N Street
Sacramento, CA 95814

Web Policy