An Official Website of the Commonwealth of Kentucky

  • Assessment/Accountability


Kentucky Summative Assessment

​​​​​​​​​​​​​​​​​​Kentucky students take the Kentucky Summative Assessments (KSA) to meet federal and state testing requirements. Previously, these tests were called Kentucky Performance Rating for Educational Progress (K-PREP). They are developed by Kentucky teachers and align with the Kentucky Academic Standards in each content area. 

KSA are the annual summative assessments given in grades 3 through 8, 10 and 11 to Kentucky public school students. KSA provides content area assessments, including reading and mathematics at all grades, and science, social studies, writing, and editing and mechanics once per grade band (i.e. elementary, middle, and high).

These assessments are Kentucky’s measure of student proficiency and progress on the state content standards. These standards establish goals for what all students should know and be able to do in each grade. KSA are online assessments with only a small percentage of accommodated students taking it on paper. It goes beyond multiple-choice questions to include extended response and technology-enhanced items for students to demonstrate critical thinking and problem-solving skills. 

Assessments provide critical information about student learning, but no single assessment should ever be the sole factor in making an educational decision. It is important to remember that assessments provide only one measure of student learning, but when combined with grades, classroom activities, unit quizzes and tests, and district-level assessments, the end-of-year assessments can help provide a more complete picture of a child’s abilities over the course of the school year and path toward academic success on these content standards. ​

Performance Level Descriptors (PLDs)

For KSA, students receive a separate scale score and performance level (distinguished, proficient, apprentice, and novice). Writing scores will include editing and mechanics and on-demand writing, along with an overall combined writing score. Scale scores are reported as numbers, while performance levels are descriptive. Performance levels are different by content area. These levels indicate performance on groups of items that measure similar skills. PLDs for each grade and content area are available for review using the links below.

KSA PLD Grade 3

KSA PLD Grade 4

KSA PLD Grade 5

KSA PLD Grade 6

KSA PLD Grade 7

KSA PLD Grade 8

KSA PLD Grade 10

KSA PLD Grade 11​ ​

AKSA PLD Grade 3

AKSA PLD Grade 4

AKSA PLD Grade 5

AKSA PLD Grade 6

AKSA PLD Grade 7

AKSA PLD Grade 8

AKSA PLD Grade 10

AKSA PLD Grade 11 ​​ ​

​​​​​​Kentucky Academic Standards

Please visit to find the adopted Kentucky Academic Standards for reading, mathematics, science, social studies, editing and mechanics, and writing.

Important Resources for Spring 2022-2023  Testing

Please visit  Assessment Blueprint - Kentucky Department of Education  to find the blueprints for reading, editing and mechanics, writing, and social studies. Please download the  Science Assessment Blueprint . ​

Test Administration Manuals

KSA Math Reference Sheets

Mathematics reference sheets are no longer provided. Instead the formulas are embedded within specific assessment items. ​

Short Answer Response Space

Extended response answer space, on-demand writing (odw) rubrics and writer's reference sheets, ksa writers reference sheet.

​​KSA Grade 5 Writer's Reference Sheet

KSA Grade 8 Writer's Reference Sheet ​

KSA Grade 11 Writer's Reference Sheet​

KSA Rubrics 

KSA Opinion Rubric Grade 5

KSA Argumentation Rubric Grade 8

KSA Argumentation Rubric Grade 11 ​

2022-2023 Testing Window

Online testing information.

2022-2023 Resources will be available as developed. Look for updates frequently on this page.

Ky Portal Resources (KSA Field Test and Operational Manuals, Scripts, etc.)


Practice Tests/Rubrics

Online Testing Toolbox 

The Online Testing Toolbox ​ contains links to resources, tools, practice tests, tutorials, technical information, and accommodations information.

In an effort to reduce testing allegations, OAA has provided districts and schools with the Accommodated Bookmarks , which offer a quick reference sheet for the do’s and don’ts when providing testing accommodations.​ OAA is also providing districts with English Learners (EL) Accommodated Bookmarks ​, which are a quick reference sheet for the do's and don'ts when providing accommodations for EL students. 

PearsonAccessnext​​​ and TestNAV trainings can be found on the Meetings and Trainings webpage​ . Guidance for specific topics can be found below. 

Certification of Proper Test Administration Forms are due within 2 weeks following the close of the test window.

Test Administration Forms ​

DAC Certification of Proper Test Administration​

BAC Certification of Proper Test Administration ​

Non-Standard Response Templates

Do not change the margins or font size on the templates.

Extended Response

Short Answer

OnDemand Writing

Science Grades 4, 7 & 11

Google Docs Guidelines ​

KSA Sample/Released Items (Coming Soon)

School Report Card Suite

Assessment and Accountability Resources

DAC Monday Emails

DAC Trainings and Webcasts

Alternate Assessment Information

Subscribe to News and Alerts

Connect With Us

Questions for summative assessment - 11th grade

Profile Picture

Terms in this set (24)

Recent flashcard sets, unit 2 d5 lexico.

Profile Picture

8.3 american women names

Profile Picture

NPLEX II - Emergency Medicine - Choking & Env…

Profile Picture

Spanish 1 Unit 2 Lesson A P.73

Profile Picture

Other sets by this creator

Words for summer 2023, mini lessons 1-40, level b vocabulary unit 3, module e mini vocabulary lesson #25 of 40, verified questions.

The following sentence may contain an error in agreement between a subject and a verb. First, circle the simple subject of each sentence. Then, draw a line through each incorrect verb, and above it write the form that agrees with the subject. If a sentence is correct, write C at the end.

Most of the incident was captured on videotape.

Recognizing Forms of Be as Linking Verbs. Underline the form of be in each sentence below. My assistant will be happy to help you.

Recognizing Demonstrative Pronouns. Underline the demonstrative pronouns. Holding out a bunch of daffodils, Daisy said. "I brought you these."

How does Beowulf prove his victory over Grendel? Why might he do this?

Recommended textbook solutions

myPerspectives: English Language Arts, Grade 7 1st Edition by Savvas Learning Co

myPerspectives: English Language Arts, Grade 7

myPerspectives, English Language Arts, Grade 8 1st Edition by Savvas Learning Co

myPerspectives, English Language Arts, Grade 8

myPerspectives: Grade 10, Volume 2 California Edition 1st Edition by Prentice Hall

myPerspectives: Grade 10, Volume 2 California Edition

Prentice Hall Writing and Grammar Grade 8, Grammar Exercise Workbook 1st Edition by Kaye Wiley

Prentice Hall Writing and Grammar Grade 8, Grammar Exercise Workbook

Other quizlet sets, nscs exam 4.

Profile Picture

Series 99 Test Bank

Sociology final exam review.

.fusion-body .fusion-button.button-1 .fusion-button-text{color:#000;text-transform:none;}.fusion-body .fusion-button.button-1 i{color:#000;}.fusion-body .fusion-button.button-1{border-color:#ffffff;border-radius:0px 0px 0px 0px;background:#bebebe;}.fusion-body .fusion-button.button-1:hover .fusion-button-text,.fusion-body .fusion-button.button-1:hover i,.fusion-body .fusion-button.button-1:focus .fusion-button-text,.fusion-body .fusion-button.button-1:focus i,.fusion-body .fusion-button.button-1:active .fusion-button-text,.fusion-body .fusion-button.button-1:active i{color:#007c75;}.fusion-body .fusion-button.button-1:hover,.fusion-body .fusion-button.button-1:active,.fusion-body .fusion-button.button-1:focus{border-color:#007c75;background:#ffffff;} State Board of Education .modal-1 .modal-header, .modal-1 .modal-footer{border-color:#ebebeb;} ×

WVASA in Grades 3-8 and 11

WVDE is charged with determining and approving rigorous yet reasonable cut scores to ensure an accurate representation of student performance in English language arts (ELA), math, and science as mandated by state and federal law.

Standard Setting Process

To set cut scores and establish achievement levels for both the West Virginia General Summative Assessment in Grades 3-8 and the SAT School Day in Grade 11, the West Virginia Department of Education (WVDE), in collaboration with its testing contractors, conducted standard setting meetings during the summer of 2018. Standard setting is the technical process in which panels of content experts participate in a federally approved process to determine cut scores per grade level, per content area. The panelists, which included West Virginia educators, business representatives, parents, and higher education faculty, set cut scores to determine four achievement levels – Exceeds Standard, Meets Standard, Partially Meets Standard, and Does Not Meet Standard.

SAT School Day

Wvgsa in grades 3-8.

The West Virginia Alternate Summative Assessment is administered to students with significant cognitive disabilities who are taught using the West Virginia Alternate Academic Achievement Standards in Grades 3-8 and 11. Results for the alternate assessment are not based on raw or scaled scores, so no scaled score ranges are presented. The four achievement levels across all grade levels and content areas for the alternate assessment are Advanced, At Target, Approaching Target, and Emerging.

Achievement levels descriptors provide a general description of student performance at each level.

Grade 3-8 and 11 

Achievement Level Descriptor The student demonstrates an emerging understanding of and ability to apply content knowledge and skills represented by the Essential Elements.

Approaching Target

Achievement Level Descriptor The student’s understanding of and ability to apply targeted content knowledge and skills represented by the Essential Elements is approaching the target.

Achievement Level Descriptor The student’s understanding of and ability to apply content knowledge and skills represented by the Essential Elements is at target.

Achievement Level Descriptor The student demonstrates an advanced understanding of and ability to apply targeted content knowledge and skills represented by the Essential Elements.

NOTE: Results for the alternate assessment are not based on raw or scaled scores, so no scaled score ranges are presented.

Please note that Internet Explorer version 8.x is not supported as of January 1, 2016. Please refer to this support page for more information.

Article preview

Introduction, section snippets, references (26), cited by (1), recommended articles (6).


Assessing Writing

Using the smarter balanced grade 11 summative assessment in college writing placement.

The Smarter Balanced grade 11 summative assessment is a career and college readiness assessment aligned with the Common Core State Standards. In addition to its use as a measure in the high school, over 200 colleges and universities in 10 states use the results of this assessment as part of a multiple measures approach for placement in writing and mathematics. Our focus in this review is on the assessment’s use in college and university writing placement. We discuss the writing constructs made evident in the assessment in addition to what is not measured in order to identify the possibilities and limitations of its use in college writing placement. We offer suggestions for future research into the consequential validity of the assessment.

The Smarter Balanced grade 11 summative assessment (SB11) is a career and college readiness assessment aligned with the Common Core State Standards (CCSS). It is administered to high school juniors in Smarter Balanced Assessment Consortium (SBAC) states, which include California, Connecticut, Delaware, Hawaii, Idaho, Maine, Michigan, Montana, Nevada, New Hampshire, North Dakota, Oregon, South Dakota, Vermont, Washington, West Virginia and Wisconsin. In addition to its use as a measure in the high school, over 200 colleges and universities in 10 of these states use the results of this assessment as a means of placement in writing and mathematics. Our primary focus in this review is on the assessment’s use in college and university writing placement.

The score of the English language arts and literacy section of SB11 is determined by a performance task that measures writing and a computer adaptive portion that measures reading and listening (Smarter Balanced Assessment Consortium, 2015, 2018a). The writing section is a combination of indirect and direct measures. Student writing is indirectly measured through multiple choice questions about organization, purpose, and evidence. For example, a student might be asked to identify the sentences that best support the conclusion within an argumentative sample. The direct measure of writing, or performance task, requires students to examine given sources in order to craft an argument for an explicitly-identified audience and is assessed on three criteria: organization, evidence, and conventions (Washington Comprehensive Assessment Program, n.d.). Organization includes matters related to clarity of the claim, presence of opposing claims, coherence, and effectiveness of introductions and conclusions. The evidence category evaluates relevance of source information, clarity of paraphrasing, and accuracy of documentation. The conventions category measures compliance with “the rules of grammar, usage, punctuation, capitalization, and spelling.”

These assessment criteria make evident certain writing constructs, which rely on ideologies that warrant critical attention. For instance, SB11 suggests that writing should paraphrase relevant evidence from the given research, address multiple perspectives, sufficiently document sources, and be coherent. Though coherence is not directly defined, essay scoring materials consider “how well … ideas thoughtfully flow from beginning to end using effective transitions” (Washington Comprehensive Assessment Program, n.d.). This understanding of coherence emphasizes one aspect of it—namely, the meaning making that results from connections between parts of a text—but neglects the intertextual aspect that makes meaning by connecting the text to the world. Fairclough (2015) notes that this intertextual aspect relies on notions of “common sense” that are ideological. These ideologies privilege dominant understandings of the world and offer no place for diverse experiences. The assessment criteria also make evident writing constructs that presuppose that writing must conform to certain rules governing language. While SBAC does not specify which language rules, these are likely the rules of standardized English because standard language ideologies present standardized English both as the default variety and the sole rule-governed variety (Lippi-Green, 2012; Milroy & Milroy, 2012). Taken together, these writing constructs leave out certain genres including those that are not argument-focused, have broader views of audience, are situated within authentic rhetorical situations, and involve non-computer-mediated forms of writing.

Though high schools treat SB11 as a single measure of college readiness, Smarter Balanced Assessment Consortium (2018b) and the colleges and universities that use the assessment treat it as one measure among many for college writing placement. In doing so, SBAC endorses an approach to college writing placement that uses multiple measures, such as SB11 and similar assessment scores, high school GPA, Learning and Study-Strategies Inventory results, writing samples, and related measures. The use of multiple measures in college writing placement provides a fuller account of students as writers and potentially minimizes the risk of misplacement that comes with single measure approaches. Though the research reviewed here suggests that this minimization enhances the consequential validity of the multiple measures approach, it does not necessarily improve the construct validity. That is, the research reviewed shows that the use of multiple measures minimizes the negative consequences that come with misplacement, but the individual measures themselves are no more valid merely because of their placement in conjunction with other measures.

SB11 and other models of college writing placement

SB11 as part of a multiple measures approach is an alternative to single-measure, high-stakes placement tests, such as ACCUPLACER or COMPASS, which create substantial risk for misplacing—especially under-placing—students. In one analysis of the predictive validity of placement tests like ACCUPLACER and COMPASS, Scott-Clayton (2012) found that such tests were relatively good predictors of who would do well in college-level courses but relatively bad predictors of who would fail. In other words,

Limitations: ideological assumptions and inequity

As a measurement of college readiness, what is measured and assessed in SB11 demonstrates the kinds of skills SBAC and school officials value and believe a first-year college student should have. In this way, the material covered, the number of questions in certain portions, and the weighting of portions to determine a score operate from “assumptions and resources of particular (historical, disciplinary, technological, cultural, social, linguistic) contexts” (Aull, 2017, p. A3) that privilege

Future considerations for Smarter Balanced and writing placement

The SBAC offers little specific detail about the future of SB11. Its Strategic Plan 2017–2022 (2017)Smarter Balanced Assessment Consortium, 2017 Strategic Plan 2017–2022 (2017)Smarter Balanced Assessment Consortium, 2017 Strategic Plan 2017–2022 (2017), however, offers some clues about the assessment’s future developments. For instance, SBAC plans to provide additional accessibility features, such as illustration glossaries, a more accessible calculator, and further developments in its Braille

Conflict of interest

Nothing declared.

Kendon Smith is a graduate student in the Joint Program in English and Education at the University of Michigan. He holds an MA in English Studies from Western Washington University and has taught first-year composition, basic writing, and professional and technical communication at Skagit Valley College.

Awaiting a new wave: The status of state writing assessment in the United States

Tools and tech: a new forum, the changing nature of educational assessment, review of research in education, predicting success in college: the importance of placement tests and high school transcripts, southern illinois university carbondale as an institutional model: the english 100/101 stretch and directed self-placement program, introducing directed self-placement to kutztown university, automated scoring technologies and the rising influence of error, english journal, frequently asked questions: smarter balanced interim automated scoring, the case of a small liberal arts university: directed self-placement at depauw, language and power, learning to ride the waves: making decision about placement testing, improving developmental education assessment and placement: lessons from community colleges across the country, assessing developmental assessment in community colleges, potential scoring and predictive bias in interim and summative writing assessments, reservoir alternate surgical implantation technique: preliminary outcomes of initial propper study of low profile or spherical reservoir implantation in submuscular location or traditional prevesical space.

Alternative reservoir site placement has become an accepted technique for patients who require an inflatable penile prosthesis. To our knowledge there has been no prospective evaluation of this technique, which is currently off label. We performed a prospective, multicenter, multinational, internal review board approved study to evaluate the effectiveness and safety of alternative reservoir site placement.

PROPPER initiated in June 2011, is a database containing patient outcomes of inflatable penile prosthesis implantation. Patients with AMS® penile prostheses continue to be enrolled at 13 North American sites. We examined PROPPER study data to determine surgical implantation use patterns for the AMS 700™ series. We evaluated reservoir implantation site and complications by implantation site.

A total of 759 patients had been implanted with an AMS 700 series implant by the time of evaluation. Mean patient followup was 17.8 months (range 0 to 36). There was no reported case of revision surgery for a palpable reservoir and no report of vascular or hollow viscous injury associated with alternative reservoir site placement. Two cases of reservoir herniation in the alternative reservoir site placement group and 2 in the space of Retzius group were treated with reservoir reimplantation. Patients with 1-year assessment available were satisfied or very satisfied with the device and reported a frequency of use of more than once per month.

Alternative reservoir placement in the submuscular location is an option in patients who undergo inflatable penile prosthesis surgery. Implant surgeons should consider alternative reservoir site placement a safe, effective alternative to reservoir placement in the space of Retzius.

Examining the validity of an analytic rating scale for a Spanish test for academic purposes using the argument-based approach to validation

Rating scales are used to assess the performance of examinees presented with open-ended tasks. Drawing on an argument-based approach to validation, this study reports on the development of an analytic rating scale designed for a Spanish test for academic purposes. The study is one of the first that sets out the detailed scale development and validation activities for a rating scale for Spanish as a second language. The rating scale was grounded in a communicative competence model and developed and validated over two phases. The first version was trialed by five raters, and its quality was analyzed by means of many-facet Rasch measurement. Based on the raters’ experience and on the statistical results, the rating scale was modified and a second version was trialed by six raters. After the rating process, raters were sent an online questionnaire in order to collect their opinions and perceptions of the rating scale, the training and the feedback provided during the rating process. The results suggest the rating scale was of good quality and raters’ comments were generally positive, although they mentioned that more samples and training were needed. The study has implications for rating scale development and validation for languages other than English.

Editorial: 25 Years of Assessing Writing

Affordances of toefl writing tasks beyond university admissions.

This review describes in brief the writing sub-sections of the internet-based Test of English as a Foreign Language (TOEFL) and discusses its use in university admissions decisions and potential use as a tool for course placement. The primary purpose of the TOEFL is to measure the academic English proficiency of non-native English speakers seeking admission to English- medium universities. The writing portion of the TOEFL consists of an independent writing task and a second writing task that requires test-takers to integrate reading, listening, and writing skills to compose a written text. Currently there is only is limited evidence supporting the use of TOEFL scores for placement (either in ESL programs or in university composition courses), but the review offers guidelines for further investigation into the TOEFL’s potential as an assessment tool for placement.

Large-scale state assessment is in a time of flux in the United States. The Common Core State Standards have been widely adopted, resulting in most states developing or adopting new writing assessments. This article presents results from document analysis of websites across all 50 states conducted in 2015 to determine writing assessment formats and scoring practices. Drawing on the dichotomy of psychometric and sociocultural assessment approaches, three major classifications for writing assessments are used to categorize assessments: indirect psychometric, direct psychometric, and direct sociocultural, with these aligning with multiple choice tests, traditional “direct writing assessment” or on-demand essay assessment, and portfolio assessment, respectively. Findings indicated that 46 out of 50 states (92%) were primarily using on-demand essay assessment, often in conjunction with multiple choice and short answer items, and no state was utilizing portfolios for writing assessment. Regarding scoring, 98% of state writing assessment was scored externally with no involvement of the classroom teacher. Overall, there was no evidence that forms of direct sociocultural assessment were occurring at the state level. The current study offers a snapshot of recent state writing assessment in order to inform the next wave of writing assessment in the United States.

It's a no-brainer

Some of the simplest life forms are able to learn. How do they do it, wonders Erica Tennenhouse

Kelly L. Wheeler is currently in the Joint Program in English and Education at the University of Michigan. She taught for 21 years at the secondary level and has an MAT from the University of Puget Sound and an MA in English Composition and Rhetoric from the University of South Carolina.


  1. Summative assessment ideas and strategies to use to determine final grades for reporting of

    summative assessment 11 grade

  2. GRADE 1 SUMMATIVE TEST with Answer Key (Modules 7-8) 2ND QUARTER

    summative assessment 11 grade

  3. Grade 11 pe summative test

    summative assessment 11 grade

  4. Summative Assessment Ideas

    summative assessment 11 grade

  5. Formative Assessment Example

    summative assessment 11 grade

  6. Summative Assessment Ideas

    summative assessment 11 grade


  1. science sa2 questions 7th class//summative assessment 2 7th class science//annual exam questions

  2. WEEK-11 SUMMATIVE -standard of Police Professionalism

  3. 10th class SA-1 Life Science answer key 2022 //summative assessment1 SCL answer key #sreducation

  4. Grade 2 Quarter 1 Summative Test and Performance Task #4

  5. Fungi Summative presentation

  6. How we conduct the summative test for grade 1 pupils🤗


  1. SB Summative Assessments Parent Guide, Grade 11

    California Assessment of Student Performance and Progress. Parent Guide to the Smarter Balanced Summative Assessments. Grades. 11. Contents. Introduction.

  2. Summative Assessment (Grade 11 Term 1)

    grade. Summative Assessment (Grade 11 Term 1). user. Жанар Джуманкулова. 9. plays. 10 questions. Copy and Edit. Save. INSTRUCTOR-LED SESSION Start a live

  3. Kentucky Summative Assessment

    KSA are the annual summative assessments given in grades 3 through 8, 10 and 11 to Kentucky public school students.

  4. Questions for summative assessment

    Study with Quizlet and memorize flashcards containing terms like In what condition did Sado find the American Soldier ?, Why was Dr. Sadao not sent to the

  5. WVASA in Grades 3-8 and 11

    To set cut scores and establish achievement levels for both the West Virginia General Summative Assessment in Grades 3-8 and the SAT School Day in Grade 11

  6. Grade 11 English Language Arts/Literacy Literary Analysis Task

    English Language Arts/Literacy Assessment: General Scoring Rules for the 2015 Summative Assessment. Note: This item set contains items with embedded

  7. Smarter Balanced, ELA Teacher Guide, Grade 11

    The Smarter Balanced Summative Assessments are part of the California Assessment of. Student Performance and Progress (CAASPP) System. The new

  8. 2020-21 ELA & Math Summative Assessments

    In both ELA and Mathematics, the summative assessment window for. all grades (3 – 8, HS) will be April 13, 2021 – June 11, 2021.

  9. Using the Smarter Balanced grade 11 summative assessment in

    The Smarter Balanced grade 11 summative assessment (SB11) is a career and college readiness assessment aligned with the Common Core State Standards (CCSS). It

  10. Grade 11 Science Released Items-2022

    Kentucky Summative Assessments. Grade 11 Science. Released Items. 2022. Grade 11 Science. Released Items. 2022. Page 2. Science. GO ON. Page 2. BI1701_00.