Leslie Gale's Blog

Just another WordPress.com weblog

Instructional Strategies and Their Emphasis in Schools July 2, 2013

Module 1 of EDU 6526, Survey of Instructional Strategies, encouraged us to consider what we as educators need to keep in mind when implementing strategies that are aimed at supporting all learners. In the resource Classroom Instruction that Works, Dean et al. (2012) describe nine teaching strategies that support high achievement for all students. These strategies are meant to promote the development of a wider view of how “different influences work together to help all students realize their learning goals.” (2012, p.vii) This philosophy goes beyond the usual bandwagon strategies that claim to be the best and most effective and are often used in isolation. Dean et al.’s nine strategies provide a framework through which all students can reach their full potential rather than simply bringing students up to standard.

We were asked to describe which of the nine strategies were most emphasized in schools today I noticed a diversity of responses that were submitted by my peers. My response stated that the setting of objectives and learning goals, assigning homework, and identifying similarities and differences are three of the nine strategies that I believe are emphasized in schools today. A classmate stated that no homework is assigned until middle school and she believed that the most emphasized strategies were identifying similarities and differences, grouping and positive rewards (Van Loon, 2013). Another classmate reported that she thought the emphasized strategies were cooperative learning and setting objectives and providing feedback (Washington, 2013). Yet another classmate writes that she believes the most used strategies are summary and note taking and cooperative learning (Hamilton, 2013). While it appears that several of us are in agreement that grouping and cooperative learning are widely used strategies in schools, there are also a wide variety of other strategies that are considered important. This makes sense to me because when we consider the different contexts in which we teach – such as grade level, community characteristics, and subject area – the use of and emphasis on a variety of strategies seems to align with best practices. Through this course I aim to be able to apply appropriate instructional strategies based on the context of my teaching which will involve making decisions prior to, during, and following instruction.

Resources:

Dean, C. B., Hubbell, E. R., Pitler, H., Stone, B. (2012). Classroom instruction that works: Research-based strategies for increasing student achievement. Denver, CO: Mid-continent Research for Education and Learning.

Hamilton, K. (2013, July 1). Week 1 discussion. Message posted to https://bbweb-prod.spu.edu/webapps/portal/frameset.jsp?tab_tab_group_id=_2_1&url=%2Fwebapps%2Fblackboard%2Fexecute%2Flauncher%3Ftype%3DCourse%26id%3D_74199_1%26url%3D

Mutal, S. (2013, June 27). Module 1 discussion. Message posted to https://bbweb-prod.spu.edu/webapps/portal/frameset.jsp?tab_tab_group_id=_2_1&url=%2Fwebapps%2Fblackboard%2Fexecute%2Flauncher%3Ftype%3DCourse%26id%3D_74199_1%26url%3D

Van Loon, M. (2013, June 28). Strategies used in a Waldorf school. Message posted to https://bbweb-prod.spu.edu/webapps/portal/frameset.jsp?tab_tab_group_id=_2_1&url=%2Fwebapps%2Fblackboard%2Fexecute%2Flauncher%3Ftype%3DCourse%26id%3D_74199_1%26url%3D

Washington, S. (2013, June 28). Strategies used in schools. Message posted to https://bbweb-prod.spu.edu/webapps/portal/frameset.jsp?tab_tab_group_id=_2_1&url=%2Fwebapps%2Fblackboard%2Fexecute%2Flauncher%3Ftype%3DCourse%26id%3D_74199_1%26url%3D

 

Standard 5 Meta Reflection: Assessment December 1, 2012

Introduction: Meta Reflection and Assessment Philosophy

My perspective and philosophy of classroom assessment has transformed in many ways throughout my eleven years as an educator. Much of this transformation has occurred just within the past two and one half months as I process the information I have learned in the Standards Based Assessment graduate course at Seattle Pacific University. Where previously I had considered assessment mainly as an endpoint to a unit or an academic year, I now understand that assessment is so much more than that. Effective assessment practices are critical to the design of differentiated and ongoing instruction (formative assessment) as well as to the communication to others of student achievement at a particular point in time (summative assessment). In other words, assessment that is formative is assessment  for student learning and assessment that is summative is assessment of student learning (Chappuis, Stiggins, Chappuis, and Arter, 2012).

Two of the most profound concepts I have learned throughout this course are connecting assessments directly to learning targets and utilizing assessment methods that best support these targets. Understanding the type of the target is imperative to ensuring that the assessment will accurately measure the achievement of that target. Five types of targets are knowledge, reasoning, skill, product and disposition targets (Chappuis et al., 2012).  Knowledge targets, which relate to the factual information that support each content area, and reasoning targets, which relate to the thought processes required to make sense of and apply new information, embody most of the learning targets in place at the elementary level. It is on these types of targets that most of my assessment portfolio is centered.

Chappuis et al. (2012) describe the best matches when deciding which assessment method to use with the different types of learning targets. As I craft assessments for kindergarten writing learning targets I am inclined to use both personal communication and performance assessment methods because these match up well with knowledge and reasoning targets. Personal communication works especially well with formative assessments in writing due to the fact that, while more time consuming, it lends itself well to differentiation amongst different learning styles as well as informs educators in real time so that adjustments can be made in instruction (Chappuis et al, 2012, p. 96). Performance assessment is described as a partial match for knowledge and reasoning targets due to the fact that if a student is not successful in this type of assessment, it might be problematic to determine the cause of the difficulty. It is for this reason that I often combine personal communication assessment with performance assessment. Personal communication allows me to better understand a young learner’s thought processes along the way.

Both Chappuis et al. (2012) and O’Connor (2009) stress that the communication of assessment data to all invested parties (students, parents, community) is a critical component of effective assessment. While I have always agreed that this is true, I have not always been aware of the extent to which the students themselves play a role in this communication. O’Connor (2009, p. 231) illustrates this point clearly with a continuum that reflects level of student involvement during scheduled formal conferences. Student involvement ranges from the student is not present at all to the student leads the conference him/herself. Educators are encouraged to move towards such student-led conferences.

Chappuis et al. (2012, p. 32) describes effective practices that support student self-assessment and goal setting as well as facilitate two way communication between students, teachers, and parents. Students can identify their own strengths and weaknesses prior to teacher feedback, utilize a response log that states what was learned and what questions they still have, select pieces for a portfolio, offer feedback to fellow students, and set goals for future learning. I have included several of these recommendations in my portfolio artifacts. These practices also help students answer the “Where am I now?” question that is part of Chappuis et al.’s Seven Strategies of Assessment for Learning (2012, p. 28).

The artifacts included in my Assessment Portfolio are designed to provide data that illustrates student progress towards specific writing learning goals in kindergarten. Several artifacts will demonstrate how this data is analyzed by the teacher, grade level team, and the students themselves, as well as reflect reporting systems that have been developed to track and report student achievement.

In my experience, technology has served as both a blessing and a source of frustration to my understanding of sound assessment practices. Currently, I use our district online grading system to record student grades for the report card. This system has provided consistency across the district in regards to summative reporting at the end of the semester. It has also provided an opportunity for some data entry to be pre-populated at the district level, thus reducing teacher work load. However, at first use this system seemed to be inconsistent with what I have learned in this class regarding linking assessments and grades directly to learning targets. Content areas on this system are graded according to “strands” such as Content, Organization, and Spelling for writing; and Number and Operations, Algebra, and Geometry for math. It seemed difficult if not impossible to record and communicate at the standards level using this system. However, after I was able to reflect on Chappuis et al.’s practice of deconstructing complex content standards (2012, pp. 60-66), I am better able to connect the strands listed in our online grade book with the content standards that give direction to our classroom instruction. The deconstructing process that they recommend uncovers and unpacks the standards and reveals common targets and categories of targets that the online system and the standards share.

The following section includes a compilation of assessment artifacts that reflect my growing understanding of assessment practices. Excellence in the craft of assessing students for learning is an ongoing endeavor and will continue to be a high priority for me as I plan for and engage in professional development and teacher leadership opportunities.

Artifacts and Analysis of Multi Assessment Approaches

     For the purposes of this portfolio, I will focus on the assessment and record keeping of learning targets that inform the writing curriculum in kindergarten. For my first artifact I am presenting a Writing Goals Checklist that would include student names and writing learning targets. This process will allow me to use a date and check off procedure to track when students have mastered a target. Likewise, I will be informed of students who have difficulty in one or more targets and be able to offer supportive instruction.

In September, students, parents and families work together to set goals in writing during our fall Goal Setting Conferences. After a couple weeks of exploring the craft of conveying a message through drawing and writing, students are guided to select one or more goals listed on a goal sheet that support writing learning targets. Through this activity students are introduced to the task of self-evaluation and goal setting as well as become more familiar with writing expectations and learning targets presented in kid friendly language.

In order to provide evidence of multiple assessment approaches, I am including the below artifacts along with a description of each as to the assessment context, the targets it addresses, an analysis of student achievement, a reference as to how this piece is supported by research and current professional literature, possible next steps for student instruction, and reflections on assessment revision (if applicable).

District Summative Writing Assessment (student sample)

Assessment Method: Performance Assessment

Assessment Context:

This District Summative Writing Assessment is a narrative writing activity.  Students are to write to the prompt “Write about Kindergarten. Tell about something fun or special you have done at school.” This assessment is given in the fall.

Resources: District CDSA and script, 45 minutes instruction time

Assessment Rubric: Writing Summative Assessment Rubric

Targets Addressed:  # 1-8 on Writing Goals Checklist

Analysis of Student Achievement:

As my teammates and I discussed the scoring of this kindergarten writing sample,  it first appeared that the student, while seeming to have a specific message to convey, did not have a clear idea as to how to organize his writing. There was no consistency in his retell and he did not seem to understand where on the paper the text of his story should go. (The shaded out text on the lines and in his illustration was where he put his name.) After further discussion, we came to the conclusion that the student DID place text in several speech bubbles which indicates a certain understanding of text versus pictures in his writing. Therefore his organization score will be a 2.

Literature Support:

Collaborating within my professional learning community in the scoring of our district writing assessments addresses Key to Quality Assessment #4: Effective Communication. (Chappuis et al, 2012, p. 8)

Next Instructional Steps:

Student will receive instructional support in including text in a story and letter-sound correspondence.

Assessment Revision/Reflection:

In order to preserve the integrity and consistency of district assessments, this assessment itself will not be revised. I will, however, incorporate professional collaboration in the scoring process on a more regular basis.

Competence Portfolio (student sample)

Assessment Method:  Portfolio

Assessment Context:

This assessment portfolio would provide summative evidence of mastery of the identified targets. This assessment method accommodates learners of all levels in that it recognizes that students will reach learning targets at different times of the year. To promote student investment in his/her own progress, students will take part in identifying which writing pieces best represent evidence of the target.

Resources:

Classroom instruction of volume 1 of Luch Calkins’ Units of Study (2003)

Student writing samples reflecting lessons from volume 1 of  Lucy Calkins’ Units of Study (2003)

Targets Addressed:  # 9 and# 15 on Writing Goals Checklist

Analysis of Student Achievement:

In order to consider a student’s writing as having a “developed topic”, our team decided that he/she must include 2 out of the 5 following “wh” questions: Who, What, Where, When, Why. Further, I decided that if I found at least 8 pieces that reflected competence of the two targets then we could consider these goals achieved and they could be checked off on the Writing Goals Checklist. It would be unlikely that 8 demonstrations of these writing behaviors would constitute “chance” rather than mastery. The writing samples are numbered in the bottom right hand corner in order of date, earliest to most recent.

Student A consistently demonstrates the use of beginning and ending sounds. I have therefore checked off the two learning targets on the Writing Goals Checklist.

Literature Support:

Competence portfolios demonstrate evidence of acceptable or exemplary achievement of one or more learning targets.  In order to be certain that enough samples are collected to rule out “chance” rather than mastery, an adequate number of samples needs to be collected (Chappuis et al., 2012, p. 367).   I determined that a sample of at least 8 pieces showing mastery of the above two targets would demonstrate enough evidence to check these targets off on the Writing Goals Checklist.

Next Instructional Steps:

This student is an above standard writer and has achieved many of the writing goals for kindergarten. I will work with him on adding details to his picture as well as to his writing. He will be encouraged to consider the reader and to provide enough details to “make a mind movie”.

Assessment Revision/Reflection:

As I reflect on this assessment type, I am aware that these same ten pieces of writing could actually serve as a competence portfolio for several other learning goals. I imagine that such a portfolio could have a checklist in the front that designates which targets are covered in this collection.

Missing Puppy Posters (Links to student samples are found below in “Student Sample Scores” section)

Assessment Method:  Performance Assessment

Assessment Rubric: Puppy Poster Performance Assessment Rubric

Assessment Context/Description:

I told students that a very special member of our classroom, our stuffed dog “Pip”, is missing and I asked them what we could do to get help in finding him.  The students were guided to come up with the idea of making posters to put around the school to see if anyone has seen him. We searched online for samples of other posters for missing items and determined that our posters needed a picture, words that described what our problem was, a description of what the missing puppy looks like, and who is looking for him. Students were given several different paper choices, one that had space for words above and below the picture, one that had space for words above the picture only, and another that had space for words below the picture. They were then sent on their way to create their posters.

Resources: Classroom instruction on lessons 1-10 in Lucy Calkins’ Units of Study (2003)

Targets Addressed:  # 6 and# 10 on Writing Goals Checklist

Analysis of Student Achievement:

In order to consider a student’s writing as having a “developed topic”, our team decided that he/she must include 2 out of the 5 following “wh” questions: Who, What, Where, When, Why. Further, I decided that if I found at least 8 pieces that reflected competence of the two targets then we could consider these goals achieved and they could be checked off on the Writing Goals Checklist. It would be unlikely that 8 demonstrations of these writing behaviors would constitute “chance” rather than mastery. The writing samples are numbered in the bottom right hand corner in order of date, earliest to most recent.

Student A consistently demonstrates the use of beginning and ending sounds. I have therefore checked off learning targets #6 and #10.

Literature Support:

Chappuis, et al. (2012, pp. 211 – 218) describe three critical dimensions pertaining to a well-constructed performance assessment.  These dimensions are task content, task structure, and sampling. As I considered task content I defined the learning targets (listed above) that would be measured through this assessment. I also wanted to allow for authenticity. Students were motivated to write and create these posters in order to locate our beloved Pip. Their writing had a specific purpose. Additionally, I incorporated student choice as they decided which layout they wanted to use for their poster.

This performance assessment addressed task structure (Chappuis, et al., 2012, pp. 213-216) as well. Students were aware of what they needed to accomplish – that is to provide other members of the school enough information to enable them to help find Pip the Dog. The timeline for completion was reasonable and accounted for the wide range of abilities in the class. Students who finished early knew to go to a quiet reading activity. Additional time was given later in the day in order to accommodate students who needed more time.

This writing assessment was formative and is intended to give information that will guide further instruction. Adequate sampling (Chappuis, et al., 2012, pp. 216 – 218) is accounted for as it allows students to demonstrate evidence that covers the breadth of the two learning targets. This assessment lends itself well to display sound-letter correspondence as well as adherence to a specific purpose.

Student Sample Scores and Next Instructional Steps:

Student #1

  • Scored a 3 for Writes for a specific purpose  because she included 3 of the poster components: a picture, a statement of the problem (“Pip is lost”), and who is looking for him (Miss Gale’s class).
  • Scored a 4 for Demonstrates sound-letter correspondence – Beginning, ending and middle sounds are represented and it can be easily read by the teacher.

Next instructional steps: Explore other purposes for writing such as “how to”.

Student #2

  • Scored a 2 for Writes for a specific purpose  because two of the poster components are included: a picture and a statement of the problem (“Pip is lost”).
  • Scored a 2 for Demonstrates sound-letter correspondence because much support was needed to get “Pip is lost.” on his paper.  Also, he happily admitted that he copied his neighbor for the bottom (illegible) portion of the text and didn’t know what it said.

Next instructional steps – Add details to picture/words, understand and use beginning and ending sounds.

Student #3

  • Scored a 4 for Writes for a specific purpose  because he included all four poster components and added one of his own, the date the dog was missing.
  • Scored a 4 for Demonstrates sound-letter correspondence.

Next instructional steps: Compare and contrast various purposes for writing.

Assessment Revision/Reflection:

I was pleased with the information that this assessment offered as it allows me to plan instruction accordingly. To improve on this assessment for next time I will clearly tell the students the criteria, in student friendly language, which I will be using to assign a score to their writing. I plan to create a related activity that incorporates student self-assessment.

Writing Conference Notes (teacher designed)

Assessment Method:  Personal Communication

Assessment Context/Description:

I conduct this formative assessment during writing conferences as part of Writing Workshop. On one side of the sheet I record writing behaviors that are already evident. In the right column I make notes regarding what I have taught the student during the conference. In order to track and keep records of who is to receive a conference and when, I schedule conferences on a Writing Conference Schedule.

Targets Addressed:  # 1 – 20 on Writing Goals Checklist

Resources:

Instruction and practice with independent writing, Carl Anderson’s (2005) Conference Notes template

Analysis of Student Achievement/ Next Instructional Steps:

On the dates specified on the conference notes I saw evidence that this student likes to be organized (disposition target), uses spaces consistently, and capitalizes the pronoun “I” (learning target #19). I will support this student in using lower case and capitalization appropriately, the planning process of writing (to include the purpose of a quick sketch), the use of blends and digraphs, and word deconstruction (phonemic awareness).

Literature Support:

This form of personal communication allows me to gain better insight into the student’s strengths and misconceptions. It also allows the student to feel “heard” and take more ownership in the direction of his/her learning. I am better able to tailor the lesson to the student’s particular needs. (Chappuis, et al. 2012, pp. 279 – 280)

Assessment Revision/Reflection:

I would like to link my writing conferences more specifically to learning targets. I will add a section to the conference sheet that indicates which learning target is being addressed. I also think that this would be a good opportunity to incorporate student self-evaluation. I will add smiley, straight, and frown faces for students to indicate where the student believes he/she is in regards to the target.

Resources

Anderson, C. (2005). Assessing writers. Portsmouth, NH: Heineman.

Calkins, L., (2003). Units of study (Vols. 1-7). Portsmouth, NH: Heineman.

Chappuis, J., Stiggins, R., Chappuis, S. & Arter, J. (2012) Classroom assessment: Every student a learner. Classroom assessment for student learning: Doing it right-Using it well (pp. 1-18). Boston, MA: Pearson.

O’Connor (2009). How to grade for learning. Thousand Oaks, CA: Corwin.

 

Tracking Student Learning in Reading November 20, 2012

Filed under: Standard 5. Assessment — lkgale @ 4:40 am
Tags: , , , ,

There were several gems in module 8 that will be very useful as I revamp my record keeping and how I track student learning. One point Chappuis, Stiggins, Chappuis and Arter (2012, pp. 302-303) made clear was how beneficial it is to students when they have the opportunity to track their own learning. Being involved in this way keeps students in touch with their own progress and provides motivation to learn.

I believe that I can readily apply this strategy to our reading program. Students take home a reading log and a book from the classroom daily and are responsible for reading for 15 minutes each night at home (or be read to). Students are placed in reading groups that correlate with a particular book level that is represented by a color. Students bring “just right” books home for daily practice in a folder that includes a reading log. Students are to document the title of the book they have read, record three interesting words they found, and have parents sign the log. It seems to me that this would be the ideal place for students to keep and maintain a chart that documents their progression through the reading levels.  I also use a Reading Log Tracking Sheet to keep track of student reading levels as well as how regularly they read at home and return their daily reading folders.

One of the benefits I have found with the Reading Log Tracking Sheet is it allows me to see current student reading levels as well as patterns of responsibility with reading homework all in one location. I am aware that Chappuis, et al. state that work habits should be tracked separately from academic achievement levels so as to avoid raising and lowering grades in response to a range of learning problems (2012, p 314). I am comfortable with recording the reading folders in this way because the regularity with which they read and bring back their logs will not affect their reading grades. It will just serve as a discussion point.

Chappuis, et al. also recommend that educators plan for assessment events that will be used formatively during learning. Such events can be practice work, evidence collected to be used for grouping purposes, work turned in for feedback, and evidence students use for self-assessment (2012, p. 299-300). While conducting reading group instruction I gather formative data and record it on a Guided Reading Recording Sheet that is specifically designed for a particular reading level. It includes a list of reading learning targets that can be checked off when they are observed on a regular basis.  Once a student has each target checked off, they are ready to take a summative reading test that confirms (or does not confirm) readiness to move on and the reading groups are changed accordingly.

Chappuis, J., Stiggins, R., Chappuis, S. & Arter, J. (2012) Classroom assessment: Every student a learner. Classroom assessment for student learning: Doing it right-Using it well (pp. 1-18). Boston, MA: Pearson.

 

Competence Portfolios in Kindergarten November 13, 2012

Filed under: Standard 5. Assessment — lkgale @ 12:57 am
Tags: , , ,

Our module this week included in-depth information on the use of portfolios as a component of assessment. I have to say that the concept of portfolios has always intimidated me. I have collected the work of students over time in order to show growth, but I have always known that I am not using these collections to their full potential. It was a revelation to me to learn that there are five different types of portfolios, each with its own set of uses. Chappuis, et al. (2012, pp. 366-368) describes these types as growth portfolios, project portfolios, achievement portfolios, competence portfolios, and celebration portfolios. Like many of my colleagues throughout our discussions this week, I have come to realize that I already incorporate many of the suggested portfolio practices and with a little tweaking  here and there I can implement one or more effective and comprehensive portfolio assessments.

Chappuis, et al. (2012) describes celebration portfolios as a collection of work that student’s choose that reflects something they are most proud of. I am reminded of a practice I currently employ in writing that could easily be transitioned into celebration portfolios as Chappuis describes them. I engage students in daily writing workshops according to Lucy Calkins Units of Study (2003). As part of this curriculum, students select a piece of writing at the end of each unit that they like the most and they revise, edit, and “fancy it up”. As a celebration, we invite other classes, teachers, or parents to come and hear these prized stories. If I could save these chosen stories throughout the year, students would have compiled pieces for a celebration portfolio that is complete by the end of Spring.

For my artifacts this week I wanted to showcase a writing competence portfolio for “Student A”. Competence portfolios demonstrate evidence of achievement of one or more learning targets. The most challenging part of putting this together is determining how many writing samples are needed to show competence in a particular target. According to Chappuis et al. (2012), this is also one of the most crucial considerations. We want to be certain enough samples are collected so that the evidence of competence is not the result of chance.  I decided to focus on two writing targets:

  • Selects topic easily and develops it throughout writing.
  • Uses beginning and ending sounds.

I selected ten writing pieces from Student A’s writing folder that have been completed at various points over the past 2 months. In order to consider a student’s writing as having a “developed topic”, our team decided that he/she must include 2 out of the 5 following “wh” questions: Who, What, Where, When, Why. Further, I decided that if I found at least 8 pieces that reflected competence of the above two targets then we could consider these goals achieved and they could be checked off on the Writing Goal Checklist. It would be unlikely that 8 demonstrations of these writing behaviors would constitute “chance” rather than mastery. The writing samples are numbered in the bottom right hand corner in order of date, earliest to most recent.

As you can see, each of the writing samples in this portfolio demonstrates the use of beginning and ending sounds. Also, each one reflects answers to “Who” and “What” questions. I have therefore checked off the two learning targets on the Writing Goal Checklist (link provided above). You will notice that several other goals are checked off for this student, however this portfolio reflects only the two goals referenced above.

As I reflect on this assessment type, I am aware that these same ten pieces of writing could actually serve as a competence portfolio for several other learning goals. I imagine that such a portfolio could have a checklist in the front that designates which targets are covered in this collection.

Calkins, L., (2003). Units of study (Vols. 1-7). Portsmouth, NH: Heineman.

Chappuis, J., Stiggins, R., Chappuis, S., & Arter, J. (2012). Classroom assessment for student learning: Doing it right, using it well. Upper Saddle River, NJ: Pearson Education Inc.

 

The Kindergarten Missing Puppy Caper! November 6, 2012

Filed under: Standard 5. Assessment — lkgale @ 7:21 am
Tags: , , ,

My artifact for this week relates to the development and delivery of a performance assessment that supports two of our kindergarten writing targets:

  • Demonstrates sound-letter correspondence
  • Writes for a specific purpose

I told students that a very special member of our classroom, our stuffed dog “Pip”, is missing and I asked them what we could do to get help in finding him.  The students were guided to come up with the idea of making posters to put around the school to see if anyone has seen him. We searched online for samples of other posters for missing items and determined that our posters needed a picture, words that described what our problem was, a description of what he looks like, and who is looking for him. Students were given the choice of several different paper choices, one that had space for words above and below the picture, one that had space for words above the picture only, and another that had space for words below the picture. They were then sent on their way to create their posters.

Chappuis, et al. (2012, pp. 211 – 218) describe three critical dimensions pertaining to a well-constructed performance assessment.  These dimensions are task content, task structure, and sampling. As I considered task content I defined the learning targets (listed above) that would be measured through this assessment. I also wanted to allow for authenticity. Students were motivated to write and create these posters in order to locate our beloved Pip. Their writing had a specific purpose. Additionally, I incorporated student choice as they decided which layout they wanted to use for their poster.

This performance assessment addressed task structure (Chappuis, et al., 2012, pp. 213-216) as well. Students were aware of what they needed to accomplish – that is to provide other members of the school enough information to enable them to help find Pip the Dog. The timeline for completion was reasonable and accounted for the wide range of abilities in the class. Students who finished early knew to go to a quiet reading activity. Additional time was given later in the day in order to accommodate students who needed more time.

This writing assessment was formative and is intended to give information that will guide further instruction. Adequate sampling (Chappuis, et al., 2012, pp. 216 – 218) is accounted for as it allows students to demonstrate evidence that covers the breadth of the two learning targets. This assessment lends itself well to display sound-letter correspondence as well as adherence to a specific purpose.

A Performance Assessment Rubric was developed to evaluate whether or not students met the learning targets. Both of the above listed targets will be scored individually according to the rubric descriptors. A score of 1 through 4 will be assigned with 4 indicating the highest level of mastery.

Below are samples of student work as well as the score assigned.

Student #1

  • Scored a 3 for Writes for a specific purpose  because she included 3 of the poster components: a picture, a statement of the problem (“Pip is lost”), and who is looking for him (Miss Gale’s class).
  • Scored a 4 for Demonstrates sound-letter correspondence – Beginning, ending and middle sounds are represented and it can be easily read by the teacher.

Student #2

  • Scored a 2 for Writes for a specific purpose  because two of the poster components are included: a picture and a statement of the problem (“Pip is lost”).
  • Scored a 2 for Demonstrates sound-letter correspondence because much support was needed to get “Pip is lost.” on his paper.  Also, he happily admitted that he copied the bottom (illegible) portion of the text and didn’t know what it said.

Student #3

  • Scored a 4 for Writes for a specific purpose  because he included all four poster components and added one of his own, the date the dog was missing.
  • Scored a 4 for Demonstrates sound-letter correspondence.

I was pleased with the information that this assessment offered as it allows me to plan instruction accordingly. To improve on this assessment for next time I will clearly tell the students the criteria, in student friendly language, which I will be using to assign a score to their writing. Ideally, I imagine that this could even be incorporated into a student self-assessment activity… Food for thought.

Chappuis, J., Stiggins, R., Chappuis, S., & Arter, J. (2012). Classroom assessment for student learning: Doing it right, using it well. Upper Saddle River, NJ: Pearson Education Inc.

 

Test Blueprints and Propositions – Tools for Planning Assessments October 23, 2012

Module 4 focused on creating test blueprints and propositions for selected response tests. I found these two test preparation activities were particularly unclear to me when they were first presented in the readings. Such structured mapping of test items was foreign to me as I tended to rely on the integrity of the assessments provided in the adopted curriculum or those passed on to teachers by our district. As I analyzed several of our district summative assessments for kindergarten, I have recently found that they often do not reflect the importance of each learning target (Chappuis, et al., 2012, p. 126) nor do they utilize the efficiency of selected response test items.

I have reviewed the Kindergarten summative assessment our district uses to evaluate the civics strand of our Social Studies Power Standards. All of the learning targets were knowledge targets. While test integrity prevents me from attaching the assessment to this post, I can report that I noticed several inadequacies. First, much of it uses the personal communication method of assessment. While I understand that personal communication is a strong match for knowledge targets (Chappuis, et al., 2012, p. 94), it is not always feasible to rely heavily on this method in kindergarten. Early primary students are not yet at a developmental stage that easily supports written communication. Personal communication usually has to occur in one on one conferences and takes a significant amount of class time. Secondly, there were no test items that used the selected response method. Chappuil, et al. (2012, p. 94) states that selected response test items are a good match for knowledge targets. This method is also more time efficient and allows for wide ranging content coverage, thereby obtaining an adequate sampling of student achievement. Selected response testing is also well suited to administer via ActivVotes. Teachers can use this hand held polling device to capture student responses as they enter them. Data is recorded and processed instantly. Lastly, the district assessment did not reflect the importance of some targets over others. It did not separate questions according to targets at all.

Having determined that the selected response method of testing would fit nicely with our Social Studies Civics assessment, I decided to create aTest Blueprint and List of Propositions for my Module 4 Artifact. Using the Lake Washington School District Civics Standards, I wrote learning targets and then created propositions that supported each target.  This planning allowed me to adequately account for each target and essential understanding as well as give extra weight to those targets that needed it.

Chappuis, J., Stiggins, R., Chappuis, S., & Arter, J. (2012). Classroom assessment for student learning: Doing it right, using it well. Upper Saddle River, NJ: Pearson Education Inc.

 

Writing Target Checklist October 16, 2012

This week we spent time focusing on appropriate assessment methods for each of the target types discussed in module 2 (Chappuis, 2012, p. 93-96). Such methods include selected response, written response, performance assessment, and personal communication. Chappuis (2012, p. 91-92) indicates that personal communication- such as questioning during instruction, observing student participation, or student conferencing – is a strong assessment method when measuring knowledge and reasoning targets. This information was of particular interest to me because as I analyze our district writing targets I see that they are primarily knowledge targets. Often times in kindergarten, it is only through conferencing that a teacher can truly assess where a student is in relation to writing targets. If a student is not writing sentences yet then progress towards a writing target such as “Connects story content to a prompt” can be difficult to determine. This module has made me aware that anecdotal note taking and journaling is a powerful  formative assessment tool.

Chappius (2012, pp. 105-106) states that if assessment data is to be used formatively, then certain conditions must be present: 1.The assessment should be aligned with content standards. 2. Assessment tasks must match what has been or will be taught. 3. The assessment must bring forth specific misunderstandings or problems so that they can be addressed. 4. Results are available in a timely manner in order to act on them. 5. Actions are, indeed, taken.

I recently had an opportunity to participate in a collaborative team effort to improve our formative writing assessment practices in kindergarten. My reading and discussions pertaining to this module were fresh in my mind so I was able to apply this new information, particularly the formative assessment conditions mentioned in the previous paragraph, directly to this task. In our team meeting we were discussing student-teacher writing conferences and decided we needed a system by which we could track each student’s progress towards Winter and Spring writing targets. We reviewed a writing goal checklist that was in use by another teacher: Checklist #1 We discussed how this would be a simple and effective way to track our assessment observations but we wanted the writing goals to be more aligned with our district writing standards as well as better support our district writing summative assessment rubric(formative assessment condition #1 and #2) . We also wanted to track learning targets that reflect higher level goals that will be in place later in the year. We came up with a Writing Goals Checklist that fits our needs. This checklist will track which students are mastering the learning targets as well as specify which targets are giving students trouble (formative assessment condition #3). Our writing conferences occur on a daily basis so assessment results are available in a timely manner. This allows us to take action and offer support as the student needs it (formative assessment condition #4 and #5).

Chappuis, J., Stiggins, R., Chappuis, S., & Arter, J. (2012). Classroom assessment for student learning: Doing it right, using it well. Upper Saddle River, NJ: Pearson Education Inc.