Tip:
Highlight text to annotate it
X
Welcome to today's live webinar for the release of The Nation's Report Card: Writing 2011, Grades 8 and 12.
I am Mary Crovo, deputy executive director of the National Assessment Governing Board
and moderator for today's event. The Governing Board is an independent,
bipartisan board that sets policy for the National Assessment of Educational Progress, also called NAEP
or The Nation's Report Card.
Today you will hear results from the first-ever national computer-based NAEP writing assessment.
This special release represents much more than writing scores.
It represents a new era for NAEP that takes into account the increasing role of computer technology.
In fact, this state-of-the-art assessment used computers
and word-processing tools comparable to those used in our nation's colleges and work places.
So what does this mean for all of you listening today?
It means the context has changed dramatically for the way in which we now test writing.
It also means we have new insights and information on how students write. Not only will you find out how 8th-
and 12th-grade students performed when they were asked to persuade, explain
or convey experiences to a specified audience such as a college admission committee;
you will learn about students' use of common word-processing tools.
Today we have a distinguished panel of experts that will share their thoughts
and reactions to the 2011 Writing Report Card. I will briefly introduce each of our speakers,
then our webinar producer will review the meeting logistics. Our first speaker will be Jack Buckley,
commissioner of the National Center for Education Statistics. He will present the NAEP Writing 2011, Grade 8
and 12 results. Our next three speakers have been involved with NAEP, and NAEP Writing specifically, for many years.
Susan Pimentel is an educational consultant and vice chair-elect of the National Assessment's Governing Board.
Sue has served on the Governing Board since 2007.
Today she will take an in-depth look at some issues behind the results, including the gender gap.
Sue will also discuss the parallels of this writing assessment
to the assessments being planned for the Common Core State Standards,
which she helped to develop as leader of the Common Core's English language arts and literacy team.
Next we have Arthur Applebee, distinguished professor and director of the Center on English Language Learning
and Achievement at the University at Albany-State University of New York.
Arthur served as co-chair of the committee that developed the writing framework
and specifications for today's assessment. He will address the realities of computer use that changed NAEP.
Arthur will also explore student readiness for the assessment, data about how high-scoring
and low-scoring writers approach drafting and revision, the digital divide,
and integrating computer tasks into curriculum and instruction.
Finally we will hear from Beverly Chin,
director of the English teaching program in the department of English at the University of Montana, Missoula,
who also served as senior project consultant on the NAEP 2011 Writing Framework.
She will discuss the definition of writing that guided this assessment
and possible correlations between classroom computer use and scores.
Beverly will also address the benefits of using tools to facilitate writing,
the importance of providing computer access to all students, and how educators can apply today's results.
Following Beverly's remarks we will have a brief question-and-answer session with all attendees and speakers.
Before we begin, Jennifer, our webinar producer, will address logistics for using the WebEx system. Jennifer?
Thank you Cornelia. Our speakers will address questions during a Q&A session later in the event,
but attendees are welcome to submit their questions throughout the entire presentation.
Simply type your question into the Q&A window at the lower right side of your WebEx screen and submit to all panelists.
Please be sure to include your name and organization with all questions. If you have technical issues,
please refer to your confirmation email or call 866.229.3239.
Please note that live closed-captioning is also available in the bottom right corner of your screen.
Please click the "x" on the blue bar at the top of the media viewer if you would like to close the captioning panel.
Back to you, Cornelia.
Thank you, Jennifer, this is Mary, I'm substituting for Cornelia today.
Now it is my pleasure to welcome our first speaker.
Dr. Jack Buckley is the commissioner of the National Center for Education Statistics.
On leave from his position as a professor of applied statistics at NYU,
he is well known for his research on school choice, particularly related to charter schools,
and on statistical methods for public policy. Jack served as deputy commissioner of NCES from 2006 to 2008.
He spent five years in the U.S. Navy as a surface warfare officer and nuclear reactor engineer,
and also worked in the intelligence community as an analytic methodologist. Jack, we look forward to your remarks.
Thanks very much, Mary. I'm very happy to be here today to share with you the results of the 2011 Writing Report Card,
our first computer-based writing assessment. The assessment was administered in early 2011 at grades 8 and 12 only.
The samples included about 24,000 8th graders and 28,000 12th graders.
The assessment was conducted at the national level only and includes public and private school students.
As always, we report results in two ways: as average scale scores on the zero to 300-point scale,
in this case with a separate scale for each grade,
and as percentages of students at each of the three achievement levels, Basic, Proficient, and Advanced.
These achievement levels were developed by the National Assessment Governing Board,
who sets the standards for what students should know and be able to do.
For each subject in each grade the Governing Board has established standards for Basic, Proficient,
and Advanced performance. Ultimately their policy goal is to have all students performing at
or above the Proficient level.
Also because the 2011 assessment uses new technology that differs significantly from what we did in previous
assessments, we can't make comparisons back to the past results. As the first assessment in the new series,
we set the average score for both grade 8 and grade 12 at 150.
In this assessment, students were presented with tasks that reflect grade appropriate, real-world issues
and are designed to measure one of three communicative purposes: to persuade in order to change the reader's point of
view or affect the reader's action; to explain in order to expand the reader's understanding;
or to convey experience (real or imagined) in order to communicate individual and imagined experience to others.
This table shows the three purposes and gives the overall time allotted to each purpose by grade.
As you can see the percentage of students' time devoted to the three purposes was different for the two grades,
with the third purpose, conveying experience, receiving somewhat greater emphasis at grade 8.
Each writing task students received fell into one of these categories and in addition each task specified
or implied the particular audience that corresponded in some way to the task. To be effective,
writers must have an awareness of their intended readers' needs and level of knowledge about the writer's topic.
When we score student responses we have to take into account the fact that what they are providing us is essentially a
first draft.
Our scorers received special training to ensure consistency in scoring for all responses.
A certain percentage of these responses were scored twice, by different scorers,
to ensure that they were indeed producing consistent results.
Each student response was given one of six possible skill ratings, running from "Effective" down to "Little
or no skill."
Students were evaluated using holistic scoring rubrics whose criteria were based on three broad features of writing:
development of ideas, organization of ideas, and language facility and conventions.
There were many innovations in the 2011 Writing Assessment.
First, writing tasks were presented to students in multimedia formats, including both audio and video presentations,
and we will have examples later on today.
Second, students used word-processing software to compose and edit responses.
In addition, we collected extensive information on 24 separate student "actions" -- including keystrokes, backspacing,
deletions and the use of spell checking.
And also, because the 2011 Writing Assessment was computer-based,
we were able to take advantage of what is called "universal design,"
which allows us to build into the software program used by all students a variety of features to accommodate
special needs students, both those with disabilities and English language learners.
All students can take advantage of these features if they wish
and we will give some information on how they used them later on.
In addition to these accommodations that were incorporated into universal design, however,
some accommodations were still available only to special needs students.
To give you a quick idea of what students were working with,
this is a sample of a grade 8 writing task that featured an audio prompt,
and we will return to this particular task later.
It asked students to immerse themselves in an imaginative situation
and to write about it as if from personal experience.
Students listened to an audio recording of atmospheric sounds while reading a few sentences from an imaginary journal.
The first sentence of the text that students read was as follows: "When we first arrived on the island,
we saw mountains and fields with lots of colorful flowers in large, strange-looking trees."
When students read the passage, the audio provided the sound of waves lapping on the shore, the squawking of birds,
as well as the sound of footsteps in the sand to create a sense of the island world that the students were to imagine
exploring. Let's listen to a sample of what they heard.
Students then wrote their responses in the right half of the window, allowing them to refer to the prompt at all times.
Each student completed two prompts during the course of their participation in the assessment.
Let's turn to the results for grade 8, but before we do so let me remind you that NAEP results are based on samples,
which means there is a margin of error associated with each score or percentage.
We will only identify those differences in scores or percentages that meet our standard for statistical significance.
Since many of our results refer to the NAEP achievement levels, let me remind you what they are.
The Basic achievement level denotes partial mastery of prerequisite knowledge
and skills that are fundamental for proficient work at each grade.
NAEP's Proficient represents solid academic performance.
Students reaching this level have demonstrated competency over challenging subject matter.
And Advanced represents superior performance.
This horizontal bar chart shows achievement level results for grade 8 students. The percentages who were below Basic,
at Basic, at Proficient, and at Advanced.
As you can see, 27% of students were at
or above the Proficient level when we combined the percentages for the top two categories,
which is shown by the portion of the bar to the right of the vertical line dividing Basic from Proficient.
Fifty-four percent of students were at Basic, while 20% were below Basic.
As always in NAEP, we also disaggregate by race and ethnicity among other factors.
This figure shows the scores in 2011 for the seven racial/ethnic groups for which NAEP collects separate data.
Scores for each group can be compared to the national average of 150 as well as to one another. Asian students,
with an average score of 165, scored higher in 8th grade than all other groups.
To the left you can see the percentage of students in each category. In 2011, 58 percent of 8th graders were white,
while 14 percent were black, and 20 percent were Hispanic.
Here we show differences in average score by gender.
The average score for female 8th graders in 2011 was 160,
19 points higher than the average score for male 8th graders when we calculate using unrounded numbers.
While we can't compare 2011 scores with those in prior assessments of writing by NAEP,
we can say that female students,
have had consistently higher scores than male students at all grades on prior writing assessments.
NAEP uses student eligibility for the National School Lunch Program as a proxy of family income
or socioeconomic status.
Students whose families have an income less than 185 percent of the federal poverty level are eligible for free
or reduced priced lunches, while those whose families are above 185 percent are not eligible.
In 2011 at grade 8, students who were eligible scored lower than students who were not by 27 points.
Eligible students constituted 42 percent of all 8th graders in 2011.
In addition to assessing students' writing ability,
the 2011 assessment included a questionnaire filled out by teachers of the participating students,
which contained questions about classroom practices and other topics.
Teachers were asked how often they ask their students to use computers to draft and revise their writing.
As this chart shows, students whose teachers more frequently asked them to use the computer to draft
and revise their writing on average scored higher than those whose teachers did so less frequently.
So for example, students whose teachers said they never
or hardly ever asked their students to make such use of their computers had an average score of 141,
at least five points below any of the other categories. Of the 8th graders,
about 44 percent had teachers who said they asked their students to use computers to draft
and revise their writing either very often or almost always or always.
It may be that asking students to use computers for writing actually improves their writing skills,
but there are other factors that could affect student performance,
and as always NAEP is not designed to identify the causes of that performance.
For example, it could be that computer-based instruction is more prevalent in higher-income areas,
or that teachers with high-performing students are more likely to ask those students to use computers.
We can look into this a little bit by trying to control or disaggregate by an additional factor.
So this slide shows that regardless of income,
students' performance does tend to increase with the frequency with which the teachers ask them to use computers to
draft and revise their writing. In the lower left hand corner,
we see the average score of lower-income students whose teachers reported never
or hardly ever asking them to use computers for writing was 130. At the lower right,
we see the average score of lower-income students reported always or almost always is 141.
The shaded area represents a 95 percent confidence range for the estimated scores.
The line near the top of the graph shows a similar increase for students from higher-income families with the scores
running from 155 up to 167.
Here we see the grade 8 writing task that we showed you earlier.
Fifty-six percent of students provided responses whose ratings fell into the top three categories: effective,
competent, or adequate. Furthermore, 76 percent selected the spell check icon at least once,
13 percent used the cut icon at least once, and 71 percent used a text-to-speech function.
This is a universal design feature that allows them to hear a spoken version of the written text.
The grade 8 writing assessment allows students to listen to the writing prompt instead of reading it via this
text-to-speech tool. This bar graph shows the scores for students who did not use the tool at all,
compared to those who used it once, twice, or three or more times. As you can see,
increasing use of the text-to-speech tool is correlated with, on average, lower scores.
Let's look at the results for grade 12.
Here are the achievement levels for grade 12. Twenty-seven percent of students were at
or above Proficient (24 percent at Proficient and 3 percent at Advanced). Twenty-one percent were below Basic
and 52 percent were at Basic. Here are the scores in 2011 at the 12th-grade level for the seven racial/ethnic groups.
In the 12th grade case, White, Asian,
and multiracial students had average scores that were comparable with each other
and higher than the remaining groups except for native Hawaiian and other Pacific Islander students.
Looking at the gender gap in 12th grade the average score for female 12th graders outscored
or was larger than that of males and in this case girls scored at 157,
14 points higher than the average score for boys.
NAEP assessments also asked students to select the highest educational level completed by their parents,
using these five categories supplied by NAEP. In our analysis,
we group students by the highest educational level attained by either parent, and as this bar graph shows
across other assessments, higher educational attainment by a parent is associated with higher scores on NAEP.
Nine percent of students said neither parent graduated from high school. Their average score is 129.
Forty-nine percent said at least one parent graduated from college, and their average score was 160.
We also asked grade 12 students how many pages they wrote in a typical week for homework in their English/language
arts class.
Those who said they wrote four to five pages had an average score higher than the average who wrote fewer than four
pages. Thirty-nine percent said they wrote none or up to one page in a typical week.
The NAEP results can't tell us if requiring students to write four or five pages will improve your scores, however,
because it could be possible that high-performing students are more likely to
write more pages on their own.
We also asked students how often during this school year they used a computer to make changes to a paper or report.
In 2011, 12th graders who reported more frequent use of the computer had higher average writing scores
than those who reported less frequent use.
Students who said they always
or almost always use a computer to edit their writing had a higher average score than students who did so less
frequently. Fifty-six percent of students were in this top category.
This slide shows a grade 12 writing task.
This task asked students to watch a short video about young people's use of technology.
They were asked to write an essay for a college admissions committee
about one kind of information or communications technology that they used.
They were asked to describe the technology
and explain why it was important to them. Now we will watch the video.
Well, so close your eyes and imagine a video which provided data through charts and graphics
and then asked students using those data to describe the personal use of technology
in the context of a college admissions essay.
Twenty-six percent of students provided responses on this task that were rated either effective or competent.
Looking at the use of the technology or interface,
74 percent accessed spell check by using the right-click option on the computer mouse.
Thirty-one percent used the thesaurus and 52 percent used the text-to-speech function.
We saw in the grade 12 writing assessment about access to a thesaurus.
This bar graph shows the average scores for students who did not access the tool, those who accessed at once,
and more than once. Sixty-nine percent of students never used the tool.
Those who did scored higher and those who used it at least twice had higher scores than those who used it only once.
In conclusion, the 2011 Writing Report Card provides all this information and more.
Additionally, the initial release website gives extensive information on the performance of students
and access to released assessment questions and student responses
at all six rating levels through NAEP's question center.
The NAEP Data Explorer, our online data analysis tool,
allows extensive further analysis of student performance as well.
As always, I would like to offer my sincere thanks to all the students, teachers,
and schools who made this possible through their participation in the 2011 Writing Assessment.
Thank you, Jack. Our next speaker, Susan Pimentel, is an education analyst and standards and curriculum specialist.
For nearly three decades, she has helped numerous states and districts -- including Arizona, California, Indiana,
Louisiana, Ohio, Massachusetts, and Washington, D.C. -- work together to advance education reform
and champion tools for standards setting, curriculum building, assessment alignment, and teacher development.
Sue also works with Partnership for Assessment of Readiness for College
and Careers, or PARCC, as well as the SMARTER Balanced Assessment Consortium
to ensure that assessment claims, emphases,
and item development reflect the Common Core State Standards.
As a member of the National Assessment Governing Board,
Sue serves on the Assessment Development and Nominations Committees
and will become the Board's vice chair on October 1st.
Since 2001, she has served as chief architect of the American Diploma Project,
which was designed to close the gap between high school demands and postsecondary expectations.
Sue, we look forward to your insights.
Thanks so much, Mary, and good morning to all.
As Mary stated, the place I'd like to start my comments is to draw the parallels
between the Common Core State Standards and the NAEP writing framework.
When we were writing the Common Core State Standards, we drew heavily from the NAEP reading
and writing frameworks.
The first thing you'll notice is that the Common Core stresses development of the same three mutually reinforcing
writing capacities: so argumentative writing, or what NAEP calls persuasive writing, explanatory writing,
and writing to convey experiences or what the Common Core calls narrative writing.
You'll also notice that the Common Core reflects the overwhelming emphasis that NAEP places on writing arguments
and informative and explanatory text at the high school level. We use the same percentages -- 40 percent for arguments,
40 percent for writing to inform and explain,
and 20 percent for writing to convey experiences. I asked a special question of the data
and wanted to see the analysis to see if the students writing to convey experiences or writing to explain
or writing arguments, were students were doing better in one or the other?
And in fact, the special NAEP analysis shows that more students are able to write a story or convey an experience
competently than they're able to write a sound explanatory persuasive text.
I underscore this because it just goes to the fact that NAEP emphasizes argument writing
and writing to inform and explain, those really are college
and career-ready skills because they require students to use evidence and logic as they move forward.
I also want to underscore what an historic event this was in 2011.
NAEP really led the nation in assessing writing on computers.
And how important it is to have NAEP take the lead.
As I've worked with PARCC and Smarter Balance, there have been some concerns about assessing students
on computers and if it would be unfair would it mask sort of deeper issues.
One of the things that was interesting, while we can't compare results from this assessment
because it was on computers versus paper and pencil,
but what is interesting is that the 2011 data show the same patterns
on the strengths and weaknesses of paper-and-pencil tests.
That is promising. In other words, there were not other sorts of patterns that came up
or new information that came up that would make us worried.
Obviously, there's the data there that shows that students who are more familiar with computers and keyboarding
are doing better and that would be to be expected.
Next slide, please.
Just to recap really quickly, at both Grades 8 and 12,
we have a little more than half of the students scoring at the basic achievement level.
And only a quarter of our nation's students really scoring where we need them to score at sort of that solid academic
performance at the proficient level.
Gaps persist, as we've seen here, by race, ethnicity, poverty, and parent education.
Next slide, please.
What I'd like to do really though is focus on the gender gap because it is stark.
The gap between female and male students in writing is greater than in any other subject.
Math and science, males do better, but only slightly.
In reading, girls do better; but the gap is not as wide.
I want to draw your attention to the 8th Grade here of 36% of girls are scoring at or above proficient level;
only 17% of boys are.
Next slide, please.
Again, NAEP can't say why, and we can't do cause and effect.
But there are some noteworthy background variables.
According to the questionnaires, and I found this really interesting, 53% of female students in Grade 12 agree
or strongly agree with the statement that "Writing is one of my favorite activities." For males, just 35% agree.
Now, it isn't clear whether writing well comes first or enjoying writing comes first.
It's sort of the chicken and the egg.
Do I enjoy writing because I write well or vice versa?
But it was interesting that the percentage was so stark between the two.
Girls also report writing more than boys, both for homework and on their own.
And like most everything else students do, like most everything else we do as human beings,
writing really does improve with practice.
And so I think this is an important background variable to watch as we move forward.
Next slide, please.
I just want to bring up two other noteworthy background variables.
As Jack said,
the NAEP data shows that students who write four to five pages a week for English language arts homework scored higher
than those who write fewer pages.
And one of the statistics that I found troubling was that only 39% of 12th graders overall report just writing one
page of homework or less each week for English language arts.
In my mind, that's just not enough because we know that practice makes perfect -- or in this case,
perhaps practice makes proficient.
And when we look at only a quarter of our students reaching proficient,
and knowing that so much of the success in our economy is increasingly based on knowledge and information,
the ability to communicate that effectively in writing becomes all the more important.
So as we move forward here, I know as an educator and someone who works with schools and districts
and states across the nation, I'm going to be keeping in mind that boys are lagging behind girls substantially.
What's that about?
And really thinking too about the amount of writing, the regularity of writing,
that goes on in classrooms as we implement the Common Core.
And with that, Mary, I'll send it back to you.
Thank you, Sue.
Our third speaker, Dr. Arthur Applebee,
has worked in institutional settings with public school children with severe learning problems,
as a staff member of the National Council of Teachers of English, and in research and professional education.
His work focuses on how children and adults learn the many specialized forms of language required for success.
And his books and articles address issues in curriculum and instruction in reading, writing,
and the English language arts.
Since the 1970s,
he has also worked extensively with the National Assessment of Educational Progress as a report author
and as a consultant for framework and item development.
Thank you for joining us today, Arthur.
Thank you, Mary.
It's really exciting to be here today
and see so many years of work come to such a successful fruition in this assessment.
There are several points I'd like to highlight about where we are today.
First, as others have said, this assessment really is a milestone in large-scale writing assessment.
Over the past decades, two things have become increasingly clear.
First, that students were making use of computers
and other kinds of digital devices in greatly increased numbers over the years.
The sort of common metaphor is that they become digital natives, and the rest of us are still digital immigrants.
That increasing experience was combined with the fact that both higher education
and workplaces with better paying positions take computer use for granted.
Students that leave high school and move into higher education
or the workplace without the ability to write on a computer are severely disadvantaged.
These two things came together to make the format for this assessment both necessary and possible.
The second point is that the students were clearly ready.
At both Grades 8 and 12, they use the keyboard effectively.
They produced interesting writing in response to a wide range of prompts addressed to a wide range of audiences.
Whether or not they could touch type, they typed out their responses;
and many made good use of the various tools that were built into the Word processors -- spellcheck and backspace,
for example -- to modify their writing.
As Sue said, people are worried now with the assessments for the Common Core that students weren't ready.
But those worries were wrong Third is that this new computer-based assessment opens up a whole wealth of
new information for us, adding to our understanding of how better and poorer writers approach drafts and revision.
For example, at Grade 12, better writers made more use of the backspace key,
suggesting perhaps that they were self-monitoring
and making more immediate corrections than their lower-performing peers.
Similarly, higher performers were more likely to right click to access spellcheck,
again suggesting more self-monitoring.
What these behaviors mean for curriculum instruction for the teaching of writing is something that we'll have
to work out over the coming years.
But it gives us a wealth of new information to begin to build from.
The next point I want to look at is the question of a digital divide.
An assessment like this raises the question, "Does NAEP penalize students whose families can't afford a computer?"
The short answer seems to be, "No."
NAEP's own preliminary studies before they moved to this assessment showed that if you give an assessment with paper
and pencil or the computer, the patterns of differences between subgroups remain the same.
The computer format doesn't seem to penalize or advantage any one group in comparison with another.
Individual students, of course, might do better or worse.
And other research has shown that computer use in writing is particularly helpful for low achieving students
and Special Education students who benefit from the Word Processing platform.
As you might expect, students who report more experience using computers did better on the assessment.
The right-hand graph on the slide in front of you is one you saw before from the 12th Grade
results in this assessment.
That could be interpreted as saying, "Well, it's a very predictable result.
If you use the computer more often, you're going to do better on a computer-based assessment. "
But it's actually more complicated and interesting than that.
The graph on the left-hand side shows a roughly parallel set of results from the 2007 assessment where the students
were writing with paper and pencil the same pattern.
Those who reported that in their everyday school experiences that they were more likely to use a computer for writing
did better on the assessment in a very parallel way.
What seems to be happening is that higher achieving students tend to use computers more often,
and they do better on writing assessments whether or not they're handwritten or computer-based.
The last point I'd like to make is that this assessment sends a really important message to teachers and schools.
As computers have become widespread in the out-of-school lives of our students,
they've stayed somewhat limited in their use in most schools.
This assessment of how well students write when they have a Word Processing platform available makes clear that such
tasks need to be integrated into curriculum and instruction.
In contrast, paper and pencil assessments that dominate in most of the State assessment frameworks at the moment,
send exactly the opposite message and lead many schools
and teachers to limit computer use because they're worried that their students will be penalized because
the high-stakes tests are handwritten.
As writing on a computer becomes more widespread, it opens up endless possibilities for enriching instruction,
ranging from the wealth of materials available to students on the Internet to the exploration of emerging genres in
digital media that students are already accessing on their own outside of school.
NAEP in the past has often been at the leading edge of assessment development,
providing models which states have adopted in their own testing.
With this assessment, NAEP has again puts itself at the leading edge,
both in the redefinition of writing achievement as writing with Word Processing tools,
and any options for accommodations such as the text to speech that we've talked about before,
and the enrichment of writing prompts with audio and video effects that are difficult or impossible on a paper
and pencil assessment.
Mary, back to you.
Thank you very much, Arthur.
Our final speaker, Dr. Beverly Chin, has more than 35 years of experience as an English language arts teacher,
adult education instructor, and teacher educator.
A past President of the National Council of Teachers of English
and former Board Member of the National Board for Professional Teaching Standards, she consults nationally
and internationally on English language arts standards, curriculum, instruction, and assessment,
and provided invaluable insight for our writing framework.
Beverly, we are happy you are here with us today.
Thank you so much, Mary.
It's a pleasure to be with you.
As a teacher educator and professional development consultant,
I am dedicated to improving literacy education for all students.
The Nation's Report Card Writing 2011,
provides important information about the writing proficiency of students in 8th and 12th Grades.
The report card results also offer educators the opportunity to discuss this research
and to consider ways to improve the writing performance of all students.
When I served as the Senior Project Consultant to the NAEP 2011 Writing Framework,
I was excited by the definition of writing that guided this assessment.
This is the definition: "Writing is a complex, multifaceted,
and purposeful act of communication that is accomplished in a variety of environments under various constraints
of time and with a variety of language resources and technological tools."
This definition presents writing as a flexible, generative, decision-making process
and reflects the current practices and perspectives of writing instructors from elementary through graduate school.
Rather than focusing on traditional genres or modes of writing,
NAEP looked at how well students accomplish writing tasks for communicative purposes to specific audiences.
With NAEP's administration of its first computer-based writing assessment,
we gave insight into the ways that 8th Grade and 12th Grade students use technology to compose their writing.
For example,
we learned that 8th Grade students whose teachers more frequently asked them to use the computer to draft
and revise their writing scored higher than those whose teachers did so less frequently.
Students whose teachers never asked them to draft or revise their writing on the computer scored the lowest.
We also learned that 12th Grade students who always
or almost always use a computer to edit their writing scored higher on average than students who reported doing so
very often, sometimes, never, or hardly.
These findings support the importance of integrating computers into writing instruction.
Teachers need to show students how to use Word Processing features effectively and efficiently to plan, draft, revise,
and edit their writing.
These findings also support the need for schools to ensure access and availability to technology for students.
If we want to provide high quality writing instruction for all students,
we need to ensure that students have these technology tools
and learn how to use them so that they can be more successful in school, the workplace, and our society.
The Nation's Report Card Writing 2011 also offers educators resources for professional development.
Educators can use the NAEP writing rubrics to discuss how they create their own writing rubrics when they're assessing
students' work at various grade levels and in various subject areas.
Also by examining the NAEP tasks, educators can see how the writing purpose and audience are presented to students.
Through NAEP's computer-based assessment, writing tasks are now presented in multiple ways.
You saw earlier the sample of the 8th Grade writing task to convey experience, real or imagined.
And you also saw at the 12th Grade level the use of animation to enable students to receive information
and then their prompt to explain to an Admissions Committee how technology is used in their own lives.
With the use of the computer, the writing tasks are presented now in multiple ways: through the printed text,
photographs, audio, video, and/or animation.
And with the tasks being delivered in these varieties of modes, students may find the prompts more engaging
and may be inspired to write more interesting ideas and make more effective language choices.
Educators who participate in the development of writing assessments in their school, district, and/or state,
can use the NAEP Writing Assessment as a model to inform their own decision-making processes.
For example, in Montana we have collaborated with parents and teachers, teacher educators,
State education agencies, and colleges and universities to develop a holistic rubric
and writing prompts for the Montana University System Writing Assessment.
From 2001 to 2012,
this voluntary program provided workshops across the state that trained educators -- including K-12 classroom teachers,
pre-service teachers,
and college composition instructors -- to apply this rubric to 11th Grade students' persuasive writing.
Over time, much of the writing has become word processed and computer-based.
The quantitative and qualitative results of this project
attest to the value of collaborative professional development that directly links
writing assessment to writing instruction.
The Nation's Report Card Writing 2011
and its related websites are essential resources for educators committed to the improvement
of students' writing proficiencies.
As we discuss this report,
we can reflect on the instructional strategies that effectively teach students to become competent, confident writers.
We can also make informed, research-based decisions about educational policies
and advocate for increased technological resources in our schools.
By engaging educators, parents, policymakers, and the public in conversations about literacy,
we share responsibility and support education for all students for today and for the future.
Thank you, Beverly, for those comments.
Now we will respond to attendee questions during a brief Question & Answer session.
Please submit your questions online.
And our facilitator for this segment, Valerie Marrapodi, will direct your questions to the appropriate speaker.
Valerie?
Thank you so much, Mary.
For those of you who have questions about today's report card result, please submit them now.
As Jennifer mentioned, we ask that you direct your questions to all panelists.
Please remember to include your name and organization when typing in your questions.
If we're not able to respond to your question during the event today, please know that we will respond via email.
Our first question is from Doug Wren.
He is with the Virginia Beach City Public Schools.
He poses the question, "With the advent of online writing assessments,
many school districts are realizing that students will perform better on these test with improved keyboarding skills.
What are your thoughts on teaching keyboarding at the elementary school level?
When should keyboarding instruction begin, and what should be taught at each grade level?"
Susan, would you like to take this?
Thanks for the question.
We grappled with this when we were putting the Common Core together.
And actually we put keyboarding skills in as early as 3rd Grade.
Now, that's with the guidance and support from adults.
And by 4th Grade,
the expectation is that students would demonstrate sufficient command of keyboarding skills so that they would be able
to type a minimum of one page.
And then in Grade 5, that that gets expanded to two pages.
And I just want to say a little aside here.
When we were developing the Common Core, we actually weren't sure where to put keyboarding skills
and didn't know -- Do we put it late in elementary school?
Do we put it in middle school?
And we heard very strongly from the early childhood community to really begin students early on the Internet
and early with keyboarding.
And you'll actually see some of that reflected in the writing standards as early as kindergarten and first
and second grades.
But keyboarding skills don't actually appear until Grade 3.
Excellent, thank you.
Our next question is from Christian Ogle.
He did not provide his affiliation, but he is curious -- "As a result of the emphasis on testing,
the curriculum has been narrowed to only those areas being tested, such as writing.
Do you feel this is a contributing factor to some of the problems we are seeing with student preparation?"
Arthur, would you like to start that out for us?
Sure,
I think the question is certainly right in its assumption that schools are focusing on the things that are being
tested.
What I like about national assessment throughout its history in assessing writing is that unlike many State
assessments, it actually has students write at length in response to prompts.
In many states, because of costs and because the technologies have been evolving slowly,
writing is assessed somewhat indirectly with multiple choice items of various sorts that are easily scored.
But what happens in that case is even though it's called an assessment of writing,
instruction inevitably narrows with students.
Teachers are worried about how their students will do, and they focus on the kinds of things that are on the tests.
So I think this assessment in particular, with its emphasis both on extended writing and on the uses of computers,
may help work against those narrowing tendencies.
And this is Sue.
Can I just add one point just to underscore what Arthur has said about how NAEP has really been a leader here?
And again, using NAEP's experience here and model for the nation, the Common Core is about reading
and writing across the disciplines.
So it's not just English language arts.
And the hope is that in the assessment consortia -- and my sense of it is that they are proceeding in this direction --
that both Park and Smarter Balance are going to ask students to write in a variety of ways,
even including research and writing there -- extended writing, as Arthur has suggested.
So I think this will be a very important turn for our country as classrooms around the nation that are part of the
Common Core and part of the consortia really include writing in science and in social studies
and also of course in English language arts classes and in technical subjects.
Thank you so much, Arthur and Sue.
Our next question is from Karen King.
She is curious, "Were you able to look at relationships between the use of the text to speech tool
and the ELL status of the students?" Jack, would you like to start us off?
Sure, so we are able to disaggregate the use of the text to speech tool by English language learner status.
And the relationship that I discussed earlier,
which was on average that students who used the tool more often scored lower on average,
is actually something we see across both groups.
So if we disaggregate the overall student population to English language learners and non-English language learners,
that trend is still the same.
I mean we can also talk further a little bit about just the absolute breakdown of students by whether
or not they use the tool, if that's more to your question.
So for example, 40% of English language learner students in Grade 8 used the text to speech tool three
or more times compared to 31% of non-English language learners.
Great, thank you so much, Jack.
Our next question is from Dan Shaw.
He is curious as to -- "Why was Grade 4 not assessed
or results presented when it was part of the 2011 NAEP writing framework?" Mary, could you go ahead
and address this question?
Yes, certainly, thank you, Valerie.
The questioner is correct; Grade 4 is part of the NAEP Writing Framework.
The Assessment Framework Committee, on which Arthur and Beverly served, grappled long and hard with the question,
as did the Board.
At the time,
we determined that were not quite sure whether fourth graders could engage in a computer-based writing assessment.
There was not enough research evidence, nor was there solid State data.
It was only being done in a very few states.
The framework called for a special study of Grade 4 writing on the computer to determine whether they had sufficient
keyboarding skills, whether they could write sufficient amounts,
whether they were comfortable in a computer-based environment.
NCES did conduct that study in 2012 with a very large sample in the pilot study.
The results are being analyzed now, but things look very promising.
For those of us who observed the fourth graders in the assessment -- both in suburban, rural
and urban settings -- our anecdotal information looks like this.
Students knew exactly what to do with the keyboard, with the mouse, with the Word Processing tools.
And we're very excited about the findings.
We hope to work NCES to release some lessons learned from this 4th Grade writing pilot in the very near future.
And the Board is looking forward also to assessing 4th Grade students
using the computer-based assessment in the very near future.
Thank you very much, Mary.
Our next question is from Emily Hanford.
She is wondering, "How did public students compare to private school students?
I assume not statistically significant because it wasn't noted, but curious."
Jack, any comments on that?
Sure, so first, actually it was a statically significant difference.
Any difference that I did note is statistically significant;
but there are many other statistically significant differences that I didn't have time to note,
and this is one of them.
So at the 8th Grade, for example,
public school students on average had a score of 149 -- so right around the national average --
and private school students had an average score of 164.
That's about 8% of students.
If we further disaggregate down to Catholic private school students, which is about 4%, their average score is 167.
Great, thank you so much, Jack.
Our next question is from Veronica Brinson.
She is wondering, "How early should teachers start integrating Word Processing features in their writing instruction?"
Beverly, would you like to take a stab at this first?
I would be delighted.
When teachers have computers in the classroom,
teachers can actually model ways in which students can use the technology tools to do prewriting,
where they're planning their compositions.
Teachers can model for students and give them practice in how to not only draft their writing,
but also to use the Word Processing features to do revision.
Students can look at how often they use the same sentence beginning.
Students can look at sentence length.
They can do a Search and Replace function if they're looking for passive voice,
such as the use of "was" instead of other strong verbs perhaps.
And of course editing features can be used as students discover the use of capital letters
and periods at the end of a sentence or question marks at the end of a question.
So I think if teachers have the technology in their classrooms
and then can project what's happening when they're using different features of the Word Processing technology tools,
students can then learn those strategies.
They can actually apply them themselves and work collaboratively to improve not only their writing processes,
but to improve their final product.
I think it's very important that teachers model it and give students opportunities to practice, practice, practice,
so that the students actually internalize not only the writing processes,
but the uses of technology to enhance their own thinking/composing process.
Great, thank you so much, Beverly.
Our next question is from Rhoda Borombozin, and I hope I said that right. She is wondering,
"Any specific addresses for these resources -- the NAEP rubrics or the tasks?"
Jack, do you have a comment on where people can go to find more information?
Absolutely, so the general overall web address is nces.ed.gov.
If you go there and click on Data Tools, then you'll see the NAEP's question tools.
And then you can explore the released items from this assessment and interactive items and even test yourself.
Specifically, for the rubrics and tasks, if you go to nces.ed.gov/nationsreportcard/itmrlsx/search.aspx,
that will actually take you right there.
But you know what?
We can email that to you after this conference, and that would probably be easier than me trying to read the URL.
Great, it's good to know that all those tools are available.
We'll follow up with you directly, Rhoda.
So my next question is from Gail Flanagan.
She is wondering, "How do we address early writing engagement with boys?"
My panel, would anyone like to take that for me to start?
Well, this is Sue.
I'll start, and I'm sure that Beverly and Art and others may have other things to say.
I think there may be two answers I have to this.
One is, just knowing that boys seem to be lagging -- certainly when we get to Grade 8
and Grade 12 -- in terms of their writing, as a teacher I would want to really focus on that as a school.
I would really want to see if our data shows that as well, as we move forward.
I would also say, we've heard this in reading as well, that with more focus on informational text
and expository writing, that there may be new and expanded ways to engage boys in writing.
And I'd love to hear if Beverly and Arthur have anything to add on that.
I would love to contribute.
I believe that when students -- boys and girls -- have authentic purposes for writing,
there are real reasons to write, to communicate,
and they see audiences who care about what the message is going to be, that we will engage both our boys and our girls.
Students have to see writing is functional.
Communication is functional; it's purposeful.
And when we help students discover their reasons for writing and the audiences for their writing,
and weave them into our days and our curriculum and our educational processes,
I believe we will help all students improve their writing and write more frequently and write more meaningfully.
Yes,
I'd like to just add onto that the other thing that helps a lot going forward is just the use of computer themselves
for writing.
We know from other research that lower achieving students --
and the majority of those tend to be boys -- do much better when they have access to a Word Processor.
And they actually are often engaged online in environments where they write and post messages that teachers
and schools can begin to exploit once computers are integrated more fully into the activities.
Great, thank you all, so much.
I know you all would want to weigh in on that question.
Our last question of the day, unless I get a few more coming in through the Q&A window here,
would be from Rosalyn King.
She is inquiring, "What are the implications of the NAEP writing results for English writing
and developmental education programs and instruction in higher education?" Jack?
I'll try and take a stab at it -- and, Mary, you may wish to speak to this.
The achievement levels when they were set for writing were not designed explicitly with something like
college readiness or preparation in mind.
this is the sort of thing that's in the back of the minds of folks who are setting achievement levels
on a 12th Grade assessment.
If you recall the distribution of the achievement levels in 12th Grade,
you had 21% of our nation's 12th graders were below NAEP's basic level -- which means writing in college
almost certainly is going to be a challenge for these students.
And 52% were at the basic level.
There it really probably depends on the college and the nature of the program whether or not they're ready
or whether or not they require remediation.
As I noted in my remarks, of course the policy goal here, as set by the Governing Board, is "proficient."
And I would be willing to hazard a guess that folks are pretty certain that if you're proficient
on NAEP's 12th Grade writing assessment -- which is not an easy task for a lot of students --
they would be ready for the writing demands of most colleges.
And so I think it probably depends a lot on the particular post-secondary institution.
But certainly the fact that so many students are below basic
or just at basic suggests that there are a lot of students who arrive in post-secondary education not necessarily
ready to write at the post-secondary level.
And certainly the statistics that we have on remedial course work
and on the kinds of college course placement tests that students take when they arrive backs that up.
I'll add to that comment that Jack just made.
Referring our audience to the Board's 2011 Writing Framework that is on the Governing Board's website --
and I commend that document to you -- you'll notice the very unique composition of PAL members
who were part of that project.
We had folks from various industry sectors, from the military, and from all levels of higher education,
as well as high school teachers, English language arts specialists.
We really did look at what were the writing demands in higher education and the workplace,
and to build those into the assessment design, into the scoring rubrics and, as Jack said, into the achievement levels.
Now, while we're not at this point making projections about whether students are prepared for college level writing,
I think that will be something the Board will be examining in the very near future.
And this framework is a solid one to use in order to do that.
I'd also open this question up perhaps to Beverly
and Arthur since you have considerable experience working in higher education.
This is Beverly.
Thank you so much, Mary.
The project that I referred to here in Montana -- the Montana University System Writing Assessment --really was
addressing this connection, this concern, this bridge between our elementary, middle school
and high schools into college and university writing.
We want to make sure that we have really clear communication about our expectations
and our shared teaching strategies, as well as helping our students move through our educational system
and have smooth bridges so that we really assist our students to becoming the most effective writers they can be.
So it's very important for all of us to have those communications -- collaborative,
positive communications -- where we actually look at samples of students' writing,
look at the rubrics that we use to give students feedback, look at the instructional strategies
and processes we use to help our students at those varying points in their writing development to make sure that we
really are helping every student to become the most proficient, accomplished writer he or she can be.
So those professional conversations and looking at students' writing -- those are just very essential,
and NAEP plays a very large part in this process.
I would just echo that last point that too often in higher education,
there is simply an assumption that students are coming to higher education already as proficient writers.
But if you sit down with a group of college instructors and lead them through the NAEP results
and the experiences the students report as responses to the background variables,
it can shift expectations in a way that helps the higher education community deal more realistically with their
students and realize that in fact,
writing is a long developmental process that has to continue in higher education as well.
Great, thank you very much.
Well, that seems to be all the time we have left for questions today.
I'm going to go ahead and turn it back over to Mary (inaudible).
Thank you, Valerie, and thanks to all who submitted a question.
Please don't let the end of the webinar today be the end of what you know about the NAEP 2011 Writing Assessment.
I invite all of you to visit the websites of the Governing Board and The Nation's Report Card,
where you will find the speakers' comments, the writing framework,
and examples of the computer-based writing prompts to which students responded.
You'll also find, via the NAEP data explorer, additional information on the released writing prompts,
the background questions, and keystroke data collected via this groundbreaking assessment.
Remember, you can become a follower of the Governing Board of NAEP on Facebook and twitter,
and stay informed about our latest news and reports.
Finally, I would like to thank Jack, Sue, Arthur, and Beverly for being with us today.
And we appreciate all of you for joining us for this release of the Nation's Report Card in Writing.
Thank you.