Tip:
Highlight text to annotate it
X
Hello and welcome to today's webinar, to explore the findings
from the Nation's Report Card: the 2015 Mathematics and Reading
in Grade 12. I'm Bill Bushaw, Executive Director of the
National Assessment Governing Board, and moderator of today's
event. The Governing Board is a nonpartisan organization created
by Congress to set policy for the National Assessment of
Education Progress, NAEP, also known as the Nation's Report
Card. The results we'll discuss today detail 12th graders
performance in math and reading nationwide. We'll also discuss
findings based on the Governing Board's groundbreaking research
about the percentage of 12th graders who are academically
prepared for college. We have with us today a panel of experts
who will share their thoughts on the findings and the
implications. And we will respond to questions from our
audience. So, next, let me introduce our expert panel.
First, Dr. Peggy Car, Acting Commissioner of the National
Center for Education Statistics, NCES, will present the report
card results. Next is Dale Nowlin, Mathematics Chair at
Columbus North High School in Indiana, and a 12th grade
teacher representative on the National Assessment Governing
Board. We also have Margretta Browne, an English teacher at
Richard Montgomery High School in Rockville, Maryland, and a
member of the NAEP Standing Writing Committee and the
Standing Reading Committee. And finally, we have Julie Evans,
CEO of Project Tomorrow, a national education nonprofit
whose mission is to ensure that today's students are
well-prepared to become tomorrow's leaders and
innovators. So, welcome everyone, and thank you for
helping us today. Before we begin, Jennifer, our webinar
producer, will review logistics for the webinar platform.
Jennifer. Thank you, Bill. As you mentioned, our panelists
will respond to questions from the audience. To submit a
question, simply type it into the chat box in the lower left
corner of your screen and then be sure to click on "Send."
Please include your full name and the name of your
organization with your question. You can submit your questions at
any time. You can submit your questions at any time during the
webinar. If you experience technical difficulties, please
send us a note using the chat box, or you can also email
nagb@commpartners.com and we will assist you as soon as
possible. Bill. Thank you, Jennifer. Now it's my pleasure
to welcome Peggy Car to present the results of the Nation's
Report Card: 2015 Mathematics and Reading at Grade 12. Peggy.
Thank you, Bill. I'm here today to release results for the 2015
12th Grade Mathematics and Reading Assessment from the
National Assessment of Educational Progress, also known
as our Nation's Report Card. The assessments were administered
from January through March of 2015, and about 32,000 12th
graders participated in the two assessment. This includes over
13,000 12th graders who were assessed in mathematics, and
almost 19,000 who were assessed in reading. The national results
are for both private and public school students in 2015. They
are compared to the last assessment in 2013, and from the
first assessment year, which was 1992 for reading and 2005 for
mathematics. Student performance on NAEP is reported in two ways,
as average scale scores and as percentage of students at the
various NAEP achievement levels. Mathematics scores are reported
on a scale of zero to 300, while reading scores are reported on a
scale of zero to 500. Now we'll take a closer look at student
results, and we're going to start with mathematics. As a
reminder, when reporting scores and percentages, we only discuss
differences that are statistically significant. In
2015, the average scale score of all students who took the NAEP
Mathematics Assessment was 152. This was a decrease from 2013
when the average scale scores for all students was 153. And
this is indicated by the asterisk here. That is not
significantly different from the average score of 150 in 2005.
Scores are also reported using percentiles to show trends and
results for students performing at the lower, middle, and higher
levels of the distribution. At the tenth, 25th, and 50th
percentile, average scale scores in 2015 were lower than in 2013.
Scores in 2015 at the 75th and 90th percentiles showed no
significant change compared to 2013. Next, we will take a look
at how the percentages of students at various achievement
levels have changed over time. 12th grade students performing
at the proficient level should be able to recognize when
particular concepts, procedures and strategies are appropriate,
and to select, integrate, and apply them to solve problems. On
the far right we see the percentage of students
performing at or above proficient has not changed
significantly compared to 2013 or 2005. On the left, we see the
percentage of students performing below basic has
increased since 2015 from 35% to 38%. Here, again, we see a
larger proportion of students at the bottom of the distribution
than in 2013, while the top of the distribution has remained
relatively unchanged. Since 2013, the last assessment year,
scores for all racial ethnic groups showed no significant
changes, as indicated here by the diamonds. For both male and
female students, scores decreased compared to 2013 as
indicated by the downward arrows. In the utmost columns we
see the demographic compositions of the NAEP sample has not
changed significantly from 2013. Compared to 2005, scores
increased for all racial ethnic groups, except American
Indian/Alaska natives, while the overall average remained flat.
Looking at the far right columns, you see that the
proportions of students who were white, Hispanic, or two or more
races have significantly changed. Neither male nor female
students show significant increase in performance compared
to 2005. To continue our examination of the results, we
now turn our attention to some additional student groups. At
the top of the slide we see student's reported parental
education status. As a reminder, the 12th grade sample we asked
students to report parental education as our SES variable,
while in the fourth and eighth grades we used eligibility for
the National School Lunch Program. In 2015, scores for
students who had a parent whose highest level of education was
"Did not finish high school" decreased by four points
compared to 2013. Students who had parents whose highest level
of education was "Some education after high school" decreased by
three points. Since 2013, scores for students without
disabilities decreased by two points compared to 2005.
Compared to 2013, scores for English language learners
increased by six points. They were the only student group
reported here that had a significant increase compared to
2013. Scores for students who are not English language
learners decreased by two points in the same period. The grade 12
Mathematics Framework contains four content areas, number
properties and operations; measurement and geometry; data
analysis, statistics, and probability; and algebra. In
2015, scores decreased in all four of these mathematics
content areas or sub-skills by two points compared to 2013. Now
we'll look at the results for reading. Looking at the average
reading scores over time, grade 12 students had an average score
of 287 in 2015, which was not significantly different from
their score in 2013 but it was lower than in 1992. By
percentile, we see the lower performing students at the 10th
and 25th percentiles scored lower in 2015 compared to both
2013 and 1992. The scores for students at the 90th percentile
was higher in 2015 compared to 2013 and 1992. The scores for
students at the 50th percentile in 2015 were not significantly
different compared to 2013, but it was lower than in 1992. We
often talk about gaps relative to student subgroups like gender
or race/ethnicity. Here we see a different kind of gap where
student performance at the top of the distribution is
increasing and performance for those at the lower end is
decreasing. This is a widening of the performance distribution
across all students. This slide shows percentage changes across
the NAEP achievement levels. 12th-grade students performing
at the proficient level should be able to locate and integrate
information using sophisticated analysis of the text. 37% of
students scored at or above proficient in 2015, which was
not significantly different from 2013. 6% scored at the advanced
level and increased compared to 2013. 31% scored at the
proficient level, which was not significantly different from
2013, while 35% scored at the basic level, which was a
decrease. The percentage of students below basic, 28%
increase compared to 2013. As I mentioned before, you can see a
widening gap, with more students at the top and the bottom of the
performance distribution than in 2013 or 1992. Compared to 1992,
the percentage of students at or above proficient was lower in
2015 while the percentage of advanced level increased.
Percentages at the proficient and basic levels decreased,
while the percentage showing below basic increased compared
to 1992. Looking at score changes from various student
groups in 2015 compared to 2013, we see no significant
differences in the average scale scores. Comparing the
demographic composition of NAEP sample in the upper rightmost
column, we see that the percentage of students in these
groups did not change significantly compared to 2013,
except for an increase in the percentage of students who are
two or more races using unrounded numbers. Let's take a
look at our score changes for various student groups in 2015
compared to 1992. There were decreases for black students,
male students and female students. The rightmost columns
show the demographic makeup of the 1992 sample. There was a
decrease in the percentage of white students while there were
increases in percentage of students who are Hispanic,
Asian/Pacific Islanders, American Indian, Alaska natives,
and two or more races over the same period. This slide looks at
score changes for additional student groups. Scores for
students across the range of parental education levels were
not significantly different in 2015 compared to 2013, but were
lower compared to 1992. We do not see any significant
differences in 2015 compared to 2013 based on disability status
or status as English language learner. Now I will summarize.
In 2015, the national average score in mathematics was lower.
In reading, the average score was about the same in comparison
to 2013, but we did see that the performance distribution was
wider in 2015 compared to 2013 and to 1992. And both subjects
comparing the 2015 results to 2013, the scores of students
performing at the 10th and 25th percentiles were lower. The
percentage of 12th-grade students performing at or above
proficient was not significantly different, and the percentage of
students scoring below basic was higher. In 2015, in mathematics,
male students scored higher, on average, than their female
students. Looking at this gender gap by race/ethnicity, we see
that white and Hispanic males scored higher than their female
counterparts. In comparison, the average score for female
students in reading was higher overall than for male students.
Female students who were white, black, Hispanic, Asian, and two
or more races scored higher than their male counterparts. Since
2008, the National Assessment Governing Board has been
conducting research on the use of grade 12 NAEP results as an
indicator of student's academic preparedness for college. These
studies indicate that students scoring at or above at above 163
on the NAEP mathematics scale, 0 to 300, are likely to possess
the knowledge, skills and abilities that would be -- makes
them academically prepared for college. For reading, students
scoring at or above 302 on the NAEP reading scale, which,
again, was zero to 500, are likely to possess these skills.
These last slides I will present are data for 2015. In 2015
mathematics, 37% of 12th-grade students scored at or above 163
on the NAEP scale of zero to 300. Again, this indicates
academic preparedness for college. 39% scored at or above
163 in 2013. In 2015 in reading, 37% of 12th-grade students
scored at or above 302 on the NAEP scale of 0 to 500, which,
again, indicates academic preparedness for college. 38%
scored at or above 302 in 2013. In conclusion, you can view the
report card in its entirety online, and you can explore more
data than I've presented here today. As always, I would like
to offer my sincere thanks to the students, teachers and
schools who participated in the 2015 NAEP 12th Grade Reading and
Mathematics Assessment. Thank you. Bill. Thanks, Peggy.
Excellent comprehensive report. We really appreciate it. So our
audience has submitted some questions about the report card,
and there's one here that I'd love to have your insight on,
Peggy. It's from Trisha Crain with the Alabama School
Connection. And Trisha asks, "Please explain how the sample
size is relevant and how sampling is a valid statistical
tool." Yes, thank you. As with most of the surveys that we
collect at the National Center for Education Statistics, this
is a sample survey, a comprehensive probability sample
which represents accurately the sample that it wishes to make
inferences about. Sample size is very important in our ability to
have confidence in these samples and making references to the
larger population. So the larger the sample, the greater our
confidence, the greater our reliability. And the smaller the
confidence interval around our estimate of what the true score
would be, whether we're talking about a score or you're talking
about a percentage of students from the sample, the estimation
would be the same, it would rely on the sample size. There are
other factors that are important, such as how
homogenous the sample is. We want the sample to be as
heterogeneous as possible, and to reflect the true nature of
the population for which you're going to make an inference. But
clearly, most important here for most of our samples at the
center, we want a sample size that is large enough to give us
reliability and validity and confidence in our estimates.
Hopefully that helps, Bill. Perfect. Thanks, Peggy.
Appreciate it. We'd like to turn the conversation over to other
members of our expert panel. First, let's hear the thoughts
on the results from Dave Nowlin. Dale. Thanks, Bill. First of
all, it's, of course, disappointing that math scores
are down. It's also disappointing that they're down
in all four areas of mathematics that students are tested over,
and specifically that the scores are significantly down when you
look at percentiles, the lower performing students, which is
actually widening the gap between high-performing students
and low-performing students. I feel like I need to give a plug
here. I know we're always bound by budgets, but it would be
really wonderful if we had state data to go with this because I
think too often when we see national results, it's easy for
us in different states to think surely that's because of the
other states and not because of us. But I think that it's
important that we reflect on what does this mean in terms of
our own policies in our own states and in our practices in
school districts and in schools. I want to kind of -- we don't
really know what caused the decline, and maybe it's a
one-year anomaly, but I do want to conjecture at some possible
causes and what that might mean for us in schools and in school
districts. One thing is that, over the years, graduation rates
have increased. I know they have increased for us locally, and
that's true across the nation. And any part of the lower scores
that are due to increased graduation rates are actually a
good thing. I'm sure that does not account for the entire drop
in scores, though. The second area of concern, I guess, for me
is to think about curriculum. There's been a lot of talk about
curriculum across the nation lately. A lot of that talk has
been around Common Core. So one of the concerns under curriculum
is how well does it align to what we test on NAEP. And I know
that NCES looked into that, to some extent, and what they found
is that it's not a perfect alignment, and that might be a
cause for part of the decline, but it certainly does not
account for the whole drop in scores. But there are a couple
other issues that I think we should be concerned about in
terms of curriculum. One is the fact that any time you change
curriculum, it takes some time for teachers and students to
adjust to the change in what's taught and how it's being
taught, and that might be something that researchers could
look at and measure. A lot of states have been changing
curriculum, either going to Common Core or changing their
own state standards. In our state, in Indiana, we were one
of the first states to pull out of Common Core, and what that
has meant in terms of curriculum, we're on a six-year
textbook adoption cycle. And over this last six years of our
current cycle we have actually changed state standards three
times. And anytime we're making that frequent and as many
changes in state standards as that, teachers are really trying
to aim at a moving target, which is a challenge in many respects.
And I know we're not the only state that has changed standards
multiple times over the past few years. And finally, in terms of
curriculum -- and these are all areas that I think would be
great for researchers to dig into the data and see what they
can see. But as states have changed curriculum, I think
there's probably a wide variation on how much
professional development has gone along with that. In the
most recent NAEP release -- NAEP and TUDA -- of the fourth grade
and eighth grade results, we saw specifically some urban
districts. I'm thinking Miami-Dade and Chicago and
Washington, D.C., that they had targeted professional
development with their teachers, and those districts saw
significant gains. So, as many states and districts are
changing curriculum to match new standards, I think it's really
important for us to look at the importance of professional
development, to go along with those changes. Then the third
thing to consider, I know there weren't drastic changes in
demographics from 2013, but I never want to lose sight of the
fact that our demographics are changing. At my district, the
Hispanic population, over the past few years, has increased
dramatically. And I know that we're not alone in that. And
other minorities, too, have increased significantly. And
we're a suburban school in the Midwest. And our free and
reduced lunch population has changed dramatically over the
past few years, too. And those populations have specific needs
that we might need to make sure that we're meeting. That does
bring us to one of the few positives that came out of the
data. The English language learners, or ELL students, had a
significant increase from 2013 to 2015. With increasing numbers
of students that are English language learners in my school
-- and I know this is true across the nation -- there has
been a lot of attention, a lot of resources, a lot of effort
put towards how can we help those students. Really it's a
couple things specifically that we have done. Those students
have resources where they can go and get content help while
they're learning English. And also the fact that we have gone
to digital textbooks, our students can access their
textbook not just in English, but if they are more comfortable
in Spanish, they can access the Spanish digital textbook. And I
think the effort that people are putting into the needs of the
English language learners, hopefully it's paying off. The
final thing I want to highlight is something we have seen
before, and that's that what math courses students take make
a difference in how they do on NAEP. Students that are enrolled
in calculus and precalculus score better than students who
never take those courses. The easy answer to that is let's put
all of our students in calculus and precalculus, but, obviously,
that's not what we want to do. What we need to do is look at
the students when they're in eighth grade and ninth grade and
tenth grade, and look at what can we do to take down the
barriers for those kids reaching the higher level math courses,
and what things can we put in place to help them with that.
Anything we can do to move students into higher math
courses in a successful way is going to be a win for the
schools and a win for the students. So that's what I have
to share. Thank you, Dale. A thoughtful presentation on the
math results in particular. Again, I have a question from
our audience that I'd like to pose to you, if you don't mind.
This is from Theresa DeRiso from North Providence Public Schools
in Rhode Island. And Theresa asks, "How do we address math,
and literacy also? How do we address the gaps at the high
school level?" I think that's one of the toughest challenges
for classroom teachers. I have, myself, taught, we call it an
algebra block class, and it's really for students who are
taking -- all of our students are required to take algebra by
ninth-grade. Not all of our students are ready for algebra
in ninth grade. And so you run into many gaps. A good example
would be students who are learning algebra but they
haven't mastered fractions. What would be the worst thing to do
is to not teach them algebra and just go back and try to focus
just on the gaps. And when you do that, then those students --
that's what we did in the past, and we have students who never
got to algebra. So I think the challenge is to go back and try
to fill in the gaps in the context of the new material, and
do the best you can and fill in the gaps while you are moving
the students forward, which is a huge challenge. It sounds like
it's something that's not just a responsibility at the high
school level, but also middle and elementary level.
Absolutely. Thank you. Hey, our next speaker is Margretta
Browne. Margretta, we'll turn to you. Okay. Thank you and good
afternoon. I'd like to start off addressing that from 2013 to
2015 there has been no significant changes in scores on
the NAEP reading test, but there is still a gap between white and
black students at 30 points, and white and Hispanic students at
20 points. This data is particularly significant at my
school, which has a very diverse makeup and shows similar results
when we're measuring performance in the classroom by GPA among
groups. So our literacy goal is to decrease the gap between the
percentage of African-Americans, Hispanics, and white and Asian
students earning a C or higher in English, and we try to start
at grade nine. So the NAEP data shows the benefits of academic
discourse, which is becoming an important literary goal at our
high school where I teach. Students talking about texts and
their interpretation of it does make a difference academically.
We want students to be able to move through various ways of
thinking about the text from a simple locate and recall to
integrate and interpret, and, finally, critique and evaluate.
So we try to make students cognitive of how they talk about
and how they work with the text. So, for example, in my own
classroom, the previous night's homework may have included a
range of questions from all three areas, locate and recall
through to critique and evaluate. And so they may come
into class the next day and they're given a chance to pair
or share their findings, talk with their neighbors, to just
sort of get them warmed up in thinking before we engage in a
larger class conversation. And I try to include different methods
of random calling to make sure that each student does actually
get an opportunity to talk about their reading the night before.
So, one day it might be boy, girl, boy, girl. The next day I
might have a bowl with names in it, and pull the students' names
out randomly. Or a classroom might look like they come in,
they immediately move into small groups, and then those groups
are given an opportunity to talk about the readings from the
night before and their questions, and then the groups
report out, or perhaps even annotate a small part of the
text and think more deeply about it. All of these different ways
of addressing academic discourse gives them a way to talk about
texts. And eventually the expectation is that they will be
able to work with those questions that require higher
levels of critical thinking. And it's also useful to me as an
instructor, because it gives me another way to really assess
where student strengths and weaknesses are, which is very
important in the classroom. So this investment that they take
in their own thinking promotes engagement of the text, and we
want them to read. And sometimes just doing that can be a
challenge, just getting them interested in reading. The NAEP
data shows that the more we get them to read in school and for
homework, the higher their scores are. So it's very
important that we use engaging yet challenging text, from both
literary and informational genres; that these texts should
be diverse and reflect the global world that we live in. So
an example of something I might use in my own classroom might be
a classic text, like Mary Shelley's "Frankenstein," or a
more modern text like Tim O'Brien's "The Things They
Carried," maybe something midcentury like Chinua Achebe's
"Things Fall Apart," or perhaps even a graphic novel such as
Marjane Satrapi's "Persepolis." I often like to work in my
informational pieces with the anchor text, for example, while
reading "The Things They Carried" I may have students
read an article from the newspaper that discusses PTSD
and the effects on soldiers today. So if students are
invested, they are going to be curious and they are going to
read more. As seen on the NAEP reading test, students who read
more in school and for homework do tend to have higher scores.
So, really, it's up to us as the instructor to make the classes
as impactful as possible for our students. Educators, I think,
can really use the NAEP data to make the most of their time in
class. We want them to think more deeply about text, which
requires that exposure to rich and engaging text. We want them
to reach the point of critique and evaluate, which requires
that practice, practice, practice in class with the text.
And then, you know, finally, we want them to know how to think
critically about any text they might encounter. NAEP test has
been able to validate a lot of what we, as teachers, see in the
classrooms every day. And that's pretty much it. Thanks,
Margretta. Thank you. Excellent practical, pragmatic
suggestions. Again, we have a question from the audience. You
may have actually answered this partially, but let me present it
to you. It's from Cindy Benge, she is from the Aldine
Independent School District in Texas. And Cindy asks, "What
specific trends in reading do you see with 12th grade that can
be addressed with changes in instruction in high school
language arts?" Yes. Well, actually, I can touch on
something that I did not in the first part of my presentation,
and that is that, you know, we do have fast-moving developments
in technology, and so I do think that it is important for
districts to invest in good professional development. We
need to be up to speed so that we can best integrate the latest
in technology in our classrooms. So these advances in technology,
they really are changing the landscape of reading and how we
process information. So, Dale mentioned it in his
presentation, and I think investing in the teachers is
investing in and making sure that the students are getting
the instruction they need to be ready for college and for work.
Well, again, thank you. And we have a very large audience out
listening in today. I think it's safe to assume that several
members of our audience like to tweet. And so we just want to
remind you, if you are so inclined, it's #NAEP. And now
our final panelist is Julie Evans. Julie, would you share
your perspective? Thanks so much, Bill. I am thrilled to
have this opportunity to answer and reflect on the results and
give maybe a slightly different perspective here. We can advance
the slide. Thank you. I'm very excited about the particular
aspect of using NAEP as an indicator for academic
preparedness. So what I'd like to do is bring into the
discussion some research findings that my organization,
Project Tomorrow, has about students' attitudes about their
learning experiences and about their academic preparation. So,
as we heard from Peggy just a few moments ago, a little bit
more than a third of students are, based on this new research,
prepared for college, academically prepared for
college. But we also know, from a practical standpoint, that
preparation is just not about knowledge itself, it's also
about the application of that knowledge. And so it's
interesting to take a look at students' perceptions of
learning, the value they place on learning, what their goals
are, what they're thinking about for their future as playing a
part in developing that academic preparation. So we asked 107,000
high school students this past fall to talk to us in a survey
that Project Tomorrow conducts about their thoughts about
school and about learning and about the experience that they
were having as part of their academic preparation. And so
it's concerning to us when we look at these particular
statistics that less than 50% of our high school students value
what they are learning in school as having a connection to their
future, and that included both subjects that they're learning
as well as skills. So that should be a wake-up moment for
us as well in relationship to the scores today. But there
actually is another piece of this puzzle, and I think this is
the more positive part of the story. So, contrastingly to the
way the students are valuing what they're learning in school,
82% of high school students told us that doing well in school is
important to them. I think that's a really strong statement
not only about the students' attitudes about learning, but
the fact that they do want to be proficient, they do want to be
prepared, they do want to do a good job, and they like
learning. So we can see here that over two-thirds of students
said that learning how to do things and learning about new
ideas was something that was important to them. We can move
on to the next slide. This disconnect between what students
are learning and the value they place on that and their greater
views around learning gets even messier when we start looking at
what students are doing outside of school to prepare themselves
for the future. I think we can probably all agree that school
no longer has that exclusivity on being the place of learning.
And, in fact, for many students, it is not their primary source
of learning, particularly as it pertains to their future. So
here we have, from that 107,000 high school students, 50% of
them saying that they're learning important things for
their future on their own, outside of school. And that may
be a wake-up moment for many educators and school leaders. So
why are students believing or doing these types of things
outside of school? I think this is where we get to some really
strong connections here into our NAEP scores, and also in
thinking about schools of the future. The students told us on
this survey that the primary reason that they are using
digital tools outside of school to self-direct their learning is
for self-remediation, and it's particularly in math. So the
students have a very cognizant understanding that they are
deficient or have deficiencies in their mathematical
understanding. And so they are tapping into some online and
digital tools that might even just be through a smartphone or
some type of mobile device to get that remediation. So that
might mean watching online videos from something like the
Khan Academy to self-remediate, or to be able to participate in
an online class -- in addition to their brick-and-mortar class,
not as a replacement for that class -- to, once again,
self-direct that learning to get themselves up to speed. In
addition to that, we see that the students are using a wide
range of different digital tools to also help with skill
development. It was particularly noteworthy to us in our research
findings that students were using online writing tools
outside of school, not homework assigned, not an assignment or a
project, but on their own to develop their writing skills. In
many cases, the students told us that the use of those tools gave
them an audience, and they particularly appreciated getting
peer review on their writing as a means to help develop better
writing skills. And I think we all know the more writing that
students can do improves those writing skills. Students are
also tapping into online games and even social media to get
ideas and to think about what other people are doing and to
make some critical assessments of that environment that they're
living in. So the development of critical thinking skills is
equally important. Go ahead to the next slide. I think we're at
a really interesting point here where these new realities around
the use of technology and, more importantly, students'
perceptions of the value of their school-based learning
presents sort of a challenge and an opportunity. Now, we know
that many factors influence how prepared students are for
college, for career, for the military, for vocational
training. I think it's particularly promising that
students have a high interest in their academic success, but that
disconnect between the way that students are valuing what's
happening during their school day and the types of learning
experiences that they're having in their school day, and what
they, themselves, perceive as their needs for greater
preparation to be able to compete and contribute in the
global society is a concerning point of this. Now, all of that
has resulted in this emergence or this increasing of this new
phenomenon where students are self-directing learning around
academic interests, using these digital tools that are available
to them, but, again, that is beyond school sponsorship. So we
have really an interesting golden opportunity here, to
rethink the way we are both approaching education during the
school hours and also to think about this concept around
academic preparation. So if we take into account what students
are doing on their own, outside of school to prepare themselves
for the future, it really could be a very interesting
conversation to start thinking about increasing the relevancy
of their formalized education process. But also, from a new
research standpoint, to think about what students are doing
outside of school, self-directing their own
learning, and what that means in terms of a reflection on the
value that they place in terms of their school-based academics.
So to this big conversation about academic preparedness, I
applaud the new research that has been done to be able to give
us some strong foundational elements around students'
academic preparation, but also call for a greater interest or a
greater exposure to what students are doing outside of
school as a way to maybe understand some of these
statistics. Really, thank you so much. Under your leadership,
Project Tomorrow has done such a great job collecting the student
voice, really collecting and broadcasting the student voice
around school. And this is so helpful, what you shared with
us. I also have another question that I would like to direct
towards you. It's from Lucinda Mulzac of Everybody Wins! in
Washington, D.C. And here's the question: "Would having students
assessed upon entering high school be of value to determine
how they could be better prepared for college?" That's a
great question, Bill, and a really thoughtful one. So,
congratulations to Lucinda for asking that. I think my
bottom-line answer, from our perspective, is yes. I think
that the more that we know about either students' current
academic preparation for college, career, military
preparedness, or their valuation of how they're thinking about
academic preparedness would be so helpful to be able to
understand what we need to do to make sure that these students
are on the right path. We know that preparing for college
doesn't start in 12th grade, in fact, it doesn't even start in
ninth grade. But to be able to have some more landmark data and
to understand where students are on that pathway, what their
attitudes and valuations are around academic preparedness,
and what they need to be able to feel that they are on the right
path would be so valuable not only for the nation at-large,
but also for teachers themselves, and the schools that
they are part of. You know, there is a huge conversation out
there about the value of personalizing learning and
addressing student needs, and being able to understand what
students are thinking about in terms of academic preparedness,
and to be able to have data such as this to take a look at that I
think would be incredibly valuable. Wonderful. So we've
had a great program to this point. Peggy did a terrific job
providing this year's results on the Nation's Report Card. Dale
and Margretta shared from the classroom perspective, both some
very practical and pragmatic ideas, along with important
policy issues. And then Julie summed it all up with
information from the student point of view, which is so
critical and so important. Thank you all for your insights. Now
let's take some additional questions. And my co-facilitator
on this segment is Greg Orrison, and he'll direct audience
questions to one of our speakers, but we also encourage
other speakers to comment if they're so inclined. Greg, I'm
going to turn it over to you. Thank you, Bill. Again, to
submit a question for the panelists, type it into the chat
box in the lower left corner of your screen and click "Send." We
also have questions that were submitted by attendees before
the event and we will try to answer as many questions as time
allows. Our first question is from Beth LaDuca with the Oregon
Department of Education. She asks, "Is the information from
NAEP regarding academic preparedness for college
consistent with data from others sources such as Smarter Balanced
or PARCC." I know that Dale mentioned alignment with the
Assessment Consortia a bit in his remarks, but, Peggy, I
wonder if you could -- based on your awareness, if you could
speak to NAEP scores as they may relate to the consortia test?
Well, it's a good question, but, unfortunately, it's a
complicated one. The answer is fairly complicated because PARCC
doesn't have all of the states. It has a small proportion of the
states. That would be true for SBAC, Smarter Balanced as well.
But we did take a look at the results for those entities, and
what we see, on average, for example, in math, PARCC
[inaudible] small number of states, about 21% of their
students are prepared if you consider their highest levels,
which I think might be comparable to what they would
report. In reading, about 40% at their highest levels for, again,
a small number of states that they included. Smarter Balanced
is even more difficult, so I won't even try to venture into
that comparison. So I think it's a bit of apples and oranges
without more representation and knowledge of how the states that
are included in their samples really reflect the nation. But I
can say that for the ACT -- which is something that we've
been thinking about as a comparable comparison -- and for
the SAT, we see numbers that are slightly higher for ACT. For
example, in 2014, for their readiness in math, it was about
42; 46 would be the comparable number for reading for the ACT.
And for the SAT -- and they combine theirs -- it's about
41%, or maybe closer to 42% if you round it up. So I think the
caution here would be that students select, for the most
part, into the ACT and SAT, so it's a slightly different sample
than what would be the case for the NAEP study, which, you know,
is a real preventative of all students that have taken this
assessment. So if we take these two factors into account, we can
make the comparison with some caution. Hopefully that's
helpful. Terrific. Thank you, Peggy. Several of our
participants, including Ellis Ott with the Fairbanks North
School District and Susan Fitzpatrick with the James S.
McDonnell Foundation, have asked about what the data can tell us
about preparedness for careers. So I know the governing board
has done some research about preparedness for job training. I
wonder if you could start us off in response there. Sure. So,
initially, we were researching and exploring the relationship
between the NAEP scores and academic preparedness first for
the transition to college, and then also the transition to
careers, and, finally, the transition to opportunities in
the military. We were able, through the research, to
establish, through linkage studies, a pretty solid
relationship to that transition from high school to college.
Unfortunately, because of the many different career
opportunities, we just were never able to establish that. We
desperately wanted to know and to be able to answer that
question, but we couldn't do it reliably in a way that we were
going to share that with the public. So we backed away from
that. And I guess the same thing with the military being similar.
Julie, I know you have an interest in this topic as well,
would you care to add anything about preparedness for careers?
I think it is a really interesting conversation, and
it's obviously one that employers are very interested
in. Are our students being well prepared, not just college
success but to be able to come into a job and to be able to
function. So when we look at what employers are saying are
the types of skills that they are most interested in, I'm
always drawn to employer's conversations around the needs
for employees to be good critical thinkers and problem
solvers. And so, whereas I agree with Bill, it's hard to make
some of these leapfrog jumps, it is, of course, interesting to
look at and say if we are doing critical reading, if we are
having strong math aptitude, those should be the types of
things that could prepare students to develop their
muscles for critical thinking and problem solving, and to be
able to develop that sort of competency. And, of course, that
is obviously what some of the goals are around some of the new
standards around career and college preparation. So I think
it's interesting to keep it in perspective and to maybe not be
able to draw the direct links from A to B to C, but to
understand that the ultimate goal of what we are trying to do
is obviously to prepare the students to be fully functional
and economically self-sufficient. And the fact
that the employers are calling for these type of skills means
that we do need to be paying attention to this. Great. Thanks
so much. We have a question from Jessica Williams with Microsoft
-- and others have posed similar questions. She'd like us to
address the trend over time as it relates to contributing
factors, such as testing approach and methodology or
standard curricula that may have changed over time. I know that
several participants have also expressed an interest in how
Common Core may have played a role in the results. Dale, I
know you addressed that part in your remarks, but I wonder if
our other panelists would like to weigh in on that question,
Margretta or Julie, for example, how contributing factors, such
as testing approach or standard curricula, may have lent
themselves to the NAEP results, and maybe your perspectives in
the classroom or what you're seeing on a day-to-day basis.
This is Julie. I can give a little bit of a perspective
there, and I would love to hear from others. I think that our
situation in our schools today -- and, again, we sort of float
above a 50,000-foot level in terms of the research that we do
-- is very much in flux. Dale mentioned the change in
curriculum standards, but, in addition to that, there are
changing expectations for students, there changing roles
of teachers, the instruction model is changing in many
schools. And so I do think that that may -- when I read through
the results, that was what struck me, that when we're
looking at the flux or the changes or the dynamics in our
school environment today, we would probably be shortsighted
if we did not realize that that would have to be a contributing
factor to some of these results. This is Dale. And I'd like to
weigh in on that in a slightly different perspective, I think,
if that's okay. One of the things that we have been working
a lot on the governing board is our transition to
digitally-based testing. And this obviously doesn't impact
the results we're looking at right now, but I think more and
more students are taking high stakes tests online or on
computers. And as NAEP heads in that direction, I actually see
NAEP as a leader in that because some of the things we're looking
at in terms of the digitally-based testing that's
coming down very soon will give us much richer results and I
think that they will engage the students a lot more in the
testing. So, in terms of the technology, it's going to have a
huge impact in the next few rounds of testing. And I know
Peggy and her people are working real hard to connect it in terms
of trends so we don't lose trend when we make that transition.
Hi, this is Margretta. And I'm commenting on the changes that
I've seen in my own school for teachers. The implementation of
the PARCC and Common Core has meant not only changing to meet
the new curriculum standards but also a lot of extra
responsibilities for the teacher, which include doing all
kinds of reports and keeping track of student data, which
isn't a bad thing, keeping track of student data in the
classroom, but we do seem to, because of requirements, spend a
lot of time tending to needs outside of preparation for our
classes, which could cause a problem. And we also spend a lot
of time doing a lot of repetitive meeting about making
sure that we're meeting all the new requirements. So, just sort
of, you know, getting ourselves situated to handle all the new
standards that come along with the Common Core and the kids
being successful on the PARCC in the beginning can disrupt some
of what goes on in the classroom. This is Peggy. Go
ahead, Peggy. Yeah, I wanted to add to some of this, the
thinking that I have heard thus far. I want to particularly
comment on the number of factors that I have heard at the
beginning of our dialogue around dropout rates and graduation
rates and course taking because I think those are some really
important factors to consider in the trends that we're seeing
today. And I can add to it some actual numbers that might put it
into a stronger context. The dropout rate, for example, has
improved from 12% to 7% from 1990 to 2012. That's a really
good picture. The graduation rate has improved from 74% in
1992 to 82% in 2012. Now, we've seen similar growth patterns or
improvement patterns for all of the racial/ethnic groups,
although they're not always this good, but it does mean that we
have more students in schools who would normally not be there.
And what we know from the data today is that students are more
likely to be taking a course in their 12th-grade year -- a math
course in their 12th-grade year than they did years before. And
that's a good thing because they score higher in all of these
assessments that we've been tracking for 12th-graders if, in
fact, they take a course, regardless of what the course
is, in the 12th grade. The other point that I think was mentioned
earlier that I think bears repeating is that there has been
a change in the pattern of course-taking over the last few
years. Since 2005, for example, there has been an uptick in
students taking calculus. And a similar pattern for students
taking precalculus as their last course taken. Conversely, which
is what you would expect to see, less students are reporting that
their last course at the end of their 12th-grade year is an
algebra or geometry. I just looked at these results by race
and it's pretty consistent. So we have students that we
normally would not have in these comparisons. The population is
shifting for the better. They are also taking more advanced
courses. Greg, this is Bill. If you don't mind, if I can add
into this, I know I'm technically the moderator, but I
think this is a great question from Jessica. And there are
people out there who really study change in environment.
Michael Fullan comes to mind as one who's an expert on this.
There's no question that educators across America have
decided we want to do everything we can to improve student skills
and knowledge in a variety of areas. This has resulted in
multifaceted change that's affecting classrooms in schools.
While we would like to see those changes result in immediate and
impactful positive results, the experts that study change often
have documented that sometimes there's a slight dip in the
performance that we measure before we actually see the
increases that we're hoping for. I don't know if that's happening
here. I do applaud the educators across America who are doing
everything they can to strengthen the skills that our
high school students have as they exit high school, though.
Great. Thanks so much to all of our panelists. Really
interesting. Ze'ev Wurman with IEEE, and several others, have
asked about state results, why those are not available, and how
many states may have participated in the assessment.
Bill, are you able -- I know that Peggy touched on the
sampling -- are you able to speak a bit more to the role of
states in grade 12 assessment? Yes, very much so. And Peggy can
also chime in here. The governing board has a set of
priorities, and included in one of those priorities is to have
state-level data in as many different assessments as we
possibly can. And, again, Peggy can speak to this. In this case,
it's a resource issue. It just -- to do the larger sample for
the state-level data simply costs more, and we just weren't
in a position to do it. That's not to say that it isn't a
priority for the governing board and for NCES, not only here in
the 12th grade but also in some of our other assessments that we
do to have that state-level data. Bill, I think you're
exactly right. I can't add much to that. I think that answer is
the most appropriate one. I would add that it is a pilot
project. It is a pilot program in the sense that it's not
required for states to participate in 12th grade as it
is for fourth and eight for the reading and math assessment. In
2011 -- no, 2013 I think, we had 13 states to volunteer to
participate; and the time before that, we had 11. And so it is
something that states want to do, and all states don't want to
do it. I think there is a growing desire to be part of our
12th grade assessment. But, keep in mind, it is a pilot project.
We do it when we have funds to do it. And when we do implement
the program, states volunteer to participate. Thanks, Peggy. One
other thing, Greg, that I see hasn't come up as a question
here today, but it's the issue of do 12th graders really engage
in this assessment, do they take it seriously? And, Peggy, you
have some pretty good data on that question about 12th-grade
engagement. Yes, Bill, that's a really good point. It's a
question we get all the time when we release these 12th grade
data. And so we've been trying some indicators that give us
some level of comfort that students are engaged with the
assessment. You know, motivation is a difficult construct to
measure. But we do have some indicators that tell us a little
bit about how students are interacting with the assessment
when we look at a mid-weight, meaning they just skip over
questions or they don't -- or they stop and don't reach the
end of the block. We've been looking at these indicators over
time and, quite honestly, they have been holding steady. The
response rates to our questions are in the high 90s. It hasn't
changed measurably over a ten-year period. And so we
believe that students, whatever results we're looking at today,
are not differentially influenced in any way by student
engagement with the items themselves. Great. Thanks so
much for bringing that up, Bill. I think we have time for one or
two more questions. Susan Fitzpatrick with the McDonnell
Foundation has a pretty interesting question I think
best addressed to Julie. She asks, "Is there a risk that
self-directed learning will widen educational gaps?" Julie,
do you have thoughts about that? Yes, absolutely. And, in fact,
we do a lot of research around students' access to technology
as an opportunity to do that sort of self-directed learning.
And so I would say, Susan, yes, there is definitely a risk that
self-directed learning could widen educational gaps. And so
when we look at the types of ways that students are
self-directing learning, we do see some differentiation. But,
to be quite honest with you, the differentiation is often about
student familiarity with technology and their comfort
with using technology for that sort of self-directed learning
more than their access. So we pay a lot of attention to the
equity conversation as part of this, and to make sure that when
we're talking about students' self-directing learning,
particularly using digital tools, that it's an equitable
type access so that all students have the opportunity to do that.
But, of course, without that sort of even playing field, it
does run a risk that certain students would have that sort of
access more than others. Thanks so much. And I think we'll try
to sneak in one last question. Barbara Hollingsworth with CNS
News asks, "How do the latest NAEP scores compare to
international scores in reading and mathematics?" Peggy, are you
able to speak to -- are we able to make those international
comparisons with some of the grade 12 results? I'm glad that
you refined your question and asked whether we're able to make
those comparisons because I think it is a little difficult
because the time period for which we have data are not
aligned very well. The TIMMS result, for example, were last
collected in 2011. In comparison to 2007, we've seen an
improvement for grade four math, and for grad eight math we've
seen no improvement, so it's holding steady. And I guess my
point is that we would rather make comparisons to a comparable
year, which does happen quite often with TIMMS because of how
it's aligned. I would say to our readers, callers, listeners
today that the TIMMS Advanced for 12th grade -- we haven't
done 12th grade or population three in a while -- will be
released this year in November. Actually, I have the date,
November 29th. We have a TIMMS Advanced for physics and a TIMMS
Advanced for calculus. So I think we should wait with baited
breath to see what goes on there. PISA, which is the other
one that people like to know about our results for our
15-year-olds, that will be released this year as well,
December 6th. So write that on your calendars. We'll have
results. The results in the past for PISA, the most recent ones
were in '09 to '12, have been flat for reading, math, and
science. So I think we're looking forward to what these
results will say for our 15-year-olds in December. Got
it. Thank you so much. And that's actually all the time we
have for questions today. So, really appreciate all of our
participants who did submit a question. Now I'll turn it back
over to Bill for concluding thoughts. Thanks, Greg. I hope
today's discussion about 12th grade achievement and college
preparedness inspires you to learn more and to continue the
conversation. Remember to join others on Twitter to talk about
the findings. So that's #NAEP. Also, you can visit the
nationsreportcard.gov to see the full report and to examine the
data in far greater detail. Additional materials include a
news release and information from the governing board's
preparedness research, and that's available right now at
our website, nagb.org. And an archived version of today's
webinar will be available early next month. Members of the media
may contact governing board's Public Affairs Specialist,
Stephaan Harris, with questions. And, in closing, I want to thank
Peggy and Dale and Margretta and Julie for providing their
valuable insights. I also want to thank Jennifer and Greg for
helping bring this program to you. And I know, Peggy, you and
I are going to join in thanking our staffs. We couldn't do
anything that we're doing without their dedication and
their support, the staff at NCES and the staff at the National
Assessment Governing Board. And finally, of course, we thank all
of you for joining us for today's discussion. Have a great
rest of the day.