Tip:
Highlight text to annotate it
X
Good afternoon.
The microphone works, great start.
Thank you, everyone, it's great to be
in this beautiful ballroom.
Welcome to the National Assessment Governing Board
25th Anniversary Symposium.
I'm David Driscoll, chair of the governing board,
and I'm proud to be one of the hosts of.
This event.
And we're going to look at what we've accomplished in the past
and also look forward.
One of these events, one of the people in my department
when I was commissioner in Massachusetts
said, so commissioner, what are you going to talk about?
I said, I'm going to talk about where we've been, where we are,
and where we're going, and he said, oh, that makes sense.
So that's a little bit about what
we're going to talk about today.
Now, whether you're here in-- I have to remember this,
now-- whether you're here in Washington or watching
via video online and joining us in that way,
please join the conversation on Twitter using the hashtag
nagbe, N, A, G, B-- I think everybody
knows that-- nagb25th.
So that's the hashtag, nagb25th.
It was, in 1988 as Ray Fields often reminds us
after he gets through the history of the United States
starting with Thomas Jefferson, we finally
get to 1988, as board members know from the orientation, when
Congress did establish the National Assessment Governing
Board.
And the board has been expanding the reach and impact
of the National Assessment of Educational Progress,
known more commonly as NAEP, and known as the nation's report
card.
We're here today to talk about that.
We're also here to look ahead to the next 25 years
and to consider new ways to keep the Nation's Report
Card the gold standard for measuring student achievement.
However, to look ahead, we need to look back and appreciate
the foundation that has been built.
The Governing Board was created by Congress
to oversee, set policy for, and ensure
the integrity of our great asset in the country has
in the National Assessment of Educational Progress.
Since then, the non-partisan and highly diverse
group of board members has worked
with hundreds of other Americans to make public these data
in the interest of improving achievement and closing
achievement gaps for all of our students.
We are glad that you will be a part of today's discussion
to review what we've done, delve into what else we need to do,
and issue challenges for the future of the Governing Board
and the Nation's Report Card.
Also that NAEP will continue to provide valuable information
to parents, policy makers, educators, and others.
I'm joined today by Alan Friedman, Chair
of the 25th Anniversary Planning Committee,
and Cornelia Orr, the Governing Board's Executive Director,
both of whom you'll be hearing from momentarily.
After our introduction, we will go right
into our panel discussions, and here's
a look at what you can expect at today's symposium.
Here in the Palladium Room, and for our live stream audience,
we will discuss Innovation and NAEP: A Special Role.
The panel will talk about the ways
in which technology has changed.
Do you think technology has changed, 25 years?
Including the impact, obviously, of computer-based assessments,
innovation and reporting results,
and a look at new questions designed
to measure problem solving and critical thinking.
At that same time in the Blue Room-- not named
for any particular political bent,
I guess Ol' Blue Eyes sang there,
so that's why it's the Blue Room-- another panel will
convene for a discussion we're calling
TUDA, its impact and future.
This panel will examine how a strategic and focused
assessment of 21 of America's largest urban school systems
has provided key insights over the years, and again,
the challenges that lie ahead.
And those panels will begin at 1:25.
Then at 2:30, following a break, we
will continue with two more sessions.
First, should NAEP go to college or should it get a career?
Should NAEP go to college?
From Preparation to Possible Expansion Into Postsecondary
Education, wouldn't that be interesting?
That will take place in the Palladium Room,
and again, for our live stream audience.
The discussion will focus on changes
in the way the Governing Board plans
to report on academic preparedness
and the possibility of NAEP's expanding
to include secondary assessments.
In the Blue Room the panel topic will
be Assessing Special Populations:
Changes and Challenges, which looks at the innovative ways
the Nation's Report Card assesses
and reports on students with special needs and related
policy issues.
We invite you to attend the panel discussions
you find to be of most interest in the work you are doing.
We know the sessions will prove to be productive in shaping
the future of the Governing Board and NAEP.
Audio or video of all panel discussions
will be available online later this month.
We will reconvene here in the Palladium Room
and hear from our plenary speaker, Jim Shelton, Acting
Deputy Secretary at the US Department of Education.
Our final panel of the day is called
"A Quarter Century of State NAEP-- And Its
Future in the Time of the Common Core."
We'll address how we learn from the past quarter century
and adapt for the future.
What are our challenges in differentiating NAEP
from other standardized tests?
The symposium wraps up with remarks
from our concluding speaker, Roberto Rodriguez,
Special Assistant to the President for Educational
Policy-- and by the way, a tremendous staff
member for Senator Kennedy for many years
before his current position.
Now it's my privilege to introduce
Alan Friedman, consultant in museum development
and science communication, former member of the National
Assessment Governing Board, and Chair of the 25th Anniversary
Planning Committee.
For 22 years, Allen was the director of the New York
Hall of Science New York City's Public Science Technology
Center, and we had the great pleasure of visiting it
when we had our meeting in New York,
and it was absolutely fabulous.
Under his leadership, the Hall of Science
won special recognition for encouraging new technologies,
creating new models for teacher training,
and serving an extraordinarily diverse audience--
as you can imagine-- in New York City.
Allen is the recipient of the American Association
for the Advancement of Science's Award for Public Understanding
of Science and Technology, The Association
of Science Technology Center's Fellows Award,
and the American Institute of Physics' Andrew Gemant Award.
He's also the only one to ever fly a plane
as a member of the Governing Board in the room
that I know of.
We're glad to have him join us today, Alan?
[applause]
Well, thank you, David.
And welcome, everyone, to the 25th Anniversary Symposium
of the National Assessment Governing Board.
Last fall, I finished my eighth and final year
as a member of NAGB, so I've been involved with NAGB
for almost a third of its entire history.
And this symposium is result of a lot of thought and planning
by the 25th Anniversary Committee,
current and former board members, our colleagues
and contractors, and as I think any of you who
have been involved know, an incredibly
hard-working and ingenious staff.
And what a remarkable process NAGB is.
It's a policy-setting and oversight
board for a federal program, but it's
a board appointed by the head of that federal program
and dependent on the program to get things done.
Yet for 25 years, this has been a generally smooth and mutually
respectful process for the board,
and for the Department of Education program it oversees.
Given all the complexities and tensions in government,
in education, in anything that involves
rigorous critical measurement, how
has NAGB managed to survive and even thrive?
So today we had a chance to explore
something of that story.
But for me, I got a summing up of what
NAGB and NAEP have accomplished.
In just a few months ago, I spent a lot of time
in a part of the country that I visit regularly
where the local newspaper and the publisher
believe that anything touched by the federal government
is instantly polluted and probably destroyed.
And so I'm reading a column by one of their regular columnists
who's raving on about how awful the federal government is,
and he's explaining that nothing the Feds say
is ever to be trusted.
He then took on education, look how the federal government is
destroying American education.
Why, he said-- and I quote-- the Nation's Report Card clearly
shows that after 20 years of federal meddling,
his state was making painfully slow progress in education.
Now, did that columnist realize that the Nation's Report Card,
which he cited, which was the only actual piece of evidence
he offered in defense of his views,
is in fact, a product of that extraordinarily important,
reliable, and trusted federal Department of Education
called NAEP and the National Assessment Governing Board?
So today we're going to have a chance to explore mysteries
like this, how have we done it, and how can we
keep doing it for the next 25 years?
Thank you.
[applause]
Thank you, Allan.
Next, we welcome our steady Executive Director,
Cornelia Orr, who is Executive Director of the National
Assessment Governing Board.
Prior to joining the board, she served
on the Florida Department of Education
as the Assistant Deputy Commissioner for Accountability
from 2003 to 2009.
She provided great leadership and direction for Florida's K
to 12 and postsecondary assessment programs.
Her career has involved assessments
at both the local and state level as Director of K
to 12 assessment, as Director of Testing and Evaluation
with the Leon County schools, and as a Program Specialist
with the Department of Education on a variety
of other assessment projects.
Please welcome Cornelia Orr.
[applause]
Thank you, David.
As David mentioned, I am from Florida.
I'm one of those strange people that retired from Florida
and moved North.
But I'm very happy to be working with NAEP and the Governing
Board.
I am also very proud of Florida's testing
program and its support of NAEP.
A great big thank you goes out to Tom Fisher who
was my mentor in that program and also a Governing Board
member, many of you may remember him.
Florida was, at his leadership, one of the first states
to make it a requirement for Florida school districts
to participate in NAEP, and many states
have subsequently followed that pattern.
Using a pattern that one of our Governing Board members
has put forward, Two Stars and A Wish,
I'm going to make today Two Comments and A Challenge.
So first of all, my first thought
is that NAEP has been very good for the United States.
Can you even imagine what it would be like living in a world
without information about how students all over this country
are performing in various subjects that NAEP assesses?
Some really great thinking and engineering
has made NAEP what it is today.
However, we shouldn't just rest on our laurels,
we should keep problem solving and redesigning to best meet
the needs of today and the next generation of NAEP.
My second thought is that the Governing Board
has been good for NAEP.
Of all the ways I could name at this point,
I will only focus on how the board has impacted the NAEP
reporting--- and by the way, it still is about this work--
these improvements have included,
more accessible reports, faster turnaround time for reporting
on assessments, reporting using achievement levels,
identifying new target audiences for NAEP.
Of course every one here won't agree that all of these changes
have been good or successful.
However, that's something that can be discussed in more depth
today.
Now here's the challenge, I want you to get some exercise today.
You'll mostly be sitting, but you
can exercise your 21st century skills.
I want you to think critically about NAEP and the Governing
Board, their contributions of the, past
and their potential for the future.
I want you to use collaboration to openly discuss
the pros and cons of the past decisions of the board
and possibilities for the future.
I want you to compare different parts of today's narrative,
I want you to find similarities and differences,
and bring clarity to the future decisions that
might improve NAEP and the Governing Board's work.
Use your problem solving and brainstorming
to identify more possibilities for the future.
So just in closing, I would like to say, what does the United
States need to know about the achievement of its students
over the next 25 years?
How can the Governing Board and NAEP
continue to improve how this information can
be provided in a consumable manner?
To help you recall the trajectory of where
the Governing Board has to been, we
have prepared a video for you to see.
It should help you reflect and start your thinking
about the possibilities for the next 25 years.
[music playing]
In 1989, nearly 20 years after its inception,
Congress transformed the respected National Assessment
of Educational Progress, NAEP, the Nation's Report Card,
by providing for state-level assessments,
standards-based reporting, and an independent, bipartisan,
National Assessment Governing Board
to oversee said policy board and ensure the integrity of NAEP.
A nation at risk, published in 1983,
highlighted academic underachievement
and underscored the importance of measuring
what students know.
In 1986, the publication of Time for Results by the National
Governor's Association argued for education reform, rigor,
and measures of student achievement
that are comparable across states.
The 1987 publication of the Alexander/James Report
recommended NAEP as the vehicle for providing
cross-state comparisons.
These recommendations were the underpinnings
for the 1988 NAEP legislation.
Over the next few years, the Governing Board
developed new challenging NAEP frameworks,
and in 1990 and 1992, those frameworks
were used as the basis for the first trial
state-level NAEP assessments in reading and mathematics.
These groundbreaking assessments moved NAEP's common yardstick
to the state level.
During the 1990s, following its congressional mandate,
the Governing Board developed NAEP basic, proficient,
and advanced achievement levels used
to report results and help answer
the question, how good is good enough on NAEP?
Starting in the mid-1990s, the Governing Board
and the National Center for Education Statistics, NCES,
worked collaboratively to include more students
with disabilities and English language learners in NAEP.
In 2002, NAEP launched the voluntary Trial Urban District
Assessment, or TUDA component, a partnership of the Governing
Board, NCES, and Counsel of the Great City Schools.
In 2002, TUDA included six districts, by 2009,
18 districts volunteered to participate in TUDA,
and by 2011, 21 districts were participating.
Following the recommendation of a blue-ribbon panel,
the Governing Board embarked in 2004
on a long-term, rigorous effort to determine
whether NAEP could report on the academic preparedness
of 12th graders for college and job training.
The Governing Board completed the first phase
of a comprehensive program of research in 2013.
These findings will be included with results from the 2013
12th grade reading and mathematics assessments.
In 2005, the Governing Board began work
on an exciting new writing assessment designed
to mirror advances in technology.
That led to NAEP's first fully computer-based assessment
in 2011 in writing.
More innovation followed in 2008 when the Governing Board
focused on the T and E in STEM education,
developing the Technology and Engineering Literacy
Framework with a dynamic computer-based design.
Today's students will have to make decisions
about how best to use what technology and engineering have
to offer.
They're also the next generation of inventors and thinkers.
When the Governing Board turned 20 in 2008,
Senator Lamar Alexander recognized the board,
calling it the key factor in cementing
the standards-based reform movement across this nation.
NAGB has set high standards, developed
high-quality assessment aligned to those standards,
and reported data in a clear and understandable manner.
In 2010, the Governing Board and NCES
embraced digital communications as a way
to share NAEP results and resources with more people.
The NAEP partners continue to pursue innovative ways
to engage parents, teachers, and policymakers alike
through social media, video, and enhanced multimedia.
The Science in Action assessment, released in 2011,
was just what it sounds like.
The assessment included hands-on science tasks
and new interactive computer tasks.
The release event was held in conjunction
with an interactive hand-on science exhibit.
With the 2013 release, NAEP presented findings
for the first time through a highly
searchable and interactive website.
Combined with online media and stakeholder outreach,
the 2013 reading and mathematics report card
saw record-breaking attendance and media coverage.
In 2014, the Governing Board hosted the first
ever Education Summit for Parents with an opening address
by US Secretary of Education, Arnie Duncan.
Raising your voice and encouraging
parents who aren't as engaged as you to stand up and speak out.
Parents absolutely have the power
to challenge educational complacency here at home.
The past 25 years have laid a solid foundation
for the future.
The Governing Board is dedicated to promoting
ways educators and policymakers can use NAEP data and resources
to make a difference in improving achievement,
in closing achievement gaps for all of American students.
The board is working on a fully computer-based NAEP,
embarking on a broader parent outreach initiative,
staying at the forefront of inclusion policies,
pursuing NAEP's role in academic preparedness,
and taking other steps to make NAEP results and resources
accessible and useful for its many audiences.
As the Governing Board enters its 26th year,
it remains committed to its duty to the public,
safeguarding NAEP as the gold standard,
remaining responsive to the challenges and opportunities
to come ensuring NAEP's integrity, credibility,
and rigor as a trusted measure of student achievement,
and making a difference for education in America.
[music playing]
[applause]
A couple of things to note, you can
see Alan Friedman's become our movie star in all of these.
He's in every video we have, now, used to be just Alan.
And I can't help but comment, as I look out in this audience
and see all the tremendous leaders and people
that have been involved in the board.
One sentence that talks about state
NAEP like it just happened so easily, some of you
still have the scars to prove the battle that went on,
but obviously tremendous progress has been made
and we've got a very proud history.
We're running a little bit behind, but we'll make it up.
We're going to take about a five minute break,
and again, the panels will begin.
The one in the Blue Room on TUDA and the one here on innovation,
thank you very much.
Well good afternoon, and welcome to the session
on innovation and NAEP.
When a program like NAEP aspires to be the gold standard
and to maintain things like a long-term trends,
can it also aspire to be innovative,
on the cutting edge, risk-taking?
There's a tension there between wanting
to be reliable and consistent verses wanting
to be innovative, coming up with new measures of rigor.
And that's been the NAEP story for 25 years,
and perhaps even longer back.
So this panel will explore how NAEP and NAGB have
been involved in assessment innovation
in the past, how they've balanced that tension,
and what opportunities for further innovation lie ahead.
Joining me on the panel this afternoon
will be Randy Bennett who holds the Frederiksen
Chair of Assessment Innovation at the Educational Testing
Service as an expert on performance assessment, Peggy
Carr, Associate Commissioner for Assessment at the NCES,
and Richard Rothstein, the former New York Times
educational columnist who's now a research
associate at the Economic Policy Institute
and a senior fellow at the Earl Warren Institute of Law
and Social Policy at the UC Berkeley School of Law,
and then I will make some remarks, too.
But you can never get enough of me,
so now you get to see a brief clip in which I
do appear-- don't blink-- about some
of the innovation that has taken place recently at NAEP.
So if we could roll the videotape.
[music playing]
Technology has transformed all aspects of education
giving rise to innovations both in and outside the classroom.
From chalkboards to white boards,
pencils to tablets, and classrooms to online learning.
In many ways, the National Assessment
of Educational Progress, or NAEP,
has made innovation its own using technology
to improve the development, delivery, analysis,
and reporting of NAEP data.
The National Assessment Governing Board
and the National Center for Education Statistics, NCES,
are partners in breaking new ground
in all stages of the assessments.
For the first time, NAEP made computers part
of the assessment itself with the Science Interactive
Computer Tasks component in 2009.
The 2011 writing then marked the first fully computer-based
assessment, which featured the use of word processing tools
to reflect how students write in their everyday lives.
The 2014 NAEP assessments in science and technology
and engineering literacy are the next
of the many computer-based assessments to come.
The Governing Board championed the development
of the technology and engineering literacy framework.
Designed with computer-based interactive tasks
and real-world scenarios to test the next generation
of inventors and thinkers.
I'm passionate about technology and engineering.
Why?
Well, look around you.
Technology is cell phones, CAT scans, bridges,
and so much more.
With the rapid shift toward online media consumption
and the proliferation of mobile devices,
the governing board and NCES have innovated NAEP's reporting
and communications.
The release of the Science Report Card in 2011
marked a shift from in-person media events
to online presentations.
The new webinar format invites people from across the country
to tune it and expand the discussion of results
beyond Washington, DC.
Increases in attendance and coverage were immediate,
and the 2013 Reading and Math Release event
had more than 400 webinar participants
and generated nearly 250 original news
articles and 1,200 mentions on social media.
Those Facebook and Twitter audiences
continue to propel conversation about NAEP.
NAEP and TUDA even topped the DC Twitter charts.
The Governing Board and NCES remained
focused on increasing NAEP's impact by expanding
the audience and improving the online experience.
With these new outreach efforts have come new products.
Together, the Governing Board and NCES
are redefining how NAEP is packaged and distributed
through new online interactive NAEP report
cards, visual infographics, enhanced online data
tools, and more.
The Governing Board and NCES will
continue to innovate on all elements of NAEP
to preserve the gold standard in a rapidly changing world.
So where will innovation take us next?
It's about unlocking human ingenuity.
Don't just donate a computer to my school.
Teach me how to understand that computer.
Because I'm connected to everyone on the globe.
Will I be prepared?
What will my future look like?
[music playing]
And so now it's my pleasure to introduce Randy Bennett, who'll
be our next speaker.
Thank you very much.
It's a pleasure to be here.
As I believe Peggy Carr will show in her presentation
to follow, NAEP has a long history of innovation.
In addition to that history and in part because of it,
NAEP has also been a touchstone, a gold standard for information
about educational achievement.
So what happens to NAEP in the era of the Common Core State
assessments?
Will NAEP's roles as innovator and as touchstone
be eroded, perhaps to the point of irrelevance?
I have high hopes for the Common Core State assessments,
especially as a mechanism for bringing much-needed change
to K-12 testing.
By virtue of being state assessments,
those measures will necessarily serve a function fundamentally
different from NAEP.
The most obvious distinction is that it will simply not
be possible for the Common Core State assessments
to provide meaningful national results
since not all states will participate.
And even for those states that do participate,
subjects outside of English, language arts, and mathematics
won't be assessed.
But there are other critical distinctions.
NAEP is a touchstone because it has
a track record built over a half century.
The Common Core State assessments
are understandably unproven.
And it will take significant time for them to become proven.
Once fully implemented, the Common Core State assessments
will, by examinee volume, be the largest consequential
technology-based testing program ever undertaken.
Being new, huge, highly ambitious,
and very consequential, these assessments
could benefit greatly from the independent verification
that NAEP might provide.
Because NAEP is a sample survey and not a census,
it can more easily address technical challenges
that may prove very troublesome for the Common Core State
assessments.
As one example, NAEP brings its technology into schools
so it can assess every student on computer, something
the Common Core State assessments will not
be able to do in their initial years,
posing for them the technical challenge of scores that
might not necessarily be comparable between paper
and computer versions.
Given NAEP's population representativeness,
expensive track record, and a level of technical quality
that won't be feasible for the Common Core State assessments,
NAEP will continue to be the touchstone for measuring
and verifying educational achievement for years to come.
What of NAEP's role as innovator?
For several reasons, that role will be even more essential
in the era of the Common Core.
Again the distinction between a sample survey and a census
is key.
In contrast to the Common Core State assessments,
each student takes only a small part of a NAEP assessment,
and only a relatively small sample of students is tested.
That difference allows NAEP to administer
experimental measures, measures that could not easily
be fit into the Common Core State assessments.
If all students are already taking an eight-hour test,
asking for more time is hardly workable.
And in a census, there are no additional students to test.
For NAEP, getting more students to try out new measures
is workable and has been done repeatedly
over the program's history.
Because NAEP does not need to simultaneously offer a paper
option, its technology-based assessments
can be highly innovative, measuring new skills called
for by the frameworks and using new task types that would not
be possible to even contemplate in the paper versions.
The Common Core State assessments
will have no such luxury because they must test every student.
If their technology-based test is too innovative,
it might be measuring aspects of the Common Core
not included on the simultaneously available paper
version, raising issues of both comparability and fairness.
Perhaps the most visible innovation
of the Common Core State assessments
will be in moving the infrastructure and member
states from paper-based assessment of state standards
to technology-based assessment of common standards.
That highly significant innovation
will take several years.
While that move is occurring, NAEP
will have the opportunity, indeed the obligation,
to continue its own innovation efforts, much of which
the Common Core State assessments can build on
in the future.
What might NAEP explore that the Common Core State assessments
might not have the bandwidth to do more immediately?
Most obviously, NAEP must build upon advances in the learning
sciences, in technology, and in measurement
to put into operation more engaging, more relevant,
and more informative measures.
NAEP started down this path with this exploration
of interactive performance tasks and the analysis of students
solution processes, first known in the Technology-Rich
Environment study of 2003 and continued in the 2009 Science
Interactive Computer Tasks and the 2014 Technology
and Engineering Literacy Assessment.
But imagine not NAEP assessments but NAEP learning challenges.
Imagine a scenario-based argumentative writing task
where students are faced with a meaningful and engaging
problem, perhaps about whether junk food should be sold
in their school or whether advertisements to children
should be banned.
Imagine providing students with a simulated internet from which
they could gather and read information
and make a policy recommendation to
their congressional representative.
Imagine being able to report not just
on the quality of the writing or the extent to which it made
claims backed by evidence but on the quality of the strategies
used for information search.
Imagine a NAEP scenario-based, science test in which students
had to gather information from their simulated internet
and then run a series of virtual experiments
to find out why a particular species of bird was dying.
Imagine being able to report not just
that one population group differed
from another in its end result but how that end result was
associated with group differences
in problem-solving approach.
Imagine a NAEP assessment that had many, many projects
like these, including technology-based collaborative
ones.
Imagine a NAEP assessment that measured learning deeply
because the tasks called for sustained work
and that measured broadly because there were enough
randomly assigned tasks to make population-based inferences
about student processes, strategies, knowledge,
and skill.
Imagine NAEP tasks from which students learn something
significant by participating in the assessment.
Imagine a NAEP assessment that didn't feel like a test at all
but felt like a learning challenge.
NAEP is the vehicle to explore potential innovations
like these.
NAEP is the vehicle within which to perfect them
to the level of technical rigor worthy of the nation's report
card.
In the era of the Common Core, NAEP's finest days
are not behind it.
Both as touchstone and as innovator, NAEP's finest days
lie ahead.
Thanks very much.
Thank you, Randy.
Peggy?
I'm going to start with a NAEP fact.
I started with a program back in 1993,
and there were a lot of firsts in the program.
I think NAGB had just really been enacted
for a little while, but they had really just gotten started.
And I had the pleasure of dealing with a lot of firsts.
And there was Emerson Elliott, who
was the commissioner at the time.
There was Chester Finn.
That was kind of scary.
He was the chair.
There was Roy Truby, who became one of my best buddies
as time went along.
And then there was Gary Phillips,
who mentored me as the associate commissioner for the program.
So these were a lot of strong leaders,
and a lot of interesting, very innovative things
happened during this time.
And they really started the ball rolling.
So what I'm going to do with my little time
is to take you on a journey.
I'm going to take you back to the future.
I'm going to show you what happens
behind the scenes, a little bit more of what you've seen today.
You've seen the reports.
You've seen the NAEP Data Explorer.
And these are the things that you see up front.
And they, my golly, are really innovative parts
of the components of the program that
has really pushed us forward.
But there's a lot that goes on behind the scenes.
So I'm going to take you on yet again a journey
with another video.
But there are no words.
Just watch.
[music playing]
2050, well, NAEP will look a lot different than it looks today.
That's a long time and a lot of time to be innovative.
It's amazing how we've been able to make
these many changes in the program
but yet maintain our trends, which
is what we are about, right?
We've done a lot.
We've made a lot of innovations in the program.
We are continuing to be the gold standard,
but yet we have not changed the measure.
In the future, we're going to be seeing
additional innovative, very, very creative activities
for NAEP.
We'll see our HOTs and our ICTs merge into hybrids.
That will be fascinating.
We will see matrix sampling for the first time coming
to the background questions.
Imagine that.
And we'll see the products of our sale, our Innovations Lab.
Just like the Bell Laboratory has
sort of a stunt group in the back sort of figuring out
what to do in the future, we have an Innovations Lab
as well, SAIL-- Survey Assessment Innovations
Laboratory.
As we've heard already, it will leverage
what can we do with technology in the future.
It will depend and aligned itself
with the learning sciences.
It will be creative.
It will be our ledge to the future.
There are many ways to maintain our gold standard.
And many of them happen behind the scenes.
We'll continue to be transparent.
We'll continue to be relevant.
And we'll continue to be the gold standard.
Thank you.
Thank you, Peggy, and now Richard.
Thank you, Alan.
I'm going to summarize a paper I prepared for this meeting
that NAGB has put on its conference website.
Contemporary education policy has narrowed school curriculum
NA by holding schools accountable primarily
for their students' scores in math and reading.
This narrowing is incontrovertible,
revealed in surveys and confirmed by logic.
Unless you think that teachers and administrators,
unlike all other actors, behavior irrationally,
accountability for only some of the many goals of education
must inevitably cause schools to pay more attention to goals
for which they're held accountable and less attention
to those for which they're not.
Social scientists have documented this kind of problem
in virtually every field of human endeavor, health care,
labor market policy, criminal justice, transportation,
and so on.
And it was most famously summarized in 1979
by Donald Campbell's law of performance measurement,
in which he said the more any quantitative social indicator
is used for social decision making, the more subject
it will be to corruption pressures and the more apt
it will be to distort and corrupt
the social processes it's intended to monitor.
Examples of Campbell's law are widespread.
When Medicare held cardiac surgery practices
accountable for the survival rates of their patients,
physicians turned away the sickest patients
who needed surgery the most but whose survival
rates were lower.
When the Department of Labor held employment offices
accountable for the proportions of job seekers placed
in employment, the officers turned attention away
from training programs for highly skilled, more longer
lasting positions and towards short-term, poorly paid,
unskilled jobs.
And when schools are held accountable
for math and reading scores, less attention
is paid to history, science, citizenship,
physical education, oral presentations,
cooperative learning, the arts, and music.
Designing accountability tools that
require satisfactory performance across a balanced set
of indicators outcomes requires significant federal research
and development effort, which could
build on NAEP's prior experience.
In recent years, NAEP has contributed
to the distortion ongoing in American education
by its heavy emphasis on math and reading,
by its promotion of state-level tests in math
and reading alone.
But when NAEP was developed in the 1960s,
it measured a much broader range of cognitive and non-cognitive
knowledge and skills.
NAEP abandoned that breath when its budget was slashed
in the 1970s and never restored it.
NAEP was first designed in the 1960s
by a team led by Francis Keppel, John Gardner, and Ralph Tyler.
Tyler proposed that evaluation of education
should not rely exclusively on standardized test scores
but must appraise the behavioral outcomes of students,
since that's what sought in education.
He recommended that assessments be administered after students
leave school, because material not well taught
may be rapidly dissipated or forgotten.
While agreeing that some skills can be assessed with paper
and pencil tests, he insisted that other objectives
of education, such as social skills,
are more easily and validly appraised through observations
of children under conditions in which social relations are
involved.
Evaluation should include collection
of actual products made by students,
such as paintings or samples of writing.
He suggested that if a school's reading program aimed
to develop students who had increasingly broad and mature
interests, evaluators should assess this
by seeing what books students checked out of the library.
In a 1963 memo to commissioner of education Keppel,
Tyler laid out this program.
And early NAEP pretty much followed his recommendations.
To see whether students learned to cooperate, for example,
early NAEP send trained observers to sampled schools.
In teams of four, nine-year-olds were
offered prizes such as yo-yos for guessing
what object was hidden in a box.
Students could ask yes or no questions,
but all team members had to agree on each question asked.
NAEP then rated the students, and NAEP reported on this,
on whether they suggested new questions,
gave reasons for viewpoints, or otherwise demonstrated
cooperative problem-solving skills.
It reported to the nation on the percentage of students
at the various age levels that NAEP
sampled who were capable of cooperative problem solving.
For teenagers, NAEP assessors provided
lists of issues about which young people typically
had strong opinions.
Students had to collaborate in writing recommendations
to resolve them.
For 13-year-olds, the lists included
topics such as whether they should have curfews
for getting home and for 17-year-olds the age
eligibility for voting, drinking, and smoking.
NAEP rated students on whether they took clear positions,
gave reasons for their viewpoints,
helped organize internal procedures,
and defended another's right to disagree.
Early NAEP understood that teaching civic responsibility
involved more than having students
memorize historical facts.
So in 1969, during the Civil Rights era,
the assessment asked teenagers what
they felt they should do if they saw black children being barred
from a public park.
NAEP reported that 82% of 13-year-olds
and 90% of 17-year-olds knew that they should
do something constructive, such as tell their parents,
report it to a civil rights or civil liberties organization,
write letters to the newspaper, or take
social action such as picketing or leafleting.
The early versions of NAEP also assessed 17-year-olds' ability
to consider alternative viewpoints by asking them
to state arguments both for and against a heated public issue
of the time, such as whether college students should
be drafted.
It asked 9- and 13-year-olds whether something reported
in the newspaper could be untrue.
To assess commitment to civil liberties,
NAEP asked teenagers if someone should
be permitted to say on television that Russia
is better than the United States,
that some races of people are better than others,
or that it's not necessary to believe in God.
The assessment reported the discouraging result
that only a small minority of the teenagers
thought all three students should
be permitted to be said on television.
The early NAEP program also assessed
personal responsibility.
17-year-olds were asked what to do if, when visiting a friend,
they noticed her six-month-old baby was bruised.
The correct answer was, suggest that your friend
called her baby's doctor.
Incorrect choices included, ignore it
because they're none of your business.
A follow-up prompt said that at a later visit, bruises remain,
and you are now suspicious that your friend may
have hurt the baby.
Students were asked what to do now.
The correct choice was, call the local child health agency
and report your suspicions.
Certainly if school systems were evaluated by such results,
not simply by math and reading scores, incentives would shift.
National reporting of low scores on the civil liberties
questions could spur demands the school do a better
job on citizenship education.
Then the incentive to drop cooperative learning
in favor of test prep in math and reading would diminish.
Returning NAEP to its mission of being the nation's true report
card will take time and care, because the problems
are daunting.
Observations of student behavior are certainly
less reliable than standardized tests of basic skills,
so we'll have to accept that it's better
to imperfectly measure a broad set of outcomes
than to perfectly measure a narrow set.
We'll have to resolve contradictory national
convictions that schools should teach citizenship and character
but not inquire about students' and parents' personal opinions.
To avoid new distortions, we'll need
to make tough decisions about how
to weight the measurement of the many goals of education.
The time to start on these difficult tasks is now,
but looking back at the early National Assessment
of Education Progress can start us on a better path.
Thank you.
Wow, we've gotten enlightening introductions to innovation
at NAEP.
All of this early work is new to me.
And it is interesting how I think
it overlaps with what Randy was saying about what he hopes
NAEP will be doing in the future.
Because many of those characteristics
of looking beyond the purely cognitive rote memory
and other sort of straightforward facts,
more behavioral activity does take place
in the newer assessments.
And I want to close by giving one example of that.
And that is the TEL-- the Technology and Engineering
Literacy innovation.
Just as an example of the things that all three
of the prior speakers have described
about the opportunity and the possibility
for NAEP to innovate.
Again, from my point of view, it's
an interesting constant creative tension
between sticking to your knitting,
and maintaining the gold standard
of educational assessment that way, and innovation,
looking for new ways to improve what we assess
and how we assess it.
NAEP has long included science, and within science has always
been some helping of technology, usually
as applied science, perhaps a little hint
of engineering in there.
But there were continuing calls for at least the last 30 years
to assess what American students know and can
do more broadly in STEM, technology and engineering
in particular.
The rise of the Information Age, the knowledge economy,
many factors that depend on both a workforce and a society
comfortable with information and communication technologies,
with a constant and rapid pace of change,
with an expansion of the digital environment,
with the ability to create and use new devices, materials,
and engineering designs.
The existing NAEP science assessment
just barely scratched the surface
of this entire realm of experience and learning.
The National Academy of Engineering
produced a series of reports starting
with Technically Speaking about the need for engineering
literacy.
And then Tech Tally about the need
to assess engineering literacy in the general public,
in school kids, in teachers.
So, should NAEP undertake a new assessment
in this area, which would undoubtedly be expensive,
would take time, money, and attention away
from our knitting.
Would this push NAEP into existing fierce debates,
such as whether technology really means computers
or whether technology is much broader
and includes metallurgy, and civil engineering,
and structural engineering.
How could they possess what students know about this?
Because the field of technology has
to do with things, not just words and ideas.
How much technology would we have
to use to assess technology literacy?
And would the uneven availability
of technology across the country bias the test results?
Finally, suppose we created such an assessment.
Who gets blamed if people don't do well?
Because very few schools have full time,
dedicated engineering teachers in the K through 12 system, so
who's responsible?
Well, I think everyone in this room
probably knows that NAGB decided that this was one innovation it
just had to do, with all of those risks
duly noted and cautioned.
And it was a risk.
In fact, it's still a risk.
In my last meetings of the board we
argued about whether maybe we just had
to postpone the TEL assessment.
Because there was the budget sequester, and the next year's
budget is looking tight, and maybe we
should just push it off into the future.
We didn't, at least as of my last meeting I attended.
So NAGB began work, before actually I joined the board,
on a framework for assessing Technology and Engineering
Literacy.
And about that time the National Research Council
came out with its new framework for STEM education, K
through 12, which called for new standards, new curricula,
and new assessments.
And to nearly universal applause launched
a project called the Next Generation Science Standards.
Somewhat more controversial, but being adopted by states now
and generally agreed by at least science teachers
to be a huge advance over anything we'd had before.
And now there's a movement to create assessments.
Can it be done?
Yes.
NAEP has just done it.
It's rolling out this year at the eighth grade level
and it will certainly be the guide
for all the new assessments and new curricula
that are developed to deal with Technology and Engineering
Literacy.
So I think we have actually a pretty good balance
between innovation and sticking to our knitting.
We are sticking to our knitting, but those new carbon fiber
knitting needles and that wonderful new nano
coated yarn, that's waterproof, it's
improving every single sweater we knit.
I'd like to know open the floor for comments and discussion.
And we can start with members of our panel,
if you have questions you would like
to ask other panel members?
I have a question.
Please.
Richard, I found your presentation
on the beginnings of the NAEP reporting very interesting.
Often we are asked about some of those reports out
of those assessments from years ago,
because they're really quite popular.
They were more diagnostic, and they were more thoughtful,
deeper in their meaning, in many ways.
But we didn't have a lot of skill reporting back then.
It was more item reporting.
And you did really good jobs of giving us
a hint of what was in those assessments.
Do you think they were just as effective?
I know people are still asking for them,
but what do you think about their impact?
Well, I think as I hope I suggested,
I think it's a shame that it was abandoned.
It was abandoned first for budgetary reasons.
But then when the budgets were restored,
NAEP's focus shifted to a more statistical position
in the skills, and not spending money
on the kind of field observation of these other skills.
The early reports are fascinating.
I saved copies of them, which is the only way
I was able to write this analysis.
But it did include skills.
They also had tests.
They assessed students-- as you know, the early NAEP assessed
9, 13 and 17-year-olds, no matter what grade they were.
It wasn't 4th, 8th, and 12th.
No, we still did that.
Well, yes, in the trend NAEP, yes.
The assessors would go out to the community
and find 17-year-olds who weren't in school.
Well, that's very true.
Give them the assessment, as well.
As Ralph Tyler recommended, he was
concerned about the decay of learning in schools
and so early NAEP also had a 26-year-old assessment,
which they gave the same assessment that they gave
to 17-year-olds to 26-year-olds out in the community,
to see if those students retained
what they had learned in school.
So the early NAEP didn't exclude skills.
It did have math and reading tests, paper tests.
But it also encouraged a much broader look
at what the schools should do by assessing
all these other areas.
And the way it assessed them, the way it reported them--
I think this is important.
It thought of itself not as a standardized test, so much,
but as something like the census.
I mean, nobody passes the census, or fails the census,
or is proficient in the census.
The way NAEP reported these things was simply
the percentage of students who were able to cooperate,
or the percentage of students who recognized
that unpopular things should be able to be
said in the newspaper.
It didn't give anybody an overall score.
And the public was then able to determine whether schools were
doing a good or poor job in these areas.
I think it would be-- obviously the reason I
gave this presentation, I would hope
that NAEP would return to that kind of thing
because one of the problems we have in the current K
through 12 system, in my view, is a narrowing of focus
towards basic skills and the diminution of attention
to these other important things.
Now, NAEP, of course, is a sample test.
And it doesn't have the kinds of corrupting consequences
that high stakes tests do.
But by putting so much of a focus on math and reading,
it's setting a tone for the national discussion which
I think is not helpful.
OK, I'd like to open the floor for questions or comments.
Yes, in the back, please.
Hi, I'm Arnie Goldstein from NCES.
And I work with Peggy.
And I was also very interested in what
you had to day, Doctor Rothstein.
My question is, how do you address these issues
in a questionnaire that goes to students
in the face of the feeling that government questionnaires
or survey questionnaires are intrusive into values,
that somehow these instruments try
to affect the values that students hold?
That is a big obstacle, I feel, to our
asking those kinds of questions.
How do you feel about that?
I absolutely agree with you.
That is a big obstacle.
It's a big thing that would have to be fought if NAEP went back
in this direction.
NAEP, as everybody here knows, is a samples.
So there's not the possibility of influencing students.
But clearly, if we want to teach values in schools--
and we do teach values in schools.
We want to teach citizenship.
We want to teach the value of the First Amendment.
How can you teach the value of the First Amendment
without asking students if they understand that you should
be able to say something unpopular in the newspaper?
And I think that the notion, which you're
absolutely correct, which is abroad in the land
that somehow schools should not teach values
needs to be fought against.
It doesn't exist so much at local school boards,
but in the national conversation,
which is much more political and much more partisan,
you get this kind of thing.
And it's got to be taken, I think, head on and confronted.
Because we cannot have a school system,
and we don't want a school system--
nobody in this country wants a school system in which we
don't teach these values.
Controversial values, perhaps-- but the First Amendment,
clearly everybody thinks we should teach that.
I found your comments about the early days of NAEP something
I want to know more about and give more thought to.
Peggy, you have a statement, if I got it down correctly,
that maintaining trends-- you spoke
about maintaining trends, which is
what we are all about, right?
And you ended your statement--
With a question.
--with a question.
And I've come to believe that while on one
hand, our long term trends-- I guess
we have three NAEPs, in a way, but that long term trend
information, I've supported that for a long time
because frankly, it was-- I thought
it was good to be able to look parents in the eye
and say, you know, on average your eighth grader knows more
math than you did when you were in the eighth grade.
Or reads better than you did when
you were in the eighth grade.
And there was some value to that in terms
of people who thought education in America
was just going down the tube.
But in the day in which we live now,
maintaining trends for more than a few years
seems to me to be increasingly irrelevant.
When major motion pictures are only out for four or five
weeks, or they come out on Netflix the week
after they're released, the American public
has a different psyche.
And I'm beginning to think that maintaining trends, which
is what we're all about, is not right, not for more than a much
more limited period then we have thought before.
And that it's OK if innovation causes us to break those trend
lines much more frequently than we had thought in the past.
So I guess that's both a statement and maybe
a question, right?
May I?
Well, I can say that it's certainly
getting harder to do that.
When I joined the program, maybe a little
after I joined the program, we started
looking at those science items on the long term trend.
And there was a question about, well,
what if man goes to the moon?
Well, man had already gone to the moon.
What about party lines?
Well, we don't even have party line phones anymore.
But those items were still on the assessment.
When we found them, of course, we
dropped the science assessment, for that
reason became really not possible to maintain
that trend.
But you make a good point.
Even now, as we start to think about our TBA
assessment, our Technology Based Assessments--
how are we going to go into the future with technology
with a new learning of sciences, the new instructional sciences,
really, and think that we're maintain a trend?
That's a hard question to answer, I'd think.
I think it's an empirical one.
We still have the same construct,
although we're convincing ourselves psychometrically
that we do-- do we really?
So I'll end my response to you with another question.
I'm not sure.
Can I add something to that?
I still hear people who should know better talk about how
the schools have deteriorated, students
know less than they used to.
And I find it unbelievably valuable
to be able to cite NAEP trends.
When I tell people, for example, that the black fourth graders
now do math better than white fourth graders did just 20
years ago, they're astounded.
And unless I can show them the data from NAEP,
I can't refute this notion that there's
been some kind of deterioration.
So I think it's very important to maintain the ability
to show trends over time.
Peggy raises a reasonable point as to what
those trends over time mean.
Because the math that they measured was the math of--
Yesteryear.
However long ago you said, yesteryear.
And mathematics, like all fields of knowledge,
has changed fairly significantly over that period of time.
Societal expectations for what kids and adults need
to know and be able to do have dramatically changed.
But we have the main NAEP as well.
Every-- we have this problem in every field of endeavor.
It's not just education, the Dow Jones average substitutes
items over time, but we still know
over time what the Dow Jones average is.
So there's no perfect way of balancing both the change
in content area and the trend.
But I think that NAEP has done a pretty good job of balancing
those two concerns, like the Dow Jones average does.
I think one of the things I heard in Mark's question
was that we have the long term trend,
we have the main NAEP, which is really becoming a long term
trend-- so do we need both?
I mean, I think that's the big question.
I'm certainly not going to try to answer that one.
But I think that's the real question.
At what point do we sort of abandon the longest long term
trend and then move into the future with the long term
trend, the current trend that we have now?
My point, I think, is more of an empirical one.
One around the validity of the construct itself.
I mean, are we truly maintaining trend
with the construct as we purport it, to be measuring it?
I'm just-- I'm not sure.
I would like to thank our panel members and thank all of you
for sticking with us.
Just when the conversation is really getting interesting,
of course, that's when it's time for a break.
We have a 10 minute break, and then you
have a great challenge.
You must decide whether to come back to this room at 2:30
for "Should NAEP Go To College?" or should you
head for the blue room and learn about opportunities,
challenges, and changes in assessing special populations.
So before we get started with the panel,
because I think there are some-- I think the panelists feel
a little differently about should NAEP go to college--
we're going to show a very short video.
Because the whole issue of academic preparedness
is one that NAGB has been interested in,
has been researching, actually, since 2004.
And I think we heard earlier that by 2014,
this spring, we're actually going to report out on that.
So with that, I thought we'd show the video
and then we'll get started with our panel.
Thanks.
[music playing]
Our nation is at a crossroads.
Today's global marketplace demands
that our nation maintain its competitive edge, especially
in innovation and technology.
Our students must have the right skills to compete,
and post secondary education and training
are central to this goal.
But we have lacked a national measure
to tell whether our 12th graders are academically
prepared for that education and training.
Everything depends on education.
I mean, it's the keystone to our communities.
Our economy now is one of global proportion.
We compete with countries across the world.
That's why the National Assessment Governing
Board undertook a comprehensive program of rigorous research
to determine whether the National
Assessment of Educational Progress, NAEP,
our only source of nationally representative
12th grade achievement data, can also tell us
whether high school seniors have the knowledge and skills needed
for that next step.
NAEP is uniquely positioned to report
on this current status of all 12th graders
across the country.
We need to know a, what we expect of them
and b, where they are relative to that expectation.
And that's what I think this work will help us do.
We're really talking about whether or not
kids know enough academically to move forward in life.
We've got huge achievement gaps across the country.
And we know that even with the best of situations,
not enough kids are getting through K
through 12 able to go on in life.
The Governing Board's journey to measure how prepared America's
12th graders are for post-secondary education
and training began in 2004.
That's when a blue ribbon panel affirmed
NAEP's unique potential to report on preparedness.
In 2005 and 2006 the 12th grade reading and mathematics test
content was reviewed for its match
with college level expectations.
A high degree of match was found, with only minor changes
made for the assessments planned for 2009.
During 2007 and 2008, a board-established technical
panel developed recommendations for a program of research
to test whether NAEP could support claims about 12th grade
academic preparedness.
Contracts for the research began in 2008
and were carried out in connection
with the 2009 12th grade reading and mathematics assessments.
In 2009, the Governing Board established the NAEP 12th grade
preparedness commission to help spread the word
and seek counsel from key stakeholders.
The commission hosted a series of symposia
across the country with leaders in education, business,
civil rights, and government to highlight the urgent need
to prepare students for higher education and job training
and to share the Governing Board's research.
Substantial media coverage, including op-ed pieces,
resulted.
Now the results of the research are in,
confirming that NAEP is a valid measure
of 12th graders academic preparedness for college.
In spring 2014, for the first time,
NAEP will report 12th graders academic preparedness
for college based on the national results
of the NAEP 2013 assessments in mathematics and reading.
Now, the governing board is conducting a second phase
of preparedness research to confirm the initial results,
to expand reporting to the state level,
to study the potential for reporting
on track to preparedness at the eighth grade,
and to continue to review the feasibility of NAEP reporting
on academic preparedness for job training.
Ensuring that all students graduate from high school
academically prepared for college and job
training for the 21st century workplace
is imperative to our country's national security
and economic well-being.
In an increasingly competitive world,
the Governing Board will continue
to focus on academic preparedness
and carry out additional research
to refine our knowledge about where
12th grade students stand.
This will help our country's leaders, educators,
and policymakers better provide the opportunities
to develop the needed skills for these rising
generations of citizens, surely our nation's greatest resource.
Terrific, OK, we're going to start our panels.
Each panelist is going to have five to seven minutes.
And if you see me sort of do this, then it's sort of time.
And then we're going to make sure that no matter what,
that by 2:10 we'll open it up for questions.
Maybe even before that, but certainly by 2:10,
so get your questions ready.
So Gary, you're going to lead us off.
Thank you.
Well, thank you for the opportunity
to talk on this important topic.
I heard Cornelia Orr say earlier to think what it would be like
if we did not have reliable information of what students
are learning in school.
Well, that she's referring to elementary, secondary.
Well, that is the state of the art in post-secondary.
We don't have reliable information
about what students are learning at the post-secondary level.
And there is a lot that is not known.
There was, in fact, a previous attempt for NAEP
to go to college.
This was about 20 years ago and was an NCES initiative.
There in fact is a report on this
titled "National Assessment of College Student Learning:
Identifying College Graduates' Essential Skills in Writing,
Speech, Listening and Critical Thinking."
What we did with that report-- it actually
came out of the goals panel.
In the winter of 1990, President Clinton and state governors
announced six education goals.
Every child will start school start school ready to learn.
Skip on down-- the third one was,
American students will leave grades 4, 8, and 12
having demonstrated competency over a challenging subject
matter.
That language morphed into some the language
used by NAGB for the achievement levels.
If you go down to number six, every adult American
will be literate and will possess
the knowledge and skills necessary to compete
in a global economy.
One of the objectives of goal six
was to substantially increase the proportion of college
graduates who demonstrate an advanced ability
to think critically, communicate effectively, and solve
problems.
That became the essence of what we
tried to do 20 years ago to get NAEP to go to college.
In the fall of 1991 and '92 NCES sponsored two study design
workshops with national experts.
The workshop participants were assessment experts,
institutional researchers, faculty,
and administrators and policymakers.
We also sponsored a number of position papers related
to general skills related to critical thinking, problem
solving, and communication skills.
We also conducted four public hearings
and conducted a survey of over 600 participants, mostly
various institutions, trying to see if we could come up
with a specific set of skills that could be assessed.
The bottom line is there were a lot of behind the scene
objections from the higher education community.
Ultimately, the project did not go anywhere and was killed.
But I do have a nice report on it if you'd like to see it.
Why is this a good time for NAEP to go to college,
to develop a collegiate assessment?
Well, there is this issue of the validity
of the college and career ready standards
that are being used in high schools.
Also, it's a natural extension of NAEP's preparedness
initiative that you're already working on.
But there is some data that's coming out
that's partially related to whether or not
everything is as rosy as we believe
it is at the post-secondary level of what students know
and can do.
PISA, for example, reported in 2011
that the US has slipped from rank 2
to rank 13 in college graduation rates.
This is not due to the fact that we're
graduating fewer students.
It's because the other countries involved in the study
are graduating more.
Also, the recent PIAAC results, which
is an assessment of adult literacy,
conducted in 2012 in 23 countries-- at age 16
to-- that should be 24-year-olds, not 14-year-olds,
in the US-- it assessed literacy, numeracy, and problem
solving.
For this age, which are more or less your college age students,
literacy was below the international average
in the US.
Numeracy was below the international average.
And problem solving technology was
below the international average.
So in other words, because we are relatively
ignorant about what students know and can
do at the post-secondary level, there's
a lot of beliefs floating around.
But empirical data don't necessarily
support those beliefs.
So what would a NAEP collegiate assessment look like?
Well, it obviously would be called like a trial assessment.
It would be administered probably
at the end of the second and fourth year of college.
Hopefully it would be computer adaptive,
certainly by the time this would get off
the ground you would want to have it computer adaptive.
It would probably contain generic content,
such as what you're assessing in grade 12.
It should be vertically linked, of course, to grade 12,
so it would naturally go right into the post-secondary world.
It would be a longitudinal component.
And you would not identify individual institutions,
just as you don't currently identify individual schools,
but you would compare groups of institutions-- community
colleges versus four year colleges, Ivy League
versus non-Ivy League, et cetera.
How could you go about bringing this about?
This obviously would be something
that would require Congressional authority.
There have been, in my view, two huge changes in NAEP
over the years since it started in 1969.
The first one was in 1983, which came about
through the procurement process, and a document
called "A New Design for a New Era."
And this is where we went from basically reporting on items,
and NAEP was in kind of a research mode,
to reporting on scales and reporting
on the condition of education which we're still doing today.
The second big redesign came about as a result
of the Alexander James panel that
was used as leverage in the congressional authorization
to create a whole new design of NAEP.
And that was how we came about with the trail state
assessment, assessing private schools and the National
Assessment governing board.
I would think that probably something like that
could also be used in the future to bring about a NAEP
collegiate assessment.
Thank you.
[applause]
Ben, you're up.
Thank you very much.
And thank for being here today.
I'm really happy to be on this panel.
It's a really important subject, and I
think a very interesting one.
I was worried about my time, but I
think having listened to Gary, I can probably cut my remarks
in half, since there were a number of commonalities.
I'll just start by reiterating I think something that is really
a simple but profound thing, which
is, when we think about the question of taking NAEP
to college, I think the background is that we just know
very little about the core activities
that college is supposed to be about, which is learning.
Beyond some imperfect measures, we
don't know how much students are learning in college.
And I think the context is very important,
because this is something I think
that is hugely important at a time
when we're seeing a really big push for understanding what
kind of *** for the buck we're getting for the huge sums
that we're spending on higher education as individuals,
as families, as taxpayers, as policymakers.
And there's a pretty widespread view
I think that we need to improve outcomes at a time
when there's huge demand, when we have limited resources,
and we have a lot of concerns, of course, about costs.
In other words, we have to raise productivity.
And that means there's greater attention
than ever to all kinds of metrics about colleges
and universities, persistence rates, graduation rates, labor
market outcomes, cost per degree, loan default rates,
and the list goes on and on.
And while of course there's a lot of challenges surrounding
preparing high quality data around these measures,
I think it's particularly hard for us
to know in a broad-based comparative way
what knowledge students gain in college,
whether that's in generic skills or in specific subjects.
And this has real consequence.
Just to take one example, but a very big example,
all the experimentation that's taking
place right now in the world of higher education as in K-12,
with using educational technology as a means
to widen access and to improve return on investment.
None of that can be successful unless we
have some kind of a common yardstick by which we
can measure success or failure.
And I do think we've seen some promising efforts,
imperfect, as I said.
One is the collegiate learning assessment, the CLA,
which many of you know, it's been around
for more than a decade.
Initially, it was measuring writing and analytical skills
for samples of students at hundreds
of colleges around the country.
Now, they have a new instrument that
is tracking the progress of individual students called
the CLA Plus.
And as you I'm sure know, the CLA
was the basis for a very well publicized and controversial
study that came out a couple of years ago, academically adrift,
that found very disappointing levels of student learning
among the students that were tested
in a range of four-year colleges and universities
around the country.
They found that 45% of undergraduates
experienced no meaningful improvement
in their critical thinking, reasoning, and writing skills
during their first two years of college.
And by the end of college, more than 1/3
didn't see any statistically significant gains.
And of course, inevitably, the study's methodology
has been very controversial.
I don't know if Cliff will have time to get into it.
There's a lot to be said.
But I think that in some sense, I
view kinds of things as a work in progress.
What's particularly important is that somebody's
trying to do this in a way that has
a lot of sophisticated people behind it, even as they're
inevitably going to take some time to learn to get it right.
But they're making the effort.
And I think a larger problem is something that is something
that Gary just alluded to in talking
about the failure of the first effort
to create a NAEP for college.
A real problem with these kinds of measures of learning
is the very strong resistance from institutions
of higher education to having scrutiny
from the outside world.
I think we're seeing more and more of this
in this period of ferment and change
as people are trying to do things differently
in higher education.
And I think there are some places-- I can certainly
see my own institution, SUNY, is doing
some great things around online learning and so forth.
But there are a lot of places that are very resistant.
And the schools that do administer
the CLA, for example, have generally
been unwilling to make the results public.
And they also often don't report them in ways--
if they do make them public, they
don't report them in ways that are easy to understand.
And the same is true for other efforts
to measure student learning.
There was a new study recently from the National Institute
on Learning Outcomes.
The study was led by George Kuh, who is sort of known
as the father of NSSE, the National Survey of Student
Engagement.
And this new study found that there's
a growing use of surveys like NSSE, also
things like learning breaks, portfolios,
classroom-based assessments.
And all that shows a desire of faculty members,
according to the study, to capture student performance
in the context where teaching and learning occur.
Well, that all sounds pretty good, right?
But colleges say they use these assessments really primarily
to measure internal improvements,
to try to help basically do self-study.
But very few make them public.
Less than 1/3 post the results on their websites.
And I would also note that almost none
of these initiatives really measured learning directly.
They measure certain kinds of practices
that are believed to be associated with learning.
So NSSE will survey students and say, how many papers did you
write?
There were more than 10 pages?
Well, that sounds good.
But that doesn't tell you whether the students
know how to write.
I think that's an important factor
to always take into account.
I'll just say briefly there, is also of course
many states that are doing things around assessment.
Missouri, Pennsylvania, South Carolina, for example,
are all requiring students take exit exams
before they graduate from college.
And often, this is not designed to measure individual students,
but to measure the performance of the colleges themselves.
But so far, the benchmarks are being set pretty low.
And we still do not have a standardized measure
that would tell us how state post-secondary systems stack up
against one another.
And of course, we don't have anything
that will tell us how all the nation's college students are
progressing over time, as we do with earlier generations.
So in short, I think there's a lot of need for something
like NAEP at the college level.
Inevitably, of course, this would raise umpteen questions
and challenges.
Some of it's around methodologies.
Some of it, as Gary said, is around,
do you measure generic skills?
Do you try to measure subject-specific skills?
Who do you compare?
State flagships, comprehensive, historically black colleges
and universities, for profits, et cetera.
So there's a lot of questions.
But the fact that there are questions just to me
makes it a very interesting and provocative challenge.
It's not a reason not to go forward.
And you've already talked about the earlier effort, which
I have looked into a little bit.
One thing I came across was Emerson Elliott,
who was then the acting commissioner of NCES.
In the transcript in one of these reports,
he was quoted as saying, "NAEP Goes to College
is a very catchy title.
It's worth a book at least.
But I think we're not quite ready to use it."
And of course, it didn't happen.
Now, what has been happening, and I
think Cliff really knows a huge amount
about the international scene and will probably tell us
about it, but there has been some interest, particularly
at the OECD, around doing learning comparisons
across countries.
There's been an initiative for the last six or eight
years called AOHELO--
I'm going to talk about it.
I'll be quick.
The Assessment of Higher Education Learning Outcomes.
And it sounds as though Cliff is eager to tell to you more
about it.
But I think the key thing here is
to understand that there really is some appetite
to create benchmarks and to figure out maybe not how
individual students are doing, though that would be desirable,
maybe not even how individual institutions are doing-- I also
think that would be desirable-- but how nations are doing,
vis a vis one another, vis a vis their previous performance
in reading, writing, and analytical skills,
and also perhaps in something like economics and engineering,
which are the two subjects were chosen for the AOHELO
initiative, precisely because they were viewed as not having
perhaps as much cultural variability,
they thought it would be more reasonable to do it
across countries.
That initiative, it's not clear where it's going.
I think we'll hear more in a minute.
But all that I think tells us that there
is a real appetite for doing more
with this hugely important sector, where we've
done a tremendous amount with access.
I think we can be proud of this country, and around the world,
there's more to do.
We've got a very long way.
What we have not done a good job with
is creating good benchmarks for quality and for improvement.
I think that's the next step.
With that, I'll turn it over.
Cliff, you look like you're ready to roll.
Let us have it.
I'm delighted to be with Gary and Ben this afternoon.
Gary, first, with all due respect,
graduation rates are not an indicator of student learning.
I think you would agree with that.
And in fact, when you start looking
at comparative international graduation rates,
OECD has a Swedish graduation rate of 69% at six years.
The Swedish ministry in Stockholm
has it at 56% over nine years.
And that's for the university sector only.
My question to all of you always is,
who do you believe about Swedish graduation rates?
Paris or Stockholm?
That's a no-brainer.
So I've written a study about this.
It's called the Spaces Between Numbers.
You can get it online.
To date-- let's start from the top-- NAEP
has conducted studies key to 12th grade preparation.
You've done that, and with occasion probes
of first-year college performance.
And as I understand it, you now are thinking seriously
about extending NAEP all the way to the end of the bachelor's
degree.
And to make quantitative statements
about the summative cognitive attainments of undergraduates,
no matter how many combinations of institutions
they attended, and more than 50% of our students
attend two schools, 30% attend three,
no matter what age they started at, whether it was traditional,
right out of high school, 29, 37,
as your brother-in-law does, whether they came out
of the military, and/or experienced other life
events that would be intervening variables in terms of where
they wound up at the end of the line and what they learned
and which of over 4,000 major fields
they studied at either the associate's degree level
or the bachelor's degree level.
And you should remember that we give
many kinds of associate's degrees
and many kinds of bachelor's degrees.
So you want to do all of that in understanding.
And I hope that just how complex a territory this is
and a complex territory of variables
and intervening experiences you'd have to account for.
And it's much more complex than pre-collegiate schooling.
Bachelor's degrees are a very complicated environment.
High school diplomas are not.
So you just got to remember that, even if I support it.
And NAGB's intention to offer yet
another set of metrics demonstrating
how dumb our college graduates are,
which is what lots of self-flagellating pundits
like to hear and continue with it, even if I supported
that in terms of the myriad kinds of degrees and subject
and myriad kinds of degrees that are awarded,
I would not enter this swamp under any circumstances
whatsoever, but for very different reasons under that.
Number one-- the world has developed
qualification frameworks that spell out
degree-culminating proficiencies that students must demonstrate.
And there's a range of specificity
of those statements.
Some of them are very generalized.
Some of them, like the one in the United States
I'm going to describe to you, which I'll put money
on the table very few people in this room are aware of,
are highly specific.
And it's unclear whether these proficiencies were a guide
to instruction, a requirements, whether degrees
will be awarded, or wish lists.
If you remember your English grammar,
we have declarative statements, we have subjunctive statements,
and we have imperative statements.
And it's not clear, when you read
these around the world, what they are,
what kind of statements there are.
We've got them in the 47 countries in the Bolonga
process in Europe, from Cork to Vladivostok.
We've got them in Australia.
We've got them in South Africa.
You've got them in Ontario.
And you have them here, in a document called the Degree
Qualifications Profile, which was first
issued in January, 2011 in an iterative process.
And following exploratory work by 400 institutions
of higher education in this country,
gave us a second edition, 2.0, which
was posted online at the Lumina Foundation website last month.
I'm giving you the web address for it, the URL.
And I'd like you all to go online and read it slowly,
everything in it, including the appendicies.
That's for those of you who know nothing about it,
because what the degree qualifications profile did,
inspired by work in the European realm area and other places,
was something very different, appropriate to the US,
and we did it better.
We did it better than they did.
So I'm urging you, read it slowly.
And for this particular reason, because
in none of the other qualification
framers in any other country does assessment and testing
play a role.
But in the degree qualifications profile, it does.
It's central, because there are in there--
and I think I have this right-- I'm
one of the four authors of the degree qualifications profile,
so I think I know it.
There are 20 proficiencies.
We don't call them competencies.
We dropped that word, because not too many people
understand what they are.
And people objected that competence
was a minimum threshold.
We are making statements about what
you have learned at the point at which I'm about to give you
a piece of paper for a degree.
For every one of those competencies,
20 at the associate's level, 26 at the bachelor's level,
and 22 at the master's level, we go up each one
with every competency ratcheted up in challenge level
from stage to stage.
For every one of those, there is a direct logical line
to an assignment, not only to one assignment
but to many assignments in the possible universe.
These assignments are not ones that come in from the outside,
in a NAGB exam.
They are generated by faculty, who then own them, bring them
into the classroom, where they're part of instruction.
And they make a difference in the lives and learning
of students, something that NAGB doesn't do, something
that the Collegiate Learning Assessment certainly
doesn't do, particularly because it's based on samples.
Going along with the generic degree qualifications profile,
internationally is another movement called tuning.
Tuning is the disciplinary version.
Tuning started in Europe just after the Bologna Declaration
was signed in 1999, with nine disciplines and about
15 countries.
What tuning does, it puts together faculty members
from the universities who teach in chemistry,
who teach in nursing, who teach in history,
put them down at the table, and say, OK, first,
provide a series of reference points
that are key to the mapping and delivery of your curriculum.
And then from that, generate sample student
learning outcomes statements from those reference points.
All of these statements, we do it better than they did.
We do it that begin with active verbs that
generate a logical line that goes through the assignments.
So that's what tuning does.
Now, in Europe, it's in 37 countries in 25 disciplines.
It came to Latin America in 2009,
with 12 disciplines, 18 countries, 188 universities.
It came to the US in 2009.
The first three state systems to do it
were Indiana, Minnesota, and Utah.
They each chose two or three disciplines.
Since then, we've had Kentucky with five, Texas,
with four engineering fields.
Montana has just come in.
The American Historical Society is running a national tuning
project to do the same thing, and the Midwest Higher
Education Consortium.
I will also give you the URLs where
you can need what these people produced
and learning how to write student learning outcomes that
lead directly to assignments and assessments.
We've got samples of those assignments and assessments
in the second version of the degree qualifications profile.
My point about all of this, it's not merely
that people are interested, that they've put assessment
at the center of this operation, and that they
don't need an outside operation to come in.
When OECD started AOHELO-- Ben, as long as you brought it up--
it started in 2003, planning with an OECD.
It had a tortured history in there.
Half of it was the Collegiate Learning Assessment,
which also ate up 80% of its budget.
And the other pieces were engineering and economics.
The technical advisory committee has met twice in the past past
six months, since the first version of the AOHELO results
came out.
And their judgment is that it's a big disappointment,
and particularly the CLA portion of it,
which turned out not to be translatable across borders
and not to issue the same kind of prompts
across borders that would make sense to people.
So it's very, very difficult.
Where it will go is anybody's guess, as Ben said.
He's right about that.
But I can tell that the US is not going to participate.
That decision's been made.
And part of it is the technical aspect of the test,
just as the same problems with the CLA
that we see here the reason you said
that schools wouldn't release the data.
But some of them do, in the form following form.
My effect size can beat up your effect size.
It's bigger than your effect size.
And nobody understands what that means.
They certainly don't.
It's based on samples of students.
They claim that 200 kids out of 51,000 at Arizona State
can represent the school.
I mean, that's nonsense.
Anybody knows that.
So I'll put money on the table that a lot of people
don't know about these developments.
That's not a problem.
That's not your fault.
It's that they don't get out there.
If Ben doesn't know about some of them,
that means we're not getting the information out.
You didn't mention DQP.
Because I don't think it's valuable as you do.
Whatever it is, we can ensure that if some of us
are not fluent in it, then the larger policy making community
is not fluent in it either.
My conclusion is this-- what we've
seen of standardized testing in higher education has nothing
to do with improving the enterprise and results
and information that could be best described as waste.
To say that my effect size is better than your effect size,
particularly when you pay student volunteers,
which is what the CLA does.
And anybody who knows anything about testing
can tell you about the validity of that
and reliability of those scores.
And no use except in producing numbers
for the pundits who like to use these things.
I guarantee you that the minute a government assessment, which
is what NAEP is, the minute a government
assessment, particularly such as this, walks through the door,
higher education's attempts to improve student learning
outcomes and to validate them and to provide information
about them, it'll disappear.
I promise.
We have been very fortunate so far
as to have the Lumina Foundation and not
one agency of government supporting all this activity.
And they've even kept hands off on it.
But the minute you come through the door
with an external assessment, you can kiss the rest of it
goodbye.
On that note, thank you.
Sorry to be blunt.
That's it.
[applause]
Be thinking of your questions, because we're
going to turn to those shortly.
But what I wanted to do is give Ben and Gary just a moment
to respond to anything Cliff said.
I think you were talking about the degree quality profiles.
And you said you didn't think they were as good-- just
a moment or two.
And then be thinking of your questions.
And there's a microphone back there.
I think Cliff's passion on objective to this proposal
exemplifies the higher education point of view.
It is what we ran into, except we had, like, 100
Cliff Edelman's when we tried to do this once before.
But it also reminds me of when I went
to a conference in Boulder, Colorado,
sponsored by the chiefs, just before we released
the first trial state assessment.
There was about 100 people in the audience.
We had two presenters and one discussant.
When I talked about the trail state assessment,
I was booed by the audience, almost couldn't get my words
out, because there were so many boos.
I was attacked relentlessly by the presenters
and the discussant.
In fact, the presenters turned into two more discussants.
That was the state at the time.
And now we see it is recognized that there
is a benefit to the state assessments.
It did not destroy local control.
It did not completely eradicate efforts in the states
to do a better job.
And I think that if we did the same
in the post-secondary world, we'd
find the same thing, over time, not immediately, but over time.
Ben, when you responded-- and I'd
love to have people start to get up to ask questions--
internationally, it sounds like no one is doing this.
And no colleges or higher-eds are putting out
a standardized test.
Do I have that right?
I just want to make sure I'm right about that.
That is correct.
So you know a lot about international things as well.
So why then would we do it if no one else is doing it,
not that that's not good reason not to do it?
Without getting into the weeds, I
think on AOHELO, that is an effort to try to do this.
That's an international initiative by the OECD,
by reputable people.
Yes, it has not had a great experience.
But I think maybe the broader picture to make about many
of the things that we've talked about-- I'll
mention the degree of qualifications framework
just briefly-- I view these, and I
think it's important to view these.
It's like the MOOCs.
It's like so many things that come along.
Yes, there's MOOC hype, but that should not
mean we all turn to MOOC hate.
And I think the same is true of many
of the things that are out there.
I think it's important to have-- I used to work for the Kaufman
Foundation, and I learned an entrepreneurial mindset.
A lot of it has to do with tolerance
for trying things and failing, tolerance
for mid-course corrections, understanding
that not getting it right the first time doesn't mean
that it's a complete failure.
I would say that about AOHELO.
It may not be revived.
It may be revived in a different form.
And I would just say very briefly on the degree--
so I think that's point number one.
On the degree qualifications framework,
I'm a big fan of the Lumina Foundation, too.
They're my friends.
And also on the tuning stuff, I view those
as actually very-- Cliff always makes me think,
and I'm going to think even more.
But I view those more as structural changes
around how you create a sequence.
It's a little bit like the standards movement of the '90s,
leading to the testing and assessment
movement in the states, going on to No Child Left Behind.
I think that if you create better pathways
with clear objectives within an institution, whether it's
subject-specific or whether it's more general,
I think that's useful in helping you be a better institution.
That is not the same.
And yeah, you measure student learning
within the institution using testing.
I think that's great.
However, that is not mutually exclusive with having
somebody external come in and do a benchmark test that
helps you compare groups of institutions,
state against state.
Yes, we have high transfer rates.
We also have a lot of mobility in K through 12.
I don't think it's either, or.
I think it should be both, and.
And I think we have to be open-minded.
And I do think fundamentally, there is a real need here
to do something.
And the fact that it's not ubiquitous around the world
doesn't mean we don't need it here.
So let's open it up.
Jim, you're on.
I'm Jim Popham, a board member.
A question for Gary-- I've known you for many years now,
Gary, and during that period, you've
made presentations to a wide variety of organizations.
In fairness, isn't it true that you've
been booed in most of them?
[applause]
That was the question.
Oh my goodness.
Hector Ibarra, I'm also a board member.
It's been years since I watched the video 1997,
Private Universe.
It dealt with science questions that
were asked of Harvard graduates.
And if you get a chance, Google it and look it up.
And it would be somewhere on the same lines, NAEP
going to college.
These Harvard graduates just did not
know how to answer the simple questions, such as, what
does the moon revolve around, or the Earth revolving around?
And obviously, they're not science majors.
So they probably didn't really-- I shouldn't say didn't really
care, but somewhere along the line, they never got it.
Cliff, go ahead.
Can I give you examples of a couple of assessment
prompts from the collection we put together
to go with the degree qualifications profile?
Every one of those proficiency statements,
beginning with an active operational verb,
leads to something like this.
Here's a couple of them, one of my favorites.
Suppose a new form of energy had been developed,
which when turned on, would slow the rotation of the Earth
from 24 to 26 hours a day.
Before we can flip the switch an environmental impact statement
must be filed.
You have a blank piece of paper and 30 minutes,
I want the chapter headings and sub chapter headings
of the environmental impact statement.
So what you do when you get response
of what is this related to?
It's synthesizing activity from a variety
and integrating activity and knowledge
from a variety of sources.
You can't provide a complete answer without that.
That is a constructive response, like PROD,
like the kinds of things that the CLA tries to lay out.
I'm not saying it's perfect, but it's
drawn from an ETS experimental examination that
was given in the late 1970s.
And we can describe where that came from.
That's almost a precise rendition of it.
Here's another one.
Quantitative literacy, you're given
a map of the United Kingdom.
And here's a spot where your aircraft is.
Here are three airfields.
Here's your refueling tanker.
Here are the statistics involving how much fuel
you will use in different maneuvers.
An alien aircraft is approaching a radar station
off the Scottish coast.
It's 3 o'clock in the afternoon.
The weather is closing.
You are to engage this alien aircraft.
You are to destroy it.
Now, at which air base will you land?
At what time?
How much fuel will you have left?
And you have to have at least 500 kilograms in order to land.
And how long did it take you to solve this problem?
Hang on.
We're not done.
That was a real recruiting ad for the Royal Air
Force in the London Sunday Times in 1985.
What we added to it for the question
was give us an algorithmic statement
for every step in solving that problem.
And that's a Bachelor's degree quantitative literacy form
and we asked for an algorithmic mathematic symbolic statement
to accompany every stage of you solving that problem.
That's two of them.
I'll give you a third.
This is in the arts.
It is now January.
We have an exhibit coming up the first week of April.
You are to create five works of visual art.
You don't need to keep it in one medium.
You can do 2D, 3D mixture.
This exhibit of five pieces is to illustrate
the phenomenon of chaos in color.
You are to write an exhibit catalog of no more than five
pages that goes with it that draws on the major color
theorists, Goethe, Kandinsky, Chevreul, et cetera.
That's an Associate's degree--
But Cliff--
Level problem.
Those kinds of things feed into a system
which is validated and put in a series of constructs.
They belong to faculty.
They have an impact on students.
And they improve whatever's going
on in the schools and colleges involved.
Well, that's very nice, but how do
you compare Wisconsin and Missouri?
I don't give a damn.
And how do you compare California 10 years ago--
I don't give a damn.
Well, so then you're saying that it's not a legitimate goal.
As long as you're saying--
This is like the K through 12 people who say,
this is my hurricane unit.
I do my hurricane unit.
Don't mess with me with any state test.
Don't mess with me with any NAEP tests.
I don't know if Richard is still here.
In the good old days, we would just
test people on their understanding
of civil liberties.
Now we do reading and math for god's sake.
I think this is a huge problem conceptually.
I don't.
As I said--
I don't at all.
I think you can do both.
I like the stuff you're describing.
It sounds like the CLA.
I like it.
But that is not the same as saying it is useless.
There is no point in trying to measure the nation's
college students' level of ability.
I definitely disagree with that.
If I can put out the assessment prompt and a sample of student
responses, and saying, this is what all my graduating students
have reached in terms of this proficiency.
And I've got 20 other proficiencies I'll show it for,
what the devil is the problem with that?
Why do I need to compare that to your school
and your set of proficiencies?
You can put yours out too.
As long as you show--
My kid's in high school up five miles away.
And he gets nice grades.
And I'm very happy.
But I can't compare those grades to a kid in Alexandria.
So we have standardized tests.
And that's one of the things we do.
I'm sorry.
OK, Gary, you're on because you've
been wanting to be saying something.
I just want to say this is deja vu all over again.
[laughter] That's it.
That's it?
Deja vu all over again.
Well, how about some other questions?
We must have stirred it up out there.
Have we not?
By the way, at Bachelor's level we add to the catalog
an essay on the way in which technology has changed both
our evaluation of the chemistry and the digital nature of color
palettes.
OK, here's a question.
Yes, good afternoon.
I'm Jennifer Presley, who's been in Higher Ed
all her adult life as a researcher.
What I'm hearing is-- what troubles
me about this whole discussion is
we keep trying to test quality in
rather than worrying about what the inputs are.
And if we produced high school graduates who are actually
going to be able to function well in college,
we wouldn't need to be worrying about what
our college graduates can do.
And so aren't we just extending our willingness
to accept low levels of learning, still promoting
people, giving them more chances,
instead of saying at some point you have to know something
to move to the next step.
Gary or Ben?
I want to be sure that I understood the question.
Yeah, I'm not either.
My question is that in other countries--
I didn't go to college in England, although that's
where I grew up-- you knew someone who graduated
from college at a particular point in time
had some attributes that were a critically thinking adult.
We have expanded Higher Ed to the point
that we're concerned about what college graduates know.
I was the leader of the research for the University of Wisconsin
Board of Regents for several years.
I remember being at a board meeting
25 years ago where the Board of Regents
were asking about what do our college graduates know.
And they wanted to test quality in at the end.
And we're still having that same conversation.
Shouldn't we--
And so is your point-- just to get it.
Are you saying that we shouldn't have
to assess because-- what I thought I heard you saying
was that by high school, if we do our job right, that students
should come in knowing a certain amount.
That is what I'm saying.
And then when they go to college there shouldn't be a need--
That is what I'm saying.
There shouldn't be a need to measure it on the other end.
And to the extent that we keep buying into a system that
pushes lack of performance further and further down
the education path, we're not creating real pressures
to succeed at earlier levels.
So what I heard-- see if I'm right.
What I heard her saying is that we wouldn't
need to measure what students are learning in college if we
could be certain that students, when students got to college,
they had a certain minimum of proficiency.
So we wouldn't have to measure it.
So that's-- I gather was the premise.
Yes, that's right.
Well, I think because of NAEP we know
that students, when they go to college, are not prepared.
We also know 40% of the students have
to have remedial courses when they go to college.
In the past, when you look at states-- the states,
I think of them as institutions.
And the analog in the higher education
is like institutions for higher education.
There is a lot of misinformation in the past in the states.
For example, in '87 when John Kuehnel did his report,
we found that all 50 states were above the national average.
Now, that was a reality.
All 50 states were reporting they
were above the national average.
Under No Child Left Behind, you have
states that have 98% proficiency by setting the lowest
standards in the country.
So you need an external benchmark.
The states and policymakers need an external benchmark
to help navigate through this information
in order to set good policy.
I say you need something like that
in institutions of higher education.
What if you find out that students leaving college
don't read any better than when they first entered college?
Or what if you find out that students at Harvard
don't read as well as students at the University of Maryland?
That's pretty important information
you like to know, which you will never
know unless you do a collegiate assessment
in a standardized way by a reputable group, like NAEP.
So Cliff, you're going to get the last word for 15 seconds.
And then we're going to close this out.
It's very important what all this movement means.
It means that people are trying--
to use the voices of the English language-- in a non imperative
way to get all students to meet competencies.
It's a shame we don't use the imperative.
As a writer, I fought for it and I lost.
In other words, you don't meet the competencies,
you don't get a degree, no matter what your GPA is
and how many credits you have.
I mean, that's one thing.
We're doing this though, in this country, nobody else is.
Thank you.
Well, was that lively?
Or was that lively?
Let's give a round of applause for our panel.
[applause]
So we're going to take a five minute
break, five, seven minute break.
But don't go away because the finale is coming.
And as we heard, Acting Deputy Secretary Shelton's
going to be here.
We're going to hear from a panel about the last 25 years.
And then Roberto Rodriguez, the President's Education person
is going to be here to talk to us as well.
So don't go away.
Take five to seven minutes and then be back
in your seats for the finale.
Thank you.
Good afternoon.
Oh, there here is.
Here's the man.
Not your fault.
They switched the schedule on us.
What are you going to do?
Hey, you're flexible.
Wait until that kid is born and you
have to get up at two in the morning.
You'll understand.
Good afternoon.
If nothing else, we're flexible.
Jim Shelton from the Department is running late.
They have a few things they're dealing with over there.
So kindly the panel has agreed to switch places.
So we're going to begin with the panel.
And then we'll have Jim Shelton address us when he arrives.
None of these gentlemen need introduction much.
And none of them need much of a prompting to get started.
And they're all, obviously, very accomplished.
Pat, with all his background, former chief in Delaware
and now Executive Director of the K to 12 Center at ETS.
And I think those of you that know
Pat know he's been a very proactive leader
for a long, long time.
And if you want to know what he's thinking, just ask him.
And the same is true of our other members.
Pat's going to go first.
And then Gene Wilhoit, Gene was the Executive Director
of the Chief State School Officers for a long time.
He was also Executive Director of NASB.
He was commissioner in both Kentucky and Arkansas.
I don't know what he did in his spare time,
but he obviously has a great perspective.
And then Terry Holliday, who's the current Commissioner
of Education in the Commonwealth of Kentucky and President
of the Chief State School Officers.
And finally, Rick Hess, who is never
shy about expressing his opinions.
And of course, he's the resident scholar
and-- I'm just reading this, Rick.
Resident scholar and Director of Educational Policy Studies
at the American Enterprise Institute.
So obviously tremendous experience, great leadership,
and also opinionated people.
So we're going to begin with Pat.
And really it's just a matter of it's
been 25 years, you've been intimately involved in NAGB so
take it away with where we've been, where we are,
and where we should be going.
Thank you, Dave.
And thank you for this kind invitation.
I will focus my comments today on a set
of four promising areas where NAGB and NAEP need
to continue demonstrating an essential leadership
role, especially in this, the time of the assessment
consortium.
But first, let's pull back and reflect
on the rich history of NAEP and the nation's report
card from a set of diverse perspectives.
These represent my personal views
based on a set of professional assignments
that I've held over the years.
So first, let's take a long view of NAEP.
We need to recognize that NAEP, in its DNA, is about change.
Certainly, this moment in time of education reform
may be particularly challenging to NAGB
and to the public education community,
but when you look back at NAEP, it
previously, and successfully, re-thought and re-engineered
itself.
In the late 1980s, a consensus emerged,
based on the Alexander James panel, which was commissioned
by then Assistant Secretary of Education Checker Finn,
and on which I served, and subsequently
endorsed by the collective voice and action of the state chiefs
through CCSSO, that NAEP should become the nation's report
card.
This introduced the realities of benchmarking individual state
performances and trends against national,
peer state, and international comparisons.
A second perspective that we could take comes from my tenure
as US Commissioner of Education Statistics.
From my point of view, NAEP is a rich and bountiful resource
of scientific data determining the condition of education
in the United States.
And that's part of the NAEP message.
To report on the condition of education in the states
and internationally on student achievement.
But as you know, the commissioner
is bounded by a pretty strict code.
We might call it, just the facts, ma'am.
It's a high standard of what you can say and can't
say about the data.
But I must tell you that I found NAGB
to be a particularly valuable partner vetting
the range of topics to be studied, which at the core
is a political determination.
And NAGB is an appropriate policy body
to take on that responsibility, unlike NCES, which should not
get involved in those kinds of decision making.
But let's now turn to the current situation
where NAEP, from my point of view,
is becoming part of the evolution of a national system
of assessment.
But what will be its value added contribution?
The goal of NAEP should always be about leading edge
R&D clarifying our understandings of content
and associated student achievement.
But NAGB is challenged because at this moment
we have a set of next generation assessments emerging
with a set of goals as part of the state consortia
to be forward thinking and to be innovative
in their measurement.
So what does this mean for NAEP in the future?
And with this, I'd like to focus on a core of four topics
that I feel would help NAEP continue its image as best
in class and the gold standard for the assessment of student
achievement.
Leadership can be shown in the areas that follow.
And this could lead us over the next quarter century.
First, NAEP should lead innovations in assessment.
NAEP should guide and confirm the work of the consortia that
will be coming forward in 2015 and beyond.
It should ensure K 12 assessments
measure the right stuff in a rapidly changing world.
And it should catalyze the marketplace
to enhance and propel greater innovations
in the work of the states because the states will
be leading this, either collectively as
part of the consortia, or independently.
So let's turn now to the first item.
And I'm just going to offer you a few comments on each.
In leading assessment innovation NAEP
should provide timely and useful R&D. Your Future of Assessment
report called for the creation of innovation laboratories
that will generate the advanced R&D needed
to do particularly two aspects.
The first one has to do with measuring
the new and complex constructs embedded in the Common Core
State Standard and the Next Generation Science Standards
that states and consortia are currently struggling with.
Similarly, the use of new technologies
within the complex performance items and tasks that
can reduce the testing time and provide instructionally useful
information.
Regarding the guide and confirming
the work of the consortia, here NAEP
should develop formal processes, Dave,
for sharing with the consortia the R&D underway.
NAEP should help incubate and shorten the innovation cycle
so the consortium aren't waiting for the delivery,
but they're part of it so they can accelerate their own work.
So that is getting the expert advice as it's developed.
Regarding the confirmation, NAEP would be valuable here
to help us in three ways.
First, we obviously need a regular external checks
on the consortium performance and progress and trends.
Someone's got to confirm that this is real progress.
Second, we're going to need an alternate comparability
yardstick.
If I move from Maryland in PARCC to North Carolina in Smarter
Balanced, how do I know the comparability
of my performance?
But certainly, NAEP must continue
to provide the national benchmarks and trends
on student achievement.
Because remember, neither or both of the consortia
represent the whole nation.
And so we don't want to lose our national benchmarks
and indicators.
The third area has to do with identifying the right stuff.
And here I see NAGB working not just with NAEP, but with IES.
Because this is going to require R&D over time
to identify those critical skills that'll be emerging.
What are they?
How do you measure it?
And how do you conduct the needed R&D
to build those learning progressions,
the series of keystone topics in math
that will lead you to success from one grade to the next.
That is serious R&D work.
And IES and NCES must be part of that.
The next thing is this then can be translated
into a series of recommendations for modifying the Common Core
Standards.
Because we know that they're in their infancy.
And they can be improved.
And finally, from this, we can identify the best practices
that our districts, in Perry and Kentucky
and others are waiting to use in their R&D
work and their implementation work.
The fourth area has to do with catalyzing the marketplace.
By actively leveraging the present transformational
opportunities, we can move from 50 isolated silos
of cost, effort, and expertise to shared platform.
There may be two, PARCC, Smarter Balanced.
There could be others that emerge for collaboration.
NAGB and NAEP can stimulate that marketplace investment
to improve and expand on the consortiums infrastructure
and tools.
Think of plug and play, new apps, new databases,
new software, being developed to meet
the needs of individual states and the consortium.
This is indeed a brave new world of possibilities
with NAGB's leadership for the future of NAEP.
And in conclusion, I'm drawing upon an excellent report
that NAGB commissioned on the future of NAEP that
says, "we envisioned a new, more nimble NAEP that
can serve as the backbone for an evolving assessment
infrastructure."
Thank you.
And may the force be with us.
Thank you, Pat.
Gene?
When I look at this panel and I try
to figure out what my contribution would be,
I think you someone on the Planning Committee
said, let's find somebody who's been around for 25 years.
I have lived this from the very beginning.
And what a difference 25 years makes.
The story behind this is-- and I think
in that first visual presentation
there was a statement to the affect
that states were given an opportunity
to participate in NAEP.
Roy would remember this fateful meeting,
but you may not know that the chiefs were gathered in a room
after multiple conversations and we knew it would be close.
Proxies were given for those who couldn't make it.
And the vote came down to 20-19 that we would participate.
And inside that conversation was intense debate
about whether this is the right thing for the nation or not.
Whether we remained as separate states doing our own thing,
or whether we moved ahead as a group,
was that central decision.
It's hard to imagine today that we
don't have a common system across the country,
that we have accepted NAEP as our national benchmark,
and that it would be virtually impossible for someone
to withdraw from this participation this moment.
But I think progress would not have
been made without several factors.
And I think these are lessons that we
ought to keep in mind as we move forward.
First is I participated in those original framework bodies.
All I can remember is Checker Finn sitting next to me
in that conversation.
Everyone else pales from that body of 20 folks.
But we debated on those frameworks.
And what we found at that point and I
think a lesson for all of us is that the states are not
going to accept the continued work of NAGB
unless it's credible, that the work is of the highest level,
that we can look to you for serious consideration,
that there is a common commitment to excellence
in the work that you've done historically.
You have earned a reputation of being the benchmark assessment
for the United States.
That's why the states are supporting it
and that's why the states came on board.
Secondly, NAEP was probably the most influential factor
in advancing the work of assessment within the states.
Many of you know that this phrase NAEP-like
became the marching order of many of the reform states.
I know in North Carolina and Massachusetts and Kentucky
and these states that were engaged
in the early years of reform, this was the goal.
Take NAEP.
Digest it.
Tear it apart.
Put it back together.
Try to make your state standards align.
And in that process we moved from what
were some pretty poor standards documents in the states to some
that were exemplars for the rest of the country.
And in addition to that, it began
to shape the work of the publishers who
were developing those assessments.
Now, that's not to say that we were in good stead
from all of this, but we did have,
out of those early efforts, at least
a few states who were modeling the good work that
was going on.
I think the other major contribution
for me is this idea of transparency.
It was obvious that we had inconsistent practices
across the state, and that we had
many issues that were being unaddressed.
The public was unaware of these practices.
We knew that there were major issues that caused states
to be reporting data that caused confusion.
A couple that I'm reminded of, first of all, exclusion rates.
Some states had the policy of excluding
large numbers of students from their samples.
This came from state documents.
When this became apparent to the public,
then, obviously, it caused the states
to have more consistent policies.
I know that the writing tests were interpreted
very differently in those early years,
and the influence of NAEP over those
made a tremendous difference.
Particularly in the interpretation of reading,
that debate goes on today, but it literally
was because of NAEP that those changes were made.
Ironically, I think the reporting mechanism
to expose state accountability flaws really
prompted the states to emerge as leaders in standards,
assessment and accountability reform.
The famous wall chart was something
that we did not receive well.
There were tremendous backlash from the states
about that release.
I do think there were some legitimate issues
about comparing the word proficiency.
I do think that states did have some legitimate concerns
in those early years about engagement.
But I'm not sure that we would have
gotten to where we are today without that slap in the face.
And so, from that point on, this revelation
of too many inconsistencies, gross inconsistencies,
in state reports and NAEP reports
caused us to move very dramatically
as a group of states, and eventually
come together around common core standards.
I'm not sure we would have done this without NAEP prodding us
in those early years.
It is a major contribution to the development
of education standards in this country.
There were lots of heated reactions, negative reactions,
but I think the conscientious states ultimately
made that decision to move in the right direction.
So, I think your greatest contribution
has been the highest standards you set, the quality
work, pushing yourself constantly, setting a model,
putting up challenges, acting on those challenges,
giving this country a forum for this kind of high quality work.
So for me, we have many issues before us around preparedness.
Your agenda on preparedness, how is
that going to fit within the state context of the consortia
work, and more importantly, the emerging plethora of new exams
that are going to be developed.
We know that our dream of having one or two consortia
is not going to play out in this country.
We're going to have many, many ways
of assessing student progress.
Maybe that's good.
Maybe it's not.
I don't know.
We're just emerging.
But I do know this.
That makes NAEP even more critical in these next two
to three fragile years that we have before us.
I think, too, we're going to, at this next few months
and year, NAEP is going to have to carve out and maintain
a role of cutting edge assessment policy work.
You can't afford not to be that benchmark.
And how you keep the pace with the rapid changes that
have been unleashed, and believe me,
we're not going to slow down.
This is a major explosion of innovation
that you are going to have to make sure
that you're a critical part in playing.
So for me, these are the things I would put on my list.
One, what's the scope of work of NAEP?
Do you begin to pull back on math and English language arts
and do them infrequent?
Do you bring the rest of the curriculum
to bear on the nation?
How do you, in this kind of environment,
begin to look at the relationship between NAEP
and the Common Core State Standards.
We have learned a lot out of the work of the Common Core
Standards.
I think there are areas where NAEP needs to catch up with.
Some of the things that came out of the research, that came out
of the common core work.
I think the rapidity of change, as I mentioned,
is going to put some challenges to you about maintaining
stability and thoughtful analysis and yet moving ahead.
We're going to get, as I said, into a crowded
field of assessments, and I think
that it's going to require your stability as we move forward.
And then finally, I would just say
that this is, at this point of revolution in our system,
there is a lot of promising work going on out there
that you need to be conscious of,
that aligns with the mandate of cutting edge and leading edge.
The schools, and the districts, and the teachers
in this country are figuring out that they
have a new challenge in front of them.
That we are going to graduate all students.
That we're going to reach these high levels that we have set,
and in order to do that, we're going
to have to turn this system upside-down.
And that means more performance-based learning,
competency-based learning coming forward.
It means more personalized learning for students.
It means a redesign of the teaching and learning
process in our schools.
And without this kind of cutting edge work on the part of NAEP,
we are going to not to be able to move
this exciting set of developments forward.
So I'll turn it over to Terry.
[applause]
We're having half time.
As I mentioned, if nothing else, we're flexible.
We ask you to be the same.
Jim Shelton has joined us and is on a very tight schedule.
So with your permission we're going to hear from Jim,
and then we'll go back to the panel.
I think everybody knows Jim Shelton and his career,
but just for the record, early in his career
he was a Program Director for Education
at the Bill and Melinda Gates Foundation.
He managed nonprofit investment portfolios,
things like new school venture funds and learned how.
Aimed at increasing high school and college graduation rates.
More recently, he served as head of the Office of Innovation
and Improvement at the US Department of Education,
managing a portfolio that included
most of the department's competitive programs,
such as investing in innovation fund,
promising neighborhoods, and other efforts focused
on teacher and leader quality, school choice,
and learning technology.
And now, it's my pleasure to introduce
Jim Shelton, the Acting Deputy Secretary of the US
Department of Education.
[applause]
Good afternoon, everybody.
Oh, OK.
I know I broke up the flow.
Let's try again.
Good afternoon.
Good afternoon.
First, let me apologize to the people here on the panel.
Thanks for filling in for me when I was late,
and thanks for letting me interrupt you
in the middle of your panel.
I can assure you I'll be short, because everything
they have to say is much more interesting than what
I'm going to say.
But I do want to just take a few minutes to highlight
a few points that I think a really important.
The first is, what I want to do is just
thank everyone in this room for your leadership.
This is an important time in our country
when it comes to education.
NAEP has played an important role
in our history and our ability to start
to move to field forward.
Whether it was, as you have heard many times today
I'm sure, setting the pace for people
to be able to actually craft the first round of the standards
movement, to being able to call out folks, frankly,
with the gaps between what NAEP was saying
and what state assessments were saying.
These things, the role that NAEP has played historically
has been very important to guiding education improvement
in this country for some time now.
The evolution of NAEP to also include
TUDA, also pretty important.
Allowing those districts to start to see how they compare
and the role that they play because
of the disproportionate number of students that they serve.
Being able to benchmark their progress,
not only against the nation, but against each other.
Understanding what is actually acceptable and not acceptable
in terms of performance, and giving some people accolades
they didn't know that they deserved,
and giving other people-- well, taking away excuse
from people who had been using them for too long.
So I want to thank you for all of that.
And I want to acknowledge that this
is something that needs to continue.
It needs to continue for a number reasons,
despite the changing context of education around you.
The good news is that as a country,
we are trying to move in the right direction.
NAEP had the mission of moving us forward
as a country to be able to understand whether or not
we were preparing our young people to be competitive,
not only across the country, but across the globe.
We've taken a huge step forward to try and do that as a nation.
Every state, almost, taking steps towards higher standards
and ensuring that those standards mean
that kids are actually ready for college, and hopefully, career.
That transition is going to happen in fits and starts.
Some states are moving quickly to try and get ready
and understand what it means to put together
a full implementation plan, to implement new standards,
implement new assessments, give teachers
the support they need to actually understand
what they mean and how they should respond
to them until they actually acquire the curricular
resources to do so, as well.
Some states know how to do that, many don't.
They need help.
They need guidance.
And in that context, while the new consortia are standing up
their work, there's going to be confusion.
That is what happens every time you do something big.
Whenever you try and put in place a new large change
to a system that needs to improve,
there is a period of disruption, and then things get better.
What is important during those times of confusion
is to have those things you can lean on that are familiar.
Those things that give you a sense
of whether you are making progress
in the right direction.
Those things that allow you to talk
to others in an informed way about the
struggles that you're going through.
I think that that is one of the most important roles
that NAEP will play over the next several years.
The second thing is that NAEP has started off
as an innovation in the space of assessment,
has continued to be a fore bearer in quality assessment.
It actually has, as we have become even more proud of
over time, really, really tight security.
So it actually is inoculated against some
of the questions about integrity,
and testing, and assessment.
All of these things are things that we
need to continue to evolve, that we
need to continue to model for the field.
They aren't sometimes the most sexy things,
but they are the things that will allow assessment systems,
whether it be state or otherwise, to be
meaningful to the students, the teachers,
and the communities that rely upon them.
When I came in, I did have the good fortune
noted here about the future, about the opportunity
for us to think about the new methodologies and assessment.
Not only in the new ways of actually
mechanisms for doing assessment, but actually starting
to get better at understanding those things that
are harder to assess today.
We talk a lot about really understanding
how to test critical thinking skills.
Really understanding how to understand whether people know
how to collaborate and leverage the information
that's available to them.
These are things that we're going
to continue to need to understand how to do well
as a country, because they are the skills that young people
and old will need to demonstrate, not
only to actually be good students,
but to be good professionals, and to be good citizens.
I'm not telling you anything that you don't know.
I'm not telling you anything that most of this panel
hasn't reiterated.
So why am I here?
I'm here because it would be easy in this world,
with the Assessment Consortia coming on board,
with the different testing companies coming forward,
with offerings that make it seem like they can fill
the void that has been created by some uncertainty
around the new assessment systems.
It could seem in that new world that NAEP is not
that important.
And I just want to say to you that that is just not the case.
We could decide to back off because the gap has been filled
in our investment in the space.
We should lean forward.
We should lean forward about being more innovative,
being able to build new capabilities, being
able to lead the field.
And at the same time, be that buttress of support
during this time of transition.
Maybe you didn't need to hear that.
Maybe the country didn't need to hear that we still need NAEP.
But I thought it was important to say.
So with that, I'm going to actually end and let
you get back to the interesting part of the panel.
I'm happy to take a question or two before I go.
But I think if you're wise, you won't ask any.
[laughter]
Thank you.
[applause]
As I was saying, Terry.
[laughter]
Well, it's a real honor for me to serve on the NAGB board.
I've never seen such a dedicated group
of professionals that serve on the board,
and a tremendous staff with NAGB, NCS, and IS
that work with us.
I'm going to try to give you a little bit of a state chief's
perspective.
I've been in education 42 years, and I've
been extremely lucky to survive 42 years,
but mainly because I've always been
the right place at the right time.
I was in South Carolina when Governor *** Riley was
in charge and really pushing reform.
And a high school principal that got a lot of the benefits
from that reform.
And I was in North Carolina through part of Governor Hunt's
terms, and the great things that happened there.
And then I came to Kentucky, and one of the first calls I got,
the first couple of months I was there,
was from Secretary Duncan, and he was applauding Kentucky
on being the only state with statistically significant NAEP
advancements.
Since I had taken all of Gene's problems and fixed them,
I took credit for that one.
[laughter]
But the luckiest that I've ever been
was I was running summer music camps one time,
and this wonderful, gorgeous young lady
came in to help me assign rooms.
And she didn't know it then, but she would later
become my wife of 33 years.
So I've been really lucky.
I was in Kentucky and they passed this state law,
and that was kind of one the reasons
I said Kentucky's probably a good place to go,
because they passed a state law unanimously.
And you tell me any General Assembly
that does that these days.
And they said Doctor Holliday, you and your department
have to implement new standards, new assessments,
new accountability.
And you got to do it with no money.
As a matter of fact, we're going to cut your budget about 20%.
So I began looking for solutions.
And at the time, common core was certainly a wonderful solution
for us, because it was an effort of all the states
together, and a lot of the best minds in the nation coming
together, and it was cheap.
So I thought that was a good solution.
[laughter]
We started about implementing things.
You know Mark Twain always said, when
he died he wanted to be in Kentucky because everything
happened two years later in Kentucky.
[laughter]
So I've been trying to refute that claim ever since.
So we've been kind on the bleeding
edge of a lot of things.
Some people call us trail blazers.
But now as I attend all the NAGB meetings,
and I'm looking at the 12th grade proficiency cut scores,
I'm real conflicted, and my committee members
will tell you that I'm always thinking
how's this going to play out politically.
Because the state of the Chief is politics, after all.
Our average tenure is down to 2.7 years,
so most of the chiefs don't even last one NAEP cycle.
So it's a problem.
And too often, judgments are made about policy decisions
and they say oh, look at the NAEP scores.
They went up a half point, two points.
And it was because of these policy decisions,
when most of those policy decisions
probably happened three, four, 10
years before the current chief was there.
And you may not be able to track them back
to specific policy decisions.
It could be demographics change.
Like everybody seem like they're looking at Kentucky
because in 2013 we had been administering common core
state assessments for two years.
We first administered those assessments
in '12 and then again '13.
So everybody said, well, our Kentucky scores
are going to drop.
Well, at the same time we were implementing a policy
to ensure that our comparative data was equal to other states.
In other words, we wanted to make sure all of our kids
were tested.
We want to make sure that our special need
kids were represented in the sample.
So our fourth grade reading went down a point.
Our eighth grade reading went up a point.
Our fourth grade math stayed at or above national average.
And our eighth grade math sucks, for all
you statisticians out there.
It's not good.
We've got problems in middle school math.
I'm banking on teachers changing instructional habits
in middle school that will be reflected in kids doing
a much better job in the future because
of our implementation of common core.
But as we were debating in NAGB the whole 12th grade
proficiency thing, that raises the big ugly head
of what does proficiency mean.
Too many pundits and too many writers
say NAEP proficiency in grade level
equate read ravage, read somebody else.
You get all kinds of different things.
But I think the people were there.
Proficiency's a little higher, and I
think our 12th grade studies, especially in math, are
beginning to show that the college readiness may
be a little bit below proficiency,
and all of our ACT equating Explore and Plan
are showing that, too.
But as a chief, you've got this big problem
coming at you right now.
Which test are you going to use?
Are you going to use Smart Balance?
Are you going to use Common Core?
Are you going to use off the shelf
that some vendor has developed?
Are you going to use a customized version
that your teacher spent a lot of time developing?
We went with the customized version
because we felt like the best training around common core
were teachers developing assessment items
and really understanding the common core.
But eventually, chiefs are going to make these decisions
about procurement of assessments based
on do the assessments measure these new standards
and the revisions to the standards made at the state
level, like Florida recently did.
Do these items show quality of items,
or are they just regurgitated items, allegedly aligned
to common core.
What about the technology?
Can we deliver this assessment with the technology we have?
And then the mother of all, how much do they cost?
Because you only get so much in a state
budget for your assessments.
But eventually, we're going to have
a couple of problems in the NAGB board.
We've been in the middle of these problems
and we'll continue to be in the middle of these problems,
I would think, and Pat laid it out much more eloquently
than I could.
Alignment.
NAEP is the gold standard in my book.
When we developed our three through eight
and we did our NAEP, we made sure that we had good equating,
with not only NAEP, but also with our Explore test
in eighth grade.
We now have a vertical alignment from grades three to 11
that we can tell mom and dad every year,
is your child on a trajectory to get to college and career ready
that Kentucky colleges will accept.
So that's going to be a problem we've
got to think about is NAEP, what's it measure versus what
do the state assessments measure.
And hopefully, we don't get back to a No Child
Left Behind problem, we're 90% proficient on state tests,
but only 20% proficient on NAEP.
And also, parents have got to understand this.
They get SAT and ACT.
And I'm so proud as I work with NAGB board
to get parents more knowledgeable about NAEP.
We've got to get parents and communities much more
aware of what's happening.
And the politics are not going to go away with the privacy
issues, with the-- when we started this work
in '09, '08, '07, even before, common core was a good thing.
Now it's a four letter word in a lot of places.
So the politics are not going away.
And it's interesting to watch how that develops.
So the NAGB board has to walk a very delicate balance
in the next few years, and I'm just honored
to be able to be serving on that and working
with brilliant people, even Jim Popham.
[laughter]
[applause]
All right Yeah, I can do it from here.
America's listening in.
Yeah.
I like me better that way, too.
It's great to be here with you guys today.
Great to be up here with such a terrific panel.
It's not often nowadays I get to truly feel
like an novice rookie who still has
no business saying anything.
So when I'm up here with four guys who've actually
all led districts or states, and have been doing this stuff,
and remember making these decisions.
In 1990, I was a first year teacher down in Baton Rouge,
so I didn't live any of this.
So for me, it's entirely kind of after the fact.
So for that reason, let me try to be relatively brief.
Let me try to do three, four minutes
of a couple of thoughts.
And then we can have a little bit of a conversation.
One, I think it's useful to keep in mind just
how significant state NAEP has been in its own right.
It has, I think, fundamentally changed
the way a lot of these conversations
have played out over the last 20 years about how states are
doing, about how do we think whether or not
certain performance strategies are working.
But like almost all policy successes in particular,
we tend never to be happy with success.
We almost always tend to push success
until it becomes overreach and failure.
And I think, for instance, just some of the conversation
the last couple of months about the way folks
were trying to wield the findings out of state
NAEP to offer I think ludicrisly overdrawn interpretations
of what we should make why states are doing well
or poorly, and what that portended.
I think shows the problem when we
take good, useful, important tools
and push them a little further than they can usefully go.
Second point.
In light of that, I think particularly, you know,
when you're inside the Beltway and talking about the Common
Core Assessment nowadays, I think it's very easy for folks
to wind up talking in a closed circle.
And imagining that only yahoos-- or to quote the secretary,
fringe elements-- might have any questions or concerns
about Common Core or how it's unfolding.
I don't think I'm a yahoo or fringe element.
I mean, folks can disagree.
I have no problem with the Common Core standards per se.
When Dave Coleman walked me through them several years
back-- when they were first released in draft form--
I thought they were perfectly reasonable.
But I also think standards are mostly
just a bunch of words on paper.
They're a useful fiction.
And I think what's much more significant about things
like standards is how they're actually translated
into the real world of policy and practice.
And I think a reasonable person could
have enormous concerns-- about how Common Core has been rolled
out, about the engagement of the federal government,
about the way in which states and decision makers signed up
without necessarily understanding
what they were signing up for.
And for that reason, I think it's entirely appropriate
to encourage folks not to count their chickens.
If we get to 2024, and 40 or 45 states are using Common Core
and pursuing the standards seriously
and are using one of a couple of consortium--
whether it's two or three or five,
we'll see-- then that will be one thing.
But I think over the next several years,
it's important to keep in mind that we continue to see
the handful of governors and state chiefs who are still
in place who signed up for the Common Core in '09 and 2010 are
steadily departing.
Legislators who were vaguely supportive-- some of them
are departing and being replaced by folks who've
run partly out of a promise to stop the Common Core.
So I think the idea that this is some bubble of turbulence which
will soon pass is wishful thinking.
I think we'll see how this unfolds,
and I would encourage people not to make too many assumptions
about what the Common Core will be or will
look like when it's 2018.
For that reason, I think it's particularly
useful and important to keep in mind
that we should not screw with State NAEP.
I think it provides a credible and established metric, which
we can rely on, even as states are going back and forth
and making different decisions.
I think it provides-- I think it was
Pat who mentioned-- an external check on what we actually
see out of new assessments, whether consortium derived
or otherwise.
And it does provide a stable, long term trend.
So that we can have sensible and reasonable conversations
about what's happening over time in the aggregate
and the subgroups.
Third, I think it's also, in this context,
useful to keep in mind why it is that the State
NAEP can do those things for us.
Why it is credible and stable and reliable.
And for a couple things to keep in mind.
One, it has provided for a quarter century enormously
useful transparency and comparability,
but it has done so without involving the federal
government even loosely in any effort to prescribe
interventions, remedies, or changes that ought to be
recommended to states as a consequence. ,
Now if you think about it for a second,
you cannot make that same claim in good faith right now about
the Common Core standards or either the Assessment
Consortium.
Through nobody's particular fault, but because of the way
things work nowadays, these are very much becoming linchpins
both in conversations around the SCA,
and in waiver derived improvement strategies.
So that kind of firewall, which keeps State NAEP isolated
from these larger political debates
does not apply to these new standards.
Fourth, let me just say real quick that I think--
as my fellow panels have mentioned,
an appropriate division of labor is useful.
But I would disagree a little bit with my fellow panelists
on what the right division of labor is.
It is absolutely true that NAEP especially state NAEP but NAEP
as an enterprise has sought to be innovative
and to push the envelope when it came to assessment.
I think that is an inappropriate-- not
inappropriate, I think that's the mistaken mission
at this point.
I think NAEP's power is that it is anchored
in content in a way that is broadly acceptable;
that it provides us a long-term, reliable, and stable way
to track what kids know and can do;
that we have a series of protocols governing how this
is administered in a way that has broad bipartisan buy-in.
None of that actually makes NAEP at this point
particularly agile or particularly suited
for being innovative.
I would argue the division of labor, at this point,
is to safeguard the State NAEP brand.
Make sure it provides us that external content-driven,
reliable, long-term check, and leave it
to the more state-centered, the less national new entrants
to provide the kind of agility and innovation
that we would like to see when it comes to assessment.
And fifth and finally, those of you
who play the stock markets at all
will remember that Warren Buffett took a lot of heat
back in the 1990s.
He kept investing in things like Coca Cola
when everybody else was making a fortune in the stock bubble.
And there was a long stretch there,
from about '94 to 2009, when Warren Buffett massively
underperformed the market.
Because he wasn't investing in tech.
And when asked why, he said, look
I don't understand what's going on.
I don't know how to invest in those guys.
I don't know how to value them.
We'll see.
And if you remember, there was a series of articles
questioning his judgment and whether he had lost his magic.
Well, it's been 15 years since the popping of the tech bubble,
and it has come back to-- it's seen that Warren Buffett is not
a particularly dumb guy.
That he valued the fact that he knew things that worked,
that he stuck to his knitting, and that he waited out the fact
that new trends and new fads come and go,
and that we shouldn't be too eager to forgo our judgment
and jump on them.
To me, that's kind of how I think about State NAEP.
That I have no firm predictions about how the Common Core
effort or the New Assessments are going to shake out.
But I think it would be foolhardy and mistaken to screw
with something that has served us well
and has provided real value, while the rest of this stuff
is still very much in the air.
Thank you.
Thank you, Rick.
And thank you, as an audience, for being flexible.
We're going to take a quick break,
and we're going to come back at 4:40
to hear from Roberto Rodriguez.
But I'm going to try to do something pretty dangerous.
And that's-- in a few sentences summarize what I heard,
and what I've been hearing the last couple of days and during
today.
I often talk about sticking to our knitting.
And so, when Rick talks about State NAEP,
that's that given that we ought to hold on to and so forth.
For years, we've been known as the truth teller, the nation's
report card, the gold standard.
We cannot abandon that.
That has to be what we're about and what we'll always be about.
On the other hand, we didn't start with State NAEP.
State NAEP wasn't a given.
In fact, Gene talked about the battle.
Twenty to 19.
We almost lost that.
So what I'd like to add to Rick's challenge is,
what are the new State NAEPs?
What are the next things we need to stretch to?
And we've begun.
And maybe a lot of people are critical.
And thank goodness for NCES that says, just the facts, ma'am.
Because that is where our strength is.
But we've wandered into making a difference.
We've wandered into parent connections
and whether we can get parents to understand NAEP.
We're wandering into assessment literacy across the country
to try and see if the general public could understand.
And maybe we'll make some mistakes.
But I don't think we'll wander too far away from the truth.
So we have established the new Technology and Engineering
Literacy Assessment.
I think we'll be terrific.
We have taken on the issue of exclusions.
We did extend into TUDA, etc.
So that really is the challenge.
How do we stick to our knitting to make
sure we're the independent verifier, no matter what
the consortia do or other states or whatever.
We have to be the one.
They're always going to compare back to us,
and that has to be the role we play.
But, so too, we have to stretch-- certainly
in the area of technology and computer-based assessment as we
go forward; the whole issue of critical thinking
and those kinds of skills that we're
being asked to look at; the whole issue of computer
scoring, writing; there's a whole world out there
of things we need to pay attention to.
So it's that balance, isn't it, of staying
who we are-- wandering, but not being
sure when we leave the nest that we're
providing good data and quality.
So, that's the challenge before all of us,
and I want to thank our panel.
And please thank our panel for a great discussion.
And I just have to say, we'll take a quick-- oh.
Sue's up there, and she's my Vice Chair,
so I do what she tells me.
So Gene, you said one little comment
that I wondered if you can expound on a little bit.
Where I think you said something like,
you thought we had some-- this is for NAEP
now-- that you thought we had some updating we needed to do.
Could you just say a little more about that?
Yeah.
In a couple ways.
Let's just use-- well, first of all,
in the preparedness conversation.
There's more to preparedness than reading and math.
Writing has appeared, out of the work we've done on Common Core,
as a critical tool for success in college and university.
I think NAEP needs to take a hard look at the writing
as being a critical component of preparedness.
Secondly, as you look at the items in writing,
they don't reflect some of the solid learning
that we have accomplished through the Common Core
efforts.
And that is, reading complex texts.
I think you ought to look very carefully at how complex
those texts are and whether they align
with college expectations.
And writing from text and based on evidence-- those are all,
I think, some of the contributions
that we will find the Common Core
has added to the field of Assessment
Policy in this country.
I think there ought to be some part of NAGB that
takes a close look at those learnings
and begins to think about how those might be incorporated
into future NAEP policy.
Thank you for that.
Very important.
We're now down to a five minute break, but that's OK.
It was well worth it.
Uh oh!
No, now we really have to go because Roberto's t
on a very tight schedule.
So, anyway, sorry about that.
Enjoy your five minute break.
Thank you.
Probably the greatest thing I can say about Roberto
is we're getting started late because so many people want
to talk to him.
It shows just how active he's been
with all of us all these years.
They gave me a script, but I said
I don't need a script for Roberto.
I think you all know that in addition
to being a Special Assistant to the President for Education
and our Liaison Education to the White House,
Roberto spent many years working for Ted Kennedy-- Senator Ted
Kennedy.
And I can only say that we love the Senator, of course,
but there probably wasn't a Senator more demanding,
in a good way.
Whereas President Bush said he never--
he used to point to the No Child Left
Behind-- and how many pages was it?
Over 1100 pages.
1187.
And he'd say, there they are, and I haven't read them.
And people used to laugh.
I don't know if it was funny, but they used to laugh.
Well, I can tell you that Ted Kennedy read all 1,187 pages,
right?
And gave edits back to Roberto.
And was right on things that they had included incorrectly.
I always tried to be nice because if the Senator said
to me, now what can I do for you, David?
I had to be careful because if I said anything, he'd go back
and say, Roberto, you got to get that done.
But he's been a great friend, as you know,
and a great help to all of us.
You know how important the staffs are.
And he's just been tremendous through all of it.
He always keeps in mind states, uses his common sense--
probably because of his young family,
he knows what it's like.
It's just a great pleasure to introduce
one of the unsung heroes in education policy in America,
frankly.
And a good friend, Roberto Rodriguez.
Hey everyone.
It's great to join you this afternoon.
David, thank you for the generous introduction,
but more importantly, for your great friendship
over the years.
It's been my pleasure to work with you, learn from you,
and be part of this bigger conversation
around federal education policy.
You blink and '15 years fly by doing some of this work.
But it's a pleasure to be here for this exciting occasion.
I want to thank Cornelia Orr and the entire NAGB
staff and board, past and present, for welcoming me here.
I know you've had a long day of conversation,
so I will try to I keep my remarks brief
and look forward to some discussion here with you all.
As we reflect on this anniversary, this celebration,
and think about what's next for the role
that NAGB plays in federal education policy.
It's a really important role.
I know how-- I know you all share
my excitement and my fulfillment and continue
to work on these issues in really helping
to think about what we need to do as a country
to make a difference and provide a great world class
education for each and every one of our young people today.
This is not a Republican agenda.
This is not a Democratic agenda.
This is an American agenda about how
we can support our young people in reaching
their full potential and keeping them on track to graduate fully
prepared for college and for career and for life.
So the arc of our efforts here and our administration
over the past several years-- I know
you've heard from our acting Deputy Secretary Jim Shelton
well a bit on some of this, but the arc of our efforts
have really been around fulfilling
the President's broader goal of leading the world
with the highest proportion of college graduates by 2020.
Right.
That's our North Star, if you will.
The Secretary refers to as our North Star.
It really challenges us to increase and strengthen
education at every level, from cradle through career.
We are engaged right now in an effort in a really exciting
time to raise the bar for our nation's learners
to ensure higher standards, to strive to
for an effective teacher in every classroom,
to seize innovation and do more to build in innovation
at every level in our education system,
think about how we use data in new ways
to drive improvement across our system,
and to take on some of these challenges
around turning around our low performing schools,
and increasing our graduation rates so
that all of our students have the opportunity
to be successful.
As you all know, education-- we love this work because it's
our great moral imperative.
It's the measure of our ability to live up to our highest
fundamentals of opportunity of justice, of equity, in all that
we do.
But even more so, today, it's our strategy
for how we equip our learners to be successful
in the twenty-first century and in this economy.
Our students compete now with the rest of the world in a way
that they didn't a generation ago, even in the generation
that I came up in.
And our long term economic security
really is directly tied to the quality of our public education
system.
And this global conscious that we have,
we really believe this begins in our classrooms each
and every day.
So I know you're all familiar with the challenges
that we still have and that we still face.
We're trailing more than a dozen countries
in math and in science achievement.
About a quarter of our students are
failing to graduate on time.
We're losing a million students a year still
to this dropout crisis.
We have tens of millions of our adults,
as some of the more recent PACT data has shown,
that lack even basic literacy skills.
And we still have too many of our young people
that are not fully engaged, motivated, and focused
on high on high school completion
and on staying on track toward that end.
So we have to do more to provide a deeper set of knowledge
and skills in the STEM areas and make sure
that we're aligning our system, particularly from K
all the way through 12, aligning that more effectively
to post secondary needs and success.
So all of this really begins with high expectations,
with setting our bar high and our standards high for teaching
and learning in our schools.
And in the past, we've had an education policy
that has protected the civil rights of children, made sure
that we're providing funding and dedicating funding
to disadvantaged children, to students with disabilities,
to other special populations of learners.
These are still important functions, obviously,
of our federal role.
But the challenge we're facing today
is principally around making sure
we're preparing our young people for a competitive knowledge
based economy.
And that's why it's so important that we support and work
with our states to help their students reach college
and career ready standards.
So as you know, we've set forth early
to encourage and support that work.
But the real revelation around why
we're having this conversation around college and career
readiness can be brought back to the NAEP.
Because it was the NAEP that was functioning
as it was intended to function as a national benchmark,
particularly in reading and math,
across all 50 states that really began
to reveal the disparities around where states set
their expectations and for proficiency
and subsequently for teaching and learning.
Between those states that were setting
high marks and a high water threshold
and others that set it low.
So when you connected these dots,
we found even more revelation around gauging
the success of federal education policy,
and found that we had 18 states that lowered standards
for proficiency during some of these early years of the No
Child Left Behind Act.
So it's that type of functioning that can provide
really important data and a really important reminder
for us as policymakers moving forward about the need
to come together.
And we've seen our governors and our state leaders,
our state superintendents come together--
Republican and Democratic-- to answer this call around raising
the bar for students and to support the adoption
and implementation of new College and Career Ready
standards.
Some on a common basis, others on their own terms.
Regardless, we're seeing a massive transition,
and I think an exciting time for our system.
And as we focus on these standards,
we also have this dual imperative
of making sure that we focus on the career in College
and Career Ready.
Our dialogue often focuses on the college ready.
We know that there's a level of expectation that's
necessary to be successful that is comparable for our learners
both to enter college without the need for remediation,
as well as to enter a successful career.
So we are really interested in promoting
a more effective alignment between our career
and technical education system and the demands of our labor
market.
Stronger collaborations between those systems and our post
secondary system, more accountability
for outcomes for all students, and an increased emphasis
in our CTE system around innovation and reform.
And we know that effective career and technical education
programs can make a real difference
in terms of helping to support the needs of employers
as well as to engage our learners
and provide opportunities for our learners to connect what
they're learning to real life situations and real life
application of knowledge.
So here, too, we need better measures of student success.
We need better measures in nontraditional settings of how
our students are learning, are progressing, are growing.
And beyond, even career and technical education,
we want to try and encourage our districts and their partners
to use federal, state, and local resources to rethink
the broader experience around high schools-- you heard
the President talk about this recently
in the State of the Union-- and challenge us to really embrace
more innovative strategies and pedagogy that are designed
around students' interests, that incorporate technology,
that personalize learning, that will help our students
graduate with a new goal of being ready for success
and also graduating with some level of college
coursework and college credit and with the knowledge
and experiences in learning that will really equip them
for success later in their career.
So there are many examples.
From the more recent McGavock High School
that the President visited in Nashville to his visit
last year down in the Manor New Tech, where we're
seeing exciting and new ways of redesigning what teaching
and learning can look like aligned to college and career
readiness in a way that, hopefully,
will help us move forward as a country.
Modernize our system, and look ahead.
We also, in this broader context-- and this is,
I think, an important imperative,
particularly for NAGB-- there's a real need
to attend to the bookends of our system beyond the K-12 work.
To think about the earliest years of our children's
learning and development, where the brain research shows us
what a tremendous difference the first two to three years
of life make, and where we are seeing early vocabulary
gaps between low income children and their more affluent peers
upwards of 30 million words, and a school readiness
gap upward of 60 points.
That then we all collectively wring our hands around
in fourth and eighth grade and beyond
and think about how can we do more
to close the achievement gap?
When if we were meeting are imperative to provide
a high quality early childhood education-- beginning at birth
all the way through infant and toddler
and into high quality preschool--
we would do more to really bridge these opportunity gaps
that begin too early for our children,
and where their zip code is predetermining
their level of educational opportunity moving forward.
So we want to remain focused on that effort.
We want to remain focused on the need to harness innovation
and seize college attainment in the post secondary space.
We have a lot of work to do-- just
as we do in modernizing and seizing innovation
in the K-12 space-- do so in our post secondary space,
to think about how we redesign, for instance, our early courses
for our freshman in subjects like math and science.
So that we can better engage them in their learning
and in their success.
Boost that success, help boost college completion,
think about redesigning developmental education
and remediation.
I was with hundreds of community college presidents
just earlier today-- on the heels of a college opportunity
summit that we convened-- thinking
about how we can redesign developmental education
and pathways, so that as we're making this transition
to bring college and career readiness online
in high school, we're also making sure
that we are providing-- in an effective and seamless manner--
a high quality developmental and remedial education that
is leading the credit-bearing coursework,
and ultimately to the credentials and degrees
that we know students need to be successful.