Tip:
Highlight text to annotate it
X
Nish Sonwalkar: My name is Dr. Nish Sonwalkar. I came to this country as a graduate student
and joined MIT for the Ph.D. program. After finishing, I joined the faculty in mechanical
engineering. But my love for on line education got me in trouble because I was given a grant
to start a hypermedia teaching facility, which later on became Hypermedia Lab and I ended
up doing almost 20 to 30 pedagogical experiments that led to some of the developments at MIT,
the Stellar, the...came out of that as well as some of the experiments that we did with
Singapore MIT program, which remains to be a very successful program where I had to work
with 30 faculty at MIT to develop five Master's program, delivered in three universities in
Singapore. That was quite a challenge. It was done on time, with a $110 million funding
from Singapore. So when we collected all the data and created the on line system, I did
end up with the innovation of what we now call, and it's now become popular, called
Adaptive Learning System and we have added some of the new answers of brain-based Adaptive
Learning System. That requires a lot of personalization as well as data analytics in real time so
that you can do Marcovian statistical analysis when a student goes through different learning
strategies and providing them real time feedback so that they can reach their learning goals.
So I'll be focusing my vantage point in terms of data analytics that leads to the learning
goals in the shortest amount of time. Thank you.
Matthew Harris: Hi, my name is Matthew Harris. I'm the co-founder of a company called College
Miner. College Miner actually started as a data mining project at Boston University.
I was a graduate student and I found out that we had to give an hour long presentation in
the class. So I wanted to come up with something kind of beneficial to the students so I started
studying student outcomes. Why are people going back to grad school and do they get
jobs related to their major, is kind of the basis behind it. After we kind of created
a tool where students can analyze certain programs of what people were doing after they
graduated, I had BU approach me, director-level type guys, to develop an internal tool for
them to analyze more of their programs, kind of analyze the quality of certain programs,
certain majors, departments. So what we do now is we aggregate all their evaluation data,
survey data into this tool so that it can analyze it. One example that we do is with
professors. You get all these professor evaluations. Maybe a particular professor may teach three
or four courses each semester. He may be a really good professor but one course he doesn't
teach as well. So we kind of analyze that and showing that maybe that great professor
could be utilized in his resources elsewhere. So we kind of work with different schools
now, collaborating and kind of improving our product.
Betzi Bateman: Hi, I am Betzi Bateman. I am the new Program Director in the Instructional
Design Program here at UMass Boston. Literally new. I started on Monday. I'm also new to
learning analytics. I used to teach on line at Kent State University and Cleveland State
University, where I was also an instructional designer. I also worked at Case Western Reserve
University in Cleveland, Ohio. But when it comes to learning analytics, one of the things
that I find exciting is that instructional designers have been focusing on analyzing
learners and contexts and trying to apply that data in educational situations for a
while and then also on the other end, evaluating instructional products and techniques for
their effectiveness. In practice -- I mean, it's always there in theory -- students learn
the whole process, the ADDIE model, analysis, design, development, implementation and evaluation.
And then if you get a job in higher ed and you start working with a professor as a new
instructional designer and you think oh great, well how are we gonna survey your students
and then the professor looks at you says this is what I'm teaching and this is how I'm going
to teach it, how do I get it on line. So analysis has kind of taken a step back in higher ed
and something like learning analytics, which is a buzz word, can open this dialog for us
to see that we need to use this data to affect what we actually do in the classroom. So I'm
very excited about that. I'm excited and interested in what this means for the training of new
instructional designers. What new research techniques do they need to learn? How does
this affect the process? How I kind of see it is that instructional design tends to be...
It's heavy on the front end. There's a lot of planning, use of templates. It can take
months to design a class. And then you have a product and it's done. And yes, you evaluate
it and it has some quality improvement built into the system with evaluating it but you're
really evaluating a final product. So it's sort of like here's instructional design.
And what learning analytics might do is make it more rapid. So instead of working with
professor in front and you have a final product, you're working with them throughout the course
in ways like this. Many instructional design processes. So I got this data from the students
and some of them didn't get it and some of them did and some of them sort of did, so
what interventions can I do? So it's more just in time and rapid development. So these
are things that will need to be taught to our new instructional design students and
I'm excited on coming here and helping to implement these new courses. Thanks.
Roger Blake: Hi, good morning. My name is Roger Blake. I'm in the College of Management.
I think this has been great so far. I've learned a lot and appreciate being here. I just wanted
to talk a little bit about how we're using analytics in the College of Management and
talk about a program that's really the core of analytics for us. But that program didn't
begin with any vision of analytics or intention to use analytics or really knowledge about
it, I don't think. It started with a problem that we had, that we noticed, which five or
six years ago, we established learning objectives, probably like every university has and every
college has. And some of those objectives were relatively easy to put into the curriculum
and relatively easy to assess. Quantitative skills, for example. Very easy to have a test
or some kind of standardized... Although maybe not with all the variability. But relatively
easy. But some of our learning objectives were the things that were soft skills, which
are very important for us, soft skills such as professionalism, an ethical sense, the
ability to recognize global issues and contribute something back to the community. We hadn't
really built them into the curriculum, let alone try to assess those. So five or six
years ago, we decided to embark on a program to bring in extracurricular activities or
co-curricular activities where we would have students attend events, attend seminars, do
community work, have internships, all kinds of real, real spectrum of events to bring
that into the curriculum. And we did that, and I have to say it started in the fall of
2006 and personally I was trepidatious about how students would accept this. Certainly
here, we have students that commute. They have family obligations, work obligations.
And here we were about to put on something else that they would have to do.
One of the things that we did was we built it into our curriculum as a requirement and
the way we did that was to offer a certain number of miles or points, if you will, for
certain activities, certain events and also to establish a threshold that students had
to reach in order to take the Capstone course, which means in order to graduate. And I have
to say, I expected an awful lot of resistance, grumbling, unhappy people. I was very surprised
-- in fact I recognize at least one student here who can tell me if that was the case
or not but wouldn't want to put anybody on the spot -- but what happened was that students,
sure they're pressed for time, they recognized the value of this program to build professionalism
and really have participated well. Since the fall of 2006 when we started, we've run about
1,400 events so it's about 100 a semester. We made sure that there were enough events
of all kinds that students could choose from. I believe last semester, I think there was
about one event every night of the week, Monday through Thursday, pretty close to it. And
we've had about 4,063 students participating in the program during that time. They've attended,
I think it was 29,700. I checked this morning. Attendees, if you will, or people who've attended
one of these events. So it's been successful, I have to say. Again, as I said, we had no
idea that analytics was gonna be used for this. We were simply trying to put this into
our curriculum, but that idea was certainly something that occurred to us.
I have a background... Before I came to academia, I was in industry for 20 years and doing this
kind of work, except we didn't call it analytics and we didn't call it business intelligence.
We said...support systems, but same concepts. The first thing we did was ask ourselves how
we could assess some of these soft skills. Great, it's in the curriculum but how do we
actually assess this and report it to our accreditation agency? Very critical. So we
came up with a method of looking at not just the quantitative data that was from attendance
or how fast students completed their requirements for these co-curricular activities, but also
to look at the text that students wrote, because certain events require a written response.
Essentially, what did I learn from this event? So we started evaluating that with text analysis,
establishing what are criteria for professionalism, what would represent good professionalism
and evaluating those events on a scale. I think one of the nice parts about that, to
address variability, is using text analysis, there's not a mode or a model that all of
the responses have to meet. The text analysis can look at many dimensions to assess this
professionalism and know how well students are doing with regard to that goal. The second
use of that program that we're embarking on now is to evaluate the program itself. How
are students doing with certain kind of events? Should we be doing more internships? Should
we be doing more seminars or outside experiences at companies or whatever kind of events we
should adapt our program for? And then recently we've started to look at the program through
the eyes of maybe more traditional data mining analytics, which is to start to look at whether
or not we can predict retention. That's some of the research I'm doing now and I think
it's quite exciting research because I'm finding that through this program, we can predict
retention not just a couple of semesters into a student's career, in fact not just at the
end of the first semester which is normally what's done by researchers who are looking
at GPA or performance or some other measure, but we can actually predict retention during
a student's first semester by participation in the program. So early on, we're able to
take a look at here are some students who have a higher likelihood of attrition and
address that issue, which is where we're at pretty much at the moment. So that's really
how we're using analytics in the College of Management right now, which is largely through
this program. Thanks.
Neil Heffernan: Hi, my name is Neil Heffernan. I'm a professor and I can take too long. I'm
timing myself. I'm actually very excited for you here at UMass Boston and the endeavor
you're actually working on. But maybe that will make more sense if I give a little bit
of background. So I'm a professor at Worcester Polytechnic Institute and I'm Co-director
of the Learning Sciences and Technologies program at Worcester Polytechnic Institute.
I guess actually I'm best known and probably why I'm on this panel is because I helped
to create the Educational Data Mining Conference and Journal. So educational data mining/learning
analytics are kind of actually the same thing but using big sources of data. So the big
sources of data that actually we use in my work actually is stuff that we create ourselves
and it's called the ASSISTment System. This is a project I've been doing for a decade
and used by actually 17,000 children this year in middle school, high school, elementary
school in probably about 80 districts here in Massachusetts and about 150 across the
United States. But let me back up a second. I was a teacher in the Teach For America program
in Baltimore City. When the provost began and the dean began, we were talking about
access. Access is it, right? And we were talking about access as if actually putting it on
line is gonna actually help. Well one of the problems we actually see is this technology
that we've built that's now actually catching fire and is being used across our suburban
districts has a much harder time catching fire in the cities. So when we have so many
of our schools that actually have kind of poor states of their technology and they don't
actually have the ability to say send homework home where our suburban districts are just
doing this now, every time I speak I try to remind us and our policy leaders about the
importance of closing the digital divide. We just got a $3.5 million grant to actually
do a huge study in the state of Maine with 54 schools. We did it in the state of Maine
because the governor bought laptops for every 7th and 8th grader in the state and has been
doing so for a decade. Math scores haven't really changed much there but it's really
important to actually give them the sorts of technologies that might actually cause
that to happen. So anyway, you guys are in the city and totally understand some of the
access issues. But anyway, I actually get sad at night when I think about the thing
that my wife and I have created and that actually it's making our digital divide worse, because
we're helping the suburban districts get the types of feedback that actually millions of
kids are actually getting denied right now, which is immediate feedback as they go on
their nightly homework. We're gonna have 10 million kids tonight that will get no feedback
on their homework until tomorrow. We know as cognitive scientists -- and you can read
some of our papers that have proved it -- that that's not a good thing. So anyway, access,
I think, is really important.
I guess actually to give a little context of how I got into this, after teaching in
Teach For America, I was diagnosed with a cancerous brain tumor and I was told I had
two to three years to live. My wife and I decided we've got to do something important
and since I thought I couldn't cure cancer, we decided all right, why don't we actually
make sure we do something big, which is why we give away this thing, the ASSISTment System,
to these schools. I guess I want to make a point. The dean was talking about actually
applied research and the integration between research and application. You can't do learner
analytics unless you got some data. And you can't do research unless you actually have
a platform by which you can control it and do stuff. Google actually experimented. They
ran 12,000 experiments in 2009, according to the Dave Brooks article that was just published
in The New York Times last week. It's really important to be applying randomized control
trials to actually find out what works. While I love learner analytics and data mining,
nothing tells you... That gives you some ideas. It doesn't actually tell you about causality
and it's really important to be able to actually run randomized control trials. So I think
you should think of the e-learning thing that you want to actually do as having a useful
platform that you guys own and that you own the data and you own the control over it.
If you don't do that, you're at the whims of the blackboards of the world that actually
are gonna not allow you to do some of the types of research that you might actually
otherwise want to. But maybe enough about me and I'm excited to actually see you guys
taking these big steps. I'll be competing with you because we're probably gonna hire
two or three faculty this year in Learning Sciences and Technologies. But it's clearly
a big, huge growth area and I applaud the University for stepping up to this. Thanks.
End