Tip:
Highlight text to annotate it
X
BRENT SHAW: A little bit about UCF.
We're a big, honking school.
Big school.
And I'll point out that classes started
in October of 1968.
And my grandfather was in the receiving department at UCF in
October of '68.
He worked at the loading dock in the
basement of the library.
And I work in the basement of the library just right around
the corner from the loading dock.
[LAUGHTER]
BRENT SHAW: So for me, that's--
it really feels like I belong there.
Oh and also, please laugh today.
[LAUGHTER]
BRENT SHAW: If you're not laughing with me, laugh at me.
I can't tell the difference.
We'll all have much more fun that way.
A little bit about me--
yes, that's really my email address.
[LAUGHTER]
BRENT SHAW: B-S are my initials.
And that's my story and I'm sticking to it.
[LAUGHTER]
BRENT SHAW: This is my third time doing this particular
Scantron upload work.
I did it for WebCT.
I did it for WebCT Vista, Blackboard Vista.
And now I'm doing it for Instructure Canvas.
And let me say, the easiest of the three has been Canvas.
Hands down.
Hands down.
Let me digress here for a minute.
Ooh, that got loud.
Sorry about that.
This is Tom Cavanagh.
Here's our associate VP of Distributed Learning at UCF.
By all rights, he should be here.
It was a major adoption our school, a major ERP project.
But he declined to come so that he could use that money
and send one of his employees.
So he's doing the right thing.
He's a very busy guy.
He has an assistant.
And I said, Tom, I need some help with this presentation.
I know you're not going to be there.
So can you have your assistant help me out?
And Tom being the good guy he is, he said sure.
No problem.
He can help you out all you need.
So this is Flat Tom.
He's going to be helping me out a little bit later.
You notice [INAUDIBLE]?
Tom [INAUDIBLE]?
Like I said, if you're not laughing with me, laugh at me.
We had a quick adoption cycle, which meant I had a little bit
of time to get the Scantron portion working.
So basically within about two semesters, once we decided on
our platform, I had to have it done.
I'll give you a little bit of background.
Our faculty can go into PeopleSoft, which is our SIS,
and they can choose to create a Canvas shell for themselves.
Any classes they're scheduled to teach, two semesters out,
we handle the creation of the account, the enrollment of the
students into the account.
And they can also crosslist any courses that they're
teaching as well.
I'm going to talk about a course today--
Physical Geography of [INAUDIBLE].
It's important to know that we use an SIS ID.
And that's what it looks like.
1470 is a term code for us.
This is the PeopleSoft database it comes out of.
And this is the course and section.
And I'll also be talking about a group called Test Scoring.
They're the folks that actually run the Scantrons
through the Scantron machine.
Why bother to do this at all?
It's just better.
It's quicker.
It's faster.
It's safer.
It's more secure.
One of the original reasons I was asked to do this was
because before we were doing this, they were handing out
information-- grades--
on floppy disks.
Those floppy disks would end up all over.
People would reuse them.
And it got to the point where this was a vector for viruses.
The computer that controls the Scantron scoring machine got
infected by a virus.
And it was down for a week and a half.
So somebody said, OK, this has got to stop.
And also, faculty were posting stuff in
hallways, and on boards.
And that's just not FERPA compliant.
So I'm not saying faculty don't still do that.
But we've done all we can to make it easy for
them not to do that.
Build your own awesome.
Tom did that.
When we started this, I found out there was a couple of
things missing in some of the APIs that I needed.
I had to be able to mute an assignment.
Faculty would've hunted me down, me specifically, with
pitchforks and torches if I returned grades to students
directly without them having the ability to make sure their
Scantron key was OK and things of that nature.
Dr. Cavanagh funded their development.
So if you use the API and you mute an
assignment, think of him.
Not Flat Tom, Dr. Tom.
Four phases of the project.
My immediate goal was to get something up and running where
I could get grades back to faculty before the API work
was concluded, so I could fully automate it.
So I had a kind of approach where I processed
as much as I could.
I created--
I translated the CSV file from Test Scoring into a Canvas
gradebook CSV file.
I emailed that to a person.
And she uploaded those to the courses.
I called that my human API.
And then I found out that person who was going to be
hired was actually Dr. Cavanagh's wife.
And I started calling her Pam instead of human API.
At this point, we've got it fairly
well hands off, automated.
Instructors drop off their grades.
Depending on schedule and-- what if they go out to lunch?
We can get their grades back to them in Canvas before they
get back to their computer.
Eventually, Test Scoring also generates metadata--
standard deviation item analysis, discrimination on
the test questions, and what not.
And I wanted to [INAUDIBLE] return to the instructors
through Canvas.
At some point I'd also like to be able to return that
information to the students via Canvas as well.
Like I said, we had the human API approach.
Work got us through until the API work was concluded.
Basically, the process runs something like this.
Test Scoring scans the forms, they create a CSV file with
grades and the student UCF identifiers-- we call them a
NID, Network ID.
They transfer that file to me, and I process it, and get the
grades uploaded.
And of course the errors are handled.
Programming is easy.
It's error-handling that's [? really ?]
[INAUDIBLE].
And then, of course, if you let the instructors
know, when it's done.
So let's go ahead and take a walk through this.
Flat Tom is going to be our host for this.
Physical Geography.
Here's Flat Tom telling the students to
use the right form.
It's got to be the pink one, not blue, not brown,
not the green one.
Use the pink.
Here he is administering the exam.
He's checked the IDs-- you can't tell from this picture--
but he's checked the IDs of all the students.
So it's a nice, safe, secure environment.
Here he is dropping off the test scores.
The dropoff points in our operations center is open 24
hours a day, 7 days a week.
So that's where they can drop them off and pick them up.
And here he is actually putting the information in our
highly sophisticated and technically
complex inbox system.
I think Walmart won the RFP for that.
[LAUGHTER]
But do you know what?
It works.
If it's not broken, don't fix it.
So at this point, the guy comes over a couple of times a
day, picks up the stuff in the inbox, and runs them over.
And that's me hanging out with Flat Tom in front of the Test
Scoring machine.
Because that's how we like to hang out together.
The machine handles 250 pages per minute on
fast and 125 on slow.
And it has a fan that sits on top of it.
So that's our technology at work.
They generate a CSV file.
And I had them tag some metadata at the top of that
and then include the data down below.
That's the information I needed to get
the grades into Canvas.
That's what an actual file looks like.
You'll notice a couple of things there.
WebCT ID--
that's left over from when I did it the first time.
But I never bothered to change that, [INAUDIBLE] the
translation on my side.
You'll notice also a negative one score.
That's unique to our test scoring office.
That's how they differentiate between the student
who scores a zero--
actually hands in a Scantron and manages to score a zero--
AUDIENCE: [INAUDIBLE] someone?
BRENT SHAW: Yeah.
Versus--
[LAUGHTER]
BRENT SHAW: I know.
Nobody in here, I'm sure.
Present company excepted.
Versus a student who did not show and did
not submit a Scantron.
Those students get a negative one.
Again, what the instructor does with that negative one is
up to them.
Another reason why we had to have this muted--
students don't like to see a negative one.
It's bad enough seeing a zero.
We don't want to damage any psyches more than we have to.
So at this point, they hand the file off to us.
And this is what you [INAUDIBLE].
Six easy steps.
I've got to make sure the class exists.
I've got to get a list of the students out of Canvas so I
can do an ID match-up, so I can get the
right grade in place.
I've got to have an assignment to put the grade, actually
assign the grade, and then let the instructor know it's done.
Thank you.
[LAUGHTER]
BRENT SHAW: Like I said, laugh at me.
It's useful.
What's the easiest way to make sure a Canvas course exists?
See if it's there.
This is the format I'm going to use.
I'm going to talk about what I'm doing, why I'm doing it.
I'll give you the link to the API documentation Canvas has.
And then I'll show you what the request looks like.
And I'll also show you what the response looks like.
A couple of things here--
I'm using that SIS ID.
That's what makes it possible.
Because instructors barely know what course and section
they're teaching.
They even get that wrong.
There's no way we can expect them to know what their Canvas
Course ID number is.
That's just asking too much.
I do pass the auth token through the header.
And also, don't create that auth token
with your own account.
Because it tags that as you having uploaded that grade,
you having input that grade.
Go ahead and create a service account that's got a generic
name-- in our case, Test Scoring--
and that way the instructor can tell the grades
came from this form.
This is what the response looks like when I'm checking
for the existence of a course.
I've highlighted the [? nonxlist ?]
course ID.
In this case, it's null.
If this section were crosslisted with another
section, instead of being null, you'd have the course
number, the Canvas course ID number.
And at that point, you'd pick that up and start using that
as the place, the final repository for the grades.
If it doesn't exist, you do a 404.
And at that point, I'd send an email to somebody that says,
go create this course.
And we go into the PeopleSoft system, check it off for them,
and file the reprocess twice a day until the course
actually shows up.
And we carry on our way.
Next we've got to get a list of students so we can do that
ID match-up.
At this point, if you haven't generated it before, this is
one of the benefits of being late in the day.
I haven't seen anyone talk about pagination yet.
Are you guys familiar with that?
Basically Canvas will chunk the data into small pieces so
it eases the load on the server and things don't time
out quite as much and a bunch of other reasons.
Here I'm using the course SIS ID.
And back here, I used the section.
Because the section will tell me if it's crosslisted.
And at this point, I want to use the course, because that's
where the grades are eventually going to show up.
When the information is paginated, the HTTP header
looks like this.
And it's easy enough to tell what's going on.
In this case, we've got 180 students or so.
And basically I just scroll through each one of those
URLs, picking up all 10 students at a time
until I get them all.
The response looks like that.
And of course, there's 10 of them within that array object.
And the nice thing about this is, at this point I've got the
the Canvas ID number, our SIS ID number, and our SIS login
ID; which is all the information I need to
[INAUDIBLE]
get the grades in.
At this point, this was enough information for me to create
the CSV file for Pam to upload them.
So that's what it looked like.
Easy peasy.
Hopefully you guys have seen that before.
Now, I need someplace to put the grades.
So we have to have an assignment.
And the question then becomes, do I need to create the
assignment, or is the assignment already existing?
Sometimes faculty members like to put in the assignment as a
placeholder.
That way it shows up in the syllabus tool,
things of that nature.
Usually, in my experiences with Scantrons, I end up
having to create them from scratch.
But we pull a list of the assignments.
And I scroll through them and I just do a straight
comparison on the name of the assignment.
AUDIENCE: Do you have [? a ?] tolerance level or
[INAUDIBLE]?
BRENT SHAW: I do an identical match.
It's just if A equals B type of thing.
That's the response.
I get the name and I get the description.
The description is important for me, because if I have a
crosslisted set of grades coming in, it
could be three sections.
And I'll have already created this on the first section.
And I tag the description with an MD5 hash.
It's not necessarily for security.
But just it lets me know that I created it and it's
safe for me to use.
I don't have to do any more processing.
For me, that's a good column to use.
So assuming the assignment doesn't exist, we create it.
I created muted.
Test Scoring just provides the raw score.
25 questions, 25 points.
If the instructor wants that to be 100 points in their
class, they've got to download that, [INAUDIBLE] sort of
manipulation, multiply by four, and go
ahead and upload it.
Again, another reason for it to be muted.
You don't want a student expecting a 96
and seeing a 24.
That's bad.
It blows up their [? conversations. ?]
Submission types is none.
That way the students can't try and submit
something to this.
There's no way to submit anything to this.
And I put the position negative one.
That forces it to be on the far left of the gradebook, so
that it kind of hits the instructor over the head when
they go in to see it.
And then they can relocate it wherever they want.
And again, I tagged the description
operation with an ID.
And again, it's a POST.
And that's what the response looks like.
It should be the same information you gave it--
negative one, muted equals true.
And you can see there, "This assignment was automatically
created by Test Scoring for a Scantron upload," [? blah, ?]
[? blah, ?]
[? blah. ?]
That's what it looks like in the software.
We've run an entire semester.
So probably why I--
200,000, 150,000 scores for students.
And no one--
I've got not one question about this verbiage.
So either the instructors aren't looking, or it's clear
enough that they understand what's going on.
AUDIENCE: Is it-- sorry--
that upload ID [? is ?]
[? exposed ?]
[? to ?]
[? that ?]
description.
Is that problematic in any way?
BRENT SHAW: Not really.
Like I said, it's not for security.
It's just an MD5 hash on, I think, the quiz name, or the
assignment name, and the SIS ID.
So it's not anything special.
It just lets me know that I was there.
And if they copied this over for some reason from class to
class-- or from semester to semester, I should say--
that SIS ID would change and it would change the hash.
So I'd know that's not a column I created.
AUDIENCE: One question.
BRENT SHAW: Sure.
AUDIENCE: Did you just [INAUDIBLE] script [INAUDIBLE]
this?
BRENT SHAW: Yeah.
AUDIENCE: OK.
BRENT SHAW: Yeah, I used Perl.
Yeah, I could try to convert everyone in the room and tell
you that Perl's the best language and all
that kind of stuff.
That's not what we're here for.
Besides that, I've only got 30 minutes total.
Now this is what we've been waiting for--
just assign the grades.
Now this is one point where I'm going to share with you--
I probably shouldn't say this.
Is Kerlene here?
No, she's not.
Kerlene's my boss.
Man, I ran into a brick wall, headlong,
full-speed on this one.
I just couldn't figure out what was going on.
This is the way the documentation reads.
And you've got courses, course ID, assignment, assignment ID,
submissions, submission ID--
no, no, no.
That's not logical.
It's not submission ID, it's just ID.
And that's a student ID.
So a student doesn't even actually have to have a
submission in order for you to grade it.
And at that point, things could be much more [INAUDIBLE]
because then you have to do it on there.
[INAUDIBLE]
Don't tell anyone else.
I'll deny it if anybody says it.
That's what the request looks like-- a JSON.
It's real simple.
Give the person a 19.
The response is a big, whole, honking thing of JSON.
But basically as long as the grade comes back as a 19, and
the score is a 19, that's what you're looking for.
And I'll point out here, one's a string and one is a number.
I think that is because it'll take the score, and if there's
a grading rubric or a grading scheme applied to this-- like
a 19 is an A plus--
for a grade you'd see an A plus as a string instead of a
number, I think.
AUDIENCE: Yeah.
BRENT SHAW: Oh, there you go.
All right.
The last step, step number six, notify an instructor.
We send them an email notification.
We don't send grades.
As a matter of fact, when the security officer at UCF found
out I was going to email grades, you would have thought
I was trying to smuggle uranium.
[LAUGHTER]
BRENT SHAW: He said, no, no, no, you can't do that.
I said, well, wait a minute, I'm using Canvas IDs which no
one really knows.
There's no names, and it's just the score.
I think we're OK.
And we're sending it-- it doesn't leave the data center,
it's going to a CDL employee, and at that point they kind of
said, well, OK.
Not anymore, just this one time.
Thank you.
So anyhow, we don't send the grades.
We do send the notification.
It's a generic email.
We send-- we include instructions on how to unmute
the course, or unmute the assignment
and publish the course.
I will tell you--
there was a while I was doing this by hand myself--
some instructors never publish their course.
So I don't know how they were getting their
grades to the students.
Maybe they weren't.
Maybe they had a FERPA-compliant thing
somewhere else they were using, I hope.
But it makes you wonder.
But I never would have known if I weren't
doing that by hand.
Also, we include some general information about Test
Scoring-- pickup hours, location, [INAUDIBLE].
And I also go ahead and do a bit of a Venn diagram.
You got a list of students that are in Canvas, a list of
students in the LMS.
We match them and you get this Venn diagram thing going on.
Room for improvement.
Always room improvement for the pagination, if I were to
grab larger chunks of data, I'd have fewer calls.
Retry on a failed grade assignment.
I have one HTTP time out in about 10,000 grade
[? solutions ?]
I did during the course of testing.
And I was [INAUDIBLE], I actually
account for that as well.
Questions?
Comments?
Suggestions?
[INAUDIBLE AUDIENCE QUESTION]
BRENT SHAW: Yeah.
That's where I'm going.
That's how I want to return the--
AUDIENCE: Can you repeat the comment?
BRENT SHAW: Oh, I'm sorry.
The question was, keep using the APIs and create a
conversation instead of sending an email.
And that's kind of what I want to do with returning the
test's metadata.
So yeah, that's [INAUDIBLE] for sure.
AUDIENCE: Have you considered migrating people off Scantron
to [? assess ?] better [INAUDIBLE]?
BRENT SHAW: Have we ever considered migrating folks off
of Scantrons and--
[INTERPOSING VOICES]
BRENT SHAW: --Into Canvas directly?
Yeah.
But Yeah, I'm up for it too.
But there are some issues with that.
Faculty are faculty.
And faculty like to do things the way
faculty like to do things.
For anybody that wants to, we'll help them do it.
I think there's a lot of advantages to do it that way.
But sometimes it's just easier,
better, to do it in Scantron.
I know that sounds ridiculous.
But sometimes it is.
You do also have a proctored environment that way.
Great question, thank you.
Yes, sir.
AUDIENCE: Is this the only option for faculty or can they
opt out if the 100% don't want those
grades sent in to Canvas?
BRENT SHAW: Yeah sure.
They just don't give Scantron.
[LAUGHTER]
BRENT SHAW: No, let me rephrase that.
I see where you're getting it from.
And in that first slide, I said, where Flat Tom was
showing you-- use the pink one, don't use the blue one,
brown one, whatever.
Some of the departments have a little Scantron thing--
I don't know, I first saw one in the teacher break room in
my high school-- a little Scantron machine about the
size of a shoe box.
And yeah, those are out there in the departments.
They can run them through there if their department's
got one and do whatever with the grades.
This is quicker, easier, automated.
Yeah, and just do something else.
Questions.
AUDIENCE: I just want to thank you for bringing up
pagination, for mentioning that.
BRENT SHAW: No, that's the only advantage of being late
in the day.
I know like four of these slides I saw earlier.
I went, oh jeez.
Yeah, no one's talked about that yet.
AUDIENCE: And also, you mentioned in your room for
improvement increasing the number of results per page.
You should be able to just do that with the per page
parameters.
BRENT SHAW: Yes, absolutely.
That's what I was talking about, yeah, pulling fewer
pages and more per page.
AUDIENCE: I think that caps at 50 for that [INAUDIBLE].
BRENT SHAW: Yeah, I think so.
AUDIENCE: [INAUDIBLE]
BRENT SHAW: Yeah.
AUDIENCE: The maximum.
[INTERPOSING VOICES]
AUDIENCE: Yeah.
And that's intentionally undocumented actually, in case
we need to tune it for performance reasons.
That's a good [INAUDIBLE].
BRENT SHAW: It'd be easy enough to try it, and if you
fail it, back off down to 40.
AUDIENCE: Yeah, actually.
So just thinking we need to talk about the pagination, as
long as you're just following the next link that comes into
the pagination header result, if you ask for 50 and it caps
at 30 instead, you can still just keep using it.
BRENT SHAW: You'll get the correct links in the header.
AUDIENCE: Yup.
BRENT SHAW: There you go.
[INAUDIBLE]
AUDIENCE: So one thing you could do with
pagination that we did--
it doesn't help with [? developer ?] requests,
[INAUDIBLE].
We built a little API client in Canvas and grabbed
something like [INAUDIBLE] or something.
[INAUDIBLE]
That thing just checks the pagination header and returns
a list to us.
So that we never know we're carrying the load.
BRENT SHAW: My subroutine does the same thing.
It checks and it just returns the full dataset [INAUDIBLE].
Let me say one thing, I said I was going to
give away my code.
I got it through Office of Commercialization.
They realized nobody was going to pay for the stuff I wrote.
[LAUGHTER]
BRENT SHAW: Good enough for a paycheck, but beyond
that, not so much.
But it got hung up in the general counsel's office.
So as soon as I can get a final stamp from them to use
an open source license, I'll get it up at that address.
If you liked what you saw here today, folks, tell
them you saw me.
And if you didn't like it, tell them you saw Flat Tom.
[LAUGHTER]
BRENT SHAW: Other than that, we've got some more time.
More questions?
AUDIENCE: I just want to make a comment on the pagination.
We run the open source version ourselves, and we set a
[INAUDIBLE]--
BRENT SHAW: Oh, OK.
AUDIENCE: --To accept the maximum--
BRENT SHAW: Oh I didn't know that.
AUDIENCE: --Number of results that we determined it is.
And we upped it to 5,000 [INAUDIBLE].
[INAUDIBLE] all 90,000 of our accounts to about half
an hour using 50.
And using 5,000 it takes three minutes.
BRENT SHAW: Yeah, if you're running your own instance, you
can definitely do that.
For us, our database is bigger.
So to not kill the database, we've got to [? do ?]
it over--
yes, sir?
AUDIENCE: Have you tried convincing your [INAUDIBLE] to
use [INAUDIBLE]
even more open?
BRENT SHAW: Yeah.
Yeah, we do.
[LAUGHTER]
AUDIENCE: [INAUDIBLE]
I just want to see how you [INAUDIBLE].
BRENT SHAW: It kind of went nowhere.
Yeah.
AUDIENCE: Can you repeat the question?
BRENT SHAW: I'm sorry.
The question was moving--
trying to move faculty and change the mindset to move
towards an LMS quizzing solution rather than a
paper-based and un-environmentally friendly,
[INAUDIBLE], non-green process.
AUDIENCE: What are some of their objections to that?
Right?
So you just saw what happened in Indiana [INAUDIBLE] where
they lost all those k-12 high stakes assessments.
BRENT SHAW: [INAUDIBLE].
It's the way they've been doing it.
It works for them.
To a certain extent, if it's not broke then don't fix it.
They might be doing other things.
They might, like I said, they might be checking IDs as they
come through the door, they might be putting up
information on the board that now everybody has to use, or
[INAUDIBLE].
It boggles the mind.
They want to [INAUDIBLE].
It's easier for them, whatever that--
they're faculty.
[LAUGHTER]
BRENT SHAW: Yes, sir.
AUDIENCE: You mentioned about [? in ?]
future wanting to upload stats that you're
generating as well to Canvas.
BRENT SHAW: Yeah, through a conversation.
AUDIENCE: Through a conversation?
[INAUDIBLE]
files to there?
BRENT SHAW: Yes, go ahead, and you can upload a file and
attach it to a conversation and send it to [INAUDIBLE].
AUDIENCE: But you're not doing that yet?
BRENT SHAW: No, not yet.
Yes, sir.
AUDIENCE: It says, [INAUDIBLE].
BRENT SHAW: Oh, that was just a joke.
I was hoping if anybody would catch it--
[LAUGHTER]
BRENT SHAW: Did anybody catch it besides him?
Yeah?
Did you get scared?
Yeah.
[LAUGHTER]
BRENT SHAW: Like I said, if you're not laughing with me,
laugh at me.
AUDIENCE: [INAUDIBLE]
BRENT SHAW: Anything else, [? guys? ?]
I think we're out of time.
People are leaving.
Nobody has [INAUDIBLE].
I'll keep talking until [? then. ?]
Any more questions?
No?
OK, thanks a lot.
[APPLAUSE]