Tip:
Highlight text to annotate it
X
>> Well, thanks again for having me here.
I will give a very quick introduction
of what the dot com is, and then hopefully pose something
igniting as Jeff asked.
So yet to come is one of the survivors
of the internet bubble.
And that because of that and because of the fact
that our website basically says nothing about what we do,
my colleagues spend a lot of time explaining what we're not
and what we're not doing.
So I joined the company
after they start being an internet company.
And so, I can skip that and just say what we do.
So we claim that we are the technology
and innovation brokers and facilitators.
And that comes with a lot of, again,
pressure and responsibility.
So we have different sides of business right now.
We do have patent transaction business.
We help companies
and individuals buy and sell patents.
On the side note, we only work for the non-practicing entities,
trolls, whatever you call them,
so we only work with real companies.
We have a little bit of debenture brand recently.
We have raised fund when we're able to invest
into the small companies
that we've seen have interesting opportunities.
The core business which I'm mostly responsible for,
we still call it old fashioned, technology transfer.
And it started mostly on the out-licensing cell site
when we were helping companies, large companies mostly,
and then small and midsize companies
to monetize their intellectual property and technology.
But in the last several years,
most of our business becomes related
to the what now called open innovation, and I am--
one of the challenges kind of I bring to the discussion,
I stop liking the term because I think it's been abused
and miscommunicated.
So I've started using the old fashioned.
What we do is technology scouting, not open innovation.
And we do it a little bit differently,
so we're really kind of believe in full-contact sport.
We don't think that internet is a solution.
We still have probably the largest online marketplace
but it's just one of the many tools that we use.
What we are different in what we do, we focus really on the sort
of 80 percent solution, the way we call it.
So we almost never deal with ideas.
We never run challenges.
We never run competition.
We're really looking for, as Craig said,
that somebody already solved this problem.
And we have tools and we have experience and we have
that process that allows us to find those potential solutions
and bringing them to you.
So the challenge and kind of the ignite point for me is
that first, we see big problems in this process and try
to pass it back to our clients.
We see that the biggest--
one of the biggest problems is selection
of the right problem to tackle with.
And the second probably even bigger problem,
I can tell you we never had the challenge or the need
that we couldn't find a solution for.
We had a lot of clients that cannot act on that.
So one of the big problems is that people and companies
and organization need to be ready for success,
which is a little bit strange notion
when you haven't even discussed your problem.
But unfortunately, finding solution,
I think, is the easy part.
What you do with a solution, that's something
that we're happy to discuss and kind of--
or leave it on the table for you to think about.
>> Okay, thank you.
Steve.
>> Yeah, great.
One of the things that I've been kind of mulling
over is I've been observing the title here of why we're together
as a panel is, you know, how to run effective challenges
through open innovation platforms.
And I think a lot of the things we've heard from Craig
and a few others is really more along the lines
of maybe changing the question of how do we run challenges
as an enterprise, and how do we make it a part of what we do.
The chemistry that Craig talked about there, to me,
is pretty important because, you know, let's face it,
being open is not the same thing as being disorganized.
And I think that's one of the things
that we've certainly learned through this process.
You know, companies tend to come to InnoCentive
to solve these problems.
We have, you know, a very large network of quarter
of million solvers from around the world.
And access to, you know, millions of eyes
through partnerships with SAP and Nature, and the economists.
But what we've tended to learn over the years of, you know,
we've run over 1200 challenges,
is that there really is a methodology and--
a methodology to how you get from point A to point B. And so,
that's I think a lot of things
that hopefully today we can learn from the panelists
and us here today and how to use that methodology effectively.
>> Okay, thank you.
Jennifer.
>> Good morning, I'm representing the Space
and Life Sciences Directorate and also NASA at this point
because Jeff Davis a couple of years ago really had a vision
for bringing these tools to NASA
and trying understanding their role in our community
and our culture, and could they be applied
and could they be sustainable.
So last year, we were able to secure some contracts
to do a pilot study with several different types
of open innovation platforms.
And they've been very instructive to us
about how it does fit into our world
and how we can leverage off of the community
at large and their creativity.
And it's a very positive thing to bring the voices
of the larger community back into NASA
and show people how we can implement their ideas
and their creativity and be open to their voices.
I think it's been very helpful to be that inclusive
because I think we have, in the past,
had kind of an arm's length perspective, you know,
people could look at it and idolize it
but never thought they could be part of it or touch.
And it's made it just more tangible to folks.
And the different platforms we used ranged from crowd sourcing
to technology scouting, internal collaboration tools,
and even software coding.
You know, there's gonna be a lot of great questions today
that will help draw out discussion of how you go
about developing the material that go
into any one of those platforms.
In the decision process, you have to start to evolve
for yourself of which platform is appropriate
for what material.
You know, I don't, just like there are no stupid questions,
there are no stupid challenges.
But you could not be fully prepared
for the challenge platform you've chosen.
So we've really tried to, at this point, do some analysis
and say, if we didn't get an answer, is it really a zero?
Is there really nothing out there?
And we have to go, you know, develop this from scratch
and create a project around it.
Or did we not get an answer 'cause we didn't appropriately
ask the question or pose the challenge.
So we're at that level of analysis.
There's also some understanding
of how open we are to the solutions.
We actually held an internal workshop yesterday and part
of the discussion was, you know,
we've tried to understand not biasing the solution
by doing a good job of developing a challenge.
But if you inadvertently limited those
who are selecting the answers, if they're not open
to something new, then you're not gonna get a solution that's
out of the box 'cause you're not gonna be accepting to it.
So we're very aware of those potential traps
that could make these tools unsuccessful.
Another point that was brought up earlier was the intangible
or the role of non-monetary awards.
And that, you know, as Karim pointed to earlier
and Craig just a moment ago, how strong that is for people
and never to underestimate.
You know, we have a lot of discussion
about how much should the award be
and did we have enough funding for that
and was that appropriate.
But really, it's not about the money and you have to start
to really come to an understanding
of those non-monetary drivers and motivators.
Because it's important to our community to understand
that because you understand the community at large better
and how they're responding to you.
So we consider all of our pilots to have been very successful
but not necessarily in just generating solutions
as what we've learn from them.
We have every intention of continuing our work through all
of the platforms in a variety of different ways.
It's just a matter of working within,
as Bev [phonetic] mentioned earlier,
the infrastructure we have and the procurement that we do.
And once we leverage off of that,
I think the best we had recently was what Robynn brought
to the table in terms of the America COMPETES Act,
and that really threw open the doors for a lot
of different ways that we can do business.
We have to understand a little bit more.
There's gonna be a lot of dialogue, I'm sure, with legal
and procurement about how we go do those things.
But it's definitely opened it up for us.
So we're looking forward to the future
and implementing more in the near term.
And we're already developing more challenges.
[ Inaudible Remarks ]
>> Okay.
>> Should I just start?
>> Yeah.
>> Okay. I guess I don't need to pile
on the great speeches we had so far.
I think we, from a serious perspective, we got a couple
of things we can talk
about which are actually quite interesting.
So first of all, we did quite a few challenges
and crowd sourcing and all these kind of things
and failed miserably more often than not.
So I'm happy to talk about these things.
I'm also happy to talk about the point--
and I'm really excited that Jennifer brought this up,
the whole non-monetary component.
So we have run a bunch of challenges.
>> We've run a whole bunch of challenges that are very,
very successful, as I said, lots of failures as well.
And none of our challenges had ever any material prize.
So, I think the best I paid so far was to fly someone
out to Mozilla and meet the staff.
And that's like, you know, whatever, 2000 dollar airfare
and a week in Montview, and it works tremendously well.
So one of the things which I find really interesting
especially in the aspect of NASA, I bet you'll find tons
of people who would die
to do the tour I had yesterday as a prize.
I had a discussion yesterday with Karim
about like the monetary value and I was at a panel
with OpenIDEO where the guys from IDEO talked
about the transaction economy and like people are, you know,
driven by the transaction.
To a large extent, I don't buy that, at least not
for the challenges we run.
So I'm happy to talk about that.
>> Thanks, Pascal.
As we get lots of questions from agencies at GSA,
the first question I was asked was,
what do you mean by "a challenge?"
And I think it means different things to different people.
And so, I wanted to ask each of you
in the world you're operating, and now, what do you mean
by innovation challenge.
And the second part of that question,
how do you decide what makes a good challenge.
What do you say, we'll start with you.
>> Sure. So we don't use the term "challenge."
We call them innovation opportunities.
And I think it is--
a good challenge is it's a problem statement externally
presented and clearly articulated.
And then we have a variety of vehicles we use
to disseminate the challenges, some online, some paper,
some person to person.
And what makes a good challenge, I think there is maybe
about 4 things that needs to be.
One is it has to be a critical, in our world,
a critical business need.
And that means that there is the will to implement the solution.
A couple of people have alluded to that,
that sometimes you can find a solution
and find the people aren't ready to implement it.
So it has to be a critical business need
that there is the will to implement the solution.
The second is it has to be--
it should be something that you haven't already solved.
I mean that sounds kind of basic.
But the basic is knowing what you know.
So, it should be something that you haven't already solved.
It needs to be very clearly stated.
And that goes towards bringing it back
up into the abstraction a little bit
and making sure it's general enough.
And then the last one is that it needs to be really specific.
Of the briefs I've posted out on our site,
the ones that gets the most hits are the ones
that are very specific challenges 'cause those are the
ones that people with the right expertise can read and they say,
"Yes, I know that one.
I can submit that."
So those are the 4 criteria I have in my organization.
>> Craig.
>> Well, I would say that somewhere at the root
of every breakthrough is what you might call a breakthrough
problem that's been solved.
And what we find seems to be a shared trait
of what we call breakthrough problems,
is they almost always involve some fundamental tradeoff,
right?
And so, for instance, we made plenty of products
that are strong, bleach, let's say, but they sure aren't mild.
So a breakthrough problem would be strong
and mild, not strong or mild.
So I think that's the first thing, is again, these things
that masquerade as facts, the givens of the world, I think,
is one place to look for these problems.
And to start this with a fundamental challenge
of something that says, this is the way we fundamentally thought
the world was.
But we're open to the idea that it's different
and I think that's-- the second thing I think is,
and I've already alluded to this is, not to pick on your posting
for reinventing the light bulb.
But when you phrased it that way,
you've almost automatically limited the potential universe
of innovators to people that know something,
that when you say light bulb, the light bulb goes off
in their head literally.
And what I'm saying is that in all likelihood,
given the fundamental idea that a breakthrough idea has
to be non-obvious to the light bulb people.
Otherwise it would be an incremental and linear idea.
Is that what we find is that in every way, the only way
to be systematic about the search,
somebody somewhere has already solved the problem,
is the way that you define the problem
in generic, not specific terms.
Because when you do that, you make it easier for people
which is where the solution is gonna come from,
who would have said prior to seeing your post,
they know nothing about light bulbs.
It needs to be written to those people.
That's where innovation comes from.
>> That's great.
Eugene, I think you used the term "technology scouting."
And so, how do you determine what makes a good thing
to do a technology scout on?
>> Yeah. Well, I'll let a little bit more to the confusion
of terminology, right?
So we also don't call it "challenges."
So we call it "needs" and I can tell that there are companies
who call it "wants" and there are a few other terms
out there in the world.
I think, again, just not to repeat what's already been said,
but we spend a lot of time with the clients and we found
that the more time we spend on selecting
and formulating these questions, these needs, the better we are
with finding the solution.
I think that's why we're searching.
We're not trying to invent something.
We're not really challenging people.
So, several people in this room have been for our kind of one
or two day process of kind of digesting
and breaking their problems apart and trying
to find what's really the core behind a problem.
And I can tell in our experience a few times, I left the room
with a client convincing them
that this is not a problem they have and we cannot help them,
which is kind of painful for the consultant.
But we think we do have the process of picking up and kind
of formulating the right need.
And I think a few criteria to keep an eye on, one, again,
we're talking about the search.
So the search should be conducive, right?
There should be some data that there's really chance
that something like that exist somewhere.
I mean we hate, once in a while, we run into the problem
when that-- our clients try--
ask us to bend the laws of physics to find the Holy Grail
or silver bullet, and they already know the answer.
So that's kind of disappointing at the end for everybody.
Then, obviously, the readiness to accept and to deal
with solutions at different levels
because sometimes the solution might not be in the right form,
it might not be packaged right for you to deal with.
So there should be-- organizations should be ready
to deal with whatever we find because, yeah,
we might not find exactly what seem
like the best case scenario.
And at the end is really, as Craig mentioned,
I think we always try to strip the problem
from application specificity if we can.
I mean obviously if we formulate the problem for NASA
and we had those examples that say that we need
to have something specifically to send it to space,
I mean there are probably--
you probably know better than anybody else, all the people
who work in the space exploration.
So the best chance for us to find something
and bring the values for you, is if we can formulate at something
in the absolutely application neutral way so people
who then have a thought about space
or NASA could actually look at this and think
that they have a solution.
So I think it's-- there is no definition
but there is a process of how you can rank and select
and find the better versus not such a good problem.
And of course, the old saying
that the right question is half the answer.
I think it works very well.
So if-- and sometimes you really have 2 or 3 problems
within one problem that you come up with.
>> Great. Steve, how do you define it in InnoCentive?
>> Yeah. I think what's interesting
about what we're talking about here is really the fact
that you have problems, you have needs,
and what a challenge does is a challenge is really just taking
that and articulating it in a way and calling it a challenge
so that you can get a broad audience to participate.
And so, there is a difference between problems and challenges.
Challenges may be the things that you're looking
for that ultimately solve your problem.
And I think it's somewhat broad
and maybe a little bit undefined.
So I thought maybe a couple of examples would kind
of help us think about it.
Because I think challenge is, when people think
of InnoCentive, a lot of times they think of technical
or scientific challenges.
Maybe when they think of yet2, they think of a certain type.
But talking about broadly challenges,
you could really run them as technical or business oriented,
and you can really run them as either ideas, looking for ideas,
or you can be asking for discrete
or definitive solutions.
And so, I think when you think about an idea,
you might be saying, you know, you might be asking the public
or asking a solver, you know.
What are some marketing messages that I could use
to help get young men to shave?
I mean these are some real examples.
You know, what are some different ways
that I might be able to affix in medical device
on the inside of an artery wall?
And so, these are looking for ideas both business side
and more technical side.
When you are looking
for something maybe a little bit more of a discrete solution,
certainly from a technical side, you could think of a lot
of examples there but one might be, you know,
looking for a specific compound that has this certain structure
that we're looking for.
>> Or perhaps you want somebody to develop an algorithm based
on a certain data set that you have.
And then, of course, from the business side,
you can ask for maybe a marketing plan,
that we have a current product and we're looking for people
to give us a business plan for a new market that we might be able
to enter with that product.
So, hopefully, giving a few examples is kind
of helpful to the audience.
>> Thanks.
Jennifer, I know NASA has really been a pioneer in this field
and NASA has done challenges with InnoCentive and
yet to come and others.
And how do you decide what would make a good challenge at NASA
and whether you do it externally or internally?
>> So we knew that we needed a process
to decide how the material was gonna come forward
and then be worked on to get into--
in shape to be able to put on a platform,
whether it be internal or external.
And really, we tried to structure our internal work
in terms of articulating risks.
And then there are elements of those risks,
there's things we either can't solve, don't know.
And we refer to them as gaps.
So I'm gonna add to the language confusion
for a second before it gets more clear.
And we decided that what we call gaps, so it's good to use a word
as long as you define it and you all agree on the definition.
So we've internally agreed on these things.
And this is for specifically for my directorate.
I don't, I wouldn't say it's NASA [inaudible]
but Jeff is actually working
on a group that's gonna take it a level higher,
so maybe process wise it could have a broader implementation.
So these gaps are really the raw material
that could become challenges.
And we asked each group to really evaluate with their peers
and their colleagues what it was about this gap
that they needed solved or didn't know
or couldn't do internally, you know.
Where was the need or what was the challenge.
And it was at that point through a lot of discussion and kind
of iteration, we started
to formulate what material was gonna then go forward
and was how the higher likelihood of fitting into one
of the platforms based on some other evidence that's been
brought forward by, you know, some other professors
at Harvard and, you know, the academic work that's out there
to help instruct us and be intelligent about our choices.
And we were able to kind of divide,
but we also were a little conflicted because we said,
what would really be cool is to take say the same six challenges
and run them in platform side by side
to understand the difference, the cost
and benefits of each one.
And we didn't have enough, I guess, resources or time
to really run that study.
But we've attempted it in such a way that we thought, you know,
it's pretty clear we had challenges that fit
in the InnoCentive crowd sourcing more the
technical side.
We thought these were either gonna give us a rock solid note,
nothing is out there, or we're gonna potentially get a
good hit.
And then we had other ones where, well, you know,
we've really needed someone to scout out some collaborators
and more of-- it was more of a technical need
that we needed some partnership on to even be able
to articulate the problem better and do more homework.
So that is something that would have gone
into the yet to come realm.
We also had an internal platform that was customized for us
by InnoCentive where we said, you know, we know as an agency,
it's very difficult to communicate across 10 centers,
around 80,000 people, you know,
to know what any one person is doing.
And we said, should we start in a phased way?
You know, should it start internal first,
if we get no answer then you start
to decide what's the right external platform.
So there is a little bit of process overlay
but we didn't wanna be strict in terms of this is the recipe
and that's how you follow it.
Because it's not about being cookie kind of that--
that kind of is, you know, in contradiction
to what the idea of these tools are.
So there are some process flavor, there are some art to it
and creativity, and there is some intuition to it.
And then there is the freedom to be wrong, you know.
There was no bad negative consequence to the person
for inadvertently not making maybe the best choice.
You just-- we really enforce the idea that we needed
to discuss it and learn some lessons from it
and then maybe develop some process or guidance,
advice for people in the future.