Tip:
Highlight text to annotate it
X
OK, Thank you Manfred, as the slide says I'm Keith Bauwise. I hold the role of a bada evangelist
within the Samsung Mobile Innovator group, based at the SERI offices in Staines, and
my main role as a bada evangelist is primarily platform promotion and application prototyping.
What I'm going to do now is get into what you are really here to see, which is what
we're doing with Augmented Reality, so let me present to you AiRaid: Rise of the Undead.
An augmented reality first person shooter, or what we like to refer to as a vacuum blaster
game. What we've done, is take the live camera preview,
and we're using that as a viewport in which we then augment your immediate vicinity with
ghosts. These ghosts come at you. Your goal is to
simply survive two minutes, and suck up as many as you can. Now this is a full, 360 augmented
reality game, and in order to play it, you typically engage in motions such as this,
to play this game. So, what technology did we use? Obviously the camera, we used OpenGL1.1
graphics, we used the compass, to give direction, we used the Accelerometer to measure the user
as they pan around, we used POWERVR SDK from imagination technologies, we used Active and
Passive audio which is really quite engaging, and we used Location.
Now within bada we have something called the SensorManager class, and this provides support
for Acceleration, the Magnetic, for the compass, we have a proximity sensor, which is simply
an on-off type sensor, and we have the Tilt sensor, which aids in pitch and roll. Typical
as you would expect. In addition to that, we actually have a WeatherSensor, which quite
surprises people. Now what it does is it's a mechanism by which a bada application can
make a http request, to a weather service provider, and this weather service provider
will populate the bada device with weather information. Of course, the GPS, Location,
which is provided by the LocationProvider class.
Ok, so how do Sensors work within the bada platform? To start making use of Sensors within
bada, we first of all need to make use of the SensorManager class. What we need to do,
we need to construct a sensor manager object. Having constructed a SensorManager object,
we then need to create a listener. Now listeners are the mechanisms by which, in the bada platform,
we can receive asynchronous responses. So here, the AccelerationSensor is going to report
back Acceleration data every 30 milliseconds, and the tilt sensor is going to report back
every 40 milliseconds. So what we're doing here, we're creating a class, called sensor
sample, and its inheriting from an interface class iSensorEventListener. All interface
classes in bada are abstract classes, which means there is a pure virtual method contained
within that class, which has to be overridden. That, pure virtual method is OnDataReceived,
so by inheriting from the iSensorEventListener class, once we construct the sensor that we wish to report, to have
to report to our application, it will report back, to our application through this OnDataReceived
method. So the application framework will call, that method, in your application, when
data is available for it. So lets see how we do this. We have a constructor and a destructor,
and in the constructor we are simply constructing the sensor manager object. So this is a prerequisite,
we have to construct the sensor manager object. In the create sensor method here, we are using
the constructed sensor manager object to determine whether or not there is acceleration sensor
data available in the platform. So you'll want to check if data is available at runtime,
before you start using it. If it is available, what we're going to do, is we're going to
add a listener to this particular sensor. So what we're doing here in the AddSensorMethod
is we're passing the This pointer, which is a reference to This application, which is
an indication to the application framework, that the sensorsample class will override,
the OnDataReceived Method. The second argument is simple specifying that we wish to make
use of the Acceleration sensor. 50 is an argument which represents the milliseconds, which we
wish to receive information back from the acceleration sensor, and the last argument,
which is set to True, is a data changed argument, which allows you to optimise your event reporting.
With the argument set to True, this will inform the application framework to inform your application,
only when data is changed. So, if you’ve set a timeout of 50 milliseconds, after a
50 millisecond interval has passed, if the data hasn’t changed OnDataReceived will
not be called. If you set it to False, it will report back 50 milliseconds all the time.
So again you have that added flexibility to optimise your application. So now, we have
our SensorManager constructed, and we’ve added an acceleration sensor, which is going
to report back. So every 50 milliseconds the OnDataReceived method, in our application,
will be called, with the necessary acceleration data. Ok, so what we’re going to do now,
we’re going to look at a bit of our AirRaid application and what’s actually happening
internally and actually how we perform the initialisation. Now, initialisation within
the AiRaid is, how can I describe it, we have internal initialisation and external initialisation.
Everything on the left is pretty much internal to the AiRaid Application, so we do the necessary
EGL Initialisation, so that’s the interface between the platform graphics, and the OpenGL
standard library, so we actually set up our native window and do the necessary binding,
which I have code extracts of to follow. We then have to start the camera, the camera
has to be started and configured, but we don’t actually start the live preview until we actually
start the game. Now this was something which we had to tweak, because when we were doing
the first incarnations of the game, we actually started the camera, and started the live preview,
but it was hidden to actually allow the transition from the game menu, to the live camera preview
to be quite smooth. We were finding that the device was getting quite warm and it was causing
battery drain, so the camera is actually started and configured, but the preview doesn’t
actually begin until the game begins. We also initialise the registry, so in registry we
have user settings such as sound on and off, vibration on and off, and we also store the
high score table. For external initialisations we have the UI
Form manager. The UI Form Manager, is a manager which manages all the UI forms, all the visuals,
all the screens that you see in the application. Its job is to know what screen is up, and
what screen is to come next. So it’s like an event scheduler for all the various screen
forms that you see in our application. We have an Audio Manager. Its job is to manage
both active and passive audio. And by passive audio we mean the background audio that plays
whilst in a game, and also while in the menu’s, while you are navigating the menu’s. The
active audio is the game which you trigger, while actually playing the game, so that could
be, enbling the switch on the vacuum to suck up the ghosts, the ghosts when they are actually
being sucked down the vacuum, and also the sound which is made when the ghosts actually
attack you. The game level controller, this is where all
the graphics are actually contained. The Game level controller works with a number of other
objects, which actually manages the game play, so this is constructing the level, setting
the game play clock, it actually manages the sensors, it manages the touch events which
you actually need to do, when you are playing the game.