Thursday, May 22, 2014

Using the Dolby Audio API in Xamarin.Android

This is a continuation of my previous post on creating a Java Bindings Library for the Dolby Audio API.  If you haven’t already, you can read it here:

Xamarin Java Bindings Library Example using the Dolby Audio API for Android

Now I’d like to take a look at leveraging the Dolby Audio API from within a Xamarin.Android App.

About the Dolby Audio API 

The Dolby Audio API enables mobile developers to access, enable, and benefit from the Dolby technology incorporated in licensed mobile devices.  The enhanced capability is royalty free to an application developer.

A developer can set one of four predefined profiles: Movie, Music, Game and Voice. Each of these profiles is tuned to achieve the the best audio quality in its particular use case.

*From the Dolby Audio API documentation



Dolby Audio API for Android:

Visual Studio 2012/2013 (optional):

And finally, you’ll need to download the binding project from GitHub:

You’ll very likely also want an Android device with Dolby hardware for testing, you won’t be able to enable Dolby Audio Processing without one.  The new Kindle Fire HD & HDX devices are Dolby enabled.

Getting Started

To get started, download the binding project from GitHub, and open the solution in either Visual Studio or Xamarin Studio. 

You’ll need a business license of Xamarin.Android to use Visual Studio.  Officially, VS 2012 & VS 2013 are supported, but this project will also work in Visual Studio 2010.

Next, extract the contents of the Dolby Audio API zip, open the “Library” folder, and drag/copy “dolby_audio_processing.jar” into the Jars folder of the “DolbyAudioAPI” project, which is the binding project itself.


You should now be able to build the binding project (DolbyAudioAPI).  If it builds successfully, can either reference the project in a new Xamarin.Android project, or find the compiled “DolbyAudioAPI.dll” in the “bin” folder and reference it directly.

Using the Dolby Audio API for Android in C#

Fortunately, the binding project does a pretty good job of encapsulating the java properties and methods. So we can read Dolby’s own Java documentation and follow along.  However, there are some subtle differences.

In order to use the Dolby Audio API, you really just need two things:

an instance of DolbyAudioProcessing

and a class that implements IDolbyAudioProcessingEventListener

Let’s setup a simple Android Activity that also implements IDolbyAudioProcessingEventListener

using Com.Dolby.Dap;
public class Activity1 : Activity, IOnDolbyProcessingEventListener
   static DolbyAudioProcessing mDolbyAudioProcessing;

Next we’ll instantiate mDolbyAudioProcessing in OnCreate

protected override void OnCreate(Bundle bundle)
    mDolbyAudioProcessing = DolbyAudioProcessing.GetDolbyAudioProcessing(this, DolbyAudioProcessing.PROFILE.Music, this);


To get a DolbyAudioProcessing instance, we call the static method GetDolbyAudioProcessing(..) on the DolbyAudioProcessing class which takes three arguments:

Context p0 – the application context, or simply, the current Activity (this).
DolbyAudioProcessing.PROFILE p1 - Profile used to initialize the Dolby audio processing instance, the DolbyAudioProcessing.PROFILE.MOVIE profile will be used by default if it is null.
IOnDolbyAudioProcessingEventListener p3 - a class the implements IDolbyAudioProcessingEventListener for receiving events from the Dolby audio processing background service.  In our case, again its our current activity (this).
DolbyAudioProcessing should not be instantiated directly, and you should only call GetDolbyAudioProcessing(..) once during the lifetime of your application.  If you call it a 2nd time without first releasing, an exception will be thrown.

Implementing IDolbyAudioProcessingEventListener

You’ll need to create four methods that will called allow you to monitor the state of the Dolby Processing hardware:

void OnDolbyAudioProcessingClientConnected();
Will be called when a connection has been made to the Dolby audio processing background service.

void OnDolbyAudioProcessingClientDisconnected();
Will be called when an abnormal disconnection from the Dolby audio processing service occurs.

void OnDolbyAudioProcessingEnabled(bool p0);
Will be called when an external application has enables or disabled Dolby Audio processing.  The bool p0 will indicate on (true) or off (false).

void OnDolbyAudioProcessingProfileSelected(DolbyAudioProcessing.PROFILE p0);
Will be called when an external application has selected one of the Dolby Audio processing profiles.

Changing Dolby Audio Profiles

After you’ve called GetDolbyProcessing (…) you can change the Processing mode by calling SetProfile ().  SetProfile takes a PROFILE enum in the DolbyAudioProcessing namespace as an argument and can be one of the following:


if( mDolbyAudioProcessing != null)
    mDolbyAudioProcessing.SetProfile     (DolbyAudioProcessing.PROFILE.Game);

Enabling and Disabling Dolby Audio Processing:

The Dolby Audio processing is a hardware feature of your mobile device, and when enabled, will process all audio playback on your mobile device.

Because of this, you should check and save the state of the Dolby Audio Processing when your application starts, and restore this state when your app or game is pushed to the background.

To enable or disable Dolby Audio Processing, you can directly set the public property Enabled

mDolbyAudioProcessing.enabled = false;

Further Information

Take a look at the two sample projects included with the Java Bindings project.

DolbyTest1 is an example of implementing IOnDolbyAudioProcessingEventListener in the activity.

DolbyText2 uses a separate class for IOnDolbyAudioProcessingEventListener

As well, be sure to take a look at Dolby’s Java documentation included with the API for further details as well as some suggested best practices.

Dolby Audio API Component in the Xamarin Store

If you don’t want to download and build the binding project yourself, there will be a Xamarin component coming soon!

Sunday, May 4, 2014

Xamarin Java Bindings Library Example using the Dolby Audio API for Android

After discovering the Dolby Audio developer program at a conference recently, I really wanted to try their new Dolby Audio Java Android API, but I didn’t want to learn Java programming to do it.

This seemed like a perfect opportunity to try out Xamarin’s Java Bindings Library project for Xamarin.Android applications.



Dolby as an audio company needs no introduction, but you may not know that are focusing on the mobile space and they have a developer program for app app developers.  What this means is, if you have mobile device that includes Dolby hardware, you can make use of this hardware to greatly enhance the audio in your apps in games.  There are quite a few Dolby enabled devices on the market, but probably most notably are the new Kindle Fire HD & HDX Amazon tablets.

Check out Dolby’s free developer program here and download their API:

If you’re a C# developer and you haven’t heard much about Xamarin yet; you will.  But the quick summary is, they make really amazing tools for developers to create native iOS, Mac & Android applications, from either a Mac or PC, entirely written in .NET C#.

* Full disclosure – when I started this blog I was simply a mobile developer that preferred to write as much code as possible in C#.  This naturally led me to Xamarin - and I love  their products.  However, at the time of writing, I am working as an instructor for Xamarin’s training & certification program: Xamarin University.  But definitely download the free trial and check it out:


Installation / Setup:

I’m not going to walk through the Xamarin setup, simply because they already have great docs on how to do it:

For the Dolby Audio API, you’ll just need to register as a developer, login, and download the Android API (bottom of the page):

The Dolby Audio API is at version 1.1 at the time of writing

I’m using the Xamarin.Android Business edition with Visual Studio 2013 on Windows 8.1, but this should all work perfectly in Xamarin Studio on either a Mac or PC.


Creating the Project:

In either Visual Studio 2012/2013 or Xamarin Studio, start to create a new project, browse to the the Android templates, choose the “Java Bindings Library” project type, pick your location, and press the OK button.



Add the Dolby Audio API Jar:

If you haven’t already, unzip the Dolby API package – “”.  In the Library folder you should see “dolby_audio_processing.jar”.  Simply drag this file into the “Jars” folder of your newly created DolbyAudioAPI Binding project.


Then change the build action of the Jar file to “Embedded Jar”. 


* Note, if you’re binding an API that contains additional reference jars.  You would also add them to the “Jars” folder in your project and set the build action to “EmbeddedReferenceJar”.


Build the Project:

Now we just need to build the project, which will create a Xamarin.Android compatible dll that we’ll be able to reference in our Xamarin.Android projects.

In some cases, when you create a binding project you may get build issues.  You would then need to edit the mappings used to create the binding.  Fortunately for us in this case, the Dolby Audio API is well formed and well structured, making binding a straight forward process. 

If you run into difficulties I definitely recommend checking out Xamarin article on Binding a Java Library here:


Reference the new Binding Assembly:

Our last step is to simply reference our new library in a Xamarin.Android project.

We can do this in two ways:

If we create a Xamarin.Android project in the same solution as the binding project, we can simply add a reference in our new project to the binding project.


Or we can directly reference the created dll, which you can find in either the bin\release or bin\debug folders of your binding project.  If you’re planning on copying the dll to another location you just need the library - “DolbyAudioAPI.dll”.



Using the Dolby Audio API:

We can now start interacting with the Dolby Audio API from within our Xamarin.Android project(s).  Dolby has done a great job of documenting the API in the downloadable API package, check out the Quick Start guide in the “Documents” folder first: “QSGuide-DolbyAudioAndroidPlug-in.pdf”.

Additionally there is sample Java code on their developer portal:

I’ll be showing how to use the Dolby Audio API in C# with Xamarin.Android in a following post.  But in the mean time, you can download the binding project along with two Xamarin.Android sample apps from GitHub here:

Saturday, May 3, 2014

P3P - The Precise 3 minute Presentation

This is a guest post by Nathan Roarty, an experienced systems engineer and accomplished presenter in the technology space.


In discussing how to encourage the members of the community to share with the group, we began to consider what the format should look like. Professionally, people are often confronted with topics that they don't feel a passion for, making it difficult to impart a compelling story and leaving the presenter and audience feeling indifferent to what was shared.   On the other end of the spectrum, often we are so excited by a story that we don't impart the details coherently, missing an opportunity to share something interesting with our audience.

Many meet-ups use popular formats; elevator pitches, lightening pitches, or the longer TEDtalk format.  Each of these formats have their strengths and weaknesses but none of them really seemed to fit. It was time to come up with our own format, and so, the P3P (pronounced pep) the Precise 3 minute (and 33 second) Presentation was born.

The idea of P3P talk is very simple, encourage members of the group to stand up and impart a short story.  The length of time is intentionally short, three minutes and thirty-three seconds, to make it less intimidating and to encourage the story-teller to focus on a few salient points.  Key to the P3P talk is to give the audience enough information to get them interested, enough to want them to follow up with the presenter.

The P3P format is easy to follow; within three minutes and thirty-three seconds, the presenter will impart a few pieces of information. the following are guidelines on what we believe the presenter would want to cover:

1. What's the need?  For the first part of the P3P, the presenter will provide the background on the need that they identified. 

Most fairy-tales will identify a need as something like, 'rescue the princess from the dragon'.  Specifically to technology it might be a gap in the market, such as, 'the ability to send messages cross-platform using data-services not SMS' (WhatsApp).  This is the opportunity for the story teller to grab the audiences attention and set the context for their P3P.

2. What's the solution? This is where the presenter has the opportunity to tell the audience how they met the need. 

Using the fairy-tale metaphor "I used a silver tipped arrow to kill the dragon and rescue the princess" This part of the P3P is optional, sometimes the story-teller might not know what the solution is, they may understand the need but haven't quite figure out how to address it yet, so they may be asking the audience for their ideas and inputs how to go about meeting their identified need.

3. How was it solved?  Sometimes the solution isn't as interesting as the journey, this is the opportunity for the story teller to describe how they managed to go from the need to the solution. 

Using a different metaphor, the Lord of the Rings wouldn't have been have nearly as interesting if the journey had been left out.  This section, like the solution, is also optional.  It's possible that the story teller has an identified need and a solution but doesn't know how to get there so it's a chance to pose the question to the participants to canvas ideas to get from need to solution.

That's it, we are currently trying to decide what happens after the P3P, do we give an opportunity for questions?  It would appear convenient to offer a P3P presenter a slot of five minutes, they can cover their P3P with time for one or two short questions.  Regarding time keeping, we haven't yet been keeping strict time for those presenting at the meet-up, we've had the luxury of time to allow people to run over, but moving forward the intention is to tighten things up.  We're also looking at developing some tools to help P3P presenters prepare, more on that to follow.

written by Nathan Roarty @njr_itarchitecht