Bala Piano

My piano application is finally done and available at the google play store for android phones/tablets:

A web version is up on my website at

Feel free to play around with it, in the middle there are the regular keys (you can slide on the mini keyboard on top to change what range of the piano is visible), and for the chord mode you can press the notes on the bottom and chord variations on the right side. For the famous four chord songs, you can try clicking on C, G, A, F (with pow mode or if you are fast enough: C maj, G maj, A min, F maj).

The full source code can be found on github:


It was developed in Android Studio 2.2 preview 3 using the libGDX java game development framework:

For people unfamiliar with it, libgdx is a quite basic library, small, but useful. It has most of the usual things needed for games:

  • lifecycle management: create, pause, resume, dispose, (resize)
  • game loop: render, Stage.act (for simulating actors in the Stage)
  • hierarchy of objects: base class is Actor, a Group is a container class for Actors, a Stage handles events and has a root Group object that has all the Actors in a hierarchy.
  • textures/sprites: libGDX can be a good choice for simple 2d games, for this purpose it handles textures, sprites, and animation, although it can be used for basic 3d as well
  • math operations: functions like clamp, and trigonometric functions (sin/cos)
  • file handling: A simple text file can be read like this:
    FileHandle file = Gdx.files.internal("file.txt");;
  • sounds: loading and playing back mp3 files
    FileHandle file = Gdx.files.internal("something.mp3");
    Sound s =;;

User Interface:

When I was using an older design of my buttons I noticed how unfairly small the top of the D key was compared to the E, and I started wondering what the actual sizes were.
After some googling I got to John Savard’s page on what the sizes for a piano keyboard should be at and that was my cue when I designed my new layout. According to his post the black keys are not at all in the center of the white keys, and I immediately checked on the real piano to see he was right. A surprising number of apps got this detail wrong (including me at the first try).

Some of the defining aspects of the piano key sizes are:

  • the top of all the keys should have the same width
  • the bottom of the white keys should have the same width
  • symmetric reasons

Already the first and the second is impossible to be true since on the top there are 12 keys and on the bottom there are only 7, so from C to E there are 5 semitones and 3 whole tones on the bottom which means the bottom is 5/3 = 1.666.. wide, but the right keys (from F to B) dictate that this same value should be 7/4 = 1.75. In the end I used the spacing on the bottom left of his page which does set almost the same width for keys that should be, and has good symmetric dimensions, but only uses whole millimetres.

It did took some time for me to familiarize myself with how to put together the UI I want in libGDX, it would have been better if I could do that through an editor (maybe I should have used, but thankfully I found the debug() method inside the source of Actor which makes colored lines appear around my objects’ borders that helped a great deal.

Programming and midi:

What I really liked in libGDX was that the different platforms were in separate folders, and platform specific code could be written there, and used in the core project through interfacing. The same can be found on the wiki:
Let me show you how this is done:

For my app, I needed to play piano notes according to the pressed buttons, thus I needed to have a structure to send when a piano key is pressed/released:

public class NoteEvent {

	public Type type;
	public int pitch;
	public int channel;

	public NoteEvent(Type type, int pitch, int channel) {
		this.type = type;
		this.pitch = pitch; = channel;
	public enum Type {

These NoteEvents are then passed through the following interface to the platform specific implementations:

public interface SoundPlayer {
	void processNoteEvent(NoteEvent ne);

I decided to go this way because at first I tried to use real recorded sounds (mp3) for playing back piano notes, but that changed later. I only downloaded a couple C notes, and for notes in between, I played back the existing C sounds with faster/slower playback rates to distort their pitch.
The pitch distortion playback rate can be given by

float rate = Math.pow(2.0f, pitchTransform / 12.0f);

given that pitchTransform is the relative difference in semitones between the tone you wish to play back and the tone you have the mp3 sound for. (For eg. you have a sound for E3, and you want to play an A3 sound, then pitchTransform is 5 – see the picture at the top).

This is because there are 12 semitones in an octave, a note one octave higher is 2x the frequency, and tones are on a geometric scale. That means that the quotient is 2^(1/12). What follows is that n semitone of difference is a frequency (playback rate) multiplication of 2^(n/12).

I implemented this in RealSoundPlayer, but the problem was that it didn’t work on my phone, yet on another phone it worked fine. This kind of distortion comes at another cost too: when you are playing notes close to each other that are using different actual mp3 files, that can be noticed.
I figured out that by not having any distortion at all fixed these problems but this meant that I had to have sound files for every piano key I wanted to have. Fortunately on I found the files I needed, after some editing and converting they sounded good, and they didn’t take up much space (~ 1.5 MB).

Now what was bothering me, was that there were apps in the play store using MIDI to play back notes, and I really liked the idea of MIDI because when you released a note, the sound faded out nicely, and when you hold on a note, the sound was longer too, this was of course difficult/impossible to do with mp3 files, which were played as one shot sounds without any effect on when you released the button.

This is when the interface came into play (this was actually when I made the interface, up until this point I didn’t need it).


Desktop Midi:

I created a DesktopMidiPlayer that played notes using the javax.sound.midi library. The good thing about it is that it’s surprisingly easy to use. Here’s how you can create midi sounds on desktop java:


	Synthesizer syn;
	MidiChannel[] mc;

	try {
		syn = MidiSystem.getSynthesizer();;
	} catch (MidiUnavailableException e) {

	mc = syn.getChannels();
	Instrument[] instr = syn.getDefaultSoundbank().getInstruments();

The synthesizer can be used to get the channels, and to load instruments. In the initialization I’m just getting the synthesizer, open it, get the channels, and then load the first instrument from the instruments available, which will be a piano.

After the initialization, making a middle C sound (pitch = 60, pitch can go from 0 to 127) on channel 0 (0-15) with a 64 of velocity value (0-127) is one simple line:

mc[0].noteOn(60, 64);

Silencing a note is similar:


I only needed to add a couple extra lines and modifications to make my SoundPlayer implementation able to play any pitch I needed. When it was done, I just had to go into the desktop part of the project, and I passed a new DesktopMidiPlayer() reference to my core libgdx application through the constructor, and from then, I could use it the same as any other SoundPlayer.


Android Midi:

The more difficult part was Android. I don’t know why, but android does not support the javax.sound.midi library. After some search, I found out that android supports midi in marshmallow, but my phone, and I guess a lot of other people’s phones do not support marshmallow. There had to be another way, and so I found a github project that accesses a midi driver through jni (c++), and it also has java bindings :

I did not know how to use the project, I saw that I can send byte arrays, but I didn’t know what the magic numbers were in the example Activity. Weirdly it was the new marshmallow API that shed light on it ( :

byte[] buffer = new byte[32];
int numBytes = 0;
int channel = 3; // MIDI channels 1-16 are encoded as 0-15.
buffer[numBytes++] = (byte)(0x90 + (channel - 1)); // note on
buffer[numBytes++] = (byte)60; // pitch is middle C
buffer[numBytes++] = (byte)127; // max velocity
int offset = 0;
// post is non-blocking
inputPort.send(buffer, offset, numBytes);

So as it turns out the content of a message is first being the type of message, and the parameters follow for the first byte.
For a note on / off message the message type is 0x90/0x80 + channel, 2 more bytes are the pitch (middle C = 60 same as in javax.sound.midi), and the velocity with which the note is pressed/released so 3 bytes take up these messages.
For an instrument changing message only the instrument’s byte identifier is needed as parameter, so it’s only 2 bytes.
Changing the instrument to piano was then done using this line (using billthefarmer’s sendMidi method which creates the byte array and sends it through jni)

sendMidi(MidiConstants.PROGRAM_CHANGE, GeneralMidiConstants.ACOUSTIC_GRAND_PIANO);

(Where MidiConstants.PROGRAM_CHANGE = 0xC0, and GeneralMidiConstants.ACOUSTIC_GRAND_PIANO = 0, other instruments are listed in the GeneralMidiConstants class)

After implementing my AndroidMidiPlayer, all that was left was to pass a new AndroidMidiPlayer() reference in the AndroidLauncher class to my constructor and the core project could use it without any troubles.

After all this, and generating the signed apk (which was only about 4 MB), the play console said my apk was not zipaligned (which means the zipalign tool that’s supposed to run automatically from Android Studio somehow failed to run successfully). After locating the zipalign tool, I had to zipalign my apk manually, and I had to run zipalign on the output of the first zipalign, and then the apk was accepted (probably a bug in the android sdk).

What’s missing from this app is the midi player for the html version (gwt). I use the RealSoundPlayer on the html version, because I did not find any simple web midi library with which I could synthesize piano notes. The web midi api is a relatively new standard for web midi access, but I think it’s meant to be used with an external music device (eg. a digital piano), and not for actually outputting any piano sound using your computer’s hardware.

In conclusion libGDX is certainly a viable option for simple interactive apps/games especially if you need some platform dependent code/library (like midi in my case), although for more complex ones I definitely think Unity or Unreal Engine is the way to go.