Saturday, February 18, 2012

Instrutorials - Part 2 - A Hard-coded Prototype

Hi all,

Like I said in my previous post it's now the time to provide a bit information about the current structure and looks of my application.

A first question was how to create the visualization of the instrument. The best way to go in my opinion was to use a background image as picture of the instrument, with an overlay so annotations over the image would be possible. By calculating the location of the objects on the overlay in comparison with the screen height and with, I can relatively place those objects so their position is always correct, whatever the screen sizes.

There you go. My first official screenshot. Now what do we see here? (You get points if you shouted out "E"!)
  • Action bar: New in Ice Cream Sandwich is the Action Bar. It contains the title of the application, but can also be used as a menu! This means I can use this to constantly show buttons on the screen (like replay or pause) while not losing too much space. You might ask "Why don't you put those buttons on the right or the left of the recorder?", but imagine the instrument is a piano. This will probably use two or three rows to show the entire keyboard. Conclusion: I need the screen space. Putting everything in the Action Bar makes the application consistent on all instruments.
  • Replay: Yup, currently my application already supports an (albeit very small) amount of midi files. Currently the notes from C to Fis are supported.
  • An image of a recorder: I've decided to use a recorder as starting instrument because of the limited number of possible notes, and chances are that almost everybody has a basic understanding of this instrument (which means this might be one of the best use-cases for my application. Everyone knows the basic notes on a recorder (learned in primary or secondary school), but who remembers notes like a Bes or a Fis?).
  • Red dots: These are painted as an overlay on the background image. They show which holes of the recorder must be covered to produce the current note.
I needed this image to be able to explain the structure of my application. Like I said, currently the notes from C to Fis are supported. This is because I first put these hard-coded in the source code to be able to test if it works. I'll explain it more in a second.

  • InstrutorialsActivity: Currently my app only has one Activity. On creation stores the width and height of the ImageView so the overlay locations are always correct. It opens streams to the provided midi and mp3 files (hardcoded a.t.m.). It then builds up an Instrument. When the Instrument is finished, it will start the parsing of the MIDI file, and plays the mp3 file concurrently so the user gets the impression that the instrument is performing the score. It contains a handler which handles all incoming messages of the MidiParser and updates the views so the correct notes are displayed.
  • InstrumentBuilder: This class creates an Instrument by parsing an XML file. This XML file contains a definition of the Instrument. For example: it contains the range of the instrument, the size of the overlay dots in relative pixels (a piano example would probably need smaller dots), and for every possible note the instrument can play, a list of dot locations. This XML parsing makes it possible to provide an endless amount of instruments. I can simply create a Piano instrument by declaring a background image of a keyboard, a range of notes the instrument can play, a radius for the dot size, and just a single dot for each note on the correct location. The recorder on the other hand has a much smaller range, but more dots per note (because one need to hold many holes to play a C).
  • Instrument: Contains a Hashtable of Notes (each Note has its MIDI note value as key), the range of the instrument, and the radius of the dots.
  • Note: Contains a List of Dots and a function to draw them.
  • Dot: Each dot has an x and y coordinate which is relative to the instrument image and independent of screen width and height.
  • MidiParser: Contains the fancy part of the application. It parses the MIDI file and throws events in real time. When an event is thrown, a corresponding message is sent to the Activity. For example, on a NoteOn event, it will send a message containing the note value, so the Activity can tell the Instrument to display a Note, which will draw all Dots.
Current state:
So, conclusion:
Currently my application contains a MIDI file made with MuseScore, as well as it's corresponding MP3 file. On running, an Instrument is created by parsing an XML file defining a recorder. When the Instrument is created, the MIDI file is parsed. It throws events in real time, on which the view is updated. The result is an application which displays an instrument playing a MIDI file. One can easily add new instruments by just providing a background image and an XML file defining the instrument.

That was all for this weekend I guess.



Instrutorials - Part 1 - Getting Started

Hi all,

By request of my promotor Prof. Dr. Wim Lamotte, I've decided to blog about the progress of my Bachelor's thesis. The goal is to write a mobile application (I've chosen the Android platform) which is tied to the API, and will allow users to learn how to play songs on a specific musical instrument, by providing a visualization of the instrument, and how to play the notes.

I've already made quite some progress, so chances are this post is going to be a bit lengthy. Anyway, let's get started.

The first question I needed to ask myself, is which platform I'd choose for development. Personally, this choice wasn't hard to make. Blackberry? LOL. iOS? Well, firstly, one needs a Mac to start coding, an iPhone, as well as a license. Since I don't have any of those, the choice for Android was obvious. No costs at all, open source, and development possible on all PC Platforms. The fact that I own a Samsung Galaxy S II also was a big reason to choose for Android. Coding on an emulator isn't that fancy.

Secondly, I needed to decide which API level to use. I must admit I lost some sleep about it. The latest API's provide an enormous amount of new functionality, both before and behind the curtains. However, Ice Cream Sandwich isn't wide spread at all. But the fact that my application wouldn't see any daylight until the end of the semester, as well as the new functionality of the API's, I decided to take a plunge in the unknown and develop my app for Ice Cream Sandwich. This way, both tablets as well as smart-phones would be greatly supported, and I could follow the new official design guidelines (these) by Google as much as I can. Since fragmentation is one of the worst enemies of Android, I chose to try to be consistent and follow those design guidelines, with hopefully a clean and great app at the finish.

Getting started
The second thing on my mind was obvious. I was supposed to use the API for providing my users with a huge database of music scores. The amount of file-types offered by the API was quite substantial, so I needed to find a file-type which allowed me to create a visualization of an instrument based on the score.

 MusicXML: This was an obvious first choice for me. MusicXML provided everything I needed in a clean XML format. However, I found the amount of data in the files so substantial, that writing my own parser would take a lot of time. And we've got open source libraries for that, don't we?

The first library I found was JFugue. It provides a MusicXML parser, and has a great structure to implement my own Renderers based on events thrown by the parser. At first sight, this was the perfect solution. But after importing the library in my project, I soon found out that it needed the javax.sound.midi library. And guess what? That library is NOT included in the Android Java SDK. Oh noes! I've tried lots of different things to try and fix this problem:

  1. Hey. I don't even need MIDI support from that library. I just need MusicXML! So instead of downloading the .jar library, I downloaded the source code and deleted everything which had nothing to do with MusicXML or the structure of the library itself. But even then, it still couldn't build because it made use of "EventListenerLists". And guess what, that also isn't supported in Android.
  2. Secondly I downloaded the java files of EventListenerLists to put them in my own package, so even if Android didn't had them, my application could make use of it. But this broke JFugue even more, and it started to fail in parsing the XML files correctly. Even including another library - the error message stated I missed the nu.xom library - didn't fix the problem.
  3. Back to square one.
  4. Another library than JFugue perhaps? I tried lots of different libraries, but all had the same problem. Dependencies on javax.sound.midi. Rats.
MIDI: After all the failing on MusicXML for missing MIDI libraries, I turned on searching for custom MIDI libraries for Android. Soon, my search resulted in android-midi-lib. A quick check ensured me that it provided in everything I needed for my project:
This code provides an interface to read, manipulate, and write MIDI files. "Playback" is supported as a real-time event dispatch system. This library does NOT include actual audio playback or device interfacing.
Sounds promising, right? I imported the library in my project and was finally ready to start coding. However, by parsing a quickly made MIDI file made with MuseScore, the library crashed with NullPointerExceptions. Great! The exception took place in code written by the developer of the library, and had nothing to do with my own code, so I sent a mail to the developer. He quickly replied that the MIDI file wasn't conform with the MIDI specifications, but that I could fix the problem with some dirty fixes. After contacting the MuseScore developers, I learned that the MIDI file WAS conform with the specifications, but that the library crashed on unknown MIDI events instead of ignoring them. After the dirty fixes mentioned by the developer of the library, I was finally ready to start coding my application.

In my next blog post I'll talk about the current structure of my app, and show some screenshots.

Until next time,


Tuesday, February 7, 2012

How to start developing Android apps on Fedora 16 (Linux)

Hello folks!

Damn, it's been a while since my last post. And after reading it again, I realised I didn't finish any of those things I said I would. Aaah. Procrastination. One does not simply make lists of what one wants done and then do those things.

Anyway, back to the topic at hand. The one you're probably here for: Getting Android development working in Linux distributions isn't as easy as it is for Windows or OSX. But I can give you a step-by-step tutorial about how to get started. Let's begin, shall we?
For the record, I have done all this on a fresh Fedora 16 (64bit - gnome3.2) installation.

Downloading some thangs
And here it is, the first and foremost advantage of development on Linux. apt-get and yum!
We need a few things to get started:
  1. Eclipse (+ any plugins you want if you want to use it for more than Android development.)
    It's a great IDE for lots of languages, even has LaTeX support and more!
  2. Get Java. Chances are you need to get OpenJDK instead of Oracle's. I used the stock Java libraries and stuff in Fedora's repositories and it does the trick.
  3. Download the Android SDK.
    Find it on Google, bro.
  4. Since my installation is 64-bit, and the Android thingies aren't, you need some 32bit stuff:
    sudo yum install glibc.i686 glibc-devel.i686 libstdc++.i686 zlib-devel.i686 ncurses-devel.i686 libX11-devel.i686
    If you have a Debian-based Linux distribution, just use apt-get (or Fedora).
Start installing like a boss
So, you've got the needed tools. Now it's just a question of getting them to play along:
  1. Extract the Android SDK and put it some place where you got permissions.
    (I put it in my home directory)
  2. Install the SDK. You need to do this by running: android-sdk-linux/tools/android
    It will ask you what API levels you want, just make sure to get at least 2.3.3. With that version you currently (as of February 2th, 2012) will support about 60% of the Android population.
  3. Start Eclipse, go to Help->Install New Software
    Just to be sure, press the Available Software Sites and make sure that the latest release of Eclipse is checked. At this time, it's
    Add a source, with as location.
    Select it, and install the Development Tools it offers.
  4. It will also ask where your Android SDK is located. Just point it to the android-sdk-linux folder.
Done. Start coding.

Extra for being really pro
So, you've dabbled a bit in Android development, but the emulator is rather slow. Fear not, let's start debugging and running your apps on your own phone instead of that emulator!
  1. Make sure your application is flagged as debuggable. This is done in your manifest file.
    Add android:debuggable="true" to the element.
  2. Set your device to allow the installation of Non-Market applications. (Unknown Sources)
  3. Set your device to enable USB Debugging.
  4. Set up your system to enable the debugging. (With Windows, you would need drivers, LOL)
    Log in as root and create the following file:
    Edit this file and fill it with the following:
    SUBSYSTEM=="usb", ATTR{idVendor}=="04E8", MODE="0666", GROUP="plugdev"
  5. Notice the "04E8". I used this specific string because it stands for a Samsung Device.
    I have a Galaxy S2 so that would obviously be the right thing to do.
    However, chances are you got a phone of a different brand. Consult the following table:

    CompanyUSB Vendor ID
    Fujitsu Toshiba04C5
    KT Tech2116
    SK Telesys1F53
    Sony Ericsson0FCE
  6. Finish by a chmod:
    chmod a+r /etc/udev/rules.d/51-android.rules
Have Fun.
That's all to it. Just create a new application using the wizard now installed in Eclipse. You'll have a Hello World app which will be perfect for testing if your IDE is working.

And now, my Padawan, start coding some awesome Android apps.



Inspired by and got some help from: