Sunday, May 27, 2012

Instrutorials: Now on Google Play

Hi all,

Just a small quick update, as Instrutorials has been released on Google Play.
You can download it at

Android app on Google Play

Some latest screenshots for the rest of this post:

Visualization of a piano score.

Showing search results.

Showing detailed information about a score.

Downloads integrated in Android OS.

Use Instrutorials to open and visualize locally stored MIDI files.

Set the playback speed of the visualization (however only 100% has audio).

Two-pane layout on a tablet.

Android app on Google Play

See you later,


Monday, April 23, 2012

Torus 1.02 Beta

Hi all,

For those who don't remember (or never played/downloaded Torus: Tower Rush), here's a screenshot:

I've recently noticed that my small Java game called Torus: Tower Rush was no longer available for download.
The reason for this was the closure of Megaupload, since the file was hosted there.

I didn't have version 1.01 Beta lying around, so I cleaned up the work-in-progress I had and reuploaded the game.

The result has no new features, but the code now provides a framework for recording games and playing back those games. You could record awesome games and share them with your friends this way.

The function isn't implemented yet, but expect it in version 1.02 Beta (whenever it comes).

Download: DROPBOX



Sunday, April 22, 2012

Instrutorials - Part 6 - Piano and more

Hi all,

Been a while since my last post; I've had a rather busy schedule in the previous weeks, but now I'm here to show some more progress on the implementation of my project.

I'll do this by starting with a video I recorded for a presentation last week, so you can show my application in action:

Firstly, audio and video isn't really 100% synced, but this is because the screenrecorder could not record audio. I've manually added the audio afterwards (well I actually let a friend do it since I'm a newbie in video editing).
You can see the user searching for scores containing "Pokemon". Once a score is selected, the score is played back to the user in real-time, by both audio and video. The user then searches for a score for the recorder, and then that score is played back to the user. It's the basics of the application, and it works. Yay!
Adding piano support to the app was very easy due to my current structure. I just needed to add a xml, an image, and just a few lines of code. If one provides me with a xml and an image for a different instrument, I can add it in less than a minute.

I have also gone back to MIDI for both my audio as visual feedback. The reason for all the MIDI bugs seemed to be problems with file-extensions. Android's MediaPlayer class can play every sort of file by just reading the header, but that fails with MIDI files. Since I first wrote my temporary files as .dat's to the internal file system, the MediaPlayer failed. However, now I'm back to using MIDI, loading times have been lowered exponentially. MIDI files are so much smaller in size than mp3 and I don't need those anymore!

Anyway, since the video I've added some things.
Firstly, I changed the search interface. I now presume that the user knows what score he's looking for, so I removed the browsing part of the GUI (it wasn't implemented anyway). The removal of those tabs gives me some more vertical space, which has a nice result:


Next, I've also added an overlay when the app is searching for scores:

This solution is clean and makes clear to the user that the application is working.

I've also added another screen between the score list and the actual playback of the score. When a score is now selected, the user gets to see a screen like the one below, with more details about the score.

Now the user has the option to play the selected score, or download some files. Of course, the "Play score" option loads the activity as shown in the video. The download button allows the user to choose a file-type:

If the user selects one of these five options the application will download the corresponding file using the DownloadManager of Android. This means that the files will be stored in the default download folder, which is chosen by the user. It also shows a notification and allows the user to open the file directly when it's downloaded. Using the DownloadManager also allows the user to keep using Instrutorials when the files are being downloaded, since that happens in the background. If one pulls down the notification bar whilst downloading, he/she gets to see this:

The downloaded MIDI and MP3 files can be played back by the standard Media Player, while the user can use Adobe Reader to check out the PDF file.

All this functionality of the "Details" window as I call it, while not making the core of my application more potent (the actual playback), I personally feel that it makes the application much more complete. Even if a person does not need the playback functionality of my application. He/she still can use it just to search and download scores on his tablet and/or smartphone. If I didn't know better, I'd call it the official MuseScore app :P

To close this blogpost a quick screenshot of the playback Activity:

As you can see, I added functionality to the Activity to pause and continue playing. The "NEXT" button on the ActionBar is supposed to let the application play just the next note and then pause, so by pressing next one could go through the score step by step. However, the library I'm using for MIDI files on Android isn't really helping much. Between calling the pause button on the library and the actual pausing is at least a gap of 90 milliseconds. This is fixable for just manually allowing pause and play, but for real-time pausing and playing for every note it's undo-able. When the library encounters a note it should pause automatically, but before it actually pauses at least a few notes more are already played. I'll try to find a solution for this, but I can't guarantee finding one, since I'm really limited by the library I'm using in that aspect.

Anyway, that was a quick peek into the latest updates of my implementation.

See you next time,


Sunday, April 1, 2012

Instrutorials - Part 5 - Working prototype for Smartphones and Tablets

Hi all,

Last week I've finally come to the point where I have a functional prototype of my application. The code definitely needs some clean-up before going public, but hey, it works. One can browse the API for scores written for piano as well as a recorder. Scores are then played back with a visualization of the instrument (currently only for the recorder).

Since my last blog post, I had bumped into some issues which stalled my progress. I'm going to talk about those below.

Android MediaPlayer
Sigh. I already had a hard-coded prototype of the visualization working in my second post about this topic. It used a MIDI file which was included in the application's resources. However, when trying to bind this Activity to my search and browse activity from my previous post, the MediaPlayer class kept failing. I could not, in any way, let it play a MIDI file. I tried creating a MediaPlayer from the URI, downloading the file to a temporary location and loading it from there, but each and every time the MediaPlayer failed with the same error: "Unable to locate the file". When I used an URI of a MP3-file, everything worked fine. Because of these ridiculous errors, I decided to use both MP3 as MIDI files again. MP3 for audio, MIDI for the visualization. My own code could use a locally stored MIDI file, but the MediaPlayer class from Android could not.

Visualisation on Tablets
For development and testing, I always used my Samsung Galaxy S II with CyanogenMod 9. However, my application should also run perfectly on a Tablet, since it's using the Ice Cream Sandwich SDK. When I demoed my current progress to my Promotor, we noticed that the visualization of the red dots was quite off. I found this strange, because I already place the dots based on percentage levels, not on actual pixels. After some digging around, the fault was rather obvious. Android only allows three possible definitions for a width or height of an ImageView, these being "MATCH_PARENT", "WRAP_CONTENT", or a fixed size. I had defined the height as "MATCH_PARENT" (filling the whole screen in this case), and the width as "WRAP_CONTENT", thinking it would scale nicely. However this was not the case, as the width of my resulting ImageView was the width of the original image, independent of its scaled height.

To solve this, I defined my own ImageView "AspectRatioImageView" and overloaded the onMeasure(int, int) method, to let it calculate it's own height and then using the resulting height to calculate the corresponding width. In code, it looks like this:

protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec)
if (getBackground() != null)
int height = MeasureSpec.getSize(heightMeasureSpec);
int width = height * getBackground().getIntrinsicWidth() / getBackground().getIntrinsicHeight();
setMeasuredDimension(width, height);
} else
super.onMeasure(widthMeasureSpec, heightMeasureSpec);

No syntax highlighting here, but I guess you'll get the point.

Next up, I'll finally implement piano visualization, clean-up my code (can take some time :p), and add some more functionality.


Saturday, March 17, 2012

Instrutorials - Part 4 - API calls and UI problems

Hi all,

Once again another update on my Bachelor's Thesis. I tried finishing the search implementation, consisting of both API calls and a decent interface.


Like I stated in my previous post, I'm using @pjv_'s library for managing the API calls to I already had a good idea about how I'd make the interface, so I first started getting those API calls to work.

@pjv_'s library was a bit difficult to get started with. It uses a MuseScore object which contains everything JSON related, but had a huge constructor and not that much documentation (since it was originally developed just for personal use). Therefore I downloaded and checked the source code Collectionista, the app for which this library was created. After digging trough numerous source files I finally found what I was looking for and based my code for initiating an API call on some code I found in Collectionista.

So, now I had a way to contact the API. I decided to implement the search function before browsing, since search is basically sending a customized query to the API, and parsing the server reply. Since the library I used contained all resulting data in Score objects, it was quite easy to get the results of a query -  meaning Title, composer, number of pages of a score, etc. However, all feedback was textual.

I wanted to give the user a quick peek at the scores. As they say, an image speaks more than a thousand words. The Musescore API provides some static links to small generated thumb images of the scores. However, this functionality is not supported in @pjv_'s library. Therefore, I extended his library so that Score objects can also contain a bitmap, meaning that the thumb can be stored in the Score object. The developer himself is responsible for filling this variable, since this image is requested in a second query. Also, it should still be possible to use the library without thumb images. If the functionality to fetch these images was included in a normal query, a lot more of data would be used by the application, even if it doesn't even need the images.

Each score contained in the response by the server is shown in the application, along with the Title, Composer, number of pages, and a thumbnail. These thumbnails are loaded in the background, so on first sight they might show blank, but once they are loaded they are instantly shown. This thumbnail loading is another reason why I decided to store these images in the Score objects. Android deallocates every item in a ListView when it is not visible. Therefore, images would be deleted on scrolling, while others had to be loaded. Upon scrolling back to the top, the bottom ones would be deleted and the first ones would be downloaded again. This created lag because of the constant background threads fetching those thumbnails. By storing the images in the Score objects each image only needs to be fetched once. When fetched, it is stored in the Score object. Therefore, while scrolling these images can be instantly loading without needing to create numerous new threads to fetch these images once again.

The results of a search query are shown in a ListView. As you can see, one can add search parameters. A keyword or a query to search for as well as the instrument the score should be written for. (About the first screenshot, a ListView in Android doesn't show a scrollbar unless interacted with - the list of piano scores on which the keyword Bach applies is substantially higher than 5). What now rests is creating a link between this Activity and the actual meat of the app Activity (which playbacks the files). This can be done by clicking on an item in the ListView, since these are all connected to a Score object which can be used to fetch the corresponding MIDI file from the API.


Sunday, March 4, 2012

Instrutorials - Part 3 - Starting with the UI, API

Hi all,

After a few busy weeks finally another update!

I've started work on the main interface of the app. You can see a screenshot below, but it's still rather empty:

Just so you know, the keyboard has nothing to do with the application, it's just Swype :)

Another screenshot in landscape:

This beautifully shows how awesome and clean the new Action Bar is. On landscape the tabs are included in the Action Bar so no space is wasted. However, the implementation is a lot more difficult than Google admits, so it was a bit of a hassle.

One of the reasons is that TabLayout is now deprecated. If one wants to use tabs, the Action Bar is the only valid way to go. Coupling that with Fragments, TabListeners and LayoutInflators and it's a whole lot more work than a simple TabLayout. At least it looks nice these days.
One of the things I've learned in the last weeks is that Google is a troll. Read the following:

Google: One of the things we leaned during the TTUI course at University Hasselt: If you want developers to use specific design choices, make those choices the easy option. Don't make it impossibly difficult. Using gestures to switch between tabs is one of the most difficult UI implementations that exists for Android. For starters: If you want developers to use it that bad, just add it to the SDK, instead of using difficult code in your own apps and not sharing it. Thanks.

Anyway back to the subject:

I also started to work on the API connection. For this i use @pjv_'s library. You can find it on Github:
This had a few complications.

  1. I have some experience with Subversion. Git, however, is a whole new world for me. Getting to know all commands, how to use them at what times,... I guess I'll get the hang of it by using it.
  2. The API wrapper didn't really include an example. It was used in Collectionista (same developer), which is a huge app already, so it was a bit difficult getting it to work. But after clearing some things with the developer I finally successfully queried the API and gotten some valid results. Yay!
I have no screenshots to show off about the API, unless you're interested in pure text-based command-line prints. But, being able to see those prints after a few weeks was a big leap forward.

Another busy week ahead so it might be one or two weeks until my next post.
Progress is slow but steady. And that's what counts.


Saturday, February 18, 2012

Instrutorials - Part 2 - A Hard-coded Prototype

Hi all,

Like I said in my previous post it's now the time to provide a bit information about the current structure and looks of my application.

A first question was how to create the visualization of the instrument. The best way to go in my opinion was to use a background image as picture of the instrument, with an overlay so annotations over the image would be possible. By calculating the location of the objects on the overlay in comparison with the screen height and with, I can relatively place those objects so their position is always correct, whatever the screen sizes.

There you go. My first official screenshot. Now what do we see here? (You get points if you shouted out "E"!)
  • Action bar: New in Ice Cream Sandwich is the Action Bar. It contains the title of the application, but can also be used as a menu! This means I can use this to constantly show buttons on the screen (like replay or pause) while not losing too much space. You might ask "Why don't you put those buttons on the right or the left of the recorder?", but imagine the instrument is a piano. This will probably use two or three rows to show the entire keyboard. Conclusion: I need the screen space. Putting everything in the Action Bar makes the application consistent on all instruments.
  • Replay: Yup, currently my application already supports an (albeit very small) amount of midi files. Currently the notes from C to Fis are supported.
  • An image of a recorder: I've decided to use a recorder as starting instrument because of the limited number of possible notes, and chances are that almost everybody has a basic understanding of this instrument (which means this might be one of the best use-cases for my application. Everyone knows the basic notes on a recorder (learned in primary or secondary school), but who remembers notes like a Bes or a Fis?).
  • Red dots: These are painted as an overlay on the background image. They show which holes of the recorder must be covered to produce the current note.
I needed this image to be able to explain the structure of my application. Like I said, currently the notes from C to Fis are supported. This is because I first put these hard-coded in the source code to be able to test if it works. I'll explain it more in a second.

  • InstrutorialsActivity: Currently my app only has one Activity. On creation stores the width and height of the ImageView so the overlay locations are always correct. It opens streams to the provided midi and mp3 files (hardcoded a.t.m.). It then builds up an Instrument. When the Instrument is finished, it will start the parsing of the MIDI file, and plays the mp3 file concurrently so the user gets the impression that the instrument is performing the score. It contains a handler which handles all incoming messages of the MidiParser and updates the views so the correct notes are displayed.
  • InstrumentBuilder: This class creates an Instrument by parsing an XML file. This XML file contains a definition of the Instrument. For example: it contains the range of the instrument, the size of the overlay dots in relative pixels (a piano example would probably need smaller dots), and for every possible note the instrument can play, a list of dot locations. This XML parsing makes it possible to provide an endless amount of instruments. I can simply create a Piano instrument by declaring a background image of a keyboard, a range of notes the instrument can play, a radius for the dot size, and just a single dot for each note on the correct location. The recorder on the other hand has a much smaller range, but more dots per note (because one need to hold many holes to play a C).
  • Instrument: Contains a Hashtable of Notes (each Note has its MIDI note value as key), the range of the instrument, and the radius of the dots.
  • Note: Contains a List of Dots and a function to draw them.
  • Dot: Each dot has an x and y coordinate which is relative to the instrument image and independent of screen width and height.
  • MidiParser: Contains the fancy part of the application. It parses the MIDI file and throws events in real time. When an event is thrown, a corresponding message is sent to the Activity. For example, on a NoteOn event, it will send a message containing the note value, so the Activity can tell the Instrument to display a Note, which will draw all Dots.
Current state:
So, conclusion:
Currently my application contains a MIDI file made with MuseScore, as well as it's corresponding MP3 file. On running, an Instrument is created by parsing an XML file defining a recorder. When the Instrument is created, the MIDI file is parsed. It throws events in real time, on which the view is updated. The result is an application which displays an instrument playing a MIDI file. One can easily add new instruments by just providing a background image and an XML file defining the instrument.

That was all for this weekend I guess.



Instrutorials - Part 1 - Getting Started

Hi all,

By request of my promotor Prof. Dr. Wim Lamotte, I've decided to blog about the progress of my Bachelor's thesis. The goal is to write a mobile application (I've chosen the Android platform) which is tied to the API, and will allow users to learn how to play songs on a specific musical instrument, by providing a visualization of the instrument, and how to play the notes.

I've already made quite some progress, so chances are this post is going to be a bit lengthy. Anyway, let's get started.

The first question I needed to ask myself, is which platform I'd choose for development. Personally, this choice wasn't hard to make. Blackberry? LOL. iOS? Well, firstly, one needs a Mac to start coding, an iPhone, as well as a license. Since I don't have any of those, the choice for Android was obvious. No costs at all, open source, and development possible on all PC Platforms. The fact that I own a Samsung Galaxy S II also was a big reason to choose for Android. Coding on an emulator isn't that fancy.

Secondly, I needed to decide which API level to use. I must admit I lost some sleep about it. The latest API's provide an enormous amount of new functionality, both before and behind the curtains. However, Ice Cream Sandwich isn't wide spread at all. But the fact that my application wouldn't see any daylight until the end of the semester, as well as the new functionality of the API's, I decided to take a plunge in the unknown and develop my app for Ice Cream Sandwich. This way, both tablets as well as smart-phones would be greatly supported, and I could follow the new official design guidelines (these) by Google as much as I can. Since fragmentation is one of the worst enemies of Android, I chose to try to be consistent and follow those design guidelines, with hopefully a clean and great app at the finish.

Getting started
The second thing on my mind was obvious. I was supposed to use the API for providing my users with a huge database of music scores. The amount of file-types offered by the API was quite substantial, so I needed to find a file-type which allowed me to create a visualization of an instrument based on the score.

 MusicXML: This was an obvious first choice for me. MusicXML provided everything I needed in a clean XML format. However, I found the amount of data in the files so substantial, that writing my own parser would take a lot of time. And we've got open source libraries for that, don't we?

The first library I found was JFugue. It provides a MusicXML parser, and has a great structure to implement my own Renderers based on events thrown by the parser. At first sight, this was the perfect solution. But after importing the library in my project, I soon found out that it needed the javax.sound.midi library. And guess what? That library is NOT included in the Android Java SDK. Oh noes! I've tried lots of different things to try and fix this problem:

  1. Hey. I don't even need MIDI support from that library. I just need MusicXML! So instead of downloading the .jar library, I downloaded the source code and deleted everything which had nothing to do with MusicXML or the structure of the library itself. But even then, it still couldn't build because it made use of "EventListenerLists". And guess what, that also isn't supported in Android.
  2. Secondly I downloaded the java files of EventListenerLists to put them in my own package, so even if Android didn't had them, my application could make use of it. But this broke JFugue even more, and it started to fail in parsing the XML files correctly. Even including another library - the error message stated I missed the nu.xom library - didn't fix the problem.
  3. Back to square one.
  4. Another library than JFugue perhaps? I tried lots of different libraries, but all had the same problem. Dependencies on javax.sound.midi. Rats.
MIDI: After all the failing on MusicXML for missing MIDI libraries, I turned on searching for custom MIDI libraries for Android. Soon, my search resulted in android-midi-lib. A quick check ensured me that it provided in everything I needed for my project:
This code provides an interface to read, manipulate, and write MIDI files. "Playback" is supported as a real-time event dispatch system. This library does NOT include actual audio playback or device interfacing.
Sounds promising, right? I imported the library in my project and was finally ready to start coding. However, by parsing a quickly made MIDI file made with MuseScore, the library crashed with NullPointerExceptions. Great! The exception took place in code written by the developer of the library, and had nothing to do with my own code, so I sent a mail to the developer. He quickly replied that the MIDI file wasn't conform with the MIDI specifications, but that I could fix the problem with some dirty fixes. After contacting the MuseScore developers, I learned that the MIDI file WAS conform with the specifications, but that the library crashed on unknown MIDI events instead of ignoring them. After the dirty fixes mentioned by the developer of the library, I was finally ready to start coding my application.

In my next blog post I'll talk about the current structure of my app, and show some screenshots.

Until next time,


Tuesday, February 7, 2012

How to start developing Android apps on Fedora 16 (Linux)

Hello folks!

Damn, it's been a while since my last post. And after reading it again, I realised I didn't finish any of those things I said I would. Aaah. Procrastination. One does not simply make lists of what one wants done and then do those things.

Anyway, back to the topic at hand. The one you're probably here for: Getting Android development working in Linux distributions isn't as easy as it is for Windows or OSX. But I can give you a step-by-step tutorial about how to get started. Let's begin, shall we?
For the record, I have done all this on a fresh Fedora 16 (64bit - gnome3.2) installation.

Downloading some thangs
And here it is, the first and foremost advantage of development on Linux. apt-get and yum!
We need a few things to get started:
  1. Eclipse (+ any plugins you want if you want to use it for more than Android development.)
    It's a great IDE for lots of languages, even has LaTeX support and more!
  2. Get Java. Chances are you need to get OpenJDK instead of Oracle's. I used the stock Java libraries and stuff in Fedora's repositories and it does the trick.
  3. Download the Android SDK.
    Find it on Google, bro.
  4. Since my installation is 64-bit, and the Android thingies aren't, you need some 32bit stuff:
    sudo yum install glibc.i686 glibc-devel.i686 libstdc++.i686 zlib-devel.i686 ncurses-devel.i686 libX11-devel.i686
    If you have a Debian-based Linux distribution, just use apt-get (or Fedora).
Start installing like a boss
So, you've got the needed tools. Now it's just a question of getting them to play along:
  1. Extract the Android SDK and put it some place where you got permissions.
    (I put it in my home directory)
  2. Install the SDK. You need to do this by running: android-sdk-linux/tools/android
    It will ask you what API levels you want, just make sure to get at least 2.3.3. With that version you currently (as of February 2th, 2012) will support about 60% of the Android population.
  3. Start Eclipse, go to Help->Install New Software
    Just to be sure, press the Available Software Sites and make sure that the latest release of Eclipse is checked. At this time, it's
    Add a source, with as location.
    Select it, and install the Development Tools it offers.
  4. It will also ask where your Android SDK is located. Just point it to the android-sdk-linux folder.
Done. Start coding.

Extra for being really pro
So, you've dabbled a bit in Android development, but the emulator is rather slow. Fear not, let's start debugging and running your apps on your own phone instead of that emulator!
  1. Make sure your application is flagged as debuggable. This is done in your manifest file.
    Add android:debuggable="true" to the element.
  2. Set your device to allow the installation of Non-Market applications. (Unknown Sources)
  3. Set your device to enable USB Debugging.
  4. Set up your system to enable the debugging. (With Windows, you would need drivers, LOL)
    Log in as root and create the following file:
    Edit this file and fill it with the following:
    SUBSYSTEM=="usb", ATTR{idVendor}=="04E8", MODE="0666", GROUP="plugdev"
  5. Notice the "04E8". I used this specific string because it stands for a Samsung Device.
    I have a Galaxy S2 so that would obviously be the right thing to do.
    However, chances are you got a phone of a different brand. Consult the following table:

    CompanyUSB Vendor ID
    Fujitsu Toshiba04C5
    KT Tech2116
    SK Telesys1F53
    Sony Ericsson0FCE
  6. Finish by a chmod:
    chmod a+r /etc/udev/rules.d/51-android.rules
Have Fun.
That's all to it. Just create a new application using the wizard now installed in Eclipse. You'll have a Hello World app which will be perfect for testing if your IDE is working.

And now, my Padawan, start coding some awesome Android apps.



Inspired by and got some help from: