Sunday, May 27, 2012

Instrutorials: Now on Google Play

Hi all,

Just a small quick update, as Instrutorials has been released on Google Play.
You can download it at

Android app on Google Play

Some latest screenshots for the rest of this post:

Visualization of a piano score.

Showing search results.

Showing detailed information about a score.

Downloads integrated in Android OS.

Use Instrutorials to open and visualize locally stored MIDI files.

Set the playback speed of the visualization (however only 100% has audio).

Two-pane layout on a tablet.

Android app on Google Play

See you later,


Monday, April 23, 2012

Torus 1.02 Beta

Hi all,

For those who don't remember (or never played/downloaded Torus: Tower Rush), here's a screenshot:

I've recently noticed that my small Java game called Torus: Tower Rush was no longer available for download.
The reason for this was the closure of Megaupload, since the file was hosted there.

I didn't have version 1.01 Beta lying around, so I cleaned up the work-in-progress I had and reuploaded the game.

The result has no new features, but the code now provides a framework for recording games and playing back those games. You could record awesome games and share them with your friends this way.

The function isn't implemented yet, but expect it in version 1.02 Beta (whenever it comes).

Download: DROPBOX



Sunday, April 22, 2012

Instrutorials - Part 6 - Piano and more

Hi all,

Been a while since my last post; I've had a rather busy schedule in the previous weeks, but now I'm here to show some more progress on the implementation of my project.

I'll do this by starting with a video I recorded for a presentation last week, so you can show my application in action:

Firstly, audio and video isn't really 100% synced, but this is because the screenrecorder could not record audio. I've manually added the audio afterwards (well I actually let a friend do it since I'm a newbie in video editing).
You can see the user searching for scores containing "Pokemon". Once a score is selected, the score is played back to the user in real-time, by both audio and video. The user then searches for a score for the recorder, and then that score is played back to the user. It's the basics of the application, and it works. Yay!
Adding piano support to the app was very easy due to my current structure. I just needed to add a xml, an image, and just a few lines of code. If one provides me with a xml and an image for a different instrument, I can add it in less than a minute.

I have also gone back to MIDI for both my audio as visual feedback. The reason for all the MIDI bugs seemed to be problems with file-extensions. Android's MediaPlayer class can play every sort of file by just reading the header, but that fails with MIDI files. Since I first wrote my temporary files as .dat's to the internal file system, the MediaPlayer failed. However, now I'm back to using MIDI, loading times have been lowered exponentially. MIDI files are so much smaller in size than mp3 and I don't need those anymore!

Anyway, since the video I've added some things.
Firstly, I changed the search interface. I now presume that the user knows what score he's looking for, so I removed the browsing part of the GUI (it wasn't implemented anyway). The removal of those tabs gives me some more vertical space, which has a nice result:


Next, I've also added an overlay when the app is searching for scores:

This solution is clean and makes clear to the user that the application is working.

I've also added another screen between the score list and the actual playback of the score. When a score is now selected, the user gets to see a screen like the one below, with more details about the score.

Now the user has the option to play the selected score, or download some files. Of course, the "Play score" option loads the activity as shown in the video. The download button allows the user to choose a file-type:

If the user selects one of these five options the application will download the corresponding file using the DownloadManager of Android. This means that the files will be stored in the default download folder, which is chosen by the user. It also shows a notification and allows the user to open the file directly when it's downloaded. Using the DownloadManager also allows the user to keep using Instrutorials when the files are being downloaded, since that happens in the background. If one pulls down the notification bar whilst downloading, he/she gets to see this:

The downloaded MIDI and MP3 files can be played back by the standard Media Player, while the user can use Adobe Reader to check out the PDF file.

All this functionality of the "Details" window as I call it, while not making the core of my application more potent (the actual playback), I personally feel that it makes the application much more complete. Even if a person does not need the playback functionality of my application. He/she still can use it just to search and download scores on his tablet and/or smartphone. If I didn't know better, I'd call it the official MuseScore app :P

To close this blogpost a quick screenshot of the playback Activity:

As you can see, I added functionality to the Activity to pause and continue playing. The "NEXT" button on the ActionBar is supposed to let the application play just the next note and then pause, so by pressing next one could go through the score step by step. However, the library I'm using for MIDI files on Android isn't really helping much. Between calling the pause button on the library and the actual pausing is at least a gap of 90 milliseconds. This is fixable for just manually allowing pause and play, but for real-time pausing and playing for every note it's undo-able. When the library encounters a note it should pause automatically, but before it actually pauses at least a few notes more are already played. I'll try to find a solution for this, but I can't guarantee finding one, since I'm really limited by the library I'm using in that aspect.

Anyway, that was a quick peek into the latest updates of my implementation.

See you next time,


Sunday, April 1, 2012

Instrutorials - Part 5 - Working prototype for Smartphones and Tablets

Hi all,

Last week I've finally come to the point where I have a functional prototype of my application. The code definitely needs some clean-up before going public, but hey, it works. One can browse the API for scores written for piano as well as a recorder. Scores are then played back with a visualization of the instrument (currently only for the recorder).

Since my last blog post, I had bumped into some issues which stalled my progress. I'm going to talk about those below.

Android MediaPlayer
Sigh. I already had a hard-coded prototype of the visualization working in my second post about this topic. It used a MIDI file which was included in the application's resources. However, when trying to bind this Activity to my search and browse activity from my previous post, the MediaPlayer class kept failing. I could not, in any way, let it play a MIDI file. I tried creating a MediaPlayer from the URI, downloading the file to a temporary location and loading it from there, but each and every time the MediaPlayer failed with the same error: "Unable to locate the file". When I used an URI of a MP3-file, everything worked fine. Because of these ridiculous errors, I decided to use both MP3 as MIDI files again. MP3 for audio, MIDI for the visualization. My own code could use a locally stored MIDI file, but the MediaPlayer class from Android could not.

Visualisation on Tablets
For development and testing, I always used my Samsung Galaxy S II with CyanogenMod 9. However, my application should also run perfectly on a Tablet, since it's using the Ice Cream Sandwich SDK. When I demoed my current progress to my Promotor, we noticed that the visualization of the red dots was quite off. I found this strange, because I already place the dots based on percentage levels, not on actual pixels. After some digging around, the fault was rather obvious. Android only allows three possible definitions for a width or height of an ImageView, these being "MATCH_PARENT", "WRAP_CONTENT", or a fixed size. I had defined the height as "MATCH_PARENT" (filling the whole screen in this case), and the width as "WRAP_CONTENT", thinking it would scale nicely. However this was not the case, as the width of my resulting ImageView was the width of the original image, independent of its scaled height.

To solve this, I defined my own ImageView "AspectRatioImageView" and overloaded the onMeasure(int, int) method, to let it calculate it's own height and then using the resulting height to calculate the corresponding width. In code, it looks like this:

protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec)
if (getBackground() != null)
int height = MeasureSpec.getSize(heightMeasureSpec);
int width = height * getBackground().getIntrinsicWidth() / getBackground().getIntrinsicHeight();
setMeasuredDimension(width, height);
} else
super.onMeasure(widthMeasureSpec, heightMeasureSpec);

No syntax highlighting here, but I guess you'll get the point.

Next up, I'll finally implement piano visualization, clean-up my code (can take some time :p), and add some more functionality.


Saturday, March 17, 2012

Instrutorials - Part 4 - API calls and UI problems

Hi all,

Once again another update on my Bachelor's Thesis. I tried finishing the search implementation, consisting of both API calls and a decent interface.


Like I stated in my previous post, I'm using @pjv_'s library for managing the API calls to I already had a good idea about how I'd make the interface, so I first started getting those API calls to work.

@pjv_'s library was a bit difficult to get started with. It uses a MuseScore object which contains everything JSON related, but had a huge constructor and not that much documentation (since it was originally developed just for personal use). Therefore I downloaded and checked the source code Collectionista, the app for which this library was created. After digging trough numerous source files I finally found what I was looking for and based my code for initiating an API call on some code I found in Collectionista.

So, now I had a way to contact the API. I decided to implement the search function before browsing, since search is basically sending a customized query to the API, and parsing the server reply. Since the library I used contained all resulting data in Score objects, it was quite easy to get the results of a query -  meaning Title, composer, number of pages of a score, etc. However, all feedback was textual.

I wanted to give the user a quick peek at the scores. As they say, an image speaks more than a thousand words. The Musescore API provides some static links to small generated thumb images of the scores. However, this functionality is not supported in @pjv_'s library. Therefore, I extended his library so that Score objects can also contain a bitmap, meaning that the thumb can be stored in the Score object. The developer himself is responsible for filling this variable, since this image is requested in a second query. Also, it should still be possible to use the library without thumb images. If the functionality to fetch these images was included in a normal query, a lot more of data would be used by the application, even if it doesn't even need the images.

Each score contained in the response by the server is shown in the application, along with the Title, Composer, number of pages, and a thumbnail. These thumbnails are loaded in the background, so on first sight they might show blank, but once they are loaded they are instantly shown. This thumbnail loading is another reason why I decided to store these images in the Score objects. Android deallocates every item in a ListView when it is not visible. Therefore, images would be deleted on scrolling, while others had to be loaded. Upon scrolling back to the top, the bottom ones would be deleted and the first ones would be downloaded again. This created lag because of the constant background threads fetching those thumbnails. By storing the images in the Score objects each image only needs to be fetched once. When fetched, it is stored in the Score object. Therefore, while scrolling these images can be instantly loading without needing to create numerous new threads to fetch these images once again.

The results of a search query are shown in a ListView. As you can see, one can add search parameters. A keyword or a query to search for as well as the instrument the score should be written for. (About the first screenshot, a ListView in Android doesn't show a scrollbar unless interacted with - the list of piano scores on which the keyword Bach applies is substantially higher than 5). What now rests is creating a link between this Activity and the actual meat of the app Activity (which playbacks the files). This can be done by clicking on an item in the ListView, since these are all connected to a Score object which can be used to fetch the corresponding MIDI file from the API.


Sunday, March 4, 2012

Instrutorials - Part 3 - Starting with the UI, API

Hi all,

After a few busy weeks finally another update!

I've started work on the main interface of the app. You can see a screenshot below, but it's still rather empty:

Just so you know, the keyboard has nothing to do with the application, it's just Swype :)

Another screenshot in landscape:

This beautifully shows how awesome and clean the new Action Bar is. On landscape the tabs are included in the Action Bar so no space is wasted. However, the implementation is a lot more difficult than Google admits, so it was a bit of a hassle.

One of the reasons is that TabLayout is now deprecated. If one wants to use tabs, the Action Bar is the only valid way to go. Coupling that with Fragments, TabListeners and LayoutInflators and it's a whole lot more work than a simple TabLayout. At least it looks nice these days.
One of the things I've learned in the last weeks is that Google is a troll. Read the following:

Google: One of the things we leaned during the TTUI course at University Hasselt: If you want developers to use specific design choices, make those choices the easy option. Don't make it impossibly difficult. Using gestures to switch between tabs is one of the most difficult UI implementations that exists for Android. For starters: If you want developers to use it that bad, just add it to the SDK, instead of using difficult code in your own apps and not sharing it. Thanks.

Anyway back to the subject:

I also started to work on the API connection. For this i use @pjv_'s library. You can find it on Github:
This had a few complications.

  1. I have some experience with Subversion. Git, however, is a whole new world for me. Getting to know all commands, how to use them at what times,... I guess I'll get the hang of it by using it.
  2. The API wrapper didn't really include an example. It was used in Collectionista (same developer), which is a huge app already, so it was a bit difficult getting it to work. But after clearing some things with the developer I finally successfully queried the API and gotten some valid results. Yay!
I have no screenshots to show off about the API, unless you're interested in pure text-based command-line prints. But, being able to see those prints after a few weeks was a big leap forward.

Another busy week ahead so it might be one or two weeks until my next post.
Progress is slow but steady. And that's what counts.


Saturday, February 18, 2012

Instrutorials - Part 2 - A Hard-coded Prototype

Hi all,

Like I said in my previous post it's now the time to provide a bit information about the current structure and looks of my application.

A first question was how to create the visualization of the instrument. The best way to go in my opinion was to use a background image as picture of the instrument, with an overlay so annotations over the image would be possible. By calculating the location of the objects on the overlay in comparison with the screen height and with, I can relatively place those objects so their position is always correct, whatever the screen sizes.

There you go. My first official screenshot. Now what do we see here? (You get points if you shouted out "E"!)
  • Action bar: New in Ice Cream Sandwich is the Action Bar. It contains the title of the application, but can also be used as a menu! This means I can use this to constantly show buttons on the screen (like replay or pause) while not losing too much space. You might ask "Why don't you put those buttons on the right or the left of the recorder?", but imagine the instrument is a piano. This will probably use two or three rows to show the entire keyboard. Conclusion: I need the screen space. Putting everything in the Action Bar makes the application consistent on all instruments.
  • Replay: Yup, currently my application already supports an (albeit very small) amount of midi files. Currently the notes from C to Fis are supported.
  • An image of a recorder: I've decided to use a recorder as starting instrument because of the limited number of possible notes, and chances are that almost everybody has a basic understanding of this instrument (which means this might be one of the best use-cases for my application. Everyone knows the basic notes on a recorder (learned in primary or secondary school), but who remembers notes like a Bes or a Fis?).
  • Red dots: These are painted as an overlay on the background image. They show which holes of the recorder must be covered to produce the current note.
I needed this image to be able to explain the structure of my application. Like I said, currently the notes from C to Fis are supported. This is because I first put these hard-coded in the source code to be able to test if it works. I'll explain it more in a second.

  • InstrutorialsActivity: Currently my app only has one Activity. On creation stores the width and height of the ImageView so the overlay locations are always correct. It opens streams to the provided midi and mp3 files (hardcoded a.t.m.). It then builds up an Instrument. When the Instrument is finished, it will start the parsing of the MIDI file, and plays the mp3 file concurrently so the user gets the impression that the instrument is performing the score. It contains a handler which handles all incoming messages of the MidiParser and updates the views so the correct notes are displayed.
  • InstrumentBuilder: This class creates an Instrument by parsing an XML file. This XML file contains a definition of the Instrument. For example: it contains the range of the instrument, the size of the overlay dots in relative pixels (a piano example would probably need smaller dots), and for every possible note the instrument can play, a list of dot locations. This XML parsing makes it possible to provide an endless amount of instruments. I can simply create a Piano instrument by declaring a background image of a keyboard, a range of notes the instrument can play, a radius for the dot size, and just a single dot for each note on the correct location. The recorder on the other hand has a much smaller range, but more dots per note (because one need to hold many holes to play a C).
  • Instrument: Contains a Hashtable of Notes (each Note has its MIDI note value as key), the range of the instrument, and the radius of the dots.
  • Note: Contains a List of Dots and a function to draw them.
  • Dot: Each dot has an x and y coordinate which is relative to the instrument image and independent of screen width and height.
  • MidiParser: Contains the fancy part of the application. It parses the MIDI file and throws events in real time. When an event is thrown, a corresponding message is sent to the Activity. For example, on a NoteOn event, it will send a message containing the note value, so the Activity can tell the Instrument to display a Note, which will draw all Dots.
Current state:
So, conclusion:
Currently my application contains a MIDI file made with MuseScore, as well as it's corresponding MP3 file. On running, an Instrument is created by parsing an XML file defining a recorder. When the Instrument is created, the MIDI file is parsed. It throws events in real time, on which the view is updated. The result is an application which displays an instrument playing a MIDI file. One can easily add new instruments by just providing a background image and an XML file defining the instrument.

That was all for this weekend I guess.