Like I said in my previous post it's now the time to provide a bit information about the current structure and looks of my application.
A first question was how to create the visualization of the instrument. The best way to go in my opinion was to use a background image as picture of the instrument, with an overlay so annotations over the image would be possible. By calculating the location of the objects on the overlay in comparison with the screen height and with, I can relatively place those objects so their position is always correct, whatever the screen sizes.
There you go. My first official screenshot. Now what do we see here? (You get points if you shouted out "E"!)
- Action bar: New in Ice Cream Sandwich is the Action Bar. It contains the title of the application, but can also be used as a menu! This means I can use this to constantly show buttons on the screen (like replay or pause) while not losing too much space. You might ask "Why don't you put those buttons on the right or the left of the recorder?", but imagine the instrument is a piano. This will probably use two or three rows to show the entire keyboard. Conclusion: I need the screen space. Putting everything in the Action Bar makes the application consistent on all instruments.
- Replay: Yup, currently my application already supports an (albeit very small) amount of midi files. Currently the notes from C to Fis are supported.
- An image of a recorder: I've decided to use a recorder as starting instrument because of the limited number of possible notes, and chances are that almost everybody has a basic understanding of this instrument (which means this might be one of the best use-cases for my application. Everyone knows the basic notes on a recorder (learned in primary or secondary school), but who remembers notes like a Bes or a Fis?).
- Red dots: These are painted as an overlay on the background image. They show which holes of the recorder must be covered to produce the current note.
I needed this image to be able to explain the structure of my application. Like I said, currently the notes from C to Fis are supported. This is because I first put these hard-coded in the source code to be able to test if it works. I'll explain it more in a second.
- InstrutorialsActivity: Currently my app only has one Activity. On creation stores the width and height of the ImageView so the overlay locations are always correct. It opens streams to the provided midi and mp3 files (hardcoded a.t.m.). It then builds up an Instrument. When the Instrument is finished, it will start the parsing of the MIDI file, and plays the mp3 file concurrently so the user gets the impression that the instrument is performing the score. It contains a handler which handles all incoming messages of the MidiParser and updates the views so the correct notes are displayed.
- InstrumentBuilder: This class creates an Instrument by parsing an XML file. This XML file contains a definition of the Instrument. For example: it contains the range of the instrument, the size of the overlay dots in relative pixels (a piano example would probably need smaller dots), and for every possible note the instrument can play, a list of dot locations. This XML parsing makes it possible to provide an endless amount of instruments. I can simply create a Piano instrument by declaring a background image of a keyboard, a range of notes the instrument can play, a radius for the dot size, and just a single dot for each note on the correct location. The recorder on the other hand has a much smaller range, but more dots per note (because one need to hold many holes to play a C).
- Instrument: Contains a Hashtable of Notes (each Note has its MIDI note value as key), the range of the instrument, and the radius of the dots.
- Note: Contains a List of Dots and a function to draw them.
- Dot: Each dot has an x and y coordinate which is relative to the instrument image and independent of screen width and height.
- MidiParser: Contains the fancy part of the application. It parses the MIDI file and throws events in real time. When an event is thrown, a corresponding message is sent to the Activity. For example, on a NoteOn event, it will send a message containing the note value, so the Activity can tell the Instrument to display a Note, which will draw all Dots.
Currently my application contains a MIDI file made with MuseScore, as well as it's corresponding MP3 file. On running, an Instrument is created by parsing an XML file defining a recorder. When the Instrument is created, the MIDI file is parsed. It throws events in real time, on which the view is updated. The result is an application which displays an instrument playing a MIDI file. One can easily add new instruments by just providing a background image and an XML file defining the instrument.
That was all for this weekend I guess.