Lomse library. API documentation
0.30.0
|
Lomse is platform independent code and cannot generate sounds for your specific platform. Therefore, Lomse implements playback by generating real-time events and sending them to your application. It is responsibility of your application to handle these events and do whatever is needed, i.e. transform sound events into real sounds or doing whatever you would like with the sound events.
From Lomse internal point of view, playing back an score involves, basically, two steps:
It is responsibility of your application to handle these events and do whatever is necessary. But for visual effects, lose provides an implementation for generating them so, if desired, your application can delegate in lose for this task.
Alternatively, your application could do playback by other mechanisms (e.g., using an external player) and to provide visual feedback by synchronizing the performance with the displayed score by using Lomse methods for this. See Using an external player.
For playing back an score your application has to:
MidiServerBase
. This class will receive the sound events and will have the responsibility for generating the sounds. See How to handle sound events.Create an instance of class ScorePlayer
. This class takes care of most of the work to do. By using it, playing an score is just two tasks:
ScorePlayer
instance.ScorePlayer
to play it, specifying the desired options (i.e. visual tracking, metronome settings, count-off, etc).See How to play an score.
As you can see, implementing score playback in an application is not complex, and the only burden for your application is coding a MidiServer
class for generating the sounds.
For playing back an score your application has just to define a class, derived from MidiServerBase. This class defines the interface for processing sound events. Your application has to define a derived class (i.e. MyMidiServer
) and implement the following virtual methods:
Sound events will be sent, directly, to your MyMidiServer
class just by invoking any of the virtual methods. Invocation of these methods is done in real time, that is, lomse will determine the exact time at which a note on / note off has to take place, and will invoke the respective method, note_on()
or note_off()
, at the appropriate time. This implies that your midi server implementation is only responsible for generating or stopping sounds when requested, and no time computations are needed.
Lomse does not impose any restriction about how to generate sounds other than low latency. Perhaps, the simpler method to generate sounds is to rely on the MIDI synthesizer of the PC sound card.
Class ScorePlayer provides the necessary methods for controlling all playback (start, stop, pause, etc.). Your application will have to request lomse the instance of ScorePlayer by invoking LomseDoorway::create_score_player() method and passing the MidiServer
to use:
This can be done only once if your application saves the ScorePlayer instance in a global variable and ensures the appropriate life scope for your MyAppMidiServer
object.
Once you have the ScorePlayer instance, playback is just loading the score to play (by invoking ScorePlayer::load_score() method) and invoking the appropriate methods, such as ScorePlayer::play() or ScorePlayer::stop() or ScorePlayer::pause();
Method ScorePlayer::load_score() requires the score to play. How to get it depends on your application, but a simple way of doing it is by using the Document methods for traversing the document and accessing its components. For instance:
Apart from generating sound events, lomse also generates visual tracking events, that is, events to add visual tracking effects, synchronized with sound, on the displayed score.
For generating visual effects you have two options, either do it yourself by modifying the lomse graphical model as desired or, simpler, delegate in lomse for generating standard visual effects. Lomse offers two type of visual effects:
The type of visual tracking event to generate is controlled by method Interactor::set_visual_tracking_mode(). Valid values for visual tracking effects are defined in enum EVisualTrackingMode. By default, if method Interactor::set_visual_tracking_mode() is not invoked, Lomse will highlight_notes and rests as they are played back. Several visual effects can be used simultaneously by combining values with the OR ('|') operator. For example:
Visual tracking events are sent to your application via the event handling callback, that you set up at Lomse initialization. When handling a visual tracking event, if your application would like to delegate in Lomse for visual effects generation, the only thing to do is to pass the event to the interactor:
Lomse will handle the event and will send an update window event to your application, for updating the display.
Some properties of the tracking effects, such as its colour, can be customized. See method Interactor::get_tracking_effect(). For the customizable properties see the documentation of each specific visual effect. Example:
For controlling playback some GUI controls (buttons, menu items, etc. to trigger start, stop, pause actions, sliders or other for setting tempo speed, etc.) are normally required.
The simplest way for passing the value of a playback option (i.e. the tempo speed) is to pass the value directly to lomse when asking to play the score in method ScorePlayer::play(). But this approach has a drawback: if the user changes, for instance, the tempo slider, lomse will not be informed of the change.
For solving these kind of problems, the solution is to link your application playback controls to lomse, so that lomse can be informed when changes take place or can collect current settings when needed. For doing this linking your application will have to define a class derived from PlayerGui (it can be your main window) and implement a few virtual methods to allow lomse to access the current values of your application playback controls:
The implementation of these methods is usually as simple as linking each method to the appropriate control in your application. For instance:
PlayerGui is also the object that will receive end of playback events. These are events generated by lomse for signaling the end of playback, so that your application can restore GUI controls related to playback or do other things. This event is sent simultaneously to your application by to mechanisms:
If you look at example code in section How to play an score (relevant lines duplicated here):
the second parameter for load_score is nullptr, meaning that no PlayerGui is used. As you can deduce, the way of informing lomse of the GUI proxy to use is by passing a pointer to the PlayerGui in this method:
If your application would like to use an external player and to provide visual feedback by synchronizing the performance with the displayed score, Lomse can not do this automatically as it doesn't control the playback, but Lomse provides some methods that can help your application to achieve the sound/display synchronization.
For synchronizing the performance with the displayed score it would be necessary:
Visual tracking effects can be managed by your application by invoking Interactor methods for displaying, hiding and positioning the desired visual effect. Thus the only requirement is that your application can provide the necessary information for positioning the visual effect as playback advances.
Probably, the most easy approach for visual tracking effects will be to use the tempo line or the tempo block, as for using them it is only required that your application can identify the location (current measure and beat, or current time position) being played back. If your application can get that information, the procedure for synchronizing the tempo line with the playback would be as follows:
Play
button is clicked.The above method will do scroll only when Lomse determines it is necessary. If you would like to use your own algorithms for scrolling, then, instead of using move_tempo_line_and_scroll_if_necessary()
, you should use move_tempo_line()
and to invoke scroll_to_measure()
when your application considers this convenient:
And that's all. You can see a working example in examples/samples/extplayer.
Trying to highlight/unhighlight the notes and rests as playback advances would be too complex as this would require that your application could provide a pointer to the note/rest being played back and to detect when it has to be unhighlighted. But the approach will be similar.