A device is configured to display images of sheet music or electronic sheet music. The device may generate and display an index including markers, locations of markers and series of notes associated with the markers. The device may link directly to a particular image of the sheet music or a particular location in the electronic sheet music using the index. The device may identify musical symbols displayed in the sheet music and generate electronic musical symbols associated with the musical symbols. The device may modify the electronic musical symbols and display the modified electronic musical symbols superimposed over the images of sheet music. The device may generate and output audio using the electronic musical symbols. The device may detect a series of pitches and display the sheet music or electronic sheet music based on a most recent pitch in the series of pitches.
|
5. A computer-implemented method of displaying sheet music on a device, the method comprising:
identifying, by a device, a first marker indicating a first location of a plurality of reference locations within sheet music data, the sheet music data being an electronic representation of sheet music;
determining the first location in the sheet music data;
acquiring first music data from the sheet music data at the first location, the first music data including a note and a note value for each musical symbol in a first series of musical symbols;
updating an index to include the first marker, the first location and the first music data, the index being a navigable index on the device including a portion of the plurality of reference locations in the sheet music data; and
displaying the first marker and at least a portion of the first music data on the device.
13. A computing device, comprising:
at least one processor;
a memory device including instructions operable to be executed by the at least one processor to cause the device to:
identify a first marker indicating a first location of a plurality of reference locations within sheet music data, the sheet music data being an electronic representation of sheet music;
determine the first location in the sheet music data;
acquire first music data from the sheet music data at the first location, the first music data including a note and a note value for each musical symbol in a first series of musical symbols;
update an index to include the first marker, the first location and the first music data, the index being a navigable index on the device including a portion of the plurality of reference locations in the sheet music data; and
display the first marker and at least a portion of the first music data on the device.
1. A computer-implemented method of displaying sheet music on an electronic reader, the method comprising:
acquiring sheet music data;
identifying, using bookmark data embedded in the sheet music data, a marker indicating a reference location of a plurality of reference locations within the sheet music data;
identifying a type for the marker based on the bookmark data, the type indicating how the marker was created;
determining the reference location in the sheet music data;
identifying one or more instruments included in the sheet music data at the reference location;
acquiring first music data from the sheet music data at the reference location, the first music data including a note and a note value for each musical symbol in a series of musical symbols;
determining second music data from the first music data for each of the one or more instruments, the second music data including a portion of the first music data associated with each of the one or more instruments;
updating an index to include the marker, the type of marker, the reference location, the one or more instruments associated with the reference location and the second music data, the index being a navigable index on the electronic reader including a portion of the plurality of reference locations in the sheet music data;
filtering the index based on the type of marker;
filtering the index based on the one or more instruments; and
displaying the filtered index on a display of the electronic reader, wherein the marker is displayed proximate to the second music data.
2. The computer-implemented method of
detecting a touch input at first coordinates of the display;
identifying the first coordinates to be proximate to the marker on the display;
determining that the touch input exceeds a time threshold; and
displaying a first staff and a second staff, the first staff having five evenly spaced lines including the first visual representation of the second music data for the first instrument and the second staff having five evenly spaced lines including the second visual representation of the second music data for the second instrument.
3. The computer-implemented method of
detecting a touch input at first coordinates of the display;
identifying the first coordinates to be proximate to the marker on the display;
generating audio using the second music data associated with the marker; and
outputting the audio.
4. The computer-implemented method of
detecting contact at coordinates on the display;
identifying the coordinates to be proximate to the marker on the display;
determining the reference location associated with the marker;
determining a portion of the sheet music data including the first music data using the reference location; and
displaying a visual representation of the portion of the sheet music data.
6. The computer-implemented method of
detecting an input selecting the first marker, the input exceeding a time threshold; and
displaying a first staff and a second staff, the first staff having five evenly spaced lines including the first visual representation of the first portion of the first music data and the second staff having five evenly spaced lines including the second visual representation of the second portion of the first music data.
7. The computer-implemented method of
detecting an input selecting the first marker;
generating audio using the first music data in response to the input; and
outputting the audio.
8. The computer-implemented method of
displaying the index, the index including the first marker and a second marker;
detecting an input selecting the second marker;
determining a second location associated with the second marker using the index;
determining a portion of the sheet music data including the second location; and
displaying a visual representation of the portion of the sheet music data in response to the input.
9. The computer-implemented method of
10. The computer-implemented method of
identifying the first location in the sheet music data;
determining a series of measures within the sheet music data beginning with the first location, the series of measures including a predetermined number of measures;
identifying musical symbols within the series of measures as the first series of musical symbols; and
generating a visual representation of the first series of musical symbols.
11. The computer-implemented method of
identifying a staff associated with the first location in the sheet music data, the staff having five horizontal staff lines, the sheet music data including an image of the sheet music;
determining a series of measures within the sheet music data beginning with the first location, the series of measures including a predetermined number of measures;
identifying the first musical symbols on the staff within the series of measures;
determining a pitch indicated by each of the first musical symbols based on the staff and a position of each of the musical symbols relative to the five horizontal staff lines;
determining a duration indicated by each of the first musical symbols based on a shape of each of the first musical symbols; and
generating the first music data, the first music data including the pitch and the duration indicated by each of the first musical symbols.
12. The computer-implemented method of
identifying a type for the first marker, the type indicating how the first marker was created;
identifying one or more instruments included in the sheet music data at the first location;
determining second music data from the first music data for each of the one or more instruments, the second music data including a portion of the first music data associated with each of the one or more instruments,
updating the index to include the type of marker, the one or more instruments and the second music data;
filtering the index based on the type of marker;
filtering the index based on the one or more instruments; and
displaying the filtered index, wherein the first marker is displayed proximate to the second music data.
14. The computing device of
detect an input selecting the first marker, the input exceeding a time threshold; and
display a first staff and a second staff, the first staff having five evenly spaced lines including the first visual representation of the first portion of the first music data and the second staff having five evenly spaced lines including the second visual representation of the second portion of the first music data.
15. The computing device of
detect an input selecting the first marker;
generate audio using the first music data in response to the input; and
output the audio.
16. The computing device of
display the index, the index including the first marker and a second marker;
detect an input selecting the second marker;
determine a second location associated with the second marker using the index;
determine a portion of the sheet music data including the second location; and
display a visual representation of the portion of the sheet music data in response to the input.
17. The computing device of
18. The computing device of
identify the first location in the sheet music data;
determine a series of measures within the sheet music data beginning with the first location, the series of measures including a predetermined number of measures;
identify musical symbols within the series of measures as the first series of musical symbols; and
generate a visual representation of the first series of musical symbols.
19. The computing device of
identify a staff associated with the first location in the sheet music data, the staff having five horizontal staff lines, the sheet music data including an image of the sheet music;
determine a series of measures within the sheet music data beginning with the first location, the series of measures including a predetermined number of measures;
identify the first musical symbols on the staff within the series of measures;
determine a pitch indicated by each of the first musical symbols based on the staff and a position of each of the musical symbols relative to the five horizontal staff lines;
determine a duration indicated by each of the first musical symbols based on a shape of each of the first musical symbols; and
generate the first music data, the first music data including the pitch and the duration indicated by each of the first musical symbols.
20. The computing device of
identify a type for the first marker, the type indicating how the first marker was created;
identify one or more instruments included in the sheet music data at the first location;
determine second music data from the first music data for each of the one or more instruments, the second music data including a portion of the first music data associated with each of the one or more instruments,
update the index to include the type of marker, the one or more instruments and the second music data;
filter the index based on the type of marker;
filter the index based on the one or more instruments; and
display the filtered index, wherein the first marker is displayed proximate to the second music data.
|
With the advancement of technology, the use and popularity of electronic devices, such as mobile devices, has increased considerably. Mobile devices, such as smart phones and tablet computers, typically have touchscreens that enable a user to operate the devices by touching the screen with a finger or stylus type device.
For a more complete understanding of the present disclosure, reference is now made to the following description taken in conjunction with the accompanying drawings.
A computing device, such as a computer, tablet, smartphone, electronic reader, etc., may be used to display electronic books, newspapers, magazines, blogs and other digital media or electronic publications, including sheet music. For a text based publication, navigation may be improved by a table of contents or an index that links to particular sections within the publication. A device may generate a table of contents or index based on formatting inherent in the text based publication, such as chapters, sections and headings, and text may easily identify the particular section being referenced. Sheet music, however, is not text based and typically does not include formatting to easily allow the device to generate a table of contents or index. In addition, it may be difficult to identify a particular section of the sheet music being referenced using text. Further, sheet music is typically displayed using a fixed layout, such as a Portable Document Format (PDF) or an image of printed sheet music. As the sheet music has a fixed layout, a size of the sheet music is fixed based on the fixed layout. Thus, if the device magnifies a view of the sheet music, portions of the sheet music may be outside a display of the device.
In order to provide a more complete set of features, provided is a system for improved navigation, magnified display and additional functionality associated with sheet music. To improve navigation, the device may identify markers in sheet music and generate an index displaying the markers, marker locations and musical symbols associated with the markers. Using the musical symbols, it may be easier to identify a particular section being referenced by the index. The index may also allow improved navigation functionality, such as enabling a user to select a marker and displaying the sheet music at a location associated with the marker. To enable a magnified display, the device may identify musical symbols displayed in the sheet music and acquire electronic musical symbols corresponding to the displayed musical symbols. The device may use the electronic musical symbols to display electronic sheet music that can be magnified and displayed in sequence by reflowing the electronic musical symbols between pages. The electronic musical symbols may also enable improved functionality, as the electronic musical symbols may be used to modify musical symbols in the sheet music, to generate and output audio corresponding to the sheet music and to monitor an audio input from a microphone and display a current location in the sheet music determined from the audio input.
As described herein, “sheet music” may refer to sheet music data, which is an electronic representation of sheet music that the device may visually represent on the display using a fixed layout, whereas “electronic sheet music” may refer to electronic sheet music data, which is an electronic representation of electronic sheet music that the device may visually represent on the display using a variety of layouts based on a zoom magnification or other settings. For example, the electronic sheet music data may include a series or sequence of electronic musical symbols and the device may visually represent the series of electronic musical symbols as first electronic sheet music using a first magnification and as second electronic sheet music using a second magnification, the series of electronic musical symbols reflowed in the second electronic sheet music so that the series of electronic musical symbols are displayed in order using additional pages relative to the first electronic sheet music. Similarly, “musical symbols” may refer to musical symbols included in the fixed layout sheet music, whereas electronic musical symbols may refer to musical symbols visually represented in the electronic sheet music that also include associated data, such as a pitch (note) and duration (note value) indicated by the electronic musical symbols. For example, musical symbols may be an image or visual representation of the musical symbols included in the sheet music and the device may display the musical symbols. In contrast, electronic musical symbols may include a visual representation of the electronic musical symbol to display, along with a pitch (note) and duration (note value) indicated by the electronic musical symbol. Therefore, the device may display the electronic musical symbol, generate audio based on the electronic musical symbol or search the electronic musical symbol based on a series of pitches.
Content (e.g., electronic documents, sheet music, electronic sheet music or the like) may be stored using a variety of file formats, and some of the file formats may use markup languages such as Extensible Markup Language (XML). For example, data including sheet music may be stored in fixed layout file formats, such as image files or Portable Document Format (PDF) files, while electronic sheet music data may be stored in music notation file formats, such as MusicXML files, or the electronic musical symbols may be saved as a sequence of commands, for example using a Musical Instrument Digital Interface (MIDI) protocol to generate a Standard MIDI File (SMF). The device may interpret electronic sheet music data, such as a MusicXML file or a SMF, and visually represent the electronic musical symbols on a display of the device as the electronic sheet music. In some examples, the MusicXML and/or SMF may be associated with sheet music, such as a PDF, but may be stored separately from the sheet music.
A single file may include sheet music and/or electronic sheet music associated with a single song or multiple songs. For example, a first file may include sheet music and/or electronic sheet music associated with a first song, while a second file may include sheet music and/or electronic sheet music associated with a second song and a third song. In addition, a single file may include sheet music and/or electronic sheet music for a single instrument or multiple instruments. For example, the first file may only include sheet music and/or electronic sheet music for a violin, while the second file may include sheet music and/or electronic sheet music for both a violin and a cello. Therefore, one file may include multiple songs for multiple instruments (e.g., set list for an entire orchestra), multiple songs for a single instrument (e.g., set list for a violin in the orchestra), a single song for multiple instruments (e.g., single song for the entire orchestra) or a single song for a single instrument (e.g., single song for the violin).
A table of contents or an index (hereinafter, both a table of contents and an index may be referred to as an “index” for ease of explanation) may be used to navigate between multiple files and/or multiple songs within a file. The index may include multiple levels, such as a first-level header (e.g., heading) and a second-level header (e.g., subheading), and a user may add or delete index entries to the index. The index may be stored separately from the sheet music and/or the electronic sheet music using a format such as a Navigation Control file for XML applications (NCX). A single index may reference locations within multiple files, such as multiple Music XML, SMF and/or PDF documents. Therefore, the NCX may allow a user of the device to navigate between sheet music and/or electronic sheet music associated with multiple songs to reference locations in the multiple songs indicated by index entries in the index.
To assist in organizing and accessing the sheet music, the device 102 may generate and display an index 120 associated with the sheet music. The index 120 may include index entries referencing different measures within the sheet music. The index entries may each be associated with a marker, such as rehearsal letter A, rehearsal letter B and rehearsal letter C, which refer to the respective measures within the sheet music. For example, the first index entry 122-1 may reference rehearsal letter A in measure 4, the second index entry 122-2 may reference rehearsal letter B in measure 24, and the third index entry 122-3 may reference rehearsal letter C in measure 36. The index entries may include snippets of music, such as a sequence of musical notes, associated with the respective marker. The snippets of music may be musically representative of respective sections of the sheet music, such as a series of notes in the sheet music associated with the respective marker. For example, the first index entry 122-1 includes a first snippet 124-1, the second index entry 122-2 includes a second snippet 124-2 and the third index entry 122-3 includes a third snippet 124-3. Thus, the index 120 may provide both a list of markers and associated information, a location in the sheet music and some musical notes from the location in the sheet music. Each index entry may also be associated with an index reference, for example, numbers “1.”, “2.”, or “3.”, of the index entries.
Examples of markers may include rehearsal letters or numbers, a beginning of a particular section within the music, a beginning of a movement or theme, a beginning of a new piece of music, a bookmark saved by a user or other points of reference within the sheet music. The markers may be visible in the sheet music and/or electronic sheet music, although some markers may not be visually indicated. The markers may be associated with marker data (e.g., rehearsal letter data, rehearsal number data, section data, bookmark data or the like) embedded in or associated with the sheet music data and/or the electronic sheet music data. For example, some marker data may indicate a type of marker and the device may determine a location of the marker based on a location of the marker data. Other marker data may indicate a type of marker and include a reference location associated with the marker. Some marker data may be semantic elements included in the sheet music data and/or electronic sheet music data. For example, the device 102 may identify a first semantic element (e.g., a first type of tag) and associate first semantic elements with a particular type of marker.
To generate the index 120, the device 102 may acquire (130) sheet music, such as by accessing electronic sheet music data or file(s)/document(s)/image(s) including sheet music. The device 102 may identify (132) marker(s) in the sheet music and may determine (134) location(s) of the marker(s) in the sheet music. For example, the device 102 may identify markers or other notations in the sheet music, or metadata associated with the sheet music, specifying specific regions as important. For example, in the sheet music the device 102 may identify rehearsal letters or numbers based on a position of the text relative to a staff in the sheet music, a beginning of a movement or theme based on notations or annotations, a beginning of a section based on a double bar line and/or a beginning of a new piece of music based on a bold double bar line. In addition, the device 102 may identify markers based on metadata associated with the sheet music, the metadata including the markers listed above along with annotations or bookmarks saved by a user. Thus, the sheet music may include a first rehearsal letter in the published sheet music and a second rehearsal letter added by the user that is saved in the metadata.
The device 102 may acquire (136) a series of notes from location(s) associated with marker(s). For example, the device 102 may acquire image(s) of the series of notes, may obtain the series of notes directly from electronic sheet music or may convert image(s) of sheet music into the series of notes. The device 102 may update (138) an index to include marker(s), location(s) and series of notes associated with marker(s). For example, the device 102 may save the information related to the marker(s) to an electronic file separate from the sheet music. The device 102 may display (140) the index, including the series of notes associated with the markers. In some embodiments, the device 102 may display the index individually, whereas in other embodiments the device 102 may display the sheet music and the index simultaneously. In addition, various elements of the index may be displayed, and user preferences may be used to filter markers or modify what information associated with the markers is displayed.
To generate the electronic sheet music, the device 102 may acquire (150) sheet music, such as by accessing file(s)/document(s)/image(s) of sheet music. The device 102 may identify (152) a staff on the sheet music, the staff being a typical musical notation encompassing five evenly spaced lines. The device 102 may identify (154) musical symbols on the staff, determine (156) notes from the musical symbols and associate (158) notes with the corresponding musical symbols. For example, the device 102 may analyze a musical symbol to identify a note (pitch) and a note value (duration) associated with the musical symbol and may save the note and note value as an electronic symbol. Thus, the electronic symbol may include the note and the note value along with a visual representation of the electronic symbol, allowing the device 102 to generate audio using the electronic symbol.
The device 102 may display (160) sheet music, such as the original image(s) of sheet music or electronic sheet music generated from the musical symbols. By generating the electronic sheet music, the device 102 may provide additional functionality when viewing the sheet music. For example, the device 102 may play audio based on the sheet music or may modify individual musical symbols based on a user input.
As an example of a marker, rehearsal letter 216 may be used to identify a particular section of the sheet music 210. To identify locations within the sheet music, measure numbers, such as measure number 217, may indicate a number for a particular measure. Thus, the rehearsal letter 216 may have a location of 4, indicating it begins at the fourth measure number in the sheet music 210 of Title 218.
The device 102 may identify (312) a marker in the sheet music, such as by identifying a notation in the sheet music corresponding to a marker or by identifying an electronic marker associated with the sheet music. For example, in the sheet music the device 102 may identify rehearsal letters or numbers based on a position of the text relative to a staff in the sheet music, a beginning of a movement or theme based on notations or annotations, a beginning of a section based on a double bar line and/or a beginning of a new piece of music based on a bold double bar line. In addition, the device 102 may identify markers based on metadata associated with the sheet music, the metadata including the markers listed above along with annotations or bookmarks saved by a user. Thus, the sheet music may include a first rehearsal letter in the published sheet music and a second rehearsal letter added by the user that is saved in the metadata.
The device 102 may identify (314) a type of marker for the marker. Examples of markers types may include rehearsal letters or numbers, a particular section of the music, a beginning of a movement or theme, a beginning of a new piece of music, a bookmark saved by a user or other points of reference within the sheet music.
The device 102 may determine (316) a location of the marker based on a staff, measure number, bar line, or the like, associated with the marker. For example, the device 102 may identify a staff associated with the marker, may identify a measure number located at the beginning of the staff and determine a number of bar lines between the measure number and the marker.
The device 102 may identify (318) instrument(s) associated with the location. For example, a marker may be associated with a system that includes multiple staves. Individual staves may be associated with an individual instrument, although some instruments may be associated with multiple staves, such as a piano being associated with a grand staff (including a staff for a treble clef and a staff for a bass clef). The sheet music may identify the instrument(s) in notation(s) preceding the staves.
The device 102 may acquire (320) a series of notes from the location. For example, the device 102 may identify musical symbols beginning at the measure number associated with the marker and may determine a pitch (note) and duration (note value) (collectively, “notes”) for individual musical symbols. The device 102 may acquire a desired number of musical symbols, a desired number of measures or a desired duration of time associated with the musical symbols.
The device 102 may associate (322) portions of the series of notes with corresponding instrument(s). For example, the notes on a first staff may be associated with a flute while the notes on a second staff may be associated with a clarinet. The device 102 may determine (324) notes to display for the corresponding marker. For example, if a single instrument is identified, the device 102 may display the series of notes for the marker. In contrast, if multiple instruments are identified, the device 102 may display notes associated with a single instrument for the marker, based on user preference, or may display notes associated with multiple instruments. In one example, the device 102 may display notes associated with a flute for a first measure but the flute may have a rest for a second measure and the device 102 may display notes associated with a clarinet for the second measure. In another example, the device 102 may determine that a melody is associated with the flute in a first measure and with a clarinet in a second measure and may display notes associated with the flute for the first measure and notes associated with the clarinet for the second measure. In another example, the device 102 may display notes from multiple instruments on a single staff to display additional information.
The device 102 may update (326) an index to include the marker, location, instrument(s) and notes to display. For example, the device 102 may save the information to an electronic file separate from the sheet music. The device 102 may determine (328) if additional markers are present in the sheet music. If an additional marker is present, the device 102 may loop (330) to step 312 and repeat steps 312-326 for the additional marker.
If an additional marker is not present, the device may filter (332) the index based on a type of marker and/or instruments and may display (334) the filtered index, including notes to display associated with the markers. For example, the index may be filtered to show a rehearsal letter associated with a flute, allowing a flutist to easily navigate the sheet music using the notes associated with the rehearsal letters. While the filtering is illustrated based on a type of marker and/or instruments, the disclosure is not limited thereto. Instead, the device 102 may filter based on a melody, a harmony, a theme, a range of measures in the sheet music or any other filtering system known to one of skill in the art.
The index may be dynamic and operable, and associated with various functionality executable by the device. For example,
As illustrated in
The device 102 may determine (814) a marker to insert and may insert (816) the marker at the location in the sheet music. As an example, the device 102 may provide the user with a list of available markers and the user may select a particular marker to insert. As another example, the device 102 may use a particular marker for insert commands and may not need input from the user. The device 102 may insert a visual representation of the marker at a desired location, such as above the bar line preceding the measure, or the device 102 may insert the visual representation of the marker at the first coordinates.
The device 102 may identify (818) instrument(s) associated with the location. For example, the device 102 may determine that the location is associated with a single staff and therefore a single instrument. Alternatively, the device 102 may determine that the location is associated with a system including multiple instruments. The device 102 may acquire (820) a series of notes beginning at the location for the instrument(s) and may associate (822) portions of series of notes with corresponding instrument(s). For example, if a system is present including three instruments, the device 102 may acquire a series of notes for the three instruments and associate portions of the series of notes with the corresponding instrument.
The device 102 may determine (824) notes to display for the marker, such as the notes to display in an index. The device 102 may then update (826) an index to include the marker, location, instrument(s) and notes to display. The disclosure is not limited thereto, and the device 102 may update the index to include additional information not listed.
The device 102 may identify (1012) a staff on the sheet music, the staff being a typical musical notation encompassing five evenly spaced staff lines. The device 102 may identify (1014) an instrument associated with the staff and identify (1016) musical symbols positioned on the staff. For example, the sheet music may include a system having multiple instruments and may have notations identifying the multiple instruments. The device 102 may determine (1018) notes (pitch) from the musical symbols. For example, the staff may include a clef indicating a pitch range and each musical symbol may be associated with a particular pitch in the pitch range. The device 102 may determine (1020) note values (duration) from the musical symbols. The note values may be a duration of the musical symbol based on a timing system identified in the sheet music. The device 102 may generate (1022) electronic symbols using the notes and the note values and may associate (1024) the electronic symbols with corresponding musical symbols.
The device 102 may store (1026) the notes, the note values and the electronic symbols. For example, the device 102 may store the notes, the note values and the electronic symbols separate from the sheet music, such as in an index, a metadata file or another file associated with the sheet music.
The device 102 may determine (1028) if an additional staff is present in the sheet music. If an additional staff is present, the device 102 may loop (1030) to step 1012 and repeat steps 1012-1026 for the additional staff.
If the additional staff is not present, the device 102 may display (1032) musical symbols and/or electronic symbols. In a first example, the device 102 may display the sheet music without the associated electronic symbols, using the notes and note values obtained to provide additional functionality to the sheet music. In a second example, the device 102 may display the sheet music with the associated electronic symbols, such as overlaid above the sheet music. In this example, the user may see the electronic symbols and may modify the electronic symbols to correct mistakes or to modify the underlying sheet music. In a third example, the device 102 may display electronic sheet music using the electronic symbols. In this example, the device 102 does not display the sheet music and may provide the electronic symbols in various formats and/or magnification not available using the sheet music.
The device 102 may (1310) detect a shift command at coordinates on a display, identify (1312) a musical symbol associated with the coordinates and determine (1314) an electronic symbol corresponding to the musical symbol. In a first example, the device 102 may be displaying an image of sheet music and the device 102 may identify the musical symbol on the sheet music and determine the electronic symbol corresponding to the musical symbol. In a second example, the device 102 may be displaying an image of sheet music with electronic symbols superimposed on the sheet music. In this example, the device 102 may identify the electronic symbol directly based on the coordinates. In a third example, the device 102 may be displaying electronic sheet music and the device 102 may identify the electronic symbol based on the coordinates.
The device 102 may modify (1316) a note (pitch) of the electronic symbol based on user input. For example, the user may click and drag the electronic symbol to a new staff line on the staff corresponding to a separate note. Alternatively, the user may select an electronic symbol, insert a shift command and click on the new staff line to assign the selected electronic symbol to the separate note. The disclosure is not limited thereto and may include any methods known to one of skill in the art for changing a note of the electronic symbol.
The device 102 may store (1318) the modified note for the electronic symbol. For example, the device 102 may store the modified note with the electronic symbols, which may be stored separate from the sheet music, such as in an index, a metadata file or another file associated with the sheet music.
The device 102 may display (1320) an electronic symbol at the modified note. In a first example, the device 102 may display an image of sheet music with the electronic symbol at the modified note, so that a user may see the modified note and the original note. In a second example, the device 102 may display electronic sheet music with the electronic symbol at the modified note along with an indicator of the original note. In a third example, the device 102 may display electronic sheet music with the electronic symbol at the modified note without any indicator of the original note, although the electronic symbol may include a visual representation alerting the user that the electronic symbol is a modified note, such as by using a different color or other methods known to one of skill in the art.
As illustrated in
As illustrated in
The device 102 may generate (1716) a first series of electronic symbols based no notes and note values and may compare (1718) the first series of electronic symbols to electronic symbols associated with sheet music. The device 102 may then identify (1720) a second series of electronic symbols associated with the sheet music that match the first series of electronic symbols. The device 102 may determine (1722) musical symbols corresponding to the second series of electronic symbols and determine (1724) a location of the musical symbols. The device 102 may then display (1726) the sheet music at the location.
For example, the device 102 may identify location(s) in the sheet music having a particular sequence of musical symbols or notes based on the first series of electronic symbols (generated based on input from the user) and electronic symbols associated with the sheet music. In a first example, the device 102 displays image(s) of sheet music and the device 102 compares the first series of electronic symbols to electronic symbols previously generated that correspond to musical symbols included in the sheet music. In a second example, the device 102 displays electronic sheet music and the device 102 compares the first series of electronic symbols to electronic symbols included in the electronic sheet music.
The device 102 may detect (1810) an input using a microphone. The device 102 may determine (1812) notes associated with the input and optionally determine (1814) note values associated with the input. In a first example, the device 102 may determine the note values to limit a number of potential matches within the sheet music. In a second example, the device 102 may exclude the note values to include potential matches despite the input being slightly incorrect. In a third example, the device 102 may determine a range of note values to compensate for variations in the input.
The device 102 may generate (1816) a first series of electronic symbols based on the notes and note values. The device 102 may compare (1818) the first series of electronic symbols to electronic symbols associated with the sheet music and may identify (1820) second series of electronic symbols associated with the sheet music matching the first series of electronic symbols. The device 102 may determine (1822) a location of the second series of electronic symbols and display (1824) the sheet music at the location.
In a first example, the device 102 may display image(s) of sheet music and may change an image displayed by the device 102 based on the location of the second series of electronic symbols. Thus, a user playing an instrument may see a corresponding page of the sheet music displayed based on the notes being played by the user. In a second example, the device 102 may display electronic sheet music and may update a display or measures displayed based on the location of the second series of electronic symbols. Thus, a user playing an instrument may see a corresponding measure of the sheet music displayed based on the notes being played by the user, or the user may see a continuous advancement of the electronic sheet music in time with the notes being played by the user.
In some embodiments, the device 102 may update the location being displayed based on timing included in the sheet music. For example, if the user is playing an instrument having a rest, the device 102 may continue to update the location based on timing associated with the rest or subsequent measures. In other embodiments, the device 102 may update the location being displayed based on a speed of the input. For example, if the user is playing faster than the sheet music indicates, the device 102 may continue to update the location based on the speed of the user. The disclosure is not limited thereto and the device 102 may update the location based on any methods known to one of skill in the art to assist the user in viewing a corresponding portion of the sheet music/electronic sheet music.
While the disclosure has illustrated examples using modern staff notation, the disclosure is not limited thereto. Instead, device 102 may display any musical notation, such as percussion notation, figured bass notation, lead sheet notation, chord chart notation, shape note notation, tablature or the like.
As shown in
As illustrated in
The computing device 102 may include one or more microcontrollers/controllers/processors 2104 that may each include a central processing unit (CPU) for processing data and computer-readable instructions, and a memory 2106 for storing data and instructions. The memory 2106 may include volatile random access memory (RAM), non-volatile read only memory (ROM), non-volatile magnetoresistive (MRAM) and/or other types of memory. The computing device 102 may also include a data storage component 2108, for storing data and microcontrollers/controller/processor-executable instructions (e.g., instructions to perform one or more steps of the methods illustrated in and described with reference to
Computer instructions for operating the computing device 102 and its various components may be executed by the microcontroller(s)/controller(s)/processor(s) 2104, using the memory 2106 as temporary “working” storage at runtime. The computer instructions may be stored in a non-transitory manner in non-volatile memory 2106, storage 2108, or an external device. Alternatively, some or all of the executable instructions may be embedded in hardware or firmware in addition to or instead of software.
The computing device 102 includes input/output device interfaces 2110. A variety of components may be connected through the input/output device interfaces 2110, such as the display or display screen 104 having a touch surface or touchscreen; an audio output device for producing sound, such as speaker(s) 2112; one or more audio capture device(s), such as a microphone or an array of microphones 2114; one or more image and/or video capture devices, such as camera(s) 2116; one or more haptic units 2118; and other components. The display 104, speaker(s) 2112, microphone(s) 2114, camera(s) 2116, haptic unit(s) 2118, and other components may be integrated into the computing device 102 or may be separate.
The display 104 may be a video output device for displaying images. The display 104 may be a display of any suitable technology, such as a liquid crystal display, an organic light emitting diode display, electronic paper, an electrochromic display, a cathode ray tube display, a pico projector or other suitable component(s). The display 104 may also be implemented as a touchscreen and may include components such as electrodes and/or antennae for use in detecting stylus input events or detecting when a stylus is hovering above, but not touching, the display 104, as described above.
The input/output device interfaces 2110 may also include an interface for an external peripheral device connection such as universal serial bus (USB), FireWire, Thunderbolt, Ethernet port or other connection protocol that may connect to networks 2020. The input/output device interfaces 2110 may also include a connection to antenna 2122 to connect one or more networks 2020 via a wireless local area network (WLAN) (such as WiFi) radio, Bluetooth, and/or wireless network radio, such as a radio capable of communication with a wireless communication network such as a Long Term Evolution (LTE) network, WiMAX network, 3G network, etc. The stylus 106 may connect to the computing device 102 via one of these connections. The touchscreen of the display 104 and the stylus 106 may also communicate data or operating information to one another to enable the computing device 102 to determine a position of the stylus 106 relative to the touchscreen. The stylus 106 may also communicate to the device 102 (either through the display 104) or otherwise, information about the stylus such as a stylus identifier, user identifier, or other information. Additionally, in some embodiments, the computing device 102 (for example, the touchscreen) and the stylus 106 may communicate using electromagnetic communications (for example, electric fields generated by each device to transmit data on a carrier frequency), and/or haptic communications.
The computing device 102 further includes a sheet music module 2124 and an audio module 2126 that may perform the steps described above with regard to
The above embodiments of the present disclosure are meant to be illustrative. They were chosen to explain the principles and application of the disclosure and are not intended to be exhaustive or to limit the disclosure. Many modifications and variations of the disclosed embodiments may be apparent to those of skill in the art. Persons having ordinary skill in the field of computers and/or digital imaging should recognize that components and process steps described herein may be interchangeable with other components or steps, or combinations of components or steps, and still achieve the benefits and advantages of the present disclosure. Moreover, it should be apparent to one skilled in the art, that the disclosure may be practiced without some or all of the specific details and steps disclosed herein.
The concepts disclosed herein may be applied within a number of different devices and computer systems, including, for example, general-purpose computing systems, televisions, stereos, radios, server-client computing systems, mainframe computing systems, telephone computing systems, laptop computers, cellular phones, personal digital assistants (PDAs), tablet computers, wearable computing devices (watches, glasses, etc.), other mobile devices, etc. that can operate with a touchscreen.
Embodiments of the disclosed system may be implemented as a computer method or as an article of manufacture such as a memory device or non-transitory computer readable storage medium. The computer readable storage medium may be readable by a computer and may comprise instructions for causing a computer or other device to perform processes described in the present disclosure. The computer readable storage medium may be implemented by a volatile computer memory, non-volatile computer memory, hard drive, solid-state memory, flash drive, removable disk and/or other media.
Embodiments of the present disclosure may be performed in different forms of software, firmware, and/or hardware. Further, the teachings of the disclosure may be performed by an application specific integrated circuit (ASIC), field programmable gate array (FPGA), or other component, for example.
As used in this disclosure, the term “a” or “one” may include one or more items unless specifically stated otherwise. Further, the phrase “based on” is intended to mean “based at least in part on” unless specifically stated otherwise.
Patent | Priority | Assignee | Title |
10261965, | Sep 23 2015 | TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED | Audio generation method, server, and storage medium |
10276137, | Apr 04 2017 | WAREHOUSETERRADA | Electronic musical score apparatus |
10325513, | Oct 17 2014 | Yamaha Corporation | Musical performance assistance apparatus and method |
10529249, | Dec 30 2015 | International Business Machines Corporation | Music practice feedback system, method, and recording medium |
10977957, | Dec 30 2015 | International Business Machines Corporation | Music practice feedback |
11902692, | Mar 27 2019 | SONY GROUP CORPORATION | Video processing apparatus and video processing method |
9672799, | Dec 30 2015 | International Business Machines Corporation | Music practice feedback system, method, and recording medium |
9842510, | Dec 30 2015 | International Business Machines Corporation | Music practice feedback system, method, and recording medium |
Patent | Priority | Assignee | Title |
4350070, | Feb 25 1981 | Electronic music book | |
5146833, | Apr 30 1987 | KAA , INC | Computerized music data system and input/out devices using related rhythm coding |
5400687, | Jun 06 1991 | Kawai Musical Inst. Mfg. Co., Ltd. | Musical score display and method of displaying musical score |
5402339, | Sep 29 1992 | Fujitsu Limited | Apparatus for making music database and retrieval apparatus for such database |
5621538, | Jan 07 1993 | TUBBY ELECTRONIC ENTERTAINMENT | Method for synchronizing computerized audio output with visual output |
5665927, | Jun 30 1993 | Casio Computer Co., Ltd. | Method and apparatus for inputting musical data without requiring selection of a displayed icon |
5689077, | Sep 13 1996 | Musical score display and audio system | |
5706363, | Jul 31 1991 | Yamaha Corporation | Automated recognition system for printed music |
5728960, | Jul 10 1996 | INTELLECTUAL VENTURES ASSETS 28 LLC | Multi-dimensional transformation systems and display communication architecture for musical compositions |
5760323, | Jun 20 1996 | Music Net Incorporated | Networked electronic music display stands |
5773741, | Sep 19 1996 | SUNHAWK DIGITAL MUSIC, LLC | Method and apparatus for nonsequential storage of and access to digital musical score and performance information |
5963957, | Apr 28 1997 | U S PHILIPS CORPORATION | Bibliographic music data base with normalized musical themes |
6051769, | Nov 25 1998 | Computerized reading display | |
6072114, | Jan 13 1998 | Yamaha Corporation | Score display apparatus and method capable of displaying additional data, and storage medium |
6084168, | Jul 10 1996 | INTELLECTUAL VENTURES ASSETS 28 LLC | Musical compositions communication system, architecture and methodology |
6348648, | Nov 23 1999 | System and method for coordinating music display among players in an orchestra | |
6380471, | Mar 22 2000 | Yamaha Corporation | Musical score data display apparatus |
6392132, | Jun 21 2000 | Yamaha Corporation | Musical score display for musical performance apparatus |
6414231, | Sep 06 1999 | Yamaha Corporation | Music score display apparatus with controlled exhibit of connective sign |
6483019, | Jul 30 2001 | FREEHAND SYSTEMS, INC | Music annotation system for performance and composition of musical scores |
6635815, | Dec 01 2000 | Hitachi Car Engineering Co., Ltd. | Electronic music providing apparatus |
7119266, | May 21 2003 | Electronic music display appliance and method for displaying music scores | |
7703014, | Dec 05 2002 | Yamaha Corporation | Apparatus and computer program for arranging music score displaying data |
7888578, | Feb 29 2008 | LITE-ON ELECTRONICS GUANGZHOU LIMITED | Electronic musical score display device |
8389843, | Jan 12 2010 | NoteFlight, LLC | Interactive music notation layout and editing system |
8669456, | Apr 26 2012 | CERBERUS BUSINESS FINANCE, LLC, AS COLLATERAL AGENT | Video export of a digital musical score |
8779269, | Mar 21 2012 | Yamaha Corporation | Music content display apparatus and method |
8835741, | Jun 18 2012 | Method and system for turning pages containing musical scores with an electronic foot pedal | |
8878040, | Jan 26 2012 | CASTING MEDIA INC | Music support apparatus and music support system |
20010022127, | |||
20030100965, | |||
20030110925, | |||
20040159212, | |||
20080060507, | |||
20080302233, | |||
20110003638, | |||
20130000463, | |||
20130005470, | |||
20130319209, | |||
20140088341, | |||
20140320442, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Dec 15 2014 | Amazon Technologies, Inc. | (assignment on the face of the patent) | / | |||
Jan 15 2015 | CLARKE, FREDERICK HUGHES | Amazon Technologies, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 034774 | /0056 |
Date | Maintenance Fee Events |
Oct 28 2019 | REM: Maintenance Fee Reminder Mailed. |
Apr 13 2020 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Mar 08 2019 | 4 years fee payment window open |
Sep 08 2019 | 6 months grace period start (w surcharge) |
Mar 08 2020 | patent expiry (for year 4) |
Mar 08 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Mar 08 2023 | 8 years fee payment window open |
Sep 08 2023 | 6 months grace period start (w surcharge) |
Mar 08 2024 | patent expiry (for year 8) |
Mar 08 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Mar 08 2027 | 12 years fee payment window open |
Sep 08 2027 | 6 months grace period start (w surcharge) |
Mar 08 2028 | patent expiry (for year 12) |
Mar 08 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |