An information processing device that includes a display, a touch panel that detects a gesture operation, a memory that stores a correlation between each of a plurality of effects that can be applied to an object displayed on the display and one of a plurality of gesture operations, and a processor that applies one of the plurality of effects that corresponds to the detected gesture operation to an object displayed on the display, and controls the display to display the object to which the one of the plurality effects is applied.
|
1. An information processing device comprising:
a display;
a touch panel that detects a gesture operation;
a memory that stores a correlation between each of a plurality of effects that can be applied to an object displayed on the display and one of a plurality of gesture operations; and
a processor that applies one of the plurality of effects that corresponds to the detected gesture operation to an object displayed on the display, and controls the display to display the object to which the one of the plurality effects is applied, wherein
the memory stores a correlation between an effect of copying, consecutively pasting and changing a case of a character and a gesture operation defined by a movement of a touch input on the touch panel in the horizontal and vertical directions, and
the effect of copying, consecutively pasting and changing the case of the character includes
copying and consecutively pasting the character according to the movement of the touch input in the horizontal direction;
controlling the consecutively pasted characters to be capital case while the movement distance vertically upward is equal to or greater than a predetermined value; and
controlling the consecutively pasted characters to be lower case when the movement distance vertically downward is equal to or greater than a predetermined value.
18. A method performed by an information processing apparatus, the method comprising:
storing, in a memory of the information processing apparatus, a correlation between each of a plurality of effects applied to an object displayed on a display and one of a plurality of gesture operations;
detecting, by a touch panel of the information processing apparatus, a gesture operation;
applying, by a processor of the information processing apparatus, one of the plurality of effects that corresponds to the detected gesture operation to an object displayed on the display; and
controlling, by the processor, the display to display the object to which the one of the plurality effects is applied, wherein
the memory stores a correlation between an effect of copying, consecutively pasting and changing a case of a character and a gesture operation defined by a movement of a touch input on the touch panel in the horizontal and vertical directions, and
the effect of copying, consecutively pasting and changing the case of the character includes
copying and consecutively pasting the character according to the movement of the touch input in the horizontal direction;
controlling the consecutively pasted characters to be capital case while the movement distance vertically upward is equal to or greater than a predetermined value; and
controlling the consecutively pasted characters to be lower case when the movement distance vertically downward is equal to or greater than a predetermined value.
19. A non-transitory computer-readable medium including computer-program instructions, which when executed by an information processing apparatus, cause the information processing apparatus to perform a method comprising:
storing, in a memory, a correlation between each of a plurality of effects applied to an object displayed on a display and one of a plurality of gesture operations;
detecting a gesture operation at a touch panel of the information processing apparatus;
applying one of the plurality of effects that corresponds to the detected gesture operation to an object displayed on the display; and
controlling the display to display the object to which the one of the plurality effects is applied, wherein
the memory stores a correlation between an effect of copying, consecutively pasting and changing a case of a character and a gesture operation defined by a movement of a touch input on the touch panel in the horizontal and vertical directions, and
the effect of copying, consecutively pasting and changing the case of the character includes
copying and consecutively pasting the character according to the movement of the touch input in the horizontal direction;
controlling the consecutively pasted characters to be capital case while the movement distance vertically upward is equal to or greater than a predetermined value; and
controlling the consecutively pasted characters to be lower case when the movement distance vertically downward is equal to or greater than a predetermined value.
2. The information processing device of
3. The information processing device of
4. The information processing device of
5. The information processing device of
6. The information processing device of
7. The information processing device of
8. The information processing device of
9. The information processing device of
10. The information processing device of
11. The information processing device of
12. The information processing device of
13. The information processing device of
14. The information processing device of
15. The information processing device of
16. The information processing device of
17. The information processing device of
|
The present application claims the benefit of the earlier filing date of U.S. Provisional Patent Application Ser. No. 61/474,806 filed on Apr. 13, 2011, the entire contents of which is incorporated herein by reference.
1. Technical Field
The present disclosure relates to an information processing control device including a touch panel, whereby instructions at the time of subjecting a selected object such as a desired character string or the like to desired decoration, editing, or the like can be given by a user's operations onto the touch panel thereof.
2. Description of Related Art
In recent years, an information processing device in which a touch panel including a transparent touch detection surface (touch screen) is disposed, for example, so as to cover generally the entire screen of a display screen has been commercialized. With this touch panel, there can be detected the contact position, the number of contact points, contact duration, the movement direction and movement speed of the finger or the like in a contact state, movement path, and so forth at the time of the user's finger or the like coming into contact with the touch detection surface thereof. Note that, with the following description, an operation wherein the user contacts a finger or the like onto the touch detection surface of the touch panel, an operation for moving a finger or the like in a state in contact, and so forth will be referred to as gesture operations collectively.
With a conventional information processing device including such a touch panel, for example, in the event that a desired character string displayed within the display screen is subjected to decoration, for example, such as changing the color, size, style, or the like, the user operates this information processing control device in accordance with the following procedures.
First, the user operates the information processing device to activate, for example, a text editor (including HTML (Hyper Text Markup Language) editors and mailers), and then to specify a desired character string using range selection or the like. Here, the range selection is performed by a gesture operation or the like, for example, such that a finger or the like is slid in a state in contact with the touch panel corresponding to the display position of the desired character string. That is to say, upon detecting a slide gesture operation on the touch panel, the information processing device determines that the character string displayed on the position corresponding to the slide path on the touch panel by the gesture operation thereof has been selected by the user.
Next, the user operates the information processing device to specify decoration content to be subjected as to the character string specified by the range selection or the like. Here, specification of the decoration content is performed by the user selecting a desired menu item or icon out of a list of multiple menu items corresponding to various types of decoration content, or a list of multiple icons corresponding to various types of decoration content, or the like. Specifically, the information processing device in this case displays a list of menu items and items on the display screen, and upon detecting a gesture operation wherein above the touch panel correspond to these display positions is touched by a finger or the like over a short period of time, determines that the decoration content corresponding to the menu item or icon displayed on the touch position in the gesture operation has been selected by the user.
Thus, the information processing device subjects the desired character string specified by the range selection to decoration according to the decoration content specified by selection of the menu item or icon, and displays this on the display screen.
Incidentally, with the conventional information processing device including a touch panel, in the event of subjecting a desired character string within the display screen to decoration, the user has to specify, as described above, a desired decoration content by an operation such as selecting a desired menu item or the like from a list such as menu items corresponding to various decoration contents.
In the event of specifying a desired decoration content using a list such as such menu items, the user has to perform a task wherein the user confirms, according to these respective menu items, which decoration content can be specified by his/her sight, and also confirms the display position of a menu item or the like corresponding to the desired decoration content thereof, and accurately touches on the display position thereof. Specifically, the user has to perform various tasks such as a confirmation task of the decoration content, a confirmation task of the display position of a menu item corresponding to the decoration content thereof, and accurately touching on this display position, and accordingly, his/her burden is large.
Accordingly, regarding an information processing device including a touch panel on a display screen, at the time of a selected object such as a desired character string displayed on the display screen to desired decoration, the inventor recognizes the necessity of greatly reducing the user's burden by reducing his/her tasks, and further the necessity of performing improvement so as to subject a character string or the like to decoration in a more intuitive manner by taking advantage of the properties of the device called a touch panel.
According to a first exemplary embodiment, the disclosure is directed to an information processing device that includes a display, a touch panel that detects a gesture operation, a memory that stores a correlation between each of a plurality of effects that can be applied to an object displayed on the display and one of a plurality of gesture operations, and a processor that applies one of the plurality of effects that corresponds to the detected gesture operation to an object displayed on the display, and controls the display to display the object to which the one of the plurality effects is applied.
According to another exemplary embodiment, the disclosure is directed to a method performed by an information processing apparatus. The method includes storing, in a memory of the information processing apparatus, a correlation between each of a plurality of effects applied to an object displayed on a display and one of a plurality of gesture operations, detecting, by a touch panel of the information processing apparatus, a gesture operation, applying, by a processor of the information processing apparatus, one of the plurality of effects that corresponds to the detected gesture operation to an object displayed on the display, and controlling, by the processor, the display to display the object to which the one of the plurality effects is applied.
According to another exemplary embodiment, the disclosure is directed to a non-transitory computer-readable medium including computer-program instructions, which when executed by an information processing apparatus, cause the information processing apparatus to perform a method. The method comprising storing a correlation between each of a plurality of effects applied to an object displayed on a display and one of a plurality of gesture operations, detecting a gesture operation at a touch panel of the information processing apparatus, applying one of the plurality of effects that corresponds to the detected gesture operation to an object displayed on the display, and controlling the display to display the object to which the one of the plurality effects is applied.
Thus, according to an embodiment of the present invention, in the event of desiring to subject the user's desired selected object on the screen to signal processing such as the user's desired decoration or the like, it is sufficient for the user to perform a gesture operation correlated with the desired decoration thereof, and accordingly, the user does not have to perform a conventional troublesome task such as selection of a menu or icon, or the like. Thus, according to an embodiment of the present invention, the user can subject a desired object such as a character string or the like to desired decoration or the like by a very small task and also small burden equivalent to simply performing a gesture operation correlated with desired decoration or the like. Also, according to an embodiment of the present invention, the properties of the device called a touch panel can be taken advantage of, and accordingly, the user can subject an object such as a character string or the like to decoration or the like using a more intuitive task.
An embodiment of the present disclosure will be described below with reference to the appended drawings.
[General Configuration of Personal Digital Assistant]
In
A speaker 24 is a speaker provided to the personal digital assistant according to the present embodiment, and is used for playback of music, output of reception audio, output of ringer tone (ringtone), and so forth. A microphone 25 is used for collection of external audio, collection of transmission audio, and so forth. An audio signal processing unit 23 is configured of an amplifier circuit for the speaker 24, an amplifier circuit for the microphone 25, a decompression decoding circuit for subjecting compressed encoded audio data to decompression decoding supplied from a control and computing unit 13, a digital/analog conversion circuit for converting this digital audio data after decompression decoding, an analog/digital conversion circuit for converting the analog audio signal input from the microphone 25, a compression coding circuit for subjecting this digital audio data to compression coding, and so forth.
A video signal processing unit 20 is configured of a decompression decoding circuit for subjecting the compressed encoded video data supplied from the control and computing unit 13, a display panel driving circuit for displaying this digital video after decompression decoding, the digital broadcast video received at a digital broadcast reception module 17, and so forth on a display panel 21, and so forth. Also, in the event of the present embodiment, this video signal processing unit 20 also generates video signals for displaying the desktop image, various types of menu images, character input image, photo image, synthesized image, virtual key or virtual button image, or the like supplied form the control and computing unit 13, and displays these images on the display panel 21.
A key operating unit 28 is configured of hard keys provided onto the casing of the personal digital assistant according to the present embodiment, and peripheral circuits thereof. This key operating unit 28 converts operation input of a hard key by the user into an electrical signal, amplifies and subjects the operation input signal to analog/digital conversion, and transmits the operation input data after analog/digital conversion thereof to the control and computing unit 13.
An external memory I/F unit 18 is configured of a slot for external memory which external memory made up of a semiconductor storage medium, or the like is mounted on/detached from, an interface circuit for communication with external memory data, and so forth. The information processing control device according to the present embodiment is configured so as to obtain various types of data, and various application programs via a storage medium such as external memory inserted into this external memory I/F unit 18. Note that, with the personal digital assistant according to the present embodiment, various application programs to be obtained via this external memory include an information processing control program according to the present embodiment for this personal digital assistant executing information processing for subjecting a selected object such as the user's desired character string displayed on the display screen to desired decoration, which will be described later, and so forth.
An external input/output terminal unit 27 is configured of a connector for cable connection and an interface circuit for external data communication at the time of performing data communication via a cable for example, or a charging terminal at the time of charging internal battery via a power cable or the like and an interface circuit for charging thereof, or the like. The information processing control device according to the present embodiment is configured so as to obtain various types of data and various application programs from an external device connected to this external input/output terminal unit 27. Note that, with the present embodiment, various application programs to be obtained via this external input/output terminal unit 27 include the information processing control program according to the present embodiment, and so forth. Also, the information processing control program according to the present embodiment may be recorded in a disc-form recording medium, a recording medium other than this, or the like, for example. In this case, for example, according to a recording media playback device included in a personal computer or the like, the information processing control program read out from this recording medium may be supplied to the external input/output terminal 27. It goes without saying that an arrangement may be made wherein the recording media playback device is directly connected to the external input/output terminal 27, and the information processing control program read out at the playback device thereof is supplied to the personal digital assistant according to the present embodiment.
A short-distance radio communication module 16 is configured of a communication antenna for short-distance radio waves such as a wireless LAN, Bluetooth (registered trademark) or the like, and a short-distance radio communication circuit. Various application programs including the information processing control program according to the present embodiment may be obtained via this short-distance radio communication module 16.
The digital broadcast reception module 17 is configured of an antenna for reception such as a so-called distal television broadcast, digital radio broadcast, and so forth, and a tuner thereof. This digital broadcast reception module 17 is configured so as to receive not a digital broadcast of only one channel but also digital broadcasts of multiple channels at the same time. Also, this digital broadcast reception module 17 is configured so as to also receive data multiplexed to a digital broadcast. Note that an arrangement may be made wherein the digital broadcast data received at this digital broadcast reception module 17 is compressed by the control and computing unit 13, and then stored (i.e., recorded) in the memory unit 14 or the like. Also, various application programs including the information processing control program according to the present embodiment may be broadcasted as one piece of this display broadcast data. In this case, the information processing control program is extracted from the digital broadcast data received at the digital broadcast reception module 17, and taken into the personal digital assistant according to the present embodiment.
A non-contact communication module 19 performs non-contact communication used for so-called RFID (Radio Frequency-Identification: electric wave method recognition), a non-contact IC card, and so forth, by way of a non-contact communication antenna. Various application programs including the information processing control program according to the present embodiment may be obtained via this non-contact communication module 19.
A GPS (Global Positioning System) module 15 includes a GPS antenna, and uses the GPS signal from a GPS geodetic satellite to obtain the latitude and longitude of the current position of the own terminal. The GPS data (information representing latitude and longitude) obtained by the this GPS module 15 is transmitted to the control and computing unit 13. Thus, the control and computing unit 13 can know the current position, movement, and so forth of the own terminal.
A camera unit 22 is configured of an imaging device, an optical system, and so forth for taking a still image or moving image, and peripheral circuits thereof, a light driving circuit for emitting fill light for taking an image, and so forth. Still image data and moving image data at the time of taking an image by this camera unit 22 are transmitted to the video signal processing unit 20 as preview image data. Thus, a preview video is displayed on the display panel 21 at the time of this camera shooting. Also, in the event of recording the still image data or moving image data taken at the camera unit 22, this taken still image data or moving image data is transmitted to the control and computing unit 13, compressed, and then stored in the memory unit 14 or external memory connected to the external memory I/F unit 18.
Various sensor units 26 are configured of sensors for various detections such as a terminal state detection sensor for detecting the state of the personal digital assistant according to the present embodiment, and peripheral circuits thereof. Examples of the various sensor units 26 include an inclination sensor, an acceleration sensor, an azimuth sensor, a temperature sensor, a humidity sensor, and an illuminance sensor. The detections signals by the various sensor units 26 are transmitted to the control and computing unit 13. Thus, the control and computing unit 13 can know the state of this personal digital assistant (inclination, acceleration, azimuth, temperature, humidity, illuminance, etc.).
A touch panel 30 is an input operation having a detection surface where operation input by the user can be detected, and is configured of a transparent touch sensor screen sensor which is disposed on generally the entire screen of the display panel 21. A touch panel signal processing unit 29 calculates, based on the coordinate data supplied from the touch panel 30, the contact position at the time of the user's finger or the like coming into contact with the touch panel 30, contact duration, contact time interval, movement direction, movement speed, and movement path of the finger or the like in the contact state, and transmits the calculated data thereof to the control and computing unit 104 as touch panel detection data. Note that the touch panel 30 and touch panel signal processing unit 29 can handle so-called multi-touch, and are configured so as detect not only the number of multiple contact points but also contact duration, contact time interval, movement direction, movement speed, movement path, and so forth for each contact point.
The memory unit 14 is configured of built-in memory provided to the inside of this terminal, a detachable card-shaped memory, and so forth. As for the detachable card-shaped memory, a card in which so-called SIM (Subscriber Identity Module) information and so forth are stored can be taken as an example. The built-in memory is made up of ROM (Read Only Memory) and RAM (Random Access Memory). The ROM stores various application programs including a text editor, HTML editor, mailer, image editor, and the information processing control program according to the present embodiment, and so forth in addition to an OS (Operating System), a control program for the control and computing unit 13 controlling each unit, various types of initial setting values, dictionary data, character prediction conversion dictionary data, and various types of sound data. This ROM includes NAND-type flash memory or rewritable ROM such as EEPROM (Electrically Erasable Programmable Read-Only Memory), whereby e-mail data, phone book and mail-address book data, still image and moving image content data, and additionally various types of the user's setting values can be stored. The RAM stores data according to need as a work region or buffer region when the control and computing unit 13 performs various types of data processing.
The control and computing unit 13 is configured of a CPU (Central Processing Unit), and controls each unit such as the transmission/reception circuit unit 12, video signal processing unit 20, audio signal processing unit 23, GPS module 15, non-contact communication module 19, short distance radio communication module 16, digital broadcast reception module 17, external memory I/F unit 18, camera unit 22, various sensor units 26, external input/output terminal unit 27, key operating unit 28, touch panel signal processing unit 29, and so forth, and performs various types of computation according to need. Also, the control and computing unit 13 executes various application programs including a control program stored in the memory unit 14, the text editor, HTML editor, mailer, image editor, and the information processing control program according to the present embodiment. Also, with the present embodiment, the control and computing unit 13 executes the information processing control program according to the present embodiment, thereby serving as a decoration editing control unit for executing the user's desired decoration or editing or the like as to the user's desired selected object out of character strings and images displayed on the screen of the display 21 while cooperating with an editing program, for example, such as the text editor, HTML editor, mailer, image editor, or the like in response to the user's operation as to the touch panel 30, which will be described later. Note that the flow of operation control of this personal digital assistant when the control and computing unit 13, i.e., the information processing control program according to the present embodiment executes decoration, editing, or the like as to the selected object while cooperating with the editing program will be described later.
Additionally, the personal digital assistant according to the present embodiment also naturally includes each component provided to a common personal digital assistant such as a clock unit configured to measure time and point-in-time, a power management IC configured to control a battery for supplying power to each unit and power thereof.
[Decoration Editing Control Operation as to Selected Object According to Gesture Operation]
With the personal digital assistant according to the present embodiment, the control and computing unit 13 executes the information processing control program stored in the memory unit 14, thereby enabling decoration or the like according to the user's operation as to the touch panel to be subjected as to the user's desired selected object on the display screen in cooperation with an editing program.
Specifically, the control and computing unit 13 of the personal digital assistant executes the information processing control program according to the present embodiment, thereby serving as a correlation table control unit which generates or stores a later-described correlation table in which multiple information processes (e.g., process of decoration etc.) that can be subjected as to an object displayed on the screen of the display panel 21, and multiple gesture operations are correlated respectively. Also, the control and computing unit 13 executes the information processing control program according to the present embodiment, thereby serving as a detection area control unit which sets, at the time of a desired object being selected by the user on the screen of the display panel 21, multiple gesture operations correlated with the correlation table as each information process that can be subjected as to the selected object to a gesture operation that can be detected on a specific detection area on the touch screen surface of the touch panel 30. Further, the control and computing unit 13 executes the information processing control program according to the present embodiment, thereby serving as a processing control unit which subjects, at the time of any of the multiple gesture operations being detected on the specified detection area, the selected object to an information process correlated with the detected gesture operation, and displays on the screen.
[Decoration Editing Control Operation Example as to Selected Object at Time of Sentence Creation and Editing]
Hereafter, description will be made regarding the operation of the personal digital assistant according to the present embodiment at the time of taking a character or character string that the user has selected out of character strings displayed on the display as the selected object, and subjecting the selected object thereof to the user's desired decoration in the event that the operation mode of the personal digital assistant is in the sentence creation and editing mode, with reference to
In the event that the personal digital assistant according to the present embodiment is in the sentence creation and editing mode, as shown in
At this time, the personal digital assistant according to the present embodiment recognizes each character of a character string 52 of which the range selection has been performed by this slide gesture operation (hereafter, referred to as selected character string 52 as appropriate) as the selected object, and also changes the operation mode of the own terminal to the object decoration editing mode according to the present embodiment in the sentence creation and editing mode.
The personal digital assistant which has changed to the object decoration editing mode sets the area of the selected character string 52 by the slide gesture operation and an adjacent area thereof as a specified area. Note that, of this specified area, the adjacent area may be an area determined beforehand with the area of the selected character string 52 as the center, an area in the place and with the size determined by this terminal, or an area in the place and with the size arbitrarily set by the user. Also, the personal digital assistant at this time correlates a gesture operation as to the touch panel on the specified area with decoration content such as shown in the correlation table in
Specifically, the correlation table shown in
With this correlation table shown in
Thus, the sentence displayed within the character display area 51 as shown in
Note that the movement distance of the two fingers by the pinch-in operation in the vertical direction can be obtained by measuring the distance between the two contact points according to the two fingers on the touch panel before and after the pinch-in operation in the vertical direction, and calculating the difference between the distances measured before and after the pinch-in operation in the vertical direction, for example. More specifically, the movement distance of the two fingers according to the pinch-in operation in the vertical direction can be obtained by subtracting the distance between the two contact points on the touch panel before the pinch-in operation in the vertical direction from the distance between the two contact points on the touch panel after the pinch-in operation in the vertical direction. Also, the font size after reduction of the selected character string 52 can be obtained by subtracting a value proportional to the movement distance of the two fingers according to the pinch-in operation in the vertical direction from the font size before reduction in each character of the selected character string 52. For example, in the event that an HTML editor is taken as an example, the reduced display of the selected character string 52 can be performed by decorating the selected character string 52 using a <font> tag for specifying the font size after reduction.
With the correlation table in
Thus, the sentence displayed within the character display area 51 as shown in
Note that the movement distance of the two fingers by the pinch-out operation in the vertical direction can be obtained by measuring the distance between the two contact points according to the two fingers on the touch panel before and after the pinch-out operation in the vertical direction, and calculating the difference between the distances measured before and after the pinch-out operation in the vertical direction, as an example. More specifically, the movement distance of the two fingers according to the pinch-out operation in the vertical direction can be obtained by subtracting the distance between the two contact points on the touch panel before the pinch-out operation in the vertical direction from the distance between the two contact points on the touch panel after the pinch-out operation in the vertical direction. Also, the font size after enlargement of the selected character string 52 can be obtained by adding a value proportional to the movement distance of the two fingers according to the pinch-out operation in the vertical direction from the font size before enlargement in each character of the selected character string 52. For example, in the event that an HTML editor is taken as an example, the enlarged display of the selected character string 52 can be performed by decorating the selected character string 52 using a <font> tag for specifying the font size after enlargement.
With the correlation table shown in
Also, at this time, in the event that other character strings or the like are displayed on the left side as to the selected character string 52, the personal digital assistant according to the present embodiment inserts a line break after a character displayed the nearest to the left side of the selected character string 52 (i.e., before this selected character string) to move the selected character string 52 to the next row, and then justifies the selected character string 52 to the left edge within the character display area 51. Also, in the event that other characters and so forth are displayed on the right side as to the selected character string 52, the personal digital assistant according to the present embodiment inserts a line break before a character displayed the nearest to the right side of the selected character string 52 (i.e., after this selected character string) to move the character strings and so forth displayed on the right side of the selected character string 52 to the next row.
Thus, the sentence displayed within the character display area 51 as shown in
With the correlation table in
Note that, at this time as well, in the same way as with the case of the drag operation to the left edge, in the event that other characters and so forth are displayed on the left side as to the selected character string 52, or in the event that other characters and so forth are displayed on the right side as to the selected character string 52, the personal digital assistant according to the present embodiment inserts, for example, a line break after a character displayed the nearest to the left side of the selected character string 52 (i.e., before this selected character string) or before a character displayed the nearest to the right side of the selected character string 52 (i.e., after this selected character string).
Thus, the sentence displayed within the character display area 51 as shown in
With the correlation table in
Note that, at this time as well, in the same way as with the case of the drag operation to the left edge or right edge, in the event that other characters and so forth are displayed on the left side as to the selected character string 52, or in the event that other characters and so forth are displayed on the right side as to the selected character string 52, the personal digital assistant according to the present embodiment inserts, for example, a line break after a character displayed the nearest to the left side of the selected character string 52 (i.e., before this selected character string) or before a character displayed the nearest to the right side of the selected character string 52 (i.e., after this selected character string).
Thus, the sentence displayed within the character display area 51 as shown in
With the correlation table in
Note that, at this time as well, in the same way as with the case of the drag operation to the left edge, in the event that other characters and so forth are displayed on the left side or the right side as to the selected character string 52, the personal digital assistant according to the present embodiment inserts, for example, a line break after a character displayed the nearest to the left side of the selected character string 52 or before a character displayed the nearest to the right side of the selected character string 52.
Thus, the sentence displayed within the character display area 51 as shown in
With the correlation table in
Note that, at this time as well, in the same way as with the case of the flick operation to the left direction, in the event that other characters and so forth are displayed on the left side or the right side as to the selected character string 52, the personal digital assistant according to the present embodiment inserts, for example, a line break after a character displayed the nearest to the left side of the selected character string 52 or before a character displayed the nearest to the right side of the selected character string 52.
Thus, the sentence displayed within the character display area 51 as shown in
With the correlation table in
Note that, at this time as well, in the same way as with the case of the flick operation to the left direction or the right direction, in the event that other characters and so forth are displayed on the left side or the right side as to the selected character string 52, the personal digital assistant according to the present embodiment inserts, for example, a line break after a character displayed the nearest to the left side of the selected character string 52 or before a character displayed the nearest to the right side of the selected character string 52.
Thus, the sentence displayed within the character display area 51 as shown in
With the correlation table in
Thus, the sentence displayed within the character display area 51 as shown in
With the correlation table in
Thus, the sentence displayed within the character display area 51 as shown in
Also, according to “PINCH-IN OPERATION IN HORIZONTAL DIRECTION”, the personal digital assistant according to the present embodiment can also change each character of the selected character string to a thin character. The pinch-in operation in the horizontal direction is taken as a gesture operation such that in a state in which two fingers are in contact with on the touch panel, generally in parallel with the alignment direction of each character within the character string within the character display area 51, the two fingers are moved so that one finger approaches from the right direction of the display screen 50, and the other finger approaches from the left direction, thereby moving the two fingers in a direction where the interval between the two fingers is narrowed down. In the event that the pinch-in operation in the horizontal direction has been detected on the specified area after transition has been made to the object decoration editing mode, the personal digital assistant according to the present embodiment changes each character of the selected character string to a thin character.
With the correlation table in
Thus, the sentence displayed within the character display area 51 as shown in
With the correlation table in
Thus, the sentence displayed within the character display area 51 as shown in
Now, with the present embodiment, hue according to the rotation angle, and saturation according to the movement distance of the two fingers can be obtained using a circular region 60 representing hue and saturation in so-called HSV (Hue Saturation Value) space shown in
Therefore, at the time of detecting that the rotating operation has been performed by multi-touch on the specified area, the personal digital assistant according to the present embodiment sets, as shown in
Then, the personal digital assistant according to the present embodiment calculates, as shown in
Also, in the event that the pinch-in operation or pinch-out operation has been detected along with the detection of the rotating operation, the personal digital assistant according to the present embodiment calculates, as shown in
Also, the finger serving as the rotation center may not necessarily be fixed to one location, and even in the event that while the two fingers are mutually moved, and relatively rotated, distance between the two fingers is changed, the hue and saturation can be obtained. In this case, when the two fingers are in contact with on the touch panel by the multi-touch, of two contact points by the two fingers, a contact point of which the coordinate value in the Y-axial direction is smaller is taken as the center of the circular region, and then when the two fingers are relatively rotated, the hue can be obtained based on an angle made up of a line segment connecting the two contact points, and the Y axis, and also the saturation can be obtained based on change in the length of the line segment connecting the two contact points.
With the correlation table in
With the present embodiment, at the time of the object decoration editing mode, in addition to various types of decoration based on the correlation table in
The editing or the like based on this correlation table in
The correlation table in
Also, the personal digital assistant according to the present embodiment is configured, as shown in the correlation table in
Hereafter, the correlation table in
In the event that the character type of the selected character string 52 is characters or symbols, “LATERAL DIRECTION (POSITIVE) OPERATION” of the correlation table shown in
Thus, the sentence displayed within the character display area 51 of the display screen 50 as shown in
In the event that the character type of the selected character string is characters or symbols, “LATERAL DIRECTION (NEGATIVE) OPERATION” of the correlation table shown in
In the event that the character type of the selected character string is alphabetic, “VERTICAL DIRECTION OPERATION” of the correlation table shown in
In the event that the character type of the selected character string is characters or symbols, “UPPER-RIGHT DIRECTION OPERATION” of the correlation table shown in
In the event that the character type of the selected character string is symbolic, “VERTICAL DIRECTION OPERATION” of the correlation table shown in
In the event that the character type of the selected character string is alphabetic, “UPPER-RIGHT DIRECTION OPERATION” of the correlation table shown in
In the event that the character type of the selected character string is numeric, “UPPER-RIGHT DIRECTION OPERATION” of the correlation table shown in
[Processing Flow at Time of Execution of Character Decoration and Editing by Information Processing Control Program According to Present Embodiment]
Hereafter, description will be made regarding a flowchart at the time of the personal digital assistant according to the present embodiment executing the information processing control program to perform processing such as decoration of characters, consecutive input editing, or the like as described above. Note that the information processing program according to the present embodiment may be prepared at the time of shipping of the portable information terminal from the factory, or may be separately acquired via the aforementioned wires communication, external input/output terminal, or various types of storage media such as external memory or disc-form storage media or the like.
With the flowchart in
In the event that the sentence creation and editing mode has been set, as step S2 upon detecting that a user's desired character string within a sentence on the character display area 51 of the display screen 50 has been selected as shown in the above-described
Upon proceeding to the processing in step S3, the control and computing unit 13 changes the operation mode of this terminal to the object decoration editing mode in the sentence creation and editing mode.
Upon proceeding to the object decoration editing mode, the control and computing unit 13 sets the selected character string 52 according to the range selection, and an adjacent area thereof on the touch panel 30 as the above-described specified area. The control and computing unit 13 then sets the specified area on the touch panel 30 as an area for detecting each gesture operation described in the correlation table in the above-described
Next, upon detecting a gesture operation on the specified area, as processing in step S5 the control and computing unit 13 determines whether or not the gesture operation thereof is the pinch-in operation in the vertical direction set to the correlation table in the above-described
Upon proceeding to step S6 after determination is made in step S5 that the gesture operation is the pinch-in operation in the vertical direction, the control and computing unit 13 subtracts distance between two contact points on the touch panel by two fingers before this pinch-in operation in the vertical direction from distance between the two contact points on the touch panel by the two fingers after this pinch-in operation in the vertical direction, thereby calculating the movement distance of the two fingers by this pinch-in operation in the vertical direction.
Next, the control and computing unit 13 advances the processing to step S7, and subtracts a value proportional to the movement distance of the two fingers by the pinch-in operation in the vertical direction from the font size before reduction of each character of the selected character string 52, thereby calculating the font size after reduction of the selected character string.
As processing in step S8, the control and computing unit 13 inputs the font size after reduction of the selected character string 52 to the editor. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <font> which specifies the font size after reduction.
Subsequently, as processing in step S9 the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the selected character string 52 having the font size after reduction on the screen of the display 21 as shown in the above-described
After the processing in step S9, the control and computing unit 13 returns the processing to step S4.
Also, upon proceeding to processing in step S11 in
Upon proceeding to step S12 after determination is made that the detected gesture is the pinch-out operation in the vertical direction, the control and computing unit 13 subtracts distance between two contact points on the touch panel by two fingers before this pinch-out operation in the vertical direction from distance between the two contact points on the touch panel by the two fingers after this pinch-out operation in the vertical direction, thereby calculating the movement distance of the two fingers by this pinch-out operation in the vertical direction.
Next, the control and computing unit 13 advances the processing to step S13, and adds a value proportional to the movement distance of the two fingers by the pinch-out operation in the vertical direction to the font size before reduction of each character of the selected character string 52, thereby calculating the font size after enlargement of the selected character string 52.
As processing in step S14, the control and computing unit 13 inputs the font size after enlargement of the selected character string 52 to the editor. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <font> which specifies the font size after enlargement.
Subsequently, as processing in step S15 the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the selected character string 52 having the font size after enlargement on the screen of the display 21 as shown in the above-described
After the processing in step S15, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S21 in
Upon proceeding to step S22 after determination is made in step S21 that the gesture operation is the drag operation to the left edge, the control and computing unit 13 inputs it to the editor that this selected character string 52 is left-justified. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <div align=“left”>. Note that at the time of this left-justified display, in the event that other characters and so forth are displayed on the left side of the selected character string 52, or other characters and so forth are displayed on the right side of the selected character string 52, the control and computing unit 13 inputs a line break for dividing the row of the selected character string 52 displayed with left justification, and the other characters, to the editor.
Subsequently, as processing in step S23 the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the left-justified selected character string 52 on the screen of the display 21 as shown in the above-described
After the processing in step S23, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S31 in
Upon proceeding to step S32 after determination is made in step S31 that the gesture operation is the drag operation to the right edge, the control and computing unit 13 inputs it to the editor that this selected character string is right-justified. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <div align=“right”>. Note that at the time of this right-justified display, in the event that other characters and so forth are displayed on the left side of the selected character string 52, or other characters and so forth are displayed on the right side of the selected character string 52, the control and computing unit 13 inputs a line break for dividing the row of the selected character string 52 displayed with right justification, and the other characters, to the editor.
Subsequently, as processing in step S33 the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the right-justified selected character string 52 on the screen of the display 21 as shown in the above-described
After the processing in step S33, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S41 in
Upon proceeding to step S42 after determination is made in step S41 that the gesture operation is the drag operation to the center, the control and computing unit 13 inputs it to the editor that this selected character string 52 is subjected to centering. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <div align=“center”>. Note that at the time of this centering display, in the event that other characters and so forth are displayed on the left side of the selected character string 52, or other characters and so forth are displayed on the right side of the selected character string 52, the control and computing unit 13 inputs a line break for dividing the row of the selected character string 52 displayed with centering, and the other characters, to the editor.
Subsequently, as processing in step S43 the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the selected character string 52 subjected to centering on the screen of the display 21 as shown in the above-described
After the processing in step S43, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S51 in
Upon proceeding to step S52 after determination is made in step S51 that the gesture operation is the flick operation to the left direction, the control and computing unit 13 inputs it to the editor that this selected character string 52 is left-scrolled. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <marquee direction=“left”>. Note that at the time of this left-scrolled display, in the event that other characters and so forth are displayed on the left side of the selected character string 52, or other characters and so forth are displayed on the right side of the selected character string 52, the control and computing unit 13 inputs a line break for dividing the row of the selected character string 52 to be displayed with left scroll, and the other characters, to the editor, as described above.
Subsequently, as processing in step S53, the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the selected character string 52 to be left-scrolled on the screen of the display 21 as shown in the above-described
After the processing in step S53, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S61 in
Upon proceeding to step S62 after determination is made in step S61 that the gesture operation is the flick operation to the right direction, the control and computing unit 13 inputs it to the editor that this selected character string 52 is right-scrolled. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <marquee direction=“right”>. Note that at the time of this right-scrolled display, in the event that other characters and so forth are displayed on the left side of the selected character string 52, or other characters and so forth are displayed on the right side of the selected character string 52, the control and computing unit 13 inputs a line break for dividing the row of the selected character string 52 to be displayed with right scroll, and the other characters, to the editor.
Subsequently, as processing in step S63, the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the selected character string 52 to be right-scrolled on the screen of the display 21 as shown in the above-described
After the processing in step S63, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S71 in
Upon proceeding to step S72 after determination is made in step S71 that the gesture operation is the Z-shaped flick operation in the horizontal direction, the control and computing unit 13 inputs it to the editor that this selected character string 52 is both-way-scrolled. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <marquee behavior=“alternate”>. Note that at the time of this right-scrolled display, in the event that other characters and so forth are displayed on the left side of the selected character string 52, or other characters and so forth are displayed on the right side of the selected character string 52, the control and computing unit 13 inputs a line break for dividing the row of the selected character string 52 to be displayed with both-way scroll, and the other characters, to the editor.
Subsequently, as processing in step S73 the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the selected character string 52 to be both-way-scrolled on the screen of the display 21 as shown in the above-described
After the processing in step S73, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S81 in
Upon proceeding to step S82 after determination is made in step S81 that the gesture operation is the double click operation, the control and computing unit 13 inputs it to the editor that this selected character string 52 is displayed with blinking. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <blink>.
Subsequently, as processing in step S83 the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the selected character string 52 to be displayed with blinking on the screen of the display 21 as shown in the above-described
After the processing in step S83, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S91 in
Upon proceeding to step S92 after determination is made in step S91 that the gesture operation is the pinch-out operation in the horizontal direction, the control and computing unit 13 inputs it to the editor that this selected character string 52 is changed to bold characters. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <b>.
Subsequently, as processing in step S93 the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the selected character string 52 displayed with bold characters on the screen of the display 21 as shown in the above-described
After the processing in step S93, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S101 in
Upon proceeding to step S102 after determination is made in step S101 that the gesture operation is the parallel movement operation in the right direction by multi-touch, the control and computing unit 13 inputs it to the editor that this selected character string 52 is changed to italics. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <i>.
Subsequently, as processing in step S103 the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the selected character string 52 displayed with italics on the screen of the display 21 as shown in the above-described
After the processing in step S103, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S111 in
Upon proceeding to processing in step S112 after determination is made in step S111 that the gesture operation is the rotating operation by multi-touch, with a circular region 60 representing the hue and saturation of the HSV space as described above, the control and computing unit 13 changes the hue of the selected character string 52 according to a rotation angle at the time of rotating the other finger with the one finger serving as the rotation center, and calculates the R, G, and B values after change thereof.
Next, as processing in step S113 the control and computing unit 13 determines whether or not the pinch-in operation or pinch-out operation has been detected on the specified area, advances the processing to step S114 in the event that the pinch-in operation or pinch-out operation has been detected, and advances the processing to step S116 in the event that neither of these has been detected.
Upon proceeding to processing in step S114 after determination is made in step S113 that the pinch-in operation or pinch-out operation has been detected, the control and computing unit 13 subtracts, from distance between two contact points on the touch panel by the two fingers after this pinch-in operation or pinch-out operation, distance between two contact points on the touch panel by the two fingers before the pinch-in operation or pinch-out operation, thereby calculating the movement distance of the two fingers by this pinch-in operation or pinch-out operation.
Next, the control and computing unit 13 advances the processing to step S115, changes the saturation of the selected character string 52 according to the movement distance of the two fingers, and calculates the R, G, and B values after change thereof. After the processing in step S115, the control and computing unit 13 advances the processing to step S116.
Upon proceeding to step S116, the control and computing unit 13 inputs the R, G, and B values (character color) of the selected character string 52 calculated in step S112 and step S115 to the editor. As an example, in the event of an HTML editor, the control and computing unit 13 decorates the selected character string 52 using the tag of <font> wherein the character color after change is specified.
Subsequently, as processing in step S117, the control and computing unit 13 controls the video signal processing unit 20 via the editor to display the selected character string 52 after the character color change on the screen of the display 21 as shown in the above-described
After the processing in step S117, the control and computing unit 13 returns the processing to step S4 in
Also, upon proceeding to processing in step S121 in
Upon proceeding to processing in step S122 after detecting the slide gesture operation in the right direction on the dedicated area 54 in step S121, the control and computing unit 13 changes the operation mode to the object decoration editing mode whereby various types of editing and so forth based on the correlation table in the above-described
Upon proceeding to the processing in step S123, the control and computing unit 13 calculates, at the time of a gesture operation wherein the user slides and moves a finger on the dedicated area 54, the movement distance of the finger from the starting point of the slide movement thereof, and advances the processing to the next step S124.
Upon proceeding to the processing in step S124, the control and computing unit 13 determines whether or not the movement distance of the slide movement of the finger on the dedicated area 54 has reached a certain value regarding the X-axial direction (lateral direction). In the event that determination is made in step S124 that the movement distance has not reached the certain value in the X-axial direction (lateral direction), the control and computing unit 13 advances the processing to step S133 in
Upon proceeding to the processing in step S125, the control and computing unit 13 determines whether or not the movement direction of the slide movement of the finger on the dedicated area 54 is a positive direction (right direction). In the event that determination is made in this step S125 that the movement direction is not the positive direction (right direction), the control and computing unit 13 advances the processing to step S141 in
Upon proceeding to the processing in step S126, the control and computing unit 13 determines whether or not the movement distance of the slide movement of the finger on the dedicated area 54 has reached a certain value regarding the Y-axial direction (longitudinal direction). In the event that determination is made in this step S126 that the movement distance has not reached the certain value in the Y-axial direction (longitudinal direction), the control and computing unit 13 advances the processing to step S131 in
Upon proceeding to the processing in step S131 in
As processing in step S132, the control and computing unit 13 resets the starting point to the current finger position, and then advances the processing to step S133.
Next, upon proceeding to the processing in step S133, the control and computing unit 13 determines whether or not the finger is separated from the dedicated area 54, and in the event that the finger is not separated, returns the processing to step S123 in
In this way, in the event that the processing proceeds to step S131 from step S126, determination is made in step S133 that the finger is not separated, returns to step S123, further determination is made in step S124 that the movement distance in the X-axial direction has reached the certain value, determination is made in step S125 that the movement direction is the positive direction, and determination is made in step S126 that the movement distance in the Y-axial direction has not reached the certain value, the control and computing unit 13 further pastes a character string copied from the selected character string behind the copied and pasted character string.
That is to say, the processing in step S123, step S124, step S125, step S126, step S131, and step S132 is arranged to be repeatable. In other words, in the event that a gesture operation has been performed such that the finger is slid and moved in the X-axial direction and also in the right direction on the dedicated area 54, and the movement distance at this time is equal to or greater than the certain value, and also less than the certain value in the Y-axial direction, the control and computing unit 13 copies and consecutively pastes the selected character string 52 behind the selected character string 52 by the number according to the movement distance of the finger. Thus, as shown in the above-described
Also, in the event of determining in step S125 that the movement direction is not the positive direction (right direction), and proceeding to the processing in step S141 in
Subsequently, the control and computing unit 13 advances the processing to step S132 in
In this way, in the event that the processing proceeds to step S141 from step S125, determination is made in step S133 that the finger is not separated, returns to step S123, further determination is made in step S124 that the movement distance in the X-axial direction has reached the certain value, determination is made in S125 that the movement direction is not the positive direction (determined to be the negative direction), and the processing proceeds to step S141, the control and computing unit 13 further deletes the copied and pasted character string.
That is to say, the processing in step S123, step S124, step S125, step S141, and step S132 is arranged to be repeatable. In other words, in the event that a gesture operation has been performed such that the finger is slid and moved in the X-axial direction and also in the left direction on the dedicated area 54, the control and computing unit 13 deletes, when there are the copied and consecutively input selected character strings 52 behind the selected character string 52, the copied and consecutively input selected character strings 52 are deleted by a number according to the movement distance of the finger. Thus, previously consecutively input selected character strings 52 are successively deleted within the character display area 51 of the display screen 50.
Also, in the event of determining in step S126 that the movement distance has reached the certain value in the Y-axial direction, and proceeding to the processing in step S151 in
Next, as processing in step S152 the control and computing unit 13 pastes the character string or the like changed according to the character type or the movement distance and direction behind the selected character string 52 using the editor.
Subsequently, the control and computing unit 13 advances the processing to step S132 in
In this way, in the event that the processing proceeds to step S151 from step S126, determination is made in step S133 that the finger is not separated, returns to step S123, further determination is made in step S124 that the movement distance in the X-axial direction has reached the certain value, determination is made in S125 that the movement direction is not the positive direction, and determination is made in step S126 that the movement distance in the Y-axial direction has reached the certain value, the control and computing unit 13 further pastes the character string or the like changed according to the character type, or the movement distance and direction behind the changed and pasted character string or the like.
That is to say, the processing in step S123, step S124, step S125, step S126, step S151, step S152, and step S132 is arranged to be repeatable. In other words, in the event that a gesture operation has been performed such that the finger is slid and moved in the X-axial direction and also in the right direction on the dedicated area 54, and the movement distance at this time is equal to or greater than the certain value in the X-axial direction, and also equal to or greater than the certain value in the Y-axial direction, and further, the character type at this time and the gesture operation are set to the correlation table in
Note that, describing in a more specific manner with reference to the example in
Also, in step S121 in
Upon proceeding to the processing in step S162, the control and computing unit 13 releases the range selection state of the selected character string 52, and also returns the operation mode of this terminal from the object decoration editing mode to the normal sentence creation and editing mode.
After the processing in step S162, the control and computing unit 13 returns the processing to step S2 in
[Decoration Editing Control Operation Example as to Selected Object at Time of Image Editing]
The personal digital assistant according to the present embodiment enables the user's desired selected object on the display screen to be subjected to decoration according to the user's operations or editing or the like as to the touch panel, for example, by cooperating with an image editor.
Hereafter, description will be made regarding operation at the time of taking an image portion that the user has selected out of images displayed on the display as the selected object, and subjecting the selected object thereof to the user's desired decoration, editing, or the like in the event that the personal digital assistant according to the present embodiment is in the image editing mode by an image editor being activated, with reference to
In the event that the personal digital assistant according to the present embodiment is in the image editing mode, as shown in
With the present embodiment, in a state in which an image such as
At this time, the personal digital assistant according to the present embodiment takes an area surrounded with the movement path of the finger as a range selection area 72 by the user, and the image part 71 (hereafter, referred to as selected image part 71 as appropriate) within the range selection area 72 thereof is recognized as the selected object. Also, the personal digital assistant at this time changes the operation mode of the own terminal to the object decoration editing mode according to the present embodiment in the image editing mode.
The personal digital assistant which has changed to the object decoration editing mode at the time of this image editing mode sets the area of the selected image part 72 and an adjacent area thereof as a specified area. Note that, of this specified area, the adjacent area may be a predetermined area having a fixed size with the area of the selected image part 72 as the center, an area in the place and with the size determined by this terminal, or an area in the place and with the size arbitrarily determined by the user. Also, the personal digital assistant at this time correlates a gesture operation as to the touch panel on the specified area with decoration content such as shown in the correlation table in
Specifically, the correlation table shown in
With this correlation table shown in
Note that the movement distance of the two fingers by the pinch-in operation can be obtained by subtracting the distance between the two contact points on the touch panel before the pinch-in operation from the distance between the two contact points on the touch panel after the pinch-in operation. Also, the reduction ratio of the selected image part 71 can be calculated as a value for decreasing the size of the original selected image part 71 by a percentage proportional to the movement distance of the two fingers according to the pinch-in operation.
Subsequently, the personal digital assistant according to the present embodiment causes the image editor to reduce the size of the selected image part 71 using the reduction ratio, and to paste on the image display area 70 so as to match the center position of a reduced selected image part 71a thereof, and the center of the position where the original selected image part 70 was displayed. Thus, the image displayed within the image display area 70 as shown in
Note that in the event that the display image on the image display area 70 is an image actually taken by a digital camera for example instead of an image wherein a background image and each image object are configured as layer configurations and are synthesized, such as a synthetic image by computer graphics software for example, at the time of pasting the reduced selected image part 71a on the image display area 70, it is desirable to interpolate the background image around the selected image part 71a thereof. Specifically, the original selected image part 71 and this reduced selected image part 71 differs in size, and accordingly, upon the reduced selected image part 71a being pasted on the position where the original selected image part 71 is displayed without change, there may be generated an area including no image around the reduced selected image part 71a due to size difference of these. Accordingly, at the time of pasting the reduced selected image part 71a on the image display area 70, in the event that an area including no image occurs around the reduced selected image part 71a, it is desirable to dispose an image interpolated based on a peripheral image regarding the area thereof.
Also, with this correlation table shown in
Note that the movement distance of the two fingers by the pinch-out operation can be obtained by subtracting the distance between the two contact points on the touch panel before the pinch-out operation from the distance between the two contact points on the touch panel after the pinch-out operation. Also, the scale of enlargement of the selected image part 71 can be calculated as a value for increasing the size of the original selected image part 71 by a percentage proportional to the movement distance of the two fingers according to the pinch-out operation.
Subsequently, the personal digital assistant according to the present embodiment causes the image editor to enlarge the size of the selected image part 71 using the scale of enlargement, and to paste on the image display area 70 so as to match the center position of an enlarged selected image part 71b thereof, and the center of the position where the original selected image part 70 was displayed. Thus, the image displayed within the image display area 70 as shown in
Also, with this correlation table shown in
Thus, the image displayed within the image display area 70 as shown in
Also, with this correlation table shown in
Thus, the image displayed within the image display area 70 as shown in
With the correlation table in
[Processing Flow at Time of Image Editing Execution by Information Processing Control Program According to Present Embodiment]
Hereafter, description will be made regarding a flowchart at the time of the personal digital assistant according to the present embodiment executing the information processing control program to perform processing such as decoration or editing or the like of a selected image part, as described above. The information processing control program according to the present embodiment may be prepared at the time of factory shipment of the personal digital assistant, or may be separately obtained via the radio communication, external input/output terminal, or various types of storage medium such as external memory, disc-form recording medium, and so forth.
With the flowchart in
In the event that the image editing mode has been set, as step S202 upon detecting that a user's desired area within an image on the image display area 70 has been selected as shown in the above-described
Upon proceeding to the processing in step S203, the control and computing unit 13 changes the operation mode of this terminal to the object decoration editing mode in the image editing mode.
Upon proceeding to the object decoration editing mode, the control and computing unit 13 sets the selected image part 71 according to the range selection, and an adjacent area thereof on the touch panel 30 as the above-described specified area. The control and computing unit 13 then sets the specified area on the touch panel 30 as an area for detecting each gesture operation described in the correlation table in the above-described
Next, upon detecting a gesture operation on the specified area, as processing in step S205 the control and computing unit 13 determines whether or not the gesture operation thereof is the pinch-in operation set to the correlation table in the above-described
Upon proceeding to step S206 after determination is made in step S205 that the gesture operation is the pinch-in operation, the control and computing unit 13 subtracts distance between two contact points on the touch panel by two fingers before this pinch-in operation from distance between the two contact points on the touch panel by the two fingers after this pinch-in operation, thereby calculating the movement distance of the two fingers by this pinch-in operation.
Next, the control and computing unit 13 advances the processing to step S207, and calculates a reduction ratio proportional to the movement distance of the two fingers by the pinch-in operation.
Further, as processing in step S208, the control and computing unit 13 generates a selected image part 71a obtained by reducing the selected image part 71 according to the reduction ratio through the image editor.
Subsequently, as processing in step S209 the control and computing unit 13 displays the selected image part 71a after the reduction on the image display area 70 through the image editor. Note that, at this time, in the event that interpolation of a background image and so forth is necessary as described above, the control and computing unit 13 also performs this interpolation processing.
After the processing in step S209, the control and computing unit 13 returns the processing to step S204.
Also, upon proceeding to processing in step S211 in
Upon proceeding to step S212 after determination is made that the detected gesture is the pinch-out operation in step S211, the control and computing unit 13 subtracts distance between two contact points on the touch panel by two fingers before this pinch-out operation from distance between the two contact points on the touch panel by the two fingers after this pinch-out operation, thereby calculating the movement distance of the two fingers by this pinch-out operation.
Next, the control and computing unit 13 advances the processing to step S213, and calculates a scale of enlargement proportional to the movement distance of the two fingers by the pinch-out operation.
Further, as processing in step S214, the control and computing unit 13 generates a selected image part 71b obtained by enlarging the selected image part 71 according to the scale of enlargement through the image editor.
Subsequently, as processing in step S215 the control and computing unit 13 displays the selected image part 71b after the enlargement on the image display area 70 through the image editor.
After the processing in step S214, the control and computing unit 13 returns the processing to step S204 in
Also, upon proceeding to processing in step S221 in
Upon proceeding to processing in step S222 after determination is made in step S221 that the gesture operation is the drag operation, the control and computing unit 13 copies (or moves) the selected image part 71c to the position where the finger after this drag operation has been stopped or separated.
After this processing in step S222, the control and computing unit 13 returns the processing to step S204 in
Also, upon proceeding to processing in step S231 in
Upon proceeding to processing in step S232 after determination is made in step S231 that the gesture operation is the rotating operation by multi-touch, with the circular region 60 representing the hue and saturation of the HSV space as described above, the control and computing unit 13 changes the hue of the selected image part 71 according to a rotation angle at the time of rotating the other finger with the one finger serving as the rotation center, and calculates the R, G, and B values after change thereof.
Next, as processing in step S233 the control and computing unit 13 determines whether or not the pinch-in operation or pinch-out operation has been detected on the specified area, advances the processing to step S234 in the event that the pinch-in operation or pinch-out operation has been detected, and advances the processing to step S236 in the event that neither of these has been detected.
Upon proceeding to processing in step S234 after determination is made in step S233 that the pinch-in operation or pinch-out operation has been detected, the control and computing unit 13 subtracts, from distance between two contact points on the touch panel by the two fingers after this pinch-in operation or pinch-out operation, distance between two contact points on the touch panel by the two fingers before the pinch-in operation or pinch-out operation, thereby calculating the movement distance of the two fingers by this pinch-in operation or pinch-out operation.
Next, the control and computing unit 13 advances the processing to step S235, changes the saturation of the selected image part 71 according to the movement distance of the two fingers, and calculates the R, G, and B values after change thereof. After the processing in step S235, the control and computing unit 13 advances the processing to step S236.
Upon proceeding to step S236, the control and computing unit 13 changes the hue and saturation of the selected image part 71 by the R, G, and B values calculated in step S232 and step S235, and displays a selected image part 71d after the change thereof on the image display area 70, through the image editor.
After the processing in step S236, the control and computing unit 13 returns the processing to step S204 in
Also, upon proceeding to processing in step S241 in
Upon proceeding to the processing in step S242, the control and computing unit 13 releases the range selection state of the selected image part 71, and also returns the operation mode of this terminal from the object decoration editing mode to the normal image editing mode.
After the processing in step S242, the control and computing unit 13 returns the processing to step S202 in
[Another Example of Correlation Table Between Gesture Operations and Decoration Editing Contents at Time of Image Editing]
With the personal digital assistant according to the present embodiment, a correlation table between gesture operations and decoration editing contents used in the object decoration editing mode at the time of the image editing mode may be a table as shown in
Specifically, the example in
With the correlation table shown in
With the correlation table shown in
With the correlation table shown in
With the correlation table in
With the correlation table in
With the correlation table in
With the correlation table in
With the correlation table shown in
With the correlation table shown in
With the correlation table shown in
With the correlation table shown in
With the correlation table shown in
Alternatively, with the personal digital assistant according to the present embodiment, in accordance with the above-described example in
[General Overview]
As described above, the personal digital assistant according to the present embodiment includes a display unit having a screen; a touch panel unit capable of detecting a gesture operation by a user as to a touch detection surface; a correlation table control unit configured to generate or store a correlation table in which a plurality of information processes that can be subjected as to an object displayed on the screen of the display unit, and a plurality of gesture operations are correlated respectively; and a processing control unit configured to subject, at the time of a desired object being selected by a user on the screen of the display unit, and one of the plurality of gesture operations being detected, the selected object to an information process correlated with the detected gesture operation thereof, and to display the selected object on the screen.
The personal digital assistant according to the present embodiment includes a detection area control unit configured to set, at the time of a desired object being selected by a user on the screen of the display unit, a plurality of gesture operations correlated with the correlation table regarding each information process that can be subjected as to this selected object to a gesture operation that can be detected in a predetermined detection area on the touch detection surface of the touch panel unit, and the processing control unit subjects, at the time of one of the plurality of gesture operations being detected at the touch detection surface, the selected object to an information process correlated with the detected gesture operation thereof, and displays the selected object on the screen.
Here, with the personal digital assistant according to the present embodiment, the touch detection surface of the touch panel unit is made up of a transparent touch screen which is disposed so as to cover generally the entire surface of the screen of the display unit.
Also, with the personal digital assistant according to the present embodiment, the detection area control unit may set the predetermined detection area to an area generally correlated with the display area of the selected object on the screen of a display unit.
Also, with the personal digital assistant according to the present embodiment, an arrangement may be made wherein the correlation table control unit generates or stores a correlation table in which a size change process for changing the size of the object, and a gesture operation for changing distance between two touch points on the touch detection surface are correlated, and the processing control unit changes, at the time of a gesture operation for changing distance between the two touch points on the touch detection surface, the size of the selected object according to change of distance between the two touch points, and displays the selected object on the screen.
Also, with the personal digital assistant according to the present embodiment, an arrangement may be made wherein the correlation table control unit generates or stores a correlation table in which a process for moving the object on the screen, and a gesture operation for moving a touch point on the touch detection surface are correlated, and the processing control unit moves and displays, at the time of a gesture operation for moving the touch point being detected on the touch detection surface, the selected object on the screen according to the movement of the touch point.
Also, with the personal digital assistant according to the present embodiment, an arrangement may be made wherein the correlation table control unit generates or stores a correlation table in which a process for blinking the object on the screen, and a gesture operation for repeating a touch on the touch detection surface over a short period of time are correlated, and the processing control unit blinks and displays, at the time of a gesture operation for repeating a touch over the short period of time being detected on the touch detection surface, the selected object on the screen.
Also, with the personal digital assistant according to the present embodiment, an arrangement may be made wherein the correlation table control unit generates or stores a correlation table in which a process for obliquely inclining the object on the screen, and a gesture operation for moving two touch points on the touch detection surface in parallel are correlated, and the processing control unit obliquely inclines and displays, at the time of a gesture operation for moving the touch points in parallel on the touch detection surface, the selected object on the screen.
Also, with the personal digital assistant according to the present embodiment, an arrangement may be made wherein the correlation table control unit generates or stores a correlation table in which a hue changing process for changing the hue of the object, and a gesture operation for rotating two touch points on the touch detection surface are correlated, and also a saturation changing process for changing the saturation of the object, and a gesture operation for changing distance between two points on the touch detection surface are correlated, and the processing control unit changes, at the time of a gesture for rotating the two touch points being detected on the touch detection surface, the hue of the selected object according to the rotations of the two touch points, and changes, at the time of a gesture operation for changing distance between the two points being detected on the touch detection surface, the saturation of the selected object according to the change of the distance between the two points.
Also, with the personal digital assistant according to the present embodiment, an arrangement may be made wherein the detection area control unit sets, at the time of a predetermined gesture operation being detected on the predetermined detection area, a plurality of gesture operations separately correlated with the correlation table as each information process that can be subjected as to the selected object to a gesture operation that can be detected on another detection area different from a predetermined detection area on the touch detection surface of the touch panel unit, and the processing control unit subjects, at the time of one of the gesture operations being detected on the other detection area, the selected object to an information process correlated with the detected gesture operation thereof, and displays the selected object on the screen.
Also, with the personal digital assistant according to the present embodiment, the correlation table control unit may generate or store a correlation table in which a decoration process that can be subjected as to a character serving as the object, and the plurality of gesture operations are correlated.
Also, with the personal digital assistant according to the present embodiment, the correlation table control unit may generate or store a correlation table in which a decoration process that can be subjected as to an image part serving as the object, and the plurality of gesture operations are correlated.
Further, embodiments of the present disclosure include an information processing control method. Specifically, an information processing control method according to the present embodiment is an information processing control method in a device including a display unit having a screen, and a touch panel unit capable of detecting a gesture operation by a user as to a touch detection surface, the method including a process in which a correlation table control unit configured to generate or store a correlation table in which a plurality of information processes that can be subjected as to an object displayed on the screen of the display unit, and a plurality of gesture operations are correlated respectively; and a process in which a processing control unit configured to subject, at the time of a desired object being selected by a user on the screen of the display unit, and any of the plurality of gesture operations being detected, the selected object to an information process correlated with the detected gesture operation thereof, and to display the selected object on the screen.
Also, embodiments of the present disclosure include an information processing control program. Specifically, an information processing control program according to the present embodiment is an information processing control program that can be executed at an information terminal including a display unit having a screen, and a touch panel unit capable of detecting a gesture operation by a user as to a touch detection surface, the program causing a computer of the information terminal to serve as a correlation table control unit configured to generate or store a correlation table in which a plurality of information processes that can be subjected as to an object displayed on the screen of the display unit, and a plurality of gesture operations are correlated respectively, and a processing control unit configured to subject, at the time of a desired object being selected by a user on the screen of the display unit, and any of the plurality of gesture operations being detected, the selected object to an information process correlated with the detected gesture operation thereof, and to display the selected object on the screen.
Also, the present embodiment also includes a recording medium. Specifically, a recording medium according to the present embodiment is configured to record an information processing control program that can be executed at an information terminal including a display unit having a screen, and a touch panel unit capable of detecting a gesture operation by a user as to a touch detection surface. Specifically, a recording medium according to the present embodiment records an information processing control program causing a computer of the information terminal to serve as a correlation table control unit configured to generate or store a correlation table in which a plurality of information processes that can be subjected as to an object displayed on the screen of the display unit, and a plurality of gesture operations are correlated respectively, and a processing control unit configured to subject, at the time of a desired object being selected by a user on the screen of the display unit, and any of the plurality of gesture operations being detected, the selected object to an information process correlated with the detected gesture operation thereof, and to display the selected object on the screen.
According to the present embodiment, an information processing control device including a touch panel on a display screen uses the correlation table in which a gesture operation and each decoration content or editing content are correlated to subject a user's desired selected object on the screen to decoration or editing or the like according to the user's gesture operation, thereby enabling the user's work to be reduced, and enabling the user's burden to be markedly reduced. In other words, with the present embodiment, parameters relating to decoration or parameters relating to editing to be subjected as to the selected object can be changed according to the movement direction, movement speed, movement amount (movement distance), movement path, or the like of a finger at the time of a gesture operation as appropriate, whereby the selected object can be subjected to more intuitive decoration or editing in accordance with the user's intention, and the user's effort and time at the time of performing such decoration or editing or the like can be markedly reduced as compared to the related art.
Note that the personal digital assistant according to the present embodiment may be applied to, in addition to high-performance portable telephone terminals, tablet terminals, or slate PCs, not only portable terminals, for example, such as so-called PDAs (Personal Digital Assistants), notebook-sized personal computers, portable game machines, portable navigation terminals, and so forth, but also various stationary electronic devices including a touch panel.
Also, the description of the above embodiment is an example of the present disclosure. Accordingly, the present disclosure is not restricted to the above-described embodiment, and various modifications can be made according to a design or the like without departing from the technical idea relating to the present disclosure.
Further, it is apparent that one skilled in the art can conceive various modifications, combinations, and other embodiments due to a design or other elements within the scope of the Claims of the present disclosure or equivalent to the Claims.
Kusano, Shunichi, Mukai, Yoshiyuki
Patent | Priority | Assignee | Title |
10713422, | Oct 25 2013 | Samsung Electronics Co., Ltd. | Method of editing document in mobile terminal and mobile terminal using the same |
11740727, | Aug 05 2011 | SMITH INTERFACE TECHNOLOGIES, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
9367225, | Apr 09 2013 | FUJITSU CONNECTED TECHNOLOGIES LIMITED | Electronic apparatus and computer-readable recording medium |
Patent | Priority | Assignee | Title |
5555363, | Sep 30 1993 | Apple Computer, Inc | Resetting the case of text on a computer display |
20060026536, | |||
20060092170, | |||
20060125803, | |||
20070247441, | |||
20090158149, | |||
20090158215, | |||
20100259482, | |||
20110041086, | |||
20110219323, | |||
20120117506, | |||
JP2004102455, | |||
JP2009205304, | |||
JP8314917, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 27 2012 | Sony Corporation | (assignment on the face of the patent) | / | |||
Feb 27 2012 | Sony Mobile Communications Inc. | (assignment on the face of the patent) | / | |||
Mar 08 2012 | Sony Ericsson Mobile Communications Japan, Inc | SONY MOBILE COMMUNICATIONS JAPAN, INC | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 033486 | /0301 | |
Jan 07 2013 | SONY MOBILE COMMUNICATIONS JAPAN, INC | SONY MOBILE COMMUNICATIONS INC | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 033486 | /0334 | |
Jul 01 2014 | SONY MOBILE COMMUNICATIONS INC | Sony Corporation | ASSIGNMENT OF PARTIAL RIGHTS | 033486 | /0387 | |
Sep 14 2017 | Sony Corporation | SONY MOBILE COMMUNICATIONS INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 043943 | /0631 |
Date | Maintenance Fee Events |
Jan 28 2015 | ASPN: Payor Number Assigned. |
May 21 2018 | REM: Maintenance Fee Reminder Mailed. |
Nov 12 2018 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Oct 07 2017 | 4 years fee payment window open |
Apr 07 2018 | 6 months grace period start (w surcharge) |
Oct 07 2018 | patent expiry (for year 4) |
Oct 07 2020 | 2 years to revive unintentionally abandoned end. (for year 4) |
Oct 07 2021 | 8 years fee payment window open |
Apr 07 2022 | 6 months grace period start (w surcharge) |
Oct 07 2022 | patent expiry (for year 8) |
Oct 07 2024 | 2 years to revive unintentionally abandoned end. (for year 8) |
Oct 07 2025 | 12 years fee payment window open |
Apr 07 2026 | 6 months grace period start (w surcharge) |
Oct 07 2026 | patent expiry (for year 12) |
Oct 07 2028 | 2 years to revive unintentionally abandoned end. (for year 12) |