A digital system that may be used by children two years old and older. The digital system is contained in a child-proof case and has an upward-facing display with a touch-sensitive screen that is within easy reach of a child. Other I/O devices include proximity and motion sensors and a microphone, and there is also a loudspeaker. When a proximity sensor senses someone in the neighborhood of the system, it displays images on the display. A child may manipulate the images by touching them on the touch screen. Manipulations include selecting an image by touching it, “dragging” the selected image by moving the finger touching the image across the screen and “dropping” the image by lifting a finger from it, moving a selected image by touching another location on the screen and thereby causing the selected image to move to the touched location, removing an image from the screen by “throwing” it, i.e., moving it above a threshold speed, and modifying the image by tapping it twice and then moving the finger in a horizontal or vertical direction on the screen. The direction in which an image is thrown may further determine what the thrown image is replaced with. The manipulations are used to in activities such as shape matching, puzzle assembly, assembly of a face out of parts, and hide-and-go-seek.
|
15. Apparatus for displaying images, the apparatus comprising:
a touch screen for displaying the images and
a computer coupled to the touch screen,
the computer responding to a continuing touch that moves the image across the touch screen such that when the computer detects that the velocity of the touch exceeds a predetermined threshold, the computer responds by removing the image from the screen, and by replacing the image with a replacement image that is not a representative of the removed image.
1. A system for manipulating images comprising
a screen upon which an image is displayed; and
a computer coupled to the screen, the computer causing the images to be manipulated in response to location inputs from a pointing device, the system being characterized in that:
when the image is being dragged in response to the location inputs and the system detects that the velocity with which the image is being dragged exceeds a threshold velocity, the system responds by removing the image from the display without leaving any representative thereof in the display.
7. A system for manipulating a movable image comprising:
a touch-sensitive screen upon which an image is displayed;
a computer coupled to the screen, the computer causing the image to be manipulated when the touch screen is touched,
the system being characterized in that:
touching the movable image at a point within the image selects the image for moving and
when the point being touched is being continually moved and the system detects that the velocity at which the point is moving exceeds a predetermined threshold velocity, the image being continually moved is removed from the screen without leaving any representative thereof on the screen.
2. The system set forth in
the removed image is automatically replaced by another image.
3. The system set forth in
there is a plurality images, each image belonging to a class of a plurality thereof according to the image's content; and
when the image that is being removed is dragged in a first direction, the removed image is replaced with a different image of the same class; and
when the image that is being removed is dragged in a second direction, the removed image is replaced with an image of a different class.
4. The system set forth in
the classes belong to a hierarchy; and
the second direction includes a third direction specifying a class from a higher level in the hierarchy and a fourth direction specifying a class from a lower level in the hierarchy.
6. The system set forth in
the touch panel is transparent and is mounted on the screen.
8. The system set forth in
the removed image is automatically replaced by another image.
9. The system set forth in
there is a plurality of images, each image belonging to a class of a plurality thereof according to the image's content; and
when the image that has been removed was moving in a first direction, the removed image is replaced with a different image of the same class; and
when the image that has been removed was moving in a second direction, the removed image is replaced with an image of a different class.
10. The system set forth in
the classes belong to a hierarchy; and
the second direction includes a third direction specifying a class from a higher level in the hierarchy and a fourth direction specifying a class from a lower level in the hierarchy.
11. The system set forth in
touching the movable image and touching the movable image again within a predetermined period selects the movable image for modification.
12. The system set forth in
continually moving the point being touched after the movable image is again touched causes the image to be modified.
13. The system set forth in
continually moving the point being touched in a vertical direction changes the height of the image.
14. The system set forth in
continually moving the point being touched in a horizontal direction changes the width of the image.
16. The apparatus for displaying an image set forth in
there is a plurality of images, each image belonging to a class of a plurality thereof according to the image's content; and
when the continuing touch moves in a first direction, the replaced image is replaced with a replacement image of the same class; and
when the continuing touch moves in a second direction, the replaced image is replaced with a replacement image of a different class.
17. The apparatus set forth in
the classes belong to a hierarchy; and
the second direction includes a third direction specifying a class from a higher level in the hierarchy and a fourth direction specifying a class from a lower level in the hierarchy.
0. 18. The system set forth in claim 2 wherein:
the removed image and the other image belong to an ordered set thereof;
if the removed image was dragged in a first direction, the other image precedes the removed image in the ordered set; and
if the removed image was dragged in a second direction, the other image follows the removed image in the ordered set.
0. 19. The system set forth in claim 8 wherein:
the removed image and the other image belong to an ordered set thereof;
if the removed image was moved in a first direction, the other image precedes the removed image in the ordered set; and
if the removed image was moved in a second direction, the other image follows the removed image in the ordered set.
0. 20. The system set forth in claim 15 wherein:
the removed image and the replacement image belong to an ordered set thereof;
if the removed image was moved in a first direction, the replacement image precedes the removed image in the ordered set; and
if the removed image was moved in a second direction, the replacement image follows the removed image in the ordered set.
|
The present patent application claims priority from U.S. Provisional Application 60/057,117, Slavoljub Milekic, Child-friendly Digital Environment, filed Aug. 28, 1997.
1. Field of the Invention
The invention relates generally to digital systems and more particularly to digital systems that are adapted to use by children two years old and older. One aspect of such a digital system is a graphical user interface that requires neither typing skills nor fine visual-motor coordination.
2. Description of the Prior Art
Ever since interactive computer systems became available in the 1960's, they have been used to educate and entertain children. Entire industries dedicated to interactive games and educational software have arisen and the Internet has literally made the whole world available to a child with access to a personal computer that is connected to the Internet. Educational uses of the computer have ranged from employing it as a page turner and exercise checking machine through using the fact that it can be programmed to teach analytical thinking and problem solving skills. For an example of the last kind of application, see Seymour Papert, Mindstorms, Basic Books, 1980.
Until now, the computer as used for the education or entertainment of children has had the same basic form as the computer as used in the workplace: the display sits on a desk and has a vertically-mounted screen and input to the computer has been by way of a keyboard and a pointing device such as a mouse that sit on the desk with the display. The graphical user interfaces have generally been based on at least two out of three assumptions: the user can type, the user can read, and the user has the fine motor coordination necessary to manipulate the buttons, sliders, and icons typical of modern graphical user interfaces.
The orientation of the screen, the input devices, and the graphical user interfaces together render a standard computer unusable by children of pre-school age. Such children are too short to see the display or reach the keyboard and mouse, they cannot read, they cannot type, and even if they could reach the keyboard and mouse, they do not have the fine motor coordination necessary to use the graphical user interface. Moreover, a standard computer is not child-safe: it has an exposed power cord and other cords connecting components such as the keyboard and mouse to the CPU.
What is needed if small children are to be able to take care of the educational and entertainment opportunities offered by the computer is a digital device which has been rendered child-safe and which has a user interface that permits direct manipulation of objects in the display and requires neither literacy nor typing skills nor fine motor coordination. It is an object of the present invention to provide such a digital device.
The child-friendly digital system of the invention differs both in its physical aspect and in its graphical user interface from standard digital systems. In its physical aspect, the child-friendly computer system is contained in a toddler-proof case that rests on the and has a touch-sensitive screen that is within easy reach of a toddler. Images are displayed on the touch-sensitive screen and the child-friendly digital system responds to touches on the display by altering the display. Other features of the physical aspect include an upward-facing display, sensors for sensing the presence of the child and motion above the display, a microphone for receiving voice inputs, and a loudspeaker. The child-friendly digital system has not cords or other appendages. In one embodiment, the digital system is portable; in another, it is a fixed unit.
The graphical user interface for the child-friendly digital system is based on manipulating an image on the touch-sensitive screen by touching the image directly. If an image is movable, touching the screen at the image selects the image for moving; moving the touched point within a selected image causes the image to move with the touched point, thus permitting the image to be dragged. An image that has been selected for moving may also be caused to move to another location by touching a point elsewhere on the screen. The touch causes the selected image to move to the selected point. If the screen is touched at two or more points simultaneously, the image moves to a point between the touched points. If the image is dragged at a speed above a threshold velocity, the image is “thrown away from” the display and may be automatically replaced by another image. Depending on the direction in which the image is dragged, the image may be replaced by one of the same kind or one of a different kind. If an image is tapped twice in short succession, with the finger remaining down after the second tap, the image is selected for modification. Moving the finger on the screen when an object has been selected for modification causes the object to change size in directions that depend on the direction of motion of the finger on the screen. The actions permitted by the graphical user interface are used to implement activities including shape sorting, puzzle assembly, hide-and-go-seek, and what might be called a digital picture book.
Other objects and advantages of the invention will be apparent to those skilled in the arts to which the invention pertains upon perusing the following Detailed Description and Drawing, wherein:
The reference numbers in the drawings have at least three digits. The two rightmost digits are reference numbers within a figure; the digits to the left of those digits are the number of the figure in which the item identified by the reference number first appears. For example, an item with reference number 203 first appears in
The Detailed Description begins with an analysis of the kinds of changes that must be made in a digital system if it is to be usable by pre-school children, continues with a description of the physical construction of such a system, and then describes the graphical user interface for such a system. Finally, the Detailed Description provides a detailed disclosure of the implementation of important aspects of the graphical user interface.
What Needs to be Changed to Make a Digital System Child Friendly
There are three major areas which need to be addressed in making digital systems child-friendly. They can be loosely defined as changes in a) location, b) mode of interaction and c) content structure. Each of these changes will be briefly described in the following paragraphs.
Change in location. Although it seems trivial at first, change of location of objects is the first indicator of the psychological change of domain perception. Just moving the computer from the desk to the floor makes it more accessible to children but also indicates to them that the computer is a legitimate part of their environment. Of course, modern computers would hardly survive this change, because they were not built with children in mind. Moving computers to the floor would also mean making them: at least as child-resistant as any good toy. As simple as it is, change in location also implies a host of other changes in the design of child-friendly digital devices. First, a child-friendly digital system should lose all of its appendages and the cords that connect them. This means getting rid of the power cord, the keyboard, the mouse, and their cables, and making the image-displaying part self-standing (battery operated or with a concealed electric cord). Putting the display on the floor also means change in the orientation of the viewing surface from perpendicular (where the child had to look up) to a more physiological upward-facing angle.
Change in mode of interaction. The change in the mode of interaction is not only dictated by the change in location, but is also necessitated by the inadequacy of the keyboard and the mouse as input devices for children. Both devices depend on possession of special kinds of knowledge and skills, not readily available to children. The keyboard requires not only competent writing (and typing) skills but also knowledge of specific vocabulary and its use (for example, that typing “exit” will end the current game). This does not mean that in a child-friendly interface environment keys would be banned from existence, but only that their number, size and function would dramatically change.
The mouse suffers from similar shortcomings. Not only is it inherently abstract (moving the mouse moves something in the display, but not always . . . ) but it also involves fine visual-motor coordination. The ‘folders’ displayed in a typical modern graphical user interface are approximately ¼ inch square and it is within this range that the child has to coordinate the movement with the ‘click’ (sometimes even ‘double-click’) in order to make something happen. The size of the interface elements is not the only problem. One could easily increase the size of typical ‘buttons’ on the screen and use another input device (like the touch-sensitive screen), but the problem of interaction still remains. The adult-designed ubiquitous ‘desktop’ metaphor with its files and folders, and subfolders and ‘windows’ (on the desktop?!) is hardly a typical child's handy metaphor. The necessity for changing not only the input devices but also the way the digital information is rendered accessible is the topic of the next section.
Change in content structure. The change in content structure does not mean change in content per se, but rather change in the way the content is organized and presented to the child. To an illiterate person (or a child) all the ‘folders’ on a computer display look pretty much the same. Thus, in a child-friendly digital environment the indicators of content should be clearly distinct visually and represent familiar aspects of the child's experience. However, this is the most superficial change necessary. There are other aspects of children's activity that call for more radical changes. These are a) making the information (content) manipulable, and b) making the content structure compatible with the child's social environment.
As described above, moving the digital system from the desktop to the floor leads to dramatic changes in design. The system loses separate input devices such as the keyboard and mouse and is reduced to the display unit. Ideally, the display itself should be compact, mobile, and with an upward-facing touch-sensitive viewing surface It should also be rugged, scratch-resistant and use a built-in power source. The unit should also have ample storage capacity and a way to quickly access, modify and update stored information. Digital systems with some of the above characteristics are already available. They may be found in ATM machines and in kiosks for finding locations in superstars, malls, airports, or even museums. None of them, however, is designed to sit on the floor or has a graphical user interface that a preschooler can use.
The stationary version 201 of the digital system is better suited for museums and day-care centers. It has the same technical characteristics as portable system 101 but is based on a desktop computer with a touch-sensitive monitor 205. The system is configured as shown in
Child-Friendly Graphical User Interface
The graphical user interface in system 101 or 201 is based on the touch-sensitive display, the motion and proximity sensors, and a voice recognition system that is made using microphone 107 and custom-made or commercially available voice recognition software such as Dragon Naturally Speaking from Dragon Systems, Inc. or Via Voice from IBM Corporation. The touch screen is employed as the primary user input device for a number of reasons. Pointing to and touching an item are the most natural ways of indicating its selection, and require no training even in very young children. Touch screens are very durable, have no moving parts, and require almost no maintenance. Since they are superimposed over the viewing surface, they demand no additional space. A host of studies on adults (summarized in Sears and Schneiderman, “High precision Touch screens: design strategies and comparisons with a mouse”, International Journal of Man-Machine Studies 34, pp. 598-613, 1991) indicate that Touch screens are the fastest pointing devices. However, if used for the selection of very small targets (less than 10 mm in diameter), they are also the ones with the highest percentage of error rates. The results were partially caused by the low resolution of older Touch screens and the returning of multiple pixel locations by the touch screen hardware. In the past several years both the increased resolution of touch screens and the software-implemented strategies for stabilizing the touch location have reduced touch screen error rates and brought them in line with those of the mouse. It is worth noting that even with the older touch screens there was no difference in error rates between the touch screen and the mouse in conditions where larger selection targets were used. A pilot study conducted at the Hampshire College Cognitive Development Lab has shown that even children as young as 2 years find the use of a touch-sensitive screen intuitive and easy. Furthermore, their performance on a simple visual mapping task was quite good, possibly as the result of the decrease in cognitive load associated with the interface.
While the touch screen makes a child-friendly graphical user interface possible, it is not sufficient by itself. Since preschoolers cannot read and do not have the fine motor coordination necessary for standard GUIs, the GUI had to be redesigned to employ interactions that were easy for the preschoolers. The interactions included the following:
Selection: Selection of an object such as circle 115 in display 111 is carried out by touching it. There is no traditional highlighting of the selected object (necessary for the mouse input) because of the existing haptic feedback. However, since all of the objects have defined ‘anchor’ points (119 in object 115) which are used for ‘dragging’ action, there is often a small movement of the selected object as it aligns its anchor point with the touch point. Selection may be further indicated by a visual “lifting” of the selected object (i.e., a discrete shadow is added to the object when it is selected and/or by a discrete auditory signal when the object is selected or released.
Moving objects: Selected objects can be dragged by moving the finger across the screen, and ‘dropped’ by lifting the finger. If objects are dropped over the appropriate slot (117 in
In addition to the traditional mouse-supported actions, there are three more types of interaction supported in the graphical user interface: pointing to a location, throwing the object, and pushing the object.
Pointing to an object: Pointing to a location consists of simply touching screen 111 at the desired location. If what was last selected was an object, the touch may cause the object to move to the location that was touched. If what was last touched was a slot, the object that fits the slot will move to the location that was touched.
Throwing an object: The throwing action is executed when the speed at which an object is dragged across display 111 exceeds a threshold speed which corresponds more or less to the speed of the natural throwing motion. When the threshold is exceeded, the ‘thrown’ object will continue to move in the same direction even when the finger is lifted off the screen. One use of throwing is to remove an object from the display.
Pushing: When the child's finger is moved along display 111 and touches the side on an object, the object starts moving in the same direction in front of the finger. The motion is terminated when the finger is lifted up from the screen or when it stops moving.
Object modification: Some objects may be modified by the child. When the child taps the object twice and doesn't raise its finger after the second tap, the object is surrounded by a red outline. If the child then moves its finger up or down on the screen, the object changes size in the vertical direction; if the child moves its finger sideways on the screen, the object changes size in the horizontal direction.
Voice recognition: The voice recognition mode of interaction allows the child to perform simple actions and navigation using oral commands like “go”, “no”, “more”, “new”, etc.
Hand gestures: Simple hand gestures (without touching of the screen) can be used for interaction in certain contexts. For example, moving a hand in horizontal direction in front of the screen may cause depicted objects, images, or pages to be ‘flipped’ in the appropriate direction. Another example is that of ‘zooming’ into the picture by just moving the palm of the hand closer to the screen (that is, to the proximity sensor).
Using the Interactions to Make Activities
The following portion of the description shows how the interactions described above may be used to make activities for preschoolers.
Matching:
As shown in
Assembling a jigsaw puzzle is also a shape-matching activity, and
Making Faces:
At the start of the activity, the face parts shown at 601 are randomly selected and shown on display 111. In the beginning facial elements that occur in pairs are matching and symmetrical. To change the face, the child uses its finger to move each part to a desired location. The child may also use the throwing interaction described above to replace a face part with a new face part. For example, an ear of the type shown at 607 may be replaced by one of the type 617. The exchange is carried out by throwing” the selected part toward the bottom part of the screen. When this is done, a different part of the same class appears, for example, ear 617. In order to allow creation both of symmetrical, natural looking and ‘strange’ faces ‘throwing’ of the symmetrical parts is regulated in the following fashion: If both parts are of the same kind (for example, two blue eyes) throwing away of one part results in random replacement by a different kind (for example, with a green eye). This allows for the making of ‘hybrid’ faces. However, if the user wants to create a different symmetrical face, ‘throwing’ away of an element in a mismatched pair will result in the replacement with the appropriate element (that is, throwing away of a blue eye in a blue-green pair will result in appearance of another green eye). Double-tapping (and holding the finger down) on any face part puts the face part into a ‘modifiable’ mode indicated by a red outline around the part, as shown at 619. While the part is in this mode moving the finger up or down (623) across the screen changes the part's height, while moving the finger left or right changes its width.
Hide and go Seek:
Using Throwing to Navigate within and Between Categories:
As is clear from the activities already discussed, a child-friendly digital system can provide a child with many different versions of the same activity as well as with many different kinds of activities. The child-friendly digital system must thus provide the child with a simple technique that permits it to both navigate among classes of activities and versions of activities in a class and to select a one of the activities. In a preferred environment, this is done with throwing. To get another version of the present object or activity, the child throws the present one to the left or the right; to get a different class of object or activity, the child throws the present one up or down.
The same mechanism may be used with games. For example, throwing the shape sorting display of
Collaborative Activities:
In what follows, details of an implementation of the invention will be discussed. The discussion is based for the most part on an implementation made for use in an art museum.
Implementation of Display 111
Display 111 is implemented in a preferred embodiment using a resistive, capacitive, or surface-acoustic wave touch screen. Such touch screens are available from vendors such as MicroTouch Systems, Inc., ELO, PixelTouch, or Keytec. The type of touch screen will of course determine what kinds of interactions are possible. For example, the previously-described behavior of a selected object when display 111 is being touched at more than one point simultaneously makes use of the fact that when a resistive touch screen is touched at more than one point simultaneously, the position that the touch screen driver reports to the CPU is the average of all of the positions. Some touch screens, for example those that use surface acoustic waves, can detect pressure as well as position, and this fact can be used in the interactions. An area of current experimentation is using a pressure sensitive screen to implement “3-D” interactions.
Implementation of Motion and Proximity Sensors 109:
Proximity sensor 1103 detects the presence of an object near the child-friendly digital system. One way of using the proximity sensor is to detect the presence of a child near the system. When the child is detected, the system may make objects appear on display 111 or even go into an “attract mode” specifically designed to attract the child's attention. Another way of using it is to have objects in the display respond to the degree of closeness of an object detected by the proximity sensor. For example, a figure on the display might respond to the approach of a child's hand to the display by moving out from under the hand and thereby appearing to “run away” from it. Another example would be the “zoom” effect mentioned earlier.
Motion sensors 1105 together detect the direction and velocity of motion above display 111. Motion sensors 1105 can be used in the same fashion as proximity sensor 1103 to detect the presence of a child. They can also be used as a source of inputs that influence the behavior of the display. For example, throwing could be implemented as a response to a rapid horizontal or vertical motion of the hand above display 111. Another use of motion sensors 1105 is to control a figure on display 111 that responds to the motion sensed by the motion sensor. For example, if the child moves its hand from left to right, the figure does the same with its hand.
Programming the Child-Friendly Digital System
A prototype of a child-friendly digital system has been implemented in a Macintosh® computer manufactured by Apple Computer, Inc. The prototype has been programmed using the Supercard® programming environment manufactured by Allegiant Technologies, Inc., 9740 Scranton Rd., Suite 300, San Diego, Calif. The Supercard programming environment was developed to create interactive displays. An interactive display is called a project in the Supercard environment; each project is a sequence of one or more windows. Within a window, a sequence of one or more cards may be displayed; a card may have associated with it a background design, and a number of objects such as text, graphics, or elements such as buttons and menus. Also associated with each project is a script written in the Supertalk™ scripting language. The code in the script is executed in response to events such as a touch on display 111. Details concerning the Supercard programming environment and the Supertalk scripting language may be found in Sean Baird, et al., SUPERCARD User Guide, Allegiant Technologies, Inc., 1996, and Ken Ray, et al., SUPERCARD Script Language Guide, Allegiant Technologies, Inc., 1996. Both of these documents are hereby incorporated by reference into this patent application. In the following, the script language code for certain of the interactions will be explained in detail.
In the Supercard programs used to implement the activities described above, a project consists of a number of different kinds of activities, for example, shape matching, hide-and-go-seek, jigsaw puzzles, and face assembly. There is a window corresponding to each class of activity and a card corresponding to each version of the activity. Graphical objects are used for the shapes, slots, puzzle pieces, face parts, and the like. Behavior of the graphical objects is controlled by scripts.
Details or the Scripts for the Activities:
The scripts in the following are for an implementation of the face activity shown in
The code that is executed when moveMe occurs is shown at on MoveMe 1215(A) and (B) in
The code of 1215A then saves the current time (provided at regular intervals by the SuperCard system) in StartTime at 1219, sets the location of the target (that is, the component currently being touched) to the present location of the mouse at 1221, puts the target's short name into CurrPict (1223), and puts the target's location into OrigPosition. It is the line of code identified at 1221 that moves the center of the target to the point touched by the user, thereby providing feedback indicating that the touch has selected the target for moving.
Continuing with
Portions 1311 and 1317 implement throwing. The position variables OldPosition and CurrPosition each contain two values, item 1, which is the x coordinate, and item 2, which is the y component. In loop 1301, OldPosition is set a little more than two clock ticks before CurrPosition; consequently, the velocity with which a component is moved can be determined from the distance between the position specified in OldPosition and the position specified in CurrPosition (1312). In this embodiment, the velocity threshold is a distance of 2 of the distance units established by SuperCard. If the distance between the position variables is greater than that, a throw has occurred. In 1311, the throwing motion is to the left; in 1317, the cases where the throwing motion is to the right, up, or down are dealt with. Here, we need only describe 1311 in detail. As shown at 1313, first the user-defined throwLeft condition 1313 is raised; this gets rid of the part being thrown. Then the user-defined condition GetNewPart 1315 is raised; this gets the replacement part. Finally, MoveMe code 1215 is exited.
The code that is executed when these conditions are raised is shown in
The first thing GetNewPart does is rename the current target “thrashedOne”. Then at loop 1509, GetNewPart 1507 deals with the problem of pairs of parts. As mentioned above, the parts in a pair may either match or not match. Conceptually, each part belongs to a class of parts, such as eyes, noses, etc. Within the class, the part has an ID number, and if it is a part that comes in pairs, it has an indication whether it is the left or right member of the pair. In the preferred embodiment, this information about the part is encoded in the part's name. For instance, in the part name Leye5, eye indicates the class name, 5 the kind of eye, and L that the eye is a left eye.
In loop 1509, GetNewPart examines the name of each part in the display in turn. It keeps going until it finds a part that belongs to the same class as the current target or it has examined all of the parts. As shown at 1511, if it finds a part that belongs to the same class and has the same ID as the current target it exits loop 1509. This deals both with parts that do not belong to pairs and parts that belong to matched pairs. On the other hand, as shown at 1513, if it finds a member that belongs to the same class as the current target but is not the other member of the pair, it puts the name of the other member of the pair into the variable NewPict and exits loop 1509.
At loop 1515, the code deals with the situations where the part is not a member of a pair and where the part is a member of a matching pair. In both cases, the local variable NewPict is empty and loop 1515 randomly generates a number for a new face part. If the number is not that of the current face part, loop 1515 exits, the name for the new face part is made by adding the number to the current face part's name, and the result is stored in NewPict (1516). In code portion 1517, finally, the graphic that was thrown is replaced by the one specified in NewPict, with the replacement graphic being placed at the position on the display indicated by OrigPosition.
Returning to
The code begins at 1404 by making a red outline around the part visible, indicating to the user that it has been selected for modification. The body of the code is a loop 1402 which is repeated while the user continues to touch the part. On each iteration of the loop, the current position being touched is saved in Position1, there is a pause of 8 ticks, and the current position is saved in Position2. In the section of the code labeled 1405, the positions saved in the two variables are compared to determine whether the motion of the touched point had a vertical component. If the comparison indicates that the vertical component was in the upward direction, the size of the part is increased in the vertical direction (1407); if the vertical component of the motion was in the downward direction, the size of the part is decreased in the vertical direction. Section 1411 works in the same fashion with any horizontal component of the motion. When the loop terminates, the dotted line showing that the part has been selected for modification is removed.
Implementation of Other Interactions
It will be immediately apparent from the foregoing how the other interactions between the user and the child-friendly digital system are implemented. In the case of the shape matching and puzzle assembling activities, the shapes are implemented as named foreground graphics and the slots are implemented as named background graphics that have the same name as the foreground graphics that corresponds to the slots. The on MoveMe code that is executed when one of the named foreground graphics is touched includes dragging code similar to that just explained, and while the dragging is going on, the code constantly checks whether the present location of the foreground graphic is within a predetermined distance of the corresponding background graphic. If it is, the foreground graphic aligns itself with the background graphic. Moving a shape to its slot when the slot is touched is done with on MouseDown code. When the slot is touched, its name is used to construct the name of the foreground graphic to be moved and the code then moves the foreground graphic to the slot that has been touched.
In the display of pictures, the pictures are divided into categories, with the same number of pictures in each category. An array is created for the category names, and each category is thereby mapped to an index. Associated with each category name is an array for the pictures in the category, thereby mapping each picture to an index. A variable called CurrCategory contains the index of the category of the picture currently being displayed and another one called CurrPic contains the index of the picture itself. A throw is detected from the motion of the touched point in the display in the same manner as described above. When the throw is to the left, the current picture in the display is removed in that direction, the index in currPic is incremented by one, wrapping around if necessary, and the picture of the category that corresponds to the new value of currPic appears. When the throw is to the right, the same thing occurs, except that currPic is decremented.
When the throw is up, the current picture is removed, currCategory is incremented, wrapping around if necessary, and the picture corresponding to the value of currPic in the category indicated by the new value of currCategory is displayed. When the throw is down, the same thing occurs, except that currCategory is decremented.
In the foregoing Detailed Description, the inventor has described the best mode presently known to him of adapting a digital system for use by children as young as two years in such fashion that those skilled in the arts to which the disclosure pertains may make and use such a digital system. The child-friendly digital system disclosed herein has been constructed according to general principles including:
One consequence of these principles is the physical form of the child-friendly digital system: a system with no cords or other appendages that sits on the floor and that uses a touch panel over an upward-facing display, a microphone, and position sensors for inputs and the display and a loudspeaker for outputs. Another consequence is a graphical user interface which permits the child to manipulate objects on the screen by touching them and moving its finger on the screen. The interface is built from actions including touching an object to select it for moving, moving the selected object by moving the finger across the screen, moving the object at a speed faster than a threshold velocity to get rid of it, and modifying the object by touching it twice and then moving the finger within the object to change its size. The Detailed Description has further disclosed how these actions can be used to create a number of activities, including shape sorting, assembling a puzzle, making a face out of component parts, and making what amounts to a digital picture book.
It will be immediately apparent to those skilled in the arts to which the disclosure pertains that there are many ways of realizing the principles demonstrated by the child-friendly digital system other than the one disclosed herein; it will also be apparent that many of the principles and techniques demonstrated in the child-friendly digital system are useful in other situations where control of a computer system by means of a keyboard and pointing device is difficult. For example, adults who have suffered a stroke or who are wearing heavy gloves have fine motor coordination problems comparable to those of small children. Moreover, interactions in the GUI such as throwing may be useful even in systems that employ standard pointing devices.
For all these reasons, the Detailed Description is to be regarded as being in all respects exemplary and not restrictive, and the breadth of the invention disclosed herein is to be determined not from the Detailed Description, but rather from the claims as interpreted with the full breadth permitted by the patent laws.
Patent | Priority | Assignee | Title |
10152844, | May 24 2012 | SUPERCELL OY | Graphical user interface for a gaming system |
10198157, | Apr 12 2012 | SUPERCELL OY | System and method for controlling technical processes |
10702777, | Apr 12 2012 | SUPERCELL OY | System, method and graphical user interface for controlling a game |
11119645, | Apr 12 2012 | SUPERCELL OY | System, method and graphical user interface for controlling a game |
8448095, | Apr 12 2012 | SUPERCELL OY | System, method and graphical user interface for controlling a game |
8954890, | Apr 12 2012 | SUPERCELL OY | System, method and graphical user interface for controlling a game |
Patent | Priority | Assignee | Title |
4678869, | Oct 25 1985 | Scriptel Corporation | Position responsive apparatus, system and method having electrographic application |
4972496, | Jul 25 1986 | SAMSUNG ELECTRONICS CO , LTD | Handwritten keyboardless entry computer system |
5019809, | Jul 29 1988 | Apple Inc | Two-dimensional emulation of three-dimensional trackball |
5031119, | Jun 12 1989 | SAMSUNG ELECTRONICS CO , LTD | Split screen keyboard emulator |
5128672, | Oct 30 1990 | Apple Inc | Dynamic predictive keyboard |
5133076, | Jun 12 1989 | SAMSUNG ELECTRONICS CO , LTD | Hand held computer |
5146556, | Oct 11 1988 | NEXT SOFTWARE, INC | System and method for managing graphic images |
5157737, | Mar 24 1987 | SAMSUNG ELECTRONICS CO , LTD | Handwritten keyboardless entry computer system |
5202828, | May 15 1991 | Apple Inc | User interface system having programmable user interface elements |
5262778, | Dec 19 1991 | Apple Inc | Three-dimensional data acquisition on a two-dimensional input device |
5297216, | Jul 25 1986 | SAMSUNG ELECTRONICS CO , LTD | Handwritten keyboardless entry computer system |
5325984, | Jun 21 1993 | Google Technology Holdings LLC | Friction clutch for dual pivot point hinge |
5355148, | Jan 14 1993 | SAMSUNG ELECTRONICS CO , LTD | Fingerpoint mouse |
5365598, | Jul 25 1986 | SAMSUNG ELECTRONICS CO , LTD | Handwritten keyboardless entry computer system |
5367130, | May 27 1992 | Apple Computer, Inc.; Apple Computer, Inc | Graphics tablet scanning and error reduction |
5396351, | Dec 20 1991 | Apple Inc | Polarizing fiber-optic faceplate of stacked adhered glass elements in a liquid crystal display |
5404442, | Nov 30 1992 | Apple Inc | Visible clipboard for graphical computer environments |
5424756, | May 14 1993 | Track pad cursor positioning device and method | |
5455499, | Apr 26 1993 | Motorola Mobility LLC | Method and apparatus for indicating a battery status |
5463725, | Dec 31 1992 | International Business Machines Corporation | Data processing system graphical user interface which emulates printed material |
5469194, | May 13 1994 | Apple Computer, Inc | Apparatus and method for providing different input device orientations of a computer system |
5495566, | Nov 22 1994 | Microsoft Technology Licensing, LLC | Scrolling contents of a window |
5508719, | May 01 1992 | SAMSUNG ELECTRONICS CO , LTD | Pressure-actuated pointing device |
5545857, | Jul 27 1994 | SAMSUNG ELECTRONICS CO , LTD | Remote control method and apparatus thereof |
5558098, | Nov 02 1995 | Pacesetter, Inc | Method and apparatus for detecting lead sensing artifacts in cardiac electrograms |
5564007, | Jun 03 1994 | Motorola Inc. | Method for configuring an automated dispense machine |
5585823, | Dec 30 1994 | Apple Inc | Multi-state one-button computer pointing device |
5600765, | Oct 20 1992 | GOOGLE LLC | Display system capable of accepting user commands by use of voice and gesture inputs |
5611031, | Apr 29 1994 | Intellectual Ventures I LLC | Graphical user interface for modifying object characteristics using coupon objects |
5615384, | Nov 01 1993 | MEDIATEK INC | Personal communicator having improved zoom and pan functions for editing information on touch sensitive display |
5616078, | Dec 28 1993 | KONAMI DIGITAL ENTERTAINMENT CO , LTD | Motion-controlled video entertainment system |
5659378, | Dec 20 1991 | Apple Inc | Polarizing fiber-optic layer for use with a flat panel display device |
5661635, | Dec 14 1995 | Google Technology Holdings LLC | Reusable housing and memory card therefor |
5663748, | Dec 14 1995 | Google Technology Holdings LLC | Electronic book having highlighting feature |
5668570, | Jun 29 1993 | NETAIRUS SYSTEMS LLC | Desktop computer with adjustable flat panel screen |
5670755, | Apr 21 1994 | SAMSUNG DISPLAY CO , LTD | Information input apparatus having functions of both touch panel and digitizer, and driving method thereof |
5697793, | Dec 14 1995 | Google Technology Holdings LLC | Electronic book and method of displaying at least one reading metric therefor |
5727141, | May 05 1995 | Apple Inc | Method and apparatus for identifying user-selectable regions within multiple display frames |
5729219, | Aug 02 1996 | Google Technology Holdings LLC | Selective call radio with contraposed touchpad |
5732230, | May 19 1995 | Ricoh Company, LTD | Computer user interface for manipulating image fragments using drag, drop and merge operations |
5745116, | Sep 09 1996 | Google Technology Holdings LLC | Intuitive gesture-based graphical user interface |
5745715, | Apr 13 1994 | International Business Machines Corporation | Method and system for facilitating the selection of icons |
5745716, | Aug 07 1995 | Apple Inc | Method and apparatus for tab access and tab cycling in a pen-based computer system |
5764218, | Jan 31 1995 | Apple Inc | Method and apparatus for contacting a touch-sensitive cursor-controlling input device to generate button values |
5844547, | Oct 07 1991 | Apple Inc | Apparatus for manipulating an object displayed on a display device by using a touch screen |
5845263, | Jun 16 1995 | IMPERIAL BANK | Interactive visual ordering system |
5898434, | May 15 1991 | Apple Inc | User interface system having programmable user interface elements |
5929840, | Mar 04 1994 | Microsoft Technology Licensing, LLC | System and method for computer cursor control |
5973670, | Dec 31 1996 | LENOVO SINGAPORE PTE LTD | Tactile feedback controller for computer cursor control device |
5986224, | Apr 19 1995 | ELO TOUCH SOLUTIONS, INC | Acoustic condition sensor employing a plurality of mutually non-orthogonal waves |
6005545, | Jan 17 1995 | Sega Enterprise, Ltd. | Image processing method and electronic device |
6043810, | Jun 12 1995 | Samsung Electronics, Co., Ltd. | Digitizer controller |
6088032, | Oct 04 1996 | TRIDIM INNOVATIONS LLC | Computer controlled display system for displaying a three-dimensional document workspace having a means for prefetching linked documents |
6115482, | Feb 13 1996 | Ascent Technology, Inc.; ASCENT TECHNOLOGY, INC | Voice-output reading system with gesture-based navigation |
6137484, | May 05 1995 | Apple Inc | Method and apparatus for identifying user-selectable regions within multiple display frames |
6281879, | Jun 16 1994 | Microsoft Technology Licensing, LLC | Timing and velocity control for displaying graphical information |
6542164, | Jun 16 1994 | Microsoft Technology Licensing, LLC | Timing and velocity control for displaying graphical information |
6816148, | Aug 23 1997 | Immersion Corporation | Enhanced cursor control using interface devices |
EP515664, | |||
GB2262644, | |||
GB2302429, | |||
GB2305715, | |||
GB2317022, | |||
WO9405440, | |||
WO9618855, | |||
WO9618856, | |||
WO9619119, | |||
WO9619347, | |||
WO9619570, | |||
WO9619609, | |||
WO9619637, | |||
WO9619638, | |||
WO9619701, | |||
WO9619704, | |||
WO9619711, | |||
WO9619783, | |||
WO9619784, | |||
WO9619786, | |||
WO9619951, | |||
WO9620011, | |||
WO9620012, | |||
WO9620013, | |||
WO9620186, | |||
WO9620451, | |||
WO9620505, | |||
WO9620908, | |||
WO9620909, | |||
WO9620910, | |||
WO9712891, | |||
WO9713611, | |||
WO9531765, | |||
WO9531766, | |||
WO9606401, | |||
WO9624095, | |||
WO9635162, | |||
WO9642068, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jul 18 2007 | Flatworld Interactives, LLC | (assignment on the face of the patent) | / | |||
Oct 02 2011 | MILEKIC, SLAVOLJUB | Flatworld Interactives, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027003 | /0623 | |
Dec 20 2011 | MILEKIC, SLAVOLJUB | Flatworld Interactives, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027431 | /0693 |
Date | Maintenance Fee Events |
Jan 17 2013 | M2552: Payment of Maintenance Fee, 8th Yr, Small Entity. |
Feb 24 2017 | REM: Maintenance Fee Reminder Mailed. |
Jul 19 2017 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Apr 17 2015 | 4 years fee payment window open |
Oct 17 2015 | 6 months grace period start (w surcharge) |
Apr 17 2016 | patent expiry (for year 4) |
Apr 17 2018 | 2 years to revive unintentionally abandoned end. (for year 4) |
Apr 17 2019 | 8 years fee payment window open |
Oct 17 2019 | 6 months grace period start (w surcharge) |
Apr 17 2020 | patent expiry (for year 8) |
Apr 17 2022 | 2 years to revive unintentionally abandoned end. (for year 8) |
Apr 17 2023 | 12 years fee payment window open |
Oct 17 2023 | 6 months grace period start (w surcharge) |
Apr 17 2024 | patent expiry (for year 12) |
Apr 17 2026 | 2 years to revive unintentionally abandoned end. (for year 12) |