There is provided a method of representing and animating a 2d (Two-Dimensional) character in a 3d (Three-Dimensional) space for a character animation. The method includes performing a pre-processing operation in which data of a character that is required to represent and animate the 2d character like a 3d character is prepared and stored and producing the character animation using the stored data.
|
1. A method of representing and animating a 2d (Two-Dimensional) humanoid character according to views of the humanoid character in a 3d (Three-Dimensional) space comprising:
performing a pre-processing operation including a character setup operation and a motion data setup operation that prepares and stores data of a 3d humanoid character; and
producing a 2d character animation of the 3d humanoid character using the stored data;
wherein performing a character setup operation includes obtaining from a camera in the 3d space camera views of the 3d humanoid character and storing the views in a character database; and
wherein performing a motion data setup operation includes analyzing a 3d character animation of the 3d humanoid character, extracting mapping information that maps the 3d character animation into a 2d character animation, and storing the mapping information in a motion database.
2. The method of
searching and extracting key frame data from the motion database, the key frame data corresponding to location information of a camera and a posture of the humanoid character in the 3d space determined by an animation producer;
converting coordinate using the location information of the camera and a motion value that is obtained at each key frame extracted from the motion database;
searching key drawings of a character obtained from the conversion of the coordinate from the character database;
converting the key drawings corresponding to the frames after searching a drawing order for body regions corresponding to the key drawings of the searched character from the motion data base; and
performing an in betweening operation in which frames are interpolated in a space between a current key frame and a next key frame.
3. The method of
setting a plurality of locations according to virtual camera locations in the 3d space;
preparing a key drawing of the 2d character from an appearance of the character obtainable from the virtual camera locations;
dividing the key drawing of the 2d character into appearances of the character that can be obtainable at the 26virtual camera locations; and
storing the divided key drawings in the character database.
4. The method of
5. The method of
6. The method of
extracting key frames from a 3d character animation data;
extracting, when the extracted key frame data is applied to the 3d humanoid character, the mapping information after searching the camera information of the 2d humanoid character from the character database considering a camera angle and a character orientation;
determining a drawing order for body regions of the 2d humanoid character after analyzing a posture of the 3d character corresponding to the extracted key frame motion; and
storing the mapping-relation information and the drawing order in a motion database.
7. The method of
|
1. Field of the Invention
The present invention relates to a method of representing and animating a two-dimensional (2D) humanoid character in a three-dimensional (3D) space, and more particularly, to a method that can represent a 2D humanoid character according to views of the humanoid character taken by a camera in a 3D space and animates the 2D humanoid character according to a procedure of a 3D animation.
2. Description of the Related Art
The prior art method of making an animation is classified into a 2D animation method and a 3D animation method.
The 2D animation method is based on a 2D cell and the 3D animation method makes an animation by three-dimensionally preparing the animation and rendering the animation.
In the 2D animation method, after a key frame is first set, an original draft is drawn on a paper and the paper on which the original draft is drawn is moved to a cell to complete the key frame. Then, frames are interpolated in a space between the key frames through an in-betweening operation, thereby completing the overall animation.
In the 3D animation method, data are three-dimensionally modeled and a 3D animation is set to the modeled 3D and rendered, thereby completing the animation.
A character animation is an animation having a virtual character moving on a screen. Recently, as a computer graphic technology has been advanced, it has become possible to represent a humanoid character that moves like a human being. The character animation is classified in 2D and 3D character animations.
The 2D character animation is made through the 2D animation method and the 3D character animation is made through the 3D animation method.
The 2D character animation is simple like a conventional cartoon film or a flash animation. The motion of the 2D character animation is exaggerated. However, the motion of the 3D character animation is represented realistically like the human being. Therefore, the 2D character animation is manufactured through a key frame method to be proper to the cell animation. The 3D character animation is manufactured through a combination of a key frame method and a motion capture method.
In the key frame method, key frames are first set and the animation is completed by interpolating frames into a space between the key frames through the in-betweening operation. However, this process must be performed for each articulates of a human being that is an original model of the animation character, many efforts and a lot of time are necessary to produce the animation data. The data quality depends on the skill of a worker.
In the motion capture method for producing the 3D character animation, the data is produced by obtaining the motion like the real human being using an optical, mechanical or magnetic motion capture device. These data are used as the character animation. The motion capture method provides a very natural motion as it uses the motion of the human being as it is. However, since the data capacity is bulky, many efforts and a lot of time are required to post-process, amend, and change the data.
Recently, even when the animation is three-dimensionally produced, it is sometimes still required to represent a texture of the 2D cell animation using a cartoon shading or exaggerated motion. Therefore, a combination use thereof is eagerly required. However, due to the different production processes, it is difficult to use them together, and even when they are used together, it is difficult to utilize their typical advantages.
Accordingly, the present invention is directed to a method of representing and animating a two-dimensional (2D) humanoid character in a three-dimensional (3D) space, which substantially obviates one or more problems due to limitations and disadvantages of the related art.
It is an object of the present invention to provide a method that can represent and animate a 2D character together with a 3D character in a 3D space, thereby effectively producing an animation in which a character moves in the 3D space but provide a nonrealistic texture of the 2D character.
It is another object of the present invention to provide a method that can represent a 2D character in a 3D space in response to a 3D character animation data by solving drawbacks of the 2D and 3D characters when the 2D character is used together with the 3D animation.
That is, for the 2D character, it is difficult to amend the same and the reuse frequency thereof is low. The 3D character has advantages Accordingly, for the 3D character, the reuse frequency is high since it is not affected by, for example, a camera conversion as far as the scene setting is not changed. However, the initial production of the 3D character is difficult. Furthermore, when the data such as the motion capture are used, the motion is realistic and thus it cannot be used for the 2D cell animation. Therefore, it is another object of the present invention to solve the disadvantages of the 2D and 3D characters by combining the 2D character with the 3D animation.
In the prior art, the character animation has been two or three-dimensionally produced and the 2D and 3D production methods have been combined with each other only for the objects or camera animation other than characters. Therefore, it is still another object of the present invention to provide a method for combining the 2D and 3D production methods for the characters. That is, it is an object of the present invention to provide a method that can represent a 2D character according to views of the character taken by a camera in a 3D space and animates the 2D character according to a procedure of a 3D animation.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
To achieve these objects and other advantages and in accordance with the purpose of the invention, as embodied and broadly described herein, there is provided a method of representing and animating a 2D (Two-Dimensional) character in a 3D (Three-Dimensional) space for a character animation, including: performing a pre-processing operation in which data of a character that is required to represent and animate the 2D character like a 3D character is prepared and stored; and producing the character animation using the stored data. The performing the pre-processing operation includes performing a character setup operation in which an appearance obtainable from a camera in the 3D space is prepared as the 2D character according to camera views and stored in a character database; and performing a motion data setup operation in which a 3D character animation is analyzed and mapped into a 2D character animation and stored in a motion database.
The producing the character animation includes searching and extracting key frame data from the motion database, the key frame data corresponding to location information of a camera and a posture of the character in the 3D space determined by an animation producer; converting coordinate using the location information of the camera and a motion value that is obtained at each key frame extracted from the motion database; searching key drawings of a character obtained from the conversion of the coordinate from the character database; converting the key drawings corresponding to the frames after searching a drawing order for body regions corresponding to the key drawings of the searched character from the motion data base; and performing an in-betweeing operation in which frames are interpolated in a space between a current key frame and a next key frame.
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principle of the invention. In the drawings:
Reference will now be made in detail to the preferred embodiments of the present invention.
A method of the present invention is realized by a system including a pre-processing unit for storing data prepared in advance in a database and an animation producing unit for producing a character animation using the data stored in the pre-processing unit.
An animation producing method of the present invention includes pre-processing a predetermined data in a database and producing a character animation using the data stored in the database.
In the pre-processing operation, the data that will be used in producing the character animation is first prepared and the animation is produced through a process depicted in
When the data is prepared through the pre-processing operation of
A motion value of each frame is calculated based on the key frames obtained from the motion database (36) and a coordinate conversion is performed using the motion values and camera information (S14). The orientation and size of the character can be identified from the coordinate conversion result of the motion data and a key drawing of the 2D character corresponding to the orientation and size of the character is extracted from the character database (S15).
Next, the drawing order with respect to the body regions stored in the motion database is detected to convert the drawing according to the order and the character drawing corresponding to the frame is completed (S16). Then, frames are interpolated into a space between first and second key frames through the in-betweening operation (S17). As a result, each frame between the first and second key frames also has a character. Next, the above operation is repeated for the next key frames so that all of the frames can have the character (S18). Then, when all of the frames are processed (YES), all of the frames are interconnected to complete an animation (S19). When the frames are not fully processed (NO), the key frame information is read from the motion database 36 (S13) and the operations (14-S18) are repeated.
In the pre-processing operation, to represent a 2D character in a 3D space, a character that can be obtained from the camera in the 3D space is two-dimensionally produced in advance according to camera views. At this point, a 3D animation is analyzed and mapped as a 2D animation. The mapped 2D animation is stored in the databases. The pre-processing operation includes a character setup process and a motion data setup process.
In the character setup process, 26 virtual locations are preset according to a camera location 104 that may be locatable in the 3D space (102) and a key drawing of the 2D character corresponding to a character obtainable at the camera location 104 is prepared as a drawing data (S20). Since the 3D character is formed in a hierarchy structure and thus the body regions such as hands and foots may move, the key drawing of the 2D character is divided to correspond to the body regions that can be obtainable at the 26 virtual locations and drawn and processed (S21). The division of the 2D character by the body regions is illustrated in
In the motion data setup process, the character animation data produced as the 3D character data are collected (S30). The collected data are the motion capture data or the character animation data produced through the 3D key frame method. Then, key frames are extracted from the 3D character animation data (S31). 3D key frames are preset in the key frame data. However, 3D key frames are not preset in the key frame data. Thus, the user set the 3D key frames. When the extracted key frame data is applied to the 3D character, mapping information is extracted by searching camera location information of the 2D character corresponding in the character database 23 after identifying the appearance obtained at an camera angle and in a direction of the character (S32). Then, information relating to the mapping is stored in the motion database 36. After a motion of the 3D character corresponding to the motion of the extracted key frames is analyzed (S33), an order of the drawings of the body regions of the 2D character is determined (S34) and the determined order is stored in the motion database 36 (S35). At this point, the drawing order is calculated using a distance between an end-effect of each body region and a character body with reference to a camera view.
There is shown a virtual object 105 that is to be photographed in a virtual space. A diagram 102 schematically shows a track at a center of which a virtual object 105 that is to be photographed in a virtual space is located and camera locations are set as 26 orientations on the track. Therefore, a virtual character that can be taken by the virtual camera may be preset and stored in the character database 23.
That is,
That is, a location of the camera is set on the track 102 that is formed by dividing the camera location into the 26 orientations in the track 101 of the virtual space and an appearance of the 2D character 130 taken at the camera location 103 in the virtual space and an appearance of the character taken at the camera location 104 on the track 102 are mapped (a). Also, an appearance of the 2D character 130 taken at the camera location 103′ of the virtual space and an appearance of the character taken at the camera location 104′ at the track 102 are mapped (b). An appearance of the 2D character taken at the camera location 103″ of the virtual space and an appearance of the character taken at the camera location 104″ at the track 102 are also mapped (c).
In
When the character is on the world coordinate, as shown in the coordinate (c), the character is located at a point P (a, b, c) away from a point of origin of the world coordinate 107 and the coordinate axes are different from each other. Therefore, the character coordinate 106 must be converted into the world coordinate 107. To convert the character coordinate 106 into the world coordinate 107, a representation of the character in the 3D space and a point P that is a reference of the animation must be identified. To identify the point P, the movement T and rotation R values must be identified. Since the location P where the point of origin of the character coordinate 106 is located in the world coordinate 107 is identified, the movement value T can be calculated by forming the P as a vector. The rotation value R is calculated from the following equation 1 using the orientation angle and movement value T of the character.
In the present invention, the human body is divided into 6 regions such as a center body 131, a head 132, a right hand 133, a left hand 134, a right foot 135, and a left foot 136. When the 2D character is drawn, it is drawn with reference to the center body 131 and the drawing order of the regions are determined by determining if each end-effect of each region is closer than the center body to the camera with reference to the camera view. The drawing starts from a region farthest from the camera and ends at a region closest to the camera.
In order for the 2D character to be exaggerated like cartoon as shown in
The pre-processing operation is described with reference to
Therefore, the character animation producing method of the present invention can be applied to internet-based applications, virtual reality systems and computer games.
In addition, the method can be programmed so that it can be read by a computer. This program can be stored in a variety of recording media such as CD ROMs, ROMs, RAMs, floppy disks, hard disks, and optical magnetic disks.
As described above, the present invention is directed to a method of representing the 2D character in the 3D space like the 3D character and a method of animating the 2D character by mapping the same as the 2D motion of the 2D character using the 3D motion data used in the 3D character animation. That is, the present invention is directed to a method of producing the 2D animation by mapping the 2D character animation as the 2D motion using the motion capture data, thereby providing the 2D animation having the cartoon-like exaggeration. In addition, the reuse of the motions, which was difficult in the conventional 2D animation, becomes possible by constructing a database having the information of the motions.
The character animation producing method of the present invention can be applied to internet-based applications, virtual reality systems and computer games as well as the animation production. Furthermore, when the character animation producing method is associated with a chroma-key technology, advantages of both 2D and 3D animations can be obtained.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention. Thus, it is intended that the present invention covers the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
Kim, Bo Youn, Koo, Bon Ki, Lee, Ji Hyung, Kim, Hee Jeong, Kim, Sung Ye
Patent | Priority | Assignee | Title |
7750907, | Apr 21 2006 | Samsung Electronics Co., Ltd. | Method and apparatus for generating on-screen display using 3D graphics |
8591329, | Feb 05 2010 | PC Concepts Limited | Methods and apparatuses for constructing interactive video games by use of video clip |
8648864, | Feb 05 2008 | DISNEY ENTERPRISES, INC | System and method for blended animation enabling an animated character to aim at any arbitrary point in a virtual space |
9123176, | Jun 27 2012 | Reallusion Inc. | System and method for performing three-dimensional motion by two-dimensional character |
9196076, | Nov 17 2010 | Method for producing two-dimensional animated characters | |
9667948, | Oct 28 2013 | Method and system for providing three-dimensional (3D) display of two-dimensional (2D) information |
Patent | Priority | Assignee | Title |
5982350, | Oct 07 1991 | Eastman Kodak Company | Compositer interface for arranging the components of special effects for a motion picture production |
6331861, | Mar 15 1996 | DG HOLDINGS, INC | Programmable computer graphic objects |
6535215, | Aug 06 1999 | Vcom3D, Incorporated | Method for animating 3-D computer generated characters |
6559845, | Jun 11 1999 | TUMBLEWEED HOLDINGS LLC | Three dimensional animation system and method |
6611266, | Jun 07 1999 | TIDEX SYSTEMS LTD | Method for achieving roaming capabilities and performing interactive CGI implanting, and computer games using same |
6912305, | Sep 08 1998 | ACCESS SYSTEMS PTY LIMITED | Computer animation |
6914603, | Jul 03 2000 | SONY NETWORK ENTERTAINMENT PLATFORM INC ; Sony Computer Entertainment Inc | Image generating system |
6931656, | Oct 11 2000 | KONINKLIJKE PHILIPS ELECTRONICS, N V | Virtual creature displayed on a television |
7225114, | May 14 2001 | CLOUDS INC | Information delivering system and information delivering method |
JP981775, | |||
KR10200356294, | |||
KR10200385249, | |||
KR10200582859, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Apr 06 2006 | LEE, JI HYUNG | Electronics and Telecommunications Research Institute | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017927 | /0733 | |
Apr 06 2006 | KOO, BON KI | Electronics and Telecommunications Research Institute | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017927 | /0733 | |
Apr 07 2006 | KIM, SUNG YE | Electronics and Telecommunications Research Institute | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017927 | /0733 | |
Apr 07 2006 | KIM HEE JEONG | Electronics and Telecommunications Research Institute | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017927 | /0733 | |
Apr 10 2006 | KIM, BO YOUN | Electronics and Telecommunications Research Institute | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017927 | /0733 | |
May 23 2006 | Electronics and Telecommunications Research Institute | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Dec 17 2008 | ASPN: Payor Number Assigned. |
Dec 17 2008 | RMPN: Payer Number De-assigned. |
Feb 24 2010 | RMPN: Payer Number De-assigned. |
Feb 25 2010 | ASPN: Payor Number Assigned. |
Feb 29 2012 | M2551: Payment of Maintenance Fee, 4th Yr, Small Entity. |
Mar 09 2016 | M2552: Payment of Maintenance Fee, 8th Yr, Small Entity. |
Apr 27 2020 | REM: Maintenance Fee Reminder Mailed. |
Oct 12 2020 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Sep 09 2011 | 4 years fee payment window open |
Mar 09 2012 | 6 months grace period start (w surcharge) |
Sep 09 2012 | patent expiry (for year 4) |
Sep 09 2014 | 2 years to revive unintentionally abandoned end. (for year 4) |
Sep 09 2015 | 8 years fee payment window open |
Mar 09 2016 | 6 months grace period start (w surcharge) |
Sep 09 2016 | patent expiry (for year 8) |
Sep 09 2018 | 2 years to revive unintentionally abandoned end. (for year 8) |
Sep 09 2019 | 12 years fee payment window open |
Mar 09 2020 | 6 months grace period start (w surcharge) |
Sep 09 2020 | patent expiry (for year 12) |
Sep 09 2022 | 2 years to revive unintentionally abandoned end. (for year 12) |