first or second clothing information corresponding to a first or second shape is obtained, and first or second pattern information corresponding to a first or second pattern is obtained. first or second clothing data corresponding to clothes with the first or second shape associated with the first or second clothing information is combined with first or second pattern data corresponding to the first or second pattern associated with the first or second pattern information to create composite data corresponding to a composite image of clothes with the first or second pattern and the first or second shape. The composite image corresponding to the composite data is output to and displayed in a display device.
|
13. A clothing design output method comprising an obtaining step, a composition step, and an output step,
the obtaining step being a step of obtaining any of first clothing information and second clothing information as clothing information corresponding to a shape of clothes, the first clothing information corresponding to a first shape, the second clothing information corresponding to a second shape different from the first shape, the obtaining step obtaining any of first pattern information and second pattern information as pattern information corresponding to a pattern of clothes, the first pattern information corresponding to a first pattern, the second pattern information corresponding to a second pattern different from the first pattern,
the composition step being:
when the clothing information and the pattern information obtained by the obtaining step are the first clothing information and the first pattern information, a step of combining first clothing data and first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the first shape, the first clothing data corresponding to clothes with the first shape associated with the first clothing information, the first pattern data corresponding to the first pattern associated with the first pattern information;
when the clothing information and the pattern information obtained by the obtaining step are the first clothing information and the second pattern information, a step of combining the first clothing data and second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the first shape, the second pattern data corresponding to the second pattern associated with the second pattern information;
when the clothing information and the pattern information obtained by the obtaining step are the second clothing information and the first pattern information, a step of combining second clothing data and the first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the second shape, the second clothing data corresponding to clothes with the second shape associated with the second clothing information; and
when the clothing information and the pattern information obtained by the obtaining step are the second clothing information and the second pattern information, a step of combining the second clothing data and the second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the second shape, and
the output step being a step of outputting the composite image corresponding to the composite data created by the composition step to a display device, the display device being configured to display the composite image,
wherein the method further comprises:
a display step of displaying the composite image in the display device and displaying the composite image in a display unit different from the display device; and
a reselection step of obtaining a reselection instruction that reselects the shape and the pattern of the clothes in a state that the composite image is displayed in the display device and the composite image is displayed in the display unit,
the display step comprising:
when the reselection instruction is obtained in the reselection step, ending the display of the composite image being displayed in the display unit and then displaying a clothing selection screen or a pattern selection screen in the display unit, the clothing selection screen including a first clothing image associated with the first clothing information and a second clothing image associated with the second clothing information, the pattern selection screen including a first pattern image associated with the first pattern information and a second pattern image associated with the second pattern information; and
continuing displaying the composite image being displayed in the display device while the clothing selection screen or the pattern selection screen is displayed in the display unit when the reselection instruction is obtained in the reselection step.
12. A clothing design output system comprising:
a controller configured to obtain any of first clothing information and second clothing information as clothing information corresponding to a shape of clothes, the first clothing information corresponding to a first shape, the second clothing information corresponding to a second shape different from the first shape, wherein the controller obtains any of first pattern information and second pattern information as pattern information corresponding to a pattern of clothes, the first pattern information corresponding to a first pattern, the second pattern information corresponding to a second pattern different from the first pattern;
a storage unit configured to store first clothing data and second clothing data, the first clothing data corresponding to clothes with the first shape associated with the first clothing information, the second clothing data corresponding to clothes with the second shape associated with the second clothing information, wherein the storage unit stores first pattern data and second pattern data, the first pattern data corresponding to the first pattern associated with the first pattern information, the second pattern data corresponding to the second pattern associated with the second pattern information;
an output unit configured to output a composite image corresponding to composite data to a display device configured to display the composite image; and
a display unit configured to display the composite image, the display unit being different from the display device, wherein
the controller is configured such that:
when the clothing information and the pattern information obtained by the controller are the first clothing information and the first pattern information, the controller combines the first clothing data and the first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the first shape;
when the clothing information and the pattern information obtained by the controller are the first clothing information and the second pattern information, the controller combines the first clothing data and the second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the first shape;
when the clothing information and the pattern information obtained by the controller are the second clothing information and the first pattern information, the controller combines the second clothing data and the first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the second shape; and
when the clothing information and the pattern information obtained by the controller are the second clothing information and the second pattern information, the controller combines the second clothing data and the second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the second shape,
the output unit is configured to output the composite image corresponding to the composite data created by the controller to the display device,
the display unit is configured to display the composite image corresponding to the composite data created by the controller,
the controller is configured to obtain a reselection instruction that reselects the shape and the pattern of the clothes in a state that the composite image is displayed in the display device and the composite image is displayed in the display unit,
the display unit is configured to end the display of the composite image being displayed in the display unit when the controller obtains the reselection instruction and display a clothing selection screen or a pattern selection screen in the display unit, the clothing selection screen including a first clothing image associated with the first clothing information and a second clothing image associated with the second clothing information, the pattern selection screen including a first pattern image associated with the first pattern information and a second pattern image associated with the second pattern information, and
the display device is configured to continue displaying the composite image being displayed in the display device while the clothing selection screen or the pattern selection screen is displayed in the display unit, when the controller obtains the reselection instruction.
1. A clothing design display system comprising a terminal, an image processing device, and a display device, wherein:
the terminal includes a first controller and a first communication unit,
the first controller being configured to obtain any of first clothing information and second clothing information as clothing information corresponding to a shape of clothes, the first clothing information corresponding to a first shape, the second clothing information corresponding to a second shape different from the first shape,
the first controller being configured to obtain any of first pattern information and second pattern information as pattern information corresponding to a pattern of clothes, the first pattern information corresponding to a first pattern, the second pattern information corresponding to a second pattern different from the first pattern,
the first communication unit being configured to transmit the clothing information and the pattern information obtained by the first controller to the image processing device;
the image processing device includes a second communication unit, a storage unit, a second controller, and an output unit,
the second communication unit being configured to receive the clothing information and the pattern information transmitted from the terminal,
the storage unit storing first clothing data and second clothing data, the first clothing data corresponding to clothes with the first shape associated with the first clothing information, the second clothing data corresponding to clothes with the second shape associated with the second clothing information,
the storage unit storing first pattern data and second pattern data, the first pattern data corresponding to the first pattern associated with the first pattern information, the second pattern data corresponding to the second pattern associated with the second pattern information,
the second controller being configured such that:
when the clothing information and the pattern information received by the second communication unit are the first clothing information and the first pattern information, the second controller combines the first clothing data and the first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the first shape;
when the clothing information and the pattern information received by the second communication unit are the first clothing information and the second pattern information, the second controller combines the first clothing data and the second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the first shape;
when the clothing information and the pattern information received by the second communication unit are the second clothing information and the first pattern information, the second controller combines the second clothing data and the first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the second shape; and
when the clothing information and the pattern information received by the second communication unit are the second clothing information and the second pattern information, the second controller combines the second clothing data and the second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the second shape,
the output unit being configured to output the composite image corresponding to the composite data created by the second controller to the display device;
the display device is configured to display the composite image output from the image processing device,
the second communication unit is configured to transmit the composite data created by the second controller to the terminal,
the first communication unit is configured to receive the composite data transmitted from the image processing device,
the terminal includes a display unit configured to display the composite image corresponding to the composite data received by the first communication unit,
the first controller is configured to obtain a reselection instruction that reselects the shape and the pattern of the clothes in a state that the composite image is displayed in the display device and the composite image is displayed in the display unit,
the display unit is configured to end the display of the composite image being displayed in the display unit when the first controller obtains the reselection instruction and display a clothing selection screen or a pattern selection screen in the display unit, the clothing selection screen including a first clothing image associated with the first clothing information and a second clothing image associated with the second clothing information, the pattern selection screen including a first pattern image associated with the first pattern information and a second pattern image associated with the second pattern information,
the first communication unit is configured to transmit the reselection instruction obtained by the first controller to the image processing device when the first controller obtains the reselection instruction,
the second communication unit is configured to receive the reselection instruction transmitted from the terminal, and
the display device is configured to continue displaying the composite image being displayed in the display device while the clothing selection screen or the pattern selection screen is displayed in the display unit, when the second communication unit receives the reselection instruction.
2. The clothing design display system according to
the first controller is configured to obtain any of first figure information and second figure information as figure information corresponding to a figure of a human, the first figure information corresponding to a first figure, the second figure information corresponding to a second figure different from the first figure,
the first communication unit is configured to transmit the clothing information, the pattern information, and the figure information obtained by the first controller to the image processing device,
the second communication unit is configured to receive the clothing information, the pattern information, and the figure information transmitted from the terminal,
the storage unit stores first doll data and second doll data, the first doll data corresponding to a doll with the first figure associated with the first figure information, the second doll data corresponding to a doll with the second figure associated with the second figure information, and
the second controller is configured such that:
when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the first pattern information, and the first figure information, the second controller combines the first clothing data, the first pattern data, and the first doll data to create composite data corresponding to a composite image of a doll with the first figure who wears the clothes with the first pattern and the first shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the first pattern information, and the second figure information, the second controller combines the first clothing data, the first pattern data, and the second doll data to create composite data corresponding to a composite image of a doll with the second figure who wears the clothes with the first pattern and the first shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the second pattern information, and the first figure information, the second controller combines the first clothing data, the second pattern data, and the first doll data to create composite data corresponding to a composite image of a doll with the first figure who wears the clothes with the second pattern and the first shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the second pattern information, and the second figure information, the second controller combines the first clothing data, the second pattern data, and the second doll data to create composite data corresponding to a composite image of a doll with the second figure who wears the clothes with the second pattern and the first shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the first pattern information, and the first figure information, the second controller combines the second clothing data, the first pattern data, and the first doll data to create composite data corresponding to a composite image of a doll with the first figure who wears the clothes with the first pattern and the second shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the first pattern information, and the second figure information, the second controller combines the second clothing data, the first pattern data, and the second doll data to create composite data corresponding to a composite image of a doll with the second figure who wears the clothes with the first pattern and the second shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the second pattern information, and the first figure information, the second controller combines the second clothing data, the second pattern data, and the first doll data to create composite data corresponding to a composite image of a doll with the first figure who wears the clothes with the second pattern and the second shape; and
when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the second pattern information, and the second figure information, the second controller combines the second clothing data, the second pattern data, and the second doll data to create composite data corresponding to a composite image of a doll with the second figure who wears the clothes with the second pattern and the second shape.
3. The clothing design display system according to
the second controller is configured to obtain shot data corresponding to a shot image including a head of the human shot by a shooting device,
the second controller is configured to extract head data corresponding to a head image of the head of the human from the shot data obtained by the second controller, and
the second controller is configured such that:
when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the first pattern information, and the first figure information, the second controller combines the first clothing data, the first pattern data, the first doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the first figure and further having a head set as the head image corresponding to the head data who wears the clothes with the first pattern and the first shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the first pattern information, and the second figure information, the second controller combines the first clothing data, the first pattern data, the second doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the second figure and further having a head set as the head image corresponding to the head data who wears the clothes with the first pattern and the first shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the second pattern information, and the first figure information, the second controller combines the first clothing data, the second pattern data, the first doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the first figure and further having a head set as the head image corresponding to the head data who wears the clothes with the second pattern and the first shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the second pattern information, and the second figure information, the second controller combines the first clothing data, the second pattern data, the second doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the second figure and further having a head set as the head image corresponding to the head data who wears the clothes with the second pattern and the first shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the first pattern information, and the first figure information, the second controller combines the second clothing data, the first pattern data, the first doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the first figure and further having a head set as the head image corresponding to the head data who wears the clothes with the first pattern and the second shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the first pattern information, and the second figure information, the second controller combines the second clothing data, the first pattern data, the second doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the second figure and further having a head set as the head image corresponding to the head data who wears the clothes with the first pattern and the second shape;
when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the second pattern information, and the first figure information, the second controller combines the second clothing data, the second pattern data, the first doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the first figure and further having a head set as the head image corresponding to the head data who wears the clothes with the second pattern and the second shape; and
when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the second pattern information, and the second figure information, the second controller combines the second clothing data, the second pattern data, the second doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the second figure and further having a head set as the head image corresponding to the head data who wears the clothes with the second pattern and the second shape.
4. The clothing design display system according to
the second controller is configured to specify color information of a skin of the human included in the head image from the head data extracted by the second controller, and
the second controller is configured to create the composite data corresponding to the composite image of a doll, the doll having a head set as the head image corresponding to the head data, a part of the doll excluding the head not covered with worn clothes being set as a color corresponding to the color information specified by the second controller.
5. The clothing design display system according to
the second controller is configured to create first composite data and second composite data as the composite data corresponding to the composite image, the first composite data corresponding to a first composite image of a doll having a head not set as the head image corresponding to the head data, the second composite data corresponding to a second composite image of a doll having a head set as the head image corresponding to the head data,
the output unit is configured to output the second composite image corresponding to the second composite data created by the second controller to the display device,
the second communication unit is configured to transmit the first composite data created by the second controller to the terminal,
the first communication unit is configured to receive the first composite data transmitted from the image processing device, and
the display unit is configured to display the first composite image, the first composite image corresponding to the first composite data received by the first communication unit.
6. A clothing production system comprising:
the clothing design display system according to
a clothing production apparatus configured to produce clothes, wherein
the clothing production apparatus is configured to produce clothes in the composite image corresponding to the composite data created by the second controller.
7. The clothing design display system according to
the second controller is configured to create first composite data and second composite data as the composite data corresponding to the composite image, the first composite data corresponding to a first composite image of a doll having a head not set as the head image corresponding to the head data, the second composite data corresponding to a second composite image of a doll having a head set as the head image corresponding to the head data,
the output unit is configured to output the second composite image corresponding to the second composite data created by the second controller to the display device,
the second communication unit is configured to transmit the first composite data created by the second controller to the terminal,
the first communication unit is configured to receive the first composite data transmitted from the image processing device, and
the display unit is configured to display the first composite image, the first composite image corresponding to the first composite data received by the first communication unit.
8. A clothing production system comprising:
the clothing design display system according to
a clothing production apparatus configured to produce clothes, wherein
the clothing production apparatus is configured to produce clothes in the composite image corresponding to the composite data created by the second controller.
9. A clothing production system comprising:
the clothing design display system according to
a clothing production apparatus configured to produce clothes, wherein
the clothing production apparatus is configured to produce clothes in the composite image corresponding to the composite data created by the second controller.
10. A clothing production system comprising:
the clothing design display system according to
a clothing production apparatus configured to produce clothes, wherein
the clothing production apparatus is configured to produce clothes in the composite image corresponding to the composite data created by the second controller.
11. A clothing production system comprising:
the clothing design display system according to
a clothing production apparatus configured to produce clothes, wherein
the clothing production apparatus is configured to produce clothes in the composite image corresponding to the composite data created by the second controller.
|
The present invention relates to a clothing design display system that displays a composite image of clothes, a clothing production system that produces the clothes, a clothing design output system that outputs the composite image of the clothes to a display device, and a clothing design output method.
A technique to confirm an impression of clothes after completion in advance when the clothes are manufactured has been proposed. For example, Patent Literature 1 discloses a production method for garments and accessories. The production method includes a selection step of a cloth and a design. In the selection step of the cloth and the design, a shop assistant or a customer himself/herself executes work using an apparatus installed in a retailer to display a selection and a clothed image. Customer's favorite design and cloth are selected and determined. Color and pattern settings are executed to the selected design using a three-dimensional design component. The finished image is displayed through computer graphics.
There is a sales form to customize and sell clothes designed to meet the demands from a customer. The inventors considered that, a system, with which an impression felt from clothes in an actually completed state can be confirmed before manufacturing the clothes, is effective to such sales form, if a coloring of the clothes after completion differs from the coloring that has impressed the customer, the feeling of satisfaction cannot be provided to the customer. The inventors have studied a technique to express, in the design phase, the impression felt from the clothes in the actually completed state. Then, the inventors have also studied a technique that allows for smooth designing, in the design phase, of clothes meeting the demands from the customer.
An object of the present invention is to provide a clothing design display system, a clothing production system, a clothing design output system, and a clothing design output method. The systems and the method of the present invention allow a display device to display, in the design phase, a composite image with an impression close to an impression felt from custom-made clothes after completion.
An aspect of the present invention is a clothing design display system including a terminal, an image processing device, and a display device, wherein: the terminal includes a first controller and a first communication unit, the first controller being configured to obtain any of first clothing information and second clothing information as clothing information corresponding to a shape of clothes, the first clothing information corresponding to a first shape, the second clothing information corresponding to a second shape different from the first shape, the first controller being configured to obtain any of first pattern information and second pattern information as pattern information corresponding to a pattern of clothes, the first pattern information corresponding to a first pattern, the second pattern information corresponding to a second pattern different from the first pattern, the first communication unit being configured to transmit the clothing information and the pattern information obtained by the first controller to the image processing device; the image processing device includes a second communication unit, a storage unit, a second controller, and an output unit, the second communication unit being configured to receive the clothing information and the pattern information transmitted from the terminal, the storage unit storing first clothing data and second clothing data, the first clothing data corresponding to clothes with the first shape associated with the first clothing information, the second clothing data corresponding to clothes with the second shape associated with the second clothing information, the storage unit storing first pattern data and second pattern data, the first pattern data corresponding to the first pattern associated with the first pattern information, the second pattern data corresponding to the second pattern associated with the second pattern information, the second controller being configured such that: when the clothing information and the pattern information received by the second communication unit are the first clothing information and the first pattern information, the second controller combines the first clothing data and the first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the first shape; when the clothing information and the pattern information received by the second communication unit are the first clothing information and the second pattern information, the second controller combines the first clothing data and the second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the first shape; when the clothing information and the pattern information received by the second communication unit are the second clothing information and the first pattern information, the second controller combines the second clothing data and the first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the second shape; and when the clothing information and the pattern information received by the second communication unit are the second clothing information and the second pattern information, the second controller combines the second clothing data and the second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the second shape, the output unit being configured to output the composite image corresponding to the composite data created by the second controller to the display device; and the display device is configured to display the composite image output from the image processing device.
This clothing design display system allows the display device to display the composite image of clothes with a predetermined pattern and a predetermined shape according to the clothing information and the pattern information obtained by the terminal. Making the display device large allows for displaying the clothes with the predetermined pattern and the predetermined shape in a large scale. By displaying the clothes with the predetermined pattern and the predetermined shape in a large scale, the sizes of the clothes in the composite image and of the actual clothes have close values, and failures caused by the difference in the perceived colors due to the area effect are solved beforehand.
The clothing design display system may be configured as follows: the first controller is configured to obtain any of first figure information and second figure information as figure information corresponding to a figure of a human, the first figure information corresponding to a first figure, the second figure information corresponding to a second figure different from the first figure, the first communication unit is configured to transmit the clothing information, the pattern information, and the figure information obtained by the first controller to the image processing device, the second communication unit is configured to receive the clothing information, the pattern information, and the figure information transmitted from the terminal, the storage unit stores first doll data and second doll data, the first doll data corresponding to a doll with the first figure associated with the first figure information, the second doll data corresponding to a doll with the second figure associated with the second figure information, and the second controller is configured such that: when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the first pattern information, and the first figure information, the second controller combines the first clothing data, the first pattern data, and the first doll data to create composite data corresponding to a composite image of a doll with the first figure who wears the clothes with the first pattern and the first shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the first pattern information, and the second figure information, the second controller combines the first clothing data, the first pattern data, and the second doll data to create composite data corresponding to a composite image of a doll with the second figure who wears the clothes with the first pattern and the first shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the second pattern information, and the first figure information, the second controller combines the first clothing data, the second pattern data, and the first doll data to create composite data corresponding to a composite image of a doll with the first figure who wears the clothes with the second pattern and the first shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the second pattern information, and the second figure information, the second controller combines the first clothing data, the second pattern data, and the second doll data to create composite data corresponding to a composite image of a doll with the second figure who wears the clothes with the second pattern and the first shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the first pattern information, and the first figure information, the second controller combines the second clothing data, the first pattern data, and the first doll data to create composite data corresponding to a composite image of a doll with the first figure who wears the clothes with the first pattern and the second shape; when the clothing information, the pattern information, and the figure information received by the second communication unit the second clothing information, the first pattern information, and the second figure information, the second controller combines the second clothing data, the first pattern data, and the second doll data to create composite data corresponding to a composite image of a doll with the second figure who wears the clothes with the first pattern and the second shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the second pattern information, and the first figure information, the second controller combines the second clothing data, the second pattern data, and the first doll data to create composite data corresponding to a composite image of a doll with the first figure who wears the clothes with the second pattern and the second shape; and when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the second pattern information, and the second figure information, the second controller combines the second clothing data, the second pattern data, and the second doll data to create composite data corresponding to a composite image of a doll with the second figure who wears the clothes with the second pattern and the second shape.
This configuration allows the display device to display the composite image of the doll with a predetermined figure who wears the clothes with the predetermined pattern and the predetermined shape according to the clothing information, the pattern information, and the figure information obtained by the terminal. Since the composite image is in the state in which the doll with the predetermined figure wears the clothes with the predetermined pattern and the predetermined shape, a user can recognize an impression of newly-designed clothes. For example, suppose that the newly-designed clothes are sleeveless or short-sleeved. Suppose that clothes that the user actually wears are long-sleeved clothes. As the composite data, created may also be composite data corresponding to a composite image in a state in which a full length of the user, who wears the long-sleeved clothes, has been shot, and further the user wears the previously mentioned sleeveless or short-sleeved clothes. Note that it is sometimes difficult with the composite image in such state to accurately recognize the impression of the newly-designed clothes due to an influence from the long-sleeved part and a pattern of the part of the clothes that the user actually wears. When the impression of the newly-designed clothes is recognized, the influence from the clothes that the user actually wears can be restrained.
The clothing design display system may be configured as follows: the second controller is configured to obtain shot data corresponding to a shot image including a head of the human shot by a shooting device, the second controller is configured to extract head data corresponding to a head image of the head of the human from the shot data obtained by the second controller, and the second controller is configured such that: when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the first pattern information, and the first figure information, the second controller combines the first clothing data, the first pattern data, the first doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the first figure and further having a head set as the head image corresponding to the head data who wears the clothes with the first pattern and the first shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the first pattern information, and the second figure information, the second controller combines the first clothing data, the first pattern data, the second doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the second figure and further having a head set as the head image corresponding to the head data who wears the clothes with the first pattern and the first shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the second pattern information, and the first figure information, the second controller combines the first clothing data, the second pattern data, the first doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the first figure and further having a head set as the head image corresponding to the head data who wears the clothes with the second pattern and the first shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the first clothing information, the second pattern information, and the second figure information, the second controller combines the first clothing data, the second pattern data, the second doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the second figure and further having a head set as the head image corresponding to the head data who wears the clothes with the second pattern and the first shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the first pattern information, and the first figure information, the second controller combines the second clothing data, the first pattern data, the first doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the first figure and further having a head set as the head image corresponding to the head data who wears the clothes with the first pattern and the second shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the first pattern information, and the second figure information, the second controller combines the second clothing data, the first pattern data, the second doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the second figure and further having a head set as the head image corresponding to the head data who wears the clothes with the first pattern and the second shape; when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the second pattern information, and the first figure information, the second controller combines the second clothing data, the second pattern data, the first doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the first figure and further having a head set as the head image corresponding to the head data who wears the clothes with the second pattern and the second shape; and when the clothing information, the pattern information, and the figure information received by the second communication unit are the second clothing information, the second pattern information, and the second figure information, the second controller combines the second clothing data, the second pattern data, the second doll data, and the head data that is extracted by the second controller to create composite data corresponding to a composite image of a doll with the second figure and further having a head set as the head image corresponding to the head data who wears the clothes with the second pattern and the second shape. In this case, the clothing design display system may be configured as follows: the second controller is configured to specify color information of a skin of the human included in the head image from the head data extracted by the second controller, and the second controller is configured to create the composite data corresponding to the composite image of a doll, the doll having a head set as the head image corresponding to the head data, a part of the doll excluding the head not covered with worn clothes being set as a color corresponding to the color information specified by the second controller.
This configuration allows the user to recognize the impression of an appearance of himself/herself who actually wears the newly-designed clothes.
The clothing design display system may be configured as follows: the second controller is configured to create first composite data and second composite data as the composite data corresponding to the composite image, the first composite data corresponding to a first composite image of a doll having a head not set as the head image corresponding to the head data, the second composite data corresponding to a second composite image of a doll having a head set as the head image corresponding to the head data, the output unit is configured to output the second composite image corresponding to the second composite data created by the second controller to the display device, the second communication unit is configured to transmit the first composite data created by the second controller to the terminal, the first communication unit is configured to receive the first composite data transmitted from the image processing device, and the terminal includes a display unit configured to display the first composite image, the first composite image corresponding to the first composite data received by the first communication unit.
This configuration allows the display unit on the terminal to display the first composite image and the display device to display the second composite image. The first composite image that does not include the head of the user displayed in the display unit on the terminal allows the user to recognize the impression of the clothes themselves. The second composite image displayed in the display device allows the user to recognize the impression of an appearance of himself/herself who actually wears the newly-designed clothes.
The clothing design display system may be configured as follows: the second communication unit is configured to transmit the composite data created by the second controller to the terminal, the first communication unit is configured to receive the composite data transmitted from the image processing device, and the terminal includes a display unit configured to display the composite image corresponding to the composite data received by the first communication unit. This allows the display device and the display unit on the terminal to display the composite images.
Another aspect of the present invention is a clothing production system including any one of the above-mentioned clothing design display systems and a clothing production apparatus configured to produce clothes. The clothing production apparatus is configured to produce clothes in the composite image corresponding to the composite data created by the second controller. This clothing production system allows for producing newly-designed clothes with the clothing design display system. The clothes meeting the demands from the user can be provided to this user.
Still another aspect of the present invention is a clothing design output system including: a controller configured to obtain any of first clothing information and second clothing information as clothing information corresponding to a shape of clothes, the first clothing information corresponding to a first shape, the second clothing information corresponding to a second shape different from the first shape, wherein the controller obtains any of first pattern information and second pattern information as pattern information corresponding to a pattern of clothes, the first pattern information corresponding to a first pattern, the second pattern information corresponding to a second pattern different from the first pattern; a storage unit configured to store first clothing data and second clothing data, the first clothing data corresponding to clothes with the first shape associated with the first clothing information, the second clothing data corresponding to clothes with the second shape associated with the second clothing information, wherein the storage unit stores first pattern data and second pattern data, the first pattern data corresponding to the first pattern associated with the first pattern information, the second pattern data corresponding to the second pattern associated with the second pattern information; and an output unit configured to output a composite image corresponding to composite data to a display device configured to display the composite image, wherein the controller is configured such that: when the clothing information and the pattern information obtained by the controller are the first clothing information and the first pattern information, the controller combines the first clothing data and the first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the first shape; when the clothing information and the pattern information obtained by the controller are the first clothing information and the second pattern information, the controller combines the first clothing data and the second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the first shape; when the clothing information and the pattern information obtained by the controller are the second clothing information and the first pattern information, the controller combines the second clothing data and the first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the second shape; and when the clothing information and the pattern information obtained by the controller are the second clothing information and the second pattern information, the controller combines the second clothing data and the second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the second shape, and the output unit is configured to output the composite image corresponding to the composite data created by the controller to the display device.
Still another aspect of the present invention is a clothing design output method including an obtaining step, a composition step, and an output step, the obtaining step being a step of obtaining any of first clothing information and second clothing information as clothing information corresponding to a shape of clothes, the first clothing information corresponding to a first shape, the second clothing information corresponding to a second shape different from the first shape, the obtaining step obtaining any of first pattern information and second pattern information as pattern information corresponding to a pattern of clothes, the first pattern information corresponding to a first pattern, the second pattern information corresponding to a second pattern different from the first pattern, the composition step being: when the clothing information and the pattern information obtained by the obtaining step are the first clothing information and the first pattern information, a step of combining first clothing data and first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the first shape, the first clothing data corresponding to clothes with the first shape associated with the first clothing information, the first pattern data corresponding to the first pattern associated with the first pattern information; when the clothing information and the pattern information obtained by the obtaining step are the first clothing information and the second pattern information, a step of combining the first clothing data and second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the first shape, the second pattern data corresponding to the second pattern associated with the second pattern information; when the clothing information and the pattern information obtained by the obtaining step are the second clothing information and the first pattern information, a step of combining second clothing data and the first pattern data to create composite data corresponding to a composite image of clothes with the first pattern and the second shape, the second clothing data corresponding to clothes with the second shape associated with the second clothing information; and when the clothing information and the pattern information obtained by the obtaining step are the second clothing information and the second pattern information, a step of combining the second clothing data and the second pattern data to create composite data corresponding to a composite image of clothes with the second pattern and the second shape, and the output step being a step of outputting the composite image corresponding to the composite data created by the composition step to a display device, the display device being configured to display the composite image.
These clothing design output system and clothing design output method allow a display device to display the composite image of clothes with a predetermined pattern and a predetermined shape according to obtained clothing information and pattern information. Making the display device large allows for displaying the clothes with the predetermined pattern and the predetermined shape in a large scale. By displaying the clothes with the predetermined pattern and the predetermined shape in a large scale, the sizes of the clothes in the composite image and of the actual clothes have close values, and failures caused by the difference in the perceived colors due to the area effect are solved beforehand.
The present invention can obtain a clothing design display system, a clothing production system, a clothing design output system, and a clothing design output method. The systems and the method of the present invention allow a display device to display, in the design phase, a composite image with an impression close to an impression felt from custom-made clothes after completion.
Embodiments for carrying out the present invention will be described with reference to the drawings. The present invention is not limited to the configurations described below, and various configurations can be employed based on the same technical idea. For example, a part of the configurations shown below may be omitted or may be replaced by another configuration or the like. Another configuration may be included.
<Clothing Design Display System>
A clothing design display system 1 is described with reference to
The clothing design display system 1 includes a terminal 10, an image processing device 30, and a display device 40. The terminal 10 and the image processing device 30 are connected to allow for data communications. Predetermined data is communicated between the terminal 10 and the image processing device 30. In the embodiment, the terminal 10 communicates the data with the image processing device 30 via a network 70. An access point 71 is disposed at the network 70. The terminal 10 is connected to the access point 71 by wireless connection. The image processing device 30 is connected to the network 70 by wired connection. In the embodiment, such connection forms are described as the examples. Note that the connection form of the terminal 10 to the network 70 array also be a wired connection. The connection form of the image processing device 30 to the network 70 may also be a wireless connection via the access point 71.
The display device 40 is connected to the image processing device 30. Further, a shooting device 50 and an operating device 60 are connected to the image processing device 30. The embodiment employs a wired connection with a connection method described below, as the connection form of the image processing device 30 with the display device 40, as the connection form of the image processing device 30 with the shooting device 50, and as the connection form of the image processing device 30 with the operating device 60. Note that the connection form of the image processing device 30 with a part of or all of the display device 40, the shooting device 50, and the operating device 60 may also be the wireless connection.
The terminal 10 is an information processing apparatus. For example, a portable information processing apparatus is employed as the terminal 10. For example, a tablet terminal or a notebook personal computer is employed as the terminal 10. The embodiment employs a tablet terminal as the terminal 10 (see
The image processing device 30 is an information processing apparatus. For example, a desktop personal computer is employed as the image processing device 30. The image processing device 30 is an information processing apparatus that executes a composition process on target predetermined image data. This composition process is executed in accordance with each piece of information from the terminal 10 (see S85 in
The display device 40 is, for example, a liquid crystal display. Note that the display device 40 may also be a display of a different type from the liquid crystal. For example, the display device 40 may also be a projector. The display device 40 displays a second composite image output from the image processing device 30. The second composite image will be described below. A screen size of the display device 40 may be configured to display the display target second composite image with a size to the extent of corresponding to a height of the wearer when the display device 40 is installed with a longitudinal direction of the display device 40 in a vertical direction (see
The shooting device 50 is a camera. The shooting device 50 shoots the wearer. For example, the shooting device 50 shoots a front, a back, and sides of the wearer. In the embodiment, the shot target by the shooting device 50 is the front of the wearer, the back of the wearer, and any one of the right and the left sides of the wearer. When any one of the right and the left sides of the wearer is the shot target, which side to be set as the shot target is properly determined taking various circumstances into consideration. Both the right and the left sides of the wearer may also be the shot target. The embodiment sets the right side of the wearer as the shot target. Shot data corresponding to the shot image shot by the shooting device 50 is input to the image processing device 30 via a signal cable compatible with a connection method described below. In the embodiment, the shot image shot by the shooting device 50 is a still image. Therefore, the shot data is image data of still image. As the shooting device 50, a known camera can be employed. Therefore, other descriptions related to the shooting device 50 are omitted. The operating device 60 will be described below.
The terminal 10 and the image processing device 30 constitute a clothing design output system 2 as a subsystem in the clothing design display system 1. That is, the clothing design display system 1, as described below, designs the clothes meeting the demands from the customer with the terminal 10 and the image processing device 30. The image including the designed clothes (the second composite image) is output to the display device 40.
<Terminal>
The terminal 10 is described with reference to
The RAM 13 serves as a storage area when the CPU 11 executes the OS and the various programs stored in the storage device 12. The RAM 13 stores predetermined information and data in a predetermined storage area in the middle of the execution of the process. In the terminal 10, the CPU 11 as the controller executes the OS and the various programs stored in the storage device 12. In association with this, the terminal 10 executes various processes, and functions corresponding to the executed processes are achieved.
The operating unit 14 accepts various instructions input to the terminal 10. The operating unit 14, like known tablet terminals, is configured to have a touchpad in the embodiment in which the terminal 10 is a tablet terminal. An operator of the terminal 10 executes an operation such as a tap, a pinch, a flick, and a swipe on the operating unit 14 by the touchpad. The operating unit 14 accepts the instruction according to each operation. In the embodiment, the operator of the terminal 10 is one or both of a shop assistant of the above-mentioned business operator and the wearer. The operating unit 14 may also include a predetermined hardware key. Further, the operating unit 14 may also be a keyboard and a computer mouse. The CPU 11 obtains the predetermined instruction accepted by the operating unit 14 via the operating unit 14.
The display unit 15 displays predetermined information. For example, the display unit 15 displays a figure selection screen (see
The communication unit 16 is a network interface compatible with the wireless LAN communication method. The communication unit 16 connects the terminal 10 to the network 70 via wireless communication with the access point 71. The communication unit 16 executes data communications with the image processing device 30. In the terminal 10, the communication unit 16 transmits predetermined information and instructions to the image processing device 30, in the terminal 10, the communication unit 16 receives the predetermined data transmitted from the image processing device 30.
The terminal 10 differs from known information processing apparatuses in that the storage device 12 stores the program for the acceptance process
<Image Processing Device>
The image processing device 30 is described with reference to
The doll data is image data corresponding to a doll with a predetermined figure. The storage device 32 stores a plurality of doll data corresponding to dolls with different figures. In the embodiment, described are three kinds of examples, a large size, a medium size, and a small size as figure information corresponding to the figure of the wearer. A magnitude relationship among the respective sizes, the large size, the medium size, and the small seize, is: “large size>medium size>small size,” In the embodiment, the figure information corresponding to the large size is referred to as “figure information L,” and the figure information corresponding to the medium size is referred to as “figure information M,” and the figure information corresponding to the small size is referred to as “figure information S.” When the figure information L, M, and S are not discriminated or these are collectively referred to, the figure information L, M, and S are referred to as “figure information.” Further, the storage device 32 stores doll data corresponding to the front of a doll, doll data corresponding to the back of a doll, and doll data corresponding to the side of a doll for each size. That is, in the embodiment based on the respective sizes, the large size, the medium size, and the small size, nine pieces of doll data in total are stored in the storage device 32.
When the front, the back, and the side regarding large size doll data are not discriminated or these are collectively referred to, the large size doll data is referred to as “doll data L.” The doll data L is associated with the figure information L. When the front, the back, and the side regarding medium size doll data are not discriminated or these are collectively referred to, the medium size doll data is referred to as “doll data M.” The doll data M is associated with the figure information M. When the front, the back, and the side regarding small size doll data are not discriminated or these are collectively referred to, the small size doll data is referred to as “doll data S.” The doll data S is associated with the figure information S. When the front, the back, and the side are not discriminated and the doll data L, M, and S are not discriminated or these are collectively referred to, these are referred to as “doll data.” When the front, the back, and the side of the doll data with the respective sizes, the large size, the medium size, and the small seize, are discriminated, for example, using the doll data M as the example, the doll data is referred to as “front doll data M,” “back doll data M,” and “side doll data M.”
In the embodiment, the horizontal direction (see
The clothing data is image data corresponding to clothes with a predetermined shape. The storage device 32 stores a plurality of clothing data corresponding to clothes with different shapes. Examples of the shapes of clothes are: sleeveless, short sleeve, cut and sewn, shirt, blouse, one-piece, cardigan, vest, jacket, coat, skirt, and trousers. Further, these shapes of clothes include the respective shapes of subdivided clothes of an identical type. For example, one-pieces with the respective different shapes are regarded as clothes with different shapes (one-pieces) in one-pieces. In the embodiment, described are examples of clothes (one-pieces) with five kinds of shapes as illustrated in
When the front, the back, and the side regarding respective clothing data corresponding to clothes (one-pieces) with the respective shapes are not discriminated or these are collectively referred to, the clothing data corresponding to the clothes (one-pieces) with the respective shapes are each referred to as “clothing data A1,” “clothing data A2,” “clothing data A3,” “clothing data A4,” and “clothing data A5.” The clothing data A1 is associated with clothing informational. The clothing data A2 is associated with clothing information A2. The clothing data. A3 is associated with clothing information A3. The clothing data A4 is associated with clothing information A4. The clothing data A5 is associated with clothing information A5. The clothing information is information to identify the clothes with different shapes. When the front, the back, and the side are not discriminated and the clothing data A1, A2, A3, 44, and A5 are not discriminated or these are collectively referred to, these are referred to as “clothing data.” When the front, the back, and the side of the clothing data with the respective shapes are discriminated, for example, using the clothing data A3 as the example, these are referred to as “front clothing data A3.” “back clothing data A3,” and “side clothing data 43.” When the clothing information A1, A2, A3, A4, and A5 are not discriminated or these are collectively referred to, these are referred to as “clothing information.”
The pattern data is image data corresponding to a predetermined pattern. The storage device 32 stores a plurality of pattern data corresponding to different patterns. For example, a pattern can be defined by coloring and patterning. In this case, the previously mentioned different patterns mean that any one of the coloring and the patterning is different. Plain is one patterning. In the embodiment, described are examples of six kinds of patterns as illustrated in
The RAM 33 serves as a storage area when the CPU 31 executes the OS and the various programs stored in the storage device 32. The RAM 33 stores predetermined information and data in a predetermined storage area in the middle of the execution of the process. The RAM 33 may also store a plurality of doll data, a plurality of clothing data, and a plurality of pattern data. Note that, in the embodiment, the storage device 32 stores the plurality of doll data, the plurality of clothing data, and the plurality of pattern data as mentioned above. In the image processing device 30, the CPU 31 as the controller executes the OS and the various programs stored in the storage device 32. In association with this, the image processing device 30 executes various processes, and functions corresponding to the executed processes are achieved.
The communication unit 34 is a network interface compatible with the wired LAN communication method. The communication unit 34 connects the image processing device 30 to the network 70. The communication unit 34 executes data communications with the terminal 10. In the image processing device 30, the communication unit 34 receives the predetermined information and instruction transmitted from the terminal 10. In the image processing device 30, the communication unit 34 transmits the predetermined data to the terminal 10.
The output unit 35 and the input unit 36 are connection interfaces to connect a predetermined external device to the image processing device 30. The display device 40 is connected to the output unit 35. The shooting device 50 and the operating device 60 are connected to the input unit 36. In the image processing device 30, the output unit 35 outputs an image displayed in the display device 40 to the display device 40. The shot data from the shooting device 50 is input to the input unit 36. The operating device 60 connected to the input unit 36 accepts various instructions input to the image processing device 30. The operating device 60 includes a known keyboard and a known computer mouse. The output unit 35 and the input unit 36 are, for example, connection interfaces compatible with one or both of the connection methods of High-Definition Multimedia Interface (registered trademark) (HDMI) and Universal Serial Bus (USB). With the output unit 35 and the input unit 36 compatible with both HDMI (registered trademark) and USB, the output unit 35 and the input unit 36 include a connection interface compatible with HDMI (registered trademark) and a connection interface compatible with USB. The respective connections of the image processing device 30 with the display device 40, the shooting device 50, and the operating device 60 may also be different connection methods. For example, the image processing device 30 may connect with the respective display device 40 and shooting device 50 with the HDMI (registered trademark) connection methods, and the image processing device 30 may connect with the operating device 60 with the USB connection method. The previously mentioned connection methods of the respective connections of the image processing device 30 with the display device 40, the shooting device 50, and the operating device 60 are exemplified. For example, the connection between the image processing device 30 and the shooting device 50 may also have a USB connection method. When the connection of the image processing device 30 with the display device 40 is the wireless connection, the output unit 35 includes a predetermined interface for the wireless connection. When the connections of the image processing device 30 with the shooting device 50 and the operating device 60 are wireless connections, or, the connection of the image processing device 30 with the shooting device 50 or the operating device 60 is a wireless connection, the input unit 36 includes predetermined interfaces for the wireless connections.
The image processing device 30 differs from known information processing apparatuses in that the storage device 32 stores, as described below, the program for the output process (see
<Acceptance Process>
The acceptance process executed by the terminal 10 is described with reference to
The figure selection screen is a screen to select the figure of the wearer. The embodiment is based on the respective sizes, the large size, the medium size, and the small size. Therefore, as illustrated in
In the embodiment, the doll images L, M, and S are at the front of the doll. The doll image L is a reduced image corresponding to the front doll data L. A thumbnail image is exemplified as the reduced image. The doll image L is associated with the figure information L. The doll image M is a reduced image corresponding to the front doll data M. The doll image M is associated with the figure information M. The doll image S is a reduced image corresponding to the front doll data S. The doll image S is associated with the figure information S. When the doll images L, M, and S are not discriminated or these are collectively referred to, these are referred to as a “doll image.”
The doll image is selected, for example, by a tap operation to the operating unit 14 on the selected doll image. The shop assistant or the wearer executes the tap operation to the operating unit 14 on the doll image matching the figure of or having the closest figure to the wearer. When the operating unit 14 accepts the tap operation selecting the doll image, the CPU 11 obtains the figure information associated with the selection-target doll image. For example, suppose that the tap operation is executed to the operating unit 14 on the doll image M. In this case, the CPU 11 obtains the figure information M associated with the doll image M. When the figure information has been obtained (S13: Yes), the CPU 11 controls the transmission of the obtained figure information (S15). The CPU 11 outputs a transmission command for the figure information to the communication unit 16. In association with this, the communication unit 16 transmits the figure information to the image processing device 30.
After the execution of S15, the CPU 11 controls the display of the clothing selection screen (S17). The CPU 11 outputs a display command for the clothing selection screen to the display unit 15. In association with this, the display unit 15 displays the clothing selection screen (see
The clothing selection screen is a screen to select the shape of the clothes that the wearer attempts to purchase. The embodiment is based on the clothes with live kinds of shapes. Therefore, as illustrated in
In the embodiment, the clothing images A1, A2, A3, A4, and A5 are at the front of the clothes. The clothing image A1 is a reduced image corresponding to the clothing data A1. The clothing image A1 is associated with the clothing information A1. The clothing image A2 is a reduced image corresponding to the clothing data A2. The clothing image A2 is associated with the clothing information A2. The clothing image A3 is a reduced image corresponding to the clothing data A3. The clothing image A3 is associated with the clothing information A3. The clothing image A4 is a reduced image corresponding to the clothing data A4. The clothing image A4 is associated with the clothing information A4. The clothing image A5 is a reduced image corresponding to the clothing data A5. The clothing image A5 is associated with the clothing information A5. When the clothing images A1, A2, A3, A4, and A5 are not discriminated or these are collectively referred to, these are referred to as a clothing image.
The clothing image is selected, for example, by the tap operation to the operating unit 14 on the selected clothing image. The shop assistant or the wearer executes the tap operation to the operating unit 14 on the clothing image of the clothes with the shape that the wearer attempts to purchase. The clothing selection screen includes a return button 18 (see
When the operating unit 14 accepts the tap operation to select the clothing image, the CPU 11 obtains the clothing information associated with the selection-target clothing image. For example, suppose that the tap operation is executed to the operating unit 14 on the clothing image A3. In this case, the CPU 11 obtains the clothing information A3 associated with the clothing image A3. When the clothing information has been obtained (S19: Yes), the CPU 11 controls the transmission of the obtained clothing information (S21). The CPU 11 outputs a transmission command for the clothing information to the communication unit 16. In association with this, the communication unit 16 transmits the clothing information to the image processing device 30.
After the execution of S21, the CPU 11 controls the display of the pattern selection screen (S23). The CPU 11 outputs a display command for the pattern selection screen to the display unit 15. In association this, the display unit 15 displays the pattern selection screen (see
The pattern selection screen is a screen to select the pattern of the clothes that the wearer attempts to purchase. The embodiment is based on six kinds of patterns. Therefore, as illustrated in
The pattern image B1 is a reduced image corresponding to the pattern data B1. The pattern image B1 is associated with the pattern information B1. The pattern image B2 is a reduced image corresponding to the pattern data B2. The pattern image B2 is associated with the pattern information B2. The pattern image B3 is a reduced image corresponding to the pattern data B3. The pattern image B3 is associated with the pattern information B3. The pattern image B4 is a reduced image corresponding to the pattern data B4. The pattern image B4 is associated with the pattern information B4. The pattern image B5 is a reduced image corresponding to the pattern data B5. The pattern image B5 is associated with the pattern information B5. The pattern image B6 is a reduced image corresponding to the pattern data B6. The pattern image B6 is associated with the pattern information B6. When the pattern images B1, B2, B3, B4, B5, and B6 are not discriminated or these are collectively referred to, these are referred to as a “pattern image.”
The pattern image is selected, for example, by the tap operation to the operating unit 14 on the selected pattern image. To display the pattern images, like the pattern images B3 and BE illustrated in
The shop assistant or the wearer executes the tap operation to the operating unit 14 on the pattern image with the pattern that the wearer likes. The pattern selection screen includes the return button 18 (see
When the operating unit 14 accepts the tap operation selecting the pattern image, the CPU 11 obtains the pattern information associated with the selection-target pattern image. For example, suppose that the tap operation is executed to the operating unit 14 on the pattern image B1. In this case, the CPU 11 obtains the pattern information B1 associated with the pattern image B1. When the pattern information has been obtained (S25: Yes), the CPU 11 controls the transmission of the obtained pattern information (S27). The CPU 11 outputs a transmission command for the pattern information to the communication unit 16. In association with this, the communication unit 16 transmits the pattern information to the image processing device 30.
After the execution of S27, the CPU 11 determines whether the first composite data has been obtained (S29). The image processing device 30 transmits the first composite data obtained at S29 at S123 in
The first front image is a composite image corresponding to the front of a doll who wears clothes with a pattern described as follows. The first back image is a composite image corresponding to the back of a doll who wears clothes with a pattern described as follows. The first side image is a composite image corresponding to the side of a doll who wears clothes with a pattern described as follows. The side of the doll is the side on the side identical to the side of the wearer shot by the shooting device 50. In the embodiment, the shot target by the shooting device 50 is a right side of the wearer. Therefore, the first side image is a composite image corresponding to the right side of the doll. The previously mentioned pattern is a pattern corresponding to the pattern data associated with the pattern information transmitted at S27. The previously mentioned clothes are clothes with the predetermined shape corresponding to the clothing data associated with the clothing information transmitted at S21. The previously mentioned doll is a doll with the predetermined figure corresponding to the doll data associated with the clothing information transmitted at S15. In the embodiment, when the first front image, the first back image, and the first side image are not discriminated or these are collectively referred to, these are referred to as a “first composite image.” It can also be said that the first composite data is composite data corresponding to the first composite image. After the execution of S29, the CPU 11 transitions the process to S31 in
At S31, the CPU 11 controls the display of the confirmation screen. An initial display target in the display of the confirmation screen is made the first front image. That is, the CPU 11 controls the display of the confirmation screen including the first front image in a display area R. The CPU 11 creates the first front image from the first front data stored in the RAM 13 and outputs a display command for the confirmation screen including this to the display unit 15. In association with this, the display unit 15 displays the previously mentioned confirmation screen (see
In addition to the display area R, the confirmation screen includes a front button 19, a back button 20, a side button 21, a reselect button 22, a confirm button 23, and an end button 24. The front button 19 is made to correspond to a front switching instruction that switches the first back image or the first side image displayed in the display area R to the first front image. The back button 20 is made to correspond to a back switching instruction that switches the first front image or the first side image displayed in the display area R to the first back image. The side button 21 is made to correspond to a side switching instruction that switches the first front image or the first back image displayed in the display area R to the first side image. Regarding the front button 19, the back button 20, and the side button 21, the buttons made correspond to the switching of the first composite image in display may be made non-display or grayed-out so as to enter a state not becoming the target for the tap operation. For example, when the first front image is displayed in the display area R, the front button 19 is made grayed-out. In
The reselect button 22 is made to correspond to a reselection instruction that reselects the shape and the pattern of the clothes. The confirm button 23 is made to correspond to a confirmation instruction that confirms the design of the clothes according to each piece of information (see S15, S21, and S27 in
Suppose that the tap operation is executed to the operating unit 14 on the front button 19 and the operating unit 14 accepts this tap operation. In this case, the CPU 11 obtains the front switching instruction. Suppose that the tap operation is executed to the operating unit 14 on the back button 20 and the operating unit 14 accepts this tap operation. In this case, the CPU 11 obtains the back switching instruction. Suppose that the tap operation is executed to the operating unit 14 on the side button 21 and the operating unit 14 accepts this tap operation. In this case, the CPU 11 obtains the side switching instruction. Suppose that the tap operation is executed to the operating unit 14 on the reselect button 22 and the operating unit 14 accepts this tap operation. In this case, the CPU 11 obtains the reselection instruction. Suppose that the tap operation is executed to the operating unit 14 on the confirm button 23 and the operating unit 14 accepts this tap operation. In this case, the CPU 11 obtains the confirmation instruction. Suppose that the tap operation is executed to the operating unit 14 on the end button 24 and the operating unit 14 accepts this tap operation. In this case, the CPU 11 obtains the end instruction.
After the execution of S31, the CPU 11 determines whether the front switching instruction has been obtained (S33). As illustrated in
When the front switching instruction has been obtained (S33: Yes), the CPU 11 controls the transmission of the front switching instruction (S35). The CPU 11 outputs a transmission command for the front switching instruction to the communication unit 16. In association with this, the communication unit 16 transmits the front switching instruction to the image processing device 30. Subsequently, the CPU 11 controls the display of the first front image (S37). The CPU 11 creates the first front image from the first front data stored in the RAM 13. Subsequently, the CPU 11 outputs, to the display unit 15, a display command that switches the display in the display area R on the confirmation screen to the first front image. In association with this, the display unit 15 displays the first front image in the display area R (see
When the front switching instruction has not been obtained (S33: No), the CPU 11 determines whether the back switching instruction has been obtained (S39). As illustrated in
When the back switching instruction has been obtained (S39: Yes), the CPU 11 controls the transmission of the back switching instruction (S41). The CPU 11 outputs a transmission command for the back switching instruction to the communication unit 16. In association with this, the communication unit 16 transmits the back switching instruction to the image processing device 30. Subsequently, the CPU 11 controls the display of the first back image (S43), The CPU 11 creates the first back image from the first back data stored in the RAM 13. Subsequently, the CPU 11 outputs, to the display unit 15, a display command that switches the display in the display area R on the confirmation screen to the first back image. In association with this, the display unit 15 displays the first back image in the display area R (see
When the back switching instruction has not been obtained (S39: No), the CPU 11 determines whether the side switching instruction has been obtained (S45). As illustrated in
When the side switching instruction has been obtained (S45: Yes), the CPU 11 controls the transmission of the side switching instruction (S47). The CPU 11 outputs a transmission command for the side switching instruction to the communication unit 16. In association with this, the communication unit 16 transmits the side switching instruction to the image processing device 30. Subsequently, the CPU 11 controls the display of the first side image (S49). The CPU 11 creates the first side image from the first side data stored in the RAM 13. Subsequently, the CPU 11 outputs, to the display unit 15, a display command that switches the display in the display area R on the confirmation screen to the first side image. In association with this, the display unit 15 displays the first side image in the display area R (see
When the side switching instruction has not been obtained (S45: No), the CPU 11 transitions the process to S51 in
When the reselection instruction has not been obtained (S51: No), the CPU 11 determines whether the confirmation instruction has been obtained (S55). When the confirmation instruction has been obtained (S55: Yes), the CPU 11 controls the transmission of the confirmation instruction (S57). The CPU 11 outputs a transmission command for the confirmation instruction to the communication unit 16. In association with this, the communication unit 16 transmits the confirmation instruction to the image processing device 30. Afterwards, the CPU 11 ends the acceptance process. Note that after the confirmation instruction has been obtained (see S55: Yes) and the confirmation instruction has been transmitted at S57, the CPU 11 may also return the process to S17 in
When the confirmation instruction has not been obtained (S55: No), the CPU 11 determines whether the end instruction has been obtained (S59). When the end instruction has not been obtained (S59: No), the CPU 11 returns the process to S33 in
<Output Process>
The output process executed by the image processing device 30 is described with reference to
The CPU 31 that has started the output process obtains the shot data (S71). For example, the shop assistant instructs the wearer to stand before the shooting device 50 in a state of facing the front with respect to the shooting device 50. The shooting device 50 shoots the front of the wearer. The shooting device 50 outputs a shot data corresponding to a shot image including the front of the wearer. Next, the shop assistant instructs the wearer to stand before the shooting device 50 in a state of facing the back with respect to the shooting device 50. The shooting device 50 shoots the back of the wearer. The shooting device 50 outputs a shot data corresponding to a shot image including the back of the wearer. Finally, the shop assistant instructs the wearer to stand before the shooting device 50 in a state of facing the side with respect to the shooting device 50. The shooting device 50 shoots the side of the wearer. The shooting device 50 outputs a shot data corresponding to a shot image including the side of the wearer. The CPU 31 obtains the respective front, back, and side shot data output from the shooting device 50 via the input unit 36. The obtained respective front, back, and side shot data are stored in the RAM 33.
Next, the CPU 31 determines whether the figure information has been obtained (S73). The terminal 10 transmits the figure information at S15 in
After the execution of S75, the CPU 31 determines whether the clothing information has been obtained (S77). The terminal 10 transmits the clothing information at S21 in
After the execution of S79, the CPU 31 determines whether the pattern information has been obtained (S81). The terminal 10 transmits the pattern information at S27 in
After the execution of S83, the CPU 31 combines the clothing data and the pattern data to create intermediate data (S85). For example, suppose that the respective front, back, and side clothing data A3 are obtained and stored in the RAM 33 at S79. Suppose that the pattern data B1 has been obtained and stored in the RAM 33 at S83. In this case, the CPU 31 combines the front clothing data A3 and the pattern data B1 to create front intermediate data. The front intermediate data is composite data corresponding to the composite image illustrated as “front” in
Next, the CPU 31 starts the first composition process (S87). The first composition process creates the first composite data and transmits the first composite data to the terminal 10. The first composite data created by the first composition process is first front data, first back data, and first side data. The first composition process will be described below. Subsequently, the CPU 31 starts the second composition process (S89). The second composition process creates the second composite data and stores the second composite data in the RAM 33. The second composite data created by the second composition process is second front data, second back data, and second side data. In the embodiment, the second composite data is a name of the composite data used when the second front data, the second back data, and the second side data are not discriminated or these are collectively referred to. The second front data is a composite image corresponding to the second front image. The second back data is a composite image corresponding to the second back image. The second side data is a composite image corresponding to the second side image.
The second front image is a composite image corresponding to a front of the following doll who wears clothes with pattern corresponding to the front intermediate data created at S85 in
At S91, the CPU 31 determines whether both the first composition process started at S87 and the second composition process started at S89 end. The end of the first composition process is determined by, for example, ON or OFF of a first composite flag. The CPU 31 turns ON the first composite flag at S87 and turns OFF the first composite flag in association with the end of the first composition process. The end of the second composition process is determined by, for example, ON or OFF of a second composite flag. The CPU 31 turns ON the second composite flag at S89 and turns OFF the second composite flag in association with the end of the second composition process. Suppose that the at least any one of the first composite flag and the second composite flag is ON. In this case, at least any one of the first composition process and the second composition process is in execution, and S91 is denied. When S91 is denied (S91: No), the CPU 31 repeatedly executes this determination.
In contrast to this, suppose that both the first composite flag and the second composite flag are OFF. In this case, both the first composition process and the second composition process are ended, and S91 is affirmed. When S91 is affirmed (S91: Yes), the CPU 31 controls the output of the second composite image (S93). The initial output target in the output of the second composite image is made the second front image. The CPU 31 creates the second front image from the second front data stored in the RAM 33 at S135 in
After the execution of S93, the CPU 31 determines whether the front switching instruction has been obtained (S95). The terminal 10 transmits the front switching instruction at S35 in
When the front switching instruction has been obtained (S95: Yes), the CPU 31 controls the output of the second front image (S97). The CPU 31 creates the second front image from the second front data stored in the RAM 33 at S135 in
When the front switching instruction has not been obtained (S95: No), the CPU 31 determines whether the back switching instruction has been obtained (S99). The terminal 10 transmits the back switching instruction at S41 in
When the back switching instruction has been obtained (S99: Yes), the CPU 31 controls the output of the second back image (S101). The CPU 31 creates the second back image from the second back data stored in the RAM 33 at S135 in
When the back switching instruction has not been obtained (S99: No), the CPU 31 determines whether the side switching instruction has been obtained (S103). The terminal 10 transmits the side switching instruction at S47 in
When the side switching instruction has been obtained (S103: Yes), the CPU 31 controls the output of the second side image (S105). The CPU 31 creates the second side image from the second side data stored in the RAM 33 at S135 in
When the side switching instruction has not been obtained (S103: No), the CPU 31 determines whether the reselection instruction has been obtained (S107). The terminal 10 transmits the reselection instruction at S53 in
When the reselection instruction has not been obtained (S107: No), the CPU 31 transitions the process to S109 in
For example, suppose that the confirmation instruction obtained this time is a confirmation instruction transmitted from the terminal 10 based on the execution of the tap operation to the operating unit 14 on the confirm button 23 with the display states of the respective display unit 15 and display device 40 being the following states. The previously mentioned states are, for example, the states in which the display unit 15 displays the confirmation screen including the first front image illustrated in
The clothing production apparatus 81, which constitutes the clothing production system 80 together with the clothing design display system 1, produces the clothes. That is, the clothing production apparatus 81 produces clothes in the first composite image and the second composite image. The clothing production apparatus 81, for example, includes a printing device 82, a cutting device 83, and a sewing device 84 (see
When the confirmation instruction has not been obtained (S109: No), the CPU 31 determines whether the end instruction has been obtained (S113). The terminal 10 transmits the end instruction at S61 in
<First Composition Process>
The first composition process started at S87 (see
In this case, the CPU 31 combines the front doll data M and the above-mentioned front intermediate data to create the first front data corresponding to the first front image illustrated in
Next, the CPU 31 controls the transmission of the first composite data created at S121 (S123). The CPU 31 outputs a transmission command for the first composite data to the communication unit 34. In association with this, the communication unit 34 transmits the first composite data to the terminal 10. That is, at S123, the CPU 31 outputs the transmission commands for the first front data, first back data, and first side data to the communication unit 34. In association with this, the communication unit 34 transmits the first front data, the first back data, and the first side data to the terminal 10. Afterwards, the CPU 31 ends the first composition process.
<Second Composition Process>
The second composition process started at S89 (see
In the embodiment, the head data extracted from the front shot data is referred to as “front head data,” and the head image corresponding to the front head data is referred to as a “front head image,” and the head data extracted from the back shot data is referred to as “back head data.” The head image corresponding to the back head data is referred to as a “back head image.” The head data extracted from the side shot data is referred to as “side head data.” The head image corresponding to the side head data is referred to a “side head image.” When the respective front, back, and side head data are not discriminated or these are collectively referred to, these are referred to as “head data.” When the respective front, back, and side head images are not discriminated or these are collectively referred to, these are referred to as a “head image.”
Next, the CPU 31 specifies color information of a skin of the wearer included in the head image from the head data (S133). The head data to be the process target is a part of or all of the respective front, back, and side head data stored in the RAM 33. For example, the front head data or the respective front and side head data is to be the process target. The color information is specified targeting a predetermined region in the head image. For example, the color information is specified targeting a face or a specific region on the face. As the specific region on the face, a cheek is exemplified. To specify the color of the skin, the CPU 31 executes the image analysis on the head data to detect the region to be the specification target. Subsequently, the CPU 31 specifies the color information of the skin from the color information at the detected region. When the specification target for the color information is plural, an average value of the color information specified from the respective regions may also be specified as the color information of the skin. For example, suppose that the color information is specified by an RGB value, in this case, the average value of the color information is obtained by, for example, averaging the respective R value, G value, and B value at each region. Besides, the color information may also be individually specified from the respective front, back, and side head data. Regarding the specification of the color information from the back head data, for example, when an ear is included in the back head image, the color information is specified using the ear as the specification target. The specified color information is stored in the RAM 33.
After the execution of S133, the CPU 31 creates the second composite data (S135). The created second composite data is stored in the RAM 33. For example, suppose that the front doll data M, the back doll data M, and, the side doll data M are obtained and stored in the RAM 33 at S75 in
In this case, the CPU 31 combines the front doll data M, the above-mentioned front intermediate data, and the front head data. Then, the CPU 31 sets the head as the front head image corresponding to the front head data. Additionally, the CPU 31 sets a part of the doll excluding the head not covered with the worn clothes as the color of the color information stored in the RAM 33. That is, the CPU 31 creates the second front data corresponding to the second front image illustrated in
Suppose that the color information has been specified regarding the respective front, back, and side. In this case, the front color information is used to combine the second front data. The back color information is used to combine the second back data. When the back color information is not specified, front or side color information or an average value of respective front and side color information is used. The side color information is used to combine the second side data. Prior to the combination of the doll data, the intermediate data, and the shot data, the doll data may also be made to have a color of the color information stored in the RAM 33. In this case, the above-mentioned composition process is executed using the doll data made to have a color of the color information stored in the RAM 33. In this case, also, the second composite data corresponding to the second composite image can be created. In the second composite image, the part of the doll excluding the head not covered with the worn clothes is made to have a color of the color information stored in the RAM 33. The known composition process technique is employed at S135. Descriptions regarding the composition process executed at S135 are omitted. After the execution of S135, the CPU 31 ends the second composition process.
<Advantageous Effects of Embodiment>
According to the embodiment, the following advantageous effects can be obtained.
(1) The clothing design display system 1 includes the terminal 10, the image processing device 30, and the display device 40. The terminal 10 executes the acceptance process illustrated in
Therefore, the display device 40 can display the second composite image (see
By configuring the second composite image in the state in which the doll with the predetermined figure wears the clothes with the predetermined pattern and the predetermined shape, the wearer can recognize an impression of the newly-designed clothes. For example, suppose the newly-designed clothes are sleeveless or have short sleeves with the shape like the clothing information A1, A2, A3, and A5 illustrated in
(2) The clothing design display system 1 includes the shooting device 50. The shooting device 50 shoots the wearer. In the output process, the shot data is obtained (see S71 in
(3) In the output process, after the intermediate data is created at S85 in
<Modifications>
The embodiment can also be configured as follows. Some configurations of modifications illustrated below may also be employed in combination as appropriate. In the following description, points different from the above description are described, and the description of similar points is omitted as appropriate.
(1) The surfaces to be the process target are the front, the back, and the side. Then, the right side is the target as the side. The side as the process target may also be both right and left sides. In this case, the above-described process targeting the side and the like are properly executed to the targeted right side and left side. The storage device 32 stores the doll data and the clothing data corresponding to the respective right and left sides. The confirmation screens (see
(2) The acceptance process illustrated in
(3) In the figure selection screen illustrated in
(4) In the acceptance process illustrated in
(5) In the output process illustrated in
When the first composition process is omitted, after S135 in
Besides, both the first composition process and the second composition process may also be omitted. In this case, the doll data may also be omitted. In the output process (see
(6) In the second composition process illustrated in
(7) The clothing design display system 1 includes the terminal 10, the image processing device 30, and the display device 40 (see
The CPU 91 corresponds to the CPUs 11 and 31. The CPU 91 is a controller for the clothing design output system 2 to control the clothing design output system 2 constituted of the one information processing apparatus. The CPU 91 executes the arithmetic processes and controls the clothing design output system 2. The storage device 92 corresponds to the storage devices 12 and 32. The storage device 92 stores the OS and various programs. The programs stored in the storage device 92 include the program for the acceptance process (see
The RAM 93 corresponds to the RAMs 13 and 33. The RAM 93 serves as a storage area when the CPU 91 executes the OS and the various programs stored in the storage device 92. The RAM 93 stores predetermined information and data in a predetermined storage area in the middle of the execution of the process. The RAM 93 may also store the plurality of doll data, the plurality of clothing data, and the plurality of pattern data. In the clothing design output system 2, the CPU 91 as the control unit executes the OS and the various programs stored in the storage device 92. In association with this, the clothing design output system 2 executes various processes, and functions corresponding to the executed processes are achieved.
The operating unit 94 corresponds to the operating unit 14. The display unit 95 corresponds to the display unit 15. In the clothing design output system 2 illustrated in
The output unit 97 corresponds to the output unit 35. The input unit 98 corresponds to the input unit 36. In the clothing design output system 2 illustrated in
In the clothing design output system 2 illustrated in
Hashimoto, Junichi, Nagata, Kozo, Ogata, Norihiro, Kawabata, Toshiro
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
4149246, | Jun 12 1978 | KATZ, RONALD A ; BABCOCK, MICHAEL JANE ALLEN | System for specifying custom garments |
5615318, | Jan 23 1992 | Asahi Kasei Kogyo Kabushiki Kaisha | Method and apparatus for visualizing assembled sewing patterns |
8249738, | Dec 19 2005 | Lectra | Device and method for designing a garment |
20100174400, | |||
20120086783, | |||
20120109777, | |||
20140279289, | |||
20150279098, | |||
20160042542, | |||
EP1160733, | |||
JP10340282, | |||
JP11001814, | |||
JP2000187683, | |||
WO2014081394, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 29 2016 | Seiren Co., Ltd. | (assignment on the face of the patent) | / | |||
Sep 19 2017 | OGATA, NORIHIRO | SEIREN CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 043718 | /0107 | |
Sep 19 2017 | HASHIMOTO, JUNICHI | SEIREN CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 043718 | /0107 | |
Sep 19 2017 | KAWABATA, TOSHIRO | SEIREN CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 043718 | /0107 | |
Sep 19 2017 | NAGATA, KOZO | SEIREN CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 043718 | /0107 |
Date | Maintenance Fee Events |
Sep 27 2017 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Feb 08 2023 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Aug 20 2022 | 4 years fee payment window open |
Feb 20 2023 | 6 months grace period start (w surcharge) |
Aug 20 2023 | patent expiry (for year 4) |
Aug 20 2025 | 2 years to revive unintentionally abandoned end. (for year 4) |
Aug 20 2026 | 8 years fee payment window open |
Feb 20 2027 | 6 months grace period start (w surcharge) |
Aug 20 2027 | patent expiry (for year 8) |
Aug 20 2029 | 2 years to revive unintentionally abandoned end. (for year 8) |
Aug 20 2030 | 12 years fee payment window open |
Feb 20 2031 | 6 months grace period start (w surcharge) |
Aug 20 2031 | patent expiry (for year 12) |
Aug 20 2033 | 2 years to revive unintentionally abandoned end. (for year 12) |