Protective headgear includes, for example, a headgear body that is wearable on a head of a baseball umpire. At least one sensor is configured to generate imaging data that includes a batter in proximity to home plate. A processor analyzes the imaging data to determine a strike zone of the batter and to generate augmented reality data in response thereto. A display device generates a heads up display of the strike zone that is viewable to the baseball umpire during a pitch to the batter, based on the augmented reality data. Other embodiments are disclosed.
|
1. Protective headgear comprising:
a headgear body that is wearable on a head of a baseball umpire;
at least one sensor, coupled to the headgear body, configured to generate imaging data that includes a visual image of a batter in proximity to home plate, wherein a uniform worn by the batter includes a plurality of visible targets;
a processor, coupled to the sensor, that operates in conjunction with a memory and is configured to analyze the imaging data via a pattern recognition application to determine a strike zone of the batter by determining corresponding positions of the plurality of visible targets of the uniform worn by the batter and to generate augmented reality data in response thereto, wherein the uniform worn by the batter has pants that cover a kneecap of the batter and a jersey that covers shoulders of the batter, wherein the corresponding positions of the plurality of visible targets of the uniform are located on a bottom of the kneecap, a top of the shoulders and a top of the pants, and wherein the processor determines the strike zone of the batter further by determining a region above the home plate between the bottom of the kneecap and the midpoint between the top of the pants and the top of the shoulders; and
a display device, coupled to the headgear body and the processor, configured to generate a heads up display of the strike zone that is viewable to the baseball umpire during a pitch to the batter, based on the augmented reality data.
2. The protective headgear of
3. The protective headgear of
4. The protective headgear of
5. The protective headgear of
6. The protective headgear of
7. The protective headgear of
8. The protective headgear of
9. The protective headgear of
10. The protective headgear of
|
None
The present disclosure relates to protective headgear used in baseball, such as an umpire's mask or helmet.
One of the duties of the umpire during a baseball game is to call pitches as either balls or strikes, in accordance with a strike zone. Major League Baseball has established rules that define the strike zone for this purpose. While rules for the strike zone have changed many times over the years, current Rule 2.00 of Major League Baseball defines the strike zone as that area over home plate 110, the upper limit of which is a horizontal line at the midpoint 118 between the top of the shoulders 116 and the top of the uniform pants 114, and the lower level is a line at the hollow beneath the kneecap 112. As defined, the strike zone is not only dependent on the particular player that is batting, but also on the batting stance employed by that player.
In various embodiments, protective headgear 102 such as a mask or helmet, is wearable on the head of the umpire 104 to not only protect the umpire from injury from a pitch that may not be caught by the catcher 106, but also to provide a heads up display of the strike zone that is viewable to the umpire during the pitch. In particular, the heads up display of the strike zone can present a virtual strike zone. This virtual strike zone eliminates the need for the umpire to determine the correct strike zone boundaries for each batter. This enables the umpire to make better calls by allowing the umpire to focus his or her attention on the relationship of the pitch to the strike zone.
Further details regarding the protective headgear 102 including many optional functions are features are presented in conjunction with
While a particular architecture is shown that includes bus 208, other architectures including two or more buses and/or direct connectivity between elements can likewise be implemented. Further, the protective headgear can include one or more additional elements, including a power source, user interface and/or other elements not expressly shown.
In various embodiments the sensor(s) 204 include visual imaging sensors such as a charge coupled device, digital video camera or other digital imaging sensor that captures visual images of the batter and of home plate. While a single visual imaging sensor can be employed, two or more sensors 204 can be employed in order to generate a stereoscopic image, a three-dimensional image or other image that provides a depth of field and allows for more accurate strike zone determination.
The processor 200 can execute a pattern recognition application that identifies points in the image, such as the boundaries of home plate, the bottom of the kneecap, top of the shoulders and top of the uniform pants of the batter that can be used to generate the boundaries of the strike zone. While this analysis can be performed directly on an image that contains a batter and home plate, in some embodiments, the pattern recognition application can be aided by modifications to the uniform of the batter. For example, the uniform of the batter can include a plurality of visible targets of a known configuration that are placed at key points on the uniform, such as the bottom of the kneecap, top of the shoulders and top of the uniform pants of the batter that can aid in the determination of these key points by the pattern recognition application. In particular, the processor 200 can analyze the imaging data from the sensor(s) 204 to determine the strike zone of the batter, based on the corresponding positions of these visible targets.
The use of multiple sensors 204 allows the processor 200 to construct a three-dimensional region corresponding to the strike zone and the trajectory of the ball in order to determine if the ball intersect the three-dimensional region at any point in order to more accurately call a ball or strike.
It should be noted that the image data is generated and analyzed on a realtime basis to provide a realtime display of a virtual strike zone that is constantly updated based on the position of the umpire, the orientation of his or her head, and the stance of the batter. In various embodiments, the pattern recognition application further operates to recognize the presence or absence of a batter in a batting stance and only generates the virtual strike zone display in circumstances where batter is in a batting stance in preparation for a pitch. In this fashion, the heads-up display of the strike zone only appears when needed and does not encumber the vision of the umpire during other calls, such as during a tag or other play at the plate or other call by the home plate umpire.
While the foregoing description has focused on the use of visual imaging sensors, other imaging technologies can be used as well. In various other embodiments, the sensors 204 can include microwave or millimeter wave devices that transmit a microwave or millimeter wave signal and capture image data in the form of a reflected millimeter wave or microwave image of the batter and home plate. While this analysis can also be performed directly on an image that contains a batter and home plate, in some embodiments, the pattern recognition application can be also aided by modifications to the uniform of the batter. For example, the uniform of the batter can include a plurality of microwave or millimeter wave reflectors that are placed at key points on the uniform, such as the bottom of the kneecap, top of the shoulders and top of the uniform pants of the batter that can aid in the determination of these key points by the pattern recognition application. In a similar fashion home plate and the baseball can be also equipped with one or more millimeter wave or microwave reflectors or surfaces that aids the pattern recognition application in determining the strike zone of the batter, based on the corresponding positions of these microwave or millimeter wave reflectors.
In further embodiments, the sensors 204 can include an infrared sensor or sensors that capture infrared image data. While this analysis can also be performed directly on an image that contains a batter and home plate, in some embodiments, the pattern recognition application can be also aided by modifications to the uniform of the batter. For example, the uniform of the batter can include a plurality of infrared reflectors that are placed at key points on the uniform, such as the bottom of the kneecap, top of the shoulders and top of the uniform pants of the batter and/or the ball that reflect infrared energy from the sun and can aid in the determination of these key points by the pattern recognition application. In particular, the processor 200 can analyze the imaging data from the sensor(s) 204 to determine the strike zone of the batter, based on the corresponding positions of these infrared reflectors.
The display device 206 can be a helmet mounted display or other heads up display that provides an augmented reality image of the umpire's field of view. In particular, the augmented reality image can include the virtual strike zone determined by the processor in response to image data from the sensors 204. For example, the display device 206 includes a transparent display element that presents the virtual strike zone without requiring the umpire to look away from their usual viewpoints. The display device 206 can include a video generator that generates rendered video based on the augmented reality data representing the strike zone, and a projector unit that projects the video on a combiner, such as a piece of safety glass with an appropriate coating or other transparent screen that reflects the light from the projector while passing other wavelengths of light. The projector unit and combiner can operate based on a light emitting diode or liquid crystal display that produces an image of the strike zone (a virtual strike zone) within the normal field of view of the umpire.
As previously discussed, the image data is generated and analyzed by the protective headgear on a realtime basis to provide a realtime display of a virtual strike zone that is constantly updated based on the position of the umpire, the orientation of his or her head, and the stance of the batter. In the configuration shown, the pattern recognition application has recognized the presence or absence of the batter in a batting stance and has generated the virtual strike zone display because the batter is determined to be in a batting stance in preparation for a pitch. In other circumstances, prior to and after the pitch, the outline 302 is not presented.
While an augmented reality data in the form of region outline 302 is presented, in other examples, the virtual strike zone can be indicated by a colored region, shading or other graphical imagery. In addition, the color of the strike zone outline or shading can be changed based on whether a ball or strike was detected—to red for a strike and green for a ball, for example. In addition to presenting the virtual strike zone on the heads up display, other game data received via the communication interface can also be displayed such as the ball count, strike count and number of outs as shown. It should be noted that other game data such as inning number, score etc. can be received and displayed in a similar fashion.
In addition or in the alternative, while the foregoing has focused on the use of the protective headgear to produce augmented reality data merely for calling balls and strikes, image data can be analyzed and augmented reality data can be presented on a heads-up display that aids the umpire in making other calls such as whether a ball is fair or foul, whether a tag has been made, whether a pitcher has balked and/or other calls. In addition or in the alternative, in embodiments where sensors 204 operate based on visual imaging, video of a particular play viewed by the umpire can be recorded. The protective headgear can include an optional user interface that allows the umpire to retrieve the recorded video for playback via the display device 206 to review a call using the protective headgear to check its accuracy and/or overturn a previous call.
It should also be noted that while the foregoing has described the use of the protective headgear to generate an augmented reality display for a baseball umpire, the use of similar augmented reality gear that is either incorporated in or separate from protective headgear, can likewise be used by other baseball umpires, other umpires and referees to assist in making game calls in other sports including football, soccer, hockey, lawn tennis, table tennis, basketball, volleyball, track and field and/or other sports.
The pattern recognition application identifies points in the image 310, such as the boundaries of home plate, the bottom of the kneecap, top of the shoulders and top of the uniform pants of the batter that can be used to generate the boundaries of the strike zone 312. As discussed, this analysis can be performed directly on an image 310. In the embodiment shown however, the uniform of the batter includes a plurality of visible targets 314 of a known configuration that are placed at key points on the uniform, such as the bottom of the kneecap, top of the shoulders and top of the uniform pants of the batter that can aid in the determination of these key points by the pattern recognition application. In particular, the pattern recognition application can analyze the image 310 to determine the strike zone of the batter 312 based on corresponding positions of the plurality of visible targets. While particular targets are shown with a particular shape and a particular pattern, other shapes and patterns can likewise be employed that either include a team's logo or otherwise are customized to a particular team, or that are standardized to a group of teams, such as all the teams in a league. In the example shown, the strike zone 312 is determined by the pattern recognition application as the region above home plate between the bottom of the kneecap and the midpoint between the top of the batter's pants and the batter's shoulders.
As in the previous example, the pattern recognition application identifies points in the image 320, such as the boundaries of home plate, the bottom of the kneecap, top of the shoulders and top of the uniform pants of the batter that can be used to generate the boundaries of the strike zone 322. As discussed, this analysis can be performed directly on an image 320. In the embodiment shown however, the uniform of the batter includes a plurality of millimeter wave reflectors 324 that are placed at key points on the uniform, such as the bottom of the kneecap, top of the shoulders and top of the uniform pants of the batter that can aid in the determination of these key points by the pattern recognition application by providing distinct points of high reflection that show up as points of high “brightness” in the image 320. While particular point reflectors 324 are shown with a square shape, other geometrical shapes, lines, etc. can likewise be employed. The home plate is similarly outlined with reflective material 326. The pattern recognition application can analyze the image 310 to determine the strike zone of the batter 322 based on corresponding positions of these reflections.
In the example shown, the strike zone 322 is determined by the pattern recognition application as the region above home plate between the bottom of the kneecap and the midpoint between the top of the batter's pants and the batter's shoulders.
In various embodiments, the imaging data includes a stereoscopic image of the batter. The imaging data can include at least one visual image, infrared image, millimeter wave image and/or microwave image of the batter and of home plate. The uniform of the batter can include a plurality of visible targets and analyzing the imaging data to determine the strike zone of the batter can include determining corresponding positions of the plurality of visible targets. The uniform of the batter can include a plurality of millimeter wave reflectors and analyzing the imaging data to determine the strike zone of the batter can include determining corresponding positions of the plurality of millimeter wave reflectors. The uniform of the batter can include a plurality of infrared reflectors and analyzing the imaging data to determine the strike zone of the batter can include determining corresponding positions of the plurality of infrared targets.
The augmented reality data can includes a region outline that outlines the strike zone in the heads up display and/or a colored region corresponding to the strike zone in the heads up display.
While the description above has set forth several different modes of operation, the devices described here may simultaneously be in two or more of these modes unless, by their nature, these modes necessarily cannot be implemented simultaneously. While the foregoing description includes the description of many different embodiments and implementations, the functions and features of these implementations and embodiments can be combined in additional embodiments of the present disclosure not expressly disclosed by any single implementation or embodiment, yet nevertheless understood by one skilled in the art when presented this disclosure.
It is noted that terminologies as may be used herein such as bit stream, stream, signal sequence, etc. (or their equivalents) have been used interchangeably to describe digital information whose content corresponds to any of a number of desired types (e.g., data, video, speech, audio, etc. any of which may generally be referred to as ‘data’).
As may be used herein, the terms “substantially” and “approximately” provides an industry-accepted tolerance for its corresponding term and/or relativity between items. Such an industry-accepted tolerance ranges from less than one percent to fifty percent and corresponds to, but is not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, and/or thermal noise. Such relativity between items ranges from a difference of a few percent to magnitude differences. As may also be used herein, the term(s) “configured to”, “operably coupled to”, “coupled to”, and/or “coupling” includes direct coupling between items and/or indirect coupling between items via an intervening item (e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module) where, for an example of indirect coupling, the intervening item does not modify the information of a signal but may adjust its current level, voltage level, and/or power level. As may further be used herein, inferred coupling (i.e., where one element is coupled to another element by inference) includes direct and indirect coupling between two items in the same manner as “coupled to”. As may even further be used herein, the term “configured to”, “operable to”, “coupled to”, or “operably coupled to” indicates that an item includes one or more of power connections, input(s), output(s), etc., to perform, when activated, one or more its corresponding functions and may further include inferred coupling to one or more other items. As may still further be used herein, the term “associated with”, includes direct and/or indirect coupling of separate items and/or one item being embedded within another item.
As may be used herein, the term “compares favorably”, indicates that a comparison between two or more items, signals, etc., provides a desired relationship. For example, when the desired relationship is that signal 1 has a greater magnitude than signal 2, a favorable comparison may be achieved when the magnitude of signal 1 is greater than that of signal 2 or when the magnitude of signal 2 is less than that of signal 1. As may be used herein, the term “compares unfavorably”, indicates that a comparison between two or more items, signals, etc., fails to provide the desired relationship.
As may also be used herein, the terms “processing module”, “processing circuit”, “processor”, and/or “processing unit” may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions. The processing module, module, processing circuit, and/or processing unit may be, or further include, memory and/or an integrated memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of another processing module, module, processing circuit, and/or processing unit. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. Note that if the processing module, module, processing circuit, and/or processing unit includes more than one processing device, the processing devices may be centrally located (e.g., directly coupled together via a wired and/or wireless bus structure) or may be distributedly located (e.g., cloud computing via indirect coupling via a local area network and/or a wide area network). Further note that if the processing module, module, processing circuit, and/or processing unit implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. Still further note that, the memory element may store, and the processing module, module, processing circuit, and/or processing unit executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in one or more of the Figures. Such a memory device or memory element can be included in an article of manufacture.
One or more embodiments have been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claims. Further, the boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality.
To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claims. One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.
In addition, a flow diagram may include a “start” and/or “continue” indication. The “start” and “continue” indications reflect that the steps presented can optionally be incorporated in or otherwise used in conjunction with other routines. In this context, “start” indicates the beginning of the first step presented and may be preceded by other activities not specifically shown. Further, the “continue” indication reflects that the steps presented may be performed multiple times and/or may be succeeded by other activities not specifically shown. Further, while a flow diagram indicates a particular ordering of steps, other orderings are likewise possible provided that the principles of causality are maintained.
The one or more embodiments are used herein to illustrate one or more aspects, one or more features, one or more concepts, and/or one or more examples. A physical embodiment of an apparatus, an article of manufacture, a machine, and/or of a process may include one or more of the aspects, features, concepts, examples, etc. described with reference to one or more of the embodiments discussed herein. Further, from figure to figure, the embodiments may incorporate the same or similarly named functions, steps, modules, etc. that may use the same or different reference numbers and, as such, the functions, steps, modules, etc. may be the same or similar functions, steps, modules, etc. or different ones.
Unless specifically stated to the contra, signals to, from, and/or between elements in a figure of any of the figures presented herein may be analog or digital, continuous time or discrete time, and single-ended or differential. For instance, if a signal path is shown as a single-ended path, it also represents a differential signal path. Similarly, if a signal path is shown as a differential path, it also represents a single-ended signal path. While one or more particular architectures are described herein, other architectures can likewise be implemented that use one or more data buses not expressly shown, direct connectivity between elements, and/or indirect coupling between other elements as recognized by one of average skill in the art.
The term “module” may be used in the description of one or more of the embodiments to implements one or more functions via a device such as a processor or other processing device or other hardware that may include or operate in association with a memory that stores operational instructions. A module may operate independently and/or in conjunction with software and/or firmware. As also used herein, a module may contain one or more sub-modules, each of which may be one or more modules.
While particular combinations of various functions and features of the one or more embodiments have been expressly described herein, other combinations of these features and functions are likewise possible. The present disclosure is not limited by the particular examples disclosed herein and expressly incorporates these other combinations.
Patent | Priority | Assignee | Title |
11103761, | Nov 30 2015 | Protective headgear with display and methods for use therewith | |
11908161, | Oct 08 2020 | Samsung Electronics Co., Ltd. | Method and electronic device for generating AR content based on intent and interaction of multiple-objects |
Patent | Priority | Assignee | Title |
6704044, | Jun 13 2000 | OmniVision Technologies, Inc | Completely integrated baseball cap camera |
7341530, | Jan 09 2002 | SPORTSMEDIA TECHNOLOGY CORPORATION | Virtual strike zone |
8834303, | Oct 06 2012 | James Edward, Jennings | Arena baseball game system |
9092952, | Oct 23 2009 | ISAAC DANIEL INVENTORSHIP GROUP, LLC | System and method for notifying a user of an object's presence within a boundary |
20020049103, | |||
20030171169, | |||
20080318595, | |||
20080318684, | |||
20090029754, | |||
20090143143, | |||
20130083003, | |||
20140100006, | |||
20140121792, | |||
20160184703, | |||
20160292850, | |||
20160296839, | |||
20160314620, | |||
20170151484, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Date | Maintenance Fee Events |
Aug 29 2019 | MICR: Entity status set to Micro. |
Apr 05 2023 | M3551: Payment of Maintenance Fee, 4th Year, Micro Entity. |
Date | Maintenance Schedule |
Oct 08 2022 | 4 years fee payment window open |
Apr 08 2023 | 6 months grace period start (w surcharge) |
Oct 08 2023 | patent expiry (for year 4) |
Oct 08 2025 | 2 years to revive unintentionally abandoned end. (for year 4) |
Oct 08 2026 | 8 years fee payment window open |
Apr 08 2027 | 6 months grace period start (w surcharge) |
Oct 08 2027 | patent expiry (for year 8) |
Oct 08 2029 | 2 years to revive unintentionally abandoned end. (for year 8) |
Oct 08 2030 | 12 years fee payment window open |
Apr 08 2031 | 6 months grace period start (w surcharge) |
Oct 08 2031 | patent expiry (for year 12) |
Oct 08 2033 | 2 years to revive unintentionally abandoned end. (for year 12) |