Embodiments are directed to an electronic device having an illuminated body that defines a virtual or dynamic trackpad. The electronic device includes a translucent layer defining a keyboard region and a dynamic input region along an external surface. A keyboard may be. positioned within the keyboard region and including a key surface and a switch element (e.g., to detect a keypress). A light control layer positioned below the translucent layer and within the dynamic input region may have a group of illuminable features. The electronic device may also include a group of light-emitting elements positioned below the optical diffuser. One or more of the light control layer or the group of light-emitting elements may be configured to illuminate the dynamic input region to display a visible boundary of an active input area. At least one of a size or a position of the visible boundary may be dynamically variable.
|
9. An electronic device comprising:
an upper portion comprising a display; and
a lower portion pivotally coupled with the upper portion and comprising:
a first translucent layer defining an external surface of the electronic device and comprising first light-extraction features defining a first input area;
a second translucent layer positioned below the first translucent layer and comprising second light-extraction features defining a second input area, the second input area smaller than the first input area;
a first light-emitting element optically coupled with the first translucent layer along a first side of the first translucent layer and configured to direct light through the first translucent layer to illuminate the first input area;
a second light-emitting element optically coupled with the second translucent layer along a second side of the second translucent layer and configured to direct light through the second translucent layer to illuminate the second input area; and
a processing unit configured to:
in a first mode of operation, illuminate the first input area; and
in a second mode of operation, illuminate the second input area, wherein:
a portion of the first input area extends around a periphery of the second input area; and
the second input area is illuminated both by the first light-emitting element and the second light-emitting element.
1. A notebook computer comprising:
an upper portion comprising a display;
a lower portion pivotally coupled with the upper portion and comprising:
a keyboard region configured to receive keypress inputs; and
a dynamic input region along a side of the keyboard region, the dynamic input region defined at least in part by:
a stack of translucent layers comprising:
a first translucent layer defining an exterior surface and comprising first light-extraction features; and
a second translucent layer positioned below the first translucent layer and comprising second light-extraction features;
a first light-emitting element configured to propagate light through the first translucent layer to the first light-extraction features to illuminate a first active input area; and
a second light-emitting element configured to propagate light through the second translucent layer to the second light-extraction features to illuminate a second active input area; and
a processor configured to:
in a first mode of operation, control the first light-emitting element to illuminate the first active input area via the first light-extraction features; and
in a second mode of operation, control the second light-emitting element to illuminate the second active input area via the second light-extraction features, wherein:
the first active input area has a first size; and
the second active input area has a second size, the second size smaller than the first size.
16. An electronic device comprising:
an enclosure;
a display positioned at least partially within the enclosure;
a keyboard coupled to the enclosure and along a side of the display;
a plurality of translucent layers defining a dynamic input region positioned along a side of the keyboard and comprising:
a stack of translucent layers comprising:
a first translucent layer comprising first light-extraction features;
a second translucent layer comprising second light-extraction features; and
a third translucent layer comprising third light-extraction features;
a first light-emitting element configured to illuminate the first light-extraction features;
a second light-emitting element configured to illuminate the second light-extraction features; and
a third light-emitting element configured to illuminate the third light-extraction features; and
a processor configured to:
in a first mode of operation, control the first light-emitting element to illuminate the first light-extraction features;
in a second mode of operation, control the second light-emitting element to illuminate the second light-extraction features; and
in a third mode of operation, control the third light-emitting element to illuminate the third light-extraction features, wherein:
the dynamic input region has a first size in the first mode of operation;
the dynamic input region has a second size in the second mode of operation, the second size larger than the first size; and
the dynamic input region has a third size in the third mode of operation, the third size larger than the second size.
2. The notebook computer of
3. The notebook computer of
the first active input area overlaps with a portion of the second active input area, thereby defining:
an overlapping portion of the dynamic input region in which the first active input area overlaps with the second active input area; and
a non-overlapping portion of the dynamic input region in which the first active input area does not overlap with the second active input area;
the overlapping portion of the dynamic input region defines a center region of the dynamic input region; and
the non-overlapping portion of the dynamic input region defines a periphery of the dynamic input region.
4. The notebook computer of
the overlapping portion is illuminated by:
the first light-emitting element via the first light-extraction features; and
the second light-emitting element via the second light extraction features;
the non-overlapping portion is illuminated by the first light-emitting element via the first light-extractions features; and
the first light-emitting element and the second light-emitting element are selectively illuminable.
5. The notebook computer of
6. The notebook computer of
7. The notebook computer of
the stack of translucent layers further comprises a third translucent layer positioned below the second translucent layer and comprising third light-extraction features; and
the lower portion further comprises a third light-emitting element configured to propagate light through the third translucent layer to the third light-extraction features to illuminate a third active input area, the third active input area different from the first active input area and from the second active input area.
8. The notebook computer of
a portion of the second active input area extends around a periphery of the third active input area; and
the first active input area, the second active input area, and the third active input area are selectively illuminable.
10. The electronic device of
the first translucent layer comprises a first internal reflection region positioned between the first light-emitting element and the first light-extraction features and configured to channel light from the first light-emitting element to the first light-extraction features; and
the second translucent layer comprises a second internal reflection region positioned between the second light-emitting element and the second light-extraction features and configured to channel light from the second light-emitting element to the second light-extraction features.
11. The electronic device of
in the first mode of operation, the second input area is unilluminated; and
in the second mode of operation, both the second input area and the first input area are illuminated.
12. The electronic device of
a third translucent layer positioned below the second translucent layer and comprising third light-extraction features defining a third input area; and
a third light-emitting element optically coupled with the third translucent layer along a third side of the third translucent layer and configured to direct light through the third translucent layer to illuminate the third input area.
13. The electronic device of
a portion of the first input area extends around a periphery of the second input area;
a portion of the second input area extends around a periphery of the third input area; and
the third input area is illuminated by the first light-emitting element, the second light-emitting element, and the third light-emitting element.
14. The electronic device of
15. The electronic device of
17. The electronic device of
18. The electronic device of
|
This application is a continuation of U.S. patent application Ser. No. 15/965,840, filed Apr. 27, 2018, which is a nonprovisional patent application of and claims the benefit of U.S. Provisional Patent Application No. 62/555,027, filed Sep. 6, 2017 and titled “Illuminated Device Enclosure with Dynamic Trackpad,” the contents of which are incorporated herein by reference as if fully disclosed herein.
The described embodiments relate generally to input surfaces of an electronic device. More particularly, the present embodiments relate to an illuminated electronic device enclosure or body that defines an input surface.
In computing systems, an input device may be employed to receive input from a user. Some traditional input devices include large buttons, keys, or other mechanically-actuated structures. However, these types of input devices may lack flexibility or adaptability and may permanently indicate the presence of the input device within the computing system.
Embodiments of the present invention are directed to an electronic device having an illuminated enclosure that defines a dynamic input surface.
In a first aspect, the present disclosure includes an electronic device. The electronic device includes an upper portion. The upper portion includes an upper enclosure defining an opening. The upper portion further includes a display positioned at least partially within the opening and configured to depict a graphical output. The electronic device further includes a lower portion pivotally coupled with the upper portion. The lower portion includes a lower enclosure having a translucent layer that defines an active input area and an array of light-extraction features positioned within the active input area. The lower portion further includes a keyboard positioned along an upper surface of the lower enclosure and configured to receive a keypress. The lower portion further includes a light-emitting element positioned along a side of the translucent layer and configured to propagate light through the translucent layer to the light-extraction features to illuminate the active input area. The lower portion further includes a processing unit configured to modify the graphical output in response to the keypress and modify the graphical output in response to an input received along the active input area when the active input area is illuminated.
In a second aspect, the present disclosure includes an electronic device. The electronic device includes a translucent layer defining a keyboard region and a dynamic input region along an external surface of the electronic device. The electronic device further includes a keyboard positioned within the keyboard region. The keyboard includes a key surface and a switch element configured to detect a keypress. The electronic device further includes a light control layer positioned below the translucent layer within the dynamic input region and having a group of illuminable features. The electronic device further includes an optical diffuser positioned below the light control layer. The electronic device further includes a group of light-emitting elements positioned below the optical diffuser and configured to propagate light through the optical diffuser, the light control layer, and the translucent layer. One or more of the light control layer or the group of light-emitting elements may be configured to illuminate the dynamic input region to display a visible boundary of an active input area. At least one of a size or a position of the visible boundary may be dynamically variable.
In a third aspect, the present disclosure includes an electronic device. The electronic device includes an enclosure. The electronic device further includes a display positioned at least partially within the enclosure. The electronic device further includes a keyboard positioned along an upper surface of the enclosure. The electronic device further includes a translucent layer defining a dynamic input region along a side of the keyboard. The dynamic input region may have an active input area that is designated by a visual boundary. The electronic device further includes a group of light-emitting elements optically coupled with the translucent layer and configured to depict a visual output within the dynamic input region. The electronic device further includes a sensing element positioned within an interior volume of the enclosure and configured to detect an input along the active input area when the visual boundary is illuminated.
In addition to the exemplary aspect and embodiments described above, further aspects and embodiments will become apparent by reference to the drawings and by study of the following description.
The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like elements.
The use of cross-hatching or shading in the accompanying figures is generally provided to clarify the boundaries between adjacent elements and also to facilitate legibility of the figures. Accordingly, neither the presence nor the absence of cross-hatching or shading conveys or indicates any preference or requirement for particular materials, material properties, element proportions, element dimensions, commonalities of similarly illustrated elements, or any other characteristic, attribute, or property for any element illustrated in the accompanying figures.
Additionally, it should be understood that the proportions and dimensions (either relative or absolute) of the various features and elements (and collections and groupings thereof) and the boundaries, separations, and positional relationships presented therebetween are provided in the accompanying figures merely to facilitate an understanding of the various embodiments described herein and, accordingly, may not necessarily be presented or illustrated to scale, and are not intended to indicate any preference or requirement for an illustrated embodiment to the exclusion of embodiments described with reference thereto.
The description that follows includes sample systems, methods, and apparatuses that embody various elements of the present disclosure. However, it should be understood that the described disclosure may be practiced in a variety of forms in addition to those described herein.
The present disclosure describes systems, devices, and techniques related to an electronic device having an illuminated enclosure or body that defines a dynamic input region. The dynamic input region may be defined along an exterior or upper surface of the enclosure formed from a translucent layer or structure. The translucent layer may be selectively illuminated to reveal a customizable active input area (e.g., a virtual track pad) and/or display various visual outputs along the dynamic input region.
Visual output may be produced along the translucent layer using a number of different techniques. In a first example, the translucent layer may be illuminated from the side by a light-emitting element. Light is directed toward light-extraction features in the translucent layer that may illuminate a visible boundary of the active input area (e.g., a virtual trackpad). In another example, the translucent layer is illuminated from below by an array of light-emitting elements (or a single light-emitting element) to create a configurable or customizable boundary of the active input area. For example, the active input area may be moved, resized, rearranged, functionally reassigned, or the like along the dynamic input region. The dynamic input region may also depict various other visual outputs or other optical effects, including illuminating a boundary of a touch input, or conveying other information, including dynamic or updateable information of the electronic device. As described herein, the virtual trackpad may be positioned along a side of a keyboard or other structure coupled to the translucent layer configured to receive a keypress or other input.
The dynamic input region may be defined along virtually any exterior surface of the electronic device formed by the translucent layer that is configured to receive an input, including a force input, a touch input, and/or a proximity input. The translucent layer may be formed from one or more translucent materials including, for example, glass, ceramic, plastic, or a combination thereof. As used herein, the term translucent or translucent layer may be used to refer to a material or layer that allows the passage of light and does not require that the material or layer be transparent, clear, or otherwise free from features that scatter or absorb some amount of light. In this regard, the term translucent may generally refer to a material or layer that is optically transparent, partially transparent, or otherwise able to transmit light. The translucent layer may be coupled with, or otherwise positioned along, one or more sensing layers or structures within the enclosure, as described herein, including a capacitive-based sensing layer, which may allow the dynamic input region to detect input along the translucent exterior surface of the electronic device.
A light-emitting element, or group of light-emitting elements may be positioned within the enclosure and configured to illuminate an active input area within the dynamic input region. The active input area may be a virtual trackpad positioned on a particular area or portion of the dynamic input region configured to control a function of the electronic device in response to an input. For example, while the entire dynamic input region may be configured to detect input, in certain configurations, a specified area or portion of the dynamic input region may define an active input area (e.g., a virtual or dynamic track pad) that is used to control the electronic device in response to detected input; other areas or portions of the dynamic input region may be temporarily unresponsive to input. This may allow the active input area to be customizable within the dynamic input region based on user preferences, such as arranging the active input area in various positions and sizes within the dynamic input region.
To facilitate the foregoing, the light-emitting element may thus illuminate a visible boundary of the active input area within the dynamic input region. The visible boundary may indicate to a user the presence and location of input functionality on the device enclosure. For example, in some cases, the dynamic input region and/or the active input area may be concealed from a user. The translucent layer may appear to substantially resemble an exterior surface of an electronic device having no apparent input functionality; however, this is not required. In other cases, the dynamic input region and/or active input area may be optically distinguishable on the exterior surface in an unilluminated state. Once illuminated, the active input area may be manipulated within the dynamic input region and the visible boundary may be updated accordingly, thereby indicating to a user a new or updated location of input functionality on the exterior surface.
In a “side-illuminated” embodiment, where the translucent layer is illuminated from the side, the translucent layer may define a light guide that guides or directs light from the light-emitting element and defines the active input area within the dynamic input region. To facilitate the foregoing, the light-emitting element may be a side-firing light-emitting diode (LED) or other light-emitting element positioned along a side of the translucent layer. An internally reflective region of the translucent layer may propagate light received from the light-emitting element along a length of the enclosure toward light-extraction features. The light-extraction features may include textured features that extract light from the translucent layer and illuminate a visible boundary of the active input area. For example, the light-extraction features may have a distinct index of refraction or other optical property that redirects light toward the external surface of the enclosure when the light traverses a boundary between the internally reflective region and the light extraction features. The electronic device may be responsive to input received within the active input area when the visible boundary is illuminated. For example, when the visible boundary is illuminated, a sensing layer (including a capacitive-based sensor) may detect input within the active input area and a display of the electronic device may be responsive to the detected input.
In a “bottom-illuminated” embodiment, where the translucent layer is illuminated from beneath the external surface, a light control layer may be used to control the propagation of light through the translucent layer and define the active input area within the dynamic input region. The light control layer may be positioned along an internal surface of the translucent layer and control the propagation of light through the translucent layer using a group of illuminable features. The light control layer may generally impede or impair the propagation of light. The group of illuminable features may be regions of the light control layer where light may propagate from an interior of the electronic device to the translucent layer.
The group of illuminable features may be illuminated to define a visible boundary (or other characteristic) of the active input area within the dynamic input region. In a particular embodiment, the light control layer may be an ink, coating, substrate, deposition, or other structure that blocks light and the group of illuminable features may include an array of microperforations forming a channel or passage through the light control layer that defines a boundary of the active input area when illuminated. The array of microperforations may also be selectively illuminated (using an LED matrix or the like) in order to vary a size, position, shape, and so on of the active input area within the dynamic input region. In other embodiments, the light control layer may be a polarizing layer and/or other components that operate to control the passage of light through individual indicators. For example, the group of illuminable features may be individually selectable windows or regions, which may allow light to propagate therethrough in response to a signal (e.g., which may be controlled by a processing unit of the electronic device).
The light control layer may be used to define a variety of user-customizable shapes, sizes, positions, configurations, and so forth of the active input area within the dynamic input region. For example, multiple, distinct visible boundaries of the active input area (first visible boundary, second visible boundary, and so on) may be illuminated within the dynamic input region successively. The visible boundaries may correspond to distinct sizes, shapes, positions, and so on of an active input area, which may have a user-customizable shape, size, or the like on the dynamic input region. The electronic device may be responsive to input received within the dynamic input region based on the illuminated boundary of the active input area. Accordingly, the active input area may not be limited to a particular configuration or position within the dynamic input region, but rather may be manipulated by a user into a variety of customizable configurations. This may be beneficial, for example, where the active input area defines a trackpad of a notebook computer; the trackpad can be resized or repositioned along a translucent device enclosure based on user-specified preferences and thus enhance the functionality and/or adaptability on the electronic device.
The electronic device may be configured to depict a visual output within the dynamic input region. As described herein, the visual output may be visual cues to prompt an input and, in other cases, may be responsive to, or produced, when the electronic device receives an input. For example, the visual output may cue the user to enter input and/or confirm or provide additional information to a user in response to a detected input. The light-emitting elements described herein may be configured to propagate light through the translucent layer and depict the visual output within the dynamic input region. In some embodiments, the light control layer may be used to control the propagation of light through the translucent layer. The light-emitting element may illuminate the visual output on the translucent exterior surface in response to an input received along the dynamic input region. Sample visual outputs in this regard may include an illumination of a location of a touch input, a path of travel of a touch input along the dynamic input region, the entire active input area, and so on. The visual output may also be used to visually emphasize aesthetic characteristics of the active input area, including emphasizing an interior periphery (corresponding to a gradient brightness, contrast, and/or other optical periphery or visual effect), a boundary thickness or edge fade, and/or adjusting a color or brightness of the active input area, among other possibilities. The visual output may also be used to identify a function of the electronic device that may be controlled or manipulated in response to an input.
In some embodiments, the dynamic input region may be used to convey information to the user corresponding to a notification, status, configuration, and/or other information related to the electronic device. For example, the visual output depicted within the dynamic input region, described above, may include or correspond to information related to the electronic device. In some embodiments, the visual output may correspond to a location of a component or assembly of the electronic device, such as a location and/or function related to an inductive charging coil, power button, wireless antenna, and so on. The visual output may also be updateable or dynamic. For example, the visual output may correspond to a power level (battery charge or battery depletion level) of the electronic device, a location and/or strength of a wireless internet connection, and/or other updateable or dynamic information. Accordingly, as described herein, the dynamic input region need not receive an input or be used to control a function of an electronic device. However, it will be appreciated that in some cases the visual output depicted within the dynamic input region may be used to both convey information related to the electronic device and receive an input that controls a function of the electronic device. To illustrate, the dynamic input region may have a visual output corresponding to a strength of a wireless internet signal that may also be displayed on a portion of the translucent layer used to control one or more properties of a wireless internet connection in response to an input (such as selecting a wireless network depicted at a display of the electronic device).
Broadly, the dynamic input region may include any appropriate sensing element configured to detect a touch input, force input, and/or proximity input along the exterior translucent surface of the electronic device. The sensing element may be positioned within the enclosure and configured to detect input received within the active input area or more generally along the dynamic input region or other region of the translucent layer (such as a keyboard region, described herein). A user may manipulate the active input area within the dynamic input region. As such, the sensing element may be adaptable to generate an input signal (that controls a function of the electronic device) in response to input received within the input area for a given configuration (e.g., based on a position of the active input area within the dynamic input region). For example, the sensing element may generate an input signal in response to input received within a visible boundary of the input area, but remain substantially unresponsive to input received outside of the visible boundary.
In one embodiment, the sensing element may be a non-contact-based sensor that measures various electrical parameters to detect a touch and/or force input, including optical, magnetic, and capacitance-based sensors, among other non-contact-based sensors. In other cases, the sensing element may be, or form a component of, a contact-based sensor, including a tactile dome switch, strain gauge, piezoelectric or electroactive polymer (EAP) stack, or the like, among other contact-based sensors. The sensing element may include multiple combinations of sensors, including contact-based and non-contact-based sensors, that cooperate to measure the force and/or touch input received at the translucent external surface. In some cases, the sensing element may measure localized or generalized deflection or bending of the translucent layer inward and trigger a corresponding switch event. The sensing element may also be configured to produce various haptic effects, as described herein, and/or be coupled with a separate haptic structure that produces a haptic or tactile output along the input region.
The sensing element may be configured to detect inputs at distinct regions or areas of the translucent layer. This may allow the dynamic input region to define multiple active input areas on the translucent layer that may each correspond to distinct functions of the electronic device. For example, individual active input areas may correspond to keyboard keys, buttons of a video game controller, or other virtual keys or buttons that may be operated in succession or are otherwise desired to be defined on the translucent layer within one another. In some cases, the sensing layer may also detect input received at the distinct input area concurrently. This may be beneficial, for example, where the translucent layer is used to form a trackpad and one or more buttons within the dynamic input region. For example, the sensing element may detect a scrolling input or motion along an active input defining the trackpad and also detect an input at an active input area defining a button or keyboard key.
In a particular embodiment, the electronic device may be a notebook computer and the dynamic input region may be defined on one or more pivotally coupled portions of the enclosure. For example, the enclosure may include an upper portion pivotally coupled to a lower portion. The upper portion may include an upper enclosure and house or partially contain a touch-sensitive display that depicts a graphical output of the electronic device. The lower portion may include a lower enclosure and may have a translucent layer that defines an exterior or upper surface of the electronic device along which the dynamic input region may be arranged.
In addition to the dynamic input region, the translucent layer may also have a keyboard region defined along the exterior or upper surface. The keyboard region may include one or more tactile switch assemblies coupled with the translucent layer that are configured to detect input at a key surface of a key cap or other input structure and generate a tactile response. The dynamic input region may be arranged on the translucent layer proximate the keyboard region (and/or adjoin or partially overlap) and define an active input area that forms a trackpad for the electronic device. The trackpad may be manipulated into a variety of configurations within the dynamic input region, as described herein, and be used to control a function of the electronic device that is distinct from the keypress. For example, the graphical output depicted at the display may be modified in a first manner in response to the keypress and in a second manner in response to input received within the active input area.
As described herein, the dynamic input region may be configured to depict a visual output of the trackpad at the translucent layer that is separate and distinct from the graphical output of the display. For example, as described above, the dynamic input region may visually emphasize the active input area, illuminate a boundary of a touch contact, and/or provide updateable information (battery level, wireless signal strength, and so on) corresponding to the electronic device that is separate and distinct from the graphical output depicted at the display.
In certain embodiments, the visual output of the dynamic input region may be visually perceptible when the upper portion and the lower portion are in a closed position. As described herein, the upper and lower portions may pivot relative to one another such that major surfaces of the upper portion and the lower portion are positioned proximate to one another or otherwise aligned to define a closed configuration of the electronic device. The dynamic input region may depict a visual output along a periphery of the major surface of the lower portion that may be at least partially visible between the closed upper and lower portions. For example, a light-emitting element may be configured to propagate light through the translucent layer and/or between a gap separating the closed upper and lower portions (which may extend along a direction substantially away from the enclosure). The light may be indicative of a status or other information relating to the electronic device, such as a power or battery depletion level. This may allow a user to receive updatable information from the electronic device even when the enclosure is closed.
It will be appreciated that while the foregoing describes the dynamic input region defined on a notebook computer enclosure, the dynamic input region may be defined on substantially any electronic device enclosure having a translucent enclosure or device body. Sample devices include a smart watch, stylus, other portable or wearable electronic device, portable media players, and so on. Broadly, the electronic device may be defined by any enclosure (at least a portion of which is translucent) having one or more openings that surround or contain a display and/or a button. In other embodiments, the electronic device may be a component or segment of a substantially mechanical structure, such as a wall of a building, a panel, dashboard, door, doorframe, or the like. For example, a wall of a building may be used to define the dynamic input region and control various functions of the building, such as climate controls, lighting, and so on. As such, the discussion of any electronic device is meant as illustrative only.
Reference will now be made to the accompanying drawings, which assist in illustrating various features of the present disclosure. The following description is presented for purposes of illustration and description. Furthermore, the description is not intended to limit the inventive aspects to the forms disclosed herein. Consequently, variations and modifications commensurate with the following teachings, and skill and knowledge of the relevant art, are within the scope of the present inventive aspects.
In a non-limiting example, as shown in
An exterior surface of the electronic device 104 may be defined by an enclosure 108. For example, the enclosure 108 may define sidewalls, and top and bottom surfaces of the electronic device 104 that enclose or encompass internal components of the electronic device 104, including various electrical and structural components described herein. The enclosure 108 may include multiple layers or assemblies that be positioned within one or both of the upper portion 109a and/or the lower portion 109b.
For purposes of illustration,
The electronic device 104 may include a touch-sensitive display 114 at least partially positioned within the enclosure 108. For example, the upper enclosure 108a may define an opening 112 and the touch-sensitive display 114 may be at least partially positioned within the opening 112. The touch-sensitive display 114 may be configured to depict a graphical output of the electronic device 104. The graphical output depicted on the touch-sensitive display 114 may be responsive to various types of detected input, described herein, including a touch input, force input, and/or a proximity input detected at a keyboard or keyboard region of the electronic device 104 and/or a dynamic contact region of the trackpad.
The enclosure 108 may have, or be partially formed from, a translucent layer. For example, as shown in
In the embodiment of
In the embodiment of
An exterior or upper surface of the translucent layer 116 adjacent to (or partially overlapping) the keyboard region 118 may resemble an exterior surface of a device enclosure free of markings and/or having a substantially uniform appearance. Despite appearances, in an activated state, the electronic device 104 may define a dynamic input region 128 along the exterior surface of the translucent layer 116. The dynamic input region 128 may be a region of the translucent layer 116 configured to receive an input, including a touch input, a force input, and/or a proximity input that is used to control a function of the electronic device 104. For example, the dynamic input region 128 may be a region of the translucent layer 116 coupled with or positioned along a sensing element, described herein, that may detect input along the exterior surface of the enclosure 108.
The translucent layer 116 may be illuminated to reveal the input functionality of the dynamic input region 128. For example, as described in greater detail below, in a side-illuminated embodiment, the translucent layer 116 may be a light guide configured to channel or redirect light along a length of the enclosure 108. The translucent layer 116 may include internal reflective properties or otherwise have an internal reflection region that allows light optically coupled with the translucent layer 116 to propagate within the translucent layer 116 without substantially escaping. One or more light-extraction features, such as a textured region, formed into the translucent layer 116 may expel light from the translucent layer 116 and reveal the input functionality of the dynamic input region 128.
Additionally or alternatively, in a bottom-illuminated embodiment, the dynamic input region 128 may be a concealable or hidden input region of the electronic device 104. For example, a light control layer having a group of illuminable features may be positioned along an underside surface of the translucent layer 116. The light control layer may generally conceal an interior of the enclosure 108 and the indicators may be visually imperceptible or invisible when not illuminated. When activated, the electronic device 104 reveals the dynamic input region 128 by propagating light through the group of illuminable features to display a boundary or an active input area, symbol, glyph, marking, or other visual output of the dynamic input region 128. An input assembly, sensing element or the like, including various haptic elements or structures, may be positioned below the dynamic input region 128 and configured to trigger a switch event and/or deliver a haptic output in response to an input received along the dynamic input region 128.
As shown in
The electronic device 104 may include various other input/output components that support one or more functions of the electronic device 104. For purposes of illustration,
To facilitate the foregoing, the substrate may include switch element 140. The switch element 140 may be a contact-based and/or non-contact-based sensor that detects a keypress received at the key cap 134. For example, the switch element 140 may define one or more electrical traces that may trigger a switch event in response to a contact from the tactile element 132 or other component of the keyboard 120 when the key cap 134 (having a key surface) is depressed. In other cases, the switch element 140 may be, or form a component of, a capacitive, magnetic, and/or optical based sensor configured to detect movements of the key cap 134 caused by the keypress and trigger a corresponding switch event. As shown in the embodiment of
The substrate 136 may also be configured to facilitate illumination of the key cap 134. For example, the substrate 136 may be a portion of the translucent layer 116 or otherwise be formed or constructed from a translucent material that allows light to propagate therethrough. As such, a light-emitting element of the electronic device 104 (not shown in
The tactile element 132 may be any appropriate structure that delivers a tactile effect to the key cap 134 in response to a keypress. In the embodiment depicted in
In this regard, in certain embodiments, the tactile element 132 may be formed from any appropriate material (e.g., including metal, rubber, or the like) that exhibits sufficiently elastic characteristics. For example, the tactile element 132 may be sufficiently elastic or resilient such that it does not permanently deform from applied force (e.g., the tactile element 132 may substantially return to an original or undeformed shape after the force ceases). The key cap 134 may deform the tactile element 132 upon the depression of the key cap 134. In turn, the tactile element 132 may return to an undeformed shape when the key cap 134 returns to a neutral or undepressed condition. The tactile element 132 may not be limited to the above example materials, and may also include any other appropriate materials consistent with the various embodiments presented herein, including silicone, plastic or other flexible and resilient materials.
As shown in
As described above with respect to
With reference to
The active input area 150 may be manipulated within the dynamic input region 128. As sample possibilities, a size, shape, position, or the like of the active input area 150 may be manipulated or otherwise changed or updated within the dynamic input region 128. This may occur in response to an input received within the dynamic input region 128 (e.g., such as a gesture or other input) and/or in response to a signal from a processing unit and/or other component or assembly of the electronic device 104. To facilitate the foregoing, the light-emitting elements of the electronic device 104 may be selectively operable to illuminate different (or overlapping) areas or portions of the translucent layer 116 to define different boundaries of the active input area 150. Analogously, the sensing elements of the electronic device 104 may also be selectively operable to detect input received within a new or updated boundary of the active input area 150.
With reference to
The resized and repositioned active input area 150 may accommodate a user-specified preference. For example, where the active input area 150 defines a trackpad of the electronic device 104, a user may desire to resize and/or reposition the trackpad along the device enclosure to facilitate use of the electronic device 104, including various applications, programs or functions performed or depicted on the electronic device 104. The resizing, repositioning, or other manipulation of the active input area 150 may occur in response to, for example, a gesture or other user input received along the dynamic input region 128, the keyboard region 118, or other input device or structure of the electronic device 104, including the touch-sensitive display 114. Once resized, repositioned, or otherwise changed, the sensing elements of the electronic device 104 may be responsive to detect input received within the updated boundary of the active input area 150. In this regard, the electronic device 104 may be configured to distinguish between input received along the dynamic input region 128 that is within and/or outside of the active input area 150. This may allow the electronic device 104 to be controlled by input received within the illuminated boundary of the active input area; however, in other cases it may be desirable to detect input received outside of the illuminated boundary as well (e.g., as described in greater detail below with respect to
As described herein, the electronic device 104 may be configured to depict various visual outputs within the dynamic input region 128. Broadly, visual outputs of the dynamic input region 128 may be substantially any optical or visual effect depicted on the translucent layer 116. In some embodiments, visual outputs of the dynamic input region 128 may visually emphasize a boundary or other feature of an active input region (e.g., as described in greater detail below with respect to
Visual outputs of the dynamic input region 128 may also be used to convey information relating to the electronic device 104. In this regard, the dynamic input region 128 may be configured to depict output of the electronic device 104 along the translucent layer that defines an exterior surface of the enclosure 108. With reference to
With reference to
The translucent layer 116 may be optically coupled with a light-emitting element 160, as shown in
Various sensing elements, haptic structures, and/or other components may be positioned below the translucent layer 116. For example, as depicted in
The sensing element 164 may be any appropriate component or assembly that detects a touch input, force input, and/or proximity input received within the active input area 150, or more generally along the dynamic input region 128 and/or other regions of the translucent layer 116. In this regard, the sensing element 164 may be a wide variety of components, sensors, assemblies, or the like that are positioned below and/or coupled with the translucent layer 116. In one embodiment, the sensing element 164 may be a non-contact-based sensing element that detects input received along the exterior surface 119a of the translucent layer 116. This may include a capacitive or magnetic-based sensor that is configured to detect a proximity of a user to the translucent layer 116, including a contact between the user and the exterior surface 119a. The non-contact-based sensing element may also detect localized or generalized bending or deflection of the translucent layer 116 which may be used to determine a corresponding force input associated with the deflection. Additionally or alternatively, the sensing element 164 may be a contact-based sensor, such as a tactile dome switch, that detects a force input on the external surface in response to localized or generalized bending or deflection of the translucent layer 116.
The haptic structure 168, may be any appropriate structure or component that produces a haptic or tactile output. In particular, the haptic structure 168 may be any appropriate component that delivers a movement or vibration along the external surface defined by the translucent layer 116 that is perceptible to human touch. The electronic device 104 may use the haptic structure 168 to deliver the haptic or tactile output in response to an input received within the active input area 150, or more generally along the dynamic input region 128. For example, the haptic structure 168 may be a mechanical structure, such as a collapsible dome, spring, or the like that produces movement or vibration in response to a force input received along the dynamic input region 128. In other cases, the haptic structure 168 may be an electrically actuated assembly that delivers a haptic or tactile output in response to a detection of a touch and/or force input by the sensing element 164. For example, the haptic structure 168 may be an electromagnet, electro active polymer or piezoelectric (EAP) stack, or the like. In an embodiment, the haptic structure 168 may be at least partially included within, or directly coupled to, the sensing element 164, for example, which may be the case where the haptic structure 168 is a collapsible dome of a tactile dome switch used to detect a force input received within the active input area 150.
With reference to
As shown in the embodiment of
In a particular embodiment, the light control layer 170 may be an ink, coating, resin, or other structure that blocks that passage of light and the group of illuminable features 172 may be a group of microperforations or holes extending through the light control layer 170. For example, the light control layer 170 may be formed directly on the underside surface 119b, for example, through a printing, deposition, sputtering, platting, or other appropriate process. In other cases, the light control layer 170 may be a separate substrate, film, or other layer applied to the underside surface 119b or on an intermediate (PET) layer connected to the underside surface 119b, described below. The group of microperforations may be openings, holes, through portions, cuts, grooves, recesses, or other features that extend through a complete thickness of the light control layer 170. In this regard, the group of microperforations may allow light to travel through the light control layer 170 and subsequently through the translucent layer 116.
At the exterior surface 119a, the group of microperforations may be substantially visually imperceptible when not illuminated. For example, the group of microperforations may have a size, shape, or other characteristic that renders the group of microperforations invisible or not visually perceptible to the unaided human eye. In one embodiment, the group of microperforations may have a width or other cross-dimension within a range of 30 microns to 70 microns. For example, the group of microperforations may be defined by a pattern of circles that each have a diameter within a range of 30 microns to 70 microns. The group of microperforations may be arranged across the light control layer 170 so that each perforation is separated by a distance within a range of 80 microns to 500 microns. For example, where each microperforation is defined by a circle, each circle may be separated across the light control layer 170 by a distance within a range of 80 microns to 500 microns. It will be appreciated that other dimensions and geometries are contemplated and described in greater detail below, including configurations in which a width of each microperforation is less than 30 microns or greater than 70 microns and where the separation distance is less than 80 microns or greater than 500 microns. Further, each microperforation need not have identical or uniform widths or separations; in some cases, various subsets of the group of microperforations may have distinct widths or separations, which may be used to produce a desired optical effect, among other considerations.
The group of microperforations may define a boundary, for example, of the active input area 150. As such, when illuminated, the input functionality on the translucent layer 116 may be revealed. The group of microperforations may also define various other symbols, glyphs, indicia, and/or other visual outputs at the dynamic input region 128. In other cases, the group of microperforations may define an array or a grid (such as a dot matrix) along the light control layer 170. Subsets of the group of microperforations may be illuminated to illuminate various different boundaries of the active input area 150. For example, as described in greater detail below with respect to
In other embodiments, the light control layer 170 may be a polarizing layer and/or component or assembly configured to control the passage of light through the group of illuminable features. For example, the group of illuminable features 172 may include individually selectable windows or regions, which may allow light to propagate therethrough in response to a signal (e.g., which may be controlled by a processing unit of the electronic device 104). In this regard, the light control layer 170 may include various polarizing filters that may only allow light to pass through an indicator which exhibits a certain frequency, polarity, or other property. A liquid crystal element or other appropriate structure may be arranged within the group of illuminable features 172 and used to rotate or otherwise alter the received light such that it may pass through the indicator and associated polarizing filter. Individual liquid crystal elements may be individually actuated such that light may pass through selective ones of the group of illuminable features 172. This may allow the dynamic input region 128 to be illuminated with various customizable and updateable visual outputs in addition to illuminating the active input area 150 in various locations and configurations within the dynamic input region 128.
To facilitate the foregoing, the light-emitting element 160 may be positioned below the light control layer 170 and illuminate the group of illuminable features 172. In the embodiment of
As shown in
It will be appreciated that while the light path L3 is shown extending through a particular one of the group of illuminable features 172, the light path L3 may extend through substantially any (or all) of the group of illuminable features 172. This may allow the dynamic input region 128 to define the active input area 150 at various positions along the exterior surface 119a of the translucent layer 116. For example, the light-emitting element 160 may substantially illuminate the entire light control layer 170, and the group of illuminable features 172 may be selectively actuated such that light propagates through particular ones of the group of illuminable features 172 and defines a desired symbol or visual output on the exterior surface 119a of the translucent layer 116. In other embodiments, the active input area 150 and/or various visual outputs may be defined by the group of illuminable features 172, for example, as may be the case when the group of illuminable features 172 are microperforations and the light-emitting element is, or forms a component of, an LED backlight.
The light path L3 may also traverse multiple other translucent layers. For example, the electronic device 104 may include an optical diffuser 174. The optical diffuser 174 may be positioned between the light-emitting element 160 and the light control layer 170 such that the light path L3 traverses the optical diffuser 174. The optical diffuser 174 may, in some embodiments, spread or scatter light received from the light-emitting element 160. This may allow the dynamic input region 128 to be illuminated with soft or diffuse light. This may be beneficial for generating various optical effects on the exterior surface 119a defined by the translucent layer 116. For example, soft light may be preferred in various settings having low or dim ambient lighting conditions.
Further, as shown in the embodiment of
The sensing element 164 and the haptic structure 168, described with respect to
With reference to
As shown in the embodiment of
Each of the translucent layers 116a-116c may be coupled with a distinct light-emitting element and be used to illuminate distinct visible boundaries of an active input area within the dynamic input region 128. For example, the electronic device 104 may include a light-emitting element 160a optically coupled along an end of the translucent layer 116a. The light-emitting element 160a may be configured to emit light into a body or thickness of the translucent layer 116a such that light is propagated substantially along a light path L1a. The translucent layer 116a may include light-extraction features 121a that redirects light of light path L1a toward the exterior surface 119a. Further, the electronic device 104 may include a light-emitting element 160b optically coupled along an end of the translucent layer 116b. The light-emitting element 160b may be configured to emit light into a body or thickness of the translucent layer 116b such that light is propagated substantially along a light path L1b. The translucent layer 116b may include light-extraction features 121b that redirects light of light path L1b toward the exterior surface 119a. And further, the electronic device 104 may include a light-emitting element 160c optically coupled along an end of the translucent layer 116c. The light-emitting element 160c may be configured to emit light into a body of thickness of the translucent layer 116c such that light is propagated substantially along a light path L1C. The translucent layer 116c may include light-extraction features 121c that redirects light of light path L1C toward the exterior surface 119a. It will be appreciated that the three distinct translucent layers are shown in
Broadly, each of the translucent layers 116a-116c may be used to illuminate the exterior surface 119a within the dynamic input region 128. The illumination of the exterior surface 119a may correspond to distinct active input areas within the dynamic input region 128. For example, when illuminated, the first translucent layer 116a may define a first active input area 150a on the exterior surface 119a using the light-extraction features 121a, the second translucent layer 116b may define a second active input area 150b on the exterior surface 119a using the light-extraction features 121b, and the third translucent layer 116c may define a third active input area 150c on the exterior surface 119a using the light-extraction features 121c. The active input areas 150a-150c may correspond to a specified region or area of the dynamic input region 128 configured to receive an input and control a function of the electronic device 104. In this regard, various sensing elements (not shown in
Additionally or alternatively, the translucent layers 116a-116c may be used to produce various optical effects or visual outputs within the dynamic input region 128. For example, multiple ones of the translucent layers 116a-116c may be illuminated concurrently or in a particular pattern or sequence in order to visually emphasize selective portions of an active input area or otherwise produce a visual output along the exterior surface 119a. To illustrate, the light-emitting element 160a may be illuminated and define a boundary of the active input area 150a within the dynamic input region 128. While the active input area 150a is illuminated, the light-emitting elements 160b, 160c may be subsequently illuminated. This may create various optical effects within the active input area 150a, for example, such as illuminating a center or a periphery of the active input area 150a in distinct manners. For example, the active input area 150a may be illuminated more intensely (or with different colors) in a center portion as opposed to a periphery. In a center portion, the active input area 150a may be illuminated from light propagated by each of the translucent layers 116a-116c, whereas the periphery of the active input area 150 may be illuminated by light from only the translucent layer 116a (defining a boundary of the active input area 150a). The translucent layers 116a-116c may also be illuminated in a sequence to show an active input area expanding or contracting within the dynamic input region 128 (e.g., as described in greater detail below with respect to
With reference to
In this regard, the electronic device 104 may include a group of light-emitting elements 162. The group of light-emitting elements 162 may be a matrix of LEDs or other structures having individually actuatable light sources. The group of light-emitting elements 162 may be used to illuminate particular ones of the group of illuminable features 172, which, in turn, may illuminate the active input area 150 or visual output of the dynamic input region 128. For example, a first subset of the group of light-emitting elements 162 may be used to illuminate a first subset of the group of illuminable features 172. The first subset of the group of illuminable features 172, when illuminated, may define a first visible boundary of the active input area 150 on the exterior surface 119a. Further, a second subset of the group of light-emitting elements 162 may be used to illuminate a second subset of the group of illuminable features 172. The second subset of the group of illuminable features 172, when illuminated, may define a second visible boundary of the active input area 150 on the exterior surface 119a. The particular area or portion of the exterior surface 119a that defines the active input area 150 within the dynamic input region 128 may therefore change, and the group of light-emitting elements 162 may visually represent this change to the user by illuminating the translucent layer 116 accordingly. The first and second visible boundaries need not be distinct or separate portions of the translucent layer 116, however. In some cases, at least some of the illuminated indicators may be included in both the first and the second subsets of the group of illuminable features 172.
With reference to
The group of light-emitting elements 162, arranged below the translucent layer 116 and the light control layer 170, may be configured to further illuminate the exterior surface 119a. As described above with respect to
The group of light-emitting elements 162 may augment the illumination of the active input area 150, which may be defined by the illuminated light-extraction features 117b. For example, the group of light-emitting elements 162 may be used to produce a visual output within the active input area 150 when the light-extraction features 117b is illuminated by the light-emitting element 160. The visual output may correspond to a visual indication of a position or a magnitude of a force or touch input received within the active input area 150. The visual output may also correspond to an output of the electronic device 104, such as a symbol corresponding to a battery depletion level of the electronic device 104. In other cases, the visual output may be illuminated within the dynamic input region 128 and outside of active input area 150. This may be beneficial in order to illuminate a visual output of the electronic device 104 within the dynamic input region adjacent or otherwise proximate to the active input area 150 on the exterior surface 119a. In other embodiments, other visual outputs are contemplated and described in greater detail below.
With reference to
To facilitate the foregoing, a periphery of the exterior surface 119a may be illuminated. For example, the electronic device 104 may include the group of light-emitting elements 162 arranged below the exterior surface 119a and along a periphery of the translucent layer 116. The group of light-emitting elements 162 may be a strip of LEDs or micro-LEDs that extend along a side of the enclosure 108; however, in other cases, other illumination structures are possible. The group of light-emitting elements 162 may be selectively controllable to propagate light along light paths L4A, L5, described herein. Additionally or alternatively, the periphery of the exterior surface 119a may be illuminated by a light source 160 positioned along a side of the translucent layer 116. The light source 160 may be configured to propagate light along light path L1, through the translucent layer 116, for example, substantially analogous to the light path L1 described with response to
In a particular embodiment, an optical diffuser 174 may be arranged within the enclosure 108 substantially along the group of light-emitting elements 162. The group of light-emitting elements 162 may propagate through the optical diffuser 174 and toward the translucent layer 116. The group of light-emitting elements 162 may selectively propagate light along multiple directions into the diffuser, such as along a light path L4 and a light path L5, depicted in
It will be appreciated that the various light path described herein are depicted for purposes of illustration only. Rather than suggest all the light travels exclusively along a particular light path, the illustrated light paths are depicted to be a representation of diffuse light. Accordingly, light may propagate from the light-emitting element 160 and/or group of light-emitting elements 162 along directions other than those shown by those depicted in the figures.
The electronic device 104 may optionally include an intermediate optical layer 176. The intermediate optical layer 176 may be another optical diffuser configured to further diffuse or scatter light. This may help condition or otherwise soften light expelled from the exterior surface 119a. The intermediate optical layer 176 may also be an optical filter or other layer that selectively transmits light at various different wavelengths. This may help illuminate the exterior surface 119a with a specified color, for example.
In one embodiment, light propagated along the light path L4 may extend through the light control layer 170 and illuminate, for example, the active input area 150 within the dynamic input region 128. Light propagated along the light path L5 need not necessarily pass through the light control layer 170. In some cases, the light propagated along the light path L5 may extend angularly through the translucent layer 116 and away from the electronic device 104. It will be appreciated that light from the group of light-emitting elements 162 may be configured to selectively propagate light along one or both of the light path L4 and/or L5, as may be appropriate for a given configuration.
As shown in
The group of light-emitting elements 162 may be configured to illuminate the active input area 150 and/or a visual output within the dynamic input region 128 in a manner that is visually perceptible when the electronic device 104 is in a closed configuration. In particular, at least a portion of the active input area 150 and/or visual output illuminated on the translucent layer 116 may be visible through the gap 130 and/or a side or end of the translucent layer. For example, light propagated along the light path L4 may be at least partially visually perceptible to a user through the gap 130 (along light path L6). Additionally or alternatively, light propagated along the light path L5 may propagate through a side or end or the translucent layer 116 (along light path L7) and thus may be visually perceptible to a user when the electronic device 104 is in a closed configuration. This may be beneficial in order to convey information to a user regarding a status of the electronic device 104. As a non-limiting example, light propagated along one or both of the light paths L4 and/or L5 may correspond to a battery depletion level of the electronic device 104 (or other updateable or dynamic status of the electronic device 104). As such, the user may receive such information without needing to open the electronic device 104 and/or cause the electronic device 104 to enter a full power mode.
With reference to
With reference to
With reference to
With reference to
With reference to
With reference to
The visual output may be depicted on the dynamic input region 128 in response to a gesture. The gesture may include a user motion pattern, signs, finger or hand positions, or the like that are performed relative to the dynamic input region 128. The gesture may be used to manipulate the active input area 150 along the dynamic input region 128, and the visual output may depict the manipulation accordingly.
With reference to
With reference to
With reference to
With reference to
With reference to
With reference to
With reference to
With reference to
With reference to
With reference to
The electronic device 1004 may include a translucent layer 1016 that forms an exterior surface of the enclosure 1008. As shown in
The electronic device 1104 may include a translucent layer 1116 that forms an exterior surface of the enclosure 1108. As shown in
As shown in
The memory 1212 may include a variety of types of non-transitory computer-readable storage media, including, for example, read access memory (RAM), read-only memory (ROM), erasable programmable memory (e.g., EPROM and EEPROM), or flash memory. The memory 1212 is configured to store computer-readable instructions, sensor values, and other persistent software elements. Computer-readable media 1216 may also include a variety of types of non-transitory computer-readable storage media, including, for example, a hard-drive storage device, a solid state storage device, a portable magnetic storage device, or other similar device. The computer-readable media 1216 may also be configured to store computer-readable instructions, sensor values, and other persistent software elements.
In this example, the processing unit 1208 is operable to read computer-readable instructions stored on the memory 1212 and/or computer-readable media 1216. The computer-readable instructions may adapt the processing unit 1208 to perform the operations or functions described above with respect to
As shown in
The electronic device 104 may also include a battery 1224 that is configured to provide electrical power to the components of the electronic device 104. The battery 1224 may include one or more power storage cells that are linked together to provide an internal supply of electrical power. In this regard, the battery 1224 may be a component of a power source 1228 (e.g., including a charging system or other circuitry that supplies electrical power to components of the electronic device 104). The battery 1224 may be operatively coupled to power management circuitry that is configured to provide appropriate voltage and power levels for individual components or groups of components within the electronic device 104. The battery 1224, via power management circuitry, may be configured to receive power from an external source, such as an AC power outlet or interconnected computing device. The battery 1224 may store received power so that the electronic device 104 may operate without connection to an external power source for an extended period of time, which may range from several hours to several days.
The electronic device 104 may also include one or more sensors 1240 that may be used to detect a touch and/or force input, environmental condition, orientation, position, or some other aspect of the electronic device 104. For example, sensors 1240 that may be included in the electronic device 104 may include, without limitation, one or more accelerometers, gyrometers, inclinometers, goniometers, or magnetometers. The sensors 1240 may also include one or more proximity sensors, such as a magnetic hall-effect sensor, inductive sensor, capacitive sensor, continuity sensor, or the like.
The sensors 1240 may also be broadly defined to include wireless positioning devices, including, without limitation, global positioning system (GPS) circuitry, Wi-Fi circuitry, cellular communication circuitry, and the like. The electronic device 104 may also include one or more optical sensors, including, without limitation, photodetectors, photo sensors, image sensors, infrared sensors, or the like. In one example, the sensor 1240 may be an image sensor that detects a degree to which an ambient image matches a stored image. As such, the sensor 1240 may be used to identify a user of the electronic device 104. The sensors 1240 may also include one or more acoustic elements, such as a microphone used alone or in combination with a speaker element. The sensors 1240 may also include a temperature sensor, barometer, pressure sensor, altimeter, moisture sensor or other similar environmental sensor. The sensors 1240 may also include a light sensor that detects an ambient light condition of the electronic device 104.
The sensor 1240, either alone or in combination, may generally be a motion sensor that is configured to estimate an orientation, position, and/or movement of the electronic device 104. For example, the sensor 1240 may include one or more motion sensors, including, for example, one or more accelerometers, gyrometers, magnetometers, optical sensors, or the like to detect motion. The sensors 1240 may also be configured to estimate one or more environmental conditions, such as temperature, air pressure, humidity, and so on. The sensors 1240, either alone or in combination with other input, may be configured to estimate a property of a supporting surface, including, without limitation, a material property, surface property, friction property, or the like.
The electronic device 104 may also include a camera 1232 that is configured to capture a digital image or other optical data. The camera 1232 may include a charge-coupled device, complementary metal oxide (CMOS) device, or other device configured to convert light into electrical signals. The camera 1232 may also include one or more light sources, such as a strobe, flash, or other light-emitting device. As discussed above, the camera 1232 may be generally categorized as a sensor for detecting optical conditions and/or objects in the proximity of the electronic device 104. However, the camera 1232 may also be used to create photorealistic images that may be stored in an electronic format, such as JPG, GIF, TIFF, PNG, raw image file, or other similar file types.
The electronic device 104 may also include a communication port 1244 that is configured to transmit and/or receive signals or electrical communications from an external or separate device. The communication port 1244 may be configured to couple to an external device via a cable, adaptor, or other type of electrical connector. In some embodiments, the communication port 1244 may be used to couple the electronic device 104 with a computing device and/or other appropriate accessories configured to send and/or receive electrical signals. The communication port 1244 may be configured to receive identifying information from an external accessory, which may be used to determine a mounting or support configuration. For example, the communication port 1244 may be used to determine that the electronic device 104 is coupled to a mounting accessory, such as a particular type of stand or support structure.
As shown in
Other examples and implementations are within the scope and spirit of the disclosure and appended claims. For example, features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations. Also, as used herein, including in the claims, “or” as used in a list of items prefaced by “at least one of” indicates a disjunctive list such that, for example, a list of “at least one of A, B, or C” means A or B or C or AB or AC or BC or ABC (i.e., A and B and C). Further, the term “exemplary” does not mean that the described example is preferred or better than other examples.
The foregoing description, for purposes of explanation, uses specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.
Cao, Robert Y., Mathew, Dinesh C., Xu, Qiliang, Lockwood, Robert J., Garelli, Adam T., Kim, Genie, Huizar, Richard G.
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10067628, | Aug 14 2013 | GOOGLE LLC | Presenting open windows and tabs |
10098198, | Apr 03 2014 | Ford Global Technologies, LLC | Interior ambient and task light bars |
10114485, | Sep 30 2013 | Hewlett-Packard Development Company, L.P.; HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Keyboard and touchpad areas |
10146383, | Oct 24 2008 | Apple Inc. | Disappearing button or slider |
10198110, | Feb 10 2015 | NXP B.V. | Touch sensor that utilizes a touch area with a single conductive path |
10203767, | Jul 14 2015 | INTERLINK ELECTRONICS, INC | Human interface device |
10241255, | Jun 03 2016 | WUHAN CHINA STAR OPTOELECTRONICS TECHNOLOGY CO , LTD | Backlight module and liquid crystal display |
10254853, | Sep 30 2015 | Apple Inc. | Computing device with adaptive input row |
10318065, | Aug 03 2016 | Apple Inc. | Input device having a dimensionally configurable input area |
10409412, | Sep 30 2015 | Apple Inc | Multi-input element for electronic device |
10656719, | Sep 30 2014 | Apple Inc | Dynamic input surface for electronic devices |
10871860, | Sep 19 2016 | Apple Inc. | Flexible sensor configured to detect user inputs |
10908350, | Jun 04 2017 | S.V.V. TECHNOLOGY INNOVATIONS, INC | Stepped light guide illumination systems |
10942580, | Apr 11 2016 | BEIJING XIAOMI MOBILE SOFTWARE CO., LTD. | Input circuitry, terminal, and touch response method and device |
5212356, | Aug 14 1992 | Key Tronic Corporation | Computer keyboard with flexible dome switch layer |
5541372, | Jun 15 1992 | U.S. Philips Corporation | Force activated touch screen measuring deformation of the front panel |
5748177, | Jun 07 1995 | SEMANTIC COMPACTION SYSTEMS, INC | Dynamic keyboard and method for dynamically redefining keys on a keyboard |
5920303, | Jun 07 1995 | SEMANTIC COMPACTION SYSTEMS, INC | Dynamic keyboard and method for dynamically redefining keys on a keyboard |
6029214, | Nov 03 1995 | Apple Computer, Inc.; Apple Computer, Inc | Input tablet system with user programmable absolute coordinate mode and relative coordinate mode segments |
6429846, | Jun 23 1998 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
6585435, | Sep 05 2001 | Membrane keyboard | |
6757002, | Nov 04 1999 | Hewlett-Packard Company | Track pad pointing device with areas of specialized function |
6822640, | Apr 10 2001 | HEWLETT-PACKARD DEVELOPMENT COMPANY L P | Illuminated touch pad |
7339577, | May 29 2001 | ALPS ALPINE CO , LTD | Input device capable of button input and coordinate input on the same operating surface |
7364337, | Jul 12 2005 | HB TECHNOLOGY CO , LTD | Flexible backlight unit for key of input device |
7364339, | Jul 12 2005 | HB TECHNOLOGY CO , LTD | Flexible backlight unit for key of input device |
7538760, | Mar 30 2006 | Apple Inc | Force imaging input device and system |
7683890, | Apr 28 2005 | 3M Innovative Properties Company | Touch location determination using bending mode sensors and multiple detection techniques |
7834855, | Aug 25 2004 | Apple Inc | Wide touchpad on a portable computer |
7839379, | Dec 24 2002 | Apple Inc. | Computer light adjustment |
7843438, | May 16 2005 | WACOM CO , LTD | Notebook-sized computer and input system of notebook-sized computer |
7847789, | Nov 23 2004 | Microsoft Technology Licensing, LLC | Reducing accidental touch-sensitive device activation |
7884315, | Jul 11 2006 | Apple Inc | Invisible, light-transmissive display system |
7893921, | May 24 2004 | ALPS Electric Co., Ltd. | Input device |
7901991, | Oct 04 2007 | SAES GETTERS S P A | Method for manufacturing photovoltaic panels by the use of a polymeric tri-layer comprising a composite getter system |
7999792, | Jun 30 2008 | TOSHIBA CLIENT SOLUTIONS CO , LTD | Electronic apparatus |
8022942, | Jan 25 2007 | Microsoft Technology Licensing, LLC | Dynamic projected user interface |
8063893, | Jun 23 1998 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
8077057, | Nov 06 2006 | ALPS ALPINE CO , LTD | Input device with palm detecting unit |
8098233, | Aug 25 2004 | Apple Inc. | Wide touchpad on a portable computer |
8172444, | Apr 24 2009 | BENCH WALK LIGHTING LLC | Light guide display with multiple light guide layers |
8232976, | Mar 25 2010 | Panasonic Corporation of North America | Physically reconfigurable input and output systems and methods |
8321810, | Apr 30 2009 | Microsoft Technology Licensing, LLC | Configuring an adaptive input device with selected graphical images |
8330725, | Jun 03 2010 | Apple Inc. | In-plane keyboard illumination |
8334794, | Oct 16 2008 | ALPS Electric Co., Ltd. | Input device and keyboard device having illumination function |
8335996, | Apr 10 2008 | Microsoft Technology Licensing, LLC | Methods of interfacing with multi-input devices and multi-input display systems employing interfacing techniques |
8378975, | Mar 21 2007 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Touch pad and electronic device having the same |
8381118, | Oct 05 2009 | Sony Ericsson Mobile Communications AB | Methods and devices that resize touch selection zones while selected on a touch sensitive display |
8390481, | Aug 17 2009 | Apple Inc. | Sensing capacitance changes of a housing of an electronic device |
8432362, | Mar 07 2010 | ICE COMPUTER, INC. | Keyboards and methods thereof |
8436816, | Oct 24 2008 | Apple Inc. | Disappearing button or slider |
8441790, | Aug 17 2009 | Apple Inc. | Electronic device housing as acoustic input device |
8502800, | Nov 30 2007 | Zebra Technologies Corporation | Method for improving sensitivity of capacitive touch sensors in an electronic device |
8537132, | Dec 30 2005 | Apple Inc. | Illuminated touchpad |
8537140, | Nov 28 2008 | Elan Microelectronics Corporation | Illuminated touch sensitive surface module and illuminated device thereof |
8570280, | Mar 25 2009 | Lenovo PC International | Filtering of inadvertent contact with touch pad input device |
8576182, | Sep 01 2009 | NEODRÓN LIMITED | Methods and apparatuses to test the functionality of capacitive sensors |
8592699, | Aug 20 2010 | Apple Inc. | Single support lever keyboard mechanism |
8599141, | Jun 11 2004 | ALPS Electric Co., Ltd. | Input device having coordinate-inputting unit and switching unit |
8642908, | Oct 13 2010 | Sony Corporation; Sony Mobile Communications AB | Electronic device having a hidden input key and method of manufacturing an electronic device |
8654524, | Aug 17 2009 | Apple Inc. | Housing as an I/O device |
8686952, | Dec 23 2008 | Apple Inc. | Multi touch with multi haptics |
8723824, | Sep 27 2011 | Apple Inc. | Electronic devices with sidewall displays |
8743083, | Oct 15 2010 | LOGITECH EUROPE S A | Dual mode touchpad with a low power mode using a proximity detection mode |
8749523, | Oct 24 2008 | Apple Inc.; Apple Inc | Methods and apparatus for capacitive sensing |
8766922, | Sep 02 2008 | Samsung Electronics Co., Ltd. | Input unit, movement control system and movement control method using the same |
8782556, | Feb 12 2010 | Microsoft Technology Licensing, LLC | User-centric soft keyboard predictive technologies |
8786568, | Oct 24 2008 | Apple Inc. | Disappearing button or slider |
8804347, | Sep 09 2011 | Apple Inc. | Reducing the border area of a device |
8854325, | Feb 29 2012 | BlackBerry Limited | Two-factor rotation input on a touchscreen device |
8859923, | Oct 29 2010 | MINEBEA MITSUMI INC | Input apparatus |
8870812, | Feb 15 2007 | BAXTER HEALTHCARE S A | Dialysis system having video display with ambient light adjustment |
8952899, | Aug 25 2004 | Apple Inc. | Method and apparatus to reject accidental contact on a touchpad |
8960934, | Apr 08 2013 | Samsung Electronics Co., Ltd. | Refrigerator and method of manufacturing the same |
8963846, | Nov 06 2009 | Elan Microelectronics Corporation | Method for cursor motion control by a touchpad to move a cursor on a display screen |
9019207, | Sep 28 2010 | GOOGLE LLC | Spacebar integrated with trackpad |
9019710, | Oct 11 2012 | Apple Inc. | Devices having flexible printed circuits with bent stiffeners |
9028123, | Apr 16 2010 | Flex Lighting II, LLC | Display illumination device with a film-based lightguide having stacked incident surfaces |
9063627, | Jan 04 2008 | TACTUS TECHNOLOGY, INC | User interface and methods |
9098120, | Oct 29 2003 | Samsung Electronics Co., Ltd. | Apparatus and method for inputting character using touch screen in portable terminal |
9098244, | Jan 15 2013 | GOOGLE LLC | Ignoring tactile input based on subsequent input received from keyboard |
9104282, | Oct 04 2012 | Canon Kabushiki Kaisha | Electronic device, control method of electronic device, program, and storage medium |
9116616, | Feb 10 2011 | Malikie Innovations Limited | Portable electronic device and method of controlling same |
9122330, | Nov 19 2012 | Disney Enterprises, Inc. | Controlling a user's tactile perception in a dynamic physical environment |
9195354, | Mar 12 2013 | Synaptics Incorporated | Device and method for localized force and proximity sensing |
9201105, | Mar 09 2012 | Sony Corporation | Sensor unit, input device, and electronic apparatus |
9213426, | May 26 2010 | Cirque Corporation | Reenable delay of a touchpad or touch screen to prevent erroneous input when typing |
9223352, | Jun 08 2012 | Apple Inc | Electronic device with electromagnetic shielding |
9244490, | Nov 03 2011 | LG Electronics Inc.; LG Electronics Inc | Mobile terminal |
9250738, | Feb 22 2011 | TERRACE LICENSING LLC | Method and system for assigning the position of a touchpad device |
9304599, | Mar 21 2014 | Dell Products L.P. | Gesture controlled adaptive projected information handling system input and output devices |
9317140, | Mar 30 2009 | Microsoft Technology Licensing, LLC | Method of making a multi-touch input device for detecting touch on a curved surface |
9367146, | Nov 14 2011 | Logiteh Europe S.A. | Input device with multiple touch-sensitive zones |
9367158, | Jan 03 2007 | Apple Inc. | Proximity and multi-touch sensor detection and demodulation |
9400579, | Oct 24 2008 | Apple Inc. | Disappearing button or slider |
9448628, | May 15 2013 | Microsoft Technology Licensing, LLC | Localized key-click feedback |
9460029, | Mar 02 2012 | Microsoft Technology Licensing, LLC | Pressure sensitive keys |
9542097, | Jan 13 2010 | LENOVO PC INTERNATIONAL LIMITED | Virtual touchpad for a touch device |
9543948, | Sep 01 2009 | Microchip Technology Incorporated | Physical force capacitive touch sensors |
9575587, | Jul 13 2015 | Smart illuminated electrical switch with touch control | |
9635267, | Jan 07 2013 | Samsung Electronics Co., Ltd. | Method and mobile terminal for implementing preview control |
9640347, | Sep 30 2013 | Apple Inc | Keycaps with reduced thickness |
9753569, | Mar 25 2014 | Intel Corporation | Switchable input modes for external display operation |
9811221, | Jan 29 2013 | Sharp Kabushiki Kaisha | Input device, method of manufacturing the same, and electronic information equipment |
9847505, | Oct 17 2014 | Semiconductor Energy Laboratory Co., Ltd. | Light-emitting device, module, electronic device, and method for manufacturing light-emitting device |
9908310, | Jul 10 2013 | Apple Inc | Electronic device with a reduced friction surface |
9917282, | Jul 30 2015 | Semiconductor Energy Laboratory Co., Ltd. | Manufacturing method of light-emitting device, light-emitting device, module, and electronic device |
9927895, | Feb 06 2013 | Apple Inc. | Input/output device with a dynamically adjustable appearance and function |
20040104894, | |||
20040257345, | |||
20060109258, | |||
20060197753, | |||
20070076859, | |||
20070236462, | |||
20080018611, | |||
20080055259, | |||
20080100568, | |||
20080117635, | |||
20080150903, | |||
20080180654, | |||
20080272927, | |||
20090128495, | |||
20090128496, | |||
20090174663, | |||
20090180282, | |||
20090219734, | |||
20090225052, | |||
20090284465, | |||
20100033354, | |||
20100134431, | |||
20100265183, | |||
20100271315, | |||
20100283741, | |||
20110001706, | |||
20110069021, | |||
20110100568, | |||
20110298716, | |||
20120001852, | |||
20120050646, | |||
20120068933, | |||
20120212443, | |||
20130002534, | |||
20130002573, | |||
20130021256, | |||
20130215122, | |||
20130335329, | |||
20140015755, | |||
20140043289, | |||
20140208262, | |||
20140317564, | |||
20140347312, | |||
20140368455, | |||
20150123906, | |||
20150123907, | |||
20150136573, | |||
20150205417, | |||
20150223328, | |||
20150293592, | |||
20150297145, | |||
20150309589, | |||
20160049266, | |||
20160098107, | |||
20160147440, | |||
20160231856, | |||
20170090594, | |||
20170090596, | |||
20170090597, | |||
20170090654, | |||
20170249072, | |||
20170315622, | |||
20180011548, | |||
20180039351, | |||
20180039376, | |||
20190025954, | |||
20190073003, | |||
20200257375, | |||
20210200385, | |||
CN101071354, | |||
CN101482785, | |||
CN101609383, | |||
CN101644979, | |||
CN101675410, | |||
CN102171632, | |||
CN102200861, | |||
CN102844729, | |||
CN103164102, | |||
CN103176691, | |||
CN103384871, | |||
CN103425396, | |||
CN103455205, | |||
CN103577008, | |||
CN103914196, | |||
CN104423740, | |||
CN104834419, | |||
CN104915002, | |||
CN1862732, | |||
CN201563116, | |||
CN202166970, | |||
CN203260010, | |||
CN205038595, | |||
CN206147528, | |||
EP189590, | |||
EP2305506, | |||
EP2664980, | |||
FR2980004, | |||
JP2001175415, | |||
TW200912612, | |||
TW201419112, | |||
WO7032949, | |||
WO11156447, | |||
WO11159519, | |||
WO14124173, | |||
WO14164628, | |||
WO97018528, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jul 02 2020 | Apple Inc | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jul 02 2020 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Jun 28 2025 | 4 years fee payment window open |
Dec 28 2025 | 6 months grace period start (w surcharge) |
Jun 28 2026 | patent expiry (for year 4) |
Jun 28 2028 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 28 2029 | 8 years fee payment window open |
Dec 28 2029 | 6 months grace period start (w surcharge) |
Jun 28 2030 | patent expiry (for year 8) |
Jun 28 2032 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 28 2033 | 12 years fee payment window open |
Dec 28 2033 | 6 months grace period start (w surcharge) |
Jun 28 2034 | patent expiry (for year 12) |
Jun 28 2036 | 2 years to revive unintentionally abandoned end. (for year 12) |