Apparatus and methods relating to persistent network resource and virtual area associations for realtime collaboration include managing and displaying an object associated with a virtual area that supports establishment of respective presences of communicants operating respective client network nodes. Examples of the object include an object that has one or more user-modifiable property fields and an object that is associated with screen sharing functionality of the client network node.
|
1. A method, comprising
by a client network node:
displaying a graphical user interface comprising a graphical representation of a virtual area that supports establishment of respective presences of a communicant operating the client network node and one or more other communicants respectively operating one or more other client network nodes and realtime communications between co-present communicants, a graphical representation of each of the communicants who is present in the virtual area, and a graphical representation of a viewscreen object in the graphical representation of the virtual area, wherein the viewscreen object is associated with screen sharing functionality that allows a client network node of a communicant who is present in the virtual area to display visual content produced by an application on the client network node and transmit representations of the visual content for display on one or more other client network nodes of respective communicants who are present in the virtual area and have activated the viewscreen object;
responsive to receipt of communicant input, transmitting a request to associate a uniform resource identifier (URI) of a network resource with the viewscreen object to a network infrastructure service supporting communicant interactions in the virtual area;
receiving from the network infrastructure service one or more values defining respective properties of the viewscreen object including an association of the URI with the viewscreen object that allows a client network node of a communicant who is present in the virtual area to display data retrieved from the URI in response to activation of the viewscreen object by the communicant;
retrieving data from the URI based on communicant input received in connection with the graphical representation of the viewscreen object; and
showing visual content in the graphical user interface based on the data retrieved from the URI.
12. At least one non-transitory computer-readable medium having processor-readable program code embodied therein, the processor-readable program code adapted to be executed by a processor to perform operations comprising:
displaying a graphical user interface comprising a graphical representation of a virtual area that supports establishment of respective presences of a communicant operating the client network node and one or more other communicants respectively operating one or more other client network nodes and realtime communications between co-present communicants, a graphical representation of each of the communicants who is present in the virtual area, and a graphical representation of a viewscreen object in the graphical representation of the virtual area, wherein the viewscreen object is associated with screen sharing functionality that allows a client network node of a communicant who is present in the virtual area to display visual content produced by an application on the client network node and transmit representations of the visual content for display on one or more other client network nodes of respective communicants who present in the virtual area and have activated the viewscreen object;
responsive to receipt of communicant input, transmitting a request to associate a uniform resource identifier (URI) of a network resource with the viewscreen object to a network infrastructure service supporting communicant interactions in the virtual area;
receiving from the network infrastructure service one or more values defining respective properties of the viewscreen object including an association of the URI with the viewscreen object that allows a client network node of a communicant who is present in the virtual area to display data retrieved from the URI in response to activation of the viewscreen object by the communicant;
retrieving data from the URI based on communicant input received in connection with the graphical representation of the viewscreen object; and
showing visual content in the graphical user interface based on the data retrieved from the URI.
11. Apparatus, comprising:
a memory storing processor-readable instructions; and
a processor coupled to the memory, operable to execute the instructions, and based at least in part on the execution of the instructions operable to perform operations comprising
displaying a graphical user interface comprising a graphical representation of a virtual area that supports establishment of respective presences of a communicant operating client network node and one or more other communicants respectively operating one or more other client network nodes and realtime communications between co-present communicants, a graphical representation of each of the communicants who is present in the virtual area, and a graphical representation of a viewscreen object in the graphical representation of the virtual area, wherein the view object is associated with screen sharing functionality that allows a client network node of a communicant who is present in the virtual area to display visual content produced an application on the client network node and transmit representations of the visual content for display on one or more other client network nodes of respective communicants who are present in the virtual area and have activated the viewscreen object;
responsive to receipt of communicant input, transmitting a request to associate a uniform resource identifier (URI) of a network resource with the viewscreen object to a network infrastructure service supporting communicant interactions in the virtual area;
receiving from the network infrastructure service one or more values defining respective properties of the viewscreen object including an association of the URI with the viewscreen object that allows a client network node of a communicant who is present in the virtual area to display data retrieved from the URI in response to activation of the viewscreen object by the communicant;
retrieving data from the URI based on communicant input received in connection with the graphical representation of the viewscreen object; and
showing visual content in the graphical user interface based on the data retrieved from the URI.
2. The method of
3. The method of
4. The method of
5. The method of
6. The method of
7. The method of
8. The method of
9. The method of
10. The method of
|
Under 35 U.S.C. §119(e), this application claims the benefit of U.S. Provisional Application No. 61/444,989, filed Feb. 21, 2011, the entirety of which is incorporated herein by reference.
This application relates to the following co-pending patent applications, the entirety of each of which is incorporated herein by reference: U.S. patent application Ser. No. 12/630,973, filed Dec. 4, 2009; U.S. patent application Ser. No. 12/418,243, filed Apr. 3, 2009; U.S. patent application Ser. No. 12/631,008, filed Dec. 4, 2009; U.S. patent application Ser. No. 12/631,026, filed Dec. 4, 2009; U.S. patent application Ser. No. 12/418,270, filed Apr. 3, 2009; U.S. patent application Ser. No. 12/825,512, filed Jun. 29, 2010; U.S. patent application Ser. No. 12/354,709, filed Jan. 15, 2009; U.S. patent application Ser. No. 12/509,658, filed Jul. 27, 2010; U.S. patent application Ser. No. 12/694,126, filed Jan. 26, 2010; U.S. Provisional patent application Ser. No. 61/373,914, filed Aug. 16, 2010; and U.S. Provisional patent application Ser. No. 61/381,956, filed Sep. 11, 2010.
When face-to-face communications are not practical, people often rely on one or more technological solutions to meet their communications needs. Traditional telephony systems enable voice communications between callers. Instant messaging (also referred to as “chat”) communications systems enable users to communicate text messages in real time through instant message computer clients that are interconnected by an instant message server. Some instant messaging systems and interactive virtual reality communications systems allow users to be represented by user-controllable graphical objects (referred to as “avatars”). What are needed are improved systems and methods for realtime network communications.
In the following description, like reference numbers are used to identify like elements. Furthermore, the drawings are intended to illustrate major features of exemplary embodiments in a diagrammatic manner. The drawings are not intended to depict every feature of actual embodiments nor relative dimensions of the depicted elements, and are not drawn to scale.
A “communicant” is a person who communicates or otherwise interacts with other persons over one or more network connections, where the communication or interaction may or may not occur in the context of a virtual area. A “user” is a communicant who is operating a particular network node that defines a particular perspective for descriptive purposes.
A “computer” is any machine, device, or apparatus that processes data according to computer-readable instructions that are stored on a computer-readable medium either temporarily or permanently. A “computer operating system” is a software component of a computer system that manages and coordinates the performance of tasks and the sharing of computing and hardware resources. A “software application” (also referred to as software, an application, computer software, a computer application, a program, and a computer program) is a set of instructions that a computer can interpret and execute to perform one or more specific tasks. A “data file” is a block of information that durably stores data for use by a software application.
The term “computer-readable medium” refers to any tangible, non-transitory medium capable storing information (e.g., instructions and data) that is readable by a machine (e.g., a computer). Storage devices suitable for tangibly embodying such information include, but are not limited to, all forms of physical, non-transitory computer-readable memory, including, for example, semiconductor memory devices, such as random access memory (RAM), EPROM, EEPROM, and Flash memory devices, magnetic disks such as internal hard disks and removable hard disks, magneto-optical disks, DVD-ROM/RAM, and CD-ROM/RAM.
A “window” is a visual area of a display that typically includes a user interface. A window typically displays the output of a software process and typically enables a user to input commands or data for the software process. A window that has a parent is called a “child window.” A window that has no parent, or whose parent is the desktop window, is called a “top-level window.” A “desktop” is a system-defined window that paints the background of a graphical user interface (GUI) and serves as the base for all windows displayed by all software processes.
A “data sink” (referred to herein simply as a “sink”) is any of a device (e.g., a computer), part of a device, or software that receives data.
A “data source” (referred to herein simply as a “source”) is any of a device (e.g., a computer), part of a device, or software that originates data.
A “network node” (also referred to simply as a “node”) is a junction or connection point in a communications network. Examples of network nodes include, but are not limited to, a terminal, a computer, and a network switch. A “server” network node is a host computer on a network that responds to requests for information or service. A “client network node” is a computer on a network that requests information or service from a server.
A Uniform Resource Identifier (URI) is a string of characters that identifies a network resource.
A “network resource” is anything that can be identified by a uniform resource identifier (URI) and accessed over a network, including an electronic document, an image, a source of information, a service, operators and operands of a mathematical equation, classes, properties, numeric values, and a collection of other resources.
A “network connection” is a link between two communicating network nodes. A “connection handle” is a pointer or identifier (e.g., a uniform resource identifier (URI)) that can be used to establish a network connection with a network resource. A “network communication” can include any type of information (e.g., text, voice, audio, video, electronic mail message, data file, motion data stream, and data packet) that is transmitted or otherwise conveyed from one network node to another network node over a network connection.
A “communicant interaction” is any type of direct or indirect action or influence between a communicant and another network entity, which may include for example another communicant, a virtual area, or a network service. Examples of types of communicant communications include communicants communicating with each other in realtime, a communicant entering a virtual area, and a communicant requesting access to a resource from a network service.
“Presence” refers to the ability and willingness of a networked entity (e.g., a communicant, service, or device) to communicate, where such willingness affects the ability to detect and obtain information about the state of the entity on a network and the ability to connect to the entity.
A “realtime data stream” is data that is structured and processed in a continuous flow and is designed to be received with no delay or only imperceptible delay. Realtime data streams include digital representations of voice, video, user movements, facial expressions and other physical phenomena, as well as data within the computing environment that may benefit from rapid transmission, rapid execution, or both rapid transmission and rapid execution, including for example, avatar movement instructions, text chat, realtime data feeds (e.g., sensor data, machine control instructions, transaction streams and stock quote information feeds), screen shares, and file transfers.
A “virtual area” (also referred to as an “area” or a “place”) is a representation of a computer-managed space or scene. Virtual areas typically are one-dimensional, two-dimensional, or three-dimensional representations; although in some examples a virtual area may correspond to a single point. Oftentimes, a virtual area is designed to simulate a physical, real-world space. For example, using a traditional computer monitor, a virtual area may be visualized as a two-dimensional graphic of a three-dimensional computer-generated space. However, virtual areas do not require an associated visualization. A virtual area typically refers to an instance of a virtual area schema, where the schema defines the structure and contents of a virtual area in terms of variables and the instance defines the structure and contents of a virtual area in terms of values that have been resolved from a particular context.
A “virtual area application” (also referred to as a “virtual area specification”) is a description of a virtual area that is used in creating a virtual environment. The virtual area application typically includes definitions of geometry, physics, and realtime switching rules that are associated with one or more zones of the virtual area.
A “virtual environment” is a representation of a computer-managed space that includes at least one virtual area and supports realtime communications between communicants.
A “zone” is a region of a virtual area that is associated with at least one switching rule or governance rule. A “switching rule” is an instruction that specifies a connection or disconnection of one or more realtime data sources and one or more realtime data sinks subject to one or more conditions precedent. A switching rule controls switching (e.g., routing, connecting, and disconnecting) of realtime data streams between network nodes communicating in the context of a virtual area. A governance rule controls a communicant's access to a resource (e.g., an area, a region of an area, or the contents of that area or region), the scope of that access, and follow-on consequences of that access (e.g., a requirement that audit records relating to that access must be recorded). A “renderable zone” is a zone that is associated with a respective visualization.
A “position” in a virtual area refers to a location of a point or an area or a volume in the virtual area. A point typically is represented by a single set of one-dimensional, two-dimensional, or three-dimensional coordinates (e.g., x, y, z) that define a spot in the virtual area. An area typically is represented by the three-dimensional coordinates of three or more coplanar vertices that define a boundary of a closed two-dimensional shape in the virtual area. A volume typically is represented by the three-dimensional coordinates of four or more non-coplanar vertices that define a closed boundary of a three-dimensional shape in the virtual area.
As used herein, the term “includes” means includes but not limited to, the term “including” means including but not limited to. The term “based on” means based at least in part on.
The first client network node 12 includes a computer-readable medium 22 (or “memory”), a processor 24, and input/output (I/O) hardware 26 (including a display). The processor 24 executes at least one communications application 28 that is stored in the memory 22. The second client network node 14 typically is configured in substantially the same general way as the first client network node 12, with a computer-readable medium 30 storing at least one communications application 32, a processor 34, and input/output (I/O) hardware 36 (including a display).
Each of the network nodes 12, 14 has a respective set of one or more sources and an exemplary set of one or more sinks. Exemplary sources include an audio source (e.g., an audio capture device, such as a microphone), a video source (e.g., a video capture device, such as a video camera), a chat source (e.g., a text capture device, such as a keyboard), a motion data source (e.g., a pointing device, such as a computer mouse), and other sources (e.g., file sharing source or a source of a customized real-time data stream). Exemplary sinks include an audio sink (e.g., an audio rendering device, such as a speaker or headphones), a video sink (e.g., a video rendering device, such as a display monitor), a chat sink (e.g., a text rendering device, such as a display monitor), a motion data sink (e.g., a movement rendering device, such as a display monitor), and other sinks (e.g., a printer for printing shared files, a device for rendering real-time data streams different from those already described, or software that processes real-time streams for analysis or customized display).
The virtual area platform 18 includes at least one server network node 40 that provides a network infrastructure service environment 42 that manages sessions of the first and second client nodes 12, 14 in one or more virtual areas 44 in accordance with respective virtual area applications 46. One or more of the virtual area applications 44 typically are synchronous conferencing applications that support one or more types of communications between the client nodes 12, 14 (e.g., text chat, audio conferencing, video conferencing, application sharing, and file sharing). The network infrastructure service environment 42 typically includes one or more network infrastructure services that cooperate with the communications applications 28, 32 in the process of establishing and administering network connections between the client nodes 12, 14 and other network nodes. Among the network infrastructure services that are included in the example of the network infrastructure service environment 42 are an account service, a security service, an area service, a rendezvous service, an interaction service, and a capabilities engine. The area service administers a virtual area 44 by managing sessions of the first and second client nodes 12, 14 in the virtual area 44 in accordance with the virtual area application 46. Examples of the virtual area platform 18 and the virtual area applications 46 are described in U.S. Provisional patent application Ser. No. 61/563,088, filed Nov. 23, 2011. Examples of an account service, a security service, an area service, a rendezvous service, and an interaction service are described in U.S. patent application Ser. No. 12/630,973, filed Dec. 4, 2009. Examples of a capabilities engine are described in U.S. Provisional patent application Ser. No. 61/535,910, filed Sep. 16, 2011.
The network infrastructure service environment 42 maintains a relationship database 47 that contains the records 48 of interactions between communicants and social network profiles 50 that are associated with respective communicants. Each interaction record describes the context of an interaction between a pair of communicants. For example, in some examples, an interaction record contains one or more of an identifier for each of the communicants, an identifier for the place of interaction (e.g., a virtual area instance), a description of the hierarchy of the interaction place (e.g., a description of how the interaction room relates to a larger area), start and end times of the interaction, and a list of all files and other data streams that are shared or recorded during the interaction. In some examples, each interaction is tracked independently such that, for a given pair of communicants, there is a list of relationship event records, each of which records a single respective interaction (e.g., sent a chat message, streamed audio for 93 seconds, shared file X, etc.). Thus, for each realtime interaction, the network infrastructure service environment 42 tracks when it occurred, where it occurred, and what happened during the interaction in terms of communicants involved (e.g., entering and exiting), objects that are activated/deactivated, and the files that were shared. Each social network profile 50 typically includes: identity characteristics (e.g., name, age, gender, and geographic location information such as postal mailing address) that describe a respective communicant or a persona that is assumed by the communicant; explicit relationship information that is declared by the communicant; and relationship information that is inferred from the communicant's interactions in the network communication environment 10.
The communications applications 28, 32, the area applications 46, and the network infrastructure service environment 42 together provide a platform (referred to herein as “the platform”) that administers the realtime connections with network nodes in a communication context that is defined by an instance of a virtual area subject to a set of constraints 43 that control access to the virtual area instance.
The communications applications 28, 32 present respective views of the virtual areas 44 in accordance with data received from the network infrastructure service environment 42 and provide respective interfaces for receiving commands from the communicants and providing a spatial interface that enhances the realtime communications between the communicants. The communicants typically are represented in the virtual areas 44 by respective avatars (e.g., sprites), which typically move about the virtual areas 44 in response to commands that are input by the communicants at their respective network nodes. In some examples, the communications applications 28, 32 establish realtime data stream connections between the first and second client network nodes 12, 14 and other network nodes sharing the virtual area 44 based on the positions of the communicants' avatars in the virtual areas 44 as described in U.S. Pat. Nos. 7,769,806 and 7,844,724.
Among the software components executing on the client network nodes 12, 14 are a user interface component and a browser component. The browser component provides a set of web browsing functions, including browser functions, document viewing functions, and data downloading functions. The user interface component generates a graphical user interface that interfaces the user to the realtime communications and network browsing functionalities of the browser component. The browser component may be integrated into the communications applications 28, 32 or it may be implemented by a separate browser component (e.g., a plug-in) that exposes an API through which the communications applications 28, 32 may call methods that are available from the browser component, including browsing methods, document viewing methods, and data downloading methods.
The network connections between network nodes may be arranged in a variety of different stream handling topologies, including a peer-to-peer architecture, a server-mediated architecture, and hybrid architectures that combine aspects of peer-to-peer and server-mediated architectures.
In some embodiments, the server network node 40 remotely manages client communication sessions and remotely configures audio and graphic rendering engines on the client network nodes 12, 14, as well as switching of data streams by sending instructions (also referred to as definitions) from the remotely hosted area applications 46 to the client network nodes in accordance with the stream transport protocol described in U.S. patent application Ser. No. 12/825,512, filed Jun. 29, 2010, the entirety of which is incorporated herein by reference. In some of these embodiments, the server node(s) 40 send to each of the client nodes 12, 14 provisioning messages that configure the client nodes 12, 14 to interconnect respective data streams between active ones of their complementary sources and sinks in accordance with switching rules specified in the server applications 46.
The platform tracks communicants' realtime availabilities and activities across the different communication contexts that are defined by the area applications 46. This information is presented to the communicants in the form of realtime visualizations that enable the communicants to make more informed network interaction decisions (e.g., when to interact with a contact) and encourages the communicants to initiate interactions with other communicants and to join contexts (e.g., an ongoing conversation between communicants) of which the communicants otherwise would not have been aware. In some embodiments, the realtime visualization includes visual cues as to the presence and activities of the communicants in the contexts of the server applications. The presentation of these visual cues typically depends on one or more of governance rules associated with the virtual areas 44, administrative policies, and user preferences (including preferences regarding the exportation of the user's presence and the connection of the user to areas and other communicants), which may define tiered relationship based predicates that control access to presence information and/or resources on a zone-by-zone basis.
The people interaction toolbar 67 includes a Chat button 98 and an Invite button 102. Selection of the Chat button 98 opens a chat panel 140 (see
The audio interaction toolbar 68 includes a headphone control 84 that enables Art to toggle on and off the local speakers of the client network node, and a microphone control 86 that enables Art to toggle on and off the local microphone of the client network node.
The panel view controls 69 include a people panel button 71 for opening and closing the people panel 65, a chat panel button 73 for opening and closing a chat panel (see
The people panel 65 depicts the realtime availabilities and activities of some or all of Art's contacts across different communication contexts. In the example shown in
In the example shown in
Each communicant is represented graphically by a respective circular sprite that is labeled with a respective user name of the communicant (i.e., “Art,” “Beth,” “Carl,” “Dan,” “Ed,” “Fran,” and “Garth”). Each sprite also may be associated with a respective status line that includes additional information about the communicant. In some embodiments, each status line can include one or more of the following information: location of presence (e.g., a server application or a zone of that sever application); availability (e.g., busy, idle); a status message (e.g., “Out of the office next Wednesday”); and the name of the client node from which the communicant is operating (e.g., “workstation 1” or “mobile phone”). In some embodiments, the ordering of the spatial positions (e.g., from top to bottom) of the communicant avatars in each of the sections 78, 80, 82 is alphabetical by user name. In other embodiments, the spatial positions of the communicant avatars in each of the server application sections 78, 80 are ordered in accordance with the temporal ordering of the communicants in terms of the times when the communicants established their respective presences with the server applications. The spatial positions of the communicant avatars in the contacts section 82 may be sorted alphabetically by user name, according to frequency of contact, according to recentness of contact, or according to other sorting or filtering criteria.
The activities of the communicants in the contexts of the area applications 44 may be inferred from the activities on the communication channels over which the respective communicants are configured to communicate. The activities on the communication channels are shown in the graphical user interface 70 by visual cues that are depicted in association with the graphical representations of the communicants in the sections 78, 80, 82. For example, the “on” or “off” state of a communicant's local speaker channel is depicted by the presence or absence of a headphones graphic 90 on the communicant's sprite. When the speakers of the communicant who is represented by the sprite are on, the headphones graphic 90 is present (see sprites Art, Carl, and Dan) and, when the communicant's speakers are off, the headphones graphic 90 is absent (see sprites Beth and Ed). The “on” or “off” state of the communicant's microphone is depicted by the presence or absence of a microphone graphic 92 on the communicant's sprite. When the communicant's microphone is on, the microphone graphic 92 is present (see sprite Dan); and, when the communicant's microphone is off, the microphone graphic 92 is absent (see sprites Art, Beth, Carl, and Ed). The headphones graphic 90 and the microphone graphic 92 provide visual cues of the activity states of the communicant's sound playback and microphone devices. In addition, the current activity on a communicant's microphone channel is indicated by a dynamic visualization that lightens and darkens the communicant's avatar in realtime to reflect the presence or absence of audio data on the microphone channel. Thus, whether or not their local speakers are turned on, communicants can determine when another communicant is speaking by the “blinking” of the coloration of that communicant's avatar. The activity on a communicant's text chat channel is depicted by the presence or absence of the hand graphic 94 adjacent the communicant's sprite (see sprite Beth). Thus, when a communicant is transmitting text chat data to another network node the hand graphic 94 is present, and when a communicant is not transmitting text chat data the hand graphic 94 is not present. In some embodiments, text chat data is transmitted only when keyboard keys are depressed, in which case the visualization of the communicant's text channel appears as a flashing on and off of the hand graphic 94.
In the example shown in
Additional details regarding embodiments of the people panel 65 are described in U.S. Provisional patent application Ser. No. 61/373,914, filed Aug. 16, 2010, and U.S. patent application Ser. No. 12/354,709, filed Jan. 15, 2009.
The viewer panel 66 includes a navigation area 110 and a display area 112.
The navigation area 110 includes forward and back buttons 114, a location bar 116, a Go button 118, and a reload button 120. The forward and back buttons 114 enable a user to traverse a navigation stack of uniform resource identifier (URI) addresses (e.g., a linked list of previously visited URLs). The location bar 116 allows a user to specify a URI address of a network resource, and the Go button 118 invokes one or more browser functions on the client network node to navigate to the specified URI address and render the network resource at the specified URI address in the display area 112. The reload button 120 invokes one or more browser functions on the client network node to reload the graphic representation of the network resource currently displayed in the display area 112.
The display area 112 contains the rendered depictions of network resources located at the URI address specified in the navigation area 110. In the example shown in
In addition to the control and panel elements of the graphical user interface 70 (e.g., the people panel 65, the viewer panel 66, the people interaction toolbar 67, the audio interaction toolbar 68, and the panel view controls 71, 73, 75), the graphical user interface 70 includes a Share button 375 and a set 373 of Viewer Panel control buttons, including a Map button 376, a Browse button 378, and four View Screen buttons 380-386. The Share button 375 initiates a screen share of the contents of the display area 112 of the viewer panel 66 in connection with a view screen object in a virtual area. These contents include, for example, renderings of any information that is received by the browser component in connection with the network resource identified in the location bar 116, and a document or application that is being shared by the user in connection with a view screen object in a virtual area. The Map button 376 sets the view presented in the viewer panel 66 to a map view of the virtual area. The Browse button 378 sets the view presented in the viewer panel 66 to a browser view. Each of the four View Screen buttons 380-386 sets the viewer panel 66 to display the content being shared in connection with a corresponding one of the view screen objects in the virtual area.
Each of the communicants who is present in the virtual area is represented graphically by a respective avatar that corresponds to the communicant's avatar that is shown in the people panel 65. The virtual area is represented graphically by a two-dimensional top view of a rectangular space. In some examples, the communicants' sprites automatically are positioned in predetermined locations (or “seats”) in the virtual area when the communicants initially enter the virtual area.
The virtual area includes four view screen objects 388, 390, 392, 394 and a table object 396. Communicants interact with the objects by selecting them with an input device (e.g., by single-clicking on the objects with a computer mouse, touch pad, touch screen, or the like). The view screen objects 388-394 are associated with application sharing functionality of the platform that enables communicants to share applications operating on their respective client network nodes. The application sharing functionality is invoked by activating a view screen object (e.g., by single-clicking the view screen object with an input device). In some embodiments, the platform provides visual cues that indicate whether or not a communicant is sharing an application over an application sharing channel. In response to a communicant's selection of the view screen object, the communicant's sprite automatically is moved to a position in the graphical representation of the virtual area that is adjacent the view screen object. The position of a communicant's sprite adjacent the view screen object indicates that the communicant currently is sharing or is about to share an application with the other communicants in the virtual area. In addition, the avatar of each communicant who is viewing a shared application (including the sharing communicant) is depicted with a pair of “eyes” to indicate that the represented communicants are viewing the content being shared in connection with the view screen objects (see, e.g., the avatars of Alex and Dan in
The table object 396 is associated with file sharing functionality of the platform that enables communicants to upload computer data files to server storage in association with the virtual area and to download data files that are associated with the virtual area from the server storage to the respective client network nodes. In example shown in
In the Map view mode, the navigational controls of the graphical user interface 70 allow the user to traverse a path through the virtual environment in accordance with a navigational model that is tied to the underlying spatial hierarchy of virtual area locations and objects within the locations. The network infrastructure service environment records the path traversed by the user. In some embodiments, the network infrastructure service environment records a history that includes a temporally ordered list of views of the virtual area locations that are presented to the user as the user navigates through the virtual area. Each view typically corresponds to a view of a respective renderable zone of the virtual area. In these embodiments, the navigation controls enable the user to move to selected ones of the zones in the history. The navigation controls also include a graphical representation of a depth path that shows the location in the spatial hierarchy that corresponds to the user's current view of the virtual area. In some embodiments, the graphical representation of the depth path includes a respective user-selectable link to a respective view of each of the preceding levels in the spatial hierarchical model of the virtual area above the current view. The back button 369 corresponds to a backward control that enables the user to incrementally move backward to preceding ones of the zones in the history of the zones that were traversed by the user. The forward button 371 corresponds to a forward control that enables the user to incrementally move forward to successive ones of the zones in the history of the zones that were traversed by the user. Some examples additionally include a placemarks button that activates a placemarking control for storing links to zones and a placemark navigation control for viewing a list of links to previously placemarked zones. In response to user selection of the placemarking control, a placemark is created by storing an image of the location shown in the current view in association with a hyperlink to the corresponding location in the virtual area. In response to a user selection of the placemark navigation control, a placemarks window is presented to the user. The placemarks window includes live visualizations (showing, e.g., where communicants are located and visual cues of their realtime activities) of all locations that have been placemarked by the user. Each of the images in the placemarks window is associated with a respective user-selectable hyperlink. In response to user selection of one of the hyperlinks in the placemarks window, a view of the virtual area corresponding to the location associated with the selected hyperlink is automatically displayed in the browsing area of the graphical user interface 70. Some examples include home button corresponds to a control that returns the user to a view of a designated “home” location in the virtual environment. Additional details regarding the structure, function, and operation of examples of the navigation controls are described in U.S. patent application Ser. No. 12/354,709, filed Jan. 15, 2009.
Activating the chat panel button 73 or the chat button 98 opens the chat panel 140. When the chat panel button 73 is activated, the Chanel panel 140 opens to show a chat interface for a persistent virtual chat area for interactions occurring in connection with a respective virtual area. In the example shown in
The chat interface of the chat panel 140 includes a chat log area 142, a text box 144, and a Send button 146. The chat panel 402 also includes a minimap view of the current zone (Zone 1) in which the user is present. In this example, the user may enter text messages in the text box 144 and activate the Send button 146 to transmit the text messages to the other communicants who are present in the zone.
The user may enter text messages in the text box 144 and transmit the text messages to the other communicants who are in the same zone by selecting the Send button 146. The chat log area 142 displays a log of current and optionally prior events that are associated with the current zone. An exemplary set of events that may be displayed in the chat log area 142 include: text messages that the user has exchanged with other communicants in the current zone; changes in the presence status of communicants in the current zone; changes in the speaker and microphone settings of the communicants in the current zone; and the status of the objects in the zone (discussed below), including references to any applications and data files that are shared in connection with the objects. In the illustrated embodiments, the events are labeled by the communicant's name followed by content associated with the event (e.g., a text message) or a description of the event.
The chat panel 140 provides a context for organizing the presentation of the events that are displayed in the chat log area 142. For example, in the illustrated embodiment, each of the displayed events is labeled with a respective tag that visually correlates with the appearance of the sprite of the communicant that sourced the displayed event. In particular, each of the events that is sourced by a particular one of the communicants is labeled with a respective icon 148, 150, 152, 154 having a visual appearance (e.g., color-code, or design pattern) that matches the visual appearance of that communicant's sprite. In this example, the color of the icons 148, 152 matches the color of the body of Art's sprite, and the color of the icon 150, 154 matches the color of the body of Beth's sprite.
In the examples described below, the platform enables a communicant to associate objects in zones of a virtual area with network resources, and maintains those associations across sessions to provide zones with persistent network resource associations that can be accessed immediately upon entry into the zones. In these examples, an object (e.g., a view screen object) in a zone of a virtual area has a configurable uniform resource identifier (URI) property that a communicant can configure to associate a network resource with the object and thereby create “spatial bookmarks” for the network resources at the respective object locations in the zones of the virtual area. In this way, a communicant can customize a zone of a persistent virtual area with any type of network accessible resources to suit any particular purpose and then share the network resources with other communicants in the zone. For example, communicants can associate view screen objects in a zone of a virtual area with respective cloud-based services that relate to a particular project or business function (e.g., finance, accounting, software development, project management). The platform stores persistent records of the state of each zone of the virtual area, including the service associations with objects and the communicant interactions (e.g., chat, recordings, shared documents) that occurred in the zone so that each time the communicants enter the zone they can continue where the left off with single-click access to the services that are relevant to the particular project or business function associated with the zone. Being able to place and keep services running in a zone of a virtual area means that meetings start with live application information (e.g., network resource information, stored documents, prior chat conversations, and recorded audio conversations) already in the zone, and can restart where communicants left a discussion at the end of the previous meeting.
Referring to
Examples of the graphical user interface elements relating to the bookmarking (or pinning) functionality of the platform as show in
After the configuration has been saved, the bookmark button 434 is highlighted (e.g., by filling the bookmark button 434 with a brighter color) when the current page show in the viewer panel 412 is pointing to the URI associated with view screen object 420 as shown in
In some examples, instead of or in addition to screen sharing the rendered contents of a network resource being rendered in the viewer panel of a moderator's graphical user interface, the moderator's client network node automatically shares the URI of that network resource. These examples have particular utility when used to share a network resource (e.g., Google Docs™) that is configured to synchronize in real time the views of all the communicants who are concurrently sharing the network resource.
In addition to moderating the sharing of network resource contents with other communicants in the same virtual area, examples of the platform also enable communicants in the virtual area to “take control” of the sharing to become the new moderator and to render a “private view” of the contents of the URI being shared by the moderator in which the communicant can control the rendering and navigation independently of the moderator.
As an example of the “take control” functionality, if Alex currently is sharing a network resource identified by a URI associated with a view screen object, Don can click the view screen object to view the contents of Alex's share, and then click a “take control” button in the graphical user interface to take control of the share. Don's client network node now renders the URI contents locally and sends out a scraped image to the other communicants in the area who subscribed to the shared content by selecting the view screen object. In this way, communicants can alternately take control of the sharing session.
In some examples, the “take control” functionality is implemented without a separate “take control” button as follows. The moderator's client network node transmits the URI of a network resource being shared to each of the viewer client network nodes. Each of the viewer client network nodes populates the location bar 116 in the viewer panel of the client application graphical user interface with the transmitted URI. In response to a particular viewer communicant's selection of the Go button 118 in the navigation area 110 of the client application graphical user interface, the particular viewer communicant's client application notifies the platform that the Go button 118 was selected. The platform configures each of the network nodes involved in the sharing session so that the moderator function now is performed by the network node of the particular communicant who has taken control and the other client network nodes function as viewer network nodes. The client network node of the new moderator passes the URI to the local browser component, which renders the network resource identified by the URI in the viewer panel of the graphical user interface. The client network node of the new moderator also shares images of the output rendered by the browser component with the other client network nodes involved in the screen sharing session. The new moderator now can navigate to different network resources (e.g., web pages) or take control of a document editing session in Google Docs™.
In some examples, a viewer communicant may not be able to take control of a sharing session unless certain conditions are satisfied. For example, the platform may require the viewer to have a particular capability or it may require the current moderator to be in a particular state (e.g., inactive for a particular period).
As an example of the “private view” functionality, if Alex currently is sharing a network resource identified by a URI associated with a view screen object, Don can click a “private view” button in the graphical user interface to open a separate view of the network resources identified URI in a local browser application (e.g., the Firefox browser application) that is separate from the browser functionality of the communications application 28. This is useful for communicants who want to open up a second copy of the URI contents and change the view (e.g., zoom in, zoom out, scroll to a different part of the rendered contents), navigate to a different URI, make edits, etc. while still being able to watch the shared view in the graphical user interface of the communications application 28. The Private View button is in
If a particular view screen object is associated with content that currently is being shared in the zone, the associated preview panel shows a video sequence of images of the shared content. For example, Beth is sharing a local application (e.g., a Microsoft® Word document) in connection with the view screen object 388, and the associated preview panel 452 shows a stream of thumbnail images 460 of the shared local application content in the iconographic representation 464 of the view screen object 388 in the preview panel 452. In addition, the preview panel 452 shows an iconographic representation 466 of the sharing communicant (i.e., Beth). Similarly, Art is sharing a network resource (e.g., a network service application) in connection with the view screen object 390, and the associated preview panel 454 shows a stream of thumbnail images 464 of the shared network resource content in the iconographic representation 466 of the view screen object 390 in the preview panel 454, along with an iconographic representation 467 of Art.
If a particular view screen object is associated with a network resource that is not currently being shared, the associated preview panel shows a preview of a local rendering of the network resource. For example, the view screen object 392 is associated with a network resource corresponding to a video file (e.g., a YouTube™ video), and the associated preview panel 456 shows a browser image 468 of the page associated with the video file in the iconographic representation 470 of the view screen object 392 in the preview panel 456. In this process, the local client application 28 passes the URI for the video file to the local browser component, which renders a browser image 468 of the page identified by the URI in the associated preview panel 456.
The virtual environment creator 10 may communicate with the client network nodes 12, 14 in a variety of different ways in the process of enabling communicants to associate network resources with objects in a virtual area to create the visual spatial bookmarks described herein.
The web server 502 communicates with the browser component on the client network node 12. The web server 502 sends to the web browser component preformatted web documents (e.g., a hypertext markup language (HTML) document) that the browser component displays as interface components (e.g., dialog boxes) in the viewer panel. The use of web pages to create interface components on the client network node allows a system administrator to rapidly change the content and appearance of the interface components on the fly without having to change the communications application running on the client network node (e.g., by having to rewrite the client application code and distribute new executable code to the communicants in the virtual area).
The browser component sends to the web server 502 request forms that contain communicant inputs (e.g., inputs specifying object property changes) in connection with dialog boxes and objects in the virtual area. For example, when a communicant in the virtual area interacts with an object in the virtual area (e.g., the communicant positions a pointer over the graphic representation of the object), interface component on the client network node 12 passes the input and an identifier for the object (Object_ID) to the browser component, which sends to the web server 502 a request form that includes the Object_ID and describes the user input. In response, the web server sends to the browser component a web page that is associated with the type of user interaction with the object. In some examples, the web page specifies a specific display size (e.g., 600 pixels wide by 300 pixels high) for the dialog box. The client browser component displays the web page as a dialog box of the specific size in the viewer panel.
The web server 502 passes object property changes to the state manager 504. The state manager 504 writes the object property changes to the object properties database 506, which contains records of all the objects in the virtual area and the properties of those objects. The state manager 504 also writes the object property changes to messages that are stored in the message queue 508.
The message queue is a message broker that provides an asynchronous communications protocol between the web server 504 and the area server 510. In some examples, the message queue is implemented by the HornetQ message queue available from Red Hat, Inc., N.C., U.S.A. The area server 510 registers its URI with the message queue 508 and the message queue pushes the object change messages to the area server 510
The area service 512 administers the area application 512 and cooperates with the rendezvous service to create a virtual area for realtime communications between communicants. The area service 512 receives object property change messages from the message queue 508. The area service 510 updates the document object models (DOMs) for the changed objects based on the object property change messages, and distributes the updated DOMs to all the client network nodes that are connected to the virtual area.
Other embodiments are within the scope of the claims.
Butler, Robert J., Leacock, Matthew, Van Wie, David, Brody, Paul J., Moyers, Josh
Patent | Priority | Assignee | Title |
10178349, | May 06 2010 | Ricoh Company, Ltd. | Transmission terminal, transmission method, and computer-readable recording medium storing transmission program |
10477147, | May 06 2010 | Ricoh Company, Ltd. | Transmission terminal, transmission method, and computer-readable recording medium storing transmission program |
10931917, | May 06 2010 | Ricoh Company, Ltd. | Transmission terminal, transmission method, and computer-readable recording medium storing transmission program |
11050833, | May 06 2019 | IMI Innovations Inc. | System and method to create independently multi-layered virtual workspace applications, designed for use with independent multiple input systems |
11563917, | May 06 2010 | Ricoh Company, Ltd. | Transmission terminal, transmission method, and computer-readable recording medium storing transmission program |
11824671, | Sep 10 2021 | Zoom Video Communications, Inc. | Previewing conference participants prior to joining a conference |
11863333, | Sep 10 2021 | Zoom Video Communications, Inc. | Messaging conference participants prior to joining a conference |
9412148, | May 06 2010 | Ricoh Company, Ltd. | Transmission terminal, transmission method, and computer-readable recording medium storing transmission program |
9787944, | May 06 2010 | Ricoh Company, Ltd. | Transmission terminal, transmission method, and computer-readable recording medium storing transmission program |
Patent | Priority | Assignee | Title |
5471318, | Apr 22 1993 | AVAYA Inc | Multimedia communications network |
5491743, | May 24 1994 | International Business Machines Corporation | Virtual conference system and terminal apparatus therefor |
5627978, | Dec 16 1994 | AVAYA Inc | Graphical user interface for multimedia call set-up and call handling in a virtual conference on a desktop computer conferencing system |
5764916, | Sep 27 1996 | NYTELL SOFTWARE LLC | Method and apparatus for real time communication over a computer network |
5793365, | Jan 02 1996 | Oracle America, Inc | System and method providing a computer user interface enabling access to distributed workgroup members |
5982372, | Nov 14 1996 | International Business Machines Corp. | Visual metaphor for shortcut navigation in a virtual world |
5999208, | Jul 15 1998 | AVAYA Inc | System for implementing multiple simultaneous meetings in a virtual reality mixed media meeting room |
6057856, | Sep 30 1996 | Sony Corporation | 3D virtual reality multi-user interaction with superimposed positional information display for each user |
6119147, | Jul 28 1998 | MAJANDRO LLC | Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space |
6119166, | Mar 28 1997 | International Business Machines Corporation | Controlling communications with local applications using a browser application |
6215498, | Sep 10 1998 | LIONHEARTH TECHNOLOGIES, INC | Virtual command post |
6219045, | Nov 12 1996 | WORLDS INC | Scalable virtual world chat client-server system |
6237025, | Oct 01 1993 | Pragmatus AV LLC | Multimedia collaboration system |
6275490, | Aug 20 1997 | NET2PHONE, INC | Method and apparatus for establishing communications from browser application |
6380952, | Apr 07 1998 | International Business Machines Corporation | System for continuous display and navigation in a virtual-reality world |
6392760, | Apr 22 1993 | AVAYA Inc | Multimedia communications network |
6400381, | Jun 11 1999 | International Business Machines Corporation | Web places |
6572476, | Apr 10 2000 | KONAMI DIGITAL ENTERTAINMENT CO , LTD | Game system and computer readable storage medium |
6580441, | Apr 06 1999 | III Holdings 2, LLC | Graph-based visual navigation through store environments |
6708172, | Dec 22 1999 | Urbanpixel, Inc. | Community-based shared multiple browser environment |
6714222, | Jun 21 2000 | E2 Home AB | Graphical user interface for communications |
6731314, | Aug 17 1998 | Muse Corporation | Network-based three-dimensional multiple-user shared environment apparatus and method |
6785708, | Oct 30 1996 | NYTELL SOFTWARE LLC | Method and apparatus for synchronizing browse and chat functions on a computer network |
6862625, | Sep 27 1996 | NYTELL SOFTWARE LLC | Method and apparatus for real time network communication |
7016978, | Apr 29 2002 | Bellsouth Intellectual Property Corporation | Instant messaging architecture and system for interoperability and presence management |
7036082, | Sep 21 2000 | AVAYA LLC | Controlling communications through a virtual reality environment |
7058896, | Jan 16 2002 | RPX Corporation | System, method and computer program product for intuitive interactive navigation control in virtual environments |
7165213, | Sep 27 1996 | NYTELL SOFTWARE LLC | Method and system for coordinating media and messaging operations in an information processing system |
7181690, | Nov 12 1996 | WORLDS INC | System and method for enabling users to interact in a virtual space |
7184037, | Oct 14 1997 | Koninklijke Philips Electronics N.V. | Virtual environment navigation aid |
7240826, | Jan 25 2005 | SERIOSITY, INC | Attention economy for attention to messages, tasks and resources |
7263526, | Oct 30 1996 | NYTELL SOFTWARE LLC | Method and apparatus for embedding chat functions in a web page |
7336779, | Mar 15 2002 | AVAYA LLC | Topical dynamic chat |
7346654, | Apr 16 1999 | Mitel Networks Corporation | Virtual meeting rooms with spatial audio |
7392306, | Apr 07 2000 | Meta Platforms, Inc | Instant messaging client having an embedded browser |
7474741, | Jan 20 2003 | AVAYA LLC | Messaging advise in presence-aware networks |
7478086, | Feb 21 2002 | KYNDRYL, INC | Real-time chat and conference contact information manager |
7516411, | Dec 18 2000 | Nortel Networks Limited | Graphical user interface for a virtual team environment |
7594179, | Oct 10 2002 | Sony Corporation | Information processing system, service providing apparatus and method, information processing apparatus and method, recording medium, and program |
7616624, | Jul 20 2006 | AVAYA LLC | Determining user availability based on the expected duration of a new event |
7676542, | Dec 02 2002 | SAP SE | Establishing a collaboration environment |
7680098, | Jul 20 2006 | AVAYA LLC | Determining group availability on different communication media |
7680480, | Jul 20 2006 | AVAYA LLC | Determining user availability based on a past event |
7707249, | Sep 03 2004 | Open Text SA ULC | Systems and methods for collaboration |
7730063, | Dec 10 2002 | Square Halt Solutions, Limited Liability Company | Personalized medicine service |
7734691, | Dec 18 2003 | ACTIVISION PUBLISHING, INC | Providing collaboration services to a wireless device |
7747719, | Dec 21 2001 | Microsoft Technology Licensing, LLC | Methods, tools, and interfaces for the dynamic assignment of people to groups to enable enhanced communication and collaboration |
7765259, | Dec 05 2006 | AVAYA LLC | System and method for aggregation of user conversations and visualizing personal communications map |
7813488, | Sep 29 2003 | UNIFY, INC | System and method for providing information regarding an identity's media availability |
7827288, | Dec 08 2005 | WRP IP MANAGEMENT, LLC | Model autocompletion for composite services synchronization |
7840668, | May 24 2007 | AVAYA LLC | Method and apparatus for managing communication between participants in a virtual environment |
8191001, | Apr 05 2008 | SOCOCO, INC | Shared virtual area communication environment based apparatus and methods |
8397168, | Apr 05 2008 | SOCOCO, INC | Interfacing with a spatial virtual communication environment |
20020049814, | |||
20020080195, | |||
20020097267, | |||
20030037112, | |||
20030043200, | |||
20030046374, | |||
20030191799, | |||
20030222902, | |||
20040030741, | |||
20040030783, | |||
20040158610, | |||
20050071426, | |||
20050080866, | |||
20050108033, | |||
20050138570, | |||
20050154574, | |||
20050163311, | |||
20050210008, | |||
20050215252, | |||
20060041684, | |||
20060117264, | |||
20060167972, | |||
20060184886, | |||
20060248159, | |||
20070047700, | |||
20070162432, | |||
20070220111, | |||
20070233785, | |||
20070274291, | |||
20070279484, | |||
20070286366, | |||
20070291034, | |||
20080019285, | |||
20080021949, | |||
20080052373, | |||
20080059570, | |||
20080101561, | |||
20080163090, | |||
20080168154, | |||
20080214253, | |||
20080215679, | |||
20080215971, | |||
20080215972, | |||
20080215973, | |||
20080215974, | |||
20080215975, | |||
20080235582, | |||
20090063677, | |||
20090100355, | |||
20090106376, | |||
20090177659, | |||
20090199095, | |||
20090241037, | |||
20090247196, | |||
20090254843, | |||
20090288007, | |||
20090307189, | |||
20100138492, | |||
20100162121, | |||
20100164956, | |||
20100169796, | |||
20100169799, | |||
20100169837, | |||
20100169888, | |||
20100185733, | |||
20100228560, | |||
20100235501, | |||
20100241432, | |||
20100246570, | |||
20100246571, | |||
20100246800, | |||
20100251119, | |||
20100251124, | |||
20100251127, | |||
20100251142, | |||
20100251158, | |||
20100251177, | |||
20100262550, | |||
20100322395, | |||
20120216131, | |||
EP1964597, | |||
EP2237537, | |||
EP2239930, | |||
JP2007122443, | |||
JP2010086163, | |||
KR1019990078775, | |||
KR1020000030491, | |||
KR1020010100589, | |||
KR1020030054874, | |||
KR1020040011825, | |||
KR1020070105088, | |||
KR102070057578, | |||
WO184334, | |||
WO191868, | |||
WO2007064174, | |||
WO2009000028, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 17 2012 | Social Communications Company | (assignment on the face of the patent) | / | |||
Feb 22 2012 | MOYERS, JOSH | Social Communications Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028135 | /0254 | |
Feb 22 2012 | LEACOCK, MATTHEW | Social Communications Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028135 | /0254 | |
Feb 24 2012 | VAN WIE, DAVID | Social Communications Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028135 | /0254 | |
Feb 28 2012 | BUTLER, ROBERT J | Social Communications Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028135 | /0254 | |
Mar 14 2012 | BRODY, PAUL J | Social Communications Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028135 | /0254 | |
Oct 20 2015 | Social Communications Company | SOCOCO, INC | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 040017 | /0709 | |
Mar 03 2017 | SOCOCO, INC | KNAPP INVESTMENT COMPANY LIMITED | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 041534 | /0142 |
Date | Maintenance Fee Events |
May 12 2017 | STOL: Pat Hldr no Longer Claims Small Ent Stat |
Feb 11 2019 | REM: Maintenance Fee Reminder Mailed. |
Feb 28 2019 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Feb 28 2019 | M1554: Surcharge for Late Payment, Large Entity. |
Dec 23 2022 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Jun 23 2018 | 4 years fee payment window open |
Dec 23 2018 | 6 months grace period start (w surcharge) |
Jun 23 2019 | patent expiry (for year 4) |
Jun 23 2021 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 23 2022 | 8 years fee payment window open |
Dec 23 2022 | 6 months grace period start (w surcharge) |
Jun 23 2023 | patent expiry (for year 8) |
Jun 23 2025 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 23 2026 | 12 years fee payment window open |
Dec 23 2026 | 6 months grace period start (w surcharge) |
Jun 23 2027 | patent expiry (for year 12) |
Jun 23 2029 | 2 years to revive unintentionally abandoned end. (for year 12) |