A system is described for improving decision making for individuals, groups and organizations. The system enables participants to collaboratively provide informational statements as well as supporting and opposing arguments and share them and then rate them. The system visually facilitates participants prioritizing arguments, and allows participants to rate arguments on clarity, agreement, and relevance. The system includes multiple levels of authority, both inherent and topic-assignable, which enable posts to be made visible/invisible based on authority level and/or “need to know”. The system allows decision makers to increase the reach of participants in decisions while keeping the input and process manageable yielding better decisions. All posts, ratings, and edits of posts and ratings within the system are archived along with any reasons given, and archives are reviewable by users.
|
1. A system for editing and displaying a structured argument, having a plurality of associated parameters, the system comprising:
a processor operative to execute computer executable instructions; and
a computer readable medium that stores the computer executable instructions, the computer executable instructions comprising: instructions to display a user interface that displays the plurality of parameters at a user accessible display and receives input from a user defining the value of a selected parameter, wherein the plurality of parameters comprise an over-arching assertion and a plurality of sub-assertions with respective subjective user-defined ratings for said sub-assertions, and descriptions for said sub-assertions;
a computational engine that alters the selected parameter to the defined value, updates the plurality of parameters according to the defined value of the selected parameter, and displays the altered parameters on the user interface, such that the display is updated in real time to reflect the user input; said user interface further comprising means for displaying on one screen at least one sub-assertion, ratings for all displayed sub-assertions, and descriptions for all displayed sub-assertions.
32. A system for editing and displaying a structured argument, having a plurality of associated parameters, the system comprising:
a processor operative to execute computer executable instructions; and
a computer readable medium that stores the computer executable instructions, the computer executable instructions comprising: instructions to display a user interface that displays the plurality of parameters at a user accessible display and receives input from a user defining the value of a selected parameter, wherein the plurality of parameters comprise an over-arching assertion and a plurality of sub-assertions with respective subjective user-defined ratings for said sub-assertions, and descriptions for said sub-assertions;
a computational engine that alters the selected parameter to the defined value, updates the plurality of parameters according to the defined value of the selected parameter, and displays the altered parameters on the user interface, such that the display is updated in real time to reflect the user input; said user interface further comprising means for displaying on one screen at least one sub-assertion, and ratings for all displayed sub-assertions, where each displayed sub-assertion has displayed at least one rating, including at least one user rating.
33. A system for editing and displaying a structured argument, having a plurality of associated parameters, the system comprising:
a processor operative to execute computer executable instructions; and
a computer readable medium that stores the computer executable instructions, the computer executable instructions comprising: instructions to display a user interface that displays the plurality of parameters at a user accessible display and receives input from a user defining the value of a selected parameter, wherein the plurality of parameters comprise an over-arching assertion and a plurality of sub-assertions with respective subjective user-defined ratings for said sub-assertions, and descriptions for said sub-assertions;
a computational engine that alters the selected parameter to the defined value, updates the plurality of parameters according to the defined value of the selected parameter, and displays the altered parameters on the user interface, such that the display is updated in real time to reflect the user input; said user interface further comprising means for displaying on one screen at least one sub-assertion, and ratings for all displayed sub-assertions, where the assertions are displayed sorted such that first appear assertions that have not yet been rated, and next appear rated assertions in descending order from highest rated to lowest rated.
2. The system of
3. The system of
4. The system of
5. The system of
6. The system of
7. The system of
8. The system of
9. The system of
10. The system of
11. The system of
12. The system of
13. The system of
14. The system of
15. The system of
16. The system of
17. The system of
18. The system of
19. The system of
20. The system of
21. The system of
22. The system of
23. The system of
24. The system of
second user's personal ratings,
ratings mathematically derived from a group of users,
ratings mathematically derived from a plurality of said second user's ratings,
first user's personal ratings.
25. The system of
26. The system of
27. The system of
28. The system of
29. The system of
30. The system of
31. The system of
|
The field of the invention relates to software-assisted decision making, public debate, structured arguments, and more specifically to multi-user software for engaging in debate and evaluation of assertions.
Individuals, groups and organizations make decisions on a daily basis. Good decisions require an evaluation of pros and cons supporting and opposing a decision. The implementation of decisions is typically more successful when parties responsible for implementation are involved in the decision-making process early and can raise pros and cons as well. Unfortunately, it can be difficult to coordinate groups of people, truly understand their positions on issues and come to an optimal decision.
There are many tools which can provide information for decision making, such as search engines and wikis, and there are many tools for getting feedback, such as surveys or blogs, and there are even decision making analysis tools which help with a number of these issues. However, human beings do not always make decisions based on logic and these tools often do not help identify and distinguish emotional from logical reasoning.
There is a need for technologies which assist individuals to better understand their own positions on issues, and technologies that help them better explain those positions to others. There is also a need for innovative tools that help group or organization decision makers to better understand their constituents' preferences and reasons for those preferences.
Within any debate or topic analysis, people may attempt to bias decision-making by hiding information, presenting information in a biased way, etc. There is a need for innovative technologies which allow participants to have more confidence in decision-making processes by structuring how arguments are presented and vetted, and by openly showing changes which occur over time to neutral, supporting, and opposing arguments. All overarching assertions are considered by definition to be neutral arguments, because they are not sub-assertions below other assertions. Another example of a neutral argument would be a restatement of an overarching assertion or sub-assertion, placed as a sub-assertion below the assertion it is restating. Such a restatement might be done, for instance, to suggest a more clear way of stating the assertion.
It is an object of the present invention to provide a technology which assists individuals to better understand their own positions, explain them to others, and for group or organization decision makers to better understand participants' preferences and reasons for those preferences. It is a further object of the present invention to provide a debate and decision-making framework which allows participants to have more confidence in decision-making processes by structuring how arguments are presented and vetted, and by openly showing changes to context and neutral, supporting, and opposing arguments.
The present invention provides a framework for assisting individuals in decision making and managing decision making for groups and organizations. In one aspect, the present invention aggregates neutral and informational statements as well as supporting and opposing arguments relating to an assertion into an electronic archive accessible through a network, such that all submitted arguments may be reviewed by the public or appropriate members of the group or organization. In a preferred embodiment, one aspect of the present invention allows those reading an argument (assertion) to rate it for clarity. In a preferred embodiment, once an assertion has been sufficiently rated as clear, users may rate the assertion on an agreement/disagreement scale and on a scale of how relevant/irrelevant they perceive the assertion to be to an overarching topic or assertion, and users may add neutral, supporting or opposing arguments. Within this document, the terms “argument” and “assertion” are used interchangeably.
In a preferred embodiment, arguments are by default sorted to visually help participants prioritize and more accurately rate each argument relative to others. One aspect of the present invention allows participants to enter a reason for each rating they make, so they can be easily reminded of their thinking later, and so people can make their reasoning visible to others. In one embodiment, users may check a checkbox to keep their reasons private if they prefer, or may elect to make their reasoning visible to group decision makers but not visible to other participants. Since group decision makers benefit from knowing participants' reasoning, one aspect of the present invention allows for the requirement that participants include the reason (which will be visible to group decision makers) for their chosen ratings for each argument.
After participants have rated an assertion using the present invention, a project manager or group decision maker can see the relative preferences of participants as a whole and individually, but most importantly, can also understand why participants chose the ratings values they did. By collecting information from all participants, decision makers can learn whether particular participants have information that may not be commonly known to the rest of the group. Decision makers can also learn which participants may have misinformation. Such knowledge facilitates better management of the decision-making process, and enables better decisions.
In a preferred embodiment, each user is assigned one of seven global levels of authority, and different levels of authority grant access to different features. Features available to a user at a given authority level include enabling or disabling different ratings features and sub-argument creation features for those of lesser authority, and limiting access to content for those of lesser authority. In a preferred embodiment, the authority structure of the present invention is a tree structure. This facilitates for example a manager at a given level having authority over system features available to his or her direct reports, but not having such authority over features available to the direct reports of another manager at the same given level.
In a preferred embodiment, users at a given level of authority over a specific topic may assign that level of authority over that topic to someone who has a lesser level of authority over that topic. Such assigned authority is conditional on the continued approval of the person who assigned it, and the person who assigned it may remove that assigned level of authority at any time, even if that assigned level of authority is equivalent to their own level of authority. A preferred embodiment also allows someone at a given level of authority to make a topic or arguments pertaining to that topic (or reasons for arguments) visible or invisible to others of lesser authority, for instance based on “need to know”. In a preferred embodiment, users may log into the system using another credentialing system to identify themselves (for instance Microsoft Active Directory, Google account, or Facebook account).
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The hardware of a preferred embodiment of the present invention is depicted in
In a web-hosted preferred embodiment, host computer 1700 is connected to a network (such as the Internet 1706), and a remote computer 1707 with remote display 1709, remote keyboard 1708, and remote pointing device 1710 may be used to access the features of the present invention. Remote computer 1707 may be a smart phone, tablet, laptop PC, desktop PC, or any other computing device as may become available, and pointing device 1710 may be a mouse, track ball, touch screen, or any other pointing device as may become available.
Several aspects of the present invention take the form of a tree structure.
In a preferred embodiment, the next level of the argument tree is a set of overarching assertions. In this example, overarching assertions (arguments) 1403 and 1404 each concern the environment. For instance, overarching assertion 1404 may be an assertion about global warming, and overarching assertion 1403 may be about the environmental effects of mercury released into the atmosphere through the burning of coal.
The next level of this tree in a preferred embodiment of the present invention is a set of sub-arguments (sub-assertions) below each overarching assertion. A sub-assertion directly below any given assertion may herein be referred to as a child assertion of that given assertion. Likewise, a sub-assertion two levels below a given assertion may herein be referred to as a grandchild assertion of that given assertion. Likewise, an assertion which has a sub-assertion directly beneath it may herein be referred to as a parent assertion of the assertion which is directly beneath it, and an assertion one level above a parent assertion may herein be referred to as a grandparent assertion.
In a preferred embodiment, some of these sub-assertions may support the overarching assertion and some may oppose the overarching assertion. For instance, if overarching assertion 1404 is that global warming is predominantly man-made, sub-argument 1405 may be a supporting sub-argument that shows that average atmospheric temperatures have been rising as atmospheric carbon dioxide has been rising, and opposing sub-argument 1406 might be an assertion that glacial ice records show such rise and fall of global temperatures in the past before mankind existed. In a preferred embodiment, each sub-assertion such as sub-assertion 1405 may become its own overarching assertion, and have supporting and opposing sub-assertions (also herein referred to as “child” arguments) such as child arguments 1407 and 1408, that may support or oppose it.
In a preferred embodiment, levels of authority within the present invention are also organized in a tree structure.
Within this document, any first person to whom a second person does not report directly or indirectly, and who does not report to that second person directly or indirectly, is said to hold a parallel level of authority. Thus persons 1502, 1505, 1507, and 1508 all hold parallel levels of authority with respect to person 1503. In a preferred embodiment, when a user has authority over an assertion or topic within the present invention, he or she may revocably assign that level of authority over that topic or assertion to anyone of subordinate or parallel authority.
In a preferred embodiment, any person at a given level of authority in the authority tree of
In a preferred embodiment, persons of parallel authority (for instance persons at authority levels 1502 and 1503) do not have authority over topics created by each other. In a preferred embodiment, when an overarching topic is created within the present invention by a person at authority level 1502, that person may confer a subordinate level of authority over that topic to a person of parallel authority (for instance the person at authority level 1503). Such a conferred subordinate authority level is topic-specific or assertion-specific. This aspect of the present invention enables peers to each have authority over topics and/or overarching assertions they create, and the arguments that are entered by others under those topics.
In a preferred embodiment, a person at (for example) authority level 1503 may set parameters regarding topics and overarching assertions he/she creates within the system (for instance a parameter requiring others to rate an overarching assertion for clarity prior to any sub-arguments being presented), and for persons of lesser authority such limitation is binding, while for persons of higher authority the limitation appears as a request that can be overridden.
In a preferred embodiment, anyone at a given level of authority may confer that level of authority to someone who is not his superior if he so desires, and may later revoke that authority. In a preferred embodiment, all aspects of a person's authority may be conferred, or only some aspects may be conferred. For instance, if a vice president at level of authority 1502 is charged with making global-warming-related recommendations to the CEO, and is utilizing the present invention within his division of the company to derive and vet a global warming policy, but will be in the hospital for the next two weeks undergoing heart surgery, he might choose to confer his authority over only that one topic to the person at authority level 1504, while he might choose to confer authority over another topic within the present invention to his subordinate at level of authority 1505.
A preferred embodiment of the present invention allows someone at a given level of authority to make topics, overarching assertions, and any sub-arguments at levels below an overarching assertion visible or invisible to persons lower on the authority tree. This aspect of the present invention facilitates involving people in a project based on their “need to know”. For instance, the CEO (at level of authority 1501) may conceive a strategic decision-making project in which he wants only eight other people in the corporation to participate. He may set up the “need to know” tree depicted in
Now that we have described the levels of authority within a preferred embodiment of the present invention, we will describe how the argument-vetting features of a preferred embodiment of the present invention are used.
Sub-assertions 106, 108, etc. are shown below overarching assertion 100 in order of user-rated relevance. The user assigns agreement ratings 110, 111, etc. after reading sub-assertions 106, 108, etc. (and perhaps explanations/detailed descriptions 107, 109). In the embodiment shown, a higher agreement number indicates “more agreement”, and a lower agreement rating indicates “less agreement”. In a preferred embodiment, when list 114 of sub-arguments cannot be displayed on one screen, list 114 automatically becomes a scrolling list, and a vertical scrolling bar appears to the right of list 114. In a preferred embodiment, list 114 takes up at least 50% of the width of the display on which the interface of
In a preferred embodiment, assertions are displayed sorted such that first appear assertions that have not yet been rated, and next appear rated assertions in descending order from highest rated to lowest rated. This naturally leads a user to first rate all assertions, and subsequently by default to first have visible the assertions that were felt to be the most useful in evaluating an overarching assertion or sub-assertion.
Button 101 selects the “Full” display mode shown in
Mode button 104 selects the display mode shown in
In a preferred embodiment, more than one column of user-assigned ratings (such as ratings column 115) may be visible in the user interface depicted in
Requiring ratings of different types at different times may serve different purposes. When an overarching assertion is first posted, for instance, it may be desirable to make sure the assertion appears clear and unambiguous to everyone who will be involved in a decision-making process concerning that assertion, before debate about the assertion begins, so it may be desirable for a person of sufficient authority to require that everyone rate the clarity of the assertion (for instance on a scale from 1 (not clear at all) to 10 (perfectly unambiguously clear)), before any agreement/disagreement ratings or supporting or opposing arguments may be posted. For instance, if a manager posted the overarching assertion “customers who have bought product X extremely often on Tuesday afternoons call our call center with complaints”, a user might rate that assertion very unclear, because someone who 70% agrees might be agreeing totally with the “extremely often”, disagreeing with the “Tuesday afternoons”, and partly agreeing with the “with complaints”. It might also be pointed out that it is unclear whether the “extremely often” refers to how often the customer has bought the product or whether it refers to how often the customer calls.
In a preferred embodiment, a user of sufficient authority may open an interface similar to that shown in
Within this document, the term “hover-over event” describes a condition where a user has positioned a graphical pointer within a defined space for longer than some pre-determined amount of time. Such an event may indicate curiosity about a graphical construct over which the user has hovered the graphical pointer. Such events are commonly detected in HTML2 web interfaces, and used to pop-up information boxes or dialog boxes.
The foregoing discussion should be understood as illustrative and should not be considered to be limiting in any sense. While this invention has been particularly shown and described with references to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the claims.
Weinstein, Lee, McDonald, Alex
Patent | Priority | Assignee | Title |
11206190, | Feb 01 2021 | International Business Machines Corporation | Using an artificial intelligence based system to guide user dialogs in designing computing system architectures |
Patent | Priority | Assignee | Title |
8660972, | Nov 11 2002 | ZXIBIX INC | System and method to provide a customized problem solving environment for the development of user thinking about an arbitrary problem |
8676735, | Nov 11 2002 | ZXIBIX INC | System and method to facilitate and document user thinking about an arbitrary problem with collaboration system |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Date | Maintenance Fee Events |
Jun 23 2019 | M2551: Payment of Maintenance Fee, 4th Yr, Small Entity. |
Feb 12 2024 | REM: Maintenance Fee Reminder Mailed. |
Jul 29 2024 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Jun 21 2019 | 4 years fee payment window open |
Dec 21 2019 | 6 months grace period start (w surcharge) |
Jun 21 2020 | patent expiry (for year 4) |
Jun 21 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 21 2023 | 8 years fee payment window open |
Dec 21 2023 | 6 months grace period start (w surcharge) |
Jun 21 2024 | patent expiry (for year 8) |
Jun 21 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 21 2027 | 12 years fee payment window open |
Dec 21 2027 | 6 months grace period start (w surcharge) |
Jun 21 2028 | patent expiry (for year 12) |
Jun 21 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |