MATHMEASURE FOR DECISIONMAKING UNDER UNCERTAINTY: ASSERBILITY

REPEATING THE DEAL: As a DECISION-GUIDE, you will be given a FORMULA involving only FRACTIONS to be added and subtracted. If you've WORKED any PROBABILITY problem, you've done the same thing. So the TACTICS haven't changed -- only the STRATEGY, since the PROBABILITY MEASURES refer to EVENTS, while the ASSERBILITY MEASURES refer to ASSERTIONS. (ASSERBILITY relates to ASSERTIONS in the way that PROBABILITY relates to EVENTS. Dig?)

And this FORMULA will show that WILDCATTING (TAKING RISKS) CAN PAY OFF BIG! (Something experience can tell you only for limited cases.)

To use it, we'll relabel its ASSERTIONS. No longer "A, B", but, respectively, "H, P" -- "H" for "HYPOTHESIS", "P" for "PREDICTION". So it runs: ((H -> P) & P) -> P.

I relabel the TABLES for this:
HPH -> P(H -> P) & P((H -> P) & P) -> H
00101
01110
10001
11111
However, I admitted that this uses something the Logician labels "FALLACY OF ASSERTING THE CONSEQUENT". However,
  1. FAC FAILS BY JUST ONE CASE OUT OF FOUR.
  2. That single FAILURE QUALIFIES IT TO DEAL WITH REALITY, whereas NO FAILURE (as with MP) TAKES IT OUT OF REALITY AND PUTS IT INTO LANGUAGE!

So what's so great about this FORM, otherwise?

Here's a clue of what you'll learn. The LAST TABLE ABOVE DISPLAYS ONLY ONE POSSIBILITY OUT OF FOUR OF THE CONCLUSION BEING UNTRUE.

The "great stuff" is the "positive part" can get better, with "the negative" remaining CONSTANT.

  1. Suppose that, besides the CONFIRMED PREDICTION P (now labeled P1), Hypothesis H makes another prediction P2 WHICH IS ALSO CONFIRMED.

  2. The FORM becomes ((H -> ((P1 & P2)&(P1&P2)) -> H. Here are the TABLES for this:

    HP1P2(P1&P2)H -> (P1 & P2)((H -> (P1&P2))& (P1&P2)(((H -> (P1&P2))& (P1&P2)) -> H
    0000101
    0010101
    0100101
    0111110
    1000001
    1010101
    1100101
    1111111

    • THE NUMBER OF ROWS (POSSIBILITIES) DOUBLED, from 4 ROWS to 8 ROWS.

    • THE NUMBER OF POSSIBLE FAILURES REMAINED THE SAME, 1 -- NOW 1 OUT OF 8!

    Continuing, if H PREDICTED ONE MORE CONFIRMED PREDICTION,

    • THE POSSIBILITIES WOULD DOUBLE TO 16,

    • with still only 1 POSSIBILLITY OUT OF THE 16 OF FAILURE!

    "SUCCESS BREEDS SUCCESS!" As long as THE NUMBER OF CONFIRMED PREDICTIONS FOLLOW FROM THE SINGLE HYPOTHESIS, the "SCORE" gets BETTER and BETTER. That's the "great thing" about FAC.

    What? You're crying over the work involved in TABLES and READING THEM? Not to worry! I'll give you an ASSERBILITY FORMULA, which will calculate ALL IN A MINUTE. And, in another file, I'll give you an ASSERBILITY CALCULATOR.

    First, THE ASSERBILITY FORMULA, PROVIDING A MATHMEASURE FOR DECISIONMAKING UNDER UNCERTAINTY.