This function performs the QCA minimization of an input truth table, or if the input is a dataset the minimization it minimizes a set of causal conditions with respect to an outcome. Three minimization methods are available: the classical Quine-McCluskey, the enhanced Quine-McCluskey and the latest Consistency Cubes algorithm that is built for performance.
All algorithms return the same, exact solutions, see Dusa (2018) and Dusa and Thiem (2015)
minimize(input, include = "", , dir.exp = NULL, details = FALSE, all.sol = FALSE, pi.cons = 0, pi.depth = 0, sol.cons = 0, sol.cov = 1, sol.depth = 0, row.dom = FALSE, min.pin = FALSE, max.comb = 0, first.min = FALSE, method = "CCubes", ...)
input |
A truth table object (preferred) or a data frame containing calibrated causal conditions and an outcome. | |||
include |
A vector of other output values to include in the minimization process. | |||
dir.exp |
Character, a vector of directional expectations to derive intermediate solutions. | |||
details |
Logical, print more details about the solution. | |||
all.sol |
Logical, search for all possible solutions. | |||
pi.cons |
Numerical fuzzy value between 0 and 1, minimal consistency threshold for a prime implicant to be declared as sufficient. | |||
pi.depth |
Integer, a maximum number of causal conditions to be used when searching for conjunctive prime implicants. | |||
sol.cons |
Numerical fuzzy value between 0 and 1, minimal consistency threshold for a model to be declared as sufficient. | |||
sol.cov |
Numerical fuzzy value between 0 and 1, minimal coverage threshold for a model to be declared as necessary. | |||
sol.depth |
Integer, a maximum number of prime implicants to be used when searching for disjunctive solutions. | |||
row.dom |
Logical, perform row dominance in the prime implicants' chart to eliminate redundant prime implicants. | |||
min.pin |
Logical, terminate the search at the depth where newly found prime implicants do not contribute to minimally solving the PI chart. | |||
max.comb |
Numeric real, to limit the size of the PI chart (see Details). | |||
first.min |
Logical, to return only the very first minimal solution (see Details). | |||
method |
Minimization method, one of "CCubes" (default), or "QMC" the
classical Quine-McCluskey, or "eQMC" the enhanced Quine-McCluskey. |
|||
... |
Other arguments to be passed to function truthTable() . |
Most of the times, this function takes a truth table object as the
input
for the minimization procedure, but the same argument can refer to
a data frame containing calibrated columns.
For the later case, the function minimize()
originally
had some additional formal arguments which were sent to the function
truthTable()
:
outcome
, conditions
, n.cut
,
incl.cut
, show.cases
, use.letters
and inf.test
.
All of these parameters are still possible with function minimize()
, but
since they are sent to the truthTable()
function
anyway, it is unnecessary to duplicate their explanation here. The only situation which does need
an additional description relates to the argument outcome
, where
unlike truthTable()
which accepts a single one, the
function minimize()
accepts multiple outcomes and performs a
minimization for each of them (a situation when all columns are considered causal conditions).
The argument include
specifies which other truth table rows are
included in the minimization process. Most often, the remainders are included but
any value accepted in the argument explain
is also accepted in the
argument include
.
The argument exclude
is used to exclude truth table rows from the
minimization process, from the positive configurations and/or from the remainders.
It can be specified as a vector of truth table line numbers, or as a matrix of
causal combinations.
The argument dir.exp
is used to specify directional expectations,
as described by Ragin (2003). They can be specified using SOP expressions, which opens
up the possibility to experiment with conjunctural directional expectations. "Don't care"
conditions are simply left unspecified.
Activating the details
argument has the effect of printing
parameters of fit for each prime implicant and each overall solution, the essential prime
implicants being listed in the top part of the table. It also prints the truth
table, in case the argument input
has been provided as a data frame
instead of a truth table object.
By default, the package QCA employes a different search algorithm based on Consistency Cubes (Dusa, 2017), analysing all possible combinations of causal conditions and all possible combinations of their respective levels. The structure of the input dataset (number of causal conditions, number of levels, number of unique rows in the truth table) has a direct implication on the search time, as all of those characteristics become entry parameters when calculating all possible combinations.
Consequently, two kinds of depth arguments are provided:
pi.depth |
the maximum number of causal conditions needed to construct a prime implicant, the complexity level where the search can be stopped, as long as the PI chart can be solved. | |||
sol.depth |
the maximum number of prime implicants needed to find a solution (to cover all initial positive output configurations) |
These arguments introduce a possible new way of deriving prime implicants and solutions,
that can lead to different results (i.e. even more parsimonious) compared to the classical
Quine-McCluskey. When either of them is modified from the default value of 0, the minimization
method is automatically set to "CCubes"
and the remainders are
automatically included in the minimization.
The higher these depths, the higher the search time. Connversely, the search time can be
significantly shorter if these depths are smaller. Irrespective of how large
pi.depth
is, the algorithm will always stop at a maximum complexity level
where no new, non-redundant prime implicants are found. The argument sol.depth
is relevant only when activating the argument all.sol
to solve the PI chart.
The default method (when all.sol
), is to find the minimal number
(k
) of prime implicants needed to cover all initial positive output
configurations (minterms), then exhaustively search through all possible disjunctions of
k
prime implicants which do cover those configurations.
The argument min.pin
introduces an additional parameter to
control when to stop the search for prime implicants. It is especially useful for very
large truth tables and uses an observation by
Dusa (2017) that out of the
entire set of non redundant prime implicants, only a subset actually contribute to
solving the PI chart. The search depth can be shortened when the PIs found at the
next complexity level do not decrease the minimum k
, thus producing absolute
minimal models in both number of disjunctive prime implicants, and their depth level.
Once the PI chart is constructed using the prime implicants found in the previous
stages, the argument row.dom
can be used to further eliminate
irrelevant prime implicants when solving the PI chart, applying the principle of row
dominance: if a prime implicant A covers the same (intial) positive output
configurations as another prime implicant B and in the same time covers
other configurations which B does not cover, then B is irrelevant and eliminated.
Depending on the complexity of the PI chart, it sometimes may take a very long time to identify all possible irredundant (disjunctions that are not subsets of previously found) disjunctive solutions. In such a situation, the number of combinations of all possible numbers of prime implicants is potentially too large to be solved in a polynomial time and if not otherwise specified, the depth for the disjunctive solutions is automatically bounded to 5 prime implicants.
If minimizing a dataset instead of a truth table, unless otherwise specified the
argument incl.cut
is automatically set to the minimum value between
pi.cons
and sol.cons
, then passed to the function
truthTable()
.
The argument sol.cons
introduces another possibility to change
the method of solving the PI chart. Normally, once the solutions are found among all possible
combinations of k
prime implicants, consistencies and coverages are
subsequently calculated. When sol.cons
is lower than 1, then solutions
are searched based on their consistencies, which should be at least equal to this threshold.
A large number of causal conditions (i.e. over 15), combined with a large number of cases (i.e. hundreds) usually produce a very large number of prime implicants, resulting in a huge and extremely complex PI chart with sometimes thousands of rows and hundreds of columns.
For such a complex PI chart, even finding a minimum is a formidable task, and exhaustively
solving it is very likely impossible in polynomial time. For this reason, after each level
of complexity the CCubes algorithm determines if the PI chart is too difficult, by
calculating the total number of combinations of minimum k
PIs necessary to
cover all columns.
The argument \bold{\code{max.comb}} controls this maximum number of combinations. It is a
rational number counted in (fractions of) billions, defaulted at zero to signal searching
to the maximum possible extent. If the total number of combinations exceeds a positive
value of max.comb
, the PI chart is determined as too complex, the search is
stopped and CCubes attempts to return all possible models using the PIs from the previous
levels of complexity, when the PI chart was still not too complex.
In the extreme situation even this is not feasible, the argument first.min
controls returning only one (the very first found) minimal model, if at all possible.
"qca"
when using a single outcome, or class
"mqca"
when using multiple outcomes. These objects are lists having
the following components:
tt |
The truth table object. | |||
options |
Values for the various options used in the function (including defaults). | |||
negatives |
The line number(s) of the negative configuration(s). | |||
initials |
The initial positive configuration(s). | |||
PIchart |
A list containing the PI chart(s). | |||
primes |
The prime implicant(s). | |||
solution |
A list of solution(s). | |||
essential |
A list of essential PI(s). | |||
pims |
A list of PI membership scores. | |||
IC |
The matrix containing the inclusion and coverage scores for the solution(s). | |||
SA |
A list of simplifying assumptions. | |||
i.sol |
A list of components specific to intermediate solution(s), each having a PI chart, prime implicant membership scores, (non-simplifying) easy counterfactuals and difficult counterfactuals. | |||
complex |
Flag solutions from a too complex PI chart. | |||
call |
The user's command which produced all these objects and result(s). |
Cebotari, V.; Vink, M.P. (2013) A Configurational Analysis of Ethnic Protest in Europe. International Journal of Comparative Sociology vol.54, no.4, pp.298-324, DOI: 10.1177/0020715213508567.
Cebotari, V.; Vink, M.P. (2015) Replication Data for: A configurational analysis of ethnic protest in Europe, Harvard Dataverse, V2, DOI: 10.7910/DVN/PT2IB9
Cronqvist, L.; Berg-Schlosser, D. (2009) Multi-Value QCA (mvQCA), in Rihoux, B.; Ragin, C. (eds.) Configurational Comparative Methods. Qualitative Comparative Analysis (QCA) and Related Techniques, SAGE.
Dusa, A.; Thiem, A. (2015) Enhancing the Minimization of Boolean and Multivalue Output Functions With eQMC Journal of Mathematical Sociology vol.39, no.2, pp.92-108, DOI: 10.1080/0022250X.2014.897949.
Dusa, A. (2018) Consistency Cubes: A Fast, Efficient Method for Boolean Minimization R Journal vol.10, no.2, pp.357-370, DOI: 10.32614/RJ-2018-080.
Dusa, A. (2019) QCA with R. A Comprehensive Resource. Springer International Publishing.
Ragin, C.C. (2003) Recent Advances in Fuzzy-Set Methods and Their Application to Policy Questions. WP 2003-9, COMPASSS.
Ragin, C.C. (2009) Qualitative Comparative Analysis Using Fuzzy-Sets (fsQCA), in Rihoux, B.; Ragin, C. (eds.) Configurational Comparative Methods. Qualitative Comparative Analysis (QCA) and Related Techniques, SAGE.
Ragin, C.C.; Strand, S.I. (2008) Using Qualitative Comparative Analysis to Study Causal Order: Comment on Caren and Panofsky (2005). Sociological Methods & Research vol.36, no.4, pp.431-441, DOI: 10.1177/0049124107313903.
Rihoux, B.; De Meur, G. (2009) Crisp Sets Qualitative Comparative Analysis (mvQCA), in Rihoux, B.; Ragin, C. (eds.) Configurational Comparative Methods. Qualitative Comparative Analysis (QCA) and Related Techniques, SAGE.
# ----- # Lipset binary crisp data # the associated truth table ttLC <- truthTable(LC, "SURV", sort.by = "incl, n") ttLCOUT: outcome value n: number of cases in configuration incl: sufficiency inclusion score DEV URB LIT IND STB OUT n incl PRI 32 1 1 1 1 1 1 4 1.000 1.000 22 1 0 1 0 1 1 2 1.000 1.000 24 1 0 1 1 1 1 2 1.000 1.000 1 0 0 0 0 0 0 3 0.000 0.000 2 0 0 0 0 1 0 2 0.000 0.000 5 0 0 1 0 0 0 2 0.000 0.000 6 0 0 1 0 1 0 1 0.000 0.000 23 1 0 1 1 0 0 1 0.000 0.000 31 1 1 1 1 0 0 1 0.000 0.000# conservative solution (Rihoux & De Meur 2009, p.57) cLC <- minimize(ttLC) cLCM1: DEV*urb*LIT*STB + DEV*LIT*IND*STB <=> SURV# view the Venn diagram for the associated truth table library(venn) venn(cLC)
# add details and case names minimize(ttLC, details = TRUE, show.cases = TRUE)n OUT = 1/0/C: 8/10/0 Total : 18 Number of multiple-covered cases: 2 M1: DEV*~URB*LIT*STB + DEV*LIT*IND*STB <=> SURV inclS PRI covS covU cases ------------------------------------------------------------------- 1 DEV*~URB*LIT*STB 1.000 1.000 0.500 0.250 FI,IE; FR,SE 2 DEV*LIT*IND*STB 1.000 1.000 0.750 0.500 FR,SE; BE,CZ,NL,UK ------------------------------------------------------------------- M1 1.000 1.000 1.000# negating the outcome ttLCn <- truthTable(LC, "~SURV", sort.by = "incl, n") minimize(ttLCn)M1: ~DEV*~URB*~IND + DEV*LIT*IND*~STB <=> ~SURV# parsimonious solution, positive output pLC <- minimize(ttLC, include = "?", details = TRUE, show.cases = TRUE) pLCn OUT = 1/0/C: 8/10/0 Total : 18 Number of multiple-covered cases: 0 M1: DEV*STB <=> SURV inclS PRI covS covU cases ----------------------------------------------------------------- 1 DEV*STB 1.000 1.000 1.000 - FI,IE; FR,SE; BE,CZ,NL,UK ----------------------------------------------------------------- M1 1.000 1.000 1.000# the associated simplifying assumptions pLC$SA$M1 DEV URB LIT IND STB 18 1 0 0 0 1 20 1 0 0 1 1 26 1 1 0 0 1 28 1 1 0 1 1 30 1 1 1 0 1# parsimonious solution, negative output pLCn <- minimize(ttLCn, include = "?", details = TRUE, show.cases = TRUE) pLCnn OUT = 1/0/C: 10/8/0 Total : 18 Number of multiple-covered cases: 5 M1: ~DEV + ~STB <=> ~SURV inclS PRI covS covU cases --------------------------------------------------------------- 1 ~DEV 1.000 1.000 0.800 0.300 GR,PT,ES; IT,RO; HU,PL; EE 2 ~STB 1.000 1.000 0.700 0.200 GR,PT,ES; HU,PL; AU; DE --------------------------------------------------------------- M1 1.000 1.000 1.000# ----- # Lipset multi-value crisp data (Cronqvist & Berg-Schlosser 2009, p.80) # truth table ttLM <- truthTable(LM, "SURV", conditions = "DEV, URB, LIT, IND", sort.by = "incl", show.cases = TRUE) # conservative solution, positive output minimize(ttLM, details = TRUE, show.cases = TRUE)n OUT = 1/0/C: 7/11/0 Total : 18 Number of multiple-covered cases: 0 M1: DEV{2}*LIT{1}*IND{1} + DEV{1}*URB{0}*LIT{1}*IND{0} => SURV{1} inclS PRI covS covU cases --------------------------------------------------------------------------- 1 DEV{2}*LIT{1}*IND{1} 1.000 1.000 0.625 0.625 FR,SE; BE,NL,UK 2 DEV{1}*URB{0}*LIT{1}*IND{0} 1.000 1.000 0.250 0.250 FI,IE --------------------------------------------------------------------------- M1 1.000 1.000 0.875# parsimonious solution, positive output minimize(ttLM, include = "?", details = TRUE, show.cases = TRUE)n OUT = 1/0/C: 7/11/0 Total : 18 Number of multiple-covered cases: 0 M1: DEV{2} + DEV{1}*IND{0} => SURV{1} inclS PRI covS covU cases ------------------------------------------------------------- 1 DEV{2} 1.000 1.000 0.625 0.625 FR,SE; BE,NL,UK 2 DEV{1}*IND{0} 1.000 1.000 0.250 0.250 FI,IE ------------------------------------------------------------- M1 1.000 1.000 0.875# negate the outcome ttLMn <- truthTable(LM, "~SURV", conditions = "DEV, URB, LIT, IND", sort.by = "incl", show.cases = TRUE) # conservative solution, negative output minimize(ttLMn, details = TRUE, show.cases = TRUE)n OUT = 1/0/C: 9/9/0 Total : 18 Number of multiple-covered cases: 0 M1: DEV{0}*URB{0}*IND{0} + DEV{1}*URB{0}*LIT{1}*IND{1} => ~SURV{1} inclS PRI covS covU cases ------------------------------------------------------------------------------------ 1 DEV{0}*URB{0}*IND{0} 1.000 1.000 0.800 0.800 GR,IT,PT,RO,ES; EE,HU,PL 2 DEV{1}*URB{0}*LIT{1}*IND{1} 1.000 1.000 0.100 0.100 AU ------------------------------------------------------------------------------------ M1 1.000 1.000 0.900# parsimonious solution, positive output minimize(ttLMn, include = "?", details = TRUE, show.cases = TRUE)n OUT = 1/0/C: 9/9/0 Total : 18 Number of multiple-covered cases: 0 M1: DEV{0} + DEV{1}*URB{0}*IND{1} => ~SURV{1} inclS PRI covS covU cases ----------------------------------------------------------------------------- 1 DEV{0} 1.000 1.000 0.800 0.800 GR,IT,PT,RO,ES; EE,HU,PL 2 DEV{1}*URB{0}*IND{1} 1.000 1.000 0.100 0.100 AU ----------------------------------------------------------------------------- M1 1.000 1.000 0.900# ----- # Lipset fuzzy sets data (Ragin 2009, p.112) # truth table using a very low inclusion cutoff ttLF <- truthTable(LF, "SURV", incl.cut = 0.8, show.cases = TRUE, sort.by="incl") # conservative solution minimize(ttLF, details = TRUE, show.cases = TRUE)n OUT = 1/0/C: 6/12/0 Total : 18 Number of multiple-covered cases: 0 M1: DEV*URB*LIT*IND*STB + DEV*~URB*LIT*~IND*STB => SURV inclS PRI covS covU cases ----------------------------------------------------------------- 1 DEV*URB*LIT*IND*STB 0.904 0.886 0.454 0.393 BE,CZ,NL,UK 2 DEV*~URB*LIT*~IND*STB 0.804 0.719 0.265 0.204 FI,IE ----------------------------------------------------------------- M1 0.870 0.843 0.658# parsimonious solution minimize(ttLF, include = "?", details = TRUE)n OUT = 1/0/C: 6/12/0 Total : 18 Number of multiple-covered cases: 0 M1: DEV*~IND + URB*STB => SURV inclS PRI covS covU cases ---------------------------------------------------- 1 DEV*~IND 0.815 0.721 0.284 0.194 FI,IE 2 URB*STB 0.874 0.845 0.520 0.430 BE,CZ,NL,UK ---------------------------------------------------- M1 0.850 0.819 0.714# intermediate solution using directional expectations minimize(ttLF, include = "?", details = TRUE, show.cases = TRUE, dir.exp = "DEV, URB, LIT, IND, STB")n OUT = 1/0/C: 6/12/0 Total : 18 From C1P1: Number of multiple-covered cases: 0 M1: DEV*URB*LIT*STB + DEV*LIT*~IND*STB => SURV inclS PRI covS covU cases ------------------------------------------------------------ 1 DEV*URB*LIT*STB 0.901 0.879 0.468 0.378 BE,CZ,NL,UK 2 DEV*LIT*~IND*STB 0.814 0.721 0.282 0.191 FI,IE ------------------------------------------------------------ M1 0.866 0.839 0.660# URB as a don't care condition (left unspecified) and # experimentally, conjunctural directional expectations minimize(ttLF, include = "?", details = TRUE, show.cases = TRUE, dir.exp = "DEV, STB, ~LIT*IND")n OUT = 1/0/C: 6/12/0 Total : 18 From C1P1: Number of multiple-covered cases: 0 M1: DEV*URB*STB + DEV*~IND*STB => SURV inclS PRI covS covU cases -------------------------------------------------------- 1 DEV*URB*STB 0.901 0.879 0.468 0.378 BE,CZ,NL,UK 2 DEV*~IND*STB 0.814 0.721 0.282 0.191 FI,IE -------------------------------------------------------- M1 0.866 0.839 0.660# ----- # Cebotari & Vink (2013, 2015) ttCVF <- truthTable(CVF, outcome = "PROTEST", incl.cut = 0.8, show.cases = TRUE, sort.by = "incl, n") pCVF <- minimize(ttCVF, include = "?", details = TRUE, show.cases = TRUE) pCVFn OUT = 1/0/C: 13/16/0 Total : 29 Number of multiple-covered cases: 5 M1: ~NATPRIDE + DEMOC*GEOCON*POLDIS + (~DEMOC*ETHFRACT*POLDIS + DEMOC*ETHFRACT*GEOCON) => PROTEST M2: ~NATPRIDE + DEMOC*GEOCON*POLDIS + (~DEMOC*ETHFRACT*POLDIS + DEMOC*ETHFRACT*~POLDIS) => PROTEST M3: ~NATPRIDE + DEMOC*GEOCON*POLDIS + (DEMOC*ETHFRACT*GEOCON + ETHFRACT*GEOCON*POLDIS) => PROTEST M4: ~NATPRIDE + DEMOC*GEOCON*POLDIS + (DEMOC*ETHFRACT*~POLDIS + ETHFRACT*GEOCON*POLDIS) => PROTEST --------------------------------- inclS PRI covS covU (M1) (M2) (M3) (M4) --------------------------------------------------------------------------------- 1 ~NATPRIDE 0.899 0.807 0.597 0.121 0.132 0.122 0.136 0.126 2 DEMOC*GEOCON*POLDIS 0.906 0.805 0.342 0.065 0.065 0.070 0.065 0.065 --------------------------------------------------------------------------------- 3 ~DEMOC*ETHFRACT*POLDIS 0.842 0.718 0.299 0.000 0.040 0.040 4 DEMOC*ETHFRACT*GEOCON 0.935 0.826 0.480 0.000 0.085 0.085 5 DEMOC*ETHFRACT*~POLDIS 0.932 0.773 0.417 0.000 0.085 0.085 6 ETHFRACT*GEOCON*POLDIS 0.869 0.786 0.365 0.005 0.045 0.045 --------------------------------------------------------------------------------- M1 0.877 0.777 0.805 M2 0.877 0.777 0.805 M3 0.879 0.782 0.810 M4 0.879 0.782 0.810 cases -------------------------------- 1 ~NATPRIDE CrimRussiansUkr,RussiansUkraine; HungariansYugo,KosovoAlbanians; RussiansLatvia; BasquesSpain; AlbaniansFYROM 2 DEMOC*GEOCON*POLDIS HungariansRom,CatholicsNIreland; AlbaniansFYROM; RussiansEstonia -------------------------------- 3 ~DEMOC*ETHFRACT*POLDIS HungariansYugo,KosovoAlbanians; GagauzMoldova 4 DEMOC*ETHFRACT*GEOCON BasquesSpain; SerbsFYROM,CatalansSpain; AlbaniansFYROM; RussiansEstonia 5 DEMOC*ETHFRACT*~POLDIS BasquesSpain; SerbsFYROM,CatalansSpain 6 ETHFRACT*GEOCON*POLDIS HungariansYugo,KosovoAlbanians; GagauzMoldova; AlbaniansFYROM; RussiansEstonia --------------------------------# inspect the PI chart pCVF$PIchart5 15 16 24 27 29 30 31 32 ~NATPRIDE x x - - x x - x - ~DEMOC*ETHFRACT*POLDIS - x x - - - - - - DEMOC*ETHFRACT*GEOCON - - - - - x x x x DEMOC*ETHFRACT*~POLDIS - - - - - x x - - DEMOC*GEOCON*POLDIS - - - x - - - x x ETHFRACT*GEOCON*POLDIS - x x - - - - x x# DEMOC*ETHFRACT*poldis is dominated by DEMOC*ETHFRACT*GEOCON # using row dominance to solve the PI chart pCVFrd <- minimize(ttCVF, include = "?", row.dom = TRUE, details = TRUE, show.cases = TRUE) # plot the prime implicants on the outcome pims <- pCVFrd$pims par(mfrow = c(2, 2)) for(i in 1:4) { XYplot(pims[, i], CVF$PROTEST, cex.axis = 0.6) }
# ----- # temporal QCA (Ragin & Strand 2008) serving the input as a dataset, # which will automatically be passed to truthTable() as an intermediary # step before the minimization minimize(RS, outcome = "REC", details = TRUE)OUT: outcome value n: number of cases in configuration incl: sufficiency inclusion score P E A EBA S OUT n incl PRI 5 0 0 0 - 0 0 3 0.000 0.000 17 0 1 0 - 0 0 1 0.000 0.000 20 0 1 1 0 1 1 1 1.000 1.000 29 1 0 0 - 0 0 1 0.000 0.000 30 1 0 0 - 1 0 3 0.000 0.000 36 1 0 1 - 1 0 2 0.000 0.000 42 1 1 0 - 1 1 1 1.000 1.000 44 1 1 1 0 1 1 1 1.000 1.000 45 1 1 1 1 0 1 2 1.000 1.000 46 1 1 1 1 1 1 2 1.000 1.000 n OUT = 1/0/C: 7/10/0 Total : 17 Number of multiple-covered cases: 3 M1: P*E*S + P*E*A*EBA + E*A*~EBA*S <=> REC inclS PRI covS covU ----------------------------------------- 1 P*E*S 1.000 1.000 0.571 0.143 2 P*E*A*EBA 1.000 1.000 0.571 0.286 3 E*A*~EBA*S 1.000 1.000 0.286 0.143 ----------------------------------------- M1 1.000 1.000 1.000