Skip to content

Comparison to SAS’ PROC MIXED #538

@W-APrince

Description

@W-APrince

We’re currently evaluating the suitability of the {mmrm} package for use in our biostatistics department, but we are experiencing some issues and are getting different estimates/p-values between the 2 softwares.
The issue seems to be related to the covariate matrix calculations – the unstructured, compound symmetry, and heterogeneous compound symmetry matrix structures seem to be okay, however, we seem to be seeing differences in the others.
For example, when we run the following model on the bcva_data dataset included in the mmrm package:
• R Code:

data("bcva_data", package = "mmrm")
bcva_data$ARMCD <- as.factor(bcva_data$ARMCD)
levels(bcva_data$ARMCD)
bcva_data$ARMCD <- relevel(bcva_data$ARMCD, ref = "CTL")

contrasts(bcva_data$ARMCD) <- contr.treatment(levels(bcva_data$ARMCD), base = which(levels(bcva_data$ARMCD) == "CTL"))
contrasts(bcva_data$AVISIT) <- contr.treatment(levels(bcva_data$AVISIT))

levels(bcva_data$AVISIT)
levels(bcva_data$ARMCD)
write_xpt(bcva_data, "./bcva.XPT", version = 5)

mmrm_model <- mmrm(
  BCVA_CHG ~ ARMCD + AVISIT + ARMCD * AVISIT + BCVA_BL + ar1(AVISIT | USUBJID),
  data = bcva_data,
  reml = TRUE,
  method = "Kenward-Roger",
  vcov = "Kenward-Roger-Linear"
)

• SAS PROC MIXED Code:

<read in XPT file written via R with references corresponding to factor levels>

proc mixed data=bcva  method = reml;
   class ARMCD(ref="1") AVISIT(ref="1") USUBJID;
   model BCVA_CHG = ARMCD AVISIT BCVA_BL ARMCD*AVISIT / solution ddfm=KR;
   repeated AVISIT / subject=USUBJID type=AR(1) r rcorr; /* Auto-regressive (AR1) covariance matrix */
   lsmeans ARMCD*AVISIT/ diff cl alpha=.05;

   /* Output the fixed effects estimates and related statistics */
   ods output SolutionF=fixed_effects; 
run;

We are experiencing the following differences with a 1E-3 tolerance (note, there are more but these are some of the more extreme differences with regard to the p-value):

<style> </style>
TYPE effect estimate StdErr df tValue Probt
R Result ARMCDTRT:AVISITVIS02 0.191755 0.142338 6073.801 1.347177 0.177973
SAS Result ARMCDTRT:AVISITVIS02 0.186703 0.173244 7581.078 1.077687 0.281208
Difference .................... -0.00505 0.030906 1507.278 -0.26949 0.103234
R Result ARMCDTRT:AVISITVIS03 0.475896 0.164868 8482.299 2.886522 0.003905
SAS Result ARMCDTRT:AVISITVIS03 0.465548 0.173933 7633.822 2.676596 0.007453
Difference .................... -0.01035 0.009064 -848.477 -0.20993 0.003548
R Result AVISITVIS02 0.275391 0.101673 6088.213 2.708602 0.006776
SAS Result AVISITVIS02 0.276339 0.123635 7587.288 2.235119 0.025439
Difference .................... 0.000948 0.021962 1499.075 -0.47348 0.018663
R Result AVISITVIS03 0.46004 0.117666 8480.155 3.90971 9.31E-05
SAS Result AVISITVIS03 0.464771 0.12412 7645.261 3.744529 0.000182
Difference .................... 0.004731 0.006454 -834.894 -0.16518 8.89E-05
R Result BCVA_BL -0.00071 0.002757 2227.454 -0.25665 0.797473
SAS Result BCVA_BL -0.00073 0.002635 2632.496 -0.27637 0.782282
Difference .................... -2.1E-05 -0.00012 405.0417 -0.01972 -0.01519
Do you know what is causing these mismatches? Is this something obvious we’re missing, or is there a difference in the way these softwares are calculating the results?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions