high ASC-coefficients in model in wtp-space
Posted: 30 Jun 2022, 13:03
Hi
I estimated this model in wtp-space:
V = list()
V[['alt1']] = Certain *(cost_B*(asc_B + torsk_B * KT1 + Cost1 + laks_B * VL1 + bunn_B * HB1 + land_B * KL1))+
(1-Certain)*(cost_T*(asc_T + torsk_T * KT1 + Cost1 + laks_T * VL1 + bunn_T * HB1 + land_T * KL1))
V[['alt2']] = Certain * (cost_B*(torsk_B * KT2 + Cost2 + laks_B * VL2 + bunn_B * HB2 + land_B * KL2))+
(1-Certain)* (cost_T*(torsk_T * KT2 + Cost2 + laks_T * VL2 + bunn_T * HB2 + land_T * KL2))
V[['alt3']] = Certain * (cost_B*(torsk_B * KT3 + Cost3 + laks_B * VL3 + bunn_B * HB3 + land_B * KL3))+
(1-Certain)* (cost_T*(torsk_T * KT3 + Cost3 + laks_T * VL3 + bunn_T * HB3 + land_T * KL3))
"Certain" is a dummy taking the value 1 for respondents in one information set in a split sample choice experiment, and the value 0 for respondents in the other information set. KT, VL, HB and KL are the attributes, and torsk, laks, bunn and land the parameters to be estimated. _B are parameters for baseline (information set 1) and _T are parameters for treatment (information set 2).
Estimating the model works fine, and the model converges. However, I get very high values for the ASC-parameters. This result is interpretable from an empiricle point of view (strong aversion against alternative 1, which is the SQ). I just wanted to ask whether there may be other, model technical reasons for this result, which I should take into consideration??
Model name : MXL_aqua_exp
Model description : MXL model with dummy coding
Model run at : 2022-06-30 11:24:53
Estimation method : bfgs
Model diagnosis : successful convergence
Number of individuals : 293
Number of rows in database : 2599
Number of modelled outcomes : 2599
Number of cores used : 3
Number of inter-individual draws : 1000 (SobolOwenFaureTezuka)
LL(start) : -2855.8
LL(0) : -2855.29
LL(C) : -2667.63
LL(final) : -1611.64
Rho-square (0) : 0.4356
Adj.Rho-square (0) : 0.4272
Rho-square (C) : 0.3959
Adj.Rho-square (C) : 0.3869
AIC : 3271.27
BIC : 3411.98
Estimated parameters : 24
Time taken (hh:mm:ss) : 01:22:6.62
pre-estimation : 00:05:55.97
estimation : 00:28:0.47
post-estimation : 00:48:10.18
Iterations : 118
Min abs eigenvalue of Hessian : 0.046956
Unconstrained optimisation.
Estimates:
Estimate s.e. t.rat.(0) Rob.s.e. Rob.t.rat.(0)
asc_B_mu 17.44502 3.09223 5.6416 8.277305 2.1076
asc_B_sig 19.82318 3.47555 5.7036 9.472019 2.0928
asc_T_mu 6.86215 1.26935 5.4060 1.851939 3.7054
asc_T_sig -14.42643 1.78324 -8.0900 2.934400 -4.9163
cost_B_mu -0.36412 0.16356 -2.2262 0.281769 -1.2923
cost_B_sig 1.10744 0.18507 5.9837 0.219688 5.0410
torsk_B_mu 0.38443 0.07701 4.9921 0.126627 3.0359
torsk_B_sig -0.55603 0.07488 -7.4259 0.148787 -3.7371
laks_B_mu 0.23840 0.11780 2.0238 0.244438 0.9753
laks_B_sig 0.74951 0.13155 5.6975 0.356692 2.1013
bunn_B_mu 0.72419 0.27200 2.6625 0.256366 2.8248
bunn_B_sig -3.6574e-04 8.5917e-04 -0.4257 0.001136 -0.3221
land_B_mu -0.01622 0.01423 -1.1402 0.013146 -1.2339
land_B_sig -0.06131 0.04961 -1.2358 0.152943 -0.4009
cost_T_mu -0.15527 0.16434 -0.9448 0.230286 -0.6743
cost_T_sig -0.91270 0.15905 -5.7385 0.163097 -5.5960
torsk_T_mu 0.25055 0.06783 3.6939 0.089304 2.8056
torsk_T_sig 0.57420 0.08993 6.3850 0.150160 3.8239
laks_T_mu 0.28932 0.08334 3.4714 0.073636 3.9291
laks_T_sig 0.60285 0.09112 6.6160 0.105155 5.7329
bunn_T_mu 0.66006 0.26217 2.5177 0.273440 2.4139
bunn_T_sig -0.09652 0.24403 -0.3955 0.114511 -0.8429
land_T_mu 0.02216 0.01593 1.3910 0.017167 1.2907
land_T_sig -0.09742 0.02474 -3.9381 0.048570 -2.0059
Thank you in advance for your reply.
best regards,
Margrethe
I estimated this model in wtp-space:
V = list()
V[['alt1']] = Certain *(cost_B*(asc_B + torsk_B * KT1 + Cost1 + laks_B * VL1 + bunn_B * HB1 + land_B * KL1))+
(1-Certain)*(cost_T*(asc_T + torsk_T * KT1 + Cost1 + laks_T * VL1 + bunn_T * HB1 + land_T * KL1))
V[['alt2']] = Certain * (cost_B*(torsk_B * KT2 + Cost2 + laks_B * VL2 + bunn_B * HB2 + land_B * KL2))+
(1-Certain)* (cost_T*(torsk_T * KT2 + Cost2 + laks_T * VL2 + bunn_T * HB2 + land_T * KL2))
V[['alt3']] = Certain * (cost_B*(torsk_B * KT3 + Cost3 + laks_B * VL3 + bunn_B * HB3 + land_B * KL3))+
(1-Certain)* (cost_T*(torsk_T * KT3 + Cost3 + laks_T * VL3 + bunn_T * HB3 + land_T * KL3))
"Certain" is a dummy taking the value 1 for respondents in one information set in a split sample choice experiment, and the value 0 for respondents in the other information set. KT, VL, HB and KL are the attributes, and torsk, laks, bunn and land the parameters to be estimated. _B are parameters for baseline (information set 1) and _T are parameters for treatment (information set 2).
Estimating the model works fine, and the model converges. However, I get very high values for the ASC-parameters. This result is interpretable from an empiricle point of view (strong aversion against alternative 1, which is the SQ). I just wanted to ask whether there may be other, model technical reasons for this result, which I should take into consideration??
Model name : MXL_aqua_exp
Model description : MXL model with dummy coding
Model run at : 2022-06-30 11:24:53
Estimation method : bfgs
Model diagnosis : successful convergence
Number of individuals : 293
Number of rows in database : 2599
Number of modelled outcomes : 2599
Number of cores used : 3
Number of inter-individual draws : 1000 (SobolOwenFaureTezuka)
LL(start) : -2855.8
LL(0) : -2855.29
LL(C) : -2667.63
LL(final) : -1611.64
Rho-square (0) : 0.4356
Adj.Rho-square (0) : 0.4272
Rho-square (C) : 0.3959
Adj.Rho-square (C) : 0.3869
AIC : 3271.27
BIC : 3411.98
Estimated parameters : 24
Time taken (hh:mm:ss) : 01:22:6.62
pre-estimation : 00:05:55.97
estimation : 00:28:0.47
post-estimation : 00:48:10.18
Iterations : 118
Min abs eigenvalue of Hessian : 0.046956
Unconstrained optimisation.
Estimates:
Estimate s.e. t.rat.(0) Rob.s.e. Rob.t.rat.(0)
asc_B_mu 17.44502 3.09223 5.6416 8.277305 2.1076
asc_B_sig 19.82318 3.47555 5.7036 9.472019 2.0928
asc_T_mu 6.86215 1.26935 5.4060 1.851939 3.7054
asc_T_sig -14.42643 1.78324 -8.0900 2.934400 -4.9163
cost_B_mu -0.36412 0.16356 -2.2262 0.281769 -1.2923
cost_B_sig 1.10744 0.18507 5.9837 0.219688 5.0410
torsk_B_mu 0.38443 0.07701 4.9921 0.126627 3.0359
torsk_B_sig -0.55603 0.07488 -7.4259 0.148787 -3.7371
laks_B_mu 0.23840 0.11780 2.0238 0.244438 0.9753
laks_B_sig 0.74951 0.13155 5.6975 0.356692 2.1013
bunn_B_mu 0.72419 0.27200 2.6625 0.256366 2.8248
bunn_B_sig -3.6574e-04 8.5917e-04 -0.4257 0.001136 -0.3221
land_B_mu -0.01622 0.01423 -1.1402 0.013146 -1.2339
land_B_sig -0.06131 0.04961 -1.2358 0.152943 -0.4009
cost_T_mu -0.15527 0.16434 -0.9448 0.230286 -0.6743
cost_T_sig -0.91270 0.15905 -5.7385 0.163097 -5.5960
torsk_T_mu 0.25055 0.06783 3.6939 0.089304 2.8056
torsk_T_sig 0.57420 0.08993 6.3850 0.150160 3.8239
laks_T_mu 0.28932 0.08334 3.4714 0.073636 3.9291
laks_T_sig 0.60285 0.09112 6.6160 0.105155 5.7329
bunn_T_mu 0.66006 0.26217 2.5177 0.273440 2.4139
bunn_T_sig -0.09652 0.24403 -0.3955 0.114511 -0.8429
land_T_mu 0.02216 0.01593 1.3910 0.017167 1.2907
land_T_sig -0.09742 0.02474 -3.9381 0.048570 -2.0059
Thank you in advance for your reply.
best regards,
Margrethe