[phenixbb] phenix and weak data

Randy Read rjr27 at cam.ac.uk
Wed Dec 12 09:24:40 PST 2012


To be honest, when I say "relatively easy" I'm thinking in terms of the single sigmaA parameter.  Once you've got an estimate of sigmaA (e.g. by scaling the alpha parameter to apply to Es instead of Fs), then you can refine it using the likelihood target that has an explicit contribution from the sigf values.  Or you could refine alpha and beta using the likelihood target expressed in terms of alpha and beta but adding the sigf contribution to beta.  (But I believe my paper on SIGMAA showed that alpha and beta are not really independent anyway.)

Randy

On 12 Dec 2012, at 15:56, Ed Pozharski wrote:

> Dear Randy,
> 
> On Wed, 2012-12-12 at 09:00 +0000, Randy Read wrote:
> 
> 
>> In the statistics you give below, the key statistic is probably the
>> standard deviation of sigf/sqrt(beta), which is actually quite small. 
>> So after absorbing the average effect of measurement error into the
>> beta values, the residual variation is even less important to the total
>> variance than you would think from the total value of sigf.
> 
> You are absolutely right - this is one of the possible reasons (perhaps
> the main reason) why the effect on model upon incorporating sigf is not
> obvious.  Indeed, in the example dataset that I used relative standard
> deviation of sigf in resolution shells ranges from 0.3 at high to 0.6 at
> low resolution.  The incorporation of the shell-average sigf into beta
> is obscured by the fact that the two anti-correlate.
> 
> Another issue is that average value of beta does not matter that much
> and is only weakly controlled by data.  This is obvious (to me but I may
> be wrong) from eq. (6-7) in Lunin (2002).  It points out that near
> minimum (and we are talking here about effects on the *final model*)
> target may be approximated  quadratically and the applied weight is
> essentially inversely proportional to beta (not exactly, of course, but
> it is the major effect beta has on the target).  Thus, some inflation of
> beta over what it is expected to be (model variance in reciprocal space)
> will do very little to the target other than scaling it.
> 
> I think my main issue is with the idea that beta may be used as an
> estimate of model variance.  Mathematically it probably does not matter,
> but we all tend to attach "physical" interpretations to model
> parameters, and here it does not work as it seems to suggest that
> crystallographic models are grossly overfitted.
> 
>> I would still argue that it's relatively easy to incorporate the
>> experimental error into the likelihood variances so it's worth doing
>> even if we haven't found the circumstances where it turns out to
>> matter!
> 
> I am not sure it is that easy.  As I mentioned previously, the
> analytical expressions for alpha/beta break down when sigf is
> incorporated.  Also, it is possible that this has already been done by
> Kevin Cowtan in his 2005 paper.  My observation was that the spline
> coefficients return more reasonable estimates of model variance.
> 
> There are also very strange consequences in alpha/beta approach.
> Equations (6-7) in Lunin (2002) essentially set up the quadratic
> approximation.  I already mentioned that target value Fs* oddly reduces
> to exact zero for "weak reflections", and if beta is overestimated those
> are not so weak anymore.  In the example dataset that I used, in the low
> resolution shells average I/sigma of such zeroed reflections is as high
> as ~5-6.  I can identify a reflection that according to eq. (6-7) will
> have a target value of zero and has I/sigma=22.5!  I understand that
> this still has only minor effect on the final model because only ~12% of
> reflections are hit (and nobody minds that 5-10% of experimental data is
> routinely tossed for the sake of Rfree calculation).  Still, it is
> puzzling and unexpected.
> 
> Also, the weights applied to individual reflections are approximated by
> ws* which has some peculiar properties, namely that it dips to zero
> around f/sqrt(beta)~1.  Curiously, it recovers back to full weight for
> reflections that are weaker (Fig.3 in Lunin 2002).  Again, I see no flaw
> in the math, but it is rather counterintuitive that reflections that
> roughly match model error are weighted down, while even weaker
> reflections are not.  Given that these weaker reflections have their
> respective target Fs* reset to zero, there will be potential during
> minimization to simply run weak reflections down to zero across the
> board.
> 
> Oh, and by the way, phaser rocks :)
> 
> Ed.
> 
> 
> -- 
> Edwin Pozharski, PhD, Assistant Professor
> University of Maryland, Baltimore
> ----------------------------------------------
> When the Way is forgotten duty and justice appear;
> Then knowledge and wisdom are born along with hypocrisy.
> When harmonious relationships dissolve then respect and devotion arise;
> When a nation falls to chaos then loyalty and patriotism are born.
> ------------------------------   / Lao Tse /
> 
> 
> _______________________________________________
> phenixbb mailing list
> phenixbb at phenix-online.org
> http://phenix-online.org/mailman/listinfo/phenixbb

------
Randy J. Read
Department of Haematology, University of Cambridge
Cambridge Institute for Medical Research      Tel: + 44 1223 336500
Wellcome Trust/MRC Building                   Fax: + 44 1223 336827
Hills Road                                    E-mail: rjr27 at cam.ac.uk
Cambridge CB2 0XY, U.K.                       www-structmed.cimr.cam.ac.uk



More information about the phenixbb mailing list