We are developing assays where we coat each time we do the assay, with peptide, overnight (2-8C in humidity box, same shelf of same fridge each time, same 18hours each time).
Next day we block and add rabbit anti-peptide and then anti-rabbit HRP and measure signal. All steps are optimised and controls are good (inter-precision 10% or less, intra precision 4%).
But the OD of the standard curve shifts a lot. - %CV of top standard OD is 20%. (Excluding extreme three assays) and we have a range of OD 1.3 - 2.3. (read at 450nM with correction at 620nM)
Why?????? Is the coating just a very variable thing and this will always be the case unless we coat a large batch of plates at a time and then set control specs on each batch of coated plates.
Or do we need to go back and look at the coating concentration more closely. Generally we aim for just below saturation of peptide on the well, by increasing coat concentration until the signal increases only slightly more with more coating peptide.
The controls are shifting in OD with the curve so they read okay. Just we are getting a lot of variation in the OD from assay to assay.
thanks in advance for any advice