I forgot to mention in my earlier reply: The AGP spec also uses a
transmission line as its Data Sheet Test Load for measuring Tval (Tco) at 4x
mode. (Section 4.2.3.3: "The timing margins for 4x mode are smaller and a
simple capacitive load is not sufficient to accurately model the load on a
buffer.") So even with BIRD71, IBIS would be unable to handle Intel's 1998
AGP spec.
> I agree that a testing a model with something other than a simple RC
> is
> necessary to test the goodness of a model. I wish more folks would
> validate
> their models with realistic transmission line type loads before
> releasing
> them. However, the purpose of the timing test load is not to test a
> buffers
> response in an environment with reflections, etc. -- it's to measure
> Tco.
> For that, the timing test load should be something that produces a
> clean,
> full-switching edge at the nominal impedance of the intended
> application.
Agreed, you need clean edges. But I also assert that you ideally want the
Data Sheet Test Load (for measuring Tco) to be reasonably similar to the
actual in-use load. To the extent that the two are similar, you get better
accuracy, when you come to using the Data Sheet delays.
(I don't understand what you mean by "at the nominal impedance of the
intended application." A 50 pF capacitor isn't anything like a 50 ohm board
trace impedance, except at 63.7 MHz, and even then the magnitude of the
reflection coefficient is unity.)
What is Tco anyway? Some people interpret it as simply the delay to the
output pin. So when they plunk the part down on their board, it is the
delay they will get. They want to "add up the numbers" and get the total
path delay. (This might have been the purpose of the P4 test load.) For
this to work, you need the Test Load (for determining Tco) to be similar to
the actual load, don't you? At one time, 50 pF was close enough to a
typical load (given slow risetimes, a small fanout, and some inches of
trace); now it may not be.
Other people use Tco as the delay in the specific case with the Test Load
attached, and run simulations with both the Test Load and the actual load
and calculate their difference. IN THEORY, this should always give the
correct results. However, the further the Test Load is from the actual
load, the greater the simulation errors may be.
Then there is the case of a Standard, such as PCI, which specifies
acceptable Tco (Tval) ranges that a part is allowed to have, from any vendor
and with any IC design or process.
Now I'm not a chip designer so I am taking some liberties here. But I
assume there are some combinations of IC characteristics (drive strengths,
edge rates, etc.) that are good for driving a 50pF load but not a
transmission line; and vice-versa. So you could take two different ICs,
that have exactly the same measured Tco with the 50pF test load, and they
would have distinctly different delays in your system!
That's the kind of thing you can avoid with the right test load. With a
50pF Data Sheet Test Load, what pass as acceptable chips, may behave poorly
(well, variably) in the real system, so you need margins to account for
that. As speeds go up, you may not be able to afford those margins. You
need to test the chips for how they will perform in the transmission line
environment, and evaluate them for Pass/Fail based on their Tco with the
transmission line load.
Regards,
Andy
Received on Fri May 11 12:25:03 2001
This archive was generated by hypermail 2.1.8 : Fri Jun 03 2011 - 09:52:30 PDT