Bob,
Thank you again for your response, it is very clear.
Another question.
If the user should simulate with the IBIS model using an input stimulus
with the same trise and tfall as was used with the input stimulus in S2I
translation, how does the user know what this is as it is not a parameter
included in the IBIS datasheet? I assume the user should refer to the
vendor's datasheet for input tr and tf, but that is assuming the IBIS
datasheet generator used the same tr and tf. Should input tr and tf be
parameters included in an IBIS datasheet?
Respectfully,
Adam Tambone
Bob Ross <bob_ross@mentorg.com>@relay1.mentorg.com on 06/13/2001 03:16:48
PM
Sent by: bobr@relay1.mentorg.com
To: Adam Tambone/SouthPortland/Fairchild@Fairchild
cc: ibis@eda.org
Subject: Re: BIRD 68.1
Adam:
Some responses are in your text.
Bob Ross
Mentor Graphics
> Adam.Tambone@fairchildsemi.com wrote:
>
> Bob and David,
>
> Thank you both for your responses. They have led me to more questions,
> please excuse their simplicity.
>
> Bob, I am not clear about your statement, "they have to be offset by
the
> same amount of time".
>
> Can you provide more description?
>
I actually met that the captured waveforms
must be delayed relative to the same reference
input or clock.
Suppose the rising waveform and
falling waveforms are both exact 1 ns ramps
into the same test load, which also happens
to be the actual load. However, both of
these ramps start at 1 ns after the clock
edge. So the actual waveforms that
are stored in the IBIS model cover 2 ns.
The maximum pulse width and clock frequency
are 2 ns and 250 MHz without causing
distorted simulation results. However the real
device can operate at a higher frequency.
The model developer might choose to
truncate the leading edge delay. An
equal amount of delay should be removed
from both the rising and falling waveform
tables. (The EDA tool may also do this.)
Then the pulse width and clock frequency
can be set to 1 ns and 500 MHz without
distortion.
> David, The netlists we use in S2I translation include only the buffers (
> we are currently considering using netlists that include full data paths
> instead ) and so my question is, should additional delay be added to the
> rising and falling waveforms to account for the delay through circuitry
not
> included in these netlists?
>
> Another question.
>
> Since the rising and falling edges represented in the rising and falling
> waveforms were gained with an input stimulus with a specific trise and
> tfall, is it necessary for the simulations to be run with the same trise
> and tfall on the input stimulus? In other words, if simulations are run
> with an input stimulus that has different trise and tfall then that of
the
> input stimulus used to produce the rising and falling waveforms will not
> the edge rates in simulation be invalid?
>
I would recommend the same trise and tfall for input
stimulus. Using an input that corresponds to an
actual input is probably a good strategy. The
effect of using a different edge rate is probably
to change the simulation delay.
> Thank You Again,
> Adam Tambone
>
> Adam:
>
> The rising and falling V-T tables do not have to
> start at the same time as the input stimulus, but
> they have to be offset by the same amount of time.
>
> Then the buffers should simulate in an undistorted
> manner - provided that the pulse width is wide
> enough to capture the whole rising and falling
> waveforms. The pulse width can be reduced if
> a leading edge delay time removal algorithm is
> used to remove equal delays in both rising and
> falling waveform sets.
>
> Bob Ross
> Mentor Graphics
>
> Hi Adam,
>
> Well, you are pretty close. The Bird clarifies IBIS in hope that--if
> everytning is done correctly--you will not see duty cycle distortion in
> simulation. Hence, as you say, the V-T tables should begin at the same
> time
> with respect to some edge stimulus inside the device you are modeling.
Any
> difference in delay between rising a falling edges should be represented
by
> where the actual edge occurs in the respective tables. You should not
need
> to manually add any additional delay; the data from your transistor level
> simulator should do that for you if it is modeling the differences in
> internal delays already.
>
> If you are taking bench data on real silicon, then you will need to build
> the tables to show the differences in delays. Your digital scope might
do
> that for you if you are triggering from a common clock, for example.
>
> But in addition to all of the above, the simulator you are using must
also
> handle the waveforms correctly, in order to avoid the duty cycle
> distortion.
> Some may, others may not. It is always a good idea to run some
correlation
> simulations to compare with the bench test data or transistor level
> simulator to make sure it is all working the way it should be.
>
> Best regards,
> David Lorang
>
> > Adam.Tambone@fairchildsemi.com wrote:
> >
> > Hello All,
> >
> > I have question regarding BIRD 68.1. Does the BIRD state that if the
> > recommendations within it are followed ( i.e. if the v-t tables for
> rising
> > and falling begin at the same time as the rising and falling edges of
> the
> > input stimulus, and additional delay is introduced to account for delay
> not
> > within the buffers ) then undistorted duty cycles will be represented
in
> > simulation?
> >
> > Thanks,
> > Adam Tambone
Received on Wed Jun 13 12:31:01 2001
This archive was generated by hypermail 2.1.8 : Fri Jun 03 2011 - 09:52:30 PDT