Discussion:
[DC] Determine parameter in set_input_delay?
(too old to reply)
Davy
2007-02-26 08:32:42 UTC
Permalink
Hi all,

When use set_input_delay/set_output_delay, how to determine the -max/-
min parameter? Is it calculated by hand , calculated by tools, or give
out by some standard specification?

Code:
set_input_delay -max 498 -clock EXTSCL [find port ddc_sda_i]
set_input_delay -min 0 -clock EXTSCL [find port ddc_sda_i]

set_output_delay -max 498 -clock CLK1MHZ [find port ddc_sda_o]
set_output_delay -min 0 -clock CLK1MHZ [find port ddc_sda_o]


Best regards,
Davy
Alvin Andries
2007-02-26 22:46:27 UTC
Permalink
Post by Davy
Hi all,
When use set_input_delay/set_output_delay, how to determine the -max/-
min parameter? Is it calculated by hand , calculated by tools, or give
out by some standard specification?
set_input_delay -max 498 -clock EXTSCL [find port ddc_sda_i]
set_input_delay -min 0 -clock EXTSCL [find port ddc_sda_i]
set_output_delay -max 498 -clock CLK1MHZ [find port ddc_sda_o]
set_output_delay -min 0 -clock CLK1MHZ [find port ddc_sda_o]
Best regards,
Davy
It depends. If you're doing hierarchical synthesis, some tool might help you
budgetting your internal I/O timing constraints. When going of chip, it's
either you who decides what the user gets or the other way around: the user
tells you what he wants.

Alvin.

Loading...