Davy
2007-02-26 08:32:42 UTC
Hi all,
When use set_input_delay/set_output_delay, how to determine the -max/-
min parameter? Is it calculated by hand , calculated by tools, or give
out by some standard specification?
Code:
set_input_delay -max 498 -clock EXTSCL [find port ddc_sda_i]
set_input_delay -min 0 -clock EXTSCL [find port ddc_sda_i]
set_output_delay -max 498 -clock CLK1MHZ [find port ddc_sda_o]
set_output_delay -min 0 -clock CLK1MHZ [find port ddc_sda_o]
Best regards,
Davy
When use set_input_delay/set_output_delay, how to determine the -max/-
min parameter? Is it calculated by hand , calculated by tools, or give
out by some standard specification?
Code:
set_input_delay -max 498 -clock EXTSCL [find port ddc_sda_i]
set_input_delay -min 0 -clock EXTSCL [find port ddc_sda_i]
set_output_delay -max 498 -clock CLK1MHZ [find port ddc_sda_o]
set_output_delay -min 0 -clock CLK1MHZ [find port ddc_sda_o]
Best regards,
Davy