EmbDev.net

Forum: FPGA, VHDL & Verilog delay time in adc code


von angelo (Guest)


Rate this post
useful
not useful
Hi all!

I modelize the component LTC2255 in vhdl, it's a 14 bits adc which is 
working like this:

data_out <= v_in after 5*Tclk+Tp

In my situation the delay is 5*8ns+5.4ns = 45.4ns, so I write:

D <= std_logic_vector(to_signed(integer(IN_ADC*8192.0),D'length)) after 
45.4 ns;

The simulation results only 'U' value on the data_out waveform. I tested 
with a smaller value, 10ns for instance and that is good.

Do you know why and how I can fix this please ?

Thank you

von Duke Scarring (Guest)


Rate this post
useful
not useful
angelo wrote:
> Do you know why and how I can fix this please?
You could try the transport model:
1
D <= transport std_logic_vector(to_signed(integer(IN_ADC*8192.0),D'length)) after 45.4 ns;

Duke

von angelo (Guest)


Rate this post
useful
not useful
Oh it's magic! Thank you very much!

Please log in before posting. Registration is free and takes only a minute.
Existing account
Do you have a Google/GoogleMail account? No registration required!
Log in with Google account
No account? Register here.