EmbDev.net

Forum: FPGA, VHDL & Verilog delay time in adc code


Author: angelo (Guest)
Posted on:

Rate this post
0 useful
not useful
Hi all!

I modelize the component LTC2255 in vhdl, it's a 14 bits adc which is 
working like this:

data_out <= v_in after 5*Tclk+Tp

In my situation the delay is 5*8ns+5.4ns = 45.4ns, so I write:

D <= std_logic_vector(to_signed(integer(IN_ADC*8192.0),D'length)) after 
45.4 ns;

The simulation results only 'U' value on the data_out waveform. I tested 
with a smaller value, 10ns for instance and that is good.

Do you know why and how I can fix this please ?

Thank you

Author: Duke Scarring (Guest)
Posted on:

Rate this post
1 useful
not useful
angelo wrote:
> Do you know why and how I can fix this please?
You could try the transport model:
D <= transport std_logic_vector(to_signed(integer(IN_ADC*8192.0),D'length)) after 45.4 ns;

Duke

Author: angelo (Guest)
Posted on:

Rate this post
0 useful
not useful
Oh it's magic! Thank you very much!

Reply

Entering an e-mail address is optional. If you want to receive reply notifications by e-mail, please log in.

Rules — please read before posting

  • Post long source code as attachment, not in the text
  • Posting advertisements is forbidden.

Formatting options

  • [c]C code[/c]
  • [avrasm]AVR assembler code[/avrasm]
  • [vhdl]VHDL code[/vhdl]
  • [code]code in other languages, ASCII drawings[/code]
  • [math]formula (LaTeX syntax)[/math]




Bild automatisch verkleinern, falls nötig
Note: the original post is older than 6 months. Please don't ask any new questions in this thread, but start a new one.