Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Valued Contributor III
703 Views

Setting bits in a std_logic_vector based on a run time pattern

I am trying to set bits in a std_logic_vector according to a number given from another hit_vector. 

the hit vector is a combination of three detector hits, and I am trying to decode this back into hit pattern from the single detectors. 

 

The decode goes fine on cl'event, but then when nothing changes in hit_vector, the LSB of all decoded values changes to 1 every time on the rising clock. I am probably doing something stupid, but I cannot find the problem. 

at 100 ns the hit vector changes, and quadrant gets decoded correctly; i.e. 11 -> bit 3, but after this quadrant switches back and forth between 1 and 0 

 

Thanks for any help. 

andi
0 Kudos
2 Replies
Highlighted
Valued Contributor III
4 Views

Why do you have quadrant, sta1/2/3 all set to 0 at the beginning of the process? because of the way VHDL works, this means that whenever you get a falling edge of clk, they will be set to 0, and the code will not synthesise. Remove these assignments and try again.,..

0 Kudos
Highlighted
Valued Contributor III
4 Views

thanks, andi

0 Kudos