Intel® Quartus® Prime Software
Intel® Quartus® Prime Design Software, Design Entry, Synthesis, Simulation, Verification, Timing Analysis, System Design (Platform Designer, formerly Qsys)
16557 Discussions

Setting bits in a std_logic_vector based on a run time pattern

Altera_Forum
Honored Contributor II
939 Views

I am trying to set bits in a std_logic_vector according to a number given from another hit_vector. 

the hit vector is a combination of three detector hits, and I am trying to decode this back into hit pattern from the single detectors. 

 

The decode goes fine on cl'event, but then when nothing changes in hit_vector, the LSB of all decoded values changes to 1 every time on the rising clock. I am probably doing something stupid, but I cannot find the problem. 

at 100 ns the hit vector changes, and quadrant gets decoded correctly; i.e. 11 -> bit 3, but after this quadrant switches back and forth between 1 and 0 

 

Thanks for any help. 

andi
0 Kudos
2 Replies
Altera_Forum
Honored Contributor II
240 Views

Why do you have quadrant, sta1/2/3 all set to 0 at the beginning of the process? because of the way VHDL works, this means that whenever you get a falling edge of clk, they will be set to 0, and the code will not synthesise. Remove these assignments and try again.,..

0 Kudos
Altera_Forum
Honored Contributor II
240 Views

thanks, andi

0 Kudos
Reply