R
rob
- Jan 1, 1970
- 0
Hi to all.
I am hoping someone can give me some advice here.
I am working on (designing) a tag-reader system using the
Atmel u2270b ic and TK5530hm232 tag chip.I have built a system that
works but I would like some ideas.
The code that is sent out from the 5530 tag chip is 64bit (8 bytes)
with the form : Header(E6 hex) then 7 bytes data(unique code)
There is no "space" or "delay" between the last bit of the last byte
and the start of the next "header " byte.The coding used is
Manchester. The system works on a frequency of 125Khz.The data out of
the tag chip is at
125Khz / 32.
I would like to know how to decode this "stream" of bits using only
software. At the moment I am using a hardware / software approach.
I am deviding the 125Khz by 32 to get a clock. On the falling edge of
the clock(drives an interupt on the 89c4051) I sample the value of the
data.8 bits are shifted into a register and checked to see if it is
"E6".If not the MSB is shifted out "left" and the next bit shifted in
(right)and the process repeats until the header is found.
Once I have the header the decoding of ther next 56 bits is done.The
problem that I am having is sometimes the falling edge of the clock
and the mid point transision of the data happens at "about" the same
time and I am getting errors in the data.Sometime sample 1 and
sometimes 0.
Hope you are still with me!!!
98% of the time things work great , but I would like 100% :0)
I would like to do all the decoding in software and exclude the clock
, but am not sure how to start , because there is no "dead" time
between the last bit of the code and the start of the next header.How
do I get the timing correct.Transitions from low t0 high(0) and vise
versa(1) can happen in the middle of a bit ( giving the value of the
bit) or at bit boundries.
I am fairly new to the "micro" stuff so any help would be apperciated.
Cheers
Rob
I am hoping someone can give me some advice here.
I am working on (designing) a tag-reader system using the
Atmel u2270b ic and TK5530hm232 tag chip.I have built a system that
works but I would like some ideas.
The code that is sent out from the 5530 tag chip is 64bit (8 bytes)
with the form : Header(E6 hex) then 7 bytes data(unique code)
There is no "space" or "delay" between the last bit of the last byte
and the start of the next "header " byte.The coding used is
Manchester. The system works on a frequency of 125Khz.The data out of
the tag chip is at
125Khz / 32.
I would like to know how to decode this "stream" of bits using only
software. At the moment I am using a hardware / software approach.
I am deviding the 125Khz by 32 to get a clock. On the falling edge of
the clock(drives an interupt on the 89c4051) I sample the value of the
data.8 bits are shifted into a register and checked to see if it is
"E6".If not the MSB is shifted out "left" and the next bit shifted in
(right)and the process repeats until the header is found.
Once I have the header the decoding of ther next 56 bits is done.The
problem that I am having is sometimes the falling edge of the clock
and the mid point transision of the data happens at "about" the same
time and I am getting errors in the data.Sometime sample 1 and
sometimes 0.
Hope you are still with me!!!
98% of the time things work great , but I would like 100% :0)
I would like to do all the decoding in software and exclude the clock
, but am not sure how to start , because there is no "dead" time
between the last bit of the code and the start of the next header.How
do I get the timing correct.Transitions from low t0 high(0) and vise
versa(1) can happen in the middle of a bit ( giving the value of the
bit) or at bit boundries.
I am fairly new to the "micro" stuff so any help would be apperciated.
Cheers
Rob