Maker Pro
Maker Pro

Hardware Neural Network

D

Daniele

Jan 1, 1970
0
Hi all,

I'm a university student, and I'm realizing a research on artificial
neural networks. The aim of my research is the feasibility of putting
an artificial neural network on a microcontroller or a DSP. Because of
the sigmoid function, I think that would be necessary a 32-bit
microcontroller (or DSP) for floating point operations. I was
searching online but I only had found exhaustive informations on
software realization of ANN's, but it's not what I was searching for.
Does someone have any hint or any previous experience on the hardware
realization of an ANN?
I think the better solution is the DSP, due to its power on floating
points operation, is it right?

Here are some details of the network:

5 or 6 inputs
about 10 neurons in the hidden layer
2 outputs

Thanks in advance,

Daniele.
 

rwmoekoe

Feb 23, 2007
6
Joined
Feb 23, 2007
Messages
6
hi,

i vaguely remember reading it somewhere but i can't seem to remember where exactly it was. Maybe it's the Datasheet 4 databook or so. If it is, then you can get it

It is about exactly what u r looking for: a hardware neural network, consisting of matrix of presettable resistors or some kind of such components, which are adjusted during the training period. What i remember is that the sigmoid transfer function and so forth, are done naturally, as analog signals through the components functioning as neurons, are passed just naturally in the sigmoid function.

It is very interesting!

Well, just wanna know, how many hidden layers do u plan on having? Are there recursive layers in them?
If u don't mind sharing, what is the purpose of the nn u're building?

Thanks alot!
 
E

EdV

Jan 1, 1970
0
Hi all,

I'm a university student, and I'm realizing a research on artificial
neural networks. The aim of my research is the feasibility of putting
an artificial neural network on a microcontroller or a DSP. Because of
the sigmoid function, I think that would be necessary a 32-bit
microcontroller (or DSP) for floating point operations. I was
searching online but I only had found exhaustive informations on
software realization of ANN's, but it's not what I was searching for.
Does someone have any hint or any previous experience on the hardware
realization of an ANN?
I think the better solution is the DSP, due to its power on floating
points operation, is it right?

Here are some details of the network:

5 or 6 inputs
about 10 neurons in the hidden layer
2 outputs

Thanks in advance,

Daniele.

If you want to do a HW realization of a neural network your best bet
is to do it in Field Programmable Gate Arrays. Try googling FPGA and
Neural network:

FPGA Implementations of Neural Networks - a Survey of a Decade of ...
http://www.itee.uq.edu.au/~peters/papers/zhu_sutton_fpl2003.pdf

and about 17k more hits.

Have fun,
Ed V.
 
R

Robert Lacoste

Jan 1, 1970
0
Daniele said:
Hi all,

I'm a university student, and I'm realizing a research on artificial
neural networks. The aim of my research is the feasibility of putting
an artificial neural network on a microcontroller or a DSP. Because of
the sigmoid function, I think that would be necessary a 32-bit
microcontroller (or DSP) for floating point operations. I was
searching online but I only had found exhaustive informations on
software realization of ANN's, but it's not what I was searching for.
Does someone have any hint or any previous experience on the hardware
realization of an ANN?
I think the better solution is the DSP, due to its power on floating
points operation, is it right?

Here are some details of the network:

5 or 6 inputs
about 10 neurons in the hidden layer
2 outputs

Thanks in advance,

Daniele.

Hi Danielle,

You may have a look at the "Neural Stamp" project I've published in the
Circuit Cellar magazine some time ago (January 2000, issue #114) : it
provided 8 analog inputs, an hidden layer of 16 neurons, an output layer of
8 neurons driving 8 analog outputs, with a refresh rate of 50ms... all with
only the internal resources of a MC68HC908GP20 low cost 8 bit
microcontroller. Floating point is absolutly unnecessary for neural
networks, as a 1 bit quantization error doesn't change anything even with
8-bit words (at least for 2-layers networks), and sigmoid can be done easily
with a table-driven approach. This project won the 3rd prize in the
Design'99 contest, see http://www.circuitcellar.com/d99winners/

Friendly,
Robert Lacoste
www.alciom.com
The Mixed Signal Experts
 
It is definitely true that floating point capabilities are not needed
for the real-time implementation of a network which has already been
trained. They can be an advantage in the training process however.

I developed a real-time multi-layer perceptron implementation around
1989 which extracted the voice fundamental frequency (voice pitch) for
use in specialised hearing aids. This used a TMS320C25 16-bit fixed
point DSP.

The training was done on Sun workstations and took many days.

A DSP is exceptionally well suited to the task, because each "neuron"
can be implemented as a repeated multiply-accumulate-with-data-move
instruction followed by a table lookup for the sigmoid function.

The following publication describes the work:

Real-Time Portable Multi-Layer Perceptron Voice Fundamental-Period
Extractor for Hearing Aids and Cochlear Implants. JR Walliker & I
Howard. 1989

http://www.ianhoward.de/ScannedPubs/WalHow90.pdf

John Walliker

www.walliker.com
 
Daniele said:
Hi all,

I'm a university student, and I'm realizing a research on artificial
neural networks. The aim of my research is the feasibility of putting
an artificial neural network on a microcontroller or a DSP. Because of
the sigmoid function, I think that would be necessary a 32-bit
microcontroller (or DSP) for floating point operations. I was
searching online but I only had found exhaustive informations on
software realization of ANN's, but it's not what I was searching for.
Does someone have any hint or any previous experience on the hardware
realization of an ANN?
I think the better solution is the DSP, due to its power on floating
points operation, is it right?

Here are some details of the network:

5 or 6 inputs
about 10 neurons in the hidden layer
2 outputs

Thanks in advance,

Daniele.

At some point, someone in the university will ask "how is your work
new," particularly if you are a graduate student. So I wonder how
you can distinguish this as something beyond the articles by James
Albus in the summer 1977 Byte magazine on a CMAC? (That summer, I
implemented his algorithm on a TI handheld calculator.)
 
P

Paul Burke

Jan 1, 1970
0
Daniele said:
Thanks for the hints and the link. I already had thought about using
an 8-bit table for mapping the sigmoid, so this is a good confirmation.

I half- recall seeing a reference to work on inhibitors in real neurones
that said the untreated neurone (from what? can't remember) had a 1-2%
chance of triggering with no input, so it would seem 8 bit should be at
least similar to real- life. But a 10 or 12 bit lookup shouldn't break
the bank with most processors or FPGAs.

Paul Burke
 
D

Daniele

Jan 1, 1970
0
"Daniele" <[email protected]> a écrit dans le message de [email protected]...











Hi Danielle,

You may have a look at the "Neural Stamp" project I've published in the
Circuit Cellar magazine some time ago (January 2000, issue #114) : it
provided 8 analog inputs, an hidden layer of 16 neurons, an output layer of
8 neurons driving 8 analog outputs, with a refresh rate of 50ms... all with
only the internal resources of a MC68HC908GP20 low cost 8 bit
microcontroller. Floating point is absolutly unnecessary for neural
networks, as a 1 bit quantization error doesn't change anything even with
8-bit words (at least for 2-layers networks), and sigmoid can be done easily
with a table-driven approach. This project won the 3rd prize in the
Design'99 contest, seehttp://www.circuitcellar.com/d99winners/

Friendly,
Robert Lacostewww.alciom.com
The Mixed Signal Experts- Nascondi testo tra virgolette -

- Mostra testo tra virgolette -

Thanks for the hints and the link. I already had thought about using
an 8-bit table for mapping the sigmoid, so this is a good confirmation.
 
D

Daniele

Jan 1, 1970
0
At some point, someone in the university will ask "how is your work
new," particularly if you are a graduate student. So I wonder how
you can distinguish this as something beyond the articles by James
Albus in the summer 1977 Byte magazine on a CMAC? (That summer, I
implemented his algorithm on a TI handheld calculator.)- Nascondi testo tra virgolette -

- Mostra testo tra virgolette -

I have to train the network and set the heights without using a
calculator, but writing directly in the memory of the MC/DSP.

Friendly,

Daniele
 
J

John Larkin

Jan 1, 1970
0
Hi all,

I'm a university student, and I'm realizing a research on artificial
neural networks. The aim of my research is the feasibility of putting
an artificial neural network on a microcontroller or a DSP. Because of
the sigmoid function, I think that would be necessary a 32-bit
microcontroller (or DSP) for floating point operations. I was
searching online but I only had found exhaustive informations on
software realization of ANN's, but it's not what I was searching for.
Does someone have any hint or any previous experience on the hardware
realization of an ANN?
I think the better solution is the DSP, due to its power on floating
points operation, is it right?

Here are some details of the network:

5 or 6 inputs
about 10 neurons in the hidden layer
2 outputs

Thanks in advance,

Daniele.

I know it's an academic darling, but has anyone ever done anything
genuinely useful with nn technology?

John
 
J

Jan Panteltje

Jan 1, 1970
0
I know it's an academic darling, but has anyone ever done anything
genuinely useful with nn technology?

John

Speach recognition, look up dragon natural speaking.
Also look up Liaw and Berger.
Now ho is asking cryptic quatrions?
Cannot you type neural net in google?
 
P

Phil Hobbs

Jan 1, 1970
0
Jan said:
Speach recognition, look up dragon natural speaking.
Also look up Liaw and Berger.
Now ho is asking cryptic quatrions?
Cannot you type neural net in google?

The problem with NNs is that you can't see why they work. Thus although
they can provide neat results, you have to verify them afterwards, e.g.
speech recognition. You can't prove that they're going to work in any
given case without trying it. As Deming said, "You can't test quality
into a product."

Cheers,

Phil Hobbs
 
J

Jan Panteltje

Jan 1, 1970
0
The problem with NNs is that you can't see why they work. Thus although
they can provide neat results, you have to verify them afterwards, e.g.
speech recognition. You can't prove that they're going to work in any
given case without trying it. As Deming said, "You can't test quality
into a product."

Cheers,

Phil Hobbs
This is not completely correct.
I suggest you look up Berger & Liaw.
For now it also is a mathematic question, so you can calculate what comes out.
Anyways Berger & Liaw came up with a betetr neuron model.
It think this is now used to find snipers and gun types(??), but hard to get
data, more likely submarine detection, as it is Navy financed.
Better models is what we need.
 
P

Phil Hobbs

Jan 1, 1970
0
Jan said:
This is not completely correct.
I suggest you look up Berger & Liaw.
For now it also is a mathematic question, so you can calculate what comes out.
Anyways Berger & Liaw came up with a betetr neuron model.
It think this is now used to find snipers and gun types(??), but hard to get
data, more likely submarine detection, as it is Navy financed.
Better models is what we need.

Hmm. So if you have one of these B&L gizmos (about which opinion seems
to be seriously divided), and it's been trained to recognize my speech,
how are you going to show that it'll recognize yours without trying it?

Cheers,

Phil Hobbs
 
D

Dirk Bruere at NeoPax

Jan 1, 1970
0
Phil said:
Hmm. So if you have one of these B&L gizmos (about which opinion seems
to be seriously divided), and it's been trained to recognize my speech,
how are you going to show that it'll recognize yours without trying it?

The same problem exists for all complex pieces of s/w.
There is no general way to prove that they are bug free or will do what
they are supposed to do.

--
Dirk

http://www.onetribe.me.uk - The UK's only occult talk show
Presented by Dirk Bruere and Marc Power on ResonanceFM 104.4
http://www.resonancefm.com
 
P

Phil Hobbs

Jan 1, 1970
0
Dirk said:
The same problem exists for all complex pieces of s/w.
There is no general way to prove that they are bug free or will do what
they are supposed to do.

Well, no, that's not true. There are unit tests and so forth, and you
can trawl through the code and see how it's organized. Try doing that
with a neural net. NNs are cool, don't get me wrong, but I _hate_ 3 AM
phone calls.

Cheers,

Phil Hobbs
 
J

Jan Panteltje

Jan 1, 1970
0
Hmm. So if you have one of these B&L gizmos (about which opinion seems
to be seriously divided), and it's been trained to recognize my speech,
how are you going to show that it'll recognize yours without trying it?

Cheers,

Phil Hobbs

Sing a song into it?
 
J

John Larkin

Jan 1, 1970
0
Speach recognition, look up dragon natural speaking.
Also look up Liaw and Berger.
Now ho is asking cryptic quatrions?
Cannot you type neural net in google?

Oh, there are lots of hits, too many in fact. I was just wondering if
any practical products have resulted. The cited applications seem to
be stuff like spam detection, language translation, and pattern
recognition, processes that really don't expect consistant accuracy.

I'd be reluctant to trust anything serious, like a control system that
mattered, to an algorithm whose corner cases are undefined and
probably not testable.

I've worked with a few academics that would shout "neural network!"
(in one situation, two did it in precise unison) in response to nearly
any problem they didn't have an analytic approach to. The suggestion
was often beyond absurd.


John
 
F

Frank Miles

Jan 1, 1970
0
Oh, there are lots of hits, too many in fact. I was just wondering if
any practical products have resulted. The cited applications seem to
be stuff like spam detection, language translation, and pattern
recognition, processes that really don't expect consistant accuracy.

I'd be reluctant to trust anything serious, like a control system that
mattered, to an algorithm whose corner cases are undefined and
probably not testable.

I've worked with a few academics that would shout "neural network!"
(in one situation, two did it in precise unison) in response to nearly
any problem they didn't have an analytic approach to. The suggestion
was often beyond absurd.

Is there _any_ pattern recognition technology that doesn't suffer from
the same problems? At least for complex signals? This is not to say
that a truly refined technology (ideally) shouldn't be able to deliver
consistent/provable/testable performance, or be open to inspection.
But then the finest pattern recognizers - at least, in discriminating
the most complex signals in significant noise - (trained people) don't
meet this test, either. We have a long way to go in developing a
technology that has the fine attributes that you seek.

-f
--
 
Top