slovo | definícia |
neural net (encz) | neural net, n: |
neural net (wn) | neural net
n 1: computer architecture in which processors are connected in
a manner suggestive of connections between neurons; can
learn by trial and error [syn: neural network, {neural
net}]
2: any network of neurons or nuclei that function together to
perform some function in the body [syn: neural network,
neural net] |
| podobné slovo | definícia |
neural network (encz) | neural network, n: |
neural network (wn) | neural network
n 1: computer architecture in which processors are connected in
a manner suggestive of connections between neurons; can
learn by trial and error [syn: neural network, {neural
net}]
2: any network of neurons or nuclei that function together to
perform some function in the body [syn: neural network,
neural net] |
artificial neural network (foldoc) | artificial neural network
neural nets
neural network
neuron
NN
(ANN, commonly just "neural network"
or "neural net") A network of many very simple processors
("units" or "neurons"), each possibly having a (small amount
of) local memory. The units are connected by unidirectional
communication channels ("connections"), which carry numeric
(as opposed to symbolic) data. The units operate only on
their local data and on the inputs they receive via the
connections.
A neural network is a processing device, either an
algorithm, or actual hardware, whose design was inspired by
the design and functioning of animal brains and components
thereof.
Most neural networks have some sort of "training" rule whereby
the weights of connections are adjusted on the basis of
presented patterns. In other words, neural networks "learn"
from examples, just like children learn to recognise dogs from
examples of dogs, and exhibit some structural capability for
generalisation.
Neurons are often elementary non-linear signal processors (in
the limit they are simple threshold discriminators). Another
feature of NNs which distinguishes them from other computing
devices is a high degree of interconnection which allows a
high degree of parallelism. Further, there is no idle memory
containing data and programs, but rather each neuron is
pre-programmed and continuously active.
The term "neural net" should logically, but in common usage
never does, also include biological neural networks, whose
elementary structures are far more complicated than the
mathematical models used for ANNs.
See Aspirin, Hopfield network, McCulloch-Pitts neuron.
Usenet newsgroup: news:comp.ai.neural-nets.
(1997-10-13)
|
cellular neural network (foldoc) | Cellular Neural Network
(CNN) The CNN Universal Machine is a low cost,
low power, extremely high speed supercomputer on a chip. It
is at least 1000 times faster than equivalent DSP solutions
of many complex image processing tasks. It is a stored
program supercomputer where a complex sequence of image
processing algorithms is programmed and downloaded into the
chip, just like any digital computer. Because the entire
computer is integrated into a chip, no signal leaves the chip
until the image processing task is completed.
Although the CNN universal chip is based on analogue and logic
operating principles, it has an on-chip analog-to-digital
input-output interface so that at the system design and
application perspective, it can be used as a digital
component, just like a DSP. In particular, a development
system is available for rapid design and prototyping.
Moreover, a compiler, an operating system, and a
user-friendly CNN high-level language, like the C
language, have been developed which makes it easy to implement
any image processing algorithm.
[Professor Leon Chua, University of California at Berkeley].
(1995-04-27)
|
neural nets (foldoc) | artificial neural network
neural nets
neural network
neuron
NN
(ANN, commonly just "neural network"
or "neural net") A network of many very simple processors
("units" or "neurons"), each possibly having a (small amount
of) local memory. The units are connected by unidirectional
communication channels ("connections"), which carry numeric
(as opposed to symbolic) data. The units operate only on
their local data and on the inputs they receive via the
connections.
A neural network is a processing device, either an
algorithm, or actual hardware, whose design was inspired by
the design and functioning of animal brains and components
thereof.
Most neural networks have some sort of "training" rule whereby
the weights of connections are adjusted on the basis of
presented patterns. In other words, neural networks "learn"
from examples, just like children learn to recognise dogs from
examples of dogs, and exhibit some structural capability for
generalisation.
Neurons are often elementary non-linear signal processors (in
the limit they are simple threshold discriminators). Another
feature of NNs which distinguishes them from other computing
devices is a high degree of interconnection which allows a
high degree of parallelism. Further, there is no idle memory
containing data and programs, but rather each neuron is
pre-programmed and continuously active.
The term "neural net" should logically, but in common usage
never does, also include biological neural networks, whose
elementary structures are far more complicated than the
mathematical models used for ANNs.
See Aspirin, Hopfield network, McCulloch-Pitts neuron.
Usenet newsgroup: news:comp.ai.neural-nets.
(1997-10-13)
|
neural network (foldoc) | artificial neural network
neural nets
neural network
neuron
NN
(ANN, commonly just "neural network"
or "neural net") A network of many very simple processors
("units" or "neurons"), each possibly having a (small amount
of) local memory. The units are connected by unidirectional
communication channels ("connections"), which carry numeric
(as opposed to symbolic) data. The units operate only on
their local data and on the inputs they receive via the
connections.
A neural network is a processing device, either an
algorithm, or actual hardware, whose design was inspired by
the design and functioning of animal brains and components
thereof.
Most neural networks have some sort of "training" rule whereby
the weights of connections are adjusted on the basis of
presented patterns. In other words, neural networks "learn"
from examples, just like children learn to recognise dogs from
examples of dogs, and exhibit some structural capability for
generalisation.
Neurons are often elementary non-linear signal processors (in
the limit they are simple threshold discriminators). Another
feature of NNs which distinguishes them from other computing
devices is a high degree of interconnection which allows a
high degree of parallelism. Further, there is no idle memory
containing data and programs, but rather each neuron is
pre-programmed and continuously active.
The term "neural net" should logically, but in common usage
never does, also include biological neural networks, whose
elementary structures are far more complicated than the
mathematical models used for ANNs.
See Aspirin, Hopfield network, McCulloch-Pitts neuron.
Usenet newsgroup: news:comp.ai.neural-nets.
(1997-10-13)
|
|