64-6 Industrial Communication Systems
e.neural.network.interface.will.receive.requests.from.users.with.uploading.les.and.input.param-
eters
.to.generate.data.les.and.then.send.a.command.to.the.training.soware.on.the.server.machine..
When.all.data.requirements.are.set.up.properly,.the.training.process.will.start..Otherwise,.the.training.
tool.will.send.error.warnings.back.to.clients..If.the.training.process.is.successful,.the.training.result.will.
be.generated..e.training.result.le.is.used.to.store.training.information.and.results.such.as.training.
algorithm,.training.pattern.le,.topology,.parameters,.initial.weights,.and.resultant.weights.(Figure.64.2).
1.0E+01
1.0E–00
1.0E–01
1.0E–02
1.0E–03
1.0E–04
0 1 2 3 4 5 6 7 8 9 10
FIGURE 64.1 User.interface.of.NBN.2.0.
1.0E+03
1.0E+02
1.0E+01
1.0E–01
1.0E–00
1.0E–02
1.0E–03
1.0E–04
0 1 2 3 4 5 6
Iteration [×50]
7 8 9 10
Parameters
Data file: parity4 in
Topology
Neurons
Initial weights
–0.0200 –0.2200
–0.9200
–0.3400
0.2000
0.2600 0.3800
0.0400
0.1400
0.5000
0.5000 –0.5600
–0.1200
–0.1200
–0.6400
–0.8800
–0.5600
–6.4887
–6.4869
–6.1057
–5.3840
7.2167 7.2210
40.822 –0.1793
–6.0018
–1.6357
–1.7794
20.2874
–20.5338
–20.4403
–20.5296
20.287
6.5580
Results weights
Training results
Total iteration: 501
Total error: 4.00000000
Training Time: 0
Biplor gain = 1.00, der = 0.01
Biplor gain = 1.00, der = 0.01
Biplor gain = 1.00, der = 0.01
5 1 2 3 4
1 2 3 4
1 2 3 4 5 6
6
7
NBN mu = 0.01000000 scale = 10.00000000
FIGURE 64.2 Training.results.
© 2011 by Taylor and Francis Group, LLC