This commit is contained in:
陈逸凡 2024-04-25 17:06:23 +08:00
commit c4b421e13a
21 changed files with 1717 additions and 0 deletions

7
.travis.yml Normal file
View File

@ -0,0 +1,7 @@
language: c
compiler:
- clang
- gcc
script: make

BIN
1.xlsx Normal file

Binary file not shown.

20
LICENSE Normal file
View File

@ -0,0 +1,20 @@
zlib License
Copyright (C) 2015-2018 Lewis Van Winkle
This software is provided 'as-is', without any express or implied
warranty. In no event will the authors be held liable for any damages
arising from the use of this software.
Permission is granted to anyone to use this software for any purpose,
including commercial applications, and to alter it and redistribute it
freely, subject to the following restrictions:
1. The origin of this software must not be misrepresented; you must not
claim that you wrote the original software. If you use this software
in a product, an acknowledgement in the product documentation would be
appreciated but is not required.
2. Altered source versions must be plainly marked as such, and must not be
misrepresented as being the original software.
3. This notice may not be removed or altered from any source distribution.

35
Makefile Normal file
View File

@ -0,0 +1,35 @@
CFLAGS = -Wall -Wshadow -O3 -g -march=native
LDLIBS = -lm
all: check example1 example2 example3 example4 mytest
sigmoid: CFLAGS += -Dgenann_act=genann_act_sigmoid_cached
sigmoid: all
threshold: CFLAGS += -Dgenann_act=genann_act_threshold
threshold: all
linear: CFLAGS += -Dgenann_act=genann_act_linear
linear: all
test: test.o genann.o
check: test
./$^
example1: example1.o genann.o
example2: example2.o genann.o
example3: example3.o genann.o
example4: example4.o genann.o
mytest: mytest.o genann.o
clean:
$(RM) *.o
$(RM) test example1 example2 example3 example4 *.exe
$(RM) persist.txt
.PHONY: sigmoid threshold linear clean

154
README.md Normal file
View File

@ -0,0 +1,154 @@
[![Build Status](https://travis-ci.org/codeplea/genann.svg?branch=master)](https://travis-ci.org/codeplea/genann)
<img alt="Genann logo" src="https://codeplea.com/public/content/genann_logo.png" align="right" />
# Genann
Genann is a minimal, well-tested library for training and using feedforward
artificial neural networks (ANN) in C. Its primary focus is on being simple,
fast, reliable, and hackable. It achieves this by providing only the necessary
functions and little extra.
## Features
- **C99 with no dependencies**.
- Contained in a single source code and header file.
- Simple.
- Fast and thread-safe.
- Easily extendible.
- Implements backpropagation training.
- *Compatible with alternative training methods* (classic optimization, genetic algorithms, etc)
- Includes examples and test suite.
- Released under the zlib license - free for nearly any use.
## Building
Genann is self-contained in two files: `genann.c` and `genann.h`. To use Genann, simply add those two files to your project.
## Example Code
Four example programs are included with the source code.
- [`example1.c`](./example1.c) - Trains an ANN on the XOR function using backpropagation.
- [`example2.c`](./example2.c) - Trains an ANN on the XOR function using random search.
- [`example3.c`](./example3.c) - Loads and runs an ANN from a file.
- [`example4.c`](./example4.c) - Trains an ANN on the [IRIS data-set](https://archive.ics.uci.edu/ml/datasets/Iris) using backpropagation.
## Quick Example
We create an ANN taking 2 inputs, having 1 layer of 3 hidden neurons, and
providing 2 outputs. It has the following structure:
![NN Example Structure](./doc/e1.png)
We then train it on a set of labeled data using backpropagation and ask it to
predict on a test data point:
```C
#include "genann.h"
/* Not shown, loading your training and test data. */
double **training_data_input, **training_data_output, **test_data_input;
/* New network with 2 inputs,
* 1 hidden layer of 3 neurons each,
* and 2 outputs. */
genann *ann = genann_init(2, 1, 3, 2);
/* Learn on the training set. */
for (i = 0; i < 300; ++i) {
for (j = 0; j < 100; ++j)
genann_train(ann, training_data_input[j], training_data_output[j], 0.1);
}
/* Run the network and see what it predicts. */
double const *prediction = genann_run(ann, test_data_input[0]);
printf("Output for the first test data point is: %f, %f\n", prediction[0], prediction[1]);
genann_free(ann);
```
This example is to show API usage, it is not showing good machine learning
techniques. In a real application you would likely want to learn on the test
data in a random order. You would also want to monitor the learning to prevent
over-fitting.
## Usage
### Creating and Freeing ANNs
```C
genann *genann_init(int inputs, int hidden_layers, int hidden, int outputs);
genann *genann_copy(genann const *ann);
void genann_free(genann *ann);
```
Creating a new ANN is done with the `genann_init()` function. Its arguments
are the number of inputs, the number of hidden layers, the number of neurons in
each hidden layer, and the number of outputs. It returns a `genann` struct pointer.
Calling `genann_copy()` will create a deep-copy of an existing `genann` struct.
Call `genann_free()` when you're finished with an ANN returned by `genann_init()`.
### Training ANNs
```C
void genann_train(genann const *ann, double const *inputs,
double const *desired_outputs, double learning_rate);
```
`genann_train()` will preform one update using standard backpropogation. It
should be called by passing in an array of inputs, an array of expected outputs,
and a learning rate. See *example1.c* for an example of learning with
backpropogation.
A primary design goal of Genann was to store all the network weights in one
contigious block of memory. This makes it easy and efficient to train the
network weights using direct-search numeric optimization algorthims,
such as [Hill Climbing](https://en.wikipedia.org/wiki/Hill_climbing),
[the Genetic Algorithm](https://en.wikipedia.org/wiki/Genetic_algorithm), [Simulated
Annealing](https://en.wikipedia.org/wiki/Simulated_annealing), etc.
These methods can be used by searching on the ANN's weights directly.
Every `genann` struct contains the members `int total_weights;` and
`double *weight;`. `*weight` points to an array of `total_weights`
size which contains all weights used by the ANN. See *example2.c* for
an example of training using random hill climbing search.
### Saving and Loading ANNs
```C
genann *genann_read(FILE *in);
void genann_write(genann const *ann, FILE *out);
```
Genann provides the `genann_read()` and `genann_write()` functions for loading or saving an ANN in a text-based format.
### Evaluating
```C
double const *genann_run(genann const *ann, double const *inputs);
```
Call `genann_run()` on a trained ANN to run a feed-forward pass on a given set of inputs. `genann_run()`
will provide a pointer to the array of predicted outputs (of `ann->outputs` length).
## Hints
- All functions start with `genann_`.
- The code is simple. Dig in and change things.
## Extra Resources
The [comp.ai.neural-nets
FAQ](http://www.faqs.org/faqs/ai-faq/neural-nets/part1/) is an excellent
resource for an introduction to artificial neural networks.
If you need an even smaller neural network library, check out the excellent single-hidden-layer library [tinn](https://github.com/glouw/tinn).
If you're looking for a heavier, more opinionated neural network library in C,
I recommend the [FANN library](http://leenissen.dk/fann/wp/). Another
good library is Peter van Rossum's [Lightweight Neural
Network](http://lwneuralnet.sourceforge.net/), which despite its name, is
heavier and has more features than Genann.

10
build.sh Normal file
View File

@ -0,0 +1,10 @@
###
# @Author: 陈逸凡 1343619937@qq.com
# @Date: 2024-04-25 16:02:09
# @LastEditors: 陈逸凡 1343619937@qq.com
# @LastEditTime: 2024-04-25 16:02:21
# @FilePath: \genann-master\build.sh
# @Description: 这是默认设置,请设置`customMade`, 打开koroFileHeader查看配置 进行设置: https://github.com/OBKoro1/koro1FileHeader/wiki/%E9%85%8D%E7%BD%AE
###
gcc genann.c my_test.c -o mytest -lm
./mytest

9
doc/e1.dot Normal file
View File

@ -0,0 +1,9 @@
digraph G {
rankdir=LR;
{i1 i2} -> {h1 h2 h3} -> {o1 o2};
i1, i2, h1, h2, h3, o1, o2 [shape=circle; label="";];
input -> hidden -> output [style=invis;];
input, hidden, output [shape=plaintext;];
}

BIN
doc/e1.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

150
example/iris.data Normal file
View File

@ -0,0 +1,150 @@
5.1,3.5,1.4,0.2,Iris-setosa
4.9,3.0,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
5.0,3.6,1.4,0.2,Iris-setosa
5.4,3.9,1.7,0.4,Iris-setosa
4.6,3.4,1.4,0.3,Iris-setosa
5.0,3.4,1.5,0.2,Iris-setosa
4.4,2.9,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.4,3.7,1.5,0.2,Iris-setosa
4.8,3.4,1.6,0.2,Iris-setosa
4.8,3.0,1.4,0.1,Iris-setosa
4.3,3.0,1.1,0.1,Iris-setosa
5.8,4.0,1.2,0.2,Iris-setosa
5.7,4.4,1.5,0.4,Iris-setosa
5.4,3.9,1.3,0.4,Iris-setosa
5.1,3.5,1.4,0.3,Iris-setosa
5.7,3.8,1.7,0.3,Iris-setosa
5.1,3.8,1.5,0.3,Iris-setosa
5.4,3.4,1.7,0.2,Iris-setosa
5.1,3.7,1.5,0.4,Iris-setosa
4.6,3.6,1.0,0.2,Iris-setosa
5.1,3.3,1.7,0.5,Iris-setosa
4.8,3.4,1.9,0.2,Iris-setosa
5.0,3.0,1.6,0.2,Iris-setosa
5.0,3.4,1.6,0.4,Iris-setosa
5.2,3.5,1.5,0.2,Iris-setosa
5.2,3.4,1.4,0.2,Iris-setosa
4.7,3.2,1.6,0.2,Iris-setosa
4.8,3.1,1.6,0.2,Iris-setosa
5.4,3.4,1.5,0.4,Iris-setosa
5.2,4.1,1.5,0.1,Iris-setosa
5.5,4.2,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.0,3.2,1.2,0.2,Iris-setosa
5.5,3.5,1.3,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
4.4,3.0,1.3,0.2,Iris-setosa
5.1,3.4,1.5,0.2,Iris-setosa
5.0,3.5,1.3,0.3,Iris-setosa
4.5,2.3,1.3,0.3,Iris-setosa
4.4,3.2,1.3,0.2,Iris-setosa
5.0,3.5,1.6,0.6,Iris-setosa
5.1,3.8,1.9,0.4,Iris-setosa
4.8,3.0,1.4,0.3,Iris-setosa
5.1,3.8,1.6,0.2,Iris-setosa
4.6,3.2,1.4,0.2,Iris-setosa
5.3,3.7,1.5,0.2,Iris-setosa
5.0,3.3,1.4,0.2,Iris-setosa
7.0,3.2,4.7,1.4,Iris-versicolor
6.4,3.2,4.5,1.5,Iris-versicolor
6.9,3.1,4.9,1.5,Iris-versicolor
5.5,2.3,4.0,1.3,Iris-versicolor
6.5,2.8,4.6,1.5,Iris-versicolor
5.7,2.8,4.5,1.3,Iris-versicolor
6.3,3.3,4.7,1.6,Iris-versicolor
4.9,2.4,3.3,1.0,Iris-versicolor
6.6,2.9,4.6,1.3,Iris-versicolor
5.2,2.7,3.9,1.4,Iris-versicolor
5.0,2.0,3.5,1.0,Iris-versicolor
5.9,3.0,4.2,1.5,Iris-versicolor
6.0,2.2,4.0,1.0,Iris-versicolor
6.1,2.9,4.7,1.4,Iris-versicolor
5.6,2.9,3.6,1.3,Iris-versicolor
6.7,3.1,4.4,1.4,Iris-versicolor
5.6,3.0,4.5,1.5,Iris-versicolor
5.8,2.7,4.1,1.0,Iris-versicolor
6.2,2.2,4.5,1.5,Iris-versicolor
5.6,2.5,3.9,1.1,Iris-versicolor
5.9,3.2,4.8,1.8,Iris-versicolor
6.1,2.8,4.0,1.3,Iris-versicolor
6.3,2.5,4.9,1.5,Iris-versicolor
6.1,2.8,4.7,1.2,Iris-versicolor
6.4,2.9,4.3,1.3,Iris-versicolor
6.6,3.0,4.4,1.4,Iris-versicolor
6.8,2.8,4.8,1.4,Iris-versicolor
6.7,3.0,5.0,1.7,Iris-versicolor
6.0,2.9,4.5,1.5,Iris-versicolor
5.7,2.6,3.5,1.0,Iris-versicolor
5.5,2.4,3.8,1.1,Iris-versicolor
5.5,2.4,3.7,1.0,Iris-versicolor
5.8,2.7,3.9,1.2,Iris-versicolor
6.0,2.7,5.1,1.6,Iris-versicolor
5.4,3.0,4.5,1.5,Iris-versicolor
6.0,3.4,4.5,1.6,Iris-versicolor
6.7,3.1,4.7,1.5,Iris-versicolor
6.3,2.3,4.4,1.3,Iris-versicolor
5.6,3.0,4.1,1.3,Iris-versicolor
5.5,2.5,4.0,1.3,Iris-versicolor
5.5,2.6,4.4,1.2,Iris-versicolor
6.1,3.0,4.6,1.4,Iris-versicolor
5.8,2.6,4.0,1.2,Iris-versicolor
5.0,2.3,3.3,1.0,Iris-versicolor
5.6,2.7,4.2,1.3,Iris-versicolor
5.7,3.0,4.2,1.2,Iris-versicolor
5.7,2.9,4.2,1.3,Iris-versicolor
6.2,2.9,4.3,1.3,Iris-versicolor
5.1,2.5,3.0,1.1,Iris-versicolor
5.7,2.8,4.1,1.3,Iris-versicolor
6.3,3.3,6.0,2.5,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
7.1,3.0,5.9,2.1,Iris-virginica
6.3,2.9,5.6,1.8,Iris-virginica
6.5,3.0,5.8,2.2,Iris-virginica
7.6,3.0,6.6,2.1,Iris-virginica
4.9,2.5,4.5,1.7,Iris-virginica
7.3,2.9,6.3,1.8,Iris-virginica
6.7,2.5,5.8,1.8,Iris-virginica
7.2,3.6,6.1,2.5,Iris-virginica
6.5,3.2,5.1,2.0,Iris-virginica
6.4,2.7,5.3,1.9,Iris-virginica
6.8,3.0,5.5,2.1,Iris-virginica
5.7,2.5,5.0,2.0,Iris-virginica
5.8,2.8,5.1,2.4,Iris-virginica
6.4,3.2,5.3,2.3,Iris-virginica
6.5,3.0,5.5,1.8,Iris-virginica
7.7,3.8,6.7,2.2,Iris-virginica
7.7,2.6,6.9,2.3,Iris-virginica
6.0,2.2,5.0,1.5,Iris-virginica
6.9,3.2,5.7,2.3,Iris-virginica
5.6,2.8,4.9,2.0,Iris-virginica
7.7,2.8,6.7,2.0,Iris-virginica
6.3,2.7,4.9,1.8,Iris-virginica
6.7,3.3,5.7,2.1,Iris-virginica
7.2,3.2,6.0,1.8,Iris-virginica
6.2,2.8,4.8,1.8,Iris-virginica
6.1,3.0,4.9,1.8,Iris-virginica
6.4,2.8,5.6,2.1,Iris-virginica
7.2,3.0,5.8,1.6,Iris-virginica
7.4,2.8,6.1,1.9,Iris-virginica
7.9,3.8,6.4,2.0,Iris-virginica
6.4,2.8,5.6,2.2,Iris-virginica
6.3,2.8,5.1,1.5,Iris-virginica
6.1,2.6,5.6,1.4,Iris-virginica
7.7,3.0,6.1,2.3,Iris-virginica
6.3,3.4,5.6,2.4,Iris-virginica
6.4,3.1,5.5,1.8,Iris-virginica
6.0,3.0,4.8,1.8,Iris-virginica
6.9,3.1,5.4,2.1,Iris-virginica
6.7,3.1,5.6,2.4,Iris-virginica
6.9,3.1,5.1,2.3,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
6.8,3.2,5.9,2.3,Iris-virginica
6.7,3.3,5.7,2.5,Iris-virginica
6.7,3.0,5.2,2.3,Iris-virginica
6.3,2.5,5.0,1.9,Iris-virginica
6.5,3.0,5.2,2.0,Iris-virginica
6.2,3.4,5.4,2.3,Iris-virginica
5.9,3.0,5.1,1.8,Iris-virginica

69
example/iris.names Normal file
View File

@ -0,0 +1,69 @@
1. Title: Iris Plants Database
Updated Sept 21 by C.Blake - Added discrepency information
2. Sources:
(a) Creator: R.A. Fisher
(b) Donor: Michael Marshall (MARSHALL%PLU@io.arc.nasa.gov)
(c) Date: July, 1988
3. Past Usage:
- Publications: too many to mention!!! Here are a few.
1. Fisher,R.A. "The use of multiple measurements in taxonomic problems"
Annual Eugenics, 7, Part II, 179-188 (1936); also in "Contributions
to Mathematical Statistics" (John Wiley, NY, 1950).
2. Duda,R.O., & Hart,P.E. (1973) Pattern Classification and Scene Analysis.
(Q327.D83) John Wiley & Sons. ISBN 0-471-22361-1. See page 218.
3. Dasarathy, B.V. (1980) "Nosing Around the Neighborhood: A New System
Structure and Classification Rule for Recognition in Partially Exposed
Environments". IEEE Transactions on Pattern Analysis and Machine
Intelligence, Vol. PAMI-2, No. 1, 67-71.
-- Results:
-- very low misclassification rates (0% for the setosa class)
4. Gates, G.W. (1972) "The Reduced Nearest Neighbor Rule". IEEE
Transactions on Information Theory, May 1972, 431-433.
-- Results:
-- very low misclassification rates again
5. See also: 1988 MLC Proceedings, 54-64. Cheeseman et al's AUTOCLASS II
conceptual clustering system finds 3 classes in the data.
4. Relevant Information:
--- This is perhaps the best known database to be found in the pattern
recognition literature. Fisher's paper is a classic in the field
and is referenced frequently to this day. (See Duda & Hart, for
example.) The data set contains 3 classes of 50 instances each,
where each class refers to a type of iris plant. One class is
linearly separable from the other 2; the latter are NOT linearly
separable from each other.
--- Predicted attribute: class of iris plant.
--- This is an exceedingly simple domain.
--- This data differs from the data presented in Fishers article
(identified by Steve Chadwick, spchadwick@espeedaz.net )
The 35th sample should be: 4.9,3.1,1.5,0.2,"Iris-setosa"
where the error is in the fourth feature.
The 38th sample: 4.9,3.6,1.4,0.1,"Iris-setosa"
where the errors are in the second and third features.
5. Number of Instances: 150 (50 in each of three classes)
6. Number of Attributes: 4 numeric, predictive attributes and the class
7. Attribute Information:
1. sepal length in cm
2. sepal width in cm
3. petal length in cm
4. petal width in cm
5. class:
-- Iris Setosa
-- Iris Versicolour
-- Iris Virginica
8. Missing Attribute Values: None
Summary Statistics:
Min Max Mean SD Class Correlation
sepal length: 4.3 7.9 5.84 0.83 0.7826
sepal width: 2.0 4.4 3.05 0.43 -0.4194
petal length: 1.0 6.9 3.76 1.76 0.9490 (high!)
petal width: 0.1 2.5 1.20 0.76 0.9565 (high!)
9. Class Distribution: 33.3% for each of 3 classes.

1
example/xor.ann Normal file
View File

@ -0,0 +1 @@
2 1 2 1 -1.777 -5.734 -6.029 -4.460 -3.261 -3.172 2.444 -6.581 5.826

41
example1.c Normal file
View File

@ -0,0 +1,41 @@
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#include "genann.h"
int main(int argc, char *argv[])
{
printf("GENANN example 1.\n");
printf("Train a small ANN to the XOR function using backpropagation.\n");
/* This will make the neural network initialize differently each run. */
/* If you don't get a good result, try again for a different result. */
srand(time(0));
/* Input and expected out data for the XOR function. */
const double input[4][2] = {{0, 0}, {0, 1}, {1, 0}, {1, 1}};
const double output[4] = {0, 1, 1, 0};
int i;
/* New network with 2 inputs,
* 1 hidden layer of 2 neurons,
* and 1 output. */
genann *ann = genann_init(2, 1, 2, 1);
/* Train on the four labeled data points many times. */
for (i = 0; i < 500; ++i) {
genann_train(ann, input[0], output + 0, 3);
genann_train(ann, input[1], output + 1, 3);
genann_train(ann, input[2], output + 2, 3);
genann_train(ann, input[3], output + 3, 3);
}
/* Run the network and see what it predicts. */
printf("Output for [%1.f, %1.f] is %1.f.\n", input[0][0], input[0][1], *genann_run(ann, input[0]));
printf("Output for [%1.f, %1.f] is %1.f.\n", input[1][0], input[1][1], *genann_run(ann, input[1]));
printf("Output for [%1.f, %1.f] is %1.f.\n", input[2][0], input[2][1], *genann_run(ann, input[2]));
printf("Output for [%1.f, %1.f] is %1.f.\n", input[3][0], input[3][1], *genann_run(ann, input[3]));
genann_free(ann);
return 0;
}

71
example2.c Normal file
View File

@ -0,0 +1,71 @@
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#include <math.h>
#include "genann.h"
int main(int argc, char *argv[])
{
printf("GENANN example 2.\n");
printf("Train a small ANN to the XOR function using random search.\n");
srand(time(0));
/* Input and expected out data for the XOR function. */
const double input[4][2] = {{0, 0}, {0, 1}, {1, 0}, {1, 1}};
const double output[4] = {0, 1, 1, 0};
int i;
/* New network with 2 inputs,
* 1 hidden layer of 2 neurons,
* and 1 output. */
genann *ann = genann_init(2, 1, 2, 1);
double err;
double last_err = 1000;
int count = 0;
do {
++count;
if (count % 1000 == 0) {
/* We're stuck, start over. */
genann_randomize(ann);
last_err = 1000;
}
genann *save = genann_copy(ann);
/* Take a random guess at the ANN weights. */
for (i = 0; i < ann->total_weights; ++i) {
ann->weight[i] += ((double)rand())/RAND_MAX-0.5;
}
/* See how we did. */
err = 0;
err += pow(*genann_run(ann, input[0]) - output[0], 2.0);
err += pow(*genann_run(ann, input[1]) - output[1], 2.0);
err += pow(*genann_run(ann, input[2]) - output[2], 2.0);
err += pow(*genann_run(ann, input[3]) - output[3], 2.0);
/* Keep these weights if they're an improvement. */
if (err < last_err) {
genann_free(save);
last_err = err;
} else {
genann_free(ann);
ann = save;
}
} while (err > 0.01);
printf("Finished in %d loops.\n", count);
/* Run the network and see what it predicts. */
printf("Output for [%1.f, %1.f] is %1.f.\n", input[0][0], input[0][1], *genann_run(ann, input[0]));
printf("Output for [%1.f, %1.f] is %1.f.\n", input[1][0], input[1][1], *genann_run(ann, input[1]));
printf("Output for [%1.f, %1.f] is %1.f.\n", input[2][0], input[2][1], *genann_run(ann, input[2]));
printf("Output for [%1.f, %1.f] is %1.f.\n", input[3][0], input[3][1], *genann_run(ann, input[3]));
genann_free(ann);
return 0;
}

39
example3.c Normal file
View File

@ -0,0 +1,39 @@
#include <stdio.h>
#include <stdlib.h>
#include "genann.h"
const char *save_name = "example/xor.ann";
int main(int argc, char *argv[])
{
printf("GENANN example 3.\n");
printf("Load a saved ANN to solve the XOR function.\n");
FILE *saved = fopen(save_name, "r");
if (!saved) {
printf("Couldn't open file: %s\n", save_name);
exit(1);
}
genann *ann = genann_read(saved);
fclose(saved);
if (!ann) {
printf("Error loading ANN from file: %s.", save_name);
exit(1);
}
/* Input data for the XOR function. */
const double input[4][2] = {{0, 0}, {0, 1}, {1, 0}, {1, 1}};
/* Run the network and see what it predicts. */
printf("Output for [%1.f, %1.f] is %1.f.\n", input[0][0], input[0][1], *genann_run(ann, input[0]));
printf("Output for [%1.f, %1.f] is %1.f.\n", input[1][0], input[1][1], *genann_run(ann, input[1]));
printf("Output for [%1.f, %1.f] is %1.f.\n", input[2][0], input[2][1], *genann_run(ann, input[2]));
printf("Output for [%1.f, %1.f] is %1.f.\n", input[3][0], input[3][1], *genann_run(ann, input[3]));
genann_free(ann);
return 0;
}

119
example4.c Normal file
View File

@ -0,0 +1,119 @@
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#include <string.h>
#include <math.h>
#include "genann.h"
/* This example is to illustrate how to use GENANN.
* It is NOT an example of good machine learning techniques.
*/
const char *iris_data = "example/iris.data";
double *input, *class;
int samples;
const char *class_names[] = {"Iris-setosa", "Iris-versicolor", "Iris-virginica"};
void load_data() {
/* Load the iris data-set. */
FILE *in = fopen("example/iris.data", "r");
if (!in) {
printf("Could not open file: %s\n", iris_data);
exit(1);
}
/* Loop through the data to get a count. */
char line[1024];
while (!feof(in) && fgets(line, 1024, in)) {
++samples;
}
fseek(in, 0, SEEK_SET);
printf("Loading %d data points from %s\n", samples, iris_data);
/* Allocate memory for input and output data. */
input = malloc(sizeof(double) * samples * 4);
class = malloc(sizeof(double) * samples * 3);
/* Read the file into our arrays. */
int i, j;
for (i = 0; i < samples; ++i) {
double *p = input + i * 4;
double *c = class + i * 3;
c[0] = c[1] = c[2] = 0.0;
if (fgets(line, 1024, in) == NULL) {
perror("fgets");
exit(1);
}
char *split = strtok(line, ",");
for (j = 0; j < 4; ++j) {
p[j] = atof(split);
split = strtok(0, ",");
}
split[strlen(split)-1] = 0;
if (strcmp(split, class_names[0]) == 0) {c[0] = 1.0;}
else if (strcmp(split, class_names[1]) == 0) {c[1] = 1.0;}
else if (strcmp(split, class_names[2]) == 0) {c[2] = 1.0;}
else {
printf("Unknown class %s.\n", split);
exit(1);
}
/* printf("Data point %d is %f %f %f %f -> %f %f %f\n", i, p[0], p[1], p[2], p[3], c[0], c[1], c[2]); */
}
fclose(in);
}
int main(int argc, char *argv[])
{
printf("GENANN example 4.\n");
printf("Train an ANN on the IRIS dataset using backpropagation.\n");
srand(time(0));
/* Load the data from file. */
load_data();
/* 4 inputs.
* 1 hidden layer(s) of 4 neurons.
* 3 outputs (1 per class)
*/
genann *ann = genann_init(4, 1, 4, 3);
int i, j;
int loops = 5000;
/* Train the network with backpropagation. */
printf("Training for %d loops over data.\n", loops);
for (i = 0; i < loops; ++i) {
for (j = 0; j < samples; ++j) {
genann_train(ann, input + j*4, class + j*3, .01);
}
/* printf("%1.2f ", xor_score(ann)); */
}
int correct = 0;
for (j = 0; j < samples; ++j) {
const double *guess = genann_run(ann, input + j*4);
if (class[j*3+0] == 1.0) {if (guess[0] > guess[1] && guess[0] > guess[2]) ++correct;}
else if (class[j*3+1] == 1.0) {if (guess[1] > guess[0] && guess[1] > guess[2]) ++correct;}
else if (class[j*3+2] == 1.0) {if (guess[2] > guess[0] && guess[2] > guess[1]) ++correct;}
else {printf("Logic error.\n"); exit(1);}
}
printf("%d/%d correct (%0.1f%%).\n", correct, samples, (double)correct / samples * 100.0);
genann_free(ann);
free(input);
free(class);
return 0;
}

405
genann.c Normal file
View File

@ -0,0 +1,405 @@
/*
* GENANN - Minimal C Artificial Neural Network
*
* Copyright (c) 2015-2018 Lewis Van Winkle
*
* http://CodePlea.com
*
* This software is provided 'as-is', without any express or implied
* warranty. In no event will the authors be held liable for any damages
* arising from the use of this software.
*
* Permission is granted to anyone to use this software for any purpose,
* including commercial applications, and to alter it and redistribute it
* freely, subject to the following restrictions:
*
* 1. The origin of this software must not be misrepresented; you must not
* claim that you wrote the original software. If you use this software
* in a product, an acknowledgement in the product documentation would be
* appreciated but is not required.
* 2. Altered source versions must be plainly marked as such, and must not be
* misrepresented as being the original software.
* 3. This notice may not be removed or altered from any source distribution.
*
*/
#include "genann.h"
#include <assert.h>
#include <errno.h>
#include <math.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#ifndef genann_act
#define genann_act_hidden genann_act_hidden_indirect
#define genann_act_output genann_act_output_indirect
#else
#define genann_act_hidden genann_act
#define genann_act_output genann_act
#endif
#define LOOKUP_SIZE 4096
double genann_act_hidden_indirect(const struct genann *ann, double a) {
return ann->activation_hidden(ann, a);
}
double genann_act_output_indirect(const struct genann *ann, double a) {
return ann->activation_output(ann, a);
}
const double sigmoid_dom_min = -15.0;
const double sigmoid_dom_max = 15.0;
double interval;
double lookup[LOOKUP_SIZE];
#ifdef __GNUC__
#define likely(x) __builtin_expect(!!(x), 1)
#define unlikely(x) __builtin_expect(!!(x), 0)
#define unused __attribute__((unused))
#else
#define likely(x) x
#define unlikely(x) x
#define unused
#pragma warning(disable : 4996) /* For fscanf */
#endif
double genann_act_sigmoid(const genann *ann unused, double a) {
if (a < -45.0) return 0;
if (a > 45.0) return 1;
return 1.0 / (1 + exp(-a));
}
void genann_init_sigmoid_lookup(const genann *ann) {
const double f = (sigmoid_dom_max - sigmoid_dom_min) / LOOKUP_SIZE;
int i;
interval = LOOKUP_SIZE / (sigmoid_dom_max - sigmoid_dom_min);
for (i = 0; i < LOOKUP_SIZE; ++i) {
lookup[i] = genann_act_sigmoid(ann, sigmoid_dom_min + f * i);
}
}
double genann_act_sigmoid_cached(const genann *ann unused, double a) {
assert(!isnan(a));
if (a < sigmoid_dom_min) return lookup[0];
if (a >= sigmoid_dom_max) return lookup[LOOKUP_SIZE - 1];
size_t j = (size_t)((a-sigmoid_dom_min)*interval+0.5);
/* Because floating point... */
if (unlikely(j >= LOOKUP_SIZE)) return lookup[LOOKUP_SIZE - 1];
return lookup[j];
}
double genann_act_linear(const struct genann *ann unused, double a) {
return a;
}
double genann_act_threshold(const struct genann *ann unused, double a) {
return a > 0;
}
genann *genann_init(int inputs, int hidden_layers, int hidden, int outputs) {
if (hidden_layers < 0) return 0;
if (inputs < 1) return 0;
if (outputs < 1) return 0;
if (hidden_layers > 0 && hidden < 1) return 0;
const int hidden_weights = hidden_layers ? (inputs+1) * hidden + (hidden_layers-1) * (hidden+1) * hidden : 0;
const int output_weights = (hidden_layers ? (hidden+1) : (inputs+1)) * outputs;
const int total_weights = (hidden_weights + output_weights);
const int total_neurons = (inputs + hidden * hidden_layers + outputs);
/* Allocate extra size for weights, outputs, and deltas. */
const int size = sizeof(genann) + sizeof(double) * (total_weights + total_neurons + (total_neurons - inputs));
genann *ret = malloc(size);
if (!ret) return 0;
ret->inputs = inputs;
ret->hidden_layers = hidden_layers;
ret->hidden = hidden;
ret->outputs = outputs;
ret->total_weights = total_weights;
ret->total_neurons = total_neurons;
/* Set pointers. */
ret->weight = (double*)((char*)ret + sizeof(genann));
ret->output = ret->weight + ret->total_weights;
ret->delta = ret->output + ret->total_neurons;
genann_randomize(ret);
ret->activation_hidden = genann_act_sigmoid_cached;
ret->activation_output = genann_act_sigmoid_cached;
genann_init_sigmoid_lookup(ret);
return ret;
}
genann *genann_read(FILE *in) {
int inputs, hidden_layers, hidden, outputs;
int rc;
errno = 0;
rc = fscanf(in, "%d %d %d %d", &inputs, &hidden_layers, &hidden, &outputs);
if (rc < 4 || errno != 0) {
perror("fscanf");
return NULL;
}
genann *ann = genann_init(inputs, hidden_layers, hidden, outputs);
int i;
for (i = 0; i < ann->total_weights; ++i) {
errno = 0;
rc = fscanf(in, " %le", ann->weight + i);
if (rc < 1 || errno != 0) {
perror("fscanf");
genann_free(ann);
return NULL;
}
}
return ann;
}
genann *genann_copy(genann const *ann) {
const int size = sizeof(genann) + sizeof(double) * (ann->total_weights + ann->total_neurons + (ann->total_neurons - ann->inputs));
genann *ret = malloc(size);
if (!ret) return 0;
memcpy(ret, ann, size);
/* Set pointers. */
ret->weight = (double*)((char*)ret + sizeof(genann));
ret->output = ret->weight + ret->total_weights;
ret->delta = ret->output + ret->total_neurons;
return ret;
}
void genann_randomize(genann *ann) {
int i;
for (i = 0; i < ann->total_weights; ++i) {
double r = GENANN_RANDOM();
/* Sets weights from -0.5 to 0.5. */
ann->weight[i] = r - 0.5;
}
}
void genann_free(genann *ann) {
/* The weight, output, and delta pointers go to the same buffer. */
free(ann);
}
double const *genann_run(genann const *ann, double const *inputs) {
double const *w = ann->weight;
double *o = ann->output + ann->inputs;
double const *i = ann->output;
/* Copy the inputs to the scratch area, where we also store each neuron's
* output, for consistency. This way the first layer isn't a special case. */
memcpy(ann->output, inputs, sizeof(double) * ann->inputs);
int h, j, k;
if (!ann->hidden_layers) {
double *ret = o;
for (j = 0; j < ann->outputs; ++j) {
double sum = *w++ * -1.0;
for (k = 0; k < ann->inputs; ++k) {
sum += *w++ * i[k];
}
*o++ = genann_act_output(ann, sum);
}
return ret;
}
/* Figure input layer */
for (j = 0; j < ann->hidden; ++j) {
double sum = *w++ * -1.0;
for (k = 0; k < ann->inputs; ++k) {
sum += *w++ * i[k];
}
*o++ = genann_act_hidden(ann, sum);
}
i += ann->inputs;
/* Figure hidden layers, if any. */
for (h = 1; h < ann->hidden_layers; ++h) {
for (j = 0; j < ann->hidden; ++j) {
double sum = *w++ * -1.0;
for (k = 0; k < ann->hidden; ++k) {
sum += *w++ * i[k];
}
*o++ = genann_act_hidden(ann, sum);
}
i += ann->hidden;
}
double const *ret = o;
/* Figure output layer. */
for (j = 0; j < ann->outputs; ++j) {
double sum = *w++ * -1.0;
for (k = 0; k < ann->hidden; ++k) {
sum += *w++ * i[k];
}
*o++ = genann_act_output(ann, sum);
}
/* Sanity check that we used all weights and wrote all outputs. */
assert(w - ann->weight == ann->total_weights);
assert(o - ann->output == ann->total_neurons);
return ret;
}
void genann_train(genann const *ann, double const *inputs, double const *desired_outputs, double learning_rate) {
/* To begin with, we must run the network forward. */
genann_run(ann, inputs);
int h, j, k;
/* First set the output layer deltas. */
{
double const *o = ann->output + ann->inputs + ann->hidden * ann->hidden_layers; /* First output. */
double *d = ann->delta + ann->hidden * ann->hidden_layers; /* First delta. */
double const *t = desired_outputs; /* First desired output. */
/* Set output layer deltas. */
if (genann_act_output == genann_act_linear ||
ann->activation_output == genann_act_linear) {
for (j = 0; j < ann->outputs; ++j) {
*d++ = *t++ - *o++;
}
} else {
for (j = 0; j < ann->outputs; ++j) {
*d++ = (*t - *o) * *o * (1.0 - *o);
++o; ++t;
}
}
}
/* Set hidden layer deltas, start on last layer and work backwards. */
/* Note that loop is skipped in the case of hidden_layers == 0. */
for (h = ann->hidden_layers - 1; h >= 0; --h) {
/* Find first output and delta in this layer. */
double const *o = ann->output + ann->inputs + (h * ann->hidden);
double *d = ann->delta + (h * ann->hidden);
/* Find first delta in following layer (which may be hidden or output). */
double const * const dd = ann->delta + ((h+1) * ann->hidden);
/* Find first weight in following layer (which may be hidden or output). */
double const * const ww = ann->weight + ((ann->inputs+1) * ann->hidden) + ((ann->hidden+1) * ann->hidden * (h));
for (j = 0; j < ann->hidden; ++j) {
double delta = 0;
for (k = 0; k < (h == ann->hidden_layers-1 ? ann->outputs : ann->hidden); ++k) {
const double forward_delta = dd[k];
const int windex = k * (ann->hidden + 1) + (j + 1);
const double forward_weight = ww[windex];
delta += forward_delta * forward_weight;
}
*d = *o * (1.0-*o) * delta;
++d; ++o;
}
}
/* Train the outputs. */
{
/* Find first output delta. */
double const *d = ann->delta + ann->hidden * ann->hidden_layers; /* First output delta. */
/* Find first weight to first output delta. */
double *w = ann->weight + (ann->hidden_layers
? ((ann->inputs+1) * ann->hidden + (ann->hidden+1) * ann->hidden * (ann->hidden_layers-1))
: (0));
/* Find first output in previous layer. */
double const * const i = ann->output + (ann->hidden_layers
? (ann->inputs + (ann->hidden) * (ann->hidden_layers-1))
: 0);
/* Set output layer weights. */
for (j = 0; j < ann->outputs; ++j) {
*w++ += *d * learning_rate * -1.0;
for (k = 1; k < (ann->hidden_layers ? ann->hidden : ann->inputs) + 1; ++k) {
*w++ += *d * learning_rate * i[k-1];
}
++d;
}
assert(w - ann->weight == ann->total_weights);
}
/* Train the hidden layers. */
for (h = ann->hidden_layers - 1; h >= 0; --h) {
/* Find first delta in this layer. */
double const *d = ann->delta + (h * ann->hidden);
/* Find first input to this layer. */
double const *i = ann->output + (h
? (ann->inputs + ann->hidden * (h-1))
: 0);
/* Find first weight to this layer. */
double *w = ann->weight + (h
? ((ann->inputs+1) * ann->hidden + (ann->hidden+1) * (ann->hidden) * (h-1))
: 0);
for (j = 0; j < ann->hidden; ++j) {
*w++ += *d * learning_rate * -1.0;
for (k = 1; k < (h == 0 ? ann->inputs : ann->hidden) + 1; ++k) {
*w++ += *d * learning_rate * i[k-1];
}
++d;
}
}
}
void genann_write(genann const *ann, FILE *out) {
fprintf(out, "%d %d %d %d", ann->inputs, ann->hidden_layers, ann->hidden, ann->outputs);
int i;
for (i = 0; i < ann->total_weights; ++i) {
fprintf(out, " %.20e", ann->weight[i]);
}
}

108
genann.h Normal file
View File

@ -0,0 +1,108 @@
/*
* GENANN - Minimal C Artificial Neural Network
*
* Copyright (c) 2015-2018 Lewis Van Winkle
*
* http://CodePlea.com
*
* This software is provided 'as-is', without any express or implied
* warranty. In no event will the authors be held liable for any damages
* arising from the use of this software.
*
* Permission is granted to anyone to use this software for any purpose,
* including commercial applications, and to alter it and redistribute it
* freely, subject to the following restrictions:
*
* 1. The origin of this software must not be misrepresented; you must not
* claim that you wrote the original software. If you use this software
* in a product, an acknowledgement in the product documentation would be
* appreciated but is not required.
* 2. Altered source versions must be plainly marked as such, and must not be
* misrepresented as being the original software.
* 3. This notice may not be removed or altered from any source distribution.
*
*/
#ifndef GENANN_H
#define GENANN_H
#include <stdio.h>
#ifdef __cplusplus
extern "C" {
#endif
#ifndef GENANN_RANDOM
/* We use the following for uniform random numbers between 0 and 1.
* If you have a better function, redefine this macro. */
#define GENANN_RANDOM() (((double)rand())/RAND_MAX)
#endif
struct genann;
typedef double (*genann_actfun)(const struct genann *ann, double a);
typedef struct genann {
/* How many inputs, outputs, and hidden neurons. */
int inputs, hidden_layers, hidden, outputs;
/* Which activation function to use for hidden neurons. Default: gennann_act_sigmoid_cached*/
genann_actfun activation_hidden;
/* Which activation function to use for output. Default: gennann_act_sigmoid_cached*/
genann_actfun activation_output;
/* Total number of weights, and size of weights buffer. */
int total_weights;
/* Total number of neurons + inputs and size of output buffer. */
int total_neurons;
/* All weights (total_weights long). */
double *weight;
/* Stores input array and output of each neuron (total_neurons long). */
double *output;
/* Stores delta of each hidden and output neuron (total_neurons - inputs long). */
double *delta;
} genann;
/* Creates and returns a new ann. */
genann *genann_init(int inputs, int hidden_layers, int hidden, int outputs);
/* Creates ANN from file saved with genann_write. */
genann *genann_read(FILE *in);
/* Sets weights randomly. Called by init. */
void genann_randomize(genann *ann);
/* Returns a new copy of ann. */
genann *genann_copy(genann const *ann);
/* Frees the memory used by an ann. */
void genann_free(genann *ann);
/* Runs the feedforward algorithm to calculate the ann's output. */
double const *genann_run(genann const *ann, double const *inputs);
/* Does a single backprop update. */
void genann_train(genann const *ann, double const *inputs, double const *desired_outputs, double learning_rate);
/* Saves the ann. */
void genann_write(genann const *ann, FILE *out);
void genann_init_sigmoid_lookup(const genann *ann);
double genann_act_sigmoid(const genann *ann, double a);
double genann_act_sigmoid_cached(const genann *ann, double a);
double genann_act_threshold(const genann *ann, double a);
double genann_act_linear(const genann *ann, double a);
#ifdef __cplusplus
}
#endif
#endif /*GENANN_H*/

127
minctest.h Normal file
View File

@ -0,0 +1,127 @@
/*
*
* MINCTEST - Minimal C Test Library - 0.1
*
* Copyright (c) 2014, 2015, 2016 Lewis Van Winkle
*
* http://CodePlea.com
*
* This software is provided 'as-is', without any express or implied
* warranty. In no event will the authors be held liable for any damages
* arising from the use of this software.
*
* Permission is granted to anyone to use this software for any purpose,
* including commercial applications, and to alter it and redistribute it
* freely, subject to the following restrictions:
*
* 1. The origin of this software must not be misrepresented; you must not
* claim that you wrote the original software. If you use this software
* in a product, an acknowledgement in the product documentation would be
* appreciated but is not required.
* 2. Altered source versions must be plainly marked as such, and must not be
* misrepresented as being the original software.
* 3. This notice may not be removed or altered from any source distribution.
*
*/
/*
* MINCTEST - Minimal testing library for C
*
*
* Example:
*
* void test1() {
* lok('a' == 'a');
* }
*
* void test2() {
* lequal(5, 6);
* lfequal(5.5, 5.6);
* }
*
* int main() {
* lrun("test1", test1);
* lrun("test2", test2);
* lresults();
* return lfails != 0;
* }
*
*
*
* Hints:
* All functions/variables start with the letter 'l'.
*
*/
#ifndef __MINCTEST_H__
#define __MINCTEST_H__
#include <stdio.h>
#include <math.h>
#include <time.h>
/* How far apart can floats be before we consider them unequal. */
#define LTEST_FLOAT_TOLERANCE 0.001
/* Track the number of passes, fails. */
/* NB this is made for all tests to be in one file. */
static int ltests = 0;
static int lfails = 0;
/* Display the test results. */
#define lresults() do {\
if (lfails == 0) {\
printf("ALL TESTS PASSED (%d/%d)\n", ltests, ltests);\
} else {\
printf("SOME TESTS FAILED (%d/%d)\n", ltests-lfails, ltests);\
}\
} while (0)
/* Run a test. Name can be any string to print out, test is the function name to call. */
#define lrun(name, test) do {\
const int ts = ltests;\
const int fs = lfails;\
const clock_t start = clock();\
printf("\t%-14s", name);\
test();\
printf("pass:%2d fail:%2d %4dms\n",\
(ltests-ts)-(lfails-fs), lfails-fs,\
(int)((clock() - start) * 1000 / CLOCKS_PER_SEC));\
} while (0)
/* Assert a true statement. */
#define lok(test) do {\
++ltests;\
if (!(test)) {\
++lfails;\
printf("%s:%d error \n", __FILE__, __LINE__);\
}} while (0)
/* Assert two integers are equal. */
#define lequal(a, b) do {\
++ltests;\
if ((a) != (b)) {\
++lfails;\
printf("%s:%d (%d != %d)\n", __FILE__, __LINE__, (a), (b));\
}} while (0)
/* Assert two floats are equal (Within LTEST_FLOAT_TOLERANCE). */
#define lfequal(a, b) do {\
++ltests;\
if (fabs((double)(a)-(double)(b)) > LTEST_FLOAT_TOLERANCE) {\
++lfails;\
printf("%s:%d (%f != %f)\n", __FILE__, __LINE__, (double)(a), (double)(b));\
}} while (0)
#endif /*__MINCTEST_H__*/

76
my_test.c Normal file
View File

@ -0,0 +1,76 @@
/** function comment
* @Author: 1343619937@qq.com
* @Date: 2024-04-25 15:21:22
* @LastEditors: 1343619937@qq.com
* @LastEditTime: 2024-04-25 16:06:42
* @FilePath: \genann-master\my_test.c
* @Description: ,`customMade`, koroFileHeader查看配置 : https://github.com/OBKoro1/koro1FileHeader/wiki/%E9%85%8D%E7%BD%AE
*/
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#include "genann.h"
/*
3
3
*/
const double input[][300]={
{2304,2288,2305,2337,2361,2354,2302,2220,2146,2095,2067,2068,2097,2141,2161,2113,1986,1821,1696,1626,1587,1570,1571,1547,1462,1358,1296,1257,1183,1093,1045,1024,1008,1019,1027,949,775,609,508,467,479,525,587,641,652,585,504,490,530,553,573,650,737,765,772,790,742,622,546,555,594,672,774,823,786,731,701,693,703,729,769,842,947,1035,1064,1042,1005,976,966,985,1045,1148,1266,1339,1332,1282,1242,1243,1274,1320,1373,1460,1577,1664,1677,1634,1576,1536,1514,1518,1570,1665,1765,1812,1795,1726,1661,1635,1631,1630,1643,1693,1765,1806,1782,1692,1583,1488,1415,1379,1385,1442,1508,1530,1489,1406,1327,1290,1283,1280,1293,1347,1429,1489,1488,1416,1312,1219,1160,1135,1158,1230,1324,1387,1379,1317,1249,1216,1214,1232,1269,1348,1458,1551,1574,1520,1422,1342,1296,1285,1324,1416,1526,1586,1576,1505,1435,1409,1416,1435,1474,1547,1643,1706,1695,1611,1510,1437,1398,1395,1429,1504,1579,1601,1552,1464,1389,1363,1369,1375,1392,1455,1550,1623,1626,1550,1439,1336,1264,1234,1262,1350,1468,1559,1574,1536,1502,1519,1583,1674,1773,1876,1970,2052,2102,2123,2148,2198,2267,2359,2467,2552,2605,2660,2701,2706,2750,2854,2920,2932,3000,3103,3127,3089,3116,3199,3235,3240,3271,3316,3356,3383,3374,3336,3300,3281,3285,3318,3328,3290,3269,3277,3236,3157,3129,3156,3171,3160,3136,3104,3080,3062,3028,2995,},
{2317,2296,2305,2343,2379,2390,2361,2286,2196,2123,2084,2082,2114,2162,2191,2169,2063,1909,1775,1695,1657,1629,1613,1608,1571,1470,1360,1293,1248,1172,1096,1067,1066,1061,1063,1037,917,727,564,494,492,538,597,648,684,673,595,518,522,573,600,635,732,823,834,821,815,741,621,575,616,676,767,875,905,848,779,745,737,753,795,850,932,1038,1103,1097,1034,973,941,943,984,1064,1177,1281,1325,1290,1216,1169,1175,1218,1275,1348,1455,1570,1636,1627,1564,1495,1455,1456,1494,1572,1676,1767,1800,1764,1687,1626,1610,1618,1631,1661,1726,1799,1825,1772,1661,1542,1461,1419,1413,1442,1504,1563,1565,1507,1417,1339,1312,1316,1326,1349,1412,1492,1530,1499,1397,1282,1204,1171,1182,1233,1321,1411,1455,1428,1351,1289,1275,1297,1333,1390,1490,1605,1678,1667,1579,1470,1396,1379,1401,1466,1570,1670,1711,1669,1580,1512,1499,1519,1552,1607,1691,1780,1822,1777,1666,1552,1491,1477,1500,1553,1631,1697,1694,1622,1519,1440,1423,1440,1462,1501,1583,1686,1735,1694,1571,1440,1345,1298,1306,1367,1477,1596,1666,1656,1597,1561,1585,1662,1759,1867,1977,2069,2141,2185,2201,2225,2280,2356,2455,2575,2657,2698,2744,2783,2790,2835,2943,3010,3031,3106,3199,3208,3163,3188,3255,3278,3274,3290,3322,3351,3360,3330,3271,3222,3193,3194,3223,3222,3175,3157,3165,3115,3037,3023,3061,3075,3063,3041,3014,2995,2982,2952,2928,},
{2323,2303,2309,2339,2378,2397,2375,2310,2216,2141,2110,2126,2174,2227,2251,2224,2138,1998,1857,1778,1752,1733,1702,1664,1615,1530,1410,1308,1265,1231,1164,1120,1124,1123,1089,1036,931,757,586,508,530,588,647,681,681,653,590,507,475,531,608,658,730,829,870,813,745,688,596,522,564,670,768,857,909,860,754,682,674,706,764,838,914,999,1061,1047,965,875,843,876,947,1045,1150,1246,1298,1280,1213,1152,1160,1230,1321,1401,1479,1572,1634,1621,1549,1470,1445,1486,1571,1661,1744,1814,1838,1807,1744,1694,1695,1743,1793,1832,1877,1923,1936,1881,1768,1652,1587,1585,1613,1644,1666,1678,1662,1600,1520,1459,1446,1472,1497,1509,1532,1574,1596,1560,1470,1362,1294,1295,1331,1373,1406,1430,1434,1400,1334,1275,1271,1313,1364,1411,1467,1541,1590,1574,1491,1380,1314,1323,1387,1461,1533,1601,1626,1590,1513,1447,1439,1488,1546,1601,1673,1754,1802,1773,1662,1531,1452,1453,1508,1581,1651,1703,1710,1649,1550,1468,1452,1496,1555,1606,1658,1715,1737,1682,1569,1451,1392,1409,1461,1524,1578,1619,1638,1614,1563,1536,1588,1700,1819,1909,1976,2025,2067,2119,2170,2217,2278,2353,2433,2521,2612,2660,2681,2728,2768,2791,2870,2989,3037,3050,3119,3179,3155,3129,3197,3279,3302,3303,3319,3335,3343,3338,3305,3256,3220,3205,3222,3248,3232,3185,3175,3169,3108,3046,3056,3093,3100,3083,3055,3026,3009,2990,2960,2952,},
{1861,1879,1921,1981,2054,2120,2154,2135,2060,1963,1906,1916,1991,2086,2156,2185,2188,2168,2129,2081,2035,1981,1891,1755,1609,1498,1419,1335,1252,1205,1170,1108,1045,1015,979,862,706,591,525,488,497,532,516,434,356,336,340,340,330,318,324,350,363,370,417,481,501,506,560,616,607,595,643,681,659,667,727,758,772,816,849,858,895,957,977,929,828,719,673,731,826,889,913,924,941,954,958,967,1002,1060,1122,1189,1259,1330,1383,1378,1317,1255,1260,1341,1436,1499,1523,1533,1550,1574,1596,1617,1652,1701,1745,1784,1827,1864,1867,1801,1668,1531,1477,1509,1572,1612,1605,1565,1525,1483,1436,1398,1387,1408,1433,1451,1464,1475,1471,1426,1335,1245,1207,1233,1274,1291,1270,1236,1224,1233,1240,1243,1255,1278,1307,1352,1407,1462,1492,1457,1365,1269,1236,1288,1376,1442,1458,1450,1444,1438,1422,1408,1423,1470,1526,1578,1626,1665,1678,1628,1516,1398,1354,1394,1473,1526,1525,1497,1475,1454,1429,1412,1418,1450,1484,1508,1524,1539,1539,1495,1407,1322,1293,1336,1393,1416,1394,1363,1355,1357,1352,1353,1381,1434,1486,1517,1534,1569,1648,1749,1831,1898,1956,2009,2076,2150,2204,2250,2311,2383,2471,2572,2642,2682,2739,2807,2844,2901,3012,3080,3073,3097,3161,3166,3121,3136,3194,3202,3168,3134,3100,3076,3061,3031,2985,2943,2909,2891,2903,2897,2841,2805,2817,2794,2722,2685,2702,2707,2692,2678,2660,2646,2642,2627,2611,},
{1817,1797,1827,1894,1993,2101,2179,2199,2139,2026,1922,1873,1900,1980,2075,2155,2222,2274,2297,2276,2209,2121,2027,1926,1801,1659,1537,1460,1403,1337,1290,1288,1287,1227,1112,976,830,677,558,541,586,611,616,603,529,407,313,290,300,312,319,337,388,444,452,412,400,427,439,457,542,627,619,595,635,674,671,703,774,806,821,861,879,852,831,843,855,854,848,832,832,860,876,862,850,874,926,974,989,982,1013,1098,1195,1274,1307,1291,1244,1197,1169,1192,1284,1407,1493,1519,1509,1507,1534,1567,1591,1616,1664,1741,1818,1853,1835,1773,1695,1622,1570,1559,1598,1653,1659,1610,1540,1494,1491,1500,1486,1458,1452,1488,1538,1570,1548,1475,1380,1280,1207,1185,1226,1299,1346,1335,1287,1249,1241,1246,1250,1257,1289,1348,1412,1450,1437,1387,1339,1307,1285,1296,1356,1424,1450,1428,1389,1372,1402,1445,1465,1466,1486,1547,1623,1664,1651,1590,1516,1448,1402,1399,1448,1522,1558,1537,1482,1436,1433,1445,1437,1424,1448,1516,1586,1618,1584,1500,1404,1318,1269,1273,1333,1417,1460,1438,1373,1327,1325,1347,1365,1378,1405,1448,1479,1492,1514,1572,1678,1783,1852,1897,1931,1966,2028,2113,2182,2246,2330,2413,2491,2576,2635,2668,2723,2787,2826,2887,2993,3060,3052,3073,3141,3157,3115,3127,3184,3186,3140,3104,3079,3065,3064,3048,3007,2966,2927,2907,2913,2899,2844,2818,2837,2818,2751,2716,2730,2736,2722,2703,2682,2665,2658,2641,2623,},
{1822,1808,1846,1923,2033,2137,2203,2202,2125,2003,1893,1839,1841,1865,1873,1863,1855,1842,1794,1714,1632,1565,1477,1348,1230,1146,1064,967,891,881,914,926,917,903,826,668,505,396,318,284,329,422,496,510,472,420,398,396,381,378,437,527,569,586,625,627,554,519,585,649,661,681,701,678,659,690,725,741,770,814,837,826,789,733,693,693,695,681,681,713,763,800,812,817,859,937,1006,1040,1060,1095,1162,1257,1357,1448,1515,1542,1511,1458,1438,1491,1596,1690,1743,1756,1763,1779,1784,1776,1771,1792,1837,1881,1903,1891,1853,1783,1684,1588,1532,1541,1591,1618,1581,1504,1435,1394,1363,1329,1291,1270,1282,1307,1323,1318,1283,1216,1115,1012,955,972,1054,1134,1165,1155,1147,1157,1173,1189,1209,1250,1324,1401,1449,1469,1476,1464,1424,1385,1378,1424,1505,1564,1567,1538,1526,1552,1584,1597,1597,1614,1661,1720,1765,1776,1757,1698,1600,1490,1412,1407,1459,1505,1493,1441,1389,1352,1318,1277,1241,1243,1299,1374,1424,1439,1417,1356,1258,1157,1116,1162,1271,1361,1390,1369,1354,1369,1397,1435,1497,1590,1708,1802,1852,1892,1948,2030,2151,2279,2370,2425,2465,2498,2548,2622,2687,2747,2835,2911,2937,2978,3058,3084,3049,3070,3134,3125,3061,3056,3092,3076,3032,3008,2996,2987,2972,2927,2864,2808,2764,2748,2770,2769,2723,2707,2725,2692,2619,2594,2625,2646,2647,2647,2645,2645,2651,2650,2658,},
//{},
};
//额外找的没吸到
const double test[300]={
1949,1920,1926,1984,2069,2140,2169,2146,2081,2008,1970,1973,2007,2042,2042,2009,1966,1926,1887,1835,1760,1681,1616,1543,1435,1313,1219,1141,1046,962,944,990,1007,951,860,743,602,492,457,456,437,423,442,453,429,394,368,372,402,418,407,431,490,504,465,466,512,529,539,609,679,663,625,623,614,610,662,713,707,691,726,789,841,850,796,712,649,614,591,591,631,703,771,802,813,835,885,940,979,1011,1075,1187,1308,1380,1389,1362,1352,1368,1400,1452,1530,1621,1680,1692,1668,1658,1687,1726,1746,1761,1801,1879,1949,1952,1872,1758,1654,1578,1538,1529,1550,1588,1596,1548,1467,1398,1370,1364,1353,1341,1349,1386,1418,1394,1310,1216,1160,1139,1139,1155,1181,1209,1213,1177,1123,1095,1117,1162,1196,1209,1241,1319,1405,1436,1401,1330,1276,1251,1245,1263,1311,1382,1433,1441,1413,1392,1400,1415,1413,1401,1419,1496,1575,1586,1510,1401,1309,1250,1225,1226,1266,1333,1375,1357,1303,1259,1258,1284,1310,1333,1379,1459,1528,1526,1451,1358,1297,1270,1266,1284,1322,1372,1398,1375,1327,1307,1349,1426,1497,1542,1587,1651,1722,1777,1828,1893,1973,2072,2167,2220,2246,2271,2297,2334,2408,2491,2550,2610,2677,2715,2748,2833,2907,2910,2927,3004,3041,2994,2970,3014,3042,3019,2991,2974,2959,2943,2907,2850,2794,2756,2739,2752,2767,2740,2710,2722,2712,2644,2596,2611,2628,2618,2599,2577,2559,2552,2537,2521,
};
//源数据input[0]数据裁剪了一下
const double test2[300]= {1821,1696,1626,1587,1570,1571,1547,1462,1358,1296,1257,1183,1093,1045,1024,1008,1019,1027,949,775,609,508,467,479,525,587,641,652,585,504,490,530,553,573,650,737,765,772,790,742,622,546,555,594,672,774,823,786,731,701,693,703,729,769,842,947,1035,1064,1042,1005,976,966,985,1045,1148,1266,1339,1332,1282,1242,1243,1274,1320,1373,1460,1577,1664,1677,1634,1576,1536,1514,1518,1570,1665,1765,1812,1795,1726,1661,1635,1631,1630,1643,1693,1765,1806,1782,1692,1583,1488,1415,1379,1385,1442,1508,1530,1489,1406,1327,1290,1283,1280,1293,1347,1429,1489,1488,1416,1312,1219,1160,1135,1158,1230,1324,1387,1379,1317,1249,1216,1214,1232,1269,1348,1458,1551,1574,1520,1422,1342,1296,1285,1324,1416,1526,1586,1576,1505,1435,1409,1416,1435,1474,1547,1643,1706,1695,1611,1510,1437,1398,1395,1429,1504,1579,1601,1552,1464,1389,1363,1369,1375,1392,1455,1550,1623,1626,1550,1439,1336,1264,1234,1262,1350,1468,1559,1574,1536,1502,1519,1583,1674,1773,1876,1970,2052,2102,2123,2148,2198,2267,2359,2467,2552,2605,2660,2701,2706,2750,2854,2920,2932,3000,3103,3127,3089,3116,3199,3235,3240,3271,3316,3356,3383,3374,3336,3300,3281,3285,3318,3328,3290,3269,3277,3236,3157,3129,3156,3171,3160,3136,3104,3080,3062,3028,2995,
};
//目标结果
const double output[]={
1,1,1,
0,0,0,
};
int main(int argc, char *argv[])
{
printf("GENANN example 1.\n");
printf("Train a small ANN to the XOR function using backpropagation.\n");
/* This will make the neural network initialize differently each run. */
/* If you don't get a good result, try again for a different result. */
// srand(time(0));
/* Input and expected out data for the XOR function. */
// const double input[4][2] = {{0, 0}, {0, 1}, {1, 0}, {1, 1}};
// const double output[4] = {0, 1, 1, 0};
int i;
/* New network with 2 inputs,
* 1 hidden layer of 2 neurons,
* and 1 output. */
genann *ann = genann_init(300, 1, 20, 1);
/* Train on the four labeled data points many times. */
for (i = 0; i < 500; ++i) {
genann_train(ann, input[0], output + 0, 3);
genann_train(ann, input[1], output + 1, 3);
genann_train(ann, input[2], output + 2, 3);
genann_train(ann, input[3], output + 3, 3);
genann_train(ann, input[4], output + 4, 3);
genann_train(ann, input[5], output + 5, 3);
}
/* Run the network and see what it predicts. */
printf("%f\n",*genann_run(ann, input[0]));
printf("%f\n",*genann_run(ann, input[4]));
printf("%f\n",*genann_run(ann, test));
printf("%f\n",*genann_run(ann, test2));
genann_free(ann);
return 0;
}

BIN
mytest Normal file

Binary file not shown.

276
test.c Normal file
View File

@ -0,0 +1,276 @@
/*
* GENANN - Minimal C Artificial Neural Network
*
* Copyright (c) 2015-2018 Lewis Van Winkle
*
* http://CodePlea.com
*
* This software is provided 'as-is', without any express or implied
* warranty. In no event will the authors be held liable for any damages
* arising from the use of this software.
*
* Permission is granted to anyone to use this software for any purpose,
* including commercial applications, and to alter it and redistribute it
* freely, subject to the following restrictions:
*
* 1. The origin of this software must not be misrepresented; you must not
* claim that you wrote the original software. If you use this software
* in a product, an acknowledgement in the product documentation would be
* appreciated but is not required.
* 2. Altered source versions must be plainly marked as such, and must not be
* misrepresented as being the original software.
* 3. This notice may not be removed or altered from any source distribution.
*
*/
#include "genann.h"
#include "minctest.h"
#include <stdio.h>
#include <math.h>
#include <stdlib.h>
void basic() {
genann *ann = genann_init(1, 0, 0, 1);
lequal(ann->total_weights, 2);
double a;
a = 0;
ann->weight[0] = 0;
ann->weight[1] = 0;
lfequal(0.5, *genann_run(ann, &a));
a = 1;
lfequal(0.5, *genann_run(ann, &a));
a = 11;
lfequal(0.5, *genann_run(ann, &a));
a = 1;
ann->weight[0] = 1;
ann->weight[1] = 1;
lfequal(0.5, *genann_run(ann, &a));
a = 10;
ann->weight[0] = 1;
ann->weight[1] = 1;
lfequal(1.0, *genann_run(ann, &a));
a = -10;
lfequal(0.0, *genann_run(ann, &a));
genann_free(ann);
}
void xor() {
genann *ann = genann_init(2, 1, 2, 1);
ann->activation_hidden = genann_act_threshold;
ann->activation_output = genann_act_threshold;
lequal(ann->total_weights, 9);
/* First hidden. */
ann->weight[0] = .5;
ann->weight[1] = 1;
ann->weight[2] = 1;
/* Second hidden. */
ann->weight[3] = 1;
ann->weight[4] = 1;
ann->weight[5] = 1;
/* Output. */
ann->weight[6] = .5;
ann->weight[7] = 1;
ann->weight[8] = -1;
double input[4][2] = {{0, 0}, {0, 1}, {1, 0}, {1, 1}};
double output[4] = {0, 1, 1, 0};
lfequal(output[0], *genann_run(ann, input[0]));
lfequal(output[1], *genann_run(ann, input[1]));
lfequal(output[2], *genann_run(ann, input[2]));
lfequal(output[3], *genann_run(ann, input[3]));
genann_free(ann);
}
void backprop() {
genann *ann = genann_init(1, 0, 0, 1);
double input, output;
input = .5;
output = 1;
double first_try = *genann_run(ann, &input);
genann_train(ann, &input, &output, .5);
double second_try = *genann_run(ann, &input);
lok(fabs(first_try - output) > fabs(second_try - output));
genann_free(ann);
}
void train_and() {
double input[4][2] = {{0, 0}, {0, 1}, {1, 0}, {1, 1}};
double output[4] = {0, 0, 0, 1};
genann *ann = genann_init(2, 0, 0, 1);
int i, j;
for (i = 0; i < 50; ++i) {
for (j = 0; j < 4; ++j) {
genann_train(ann, input[j], output + j, .8);
}
}
ann->activation_output = genann_act_threshold;
lfequal(output[0], *genann_run(ann, input[0]));
lfequal(output[1], *genann_run(ann, input[1]));
lfequal(output[2], *genann_run(ann, input[2]));
lfequal(output[3], *genann_run(ann, input[3]));
genann_free(ann);
}
void train_or() {
double input[4][2] = {{0, 0}, {0, 1}, {1, 0}, {1, 1}};
double output[4] = {0, 1, 1, 1};
genann *ann = genann_init(2, 0, 0, 1);
genann_randomize(ann);
int i, j;
for (i = 0; i < 50; ++i) {
for (j = 0; j < 4; ++j) {
genann_train(ann, input[j], output + j, .8);
}
}
ann->activation_output = genann_act_threshold;
lfequal(output[0], *genann_run(ann, input[0]));
lfequal(output[1], *genann_run(ann, input[1]));
lfequal(output[2], *genann_run(ann, input[2]));
lfequal(output[3], *genann_run(ann, input[3]));
genann_free(ann);
}
void train_xor() {
double input[4][2] = {{0, 0}, {0, 1}, {1, 0}, {1, 1}};
double output[4] = {0, 1, 1, 0};
genann *ann = genann_init(2, 1, 2, 1);
int i, j;
for (i = 0; i < 500; ++i) {
for (j = 0; j < 4; ++j) {
genann_train(ann, input[j], output + j, 3);
}
/* printf("%1.2f ", xor_score(ann)); */
}
ann->activation_output = genann_act_threshold;
lfequal(output[0], *genann_run(ann, input[0]));
lfequal(output[1], *genann_run(ann, input[1]));
lfequal(output[2], *genann_run(ann, input[2]));
lfequal(output[3], *genann_run(ann, input[3]));
genann_free(ann);
}
void persist() {
genann *first = genann_init(1000, 5, 50, 10);
FILE *out = fopen("persist.txt", "w");
genann_write(first, out);
fclose(out);
FILE *in = fopen("persist.txt", "r");
genann *second = genann_read(in);
fclose(in);
lequal(first->inputs, second->inputs);
lequal(first->hidden_layers, second->hidden_layers);
lequal(first->hidden, second->hidden);
lequal(first->outputs, second->outputs);
lequal(first->total_weights, second->total_weights);
int i;
for (i = 0; i < first->total_weights; ++i) {
lok(first->weight[i] == second->weight[i]);
}
genann_free(first);
genann_free(second);
}
void copy() {
genann *first = genann_init(1000, 5, 50, 10);
genann *second = genann_copy(first);
lequal(first->inputs, second->inputs);
lequal(first->hidden_layers, second->hidden_layers);
lequal(first->hidden, second->hidden);
lequal(first->outputs, second->outputs);
lequal(first->total_weights, second->total_weights);
int i;
for (i = 0; i < first->total_weights; ++i) {
lfequal(first->weight[i], second->weight[i]);
}
genann_free(first);
genann_free(second);
}
void sigmoid() {
double i = -20;
const double max = 20;
const double d = .0001;
while (i < max) {
lfequal(genann_act_sigmoid(NULL, i), genann_act_sigmoid_cached(NULL, i));
i += d;
}
}
int main(int argc, char *argv[])
{
printf("GENANN TEST SUITE\n");
srand(100); //Repeatable test results.
lrun("basic", basic);
lrun("xor", xor);
lrun("backprop", backprop);
lrun("train and", train_and);
lrun("train or", train_or);
lrun("train xor", train_xor);
lrun("persist", persist);
lrun("copy", copy);
lrun("sigmoid", sigmoid);
lresults();
return lfails != 0;
}