Jump to content

AI with Java Netbeans 16, Ant


Dietmar

Recommended Posts

The point is: they (MS, Google, meta and others) won't never rebuild uman's brain in total, only small parts, and with all the rest (with needs, fears etc.) they have nothing to do... hopefully we'll have (political) powers in the future, that will have an ethic-view on this development...

Currently stuck to integrate javax.visrec.ml.data.DataSet, thank you @Dietmar with the idea to change to maven... i'll be back

Edit: it's such a mess, i struggle to import the libs you mentioned:  it stores visrec in a new folder javax but the error persists...

Edit2: OK! Now i've learnd how to integrate visrec-api correctly <_<

Edited by Mark-XP
resolved...
Link to comment
Share on other sites


@Dietmar, thank you very much again for sharing this truely interesting stuff here! I must say the Neroph framework is an impressive tool (especially in regard of the small size of the package!) but the results of "tiere" - tbh - isn't overwhelming either.

I'm very curious for the libraries neuroph-imgrec...  does it really mean "image recognition"? i have to read the docs... a system that could learn to seperate women (portraits) from men (portraits) would be sensational!

For the moment unfortunately i have yet to struggle hard with the eclipe IDE, please look at this ridiculous behavior of the help window when i hover the mouse over the class Scanner, to get some info: it tells me black on black background :wacko::

Classes-black.jpg

Link to comment
Share on other sites

@Mark-XP

Does the program "tiere" works on Eclipse Ide?

It is an amazing program. For example after teaching with names for "Hund" "Katze" or "nix" it can decide for example for

HUUUNDDD, hundhundhund, hundkatze, katzehund,

which was never trained before.

And you can look for primes. Just set letter "a" to "0", "b" to "1" "c" to "2" "d" to "3" etc.  which gives prime  37 as dh .

This overcomes the problem with BIG numbers and Normalization because of Sigmoid f(x) = 1/(1+e^(-x))

Dietmar

Edited by Dietmar
Link to comment
Share on other sites

@Dietmar oh yes, it runs nicely! (After having added the 3 neuroph-core lib's sleepwalking safely (in the night) i struggeld hours in eclipse to add the other 3 libs the next morning...something went very wrong)

Ok, i roughly understand the concept of the sigmoid-Fkt. (as a threshold function to decide when a neuron should "fire") in a neuro-net, and it's well limited, but tbh. i havn't understood it's meaning for search for primes. Maybe you have a good example or link?

Huundd and Mietzie seems too ,fuzzy' for me. I'd like to find an application of this in rather binary situations; like i wrote before: decide if a portrait shows a man or a women. (Like it's just done in medicin: AI decides if flesh-texture is cancer or good-natured).

Edited by Mark-XP
typo
Link to comment
Share on other sites

Now this is interesting @Dietmar: i tried to ,teach' it to seperate odd from even: starting with '2' it estimates Hund and i inform him: that's correct. (Hund = even - if first answer is Katze we define Katze = even). After trainung with 3 additional valus (7, 55, 88 see picture) the Java runtime is running hot, but the prgram doesn't work anymore.

That happens in both environments, XP and Win7 (with slightly different Java versions)!

Are you able to reproduce?

Odd-Even.jpg

Odd-Java-stressed.jpg

Edited by Mark-XP
Question
Link to comment
Share on other sites

@Mark-XP

Always, when you hit "n",

ALL the weigths have to be calculated new.

There is a limit in Neuroph. For me the same happens, after about 40 numbers for training.

Always after an "n" answer.

Chat GPT told, that this is a limit in Neuroph, but I think, that it is a Bug.

Of course, such a lot of resources are needed for this program, but it should work longer.

Because of this I try to implement DL4j but under Java Ant without success until now.

In 4 days I have holidays and then I make a try to transfer the "Hund" "Katze" "nix" program

Dietmar

 

Link to comment
Share on other sites

6 hours ago, Dietmar said:

Chat GPT told, that this is a limit in Neuroph, but I think, that it is a Bug.

Do i understand you well @Dietmar , you spoke with one AI about another? I find this so.. amazing SciFi, really extraordinary!

And of course that is a bug in Neuroph, and we should imo report it, here. it could be done easyly by sending them the link with your java source (eg. this) and the link to my post above.

Ok, that said, now i'll finally begin to study the documentations of Neuroph... ;)

Edit: Nice, on page 13 of "Getting Started with Neuroph 2.98.pdf" (inside neuroph 2.98 zip) are described the mehods save and createFromFile to save and load previously trained neuro-networks ...

Edited by Mark-XP
Edit
Link to comment
Share on other sites

seems to work:

    	String eingabe = "";
    //  Evtl. vorhandenes (trainiertes) NN laden:
    	File f = new File("or_perceptron.nnet");
    	if (f.exists() && !f.isDirectory()) { 
    		System.out.println("Soll or_perceptron.nnet geladen werden: ");
    		eingabe = scanner.nextLine().toLowerCase();
    		System.out.println("");
    		if (eingabe.equals("y") || eingabe.equals("j")) {
    	    	System.out.println("or_perceptron.nnet laden... ");
   				neuralNetwork = 
   				   NeuralNetwork.createFromFile ("or_perceptron.nnet");
    		}
    	}
		...
    //  while (true) {
        while (! eingabe.equals("save")) {
   
        }
        neuralNetwork.save ("or_perceptron.nnet");

 

Link to comment
Share on other sites

I just installed in win10 bit64 the Alpaca ChatGPT from Stanford uni with name

https://github.com/BenHerbst/Dalaix

It works. After install, you have your own ChatGPT, no Internet needed after install.

But until now it is stupid and I have no idea how I can train it.

Example:

Which year is now? 2019

No, 2023.

Which year is today? 2021

No. 2023.

Which year is today? 2021.

No. 2023.

Which year is today?

20

No. 2023.

Which year is today?

This year

Dietmar

Link to comment
Share on other sites

@Mark-XP

Today I make a new try with the "tiere" program.

I install Netbeans 17 bit64 on win10 bit64 with latest Java.

And voila, no crash now at all with the Neuroph Neural Network Bibliothek from Belgrad.

So, I was right, that it is a resources problem.

The program uses more than 1Gbyte ram in a session with training with 100 different names for dogs and cats.

I run it on the Asrock z690 Extreme board with 12900k cpu and 32 Gbyte ram.

Now, this nice program can be used also for looking for primes. Now it is fast.

It is a fantastic program, I think nearly everything can be shown with this program, that has to do with Ai

Dietmar

Link to comment
Share on other sites

Hello @Dietmar, so you're obviously enjoying your hollidays, very nice :)! Just a couple of remarks/questions:

- What ,crash' do you mean? Jave running hot (infinite) after entering a new trainingElement?

- What is "Neuroph Neural Network Bibliothek from Belgrad" - i can't find nothing.

- I've got no resources problem at all with tiere on an Z77E machine with only 4 GB of RAM: it just uses 438 MB  (-> pic.1)

- However, i have problems to include the neuroph-docs in Eclipse-IDE: no Info in the popup when hovering the mouse over the NeuralNetwork class (-> pic 2)

- i try to make an image recognition progi that can learn to seperate Men from women portraits. But that will take some time...

have a nice day!

Tiere-mem.jpg

neuroph-doc.jpg

Link to comment
Share on other sites

@Mark-XP

During search for Prime Numbers,

the Hund, Katze Program crashes again, means it is hanging when setting all the weights new after an "n".

So, there is a bug in Neuroph 2.98, which depends not whether it is bit32 or bit64, win10 or XP SP3, or the Java version

Dietmar

Edited by Dietmar
Link to comment
Share on other sites

40 minutes ago, Dietmar said:

...the Hund, Katze Program crashes again, means it is hanging when setting all the weights new after an "n".

So, there is a bug in Neuroph 2.98, which depends not whether it is bit32 or bit64, win10 or XP SP3, or the Java version

Yes @Dietmar, i agree to name it ,hanging' - and would like to adress it to the developer(s) as i posted above... (if you're ok with it).

What's the Belgrad bibliothek? Can't find that with qwant.

Edit: @Dietmar HA!! your Asrock z690 Extreme has an Intel i219-V on it - maybe you like to have a look in my topic which i created for it... ;)

Edited by Mark-XP
Edit
Link to comment
Share on other sites

@Mark-XP

I make a try with Neuroph 2.96.

Hangs at exact the same place. So, it is a loong standing bug in Neuroph

Dietmar

PS: I try to use only the original Java Bibliothek.

Hard job, to implement everything by hand for a working Neural Network,

but I try.

Edited by Dietmar
Link to comment
Share on other sites

@Mark-XP

Hi,

here is the Neural Network with Java from Scratch. What a crazy hard job. It works using only the Java Standard Bibliothek.

 

This program is written in Java, which is a programming language. It's a neural network program, which means it's designed to learn and recognize patterns in data. Neural networks are used in many applications such as image recognition, natural language processing, and recommendation systems.

The program starts by defining the NeuralNetwork class, which contains the following variables:

numInputNodes: the number of input nodes in the neural network.

numHiddenNodes: the number of hidden nodes in the neural network.

numOutputNodes: the number of output nodes in the neural network.

numHiddenLayers: the number of hidden layers in the neural network.

inputHiddenWeights: a matrix of weights between the input layer and the first hidden layer.

hiddenOutputWeights: a matrix of weights between the last hidden layer and the output layer.

hiddenBias: an array of biases for the hidden nodes.

outputBias: an array of biases for the output nodes.

The constructor of the NeuralNetwork class takes four arguments: numInputNodes, numHiddenNodes, numOutputNodes, and numHiddenLayers. These arguments define the structure of the neural network. The constructor initializes the inputHiddenWeights, hiddenOutputWeights, hiddenBias, and outputBias variables with random values between 0 and 1.

The program then defines a sigmoid function, which is a mathematical function used in neural networks to convert any input value into a value between 0 and 1. The sigmoid function is used to calculate the activation level of each node in the neural network.

The forwardPropagation method takes an input array and returns an output array. The input array represents the input data that the neural network is trying to recognize. The forwardPropagation method calculates the activation level of each node in the neural network and returns the activation level of the output nodes as the result.

The forwardPropagation method starts by calculating the activation level of the nodes in the first hidden layer. It does this by multiplying the input values by the weights between the input layer and the first hidden layer, adding the biases for each hidden node, and then applying the sigmoid function to the result. This gives the activation level of each node in the first hidden layer.

The forwardPropagation method then calculates the activation level of the nodes in the subsequent hidden layers in the same way as the first hidden layer. It does this by multiplying the activation levels of the nodes in the previous hidden layer by the weights between the previous hidden layer and the current hidden layer, adding the biases for each hidden node, and then applying the sigmoid function to the result.

Finally, the forwardPropagation method calculates the activation level of the output nodes by multiplying the activation levels of the nodes in the last hidden layer by the weights between the last hidden layer and the output layer, adding the biases for each output node, and then applying the sigmoid function to the result.

The main method of the program creates an instance of the NeuralNetwork class with 2 input nodes, 4 hidden nodes, 1 output node, and 1 hidden layer. It then tests the forwardPropagation method with an input array of [0.5, 0.7]. The output of the forwardPropagation method is printed to the console using the Arrays.toString method. The output represents the activation level of the output node(s) for the given input.

I hope this explanation helps! Let me know if you have any more questions.

Dietmar

package neuralnetwork;


import java.util.Arrays;

public class NeuralNetwork {
    private int numInputNodes;
    private int numHiddenNodes;
    private int numOutputNodes;
    private int numHiddenLayers;

    private double[][] inputHiddenWeights;
    private double[][] hiddenOutputWeights;
    private double[] hiddenBias;
    private double[] outputBias;

    public NeuralNetwork(int numInputNodes, int numHiddenNodes, int numOutputNodes, int numHiddenLayers) {
        this.numInputNodes = numInputNodes;
        this.numHiddenNodes = numHiddenNodes;
        this.numOutputNodes = numOutputNodes;
        this.numHiddenLayers = numHiddenLayers;

        // initialize weights and biases randomly
        inputHiddenWeights = new double[numInputNodes][numHiddenNodes];
        hiddenOutputWeights = new double[numHiddenNodes][numOutputNodes];
        hiddenBias = new double[numHiddenNodes];
        outputBias = new double[numOutputNodes];

        for (double[] row : inputHiddenWeights) {
            Arrays.fill(row, Math.random());
        }

        for (double[] row : hiddenOutputWeights) {
            Arrays.fill(row, Math.random());
        }

        for (int i = 0; i < numHiddenNodes; i++) {
            hiddenBias[i] = Math.random();
        }

        for (int i = 0; i < numOutputNodes; i++) {
            outputBias[i] = Math.random();
        }
    }

    public double sigmoid(double x) {
        return 1 / (1 + Math.exp(-x));
    }

    public double[] forwardPropagation(double[] input) {
        // calculate activations for first hidden layer
        double[] hiddenActivations = new double[numHiddenNodes];

        for (int j = 0; j < numHiddenNodes; j++) {
            double sum = 0;

            for (int i = 0; i < numInputNodes; i++) {
                sum += input[i] * inputHiddenWeights[i][j];
            }

            hiddenActivations[j] = sigmoid(sum + hiddenBias[j]);
        }

        // calculate activations for subsequent hidden layers
        for (int layer = 1; layer < numHiddenLayers; layer++) {
            double[] nextHiddenActivations = new double[numHiddenNodes];

            for (int j = 0; j < numHiddenNodes; j++) {
                double sum = 0;

                for (int i = 0; i < numHiddenNodes; i++) {
                    sum += hiddenActivations[i] * inputHiddenWeights[i][j];
                }

                nextHiddenActivations[j] = sigmoid(sum + hiddenBias[j]);
            }

            hiddenActivations = nextHiddenActivations;
        }

        // calculate output layer activations
        double[] outputActivations = new double[numOutputNodes];

        for (int j = 0; j < numOutputNodes; j++) {
            double sum = 0;

            for (int i = 0; i < numHiddenNodes; i++) {
                sum += hiddenActivations[i] * hiddenOutputWeights[i][j];
            }

            outputActivations[j] = sigmoid(sum + outputBias[j]);
        }

        return outputActivations;
    }

    public static void main(String[] args) {
       

        // create neural network with 2 input nodes, 4 hidden nodes, 1 output node, and 1 hidden layer
        NeuralNetwork nn = new NeuralNetwork(2, 4, 1, 1);

        // test forward propagation with input [0.5, 0.7]
        double[] input = {0.5, 0.7};
        double[] output = nn.forwardPropagation(input);

        System.out.println(Arrays.toString(output)); // print output
    }
}

 

Edited by Dietmar
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...