
Reading from Databases is slightly more complicated, but still very easy. For example, the XRFF format saves the class attribute information as wellĭata.setClassIndex(data.numAttributes() - 1) setting class attribute if the data format does not provide this information import .DataSource ĭataSource source = new DataSource("/some/where/data.arff") It can also read CSV files and other formats (basically all file formats that Weka can import via its converters it uses the file extension to determine the associated loader). The DataSource class is not limited to ARFF files. The classifiers and filters always list their options in the Javadoc API ( stable, developer version) specification.Ī comprehensive source of information is the chapter Using the API of the Weka manual. A link to an example class can be found at the end of this page, under the Links section.
Configure weka jar how to#
The following sections explain how to use them in your own code.


tActivationFunction(new ActivationSoftmax()) įurther configuration can be done by setting a NeuralNetConfiguration NeuralNetConfiguration nnc = new NeuralNetConfiguration() įinally the layers are set with // Add the layers to the classifier OutputLayer outputLayer = new OutputLayer() The networks architecture can be set up by creating each layer step by step: DenseLayer denseLayer = new DenseLayer() ĭtActivationFunction(new ActivationReLU())
Configure weka jar code#
Using the Dl4jMlPClassifier your code should usually start with // Create a new Multi-Layer-Perceptron classifierĭl4jMlpClassifier clf = new Dl4jMlpClassifier() The Java API is a straight forward wrapper for the official DeepLearning4j API. Available options can be found in the Java documentation (the field commandLineParamSynopsis indicates the commandline parameter name for each available method). The above setup builds a network with one hidden layer, having 10 output units using the ReLU activation function, followed by an output layer with the softmax activation function, using a multi-class cross-entropy loss function (MCXENT) as optimization objective.Īnother important option is the neural network configuration -conf in which you can setup hyperparameters for the network. This option can be used multiple times and defines the architecture of the network layer-wise. The most interesting option may be the -layer specification. The desired batch size for batch prediction (default 100). The number of decimal places for the output of numbers in the model (default 2). If set, classifier capabilities are not checked before classifier is built May output additional info to the console

If set, classifier is run in debug mode and The queue size for asynchronous data transfer (default: 0, synchronous transfer). The name of the log file to write loss information to (default = no log file). Dl4jMlpClassifier -hīelow the general options, the specific ones are listed: Options specific to 4jMlpClassifier: ImageInstanceIterator exposes the following options:Īssuming weka.jar is on the CLASSPATH, a first look for the available commandline options of the Dl4jMlpClassifier is shown with $ java weka.Run. The iterator can be selected from the Dl4jMlpClassifier window via the instance iterator option. the ConvolutionLayer's options:Īs explained further in the data section, depending on the dataset a certain InstanceIterator has to be loaded that handles parsing of certain data types (text/image). The user can choose from the following available layers:Ī layer can be further configured, e.g. The layer specification option lets the user specify the sequence of layers that build the neural network architecture: The network configuration option exposes further hyperparameter tuning: The Dl4jMlpClassifier can be configured as shown below: Make sure your WEKA_HOME environment variable is set. Simple examples are given in the examples section for the Iris dataset and the MNIST dataset. The main classifier exposed by this package is named Dl4jMlpClassifier. If you are new to Weka, a good resource to get started is the Weka manual.Īs most of Weka, the WekaDeeplearning4j's functionality is accessible in three ways:Īll three ways are explained in the following.
