Title: | Kernel Fisher Discriminant Analysis |
---|---|
Description: | Kernel Fisher Discriminant Analysis (KFDA) is performed using Kernel Principal Component Analysis (KPCA) and Fisher Discriminant Analysis (FDA). There are some similar packages. First, 'lfda' is a package that performs Local Fisher Discriminant Analysis (LFDA) and performs other functions. In particular, 'lfda' seems to be impossible to test because it needs the label information of the data in the function argument. Also, the 'ks' package has a limited dimension, which makes it difficult to analyze properly. This package is a simple and practical package for KFDA based on the paper of Yang, J., Jin, Z., Yang, J. Y., Zhang, D., and Frangi, A. F. (2004) <DOI:10.1016/j.patcog.2003.10.015>. |
Authors: | Donghwan Kim |
Maintainer: | Donghwan Kim <[email protected]> |
License: | GPL-3 |
Version: | 1.0.0 |
Built: | 2025-02-14 04:25:28 UTC |
Source: | https://github.com/cran/kfda |
Train the trainData using KFDA. Basically, we run KFDA using Gaussian kernel. Returns trained KFDA object.
kfda(trainData = data, kernel.name = "rbfdot", kpar.sigma = 0.001, threshold = 1e-05)
kfda(trainData = data, kernel.name = "rbfdot", kpar.sigma = 0.001, threshold = 1e-05)
trainData |
an optional |
kernel.name |
the kernel function used in training and predicting. This parameter is fixed in the |
kpar.sigma |
hyper-parameter of selected kernel. |
threshold |
the value of the eigenvalue under which principal components are ignored (only valid when features = 0). (default : 1e-05). |
Train the trainData using KFDA. Basically, we run KFDA using Gaussian kernel. Returns trained KFDA object.
Since this function performs KFDA with the appropriate combination of kpca
and lda
, the following values can show the result of each function.
An object of class kfda
.
kpca.train |
An object of class "kpca". It has results of |
lda.rotation.train |
The result of applying LDA, After KPCA is performed on trainData. |
LDs |
A dataframe of linear discriminants of LDA. |
label |
A vector of class label of trainData. |
This package is an early version and will be updated in the future.
Donghwan Kim
[email protected]
[email protected]
[email protected]
Yang, J., Jin, Z., Yang, J. Y., Zhang, D., and Frangi, A. F. (2004) <DOI:10.1016/j.patcog.2003.10.015>. Essence of kernel Fisher discriminant: KPCA plus LDA. Pattern Recognition, 37(10): 2097-2100.
kpca
(in package kernlab)
lda
(in package MASS)
kfda.predict
# data input data(iris) # data separation idx <- sample(1:dim(iris)[1], round(dim(iris)[1]*0.7)) trainData <- iris[idx, ] # training KFDA model kfda.model <- kfda(trainData = trainData, kernel.name = "rbfdot") # structure of kfda.model str(kfda.model)
# data input data(iris) # data separation idx <- sample(1:dim(iris)[1], round(dim(iris)[1]*0.7)) trainData <- iris[idx, ] # training KFDA model kfda.model <- kfda(trainData = trainData, kernel.name = "rbfdot") # structure of kfda.model str(kfda.model)
Test the testData using KFDA. This function is used after training phase is performed using the kfda function.
kfda.predict(object = obj, testData = data)
kfda.predict(object = obj, testData = data)
object |
An |
testData |
an optional |
Since this function inherits KPCA
and LDA
, various learning can be possible by adjusting the hyper-parameters of each function.
The result of performing testData on the KFDA model.
class |
A class label of testData. |
posterior |
A posterior probabilities for the classes. |
x |
The scores of testData on up to |
Donghwan Kim
[email protected]
[email protected]
[email protected]
Yang, J., Jin, Z., Yang, J. Y., Zhang, D., and Frangi, A. F. (2004) <DOI:10.1016/j.patcog.2003.10.015>. Essence of kernel Fisher discriminant: KPCA plus LDA. Pattern Recognition, 37(10): 2097-2100.
# data input data(iris) # data separation idx <- sample(1:dim(iris)[1], round(dim(iris)[1]*0.7)) trainData <- iris[idx, ] testData <- iris[-(idx), -dim(iris)[2]] testData.Label <- iris[-(idx), dim(iris)[2]] # training KFDA model kfda.model <- kfda(trainData = trainData, kernel.name = "rbfdot") # testing new(test)data by KFDA model pre <- kfda.predict(object = kfda.model, testData = testData) # plotting plot(kfda.model$LDs, col = kfda.model$label, pch = 19, main = "Plot for KFDA") points(pre$x, col = pre$class, cex = 2) legend("topleft", legend = c("trainData","testData"), pch = c(19,1)) # prediction result table(pre$class, (testData.Label))
# data input data(iris) # data separation idx <- sample(1:dim(iris)[1], round(dim(iris)[1]*0.7)) trainData <- iris[idx, ] testData <- iris[-(idx), -dim(iris)[2]] testData.Label <- iris[-(idx), dim(iris)[2]] # training KFDA model kfda.model <- kfda(trainData = trainData, kernel.name = "rbfdot") # testing new(test)data by KFDA model pre <- kfda.predict(object = kfda.model, testData = testData) # plotting plot(kfda.model$LDs, col = kfda.model$label, pch = 19, main = "Plot for KFDA") points(pre$x, col = pre$class, cex = 2) legend("topleft", legend = c("trainData","testData"), pch = c(19,1)) # prediction result table(pre$class, (testData.Label))