Tuesday, October 29, 2013

video lecture on different topics



1. Linear Programming / Linear Optimization

fundamental of operation research
Lec-3 Linear Programming Solutions IIT madras

http://www.youtube.com/watch?v=XEA1pOtyrfo



http://nptel.iitm.ac.in



2. 

Wednesday, October 23, 2013

Call matlab from C/C++ java or Call C/C++ java from Matlab matlab binary calling



Call Matlab from c/c++ , java etc



http://www.mathworks.com/help/matlab/matlab_external/calling-matlab-software-from-a-c-application.html

http://www.mathworks.com/help/matlab/matlab_external/compiling-engine-applications-with-the-mex-command.html#bsq78dr-9


Set env var First


export LD_LIBRARY_PATH=/mnt/kaustapps/MATLAB-faculty/R2011b.app/bin/glnxa64/:/mnt/kaustapps/MATLAB-faculty/R2011b.app/sys/os/glnxa64/:$LD_LIBRARY_PATH

UNIX Engine Example engdemo

To verify the build process on your computer, use the C example engdemo.c or the C++ example engdemo.cpp.
  1. Copy one of the programs, for example, engdemo.c, to your current working folder:
    copyfile(fullfile(matlabroot,...
      'extern','examples','eng_mat','engdemo.c'),...
      '.', 'f');
  2. Build the executable file:
    mex('-v', '-f', fullfile(matlabroot,...
      'bin','engopts.sh'),...
      'engdemo.c');
  3. Verify that the build worked by looking in your current working folder for the engdemo application:
    dir engdemo
  4. Run the example in MATLAB:
    !engdemo
For more information about the engdemo applications, see Call MATLAB Functions from C Applications.
 
 


Call C/C++ , java from Matlab




http://www.mathworks.com/help/matlab/ref/mex.html

http://www.mathworks.com/help/matlab/create-mex-files.html


Sunday, October 20, 2013

libsvm usage


FAQ
=====
http://www.csie.ntu.edu.tw/~cjlin/libsvm/faq.html#/Q4:_Training_and_prediction


DONWLOAD
========

Just need to download 1 zip file from main page. That's all.

http://www.csie.ntu.edu.tw/~cjlin/libsvm/

INSTALL
==========
make

If you want to use parameter estimation, you need to change the code a bit and do following

make clean;
make install;


DATAFORMAT
==============
label  1:feat#1  2:feat#2  3:feat#3      N:feat#N


Some available data
===============
http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/

Use heart_scale data. It works perfectly for all plot, cv and parameter estimation.

2-class CLASSIFICATION with RBF kernel with 5 fold CV
=================================

train (support vectors are generated):
./svm-train -s 0 -t 2   -g  0.03125 -c 0.25   train.dat  train.model

train with CV(No support  vectors are shown, just show your score: AUC,F-score)

./svm-train -s 0 -t 2 -v 5  train.dat  > train.cv

testing:
./svm-predict test.dat train.model test.output

Now parse output file (containing predicted label) to calculate sen, spe and accuracy.

Regression
================

train:
./svm-train -s 3  engine.train  engine.train.model
 testing:
./svm-predict  engine.train.model engine.output

Now parse output file (containing predicted values) to calculate RMS etc.

Parameter Searching for RBF kernel (only supported kernel)
====================================

http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/eval/index.html

a. Change in source file as mentioned in above link
b. use grid.py in folder tools
c. Read  the README file inside tools to select range of parameters.
d. use following command:

 python grid.py -log2c -5,5,1 -log2g -4,0,1 -v 10  ../../data/heart_scale

- it searches log2c of "c" parameter in range [-5,,5] with increment 1. And searches log2g of "g" parameter in range [-4,0] with increment 1.  with 10 fold CV using data heart_scale

e. Select the maximum score(i.e. AUC, F-score ) from the output file and it contains the log2(param) of kernel parameter. So, take inverseLog for final parameter.

f. If you wanna use other criteria besides AUC (default ) then change the

 double (*validation_function)(const dvec_t&, const ivec_t&) = auc;
in eval.cpp to the evaluation function you preferred. You can also assign "precision", "recall", "fscore", or "bac" here.

FEATURE SELECTION
===================



Windows
=============

svm-train -s 3  D:\matlabWorkspace\fuelPerfHeavyNapthaSVM\engine.train.svm D:\matlabWorkspace\fuelPerfHeavyNapthaSVM\engine.train.svm.model
svm-predict    D:\matlabWorkspace\fuelPerfHeavyNapthaSVM\engine.test.svm  D:\matlabWorkspace\fuelPerfHeavyNapthaSVM\engine.train.svm.model D:\matlabWorkspace\fuelPerfHeavyNapthaSVM\predicted

Wednesday, October 2, 2013

matlab cross validation with svm [draft not final]



function test

clc;
matPos = csvread('pos.dat');
noPos= size(matPos,1);
noFeature = size(matPos,2);
labelPos= ones(noPos,1);
% matPos= [matPos labelPos ];

matNeg = csvread('neg.dat');
noNeg= size(matNeg,1);
labelNeg= -1*ones(noNeg,1);
% matNeg= [matNeg labelNeg ];

% svmStruct = svmtrain(featureInTrain,featureOutTrain,'kernel_function','linear'  , 'options' ,smo_opts); %,'rbf_sigma',100,'boxconstraint',25

noFold=5;
c = cvpartition([labelPos ; labelNeg],'kfold', noFold);
strArray = java_array('java.lang.String', 2);
strArray(1) = java.lang.String('1');
strArray(2) = java.lang.String('-1');

myorder = cell(strArray)

f = @(xtr,ytr,xte,yte) confusionmat(yte,@(xtr,ytr,xte)crossfun(xtr,ytr,xte, exp(z(1)),exp(z(2))),'order', [1 -1] );
cfMat = crossval(f,[matPos; matNeg], [ labelPos ; labelNeg],'partition',c);
cfMat = reshape(sum(cfMat),3,3)


minfn = @(z)crossval('mcr',[matPos; matNeg], [ labelPos ; labelNeg],...
    'Predfun', @(xtrain,ytrain,xtest)crossfun(xtrain,ytrain,xtest, exp(z(1)),exp(z(2))), ...
    'partition',c   );
 

opts = optimset('TolX',5e-4,'TolFun',5e-4);
[searchmin fval] = fminsearch(minfn,randn(2,1),opts)

load('fisheriris');
y = species;
X = meas;
order = unique(y); % Order of the group labels
cp = cvpartition(y,'k',10); % Stratified cross-validation

f = @(xtr,ytr,xte,yte)confusionmat(yte,classify(xte,xtr,ytr),'order',order);

cfMat = crossval(f,X,y,'partition',cp);
cfMat = reshape(sum(cfMat),3,3)



display('done')
end





function yfit = crossfun(xtrain,ytrain,xtest, rbf_sigma,boxconstraint)

% Train the model on xtrain, ytrain,
% and get predictions of class of xtest
svmStruct = svmtrain(xtrain,ytrain, 'boxconstraint',boxconstraint, ...
    'Kernel_Function','rbf','rbf_sigma',rbf_sigma );
    %'kernel_function','linear'  , 'options' ,smo_opts );
  %  'Kernel_Function','rbf','rbf_sigma',rbf_sigma, );
  
 
yfit = svmclassify(svmStruct,xtest);

yfit

end