美文网首页
opencv 非线性分离数据svm使用

opencv 非线性分离数据svm使用

作者: 一航jason | 来源:发表于2020-08-12 13:21 被阅读0次
使用opencv 非线性分离数据svm

为什么要扩展SVM优化问题以处理非线性可分离的训练数据呢?在计算机视觉中使用SVM的大多数应用需要比简单的线性分类器更强大的工具。这源于事实,在这些任务中,训练数据很少使用超平面分离。

考虑这些任务之一,例如面部检测。在这种情况下的训练数据由一组图像组成,这些图像是面部和另一组图像,这些图像是非面部(世界上除了面部之外的其他事物)。该训练数据太复杂,以便找到每个样本(特征向量)的表示,可以使整个面的整个面线与整组非面线线性分离。

使用SVM我们获得一个分离超平面。因此,由于训练数据现在是非线性可分的,所以我们必须承认,发现的超平面将错误分类某些样本。这种错误分类是必须考虑的优化中的一个新变量。新模式必须既包含找到提供最大利润的超平面的旧要求,又包括通过不允许太多分类错误正确地推广训练数据的新要求。

使用
        // Data for visual representation
        const int WIDTH = 512, HEIGHT = 512;
        Mat I = Mat::zeros(HEIGHT, WIDTH, CV_8UC3);
        //--------------------- 1. Set up training data randomly ---------------------------------------
        Mat trainData(2*NTRAINING_SAMPLES, 2, CV_32FC1);
        Mat labels   (2*NTRAINING_SAMPLES, 1, CV_32SC1);
        RNG rng(100); // Random value generation class
        // Set up the linearly separable part of the training data
        int nLinearSamples = (int) (FRAC_LINEAR_SEP * NTRAINING_SAMPLES);
        // Generate random points for the class 1
        Mat trainClass = trainData.rowRange(0, nLinearSamples);
        // The x coordinate of the points is in [0, 0.4)
        Mat c = trainClass.colRange(0, 1);
        rng.fill(c, RNG::UNIFORM, Scalar(1), Scalar(0.4 * WIDTH));
        // The y coordinate of the points is in [0, 1)
        c = trainClass.colRange(1,2);
        rng.fill(c, RNG::UNIFORM, Scalar(1), Scalar(HEIGHT));
        // Generate random points for the class 2
        trainClass = trainData.rowRange(2*NTRAINING_SAMPLES-nLinearSamples, 2*NTRAINING_SAMPLES);
        // The x coordinate of the points is in [0.6, 1]
        c = trainClass.colRange(0 , 1);
        rng.fill(c, RNG::UNIFORM, Scalar(0.6*WIDTH), Scalar(WIDTH));
        // The y coordinate of the points is in [0, 1)
        c = trainClass.colRange(1,2);
        rng.fill(c, RNG::UNIFORM, Scalar(1), Scalar(HEIGHT));
        //------------------ Set up the non-linearly separable part of the training data ---------------
        // Generate random points for the classes 1 and 2
        trainClass = trainData.rowRange(  nLinearSamples, 2*NTRAINING_SAMPLES-nLinearSamples);
        // The x coordinate of the points is in [0.4, 0.6)
        c = trainClass.colRange(0,1);
        rng.fill(c, RNG::UNIFORM, Scalar(0.4*WIDTH), Scalar(0.6*WIDTH));
        // The y coordinate of the points is in [0, 1)
        c = trainClass.colRange(1,2);
        rng.fill(c, RNG::UNIFORM, Scalar(1), Scalar(HEIGHT));
        //------------------------- Set up the labels for the classes ---------------------------------
        labels.rowRange(                0,   NTRAINING_SAMPLES).setTo(1);  // Class 1
        labels.rowRange(NTRAINING_SAMPLES, 2*NTRAINING_SAMPLES).setTo(2);  // Class 2
        //------------------------ 2. Set up the support vector machines parameters --------------------
        //------------------------ 3. Train the svm ----------------------------------------------------
        Ptr<SVM> svm = SVM::create();
        svm->setType(SVM::C_SVC);
        svm->setC(0.1);
        svm->setKernel(SVM::LINEAR);
        svm->setTermCriteria(TermCriteria(TermCriteria::MAX_ITER, (int)1e7, 1e-6));
        svm->train(trainData, ROW_SAMPLE, labels);
        //------------------------ 4. Show the decision regions ----------------------------------------
        Vec3b green(0,100,0), mRed (100, 0, 0);
        for (int i = 0; i < I.rows; ++i)
            for (int j = 0; j < I.cols; ++j)
            {
                Mat sampleMat = (Mat_<float>(1,2) << i, j);
                float response = svm->predict(sampleMat);
                if      (response == 1)    I.at<Vec3b>(j, i)  = green;
                else if (response == 2)    I.at<Vec3b>(j, i)  = mRed;
            }
        //----------------------- 5. Show the training data --------------------------------------------
        int thick = -1;
        int lineType = 8;
        float px, py;
        // Class 1
        for (int i = 0; i < NTRAINING_SAMPLES; ++i)
        {
            px = trainData.at<float>(i,0);
            py = trainData.at<float>(i,1);
            circle(I, Point( (int) px,  (int) py ), 3, Scalar(0, 255, 0), thick, lineType);
        }
        // Class 2
        for (int i = NTRAINING_SAMPLES; i <2*NTRAINING_SAMPLES; ++i)
        {
            px = trainData.at<float>(i,0);
            py = trainData.at<float>(i,1);
            circle(I, Point( (int) px, (int) py ), 3, Scalar(255, 0, 0), thick, lineType);
        }
        //------------------------- 6. Show support vectors --------------------------------------------
        thick = 2;
        lineType  = 8;
        Mat sv = svm->getUncompressedSupportVectors();
        for (int i = 0; i < sv.rows; ++i)
        {
            const float* v = sv.ptr<float>(i);
            circle( I,  Point( (int) v[0], (int) v[1]), 6, Scalar(128, 128, 128), thick, lineType);
        }
效果
outs.jpg

相关文章

网友评论

      本文标题:opencv 非线性分离数据svm使用

      本文链接:https://www.haomeiwen.com/subject/vnipdktx.html