C# and Matlab differences in reading video frame rgb values -
I am trying to convert a matlab script to c #. Matlab script reads an .avi video file and for each frame in the video it calculates the mean value of red, green, and blue pixels and then puts them in the matrix. So im is prepared in a red color. Pisces, green color Pisces, and blue colors. A matrix value for each frame in the video value.
To use it in the Effore framework, in C # im, in which there is an AVIreader class, as well as an Image Static class which provides RGB mean value for a given image. The problem I have run is the mean value, calculates matlab for one frame and the C # calculations for one frame do not mean the match. They are the same and like within 5 or I think they should be the same.
C # was using 15 sigfigs for the purpose, so I asked to remove the possibility of rounding matlab errors and it did not really improve the results.
So here's the kicker .....
I have decided that we take a random jpeg image from Google and load it into matlab. And separate your RGB pixel values into three different metrics I selected R matrix and calculate the histoc () function to see how many R values were equal to any one of the ranges of 0-255.
I did the right thing using the same image. The only thing in C # that changed me the image to bitmap and then using the photograph of the photographer to give me a histogram for R values in the image. Done
When I compared the histogram given in matlab in C #, they were very similar, but for some values between 0-255 a histogram would have more or less value than the other. In fact, for some reasons, C # and Matlab interprets pixel values differently in the image and I would like to know why?
Have you checked the data types of RGB matrix in Matlab? It is possible that the value is stored as an integer.
& gt; & Gt; A = magic (25); & Gt; & Gt; B = uint8 (a); & Gt; & Gt; Meaning (mean (a)) ans = 313 & gt; & Gt; Meaning (Mean (B)) ans = 203.1840 & gt; & Gt;
Comments
Post a Comment