Weekly Post #6 - De Luca Fuzzy Entropy is a-go!

I think...

With mid-project demos looming this week has been focused on getting De Luca & Termini's fuzzy entropy working.

This week:

Tuesday was my birthday 🎉

Finished poster which was due Wednesday

Created Toy example to run De Luca on (sort of)

Fix De Luca fuzzy entropy

Fix initial membership test


Toy example

Last week I wrote that I didn't think the De Luca algorithm was actually working, as the output mean images never changed. Neil made a good point on Monday, that MNIST data is binary, so there are very few grey-level values even on the mean image, so this could be contributing to the lack of change.

He suggested I create a 'Toy Example' - a small set of input data, which will clearly show if improvements are being made or not.

MNIST grey

This is a small subset of the MNIST data, in which I have tweaked the grey-level values of most of the 0s. \n\nThis has raised a new issue. Reading in pgm files into the Congealing code, and ensuring that the header of the pgm file is constructed appropriately.

MNIST header This is the header of train_30_shuf.pbm as supplied in the demo Congealing code. It's worth noting, that while line 2 is a comment (denoted by # symbol), this line is actually read in, and used in the load function.

pgm header

Now let's compare the header of my toy example.

P5 denotes that it's a .pgm file (P4 = pbm)

Comment GIMP is a pain and adds it's own little comment in the second line - need to be able to delete / work around this

87 83 is number of columns (image width) and number of rows (image height)

255 is the maximum greylevel value in the image\n\nSo, let's add that special comment that is in the demo pbm right, tweak the code to skip the GIMP comment line and hey presto right?

Added line

Broken toy example

Nope.

To do - how to add this line without breaking the file? Also how to automatically generate this?!


De Luca & Termini Fuzzy entropy

Now onto the big one of this week.

I fed in the 'Toy Example' (from above) to run my code on. And still no luck.

SO... 2 days sat staring at the MATLAB debugger, and printing numerous figures out to file, I finally tracked down all the little issues.

First of all I removed the original implementation of a lookup table - whilst this might be quicker, I could not get it working. A simple cell array will do for now. Some error checking needs to be thrown in, as anything with the membership of '1' (which is quite a lot of grey-levels) returns NaN, so make sure the entropy for this is just returned as 0.

Next issue was adding each individual entropy value to an 'imageEnt' array to store them all.

imageEnt as cell After initialisation, imageEnt is a 14x14 cell (hardcoded size, but we can change that later), same as imgMu.

imageEnt error Add something to the array and it switched to being a double without warning.

And so the debugging continued...

Then I realised I had made a rookie mistake setting:

imageEnt = pixelEnt;  

so of course it was a double, I was just setting imageEnt to each individual pixel, overwriting the last one.

imageEnt{i} = pixelEnt;  

Showing imageEnt with values in That''s better.

Now to get the mean entropy for the image and also introduce 1/n (for the constant K in the De Luca equation).

If you sum the array of imageEnt, then you end up with a smaller array of summed columns, thus needing a second sum to combine the columns.

ent array

Entropy sum

The naming leaves a lot to be desired, however I'm currently working off the same standard as Learned-Miller in his original standard entropy.

Maybe different naming standard would make it explicitly clear what is my own code?

So going back to running this code on binary MNIST data, I think this represents it quite well:

Entropy not changing MNIST output

As you can see, the mean image is changing, but really very minimally.

Now take the toy example:
Toy entropies Toy example output

The output leaves a lot to be desired, however it is changing, and attempting to reduce the entropy. I wonder though, is it normal for it to increase on some iterations? Something to ask Neil on Monday.

Upon checking the toy example file I fed in, it appears that actually it had reverted back to the original MNIST data input. However this is a good sign that it is working on binary data. Fingers crossed for the non-binary!

Why the odd output on the right? This I will dedicate a blog post to!


Next week

(Well this week now...)

  • Post this blog post (done!)
  • Write blog post on over-congealing and local optima
  • Look into the saveSeries function (could be clues here for pgm header formation)
  • Start GUI ready for mid-project demo
  • Create 3 or 4 slides on my progress so far for mid-project demo