Here is the Ph.D. thesis on Physics and Computation. It looks really interesting!

# Author: Abhishek Behera

# Must Read

This is a paper by E. T. Jaynes and it is called: Information Theory and Statistical Mechanics. It is one of the most influential pioneering papers of this field and I have to read this paper! I suspect much of the content of this paper might already be familiar to me, having transmitted to me through the folklore, but still this would be fun!

And this is a paper by I. Csiszar and it is called: I-divergence Geometry of Probability Distributions and Minimization Problems, which is another classic. This is one of those papers that I have to read very carefully.

p.s.: I recently read this paper: Learning Markov Structure by Maximum Entropy Relaxation. It is very exciting work. I am presenting the paper on Monday to my guide.

# Gathering Thoughts

**Introduction:**

I was upset when I learnt that even at the rate of seeing one person a second, one lifetime is very inadequate to meet all the people on earth.

Somewhere along the way, I have been thinking about the choices that hypothetical people would make in a myriad of contexts and was thinking about some kind of ‘a grand narrative’ that would subsume all.

Well, no one can see the grand narrative, but has access to only a limited abridged edition of it. Then how could possibly all people ever *conspire* to make everyone *happier*?

**Progress:**

If one were hoping that the above question has an easy answer that could be written down in a casual blog post like this, then they are only fooling themselves.

I first thought about the above problem several years back, as an undergraduate student, and clearly back then I was more in a state of stupor or awe and I had absolutely no dreams of ever solving it.

All I am saying now is that from that day I think I have made progress. So lets put together all that I now know about how-to-attempt-a-solution and curate all things useful (the map to el Dorado, if you wish to believe):

- In differential geometry, one learns about doing patchwork i.e. – how to put together parts in a coherent manner to make a whole. Walking this way leads to Information Geometry.
- People have solved optimization problems via message passing. This Ph.D. thesis could be a starting point to know more. Also there is a connection between Information Geometry and Message Passing, for example this Ph.D. thesis.
- One might be interested in ideas as be employed in Analog Logic or Statistical Computing, where people solve problems with message-passing like ideas.
- This post on azimuth starts with an intriguing and similar premise. But I don’t know enough about it to say much.

# Textbook

“Heaven is a place on earth with you

Tell me all the things you wanna do”

I found this book and I find it to be very interesting. I want to read this. I read chapter-3 and my impression is that, unlike Huang or Pathria, this books doesn’t fuss as much about physics and is more mathematical, which is to my liking. There could of course be even more mathematical books (which will take a thousand years for me to parse). This books seems suited right for my level of maturity. And first impressions could be wrong.

p.s.: I apologize for the unrelated quote at the top and I am not very proud of the source!

# For salvation sake!

I want to read this Ph.D. thesis and understand it. How much of it will be possible? I don’t know. But certainly reading something like this always provides one an escape from the banalities of everyday life! It is as if I am trapped, limited by my own capabilities. I want to transcend them.

# Evolution and Game Theory

Here is a paper by Umesh Vazirani: Algorithms, Games and Evolution. It is certainly a paper worth reading. At this moment, I wouldn’t read this paper, bothering about all the detailed nuances. But rather read it to get the big ideas, that I may use later. I would spend some time thinking on my own about what I learnt before plunging through the travails of finicky notations and details.

# A paper by Gromov

I haven’t started reading information geometry, and now it feels it will take a while (may be after February, or may be after July) before I start reading information geometry. I am kind of busy with a few other things.

In the meanwhile, I found this paper which I would certainly want to take a look at (because: 1. It is Gromov 2. writing about things I am interested in 3. and because it is Gromov! 4. Why is Gromov here? 5. What else is Gromov into 6. You get the idea!). So lets put it on the calendar, lest I forget: tentatively somewhere in December I will take a superficial view of this paper. I will look for mentors who can help me read these kind of work.

p.s.: Sorry no more chess analysis. No time for it for years to come.

# Interesting Position

Here is a position that arose after move 12 in a game against Stockfish:

1.d4 Nf6 2.Nf3 e6 3.c4 c5 4.e3 cxd4 5.exd4 Be7 6.d5 exd5 7.cxd5 d6 8.Be2 O-O 9.O-O b6 10.Nc3 Bb7 11.Re1 Nbd7 12.Bf4 Rc8

The continuation 13. Nd4! by Stockfish is brilliant and I wanted to make a note of it. (I would like to look at this position more carefully later, and hopefully I will!)

And here is the rest of the game:

13.Nd4 Nxd5 14.Nxd5 Bxd5 15.Nf5 Nf6 16.Nxe7+ Qxe7 17.Ba6 Qd7 18.Bxc8 Rxc8 19.Bg5 1-0

(I am at the moment struggling to find a widget/plugin for wordpress that could show a board. The whole process is a bit more involved than I have the time for right now. So, let see if we can fix that!)

# Preview

Over the next couple of months I am going to learn Information Geometry! And guess the exciting part! I am going to make notes and make them available on this blog. Also I will be learning some differential geometry, category theory and other things as and when required.

# Lets start!

I just want to latex anything right now! So lets do this:

The above is called Bethe Free Energy. For a graphical model, the above is an approximation to the actual Free Energy.

p.s.: I haven’t thought ahead much about how to use this blog. I am thinking to review papers and write about things while I learn. But lets see.