A Computational Diary, A Hacking Diary, and the Dead End Philosophical Diary

Irony or Paradox or Contradiction?  You decide.  I am posting a “blog” at a wordpress site in which I declare that I am not blogging, but simply keeping track of non-bloggable projects.

  1. My sketch pad has turned into a math diary.  I call this “A Computational Diary” since most of the mathematical ideas get transformed into magic recipes that I type at the command line, recipes in Python and C.  I like to be able to turn functions from Python modules into C or C++ codes that can be run from the command line with parameters.  So, that stuff is non-bloggable unless I were to get my own site where I uploaded scanned copies of my notes.  That might work for someone if many people were also interested in what one was computing, but, for me, it’s just not necessary.
  2.  My “philosophical diary” has turned into a “hacking diary”.  I call it The Book of Nonsense.  It has many notes about what I am currently studying … it has become a kind of supplemental text to the math diary.
  3. So, this morning, as I changed the Books of Nonsense from “philosophical diaries” to “hacking diaries”, I wondered where I will write when I want to philosophize.  Eureka, I already have Dead End: A Philosophical Diary.  I’ll just add chapters as needed.

It will come down to this:  Whatever can be most easily communicated with alphabetic symbols will end up being typed into forthcoming chapters of Dead End.   Like this blog, it’s not really a project so much as a bucket waiting to catch rain water.

Maybe that’s why I called it Dead End.  It’s all I will need for such “meditations” …

That’s my story and I’m sticking to it.  Now, back to the Extended Euclidean Algorithm … I want to alter some code so that I can display the entire process of applying the Extended Euclidean Algorithm.   It’s a pedagogical exercise.

7 thoughts on “A Computational Diary, A Hacking Diary, and the Dead End Philosophical Diary

    • No, not a computer genius … More like a “sub-genius” …

      No, I’m not at all like Mitnick. I’m more interested in code used to solve basic numerical operations that are mechanical and tedious where we are prone to make many arithmetic errors, like the extended Euclidean algorithm or row-reduction on matrices. I hunt down math code and study it. I then manipulate the code in crucial areas so that it will print out meaningful information about what’s going on with its computations.

      If nothing else, by the time I am through, I have a better understanding of the math after I have forced the code to be instructive rather than mysterious.

      It might even be a kind of “art form” where one takes the time to extract the mathematical knowledge that is hidden in the while loops.

      It’s nothing too sophisticated.

      I guess I might be involved in a primitive form of developing educational software. It’s just a hobby, but the process is stimulating in that I have to get into textbooks that go over the elementary and fundamental ideas which motivate the “algorithms” which become “recipes” or “scripts”.

      I always start with print statements where the action is taking place in some loop. It starts off like hard-coding an onboard debugger directly into a program showing when values of variables change, and then slowly but surely BEAUTIFYING the output until ultimately demystifying the “magic wizardry” which is taking place at the speed of light.

      You might say, I take snap shots of the arithmetic operations so as to display them in the manner human beings have been computing since ancient times … more like we would do with pencil and paper.

      So, again, I am no genius. The more I study, the more aware I am of just how little I understand, so I stick to what some whizkid might pretentiously call “trivial” cases.

      The cool thing though is that, once I have the code beefed up to explain itself, even though I am working with trivial cases, I can then feed it “more difficult (for human beings) input” and watch it spit out the necessary computations flawlessly. With a determined attitude I use brute force to make the code “show its work” with cases I can keep track of on paper, and once I am confident that the logic is correct, the code shows its work with more complicated cases. So, there is a kind of method to my madness after all.

      That’s the underlying unconscious motivation, I guess.

      I must have a quasi-mystical relation to mathematical computation since I clearly remember two specific instructors I was fortunate enough to encounter at a community college at the very end of the 20th century who enhanced my appreciation of electro-digital computers and mathematics. There was a time I was anti-computers, and even rebelled vehemently against the forced instruction of Computer Science when I was in high-school (1980’s).

      One of the instructors who got me over this prejudice was a Calculus professor from India who not only reawakened my love of Algebra but also showed me – how to say this? — he pointed out that f(x) = y is like a computer function that takes input in from the parameter x and shoots the output out as the result y. Ever since then, whenever I create a function on the fly, I just name it f, not foo or anything more descriptive … Yes, I am also partial to the independent variable x.

      The other instructor was a woman (a woman in Computer Science???!!!???) for a C++ class who made us create an “object-oriented” program involving a class called Fraction where she taught us how to instantiate “instances” of Fraction objects which had attributes like numerator, denominator, and whole. This is how she taught us about “operator overloading” since our program had to add, subtract, multiply, and divide fractions.

      It was very intense for a 32 year old beginner, and I was shocked at how COOL it was. How could this be FUN? I don’t know. It was a tremendous amount of fun. It made me resent riding around on a tractor all day and cleaning toilets for a living. I was missing out. Studying had become something “forbidden” and “useless” in a work-a-day world where our major problems involved machinery that back-fired or wouldn’t fire up.

      My foreman at the time was a brainiac in a monkey suit who loved to collect junked computers and get them to operate. He also had some kind of mystical influence on me. I wanted to understand what was going on underneath the hood beyond the technical difficulties and parts and wires …

      Anyway, a year or so after leaving that job due to some “psychotic episode disaster” I still was stuck with this intellectual fever, and in the summer of 1999, using what I had learned at the community college, I created a very complicated program that stored prime factors into arrays (in C then C++) … I still have that original code. It gets segmentation faults when the numbers are too large.

      Now I use code that is only about 20 lines. Very elegant and SIMPLE. It gets the job done without creating arrays and passing them around as parameters. I was dumbfounded when I realized how simple it could be. Back in 1999 I had the Knuth texts out and made something extremely complicated. It turns out I could have followed the same logic that my young nephew and I had used in our heads (computers made of meat) when we would play a game while cleaning carpets at an apartment complex. We played this game to pass time and gave each other all the time we needed when it was the others turn as the numbers got larger and larger. The game involved saying whether the number was prime or not. If not, we had to name the prime factors of that number.

      We did just what this later discovered (elegant) code does:

      Is it divisible by 2? If so, divide by 2. Keep doing that until it is an odd number.

      From then on, is it divisible by 3? 5? 7? etc …

      The code is only a couple loops!

      Of course, being a human being yourself, with us, it gets out of hand quickly … or, I should say, we would only get so far until we would want to unplug the vacuum cleaners and grab a fucking pencil.

      What we didn’t know back then is that you only have to check for divisibility up until i is the square root of n, a fact that made itself clear to me one day when I was stuck in a jail cell “doing prime factorization” for kicks.

      I filled up page after page with scrap paper with the long divisions … it did dawn on me that when the divisor and quotient started to approach around the same size, that I ought to think about around what value the square root of the dividend was … so that I could say I was done.

      A friendly arithmetic reminder (It helps to use these words. I only use them to force myself to communicate as clearly as possible.)

      In other words, if the number was close to 160000, I knew I could stop when the divisor was over 400.

      That’s a big difference from just keep doing senseless divisions. I mean, even in a jail cell, you don’t want to do any more work than you have to, especially when you are doing this for “kicks” — some kind of mental stimulation.

      And here I have reached an answer to your question in a round about way. You don’t have to be a “genius” in order to be mentally stimulated by ideas!

      By the way, thank you so much for your comment. It actually inspired me to type something that may be worthy of “the great experiment,” which I have decided will just become the N+1 chapter of Dead End … a work in progress that ends when I croak.

  1. Señor Mike,
    My God, you are really a computer wizzard. I am sure a man with your knowledge is going to be well paid by any company. But I hope you keep writing your thoughts in this blog. Be safe. Raul.

    • I don’t think any company will hire me since I don’t apply for any jobs. I’m not really “company” material. I’m just a lone intellectual … useless to the captains of industry.

  2. Yes,Señor Mike,the captains of industry are slave owners and we see them in history books
    or on the newspaper like heroes. It is useful to be useless to them. Be safe. Raul

Leave a comment