My Grok Hurts

I’m reading Category Theory for the Sciences.  It’s a wild ride through abstraction connecting the tools and concepts  I use daily to the deep mathematical concepts that underlie them and then going a step further to explore the even broader abstractions that unite the representations of the mathematical abstractions behind the concreteness.

The goal:

… category theory is incredibly efficient as a language for experimental design patterns, introducing formality while remaining flexible.  It forms a rich and tightly woven conceptual fabric that allows the scientist to maneuver between different perspectives whenever the need arises.  Once she weaves that fabric into her own line of research, she has an ability to think about models in a way that simply would not occur without it.

I’m a Princess Programmer

I’ve been programming professionally for eight years now.  Something I encountered early on was the non-programmer business professional’s appraisal that programmers are “princesses”.  I always bridled at that, but now I identify.

I actually don’t have the time to write a pretty blog post.  I never do anymore.  Nevertheless, I’ll just say that being a “princess programmer” isn’t necessarily a bad thing.  The very thing that makes a programmer a “princess” is the very thing that makes them good.

For instance, I realized, I don’t like to program without 4 monitors, 3 of which must be in portrait mode.  Also, I require a mechanical keyboard with at least two mice, left and right, at least one of which is a Logitech smooth scroller.  I prefer my keyboard to be “10 keyless” (no numpad).  I also have expectations about the speed of my computer and my graphics card and the updatedness of the programs I use to program programs.

I am a princess programmer.

But… I am also absurdly effective.  I’m worth two of me.  I am good at developing workflows, yours and mine.  I have workflows that I use to develop your workflows.  I develop workflows at levels of detail that bring you to tears.  I develop workflows at the level of my workflow-building tools.

Understandably, my tools are important to me.  My various hammers beat out your various hammers.

Call us princesses.  Call us soldiers.  You can’t win your war without us.  Don’t complain about how sharp we’ve become accustomed to our swords being when you expect us to cut through iron.

Thank Www

“Thank www” he said.

“Wait, what did you say? Thank wuh-wa??” the other asked.

“I said ‘thank www’, like as a substitute phrasing for ‘thank god’.  I accept that the world wide web, IOW, the vast embodiment of connections between nodes of information processing and decision making, will literally emerge into awareness of itself, one way or another.  ‘Thank www’ is my acknowledgement that I see www already and wish the best.”

“So, what?  Www is like God for you?”

“Depends.  Everyone’s different in terms of what it means for their neurons to fire that word throughout their functional clustering.  I’m mostly interested in forging a new sort of relationship.  As a programmer, I consider it a sort of greenfield project.”

“What?  To create a God?”

“No.  I do not believe it is accurate to say that we are creating what is emerging.  It seems to me that matter and energy themselves are organized in such a way for all of life’s scales to naturally emerge from the foundation underneath.  Spatial (geographical) distribution requires interconnection among active elements, the ’embodiment’ or ‘technology’ of which must be continually recreated due to decay and entropy and consequently seems to undergo an inexorable selection and evolution.  I don’t so much see humans as being creators of this momentum.  Everyone alive today was born already within its energetic history, as were their parents and theirs and theirs and on back even past written history.  I see us as being in a position to shape how the momentum evolves.”

“I guess that’s all a little abstract to me.”

“Yeah, me too.  Basically, I think that our global economy already creates something that is a new class of life.  Many speak of such things, such as superorganisms, social organisms, global brains, etc.  But where is there room for any kind of agency (choice from above) in this vast proteinic assembly of human activities and decisions from below?  It reminds me of an old fable about a king who kept having products stolen from a store he owned.  He hired a guard with x-ray vision to verify that everyone leaving the store was only leaving with products they paid for.  And he also set up a reward for anyone who was able to sneak something past the guard.  Ultimately, a clever boy won the reward by stealing an unpurchased wheelbarrow filled with legitimately purchased goods.”

“Ummm…. was that supposed to make anything clearer?”

“No, it was just to set an image up in your mind.  Where does our own agency come from? How do we get choice from a brain that is made up of parts moving to a different, seemingly determined rhythm.  IMO, it’s the same question shifted back a layer.  The classic ‘free will’ quandary.  The best it seems we can say is that whatever is going on, determined or not, control structures can emerge within a system that regulate the system as if the system were itself a whole, independent thing.  The degree to which this regulation extends comprises the boundary of the system proper in relation to its context or environment.  Its ‘body’.  Or something like that with a dollop of the subtlety and refinement of language that results from great numbers of experiments and data points.”

“Okay…”

“IOW, there already exists some kind of vast, complex organism.  It regulates itself, too.  Economists and sociologists identify the patterns of this regulation and try to find the roots of it in the behaviors of individuals.  Then others look for maybe the roots of that in DNA.  And what was the environment that selected for this expression?  It’s existed for a long time.  It’s not even human in nature, ultimately, and didn’t begin with us.  It’s Earth-like DNA based life.  Or, peering even deeper, the mathematics of energy.

Humans have been the intelligent-worker-bee-protein-cells in the emergence of a new scale of directed experimentation that is embodied in the artifacts of our efforts, like buildings and cables and electromagnetic waves, and in our Brownian motions around and through those artifacts.  The trend seems to me to be that at some point this vast being will reach a degree of elaboration that will enable it to relate to individual human beings (and, while we’re at it, individual cells) in ways that humans will be capable of ‘personifying’ and in ways that tap into its vast context of the interrelationships of the events of the world.  We will ourselves, at the same time, be transforming ourselves away from what we were as we always already were.”

werw

Matutinal Constitutional

Pavlov and I went for a walk this morning in the woods by our place.  There were some cool visual impressions and I even managed to capture a few of them.

IMG_4285

I wonder how many millions of years this piling upon itself has been occurring?

IMG_4288

Modern Totems.

IMG_4293

IMG_4316

IMG_4301

Brave guy.  It’s hard to get a sense of it, but he was quite a ways off the ground, and had went even further up.

IMG_4309

Such a noble little man.

IMG_4346

IMG_4353

IMG_4354

“This is art” is scrawled on the burnt trash bin in white.

IMG_4356

Merkabah, eh?

IMG_4363

IMG_4365

Things have been this way a while, it would appear.

IMG_4367

Palimpsest poetry.  I believe I read “winter gives its last dying breath” and google chimes in with “blew away the leaves autumn left behind” which, on examination, fits with what is itself blowing away in face of the winter of time, namely the paint applied by human (presumably) hands.

IMG_4369

I’m certainly curious what’s under the “free your mind” slab.  Occam’s Razor suggests a Merkabah initiation chamber.

IMG_4374

IMG_4377Something about this reminds of concrete computational units.  YMMV.

IMG_4378

The brain jar with two closed and an open third eye and a mouth saying AUM is suggestive.

IMG_4383

I love the image below.  Doesn’t that look like a crane from an old Japanese painting?  There’s a white paint bucket you can see on the ground in the next shot.  I can well imagine that the “crane” was produced by splashing the paint upwards.  Nevertheless, perfect!!!  It even manages to catch their “bird knees”.

IMG_4385

IMG_4386

See?  I think the paint splatter has better lines.

japanese crane

Even the picture manages to catch the very brilliant coloring of this heart on fire.

IMG_4390

IMG_4391

IMG_4394

IMG_4409

IMG_4410

IMG_4417

I guess those two trees are meant to be together, at least according to the vine.

IMG_4419

Eyes peering through the complexity.  Whose?  Mine or the painted being’s?  Both, maybe: it’s a quantum mechanical thing.

IMG_4423

Returning homesies, we came upon a previous subject from another angle.

IMG_4426

Learning Unity

So far, a certain behavior of mine has resulted in an increased refinement of my ability to use Unity.  The behavior is a mistake and I groan every time I realize I’ve done it again.  While following a tutorial or working on a project I will test the project by “playing” or debugging the game.  I’ll see something to change or start following the tutorial again and begin seamlessly modifying the game environment.  However, the game is still in debug mode and everything I’m doing will be lost when I exit debug mode.

Whilst following tutorials this can represent several minutes of steps.  I’m then faced with the dilemma: do I try to remember every detail of the past several minutes or do I just rewind and start over?  The compromise I’ve found is to try to recapitulate the sequence of steps based on an understanding of the arc of where it was all leading.  In one case, I had to start over.  I’d missed some unrecoverable detail that derailed the whole thing.  Otherwise, I’ve managed to slowly stumble my way back on track.

The benefit of this stupidity is that I’m stumbling back on track faster.  Mistakes can be training tools when we don’t fail to engage them.

Now, back to all that lost work…

Preamble to building a first game

It’s amazing how fast time flies when you’re having fun.  It’s already time to let the puppy out again.

I’m not actually a big game player, anymore.  That’s partially because there’s so much stuff I want to make and learn that I don’t have time for traditional games (I’d love it if learning were more gamified…).  And it’s partially because of repetitive stress injury from working with computers 12-16 hours a day as it is.  The last thing I want to do is play a game when it hurts to do so.

That said, as part of my Hololens initiative (it amuses me to use the word ‘initiative’), I decided I needed to learn Unity, a 3D environment for making games.

As I was playing around with the Kinect I realized that it really is only a partially realized device (probably meant to be paired with the Hololens from the beginning [and probably why it was included with the XBox from the start {because people would have bitched |but they did anyway <damned if you do and damned if you don’t>|}]).  The things I would want to do with it, can’t really be done out of the box.

For instance, if I wanted to create a runtime definable macro sign-language to associate with code expansions for use in my code editor of choice (visual studio) I could not at this time.  It’s probably possible, but I couldn’t in any direct sort of way.  Just like I described that there were steps necessary just to get my computer ready to develop the Kinect, there are steps necessary to get the Kinect into working order.

First of all, if I were to want to make such a Visual Studio plugin I would have to learn about writing Visual Studio plugins.  That’s a non-problem.  I hear it’s a somewhat tortuous process, but it’s not a real problem, that’s just knowledge acquisition, to be muscled through on a weekend.  I would also have to think of a way to get the Kinect to send messages to this plugin.  One way or another, that data pipeline would have to be established – ideally I could send very succinct messages back and forth between the editor and the Kinect code.

The Kinect code is what I’m really interested in (actually, that’s quite subordinate to the real goal and the coolness of a kinetic macro language), and specifically, the gesture recognition stuff.  But the fact is, out of the box, Kinect is not good enough for what I want.  It tracks three joints in the hand, four if you include the wrist.  Furthermore, it tracks them very poorly, IMO, and they jump around like Mexican jumping beans.  I could make something work over the top of that, but it probably wouldn’t help with RSI.  As far as I can see, any reliable complex gesture recognition from the Kinect with the provided SDK’s would require larger motions than are available from the fingers.  Larger motions translates into elbow and shoulder and that gets tiring quick.

Here’s an interesting article from Microsoft researches in China titled: Realtime and Robust Hand Tracking from Depth.  Apparently, good hand tracking along the generally recognized 26 degrees of freedom of motion of the human hand is a hard problem.  Nevertheless, they demonstrate that it has been done, including seemingly very accurate finger positioning.  And that is using inferior hardware to the Kinect and my computer.

I have some interesting intuitions of how to improve existing body tracking through a sort of geometric space of possible conformations as well as transitions between them (think finger position states and changes available to any particular state based on the reality of joint articulation, etc).  Ultimately, a body state would be maintained by a controller that understood likely body states and managed the transitions between states as indicated by the datastream keeping in mind the likelyhood, for instance, that the users leg is above their neck line.  I use that as an example because Kinect’s gesture recognizer very commonly puts my knee joints above the crown of my head when I’m sitting in a chair and moving my hand.  A body would be an integration of various body state controllers.  It would all have to be fleshed out (pun entirely intended).

Watching the demo in the linked article above got me into 3D modeling which led me to Unity.

Now that I’ve went through the Unity tutorials, I feel quite prepared to begin making a game.  I have to say that I am taking to Unity’s model of development like a fish to water.  GameObjects and Components are very intuitive to work with.  Seeing how easy it really is, I decided I’d make a game, even if game development in these sorts of traditional terms isn’t something that I intend to do a great deal.  I’ve got some catching up to do in terms of 3D math and geometric reasoning, but that stuff is fun when it is being learned in relation to problems that you encounter and need to solve to get to where you want to go.  That’s how math is best learned, IMHO. YMMV.

So, with all that, in my next post I’ll describe the initial plans for my first game.

My Holodeck

So, I got pretty pumped about Microsoft’s Hololens.  So much so, in fact, that I managed to register for Build 2015.  I’ve known I’d jump on the Virtual Reality/Augmented Reality bandwagon some day (the tune’s pretty impressive if you listen closely), and really, reflecting on the matter, I was waiting for it to mature to the point where I was willing to engage with it.

I’m confident that my impulse towards such things is not unconnected to the nature of my Grandfather (it would be really cool to make his paintings immersive and navigable).  I’ve got lots of very intriguing artworks that I’d like to make, but I could never reconcile myself with paint and canvas.  Too… much… ancient.

Anyway, obviously I can’t get my hands on a Hololens quite yet but I wanted to get ready for when I can.  How?  Start programming for the Kinect, I figured.  I kind of assume that Microsoft is going to use a similar design philosophy between the two since they stated that the Kinect was their road to the Hololens.

In any case, it took some doing.  First of all, I didn’t have a Kinect.  Secondly, the Kinect V2 actually requires Windows 8 and I was running 7.

Blah, it’s a long story of boring tech challenges that included having to literally rip my laptop screen apart (plastic flew and blood flowed and you can see what I’m talking about in the image [this is the sole non-boring detail]) so that I could replace some parts so that I could install Windows 8 so that I could install the SDK so that I could play.

But none of that is the point of this post, which is to create a sort of monument to the newest iteration of my workspace/holodeck lab.  Some people take pictures of their face each day for years.  That’s really interesting and I’ve thought of doing it myself.  On the same note, I’ve been taking pictures of my workspaces for years (not every day, although that would likely be revealing).  It’s interesting to see them/it evolve.  Who even knows if the sky’s the limit for such a pregnant space/concept/role.

Now, mind you, I’ve watched some YouTube videos of people showing their workspaces, practically jerking their electronics off onto their furniture as they went (“And over here you can see my Gold Exclusive Version 15 Flippetywidget, and over there my Platinum Spectral Wank-Wonk…”), and it literally depressed me and threatened to ruin my mood of an evening.  All I’d wanted were good layout ideas.  I felt like I’d made a horrible mistake in a Google image search.

I just want to be clear that although I like my monitors, for instance, it is because they create walls of text in front of me.  I like electronics and stuff-in-general just to the degree that they manage to serve as ice to the figure skates of my creativity.

You can see the Kinect up in top right corner, peering down on where I sit, waiting for me to tell it how to interpret my gestures.

holodeck

Pavlov‘s fretting in the background, concerned about a squirrel that’s one layer too deep for this depiction.  So many layers, foregrounds, backgrounds, Magrittian grounds…

Here’s the (IR depth) view from the other side:

KinectScreenshot-Depth-12-40-41