Friday, February 4, 2022

Pattern musings

A new breed of patterns

I like to code up little snippets in Haskell to play around with interesting concepts. Sometimes I invent notation that must appear forcing to most, but let's introduce it anyway...

How I use ViewPatterns, for fun and profit

When writing evaluators, I like to use view patterns to evaluate subtrees already on the LHS (pattern side). Let's introduce a mini-language (also called Hutton's razor) that can only add:

data AST = Lit Integer | Plus AST AST deriving Show

The evaluator is maximally simple, as it only needs to care about two sole cases:

eval :: AST -> Integer
eval (Lit res) = res
eval (Plus l r) = eval l + eval r

There is a train of thought that cares about termination and logical consistency and thus requires that in the definition of eval (Plus l r) the function to be defined can only be applied to structurally strictly smaller arguments (these are the hypotheses). We adhere to this rule above, but for more complicated functions it is possible to introduce bugs when not being super careful.
So why not resort to view patterns, and ban the eval from the RHS altogether?

eval (Plus (eval -> l) (eval -> r)) = l + r

This way of putting it makes it totally clear that eval is always applied to parts of the syntax tree.

Result patterns?

But let's return to the base case (Lit). We are forced to come up with a new binding (res), just to have something in our hands that we can return? Sounds like redundancy. So for the purpose of this post I'll come up with a result pattern (=), which can appear on the LHS only once and it's presence there will rule out the RHS:

eval (Lit =)

No more arbitrary names!

View lambdas, anyone?

Okay, so let's turn up the heat a bit more. A continuation-passing interpreter is a tail recursive way to evaluate the AST:

cps :: forall k. AST -> (Integer -> k) -> k
cps (Lit n) k = k n
cps (Plus l r) k = cps l (\l -> cps r (\r -> k $ l + r))

For the purpose of this exercise you can consider the type k to be unit (), and the continuation (value k) a consumer of intermediate results. We recognise the same usage pattern as in eval: subtrees are immediately (and linearly) passed to the function to be defined. So let's rewrite it with view patterns:

cps (Plus (cps -> l) (cps -> r)) k = l (\l -> r (\r -> k $ l + r))

But  now we face the same redundancy issue, that the view patterns bind identifiers that are somehow repetitive, as they are applied to lambdas immediately. Can we come up with a notation for a view pattern that applies on a lambda and extends its scope to the rest of the patterns and to the RHS? Consider

cps (Plus (cps -> \l) (cps -> \r)) k = k (l + r)

It takes some time to see the charm here, as it is wormholing a lambda-bound identifier to the RHS, but it is unbeatably concise. And we can combine it with the result pattern idea too:

cps (Plus (cps -> \l) (cps -> \r)) (($ l + r) -> =)

At this point however the readability is suffering.

Outro

Note that the view lambdas are strange beasts, not only because they wrap the result (RHS or = pattern), but also the identity holds: \p(id -> \p), but since all the ids don't change anything, they can be stripped anyway. I wonder where I shall encounter more such view lambdas in the wild...

Thursday, November 6, 2014

Unembedded associations

Everytime I start understanding the gist of a paper that appeared unfathomable to me a few months before, a strange thing happens to me. My brain often wanders off, and creates interesting new bridges, on which my thoughts begin to run and reach new previously unchartered land. This same thing happened to me when reading Atkey et al.'s "Unembedding Domain-Specific Languages". Here I encountered my old friend, the HOAS lambda vocabulary

class LC expr where
  lam :: (expr → expr) → expr
  app :: expr → expr → expr


Freely associating I came up with an idea how to simulate a limited form of duck typing for conditionals:

class Condition expr
  toBool :: expr → Bool
  cond :: expr → a → a → a
  cond c th el = if toBool c then th else el


This would allow to retrofit many condition-like data types with this vocabulary. Nothing, 0, (Left _) all could serve as false.

Maybe I could even follow Conor McBride's advice and make the then and else arms of the conditional differently typed. Though I would need associated types for that.

Duck typing may turn out like a good idea in a statically typed language, when implemented this way. Another use case would be function application by the built-in juxtaposition syntax. It already means different things in different syntactic contexts, like function application or type (family) application. Idiom brackets come to my mind. Edward Kmett's Apply class looks like a good candidate for a related vocabulary. The typing rule would be implemented by an associated (injective) type family. Hopefully someday we'll see -XRebindableSyntax for value-level application, that is type-directed.

Sunday, October 26, 2014

gdiff – Polymorphically

In the few last months I've been busy coming up with techniques mastering the gdiff Haskell library, even for use cases it was not originally designed for.
This is mostly a brain dump of some approaches I have learnt while tinkering and trying to solve certain problems.

1) Polymorphic diff: Eelco Lempsink writes in the conclusion of his 2010 thesis «Furthermore, we can not encode polymorphic datatypes such as lists, but need to write specific encodings for each type we want to include». In the library documentation he says «It might require another master thesis project to solve this». However in the last days I developed a trick, which does allow me to describe polymorphic data types, such as [a] or Maybe a. I just needed to add these lines:

in the family GADT:
  ListNil :: List Fam as → Fam a as → Fam [a] Nil
  ListCons :: List Fam as → Fam a as → Fam [a] (a `Cons` [a] `Cons` Nil)

then define a Type instance:
instance Type Fam a ⇒ Type Fam [a] where
  constructors = head [Concr (ListNil cc) | Concr cc Concr cc ← constructors] : [head [Concr (ListCons cc) | Concr cc ← constructors]]


What is this doing? It picks two random constructors from the underlying data type and wraps them with ListNil and ListCons. We usually ignore what is wrapped, with one exception: in decEq one needs to ensure that the constructors' arguments are compared too, otherwise one cannot finish the type equality proof.

2) Non-moving pairs: In the usual textual diff algorithm the lines may move around a bit to make room for insertions, etc. This is normally the case for gdiff too. I have seen gdiff reusing values (with Cpy) when comparing (True, False) with (False, True). But sometimes this is not desired at all. I figured out that "inlining" the leaves into the pair's description (i.e. instead of (a `Cons` b `Cons` Nil) writing appendList on the field decompositions of a and b does the trick.

3) Location types: I found a way to equip Abstr constructor descriptors with type class dictionaries. It is similar to the lifting approach shown under 1), but much more involved.
The idea is to wrap a location-aware descriptor with a descriptor that is location-blind, i.e. that hides the location parameter: Loc' :: KnownNat n ⇒ Fam (Loc n Bool) …
Then we need the wrapper:
Hidden' :: KnownNat n ⇒ Fam (Loc n Bool) ts → Fam (Hidden Loc) ts
Giving the Type instance for the latter is tricky, as Abstr alone has no provision for dodging the class dictionary into Hidden', so I had to write a locAware smart constructor to syphon it through the Hidden' wrapper.
This also stressed the compiler to its limits in the advanced PolyKind-ed approach I needed, so I filed a GHC bug #9725. There is also some code showing how the Hidden Loc is unwrapped and the dictionary-passing functions is installed into the Abstr.

4) Monadic actions coming out of patch: Here another wrapping approach is needed, but this time we need a sum type for diff and patch so that we can enter with a pure value at the Left and obtain a Right action back. This is the most involved approach. I can probably blog another time about it.

Thursday, August 7, 2014

First Programming with my Daughter

Today I had a short programming session with my 7 year old. I had shown her earlier what a list is and she came back several times already to show her "more lists".
So this time I thought we could do something more interesting. A letter-changing game! I fired up GHCi and entered:

Prelude⟩ let satz = "Leleka hat Kacke in der Hose"
Prelude⟩ print satz
"Leleka hat Kacke in der Hose"


She loved it so far. Fun is the strongest motivator to learn for kids…

Then came the changer for letters:

Prelude⟩ let tausch = \x → case x of {'L' → 'K'; 'e' → 'i'; 'i' → 'u'; 'u' → 'e'; x → x }


Then I applied it on the phrase defined earlier:

Prelude⟩ map tausch satz
"Kilika hat Kacki un dir Hosi"


This did it. She kept repeating the transformed sentence for an hour, giggling. I tried to explain to her what we did just now, but I guess I'll wait till next time to make her repeat this exercise.

Thursday, July 31, 2014

Rank-2 PHOAS

Lately I've been experimenting with finally-tagless (typed) representations and made an attempt to model the (implicitly typed) lambda calculus. However, I wanted to use the parametric higher-order abstract syntax (PHOAS) technique to obtain variable bindings that are automatically well-scoped.
I arrived at this formulation:

class LC rep where
  var :: p → rep
  lam :: (forall p . p → rep) → rep


(I am ignoring applications as they are not important for making my point.)
As Andres Löh has pointed out to me this is not the final-tagless formulation of the regular PHOAS form, as that would add the p type variable as the second class parameter and the rank-2 forall would appear from "outside".
But I liked my simple formulation and the obviously and explicitly parametric lambda body. So I started implementing a rudimentary visualisation instance given a name supply:

instance LC ([String] → String) where
  var = ??? -- const "VAR"
  lam f = \supply → ...


It turns out that implementing var is only possible by giving a constant result, and for lam I am in trouble, because I cannot call f as it expects a polymorphic argument. Both problems are due to the fact that p is too polymorphic. Can we have it a bit less polymorphic in order to make some progress?
Thinking about it I came up with the idea of giving each instance a way to constrain the p variable as it fits. So I changed class LC such:

class LC rep where
  type Restricted rep p :: Constraint
  var :: Restricted rep p ⇒ p → rep
  lam :: (forall p . Restricted rep p ⇒ p → rep) → rep


Now my instance can pick the restriction as it fits:

instance LC ([String] → String) where
  type Restricted ([String] → String) p = ([String] → String) ~ p
  var = id
  lam f = \supply → ...


As you see I chose the restriction to be type equality which essentially cancels parametricity in this instance and gives me simple HOAS. Filling in p becomes easy now.

  lam f = \(n:ns) → "\\" ++ n ++ "." ++ (f $ const n) ns

Let's try it out! But in order to do that we need an instance of Show for name supplies. This could be one:

instance Show ([String] → String) where
  show f = f $ map (('v':) . show) [0..]


Now we can interpret a lambda term as a name supply.

*Main› lam (\x → lam $ \y → var x) :: [String] → String
\v0.\v1.v0


It works™ :-)

But I can go further. After implementing several other instances I observed that I always wanted to destroy parametricity completely and implement var as the identity. So why not have these as defaults and reduce the boilerplate by a nice margin? Here is the final class definition that I arrived at:

class LC rep where
  type Restricted rep p :: Constraint
  type Restricted rep p = rep ~ p
  var :: Restricted rep p ⇒ p → rep
  default var :: rep ~ p ⇒ p → rep
  var = id
  lam :: (forall p . Restricted rep p ⇒ p → rep) → rep


I like it so far.

PS: Here are the prerequisites if you want to compile the above yourself:

{-# LANGUAGE UnicodeSyntax, ConstraintKinds, TypeFamilies, RankNTypes, FlexibleInstances, DefaultSignatures #-}

import GHC.Exts

Sunday, January 12, 2014

Testing LaTeX with MathJax

$3_{5}$ $$42^{25}$$
extensions: ["tex2jax.js"], jax: ["input/TeX", "output/HTML-CSS"],
Does not work

But this:
<script src="http://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML" type="text/javascript"></script>
<script type='math/tex; mode=display'>   \{ 0, 1, 2 \} </script>
Results in something pretty:

Monday, September 2, 2013

Now it's evident — totally!

In one of my posts about logics I have shown that a proof of a true proposition consists of constructing a function from ⊤ (truth) to an inhabitant of that proposition (interpreted as a type).

Dually, one would hope, we must construct a function between false propositions and ⊥ (bottom, the empty set).

The big question is: How? There appear to be many functions into the empty set. These surely cannot mean to be proofs!

There is a catch: totality. For every possible input we are obliged to provide a well-defined, deterministic result. A hard job when the target set is empty!

On the other hand it is easy enough for positive proofs: (considering the finite case...) say, we seek a proof of the Bool proposition. Bool has two inhabitants, so the function space from ⊤ (single inhabitant) to Bool must have 21 of them. Here is one:

provebool () = True

(Do you find the other one?)

But for negative proofs, it isn't really obvious how to do it. Since refutable propositions (allegedly) have no inhabitants, how do we write a pattern-matching function between them?

But remember, in the finite proof case our arrows had nm inhabitants, picking any one of these constituted a valid proof.

For the uninhabited case such a way of counting gives us a clue: 00 can be interpreted as 1! And this is the key, we need to do pattern matching with no patterns to get that inhabitant:

refutation = \case of {}

When this function is total, we have our sought-for unique proof. And for actual negative proofs it evidently is!

Tuesday, August 20, 2013

Verschiedenes

Mein Vater wäre morgen 72 geworden. Ich habe mich schon lange daran gewöhnt, daß er nicht da ist, dennoch habe ich ruhige Momente, wo ich mir wünschte ich könnte ihm erzählen was ich so mache.

Er würde es verstehen.

Er sagte mir einmal "Ich hätte es geschafft dich in die Medizin zu bugsieren, wenn ich dort eine Zukunft gesehen hätte." Ich glaube er sah im ganzen medizinischen Komplex keine Zukunft, mit Sicherheit auch im finanziellen Aspekt. Oder er sah einfach keine Perspektive darin, für mich. Auf alle Fälle stand er meinen mathematischen Ambitionen sehr aufgeschlossen gegenüber. Er kaufte mir das Buch das ich immer noch sehr verehre, "A fizika kultúrtörténete" (die Kulturgeschichte der Physik). Ich habe es vor kurzem meinem Sohn auf englisch gekauft.

Das aus meinen Ambitionen nicht wirklich viel wurde, lag wahrscheinlich daran, daß ich in die Informatik gegangen bin ;-)

Aber egal, es macht mir ja viel Spaß, und die beiden Gebiete verzahnen sich ja zusehends. Es gibt einen signifikanten Trend, die Grundlagen der Mathematik auf das Fundament der Typentheorie zu stellen, und somit die Mengenlehre praktisch zu entthrohnen, nach mehr als 100 Jahren Herrschaft.

In einem kleinen Winkel dieses Kriegsschauplatzes mache ich mir selber gedanken. Gestern Abend fuhr ich nach Ansbach, um meine Mutter zu besuchen, und nahm auch Hamster Daniela mit. Ich war wirklich müde geworden nach dem Abendessen, las noch ein Paar Seiten Kategorientheorie und schlief ein. Aber zwischen 2 und 3 wachte ich auf, und ich denke in dem Moment verstand ich, daß meine Idee von Typen als Flächen sich mit den Opetopen vereinbaren lassen, wenn man sich nur in die Kodimension versetzt. Das Stichwort Poincaré-Dualität kommt da auf, was alles erklären könnte.
Ich nahm den Hamster mit runter, wir setzten uns vor die Terrassentür, und der Schein des Vollmondes fiel auf uns. Es war mystisch.

Außerdem arbeite ich an einer Kodierung, so daß etwas gleichwertiges zum Lambda-Kalkül mit Opetopen kodiert werden kann. Mal schauen wie das alles zusammenkommt.

Ich denke mein Vater wäre der einzige aus meinem Umkreis der für solche Gedanken ein wirklich offenes Ohr gehabt hätte, selbst wenn er das meiste gar nicht verstanden hätte. Aber er wäre auf mich manchmal stolz gewesen.

Monday, August 19, 2013

Compiling GHC on RHEL6

Compared to building GHC on RHEL5 this is a breeze.

git is in the installation — check
perl bootcheck
./configurebreaks, linker does not find libgmp.so
I built gmp from sources, and installed it into my home directory.

But... it still did not build :-(

As always, Stack Overflow for the win!

Specifying
setenv LIBRARY_PATH $HOME/lib
did the trick.

For good measure I also added
--with-gmp-libraries=$HOME/lib --with-gmp-includes=$HOME/include
to the configure line, but this might be redundant?

(Another thought, find seems to reveal some  libgmp.so under /usr, I might point it there.)

So, finally,

./configure ...check

The part that involves building was the pleasure part:

nice make -j14check

Then I dared running the test suite:

make testcheck

This step takes its time, as it is not parallelizable.

There is only one drop of bitterness, namely the stage2 compiler does not run on RHEL5. But that is something I did not expect, actually.

make installcheck

nothing unexpected here.

Have fun with GHC on RHEL6!

Saturday, August 10, 2013

Proudly presenting the »nopetope«!

(This is is jotted down, raw posting, that may never get finished. I am publishing it anyway, as it pretty much reflects my current mood.)

I am about to do some research with coverings of trees, and it was only natural to look at Baez-Dolan metatrees and the corresponding notion of opetopes. The paper contains a famous 5-minute definition and it is a brain teaser worth reading. They introduce trees and nestings (actually two sides of the same coin) and their superposition, called a constellation, which has to follow some rules, but drawing spheres is a creative process.

Then there are zooms which connect constellations as long as the left nesting and the right tree are morally identical.

My lambda graphs are basically search trees and a nesting would add the operational notion of evaluation. Since we can freely choose our evaluation strategy (confluence?), the latter corresponds to a nesting. It remains to find out what the degenerate zooms are under this aspect.

I am just reading the "Polynomial functors and opetopes" paper (http://arxiv.org/pdf/0706.1033.pdf) and it asserts that it's starting constellation is a one leafed tree: But I wonder if this is fundamental, and since leaves are stripped by zooms, any number will do.
So I'll suggest starting with zero leaves, and calling the resulting zoom complex(es) the nopetopes. This might be the first mathematical term I have coined :-)

For the mathematically-challenged, a nopetope is just a lollipop wrapped in cellophane, while an opetope is the two-stick version thereof.

It is a funny coincidence that "one" and "ope-" contain the same vocals. Going on, we could also have twopetopes and thropetopes, fouretopes etc. But I doubt these are significant in any way.

And now back to the paper and then to an Ωmega implementation of nopetopes...

PS: while writing this my imagination went though... Are trees and nestings compatible with the famous correspondences energy-mass, wave-particle of physics? Looks like I have to start some more research.

Wednesday, February 27, 2013

Old Cabal Subhell

Sometimes I have a cabal that is too old for generating up-to-date boilerplate files for package installs, so I have to install a new cabal-install. But unfortunately I run into the same problem when installing the HTTP package, which is needed for cabal itself. So I am in a (pun totally intended, as you'll see later on) catch-22, and each time I feel being lost completely.

But there is a cure to this particular problem:

create a modified

mv dist/build/autogen/Paths_HTTP.hs dist/build

and patch it with

$ diff -u dist/build/autogen/Paths_HTTP.hs dist/build
--- dist/build/autogen/Paths_HTTP.hs 2013-02-27 20:07:01.437225000 +0100
+++ dist/build/Paths_HTTP.hs 2013-02-27 20:20:36.735526000 +0100
@@ -6,6 +6,7 @@

import Data.Version (Version(..))
import System.Environment (getEnv)
+import Control.Exception

version :: Version
version = Version {versionBranch = [4000,2,8], versionTags = []}
@@ -17,11 +18,14 @@
datadir = "/home/ggreif/share/HTTP-4000.2.8"
libexecdir = "/home/ggreif/libexec"

+hardCoded :: FilePath -> IOException -> IO FilePath
+hardCoded dir = const $ return dir
+
getBinDir, getLibDir, getDataDir, getLibexecDir :: IO FilePath
-getBinDir = catch (getEnv "HTTP_bindir") (\_ -> return bindir)
-getLibDir = catch (getEnv "HTTP_libdir") (\_ -> return libdir)
-getDataDir = catch (getEnv "HTTP_datadir") (\_ -> return datadir)
-getLibexecDir = catch (getEnv "HTTP_libexecdir") (\_ -> return libexecdir)
+getBinDir = catch (getEnv "HTTP_bindir") (hardCoded bindir)
+getLibDir = catch (getEnv "HTTP_libdir") (hardCoded libdir)
+getDataDir = catch (getEnv "HTTP_datadir") (hardCoded datadir)
+getLibexecDir = catch (getEnv "HTTP_libexecdir") (hardCoded libexecdir)

getDataFileName :: FilePath -> IO FilePath
getDataFileName name = do


This file is then preferably found by GHC and all is okay. Incidentally I already employed this trick in the past, but forgot about the details so I had to reinvent it again.

After escaping this particular subhell of cabal I came up with this blog post in order to not lose my way in the future. Hopefully it helps you too.

Wednesday, December 19, 2012

Decidable equality

Trailing Richard Eisenberg's blog post I've been triggered to recollect what I have read about the matter so far. Turns out that I was about 17 when I first came into contact with the concept. After my brother hinted to me a month ago that he is reading "Gödel, Escher, Bach", I grabbed my own copy from my bookshelf and quickly found the page about the fractal nature of provable propositions and the true-false divide. I have scanned it and showing it below.


What is the connection to decidable equality? I believe this illustration shows it:

I have depicted T (truth, top, any nullary constructor, e.g. ()) at the north pole, and bottom (false, ⊥) at the south pole. The way or reasoning on the north hemisphere is to transport truth into a type, i.e. construct a function of type ()P, where P is the proposition we want to prove in Curry-Howard encoding. As the figure shows we may have a finite number of stops on our way. We can note that a function of type ()P is isomorphic to P, so simply constructing an object of type P suffices.

Dually, proving that a proposition is false, we have to construct a path to bottom from Q, again possibly via a finite number of stops. Since (Q) cannot be simplified we really have to construct functions here.

The equator divides true from false and thus separates the duals.

Since each proof (when computable in finite time) will either result in something from the blue or red island, the type for decidable equality must be Either (a :~: b) ((a :~: b) -> Void).

Tuesday, September 18, 2012

Thrist sightings

It's been a bit quiet recently, but not inactive. Edward Kmett has adopted the name and the concept in two of his libraries. Of course the concept is much older than the name, being a path in some category.

If you look carefully, in the middle of his ICFP '12 keynote Conor McBride derives thrists (he keeps the name List in his development) and they may also appear in the JFP article where they are named Path.

Only with the advent of GADTs and user defined kinds in systems like Ωmega, SHE and Agda the concept of threading types begins to blossom. Since the release of GHC v7.6.1 the most popular Haskell compiler also supports kind polymorphism. This prompted me to re-spin the thrist package to take advantage of GHC's new feature.

Have fun,

    Gabor

Monday, December 26, 2011

Showing Thrists Revisited

More than three years ago I lamented that it's impossible to to define a show function on thrists, generally, even when the element type is member of the Show class.

In the meantime I succeeded to declare certain thrist parametrizations and convince GHC to accept show instances for them. The Appli thrist is such an example.

With the upcoming GHC 7.4 things may improve considerably, as it will bring constraint kinds. Remember, thrists are (currently) parameterized like this (* → * → *) → * → * → *, i.e. on types, and with the new kind variable feature we can probably generalize to user-defined kinds. Then the parameters may match Ωmega's: (a → a → *) → a → a → *. So we still result in a type, but we can choose our indices from a vastly bigger domain. Enter constraint kinds! These may be part of our user-defined kinds, so we can form stuff like (Show, *) and supply it for the parameter a. With some luck deriving Show can be more successful and not attached to particular parametrizations.

I regrettably still haven't gotten around building a GHC v7.4.1 candidate, so I cannot verify the above, but my gut feeling is that this'll work out...

Tuesday, November 22, 2011

Pondering about the Foundations

As I am thinking about how a possible Ωmega 2.0 could be implemented, I am trying to come up with some interesting concepts w.r.t. parsing and type checking. Since no single line is written yet, it is easy and cheap to ponder.

Some key elements are already taking shape in my mind. These are:
  • parsed entities carry their start and end coordinates in their types
  • there is no difference between patterns and expressions
  • each expression carries a typing thrist.
The second item has been elaborated by me in an earlier post. The third must wait a bit until I have something coherent (but the curious reader might visit my sketch of ideas).

I'd like to talk about item one here. The essence is that every grammar element obtains a type that describes where this element is located. For example a function declaration may occupy the text portion between start and end, where these are type-level triples (file, line, column). In order to consider parsed input consistent, the start coordinates must be aligned with the prior element's end coordinates, i.e. we have another thrist here.

Things get interesting when we model naming and scope. Name definition and referral must somehow match the types, after the match is done the semantics can be read off the user. To give an example:
Ref definer :: Exp (`foo, 42, 7) (`foo, 42, 11)
  where definer :: Exp (`foo, 41, 1) (`foo, 45, 37) = ...
Both the reference and the definition live in the same file "foo.prg". The former is on line 42 from column 7 to 11 (exclusive). The definition it refers to is in fact enclosing it (a recursive call, maybe?) extending from line 42 to 45. With this representation the name of the indentifier becomes secondary, all that counts is the evidence in definer which contributes to its type! We have created a globally nameless representation.

We can expand this world view to also include the level of the grammatical entity (value, type, kind, etc.) as a fourth coordinate. So a reference to a ›25‹ somewhere in the text at floor 0 would point to an integer literal, the same place at floor 1 to the type Int and at floor 2 it would be the kind *.

My hope is that all the proof obligations arising in such a scheme can be cleverly extracted from a parsec-like engine.

Wednesday, October 5, 2011

Macs and me

I am profoundly saddened since I woke up at 4:00am in the morning, and saw the news headline about the passing of Steve Jobs. I have seen this coming for a long time, as my father died in a very similar way back 1993 with only 52. Looking at the thin appearance of Jobs in the last month or even years I constantly get reminded of him. Basically the same story of suffering, loss of weight, liver transplant (in vain), death. RIP, Dad, RIP, Steve.

I am writing this on a rusty vintage 2000 PowerBook G4 Titanium, I bought on eBay last year, because the video of my own 2001 TiBook went black. By today's web standards completely inadequate, it serves me well for news reading, terminal logins, etc. My son Pedro got his MacBook Pro 15'' delivered just today. An awesome piece of technology.

My father bought the first Mac in 1986, just after opening his practice as a neurologist. This was two years after cutting all strings in Hungary and fleeing to Germany in a pretty bold move. Must have been a moment of total self-overestimation when I promised to my dad "if you buy that Mac Plus I'll write you the best software for it for your doctor's office". A crazy time began. At day the Mac was used to keep patient's data with a DTP program "RagTime", at 5pm I hauled the Mac home (in a big black bag) and started writing the program. Sometimes deep into the night. I used Turbo Pascal (and later MPW) after figuring out that the Lisp environment I preferred simply did not cut it due to insufficient support of the Toolbox. In the morning my father carried the Mac back and powered it up.

Less than year later the program was ready for productive work. A Mac SE joined the party and we had a networked doctor's application with a really neat windowed user interface, that would put even today's programs to shame in this regard.

There was even a time when we fancied marketing this product, but my university duties and the early death of my father simply negated all plans to this end.

When I had my diploma in my hands I picked up the phone and called the guy who sold us the Mac Plus and a copy of "Inside Macintosh" back in '86. In the meantime he founded a pretty successful company around a networked admin solution called 'netOctopus' which was his baby. We occasionally met at Apple developer events and I new that he was a pretty damn good coder. He hired me and I was earning money by programming Macs!

So yes, I love Macs and there is no reason that this will change in the foreseeable future.

I kept telling to myself, should Jobs die one day, I'll put that Mac Plus (now in my basement and still functional) up for sale at eBay. My thought today: "screw it – too many fond memories attached".

Friday, September 9, 2011

GHC 7.0.4 on CentOS

This is mainly a post for myself so that I can look back any time how I did it.

Basically I fell in every possible hole while wandering around in the dark but I could get out of them :-)
  • after downloading I configured the binary installation for x86-64. configure starts ghc-pwd, which is built with the wrong glibc-versions. Bummer.
  • found out that I have to get the source install. Downloaded and configured, it died in the middle because base-4.2 was not available. Bummer.
  • so I had to install ghc-6.12.3 first. Fortunately this went smoothly (binary install). With this new friend I reconfigured v7.0.4 and started building. When linking the stage1 compiler I got an error from binutils that some relocation went awry. Bummer.
  • the internets are full of hints that the system compiler (gcc-4.1) uses the bad linker. To my luck I found a new gcc-4.4 in an amicable colleague's home directory, so I reconfigured thus:
    ./configure --prefix=/home/gabor --with-gcc=/home/msichel/local_x86/bin/gcc
  • but configure told me that the shiny new gcc cannot build executables. Bummer.
  • it turned out (I had to write a hello-world to figure that out) that some gcc-internal shared libraries were not found. Luckily they came into scope with
    LD_LIBRARY_PATH=/home/msichel/local_x86/lib:/opt/lsf/6.2/linux2.6-glibc2.3-x86_64/lib
  • at this point I could smell victory, but it was still out of reach. A blog post suggested to cp mk/build.mk.sample mk/build.mk and edit it in two ways:
    • uncomment BuildFlavour = quick
    • and change to GhcLibWays = v p dyn, i.e. append "p dyn"
  • I did both, reconfigured and typed make install.
It went through without a hitch :-)

    Sunday, August 21, 2011

    Poor Man's Haskell Kinds

    After finalizing some of my idea prototypes in Ωmega, I intend to port a part of the code to Haskell, to enjoy the speed advantages of a compiled language.

    Unfortunately Haskell does not support user-defined kinds, so I was wondering how to simulate them at least, in order to get some confidence that my type-level constructs do not drift into no-no-land.

    This literate haskell file demonstrates how I put instance constraints to work, and thereby define the inference rules for well-kinded phantom type indexes. I am sure somebody has already done this, I would like to hear about it. Anyway, this almost allows me to introduce Ωmega's checked singleton types in Haskell.

    Sadly GHC does not yet allow instance definitions for type families, but it would be cool if somebody could provide me a workaround.

    Thursday, February 10, 2011

    My Thoughts about Inertia

    Big caveat: I am very much of a layman when it comes to physics and this is consequence of having heard elementary physics lectures more than 15 years ago.

    But I like the occasional book on the subject (The Elegant Universe comes to my mind) and am fascinated by the theoretical possibility of a grand unified theory of Nature's forces.

    There is a small niggling idea in my head that wants to get out from time to time, and this post is the evidence that it finally made it into the wild. (I will probably appear as a bloody idiot because of this...)

    Being Hungarian-born has made me somewhat receptive for advances in physics performed by the pretty smart folks coming from this small country, and already as a pupil I was taught that the "equivalence principle" (of gravitational mass and inertial mass coinciding to an amazing degree) was shown by Eötvös. This cannot be a simple coincidence but must be a fundamental thing.

    A completely ununderstood fundamental thing, as it appears to me. A quick googling for "inertia quantum gravity" brings up some outright lunatics and some rather exotic theory of "zero field fluctuations" causing inertia:

    http://www.calphysics.org/articles/PRA94.pdf
    http://www.calphysics.org/articles/gravity_arxiv.pdf

    But, electromagnetism causing the same mass as gravity (i.e. linking two of the fundamental forces with an unbelievable precision) appears very far-fetched to me.

    Wikipedia seems to be no help when inquiring for the source of inertia. Mach's principle referenced therein, namely that the existence of the rest of the universe leads to the phenomenon of inertia does not sound right; after all, when all the other stars (galaxies etc.) would be removed but the Sun, would you expect that the inertia of your body suddenly would fall to almost zero?

    I cannot believe that modern theories of gravity (such as quantum gravity approaches) do not try to explain the equivalence of gravitational mass and inertial mass. My intuition says that this should be the first step!

    Now I'll begin a steep descent in the land of speculation and respectlessly mix stuff told by others with vague ideas and metaphors of mine. I won't even make effort to cite or link, but at some point I'll tell what my main idea is. Be warned: at that point probably each self-respecting physicist (that is, a person who successfully solved a differential equation about movement of a body under external force) will turn away with disgust, but okay, let it be...

    There is this idea that gravity is so weak in comparison to electromagnetism, because the circular (closed)  strings corresponding to the graviton do not permanently bind to our three dimensional space (or four-dimensional spacetime). They are just passing through, and there is little time for them to establish the force between bodies. So, the gravitational force exerted between two glasses of red wine is very small on each other. But when they are accelerated (say, by clinking them) perceptible forces appear. So I conjecture that both forces are caused by the same mass, and by the same gravitons, but they differ in magnitude, because the gravitons stay around longer in the second case, so as messenger particles they establish a force many orders of magnitude stronger.

    How can they stay around longer to become relevant? Perhaps by traveling orthogonally to the gravitational gravitons. Okay, inertial gravitons travel (more or less) orthogonally to our space, that is, along the time axis and thus following the trajectories of the particles the matter is built of.

    I think it was Feynman who mentioned that an anti-particle traveling 'backwards' in time can be equated with its particle traveling forward in time. Now, what is the anti-graviton? I believe that I read sometime that it is the same as the graviton. Just like a photon is not differing from an anti-photon. So the time inversion does not change any property of the graviton. Good. We have gravitons which make a short visit in our spacetime, and vanish just as fast as they appeared, and there are those which accompany mass-like particles (even when at rest) along their trajectory in time, possibly bouncing back and forth between earlier and later. This constitutes (my IDEA!) a force between a body now and its earlier appearance in spacetime. Of course this is the same as a force between now and the future of the body. Actually there is a force between me and (me one second later). Or ten seconds or even a year!

    Two things become clear now. First we have to perform an integration (over all time destinations) to get the resulting force. And second, the force toward the past will bear a negative sign, so for stationary bodies all inertial forces will cancel out. I think this can be asserted for uniform movement too (i.e. constant velocity vector) but I did not waste any thoughts on this yet. Anyway, this is Newton's first law.

    Time to get to the second one. Here I want to make some further remarks. The inverse-square(-like) law should hold for inertial gravitons too, so that a longer time-span enters into the integral with a smaller effect. Kind of like the gravitational force between objects a light-second apart also decrease substantially. Second, causality is a bit smeared at the very short intervals where the forces are really relevant (Planck time?) so the reality a bit before pulls hard backwards and the reality a bit later pulls hard forward. Both look pretty much the same, differing only in the arrow of the force. I do not know whether this is essential, though.

    Back to Newton's second law. Imagine a stroboscope lighting a scene where a body moves, accelerated by a constant force along its movement direction. When you make a photo (with reasonable exposition time) you'll obtain a picture of several bodies, with the earlier ones nearer together, and the later ones successively more distant. Now let's declare the middle one as being now. Then the prior one is less distant than the next one so the force between the prior and now is bigger than the force between now and next. These do not cancel out any more. This is a consequence of the inverse-square law. Of course the stroboscope should flash with Planck frequency etc. (so that the spacetime distance is small enough), but the principle is clear: We get a resultant force that points backwards, namely the difference between the (bigger) half-integral towards the past and the (smaller) half-integral towards the future.

    All the time I was using the wrong word 'force' for something space-time-like (a vector with 4 components). What we perceive or measure as accelerating force is the projection of that 4 component vector into our 3-dimensional space. So I conjecture that F = ma is the projection of the resultant 4-force which I explained in the last paragraph to our 3 dimensions.

    If this little idea holds any water then it should be possible to calculate the inverse law for the 4-force. I'd bet it is inverse-cubic. If this succeeds, then the next test would be to figure out how the 4-force looks like when a relativistic (say half-lightspeed) particle is accelerated by a 3-force. Clearly the 4-speed rotates out of time, but does the the mass of the particle increase?
    I do not dare to think about whether we can ever get to the point where this idea gets tested in an even more comprehensive setting.

    But it was sure a lovely night when I was drinking that red wine with a good friend from Brasil two days ago, who dared to ask a quantum-mechanical question and received a long story as an answer, unknowingly liberating this small niggling idea from the confines of my head!

    Sunday, January 9, 2011

    Quantifiers: Dark Matter

    Modern cosmology routinely deals with dark matter, a source of gravity necessary to explain certain things. In the last days I discovered a kind of dark matter in Ωmega too, and here is the related story.

    As we know type constructors arise when the data definition is equipped with parameters, like below:
    data Tree a = Tip a | Fork (Tree a) (Tree a)
    a is a type variable and gets instantiated (to a type) when we want the tree to contain actual information.

    Actually I could have written the second recursive occurrence of Tree a in the Fork branch as Tree Char, and the type checker would have accepted it. Why? Because at each instantiation site the type variable a is created fresh, and can be bound to a new expression.

    But what is the kind of a? In Haskell it can only be *, which classifies all types, as the other possible kinds (* → *, etc.) do not classify types. But this kinding relation is not spelled out directly, so let's choose a more explicit definition:
    data Tree :: * → * where ...
    When talking about Tree Integer, then it gets pretty clear that Integer's kind must be * and that the resulting Tree Integer is itself kinded by *.
    But there is a fine distinction between explicit and implicit kinding in Ωmega: in the latter case a is kinded by a unification variable instead of by *. The reason is that Ωmega allows us to define custom kinds, which can classify other type-like things. These are usually used for indexing of type constructors.

    The unification variables that are present, but hidden from the user only become manifest when we have multiple levels of data definitions, with the higher levels indexing the lower ones and some constructor function tries to bind the unification variable to different kinds. Of course that won't work and all you'll see is a very cryptic error message. The smallest example I have encountered is issue 70, which I will present as an example here.

    data Pat :: *3 where
      Q :: Pat ~> Pat

    data Pat' :: Pat ~> *2 where
      Q' :: Pat' n ~> Pat' (Q n)

    data Pat'' :: Pat' p ~> *1 where
      Q'' :: Pat'' n ~> Pat'' (Q' n)

    data Pat''' :: Pat'' p ~> *0 where
      Q :: Pat''' n → Pat''' (Q'' n)
    Look at the last line, it induces these two kind equations:
    n = Pat'' p1
    Q'' n = Pat'' p2
    Each equality at some level induces an equality at one level up. But p1 and p2 are kinded by the same unification variable, k. Let's note the two equations with k in the middle:
    m = k = Q' m
    where m is the kind of n.
    You can see now that m is forced to be equal to Q' m, which is clearly incommensurable.
    The solution is clearly to spell out kinding quantifiers explicitly with type variables. Since these get instantiated to fresh copies on the fly, the above problem does not occur.

    So we arrive at (only showing the last definition for brevity's sake)
    data Pat''' :: forall (o::Pat) (p::Pat' o) . Pat'' p ~> *0 where
      Q :: Pat''' n → Pat''' (Q'' n)
    Black magic? Only when you let the kinds live in the dark!