16
Jul 12

## The Fresh Prince of Bell Labs

Inspired by:

In Petoskey, Michigan born and raised
Tinkering with gadgets I spent most of my days

Chillin out, maxin, relaxing with Boole,
And working on my master’s outside of the school

When a couple of Nazis who were up to no good,
Started making trouble in my neighborhood

I got one Alfred Nobel and Uncle Sam had confabs

21
Jun 12

## Program Configuration Space Complexity

Taking the Coursera Probabilistic Graphical Models course has had me thinking a lot about complexity. Lately, program state complexity in the context of testing in particular.

How many discrete tests would it take to ensure every combination of inputs is correct? Let’s do some back of the napkin math and see what we come up with. Please excuse my amateurish Latex.

First, let’s start with a thought experiment. Imagine a program (in any language) in which all of the configuration options are expressed as enumerations. Now imagine that each configuration option has some impact on the code of every other.

We’ll call this our worst case testing complexity as it completely ignores how everything is wired up. This effectively treats the entire program as a giant black box.

The terrifying thing about this to consider is that the addition of a simple Boolean configuration option can double your testing complexity. Now take a moment for that to sink in. Just adding a simple flag to a console application potentially doubles the test space.

This previous description only applies to a program where every configuration option is global. What about nice well separated classes?

If we consider each as its own little program, a non static class is only dependent on the cardinality of its internal state space times the cardinality of each of its methods input spaces.

For comparison, how might pure functional programs look?

Interesting that the testing complexity of a pure functional program is equivalent to that of object oriented programming done only with immutable classes. Each only has a single state so the whole product over options term drops away and we are left summing over all of the methods in our program. Now how many object oriented languages give nice semantics for immutability? That’s another question entirely.

Admittedly, is seems as though those very states when taken out could potentially turn into a roughly equal number of new classes thus giving no benefit in terms of test count. However, in my experience this is not the case. Will a given class state change the behavior of every single method? Also consider that if those states might change mid-computation things will become much more complex.

We seem to be stuck on the cardinality of our method or function inputs as our primary driver of testing complexity. Essentially this puts us back up to our original thought experiment, just in manageable chunks that are useful for pin-point debugging. We’ll never be able to test every state. The best you might do is using a random testing tool like QuickCheck.

As for languages which do not restrict inputs by type at all, they explode the testing space by forcing consideration of how every construct might interact with a given function. This could be seen as a new very large term in both the object-oriented and functional models above.

Finally, this makes me lament the lack of any kind of range value restriction in popular typed languages. Once we’ve done away with statefulness it’s the most glaringly obvious area where we can reduce complexity. The good news is we might start seeing some help from academia along these lines within the next few years.

15
Apr 12

## What Microsoft MVP means to me

It wasn’t long after college that I found myself blogging about the technology I was using on a regular basis. I have pretty good writing skills and am damn good with the code so soon after I was easily breaking 10K hits per post. Having a platform to share my ideas and knowledge was exhilarating and fun, but really didn’t mean much career wise. I wasn’t particularly passionate about C# or the CLR and would have been just as happy blogging about Java.

But after about two years out of college everything changed. I went to a talk by Rich Hickey on Clojure. Rich walked us through a completely functional and massively parallel ant colony simulation which repeatedly blew my mind. Four hours after walking into that talk I came out a different person. I knew then that I had been going about this whole programming business wrong. I knew then that everything was much harder than necessary and I was wasting huge amounts of time grooming my object oriented garden.

Now, I worked for a .NET shop so Clojure at work was out of the question, but I seen posts around on a new language from Microsoft Research. It was also a functional language but had some different properties. Properties that could be used to get stronger guarantees on code correctness.

Over the next week I feverishly built my own ant colony simulation in F#. While I struggled with the new type system at first, I found the ML syntax to be a joy to read after the fact. The code was also remarkably robust, it was a simple matter to inject and swap between many different thread communication models. I soon became convinced that I had found something even better than Clojure. A passion grew inside me like I had never felt before. Other people had to know that there was a better way.

Soon I found myself giving talks at every Code Camp and user group meeting that would have me. Most others viewed my enthusiasm skeptically but I was pushed on by the open minded few who watched my presentations with enthralled attention. Of course, some meetings were better than others. After one particularly great night at a meeting in central Connecticut I had a line of about ten people who were just dieing to know more.

I also worked with Michael de la Maza and Talbot Crowell to start the first F# User Group. Getting speakers locally was a challenge so in most cases we resorted to having people speak over live meeting. I worked on this as much for myself as I did to help spread the word about F#. It was fantastic to hear from others using the language and to learn about things I have never even considered before. Even after moving on to NYC, I still reminisce about the early days and I’m still very proud of our loyal group members.

Now all of this, from learning F# to starting the group, had been only taken about a year. What followed was an even more overwhelming whirlwind of life changing events.

I’m not sure how it came about, but someone had noticed my passion. I was given a free trip to Microsoft PDC where I had dinner with the Microsoft language teams. Chris Smith carted me around and introduced me to everyone (he’s a very popular fellow). Conversation after conversation was loaded with interesting ideas and fresh perspectives. I had one of the best times of my life.

Then came the MVP, shortly followed by Professional F# 2.0, then the MVP Summit where I was able to spend a day with the F# team right in their offices! To spend so much time talking with the people I admired so much for building this wonderful language was a dream come true. And still, it continues with more conferences and events, meeting very smart people and hearing fantastic new ideas. The whirlwind hasn’t stopped, even three years later, and it’s been a fantastic ride.

I wouldn’t give it up for anything in the world.

21
Jan 12

## Musicians, Mechanics, and Mathematicians

Thank you all for your comments on my previous post, I appreciate the time you all took in sharing your perspectives very much.  Many of you have brought up great analogies to demonstrate how you feel and in reading these responses I realized I must not have been very clear.

There are some musical geniuses who have composed great works without having been taught even the basics of music theory. However, this doesn’t mean they’re not doing math. The human brain excels at building approximate mathematical models and a rare few minds are capable of building exceedingly complex ones. Still, formal knowledge of the patterns of music allow a musician to both play the same song in new and interesting ways and see the underlying connections between different pieces. As a composer it informs how rhythms and melody can be juxtaposed or fitted together to create a desired effect in a way much more profound than trial and error. It expands the musician’s mind and makes them better at what they do.

Another great example is that of the wrench wielding mechanic. There are a great many mechanics who went to trade school and learned the high level basics of engines and a long list of procedures. They might not understand the details of combustion or material science but they can replace brake pads or swap in a new timing belt without too much difficulty. After many years of experience some may have even built a mental model so superb that they can take you for a spin around the block and tell you exactly what’s wrong.

And still, as the mechanic reaches for their bolts and wrench they might not think of the underlying mathematics of what they are about to perform. Yet, you can be sure the people who made those tools worried over it greatly. If they didn’t the wrench would not fit the head or worse, the bolt might shear under the stress applied. While they surely tested many bolts before shipping their product, they certainly didn’t before creating a formal model of how the tools were shaped or how they would perform. Even if the tools might happen to work without testing, they probably wouldn’t work very well and to sell tools made in this way would be grossly negligent.

Yet, I can’t be the only one who has suffered many near catastrophes at the hands of inept mechanics over the years. From the time a post-brake change air bubble in my brake line made my car roll out into traffic or the punctured gas tank that almost left me stranded at the side of the road. One might wonder if they even bothered testing their work.

Some might think programmers shouldn’t be beholden the same strictness as our creations aren’t usually capable of quite so much damage. Instead, the worst things most are liable to do are destroying important information, providing incorrect data to critical systems, leaking private information, sharing unsalted (or worse, unencrypted) passwords or causing people to become unable to access their bank or medical records. No big deal really, just shoveling data.

I’d love to see every programmer taking the time to learn deeply about the mathematical modeling of data and programs, but I know that’s not reasonable. However, it takes just a little bit of learning to leverage tools with very complex underlying mathematics made by others. You don’t need to be an expert in category theory to use Haskell any more than you need to be an expert in set theory to use SQL. F# and Scala are even more accessible as they have access to all of the libraries and patterns you would be familiar with as a programmer who works in .NET or Java.

So, I’m not asking that you go out and spend years studying your way to the equivalent of a PhD. Instead what I ask is that you just take a little time to understand what’s possible and then use tools made by people who do have that kind of deep understanding.

I know I wouldn’t want to drive a car with parts made by someone who didn’t use models, would you?

Huge thanks to @danfinch and @TheColonial for proof reading.

13
Jan 12

## Why do most programmers work so hard at pretending that they’re not doing math?

In the early days programming was considered a subdiscipline of mathematics. In fact, the very first person to write an algorithm was renowned as a mathematical genius. However, somewhere along the way we forgot. We began to think of ourselves as something different, a profession not beholden to rigor or deep understanding of the models we create.

It’s easy to see how this would happen within an industry in which so much more weight is put on knowing the API of the year over understanding base principles or expressing the desire to dig deep. People can make huge amounts of money pushing methodologies when they have little to no evidence of effectiveness. We work in an environment where hearsay and taste drive change instead of studies and models. We are stumbling in the dark.

I have come to attribute our sorry state to a combination of factors. The first is the lack of formal training for most programmers. This isn’t in itself a bad thing, but when combined with a lack of exposure to mathematics beyond arithmetic, due primarily to an inadequate school system, we are left with a huge number of people who think programming and math are unrelated. They see every day how their world is filled with repeating patterns but they miss the beautiful truth that math is really about modeling patterns and that numbers are just a small part of that.

The relationship between math and programming extends far beyond SQL’s foundation in set theory or bits being governed by information theory. Math is intertwined within the code you write every day. The relationships between the different constructs you create and the patterns you use to create them are math too. This is why typed functional programming is so important: it’s only when you formalize these relationships into a model that their inconsistencies become apparent.

Most recently it seems to have become a trend to think of testing as a reasonable substitute for models. Imagine if physics was done this way. What if the only way we knew how to predict how an event would turn out was to actually measure it each time? We wouldn’t be able to generalize our findings to even a slightly different set of circumstances. But it gets even worse: How would you even know what your measurement is of without a model to measure it within? To move back into the context of programming: how useful is a test that doesn’t capture all of the necessary input state?

This is exactly why dynamic programs with mutation become such a mess. Object oriented structure only makes things worse by hiding information and state. Implicit relationships, complex hidden structure and mutation each impair reasoning alone, but when combined they create an explosion of complexity. They each conspire with each other to create a system which defies comprehensive modeling by even the brightest minds.

This is also why programmers who use typed functional languages frequently brag about their low bug count yet eschew methodologies like test driven development: tests pale in comparison to the power of actual models.

Many thanks to my friends on twitter for the discussion leading up to this post: @seanschade, @craigstuntz, @sswistun, @joakinen, @copumpkin, @dibblego, @taylodl, @TheColonial, @tomasekeli, @tim_g_robinson and @CarstKoenig. Special thanks to @danfinch for taking the time to proof read.

12
Dec 11

## 2011 In Retrospect: A Year of Writing F# Professionally

For the past year I’ve been working almost entirely in F# and have found the experience to be everything I hoped it to be and better. In just six months I was able to bring a research project to fruition which has since made our company millions of dollars. F#’s terseness made algorithms a joy to implement while F#’s type system made changes easy and regression errors non-existent. In fact, the few bugs we have had were either due to unexpected behavior in the GUI API, or in external C# libraries. Never before have I been so happy with the programming language I use on a daily basis. Never before have I felt so confident that when I make a change new bugs won’t be introduced. And no, we’re not using strict TDD, or any other strict development methodology. Simply leveraging the type system well in addition to algorithmic testing has proven to be extremely robust within F#.

It’s also been a great year for making new friends and meeting interesting people. The NYC F# community is thriving with an average of about fifty people attending any given NYC F# User Group meeting, many of them using F# at work for at least some subset of their projects. I hear the same experiences from many: Yes, learning F# isn’t simple, but the benefits in speed of implementation, LoC reduction, and correctness are huge. The excitement over type providers is also palpable. Finally, we’ll be able to extend our types out into data sources and eliminate new classes of error which currently occur at the border between our data and our applications. If you’re interested in following what we’ve been up to at the NYC F# User Group, check out our Vimeo stream.

I’ve also had the pleasure of speaking at a quite a few interesting events this year. One I haven’t yet written about is Monospace 2011.

Click For Video

In retrospect, it shouldn’t have been surprising that many in the Mono community hadn’t seen much F#.  They tend to go to different conferences and events than your standard .NET user and so hadn’t yet heard much about its benefits.  I will say though that it was a very bright group and I think many who speak on F# would find them  just as receptive as I have.  I do hope to see (and participate in) more outreach into the Mono world as I think we will find that there’s quite a few folks who would be happy to embrace a better way to program, if only they were shown the way.

28
Sep 11

## For whom the proteins fold

I know this post isn’t of my usual technical type, but I hope you’ll bear with me while I share an idea I’ve been thinking about.

Starting way back with SETI@Home, scientists have been borrowing our computer time in exchange for awesomely nerdy screen savers for years. However, it’s only fairly recently have they discovered the promise of crowd sourcing. You may have seen the headlines for the two largest successes just recently.

Crowd Sourced Protein Folding

Crowd Sourced Planet Finding

Most interesting, scientists have found that gamers are actually better at figuring these things out than they are! That’s right, all of those years you thought you were wasting time, you were actually honing your science generating skills.

But would it be possible to build on these discoveries in order to turn some proportion of the 150+ billion hours of time spent gaming a year into something good for society?

If you’re like me you didn’t have a very big allowance when you were young, maybe enough to buy a few video games a year. I’m guessing that in the current recession it’s even worse for most kids. Also, have you noticed all of those games downloadable directly to a console these days? That must be torture for the modern day low-allowance kid.

So here’s the idea: we turn these kids into an army of protein folding, planet finding scientists by paying them in virtual currency to do these tasks. They can then save up and use the virtual currency to buy the games their little hearts are burning for.

Even better, think of the secondary effects:

1. More exposure to science for kids, some might grow up and become scientists.
2. It’s an opportunity for great corporate publicity (especially if they give away some virtual currency for free or let scientists turn grants into it at a discounted rate).
3. I expect that children’s very plastic brains will learn the tasks quickly and be better at them than adults.
4. Even if it’s done for-profit with drug companies we will still get new and better medicines out of the deal.

All that’s left is for someone to lock some scientists and game console manufacturers in a room so they can figure the details out. Any takers?

19
Sep 11

## Imperative Pseudocode to Pure Functional Algorithm with Gale-Shapely and F#

Continuing from last time, let’s look at how one goes from imperative pseudocode to pure functional using Gale-Shapely as an example.

Overall, to convert an algorithm from imperative to functional is a fairly simple process once you understand how to convert from a while loop to recursion with accumulators. This post is just a more advanced version of what I talked about in my From Imperative to Computation Expressions post. In fact, if you haven’t already I suggest you read it first. I’ll be assuming you understand how to perform that conversion in this article.  If you’re still having trouble, another good article which is a bit easier is The Ted Neward F# Folding Challenge.

The conversion of an algorithm is a two stage process: first identify algorithmic cases, implied mutable data structures, and alternate immutable data structures.  Then convert the given loops into recursion while leveraging pattern matching.  I know it may seem difficult at first, but after reading this I think you’ll have a good idea of how to get there.

So, let’s identify the cases.

function stableMatching {
Initialize all m ? M and w ? W to free
// Loop Termination Case on
// check for “free m with unproposed w” and

// implicit ranking of W for each m in M

while ? free man m who still has a
woman w to propose to {
w = m’s highest ranked such woman
who he has not proposed to yet

// She’s Single Case
if w is free
(m, w) become engaged
// She’s Engaged Case
else some pair (m’, w) already exists
// She Picks m Sub-Case

if w prefers m to m’
(m, w) become engaged
m’ becomes free
// She Picks m’ Sub-Case
else
(m’, w) remain engaged
}
}

Starting from the top, we can just ignore the imperative initialization.  However, you’ll notice that a great deal is going on in that while clause.

First, it defines our loop termination by looking for a “free m who still has a w to propose to”.  This is interesting because it seems to be begging for some kind of data structure lookup.  Second it defines a implicit ranking of each member of W by each member of M.  Note that this is not mentioned in the initialization phase.

The rest of this algorithm reads like a simple list of cases.  To make this plain, let’s list them out along with state changes.

1. no single m in M with unproposed w
— execution terminates
2. m proposes to unengaged w
— w is no longer in m proposals
— m is no longer single
— m and w are engaged
3. m proposes to w engaged to m’
a.  She switches to m
— w is no longer in m proposals
— m is no longer single
— m and w are engaged
— m’ is single
b.  She stays with m’
— w is no longer in m proposals
— m’ and w stay engaged
— m remains single

The mechanics of what exactly we do in these cases depends on our data structures.  So we can’t get much further without figuring that out.  To identify what data structures are needed we’ll need to identify what exactly it is that needs to be kept track of.  Only once we know that can we pick our immutable data structures appropriately.

Looking above, it appears that we’ll need to keep track of:

1. Given a woman, is she engaged and, if so, who is she engaged to? (need to look it up)
2. Which men are single? (just need one at a time, order doesn’t matter)
3. For each man, which is his next most desirable match that he has not yet proposed to? (just need one at a time, order matters)

Note that we don’t have to keep track of which men are engaged to which women, the question is never asked.  This is a minimal set of requirements here.  No extra stuff allowed.  I’m serious.

Moving on, in an imperative mutable version you might use these data structures:

1. An array indexed by w to the m index engaged to (-1 if single)
2. An array indexed by m to the w index engaged to (-1 if single)
3. An array indexed by m where each element is an ordered linked list of unproposed w indices.

However, what would satisfy these same requirements but be immutable?  Well, if we step out of the mutation mindset it becomes obvious pretty quickly:

1. An engagement map of women to men
2. An immutable linked list of single men
3. Define a man to include his index as well as an ordered unproposed woman immutable linked list

Instead of scanning arrays, or querying mutable structures we simply decompose our data structures as much as needed and recompose them into the form we need on the next call.   This may sound difficult, but it’s really not.  The fact that we have pattern matching and our cases can be defined in terms of the shape of our data structures actually makes this very simple to do.  That is, once you get the hang of it.

To see this in action, let’s break up the code in the last article and discuss each chunk.

 1: open System
2:
3: // a Bachelor is an identity index and an
4: // ordered list of women indicies to approach.
5: type Bachelor = int * int list

Here we see the definition of a bachelor, it’s a tuple of an integer and a list of integers. This is effectively a pair containing an index and a list of woman indices. Note that in this example all men are of type Bachelor.

 7: // Some notation:
8: // wi = woman index (int)
9: // mi = man index (int)
10: // mi' = woman's current partner index (int)
11: // m = man with index and unapproached women indices (Bachelor)
12: // mSingle = men that are single (Bachelor list)
13: // wEngaged = engagements from women to men (int, Bachelor)

We’ll keep this notation around just so you have easy reference.

15: let funGS (comp: _ -> _ -> float) (M: _ array) (W: _ array) =
16:   let Windices = [ 0 .. W.Length - 1 ]
17:   // List of men with women in order of desire
18:   let Munproposed =
19:     List.init M.Length
20:       (fun mi ->
21:            let sortFun wi = 1.0 - (comp M.[mi] W.[wi])
22:            mi, Windices |> List.sortBy sortFun)

Windices is just a list of numbers from 0 to the number of woman tokens. Doing this first makes the calculation of Munproposed less expensive.

Munproposed is our initial set of all single men. Here List.init is used to call the given lambda to generate each element. As you can see by the last line within that lambda, each element of the list is a tuple containing the index of that man and a list of women indices sorted by the given desirability function, sortFun.

23:   // Recursively solve stable marriages
24:   let rec findMarriages mSingle wEngaged =
25:     match mSingle with
26:     // No single guys left with desired women, we're done
27:     | [] -> wEngaged
28:     // Guy is out of luck, remove from singles
29:     | (mi, []) :: bachelors -> findMarriages bachelors wEngaged

These are our first two cases, if the mSingle list pattern matches with [], we know it’s empty and so we are done.

In the second case we are pattern matching on the structure of singles list as well as it’s first element.  The syntax (mi, []) matches on a tuple of an index and an empty inner list.  This list is the list of yet unproposed to women and so if it’s empty we know that this guy is out of options and so it’s ok to drop him from the list of bachelors.  We do this by simply recursing on bachelors, which is the tail of the mSingle list.

30:     // He's got options!
31:     | (mi, wi :: rest) :: bachelors ->
32:       let m = mi, rest

This is our general purpose match, and is the structure we expect to see in most cases.  Here the head is fully decomposed into a man index (mi), his next first choice woman index (wi), the rest of the women (rest) and the rest of the single men (bachelors).

Immediately afterward we define m to be that given man index (mi) and the rest of the women tokens (rest).  As two tokens are only ever compared once, m should no longer contain wi in his ordered list of proposals.  This is the form he will take until the algorithm ends or he is evaluated again after being placed in the mSingle pool.

33:       match wEngaged |> Map.tryFind wi with
34:       // She's single! m is now engaged!
35:       | None -> findMarriages bachelors (wEngaged |> Map.add wi m)
36:       // She's already engaged, let the best man win!
37:       | Some (m') –> 38:         let mi', _ = m'

Now that we have fully matched out everything masculine, it’s time to turn to the feminine.

Here we look in the wEngaged map to find out if wi is single.  This is done by a using Map.tryFind which returns None if the given key is not in the map and Some(value) if it is.

If the Map.tryFind call returns None we know she is single and so recurse with the rest of our single men (bachelors) and our wEngaged map is extended via Map.add to contain a new mapping from the woman index (wi) to the given man (m).

If she’s engaged our pattern match automatically decomposes the given option type and we know exactly who she is engaged to (m’).

39:         if comp W.[wi] M.[mi] > comp W.[wi] M.[mi'] then
40:           // Congrats mi, he is now engaged to wi
41:           // The previous suitor (mi') is bested
42:           findMarriages
43:             (m' :: bachelors)
44:             (wEngaged |> Map.add wi m)
45:         else
46:           // The current bachelor (mi) lost, better luck next time
47:           findMarriages
48:             (m :: bachelors)
49:             wEngaged

This is the core comparison logic.  Here we determine which bachelor gets her hand.  This is pretty straightforward from an imperative perspective as we’re using the oh-so familiar if statement.

In the first case the new contender (m) wins.  We append the loser (m’) back on to the head of what will be mSingle next time around with the syntax (m’ :: bachelors).  Similar to what happens if she’s single, we add a mapping from the woman index (wi) to the man instance (m) via the call to Map.add.  Note that this will override the former mapping from wi to m’.

If the current bachelor loses we simply append him back to the head of the bachelors list and try again next time around.  You’ll get her next time bro.

50:   findMarriages Munproposed Map.empty
51:   // Before returning, remove unproposed lists from man instances
52:   |> Map.map (fun wi m -> let mi, _ = m in mi)  

The final section is just how we set up our recursion.  We start with the full list of single men and their ranked lady lists (Munproposed) and an empty map of relationships (Map.empty).  When the algorithm exits we make one last pass through our returned wEngaged map in order to strip out all of the lists of unproposed to women.  It’s just baggage now anyway.

Well that’s how you get from imperative pseudocode to a pure functional implementation.  All of the code here, as well as a really nasty imperative version to compare it to is available on my GitHub page

If you liked this post, have any questions, or feel like I missed something important I hope you’ll comment here on it.  We’ll all benefit from the knowledge shared.

namespace System
type Bachelor = int * int list

Full name: Snippet.Bachelor

Multiple items

val int : ‘T -> int (requires member op_Explicit)

Full name: Microsoft.FSharp.Core.Operators.int

——————–

type int<‘Measure> = int

Full name: Microsoft.FSharp.Core.int<_>

type: int<‘Measure>

implements: IComparable

implements: IConvertible

implements: IFormattable

implements: IComparable<int<‘Measure>>

implements: IEquatable<int<‘Measure>>

inherits: ValueType

——————–

type int = int32

Full name: Microsoft.FSharp.Core.int

type: int

implements: IComparable

implements: IFormattable

implements: IConvertible

implements: IComparable<int>

implements: IEquatable<int>

inherits: ValueType

type ‘T list = List<‘T>

Full name: Microsoft.FSharp.Collections.list<_>

type: ‘T list

implements: Collections.IStructuralEquatable

implements: IComparable<List<‘T>>

implements: IComparable

implements: Collections.IStructuralComparable

implements: Collections.Generic.IEnumerable<‘T>

implements: Collections.IEnumerable

val funGS : (‘a -> ‘a -> float) -> ‘a array -> ‘a array -> Map<int,int>

Full name: Snippet.funGS

val comp : (‘a -> ‘a -> float)
Multiple items

val float : ‘T -> float (requires member op_Explicit)

Full name: Microsoft.FSharp.Core.Operators.float

——————–

type float<‘Measure> = float

Full name: Microsoft.FSharp.Core.float<_>

type: float<‘Measure>

implements: IComparable

implements: IConvertible

implements: IFormattable

implements: IComparable<float<‘Measure>>

implements: IEquatable<float<‘Measure>>

inherits: ValueType

——————–

type float = Double

Full name: Microsoft.FSharp.Core.float

type: float

implements: IComparable

implements: IFormattable

implements: IConvertible

implements: IComparable<float>

implements: IEquatable<float>

inherits: ValueType

Multiple items

val M : ‘a array

type: ‘a array

implements: ICloneable

implements: Collections.IList

implements: Collections.ICollection

implements: Collections.IStructuralComparable

implements: Collections.IStructuralEquatable

implements: Collections.Generic.IList<‘a>

implements: Collections.Generic.ICollection<‘a>

implements: seq<‘a>

implements: Collections.IEnumerable

inherits: Array

——————–

val M : ‘a array

type: ‘a array

implements: ICloneable

implements: Collections.IList

implements: Collections.ICollection

implements: Collections.IStructuralComparable

implements: Collections.IStructuralEquatable

implements: Collections.Generic.IList<‘a>

implements: Collections.Generic.ICollection<‘a>

implements: seq<‘a>

implements: Collections.IEnumerable

inherits: Array

type ‘T array = ‘T []

Full name: Microsoft.FSharp.Core.array<_>

type: ‘T array

implements: ICloneable

implements: Collections.IList

implements: Collections.ICollection

implements: Collections.IStructuralComparable

implements: Collections.IStructuralEquatable

implements: Collections.Generic.IList<‘T>

implements: Collections.Generic.ICollection<‘T>

implements: seq<‘T>

implements: Collections.IEnumerable

inherits: Array

Multiple items

val W : ‘a array

type: ‘a array

implements: ICloneable

implements: Collections.IList

implements: Collections.ICollection

implements: Collections.IStructuralComparable

implements: Collections.IStructuralEquatable

implements: Collections.Generic.IList<‘a>

implements: Collections.Generic.ICollection<‘a>

implements: seq<‘a>

implements: Collections.IEnumerable

inherits: Array

——————–

val W : ‘a array

type: ‘a array

implements: ICloneable

implements: Collections.IList

implements: Collections.ICollection

implements: Collections.IStructuralComparable

implements: Collections.IStructuralEquatable

implements: Collections.Generic.IList<‘a>

implements: Collections.Generic.ICollection<‘a>

implements: seq<‘a>

implements: Collections.IEnumerable

inherits: Array

val Windices : int list

type: int list

implements: Collections.IStructuralEquatable

implements: IComparable<List<int>>

implements: IComparable

implements: Collections.IStructuralComparable

implements: Collections.Generic.IEnumerable<int>

implements: Collections.IEnumerable

val W : ‘a array

type: ‘a array

implements: ICloneable

implements: Collections.IList

implements: Collections.ICollection

implements: Collections.IStructuralComparable

implements: Collections.IStructuralEquatable

implements: Collections.Generic.IList<‘a>

implements: Collections.Generic.ICollection<‘a>

implements: seq<‘a>

implements: Collections.IEnumerable

inherits: Array

property Array.Length: int
val Munproposed : (int * int list) list

type: (int * int list) list

implements: Collections.IStructuralEquatable

implements: IComparable<List<int * int list>>

implements: IComparable

implements: Collections.IStructuralComparable

implements: Collections.Generic.IEnumerable<int * int list>

implements: Collections.IEnumerable

Multiple items

module List

from Microsoft.FSharp.Collections

——————–

type List<‘T> =

| ( [] )

| ( :: ) of ‘T * ‘T list

with

interface Collections.IEnumerable

interface Collections.Generic.IEnumerable<‘T>

member IsEmpty : bool

member Item : index:int -> ‘T with get

member Length : int

member Tail : ‘T list

static member Cons : head:’T * tail:’T list -> ‘T list

static member Empty : ‘T list

end

Full name: Microsoft.FSharp.Collections.List<_>

type: List<‘T>

implements: Collections.IStructuralEquatable

implements: IComparable<List<‘T>>

implements: IComparable

implements: Collections.IStructuralComparable

implements: Collections.Generic.IEnumerable<‘T>

implements: Collections.IEnumerable

val init : int -> (int -> ‘T) -> ‘T list

Full name: Microsoft.FSharp.Collections.List.init

val M : ‘a array

type: ‘a array

implements: ICloneable

implements: Collections.IList

implements: Collections.ICollection

implements: Collections.IStructuralComparable

implements: Collections.IStructuralEquatable

implements: Collections.Generic.IList<‘a>

implements: Collections.Generic.ICollection<‘a>

implements: seq<‘a>

implements: Collections.IEnumerable

inherits: Array

val mi : int

type: int

implements: IComparable

implements: IFormattable

implements: IConvertible

implements: IComparable<int>

implements: IEquatable<int>

inherits: ValueType

val sortFun : (int -> float)
val wi : int

type: int

implements: IComparable

implements: IFormattable

implements: IConvertible

implements: IComparable<int>

implements: IEquatable<int>

inherits: ValueType

val sortBy : (‘T -> ‘Key) -> ‘T list -> ‘T list (requires comparison)

Full name: Microsoft.FSharp.Collections.List.sortBy

val findMarriages : ((int * int list) list -> Map<int,(int * int list)> -> Map<int,(int * int list)>)
val mSingle : (int * int list) list

type: (int * int list) list

implements: Collections.IStructuralEquatable

implements: IComparable<List<int * int list>>

implements: IComparable

implements: Collections.IStructuralComparable

implements: Collections.Generic.IEnumerable<int * int list>

implements: Collections.IEnumerable

val wEngaged : Map<int,(int * int list)>

type: Map<int,(int * int list)>

implements: IComparable

implements: Collections.Generic.IDictionary<int,(int * int list)>

implements: Collections.Generic.ICollection<Collections.Generic.KeyValuePair<int,(int * int list)>>

implements: seq<Collections.Generic.KeyValuePair<int,(int * int list)>>

implements: Collections.IEnumerable

val bachelors : (int * int list) list

type: (int * int list) list

implements: Collections.IStructuralEquatable

implements: IComparable<List<int * int list>>

implements: IComparable

implements: Collections.IStructuralComparable

implements: Collections.Generic.IEnumerable<int * int list>

implements: Collections.IEnumerable

val rest : int list

type: int list

implements: Collections.IStructuralEquatable

implements: IComparable<List<int>>

implements: IComparable

implements: Collections.IStructuralComparable

implements: Collections.Generic.IEnumerable<int>

implements: Collections.IEnumerable

val m : int * int list
Multiple items

module Map

from Microsoft.FSharp.Collections

——————–

type Map<‘Key,’Value (requires comparison)> =

class

interface Collections.IEnumerable

interface IComparable

interface Collections.Generic.IEnumerable<Collections.Generic.KeyValuePair<‘Key,’Value>>

interface Collections.Generic.ICollection<Collections.Generic.KeyValuePair<‘Key,’Value>>

interface Collections.Generic.IDictionary<‘Key,’Value>

new : elements:seq<‘Key * ‘Value> -> Map<‘Key,’Value>

member Add : key:’Key * value:’Value -> Map<‘Key,’Value>

member ContainsKey : key:’Key -> bool

override Equals : obj -> bool

member Remove : key:’Key -> Map<‘Key,’Value>

member TryFind : key:’Key -> ‘Value option

member Count : int

member IsEmpty : bool

member Item : key:’Key -> ‘Value with get

end

Full name: Microsoft.FSharp.Collections.Map<_,_>

type: Map<‘Key,’Value>

implements: IComparable

implements: Collections.Generic.IDictionary<‘Key,’Value>

implements: Collections.Generic.ICollection<Collections.Generic.KeyValuePair<‘Key,’Value>>

implements: seq<Collections.Generic.KeyValuePair<‘Key,’Value>>

implements: Collections.IEnumerable

val tryFind : ‘Key -> Map<‘Key,’T> -> ‘T option (requires comparison)

Full name: Microsoft.FSharp.Collections.Map.tryFind

union case Option.None: Option<‘T>
val add : ‘Key -> ‘T -> Map<‘Key,’T> -> Map<‘Key,’T> (requires comparison)

union case Option.Some: ‘T -> Option<‘T>
val m’ : int * int list
val mi’ : int

type: int

implements: IComparable

implements: IFormattable

implements: IConvertible

implements: IComparable<int>

implements: IEquatable<int>

inherits: ValueType

val empty<‘Key,’T (requires comparison)> : Map<‘Key,’T> (requires comparison)

Full name: Microsoft.FSharp.Collections.Map.empty

val map : (‘Key -> ‘T -> ‘U) -> Map<‘Key,’T> -> Map<‘Key,’U> (requires comparison)

Full name: Microsoft.FSharp.Collections.Map.map

val alignJaroWinkler : (‘a array -> ‘a array -> Map<int,int>)

Full name: Snippet.alignJaroWinkler

14
Dec 10

## I got 99 problems but dynamic ain’t one

If you got runtime errors I feel bad for you son
I got 99 problems but dynamic ain’t one

I got the cube patrol on the code patrol
Foes that wanna try and keep my source out of control
Ruby writers that say he’s “Science Strict Not-Bold”
I’m from university stupid what type of facts are those

If you grew up with segfaults in your joes
You’d celebrate the minute you was havin’ stable codes
I’m like avoid Python it can’t handle my whole payload
If you don’t trust my types you can take my tests off hold

Got beef with podcasts if Iron’s on they show
When those brakepoints hit they don’t know what’s missed
Soft’ mags try an’ ignore my syntax

I don’t know how you be writin’ that
without understanding the power that type systems have
I’m from calls to lambdas dude, I ain’t dumb
I got 99 problems but dynamic ain’t one
Hit me

99 problems but dynamic ain’t one
If you got runtime errors I feel bad for you son
I got 99 problems but dynamic ain’t one
Hit me

10
Apr 10

## The Repeating History of Closed Platforms

I was one of the many that loved the iPhone AppStore platform.  Consumers marveled at how we could easily buy any application we might want, vetted with a nice rating system.   Developers were shocked at the ease with which it seemed possible to monetize on simple applications.   The only downside seemed to be the occasional political or ambiguous AppStore rejection.

Looking back now, the recent changes to Apple’s developer terms of service are not so much of a shock.  They are just the continuation of Apple’s past policies.  It’s the same pattern of gradual erosion of Consumer and Maker rights we see on any platform over time.  If you look beyond the scope of software and include any marketable good, you see a long history of this pattern repeated over and over.  Consider cable television and the music industry.

At first, closed platforms are great.  They get the both the Consumer and the Maker what they need as fast and easily as possible, because that’s the platform provider’s job.  The platform providers are competing against each other and so must cater to both the Consumer and the Maker.  However, this soon changes.

Over time, one of two things happen, both bad:  Either a single provider wins out and gains a monopoly, or platform providers conspire together.  The platform then becomes a weapon in the corporate war chest.  The ultimate end is control over both the Makers and the Consumers until a new revolutionary platform emerges, catches the old players off guard and the process begins again.

The only escape from this pattern seems to be the open source community, a community in which the Makers have taken complete control by keeping out profit-seeking Providers.  However, this seems to have the unfortunate side effect of alienating Consumers.  Perhaps this can be remedied by a benevolent Platform Provider, but I’m not holding my breath.