Friday, April 24, 2009

What is this, the developer's version of The Onion?

I've started a new venture: Patently False. Tech headlines and gossip from the cutting edge of satire.

Here is an RSS feed and here is a Twitter feed.

The Story

CodeToJoy's model has been, vaguely, Calvin and Hobbes, with posts that range from the earnest to the absurd. Long-time readers can stay with me, and adjust easily as the sincerity scale changes.

However, I realize that it's difficult for newbies to pin down the vibe. The title of this post is a comment that was left on Reddit or DZone, months ago. For CodeToJoy, the answer is "it depends on the post".

Though my writing certainly doesn't compare to genius of C&H or The Onion, I've decided to branch out. Patently False will be all absurdity, all the time -- a resounding yes to this post's title. To combat ambiguity, I put a disclaimer right in the title.

The Organization

Also, I've decided to go ultra-lightweight: headlines only, on Twitter. On Google Reader, I've noticed that The Onion can make me laugh with a good headline. Sometimes, the stories seem like a forced, obligatory exercise.

There is a blog site, with an official introduction and explanation of the interrobang, but for now that is mostly a holding area.

The Upshot

Subscribe today.... I reserve the right to do longer, full spoof pieces on CodeToJoy: this blog won't change. After all, the nation of The Joyous must be served!

Monday, April 20, 2009

Top Ten Reasons Java Developers Fear The Oracle Buyout

It is huge news indeed, and so without further ado, here are the Top Ten Reasons why Java developers fear the Oracle buyout:

#10 Any Java class with the name Table in it will have a 30 character limit

#9 Your favorite open-source IDE will become a Toad plugin

#8 Applications will become known as Enterprise Stored Procedures

#7 Ignoring silly hexadecimal, the CEO will insist the venerable CAFEBABE header be changed to IAMLARRY

#6 Prospective JavaOne speakers will need to win a yacht race as part of the acceptance process. JavaOne itself will be held on a desert island.

#5 The beloved Java Posse will suddenly resurface as the PL/SQL Posse

#4 The classloader will become a fork of MySQL and require a series of certification tests before modification

#3 The next version of Java will be known as JDK 7g

#2 A new Swing API will replace EventListener with TNSListener, requiring annoying server configuration.

and the top reason Java developers fear the Oracle buyout:

#1 DBAs will park in reserved slots, acquire window offices, and gleefully demand that everything goes into Fourth Normal Form.

If you like this kind of humor, come on over to a brand-new Twitter feed...

Introducing @patentlyfalse (read more here).

Even if you aren't on Twitter, here is an RSS feed.

Sunday, April 19, 2009

Historical Footnote on Design Patterns

When it rains, it pours. I recently attended an excellent talk on "Design Patterns Reconsidered" by Alex Miller. Around the same time, I was listening to a Software Engineering radio podcast (on Adrenalin Junkies) and heard a comment that merits amplification.

Many people know (or would know, if they attended Alex's talk) that the seminal book, Design Patterns, was heavily influenced by books on architecture by Christopher Alexander. In Design Patterns, the now-famous Gang of Four certainly discuss Alexander, and list patterns-based literature of the era, vis-a-vis software architecture -- but there isn't much on the semantic gap between architecture and computer science. How did we discover Alexander in the first place?

On the podcast, a woman points out that Peopleware is one of the first known books on software to reference Alexander's works (though note that the context is organizing office space).

Tom deMarco acknowledges the comment, but states that Edward Yourdon was a major factor in bringing the book into consciousness of IT (in the early 1970s). Though he can only comment for himself (and not the Gang of Four), deMarco goes on to say that he owes "a personal debt" to Yourdon.

Perhaps, we all owe thanks?

Saturday, April 18, 2009

Trimming a string column in PostgreSQL

In case this helps anyone... I had to figure it out recently and though it isn't hard, it isn't obvious either.

Here's how to truncate a string column in Postgres. Assume the column is of length M where M > N and we want it to be N with simple trim:

alter table T alter column C type varchar(N)
using substring(C from 1 for N);
The key is that C references both the column name (in the first instance) and the value (in the second instance).

Thursday, April 16, 2009

Monads 102: using Haskell online (or 'Finding your inner Klingon')

The Goal

This post is a long-delayed follow-up to a talk I gave at the Lambda Lounge. It will use a simple code example to show some intermediate ideas of monads in Haskell.

The best part is that the code can be pasted into Codepad (no affiliation) and tinkered with. No download necessary!

(Note that a recent review of RWH calls Haskell "like Klingon, but with math". Hence the subtitle.)

Where is Monads 101?

There is no Monads 101 post on this blog. This is part of the zen of monads, as explained in Monads are Burritos. I did my best at the Lambda Lounge, and have some fun ideas for the future, but for now, this post is intended for readers who have some background in Haskell.

Ok, ok, as a super brief recap, check out this photo. Recall that

1. m is a type constructor (i.e. it is a generic type that wraps another type)

2. a -> m a is a function that injects/wraps something of type a into something of type m. In Haskell, this is called return

3. m a -> (a -> m b) -> m b is a function that takes a monad m a and another function, a -> m b. It returns m b. Loosely speaking, it breaks the inner type out of m and applies the function, resulting in a computed value back inside another m. The 'breaking out of' part is unspecified and unique to each monad. This whole doozie is called bind and uses this symbol: >>=

It is vital to understand that a, b, and m above are types (e.g. Integer, String). Haskell has a gorgeous fluidity between data variables and type variables that can be confusing at first.

A monad is a type that supports all three of the above. I warned you that this was Monads 102. If you aren't comfortable at this point, that's fine. This is non-trivial stuff.

The Goals

We'll take a code example that defines a trivial function called tuple. This example won't change but we'll send in some different monadic values and see what happens.

Example: Maybe (with Integer)

Here is the full example... Paste this into Codepad:

-- see comments below
tuple mx = mx >>= \x -> return (x, x + 1 )

mx = Just 10

main = (print (tuple mx))
Here is a version with comments:

-- In English, tuple accepts a mx, a monad.
-- The monad pulls x out, and builds a simple tuple
-- which is returned in a monad of the same type.
-- Here mx is a variable name. It is of type 'm a'
-- where m and a are types
-- 1. tuple takes m a and returns m (a, a)
-- 2. note that \x -> ... is a lambda expression
-- with a parameter 'x'. This expression is
-- the function a -> m b that is passed to bind.
-- 3. >>= is called 'bind' because it binds the
-- value of x
-- NOTE: mx defines the way >>= behaves!!!

tuple mx = mx >>= \x -> return (x, x + 1 )

-- mx is of type Maybe Integer

mx = Just 10

-- main just prints the result

main = (print (tuple mx))
The output should be:
Just (10,11)
Try some other values for mx. Experimentation here will be worth a zillion words and comments.

Example: Maybe (with Double)

In the Codepad editor, change the value of mx to:

mx = Just 3.14
and run again. Since Haskell is Klingon-esque about types, this is a big deal. The tuple function works with Maybe Integer as well as Maybe Double. In fact, it should work with any m a where a supports addition.

Example: Maybe (with Nothing)

Now, again in Codepad, change the value of mx to:

mx = Nothing
The output should be the same: Nothing. What is happening? Recall that the monad supplies the bind function by 'breaking out' the inner type: but each monad can define that behaviour.

In the case of Maybe, that behaviour is defined in part as: if the value is Nothing, then don't even bother calling the supplied function! Hence, the result is Nothing.

Example: List

Here's where things get fun... Let's wish Maybe a fond farewell and use another monad: List. Remember that tuple isn't going to change here.

In Codepad, do this:

mx = [1,2,3,4]
Try and guess what the output should be, then run it.

You should see:
The reason for this is that the List monad uses a different definition for 'breaking out' when applying >>= / bind. Clearly, the List definition is to apply the provided function to each element in the list.


The upshot here is that tuple isn't changing. The monads are changing. (Or for you Zen types, your mind is changing. For Klingons, the semantics of the syntax is bending to your will.)

It is important to note that tuple is indeed a lame function with no utility. The types Maybe and List are useful; as monads, they are very basic. If you were to describe the 'breaking out' in pseudocode, they seem trivial:

Given m >>= f where m is m a and f is a -> m b
  • When m is Maybe: if it has something, it applies f; else it does nothing.
  • When m is List: it applies f for each element in the list.
Don't be fooled! There are other monads in Haskell that are much more sophisticated (an intense example is the STM monad for software transactional memory).

The important thing is to understand that the power is in the 'breaking out', which is individual to the monad. Yet against that flexibility, we have seen with tuple that monadic code remains constant.

That's monads in a nutshell: rigidity and flexibility in a powerful combination.

Wednesday, April 15, 2009

I love the Deep Fried Bytes, man!

(The title of this post comes from a quote by Dick Wall, professing his love for Groovy in the face of charges to the contrary.)

A while back (on Twitter), I made a comment about the podcast Deep Fried Bytes. There was a brief exchange with the guys at DFB: it was friendly, but I felt like we weren't communicating. I allow that to happen 2-3 times before I bail and use email or the blog.

(Random tangent: It is mystifying why others try to debate religion or politics in 140 chars. It comes across as an intellectual boxing fight, except with Nerf gloves. Or an obscure debate between two Zen masters).

I first heard of the podcast during a talk by Ken Sipe (on F#). I loaded up on episodes for F#, C#, and some other Microsoft technologies. I'm not proud of this, but I'm not familiar with the dotNet space; I haven't used Visual Studio in years.

I loved the podcasts because they were geeky, but also because the topics are a new world to me. I felt like a spy listening in on a secure line. This is not a criticism: the 'casts are a great way to catch up on what is happening over there.

I have since learned that the podcast isn't solely Microsoft: e.g. they have a great episode on architecture with a guy from Digg, one with the Rails Rumble champs, and so on. Also, the April 1 podcast is simply brilliant.

So, I invite you to join me and sit on the porch with these guys and have some iced tea. They have asked me about some topics of interest but I refuse to offer any: I want them to surprise me and stretch my boundaries.

Keep up the good work, mates.

Monday, April 13, 2009

Dear Speaker: 10 Thoughts Beyond 'Make Eye-Contact'

The Motivation

Recently, I read a post by James Duncan Davidson called Dear Speakers. He tweeted criticisms about speakers (no names used) and later blogged his thoughts. The tweets were not mean-spirited but also not inside jokes among friends.

I believe that James is offering earnest advice, but the post really irritates me. Here's why:

  • Critiquing a speaker during a presentation, even without using names, is both gutless and rude. I wonder if James offered any advice, in person, to the speakers afterwards. New technology doesn't excuse us from acting like civilized adults. True, I'm the guy that does this (yes, juvenile). But I asked first and looked people straight in the eye.
  • James provides some random, tactical details as advice. They are fine tips but they strike me as being mere trees in the forest. I have wanted to write about the forest for some time now, so here I am.
  • However, the post really irks me because I attended a technical talk by JDD on the No Fluff Just Stuff tour, circa 2002. I won't comment on it here, but: if we had the technology back then, how would JDD feel if I tweeted, even without mentioning his name?
What are my qualifications to talk on this?

Frankly, I'm no more qualified to talk about this than anyone else.

FWIW, I have emceed a couple of weddings, and have given some technical talks, all with widely mixed results. In my team's war room, I'm not at all shy about launching into an impromptu lecture on whatever I find interesting. I have taken the venerable Dale Carnegie course on public speaking (highly recommended). All of this may or may not impress you.

[Editor's note: I have since spoken at the Gateway Software Symposium 2010 and the Lambda Lounge. This still may or may not impress you.]

I fully concede I have broken many of the following rules. Sometimes, it has haunted me for weeks afterwards.

So I'm a modest presenter. However, I have seen dozens of talks: tech talks, conference sessions, keynote addresses, etc. I've attended my local JUG and NFJS for years, and am lucky to see terrific speakers on a regular basis via my employer.

It is very hard to describe what works, but I know when I see it. The best analogy is music: I can't tell you why I admire certain guitar players. There is no formula, and it is highly subjective, yet there seem to be common elements across my favourites.

Writing about this is like describing a dream: it's impossible to articulate the elements of my favourite guitar players, or my favourite speakers. But spurred to action by James' post, here are 10 things to consider.

1. Take a class

Before you can give a tech presentation, you should be able to give a presentation. JDD's post, and the comments, concentrate on things like pacing, pause words ("um", "so"), eye contact, etc. A lot of advice is written as "just keep these 1000 things in mind the next time you are feeling the adrenalin rush of the flight-or-fight syndrome while in front of a crowd".

Gee, thanks. Here's some real advice: if you want to learn to be a better speaker, with a chance to receive genuinely constructive criticism, take a class. There are classes at your local college. There are higher end classes like Dale Carnegie and Toastmasters. Or take an improv class.

It doesn't matter: just pick one and get out there.

(I have no affliation with Dale Carnegie, but a quick plug. When I was 13, I was so shy that I had to steel my resolve to call a store and ask about their hours of operation. I took Dale C at age 22 and have never looked back. No one describes me as shy now.)

2. Know your audience

I learned this one the hard way. Above, I mentioned several types of talks: tech talks, keynotes, etc. Be sure to think about your gig, and match your preparation to it.

For example, at a brown bag tech lunch, you have about 3 minutes to show some code. These people are voyeurs, and code is their porn. If you show up with 10 slides about cargo cults and the history of computing, they aren't going to be happy. Similarly, if you are up for a keynote, and don't have some kind of polish, things are going to be rough as well.

Also, you need to understand the technical level of your audience. This should be fairly obvious, as I'm sure you suffered through mismatches as an attendee.

3. Know your audience, seriously

I'm repeating this one because of the hidden audiences.

At NFJS, Scott Davis recently joked about those long, gorgeous Flash intros on artsy/marketing websites. He said, "who are those for? Everyone clicks Skip Intro".

My first thought was: they are for other people who write Flash intros. It is an arms race among a small elite to impress each other. This is an example of a hidden audience.

As an example, a fancy Keynote presentation can be very slick and alluring, but if you have 3D dancing slide transitions that emit pyro-lasers onto the ceiling, are you trying to impress the audience, or are you trying to impress other speakers? or other Keynote users? or your own ego?

That is to say, who is your real audience?

Always, always, always keep the real audience as priority #1. Be slick, be funny, be wacky, but only insofar as it advances your message.

4. Steal

If you don't play guitar, you might think that each solo, each lick, is its own creative snowflake, a sonic fingerprint that is unique in the universe.

This is just one of many lies you've been led to believe.

Guitarists copy, steal, and nick from each other all the time, and always have. The reason you may not be able to tell is that the good ones are clever about it: they take the essence of an idea, and make it their own.

With respect to speaking, I'm not talking about stealing content. I'm talking about style. Once you've identified your style (see below) think about who you like as a speaker, and why. Then, pattern your talk using similar elements.

A great example for keynote addresses is referencing a topic far removed from the ostensible subject, and then tying it in. A fantastic example is Dave Thomas' talks and writings on cargo cults.

5. Be true to yourself

This item is in a delicious tension with the previous one.

Some speakers are animated and theatric. Some are dry and yet genuinely funny. Some are no-nonsense and try to maximize the amount of content provided to you. This is all fine and well.

No one should label or box themselves in, but it is wise to think about the speaker you want to be. If you have a naturally dry sense of humor, then it may be futile to try and speak as a different character. Public speaking is inherently outside our comfort zone, so there is no need to double that by pretending to be someone you're not.

That said, it can be electrifying to go on stage. Many entertainers have alter egos that appear out of nowhere when the lights go up. If that happens, great, but it isn't necessary.

The upshot: take risks but follow your intuition.

6. Have a message

Everyone knows the old saw, "tell 'em what you're gonna tell 'em, then tell 'em, then tell 'em what you told 'em". That's good stuff.

The key point: have something to tell them.

I'm old-fashioned but I was taught that an essay should have a thesis statement. A movie should have a story. A novel should have a narrative, and so on.

In the same way, I think that a talk should have an essential message that can be condensed into a short outline or a simple phrase. If someone asks "what was your talk about?", you should be able to answer, coherently, in 30 seconds.

This may seem obvious for an expansive keynote address, but I think it applies even to the humble brown-bag tech lunch. My goal for such a lunch is to present a topic to the audience so that they can decide if they want to pursue it further. Consequently, the message is invariably along the lines of "This tool offers A, B, and C, but suffers from X. If you value X, then you may want to wait but if, like me, you value A above the others, then check this out".

The good news is that thinking about this up front will focus your preparation. As well, a creative challenge is to express your message without actually saying it, but this can be tricky (see the last item).

7. Prepare

This one is cheap and easy, but I am compelled to write it.

Prepare your talk. Practice, rehearse, check your time. Remember that time can evaporate on stage, especially if there are questions.

More than this, though, take every opportunity to prepare the equipment. If possible, go to the venue days beforehand. On the day of the talk, get there very early, and remember to test your equipment! Just showing up isn't enough!

I once had a golden opportunity to rehearse with some equipment, on the day before an event, and passed up the chance. It was a major error. The mic was hard to use and I didn't find out until "go time", despite having ample opportunity to prepare. Shameful.

8. Respect questioners, but keep it moving

Assume that a questioner is at the right technical level and earnestly trying to advance the cause of the talk on behalf of you and the audience.

If they are, no problem: be polite and answer the question.

However, it might not be true, or may become apparent after a couple of questions. E.g.
  • The person might not be at the right level technical level for the group (e.g. if someone asks 'does CSS support aspect-oriented monads?' or 'what is a database?').
  • The person might have their own hidden audience and start to grand-stand to impress others or themselves.
  • Everyone is a comic. Often, this sets a warm atmosphere, but one can go out of control after scoring some laughs.
(The unvarnished truth is that I'm guilty of all of these, as an audience member. Hopefully not too often!)

I defer to your intuition on how to be graceful, but it is important in these instances to acknowledge the person, be respectful, and then move on. The goal is to convey your message to the group.

Squash the impasse with the venerable "let's go offline". If you really follow-up later in an earnest manner, it is better for everyone.

9. Learn from criticism

I'm paraphrasing the master, Dale Carnegie, on this one, as he said it best.

There are two ways to handle criticism: if it's accurate, learn and adapt from it; if it isn't accurate, be a duck and let that water just roll off your back.

The trick is to identify accuracy. This is difficult but clearly it requires objective reflection. And a keen sense for the difference between fact and opinion.

I think there is an asymptotic effect here: if you give a sufficient number of talks (see item #1!), and adapt, earnestly and honestly, to enough criticism, the curve will invariably tend towards you being an excellent speaker.

10. Break the damn rules

Just like in music, the most creative and wonderful things come when we break the rules. (Note that here I mean tactical rules like "use slides", and not themes, e.g. "respect your audience".)

However, it isn't just a matter of ignoring the rules like a bull in a china shop. To truly break the rules, one should first understand them.

In this way, one becomes a master. Note that the road is not easy: for every brilliant, rule-breaking, game-changing creation, there are countless disasters laying in the ditch. Talk is cheap: you must be prepared for failure if you try something crazy.

Again, it comes back to intuition: if you feel you're ready and can accept the consequences, go for it.

Rule-breaker? A Masterful (and high risk) Example

I'm bummed that someone, in the comments on JDD's post, already pointed out the video below. A friend and I saw Clifford Stoll at SD West circa 2000. It was the single best talk I have ever seen, and represents my own Platonic ideal as a speaker. I've searched for it but no luck. (I tried to capture its spirit in Beethoven didn't use Powerpoint).

Talking about it is like trying to describe a dream. Or for someone to describe seeing Stevie Ray Vaughn play live.

The version below is similar in nature. It doesn't (can't!) compare to the dream I saw, but it is great stuff.

I include it also for the duality: on one hand, this breaks all of the little tactical rules ('make eye-contact'); on the other hand, it preserves -- even illustrates -- the core principles behind a great presentation.

Though again, you better be careful about running around your conference room like a mad scientist, I conclude with these questions:
  • Did he think about his thesis?
  • Did he know his audience? (a slide-rule!?)
  • Do you think he was earnest in conveying his message?
(Hint: yes).

Wednesday, April 1, 2009

Cult attacks OpenJDK, demands deified generics in Java

CodeToJoy Newswire
Parody City, CA
April 1, 2009

The OpenJDK team announced today that its source-control system received a "prolonged, sustained attack". The attack came from a small sect of developers who are strangely zealous about strong, runtime typing. The group's attempted upload to the site repeatedly stated "Give us deified generics in Java!".

One insider reported: "Apparently these people are not capable of submitting a reasonable proposal, nor are they satisified with mere reified generics. They are continually attempting to inject a syntax such as:

class MyCollection<we_kneel_before_thee T> extends Collection<T>;

which we think is both unpleasant and inappropriate."

There is no word on the affiliation behind the deification of generics, but sources say they suspect the misfits have ties to the ML-family of languages.

Conficker inspires F# team to release immutable browser

CodeToJoy Newswire
Parody City, CA
April 1, 2009

Hot on the heels of the worldwide Conficker scare, insiders reveal that Microsoft has asked its nascent F# team to improve web security by writing an immutable browser.

Code-named Atom, the browser will allow users to manually type in a URL, and to scroll the results, but there is no mouse support within the HTML.

Said someone close to the team: "The aim is to bring pure functional programming to the web. Atom will have no support for Ajax or any other Javascript. In fact, the users are not allowed to click on the page, or to have plugins of any kind. This will thwart the hacking community and save untold numbers of person-years."

When pressed on the utility and security of having to manually enter URLs, the source replied, "It is definitely less convenient, but since we cannot secure the OS level, this really the next best route. Plus, we are excited to bring the principles of functional programming to the desktop".

In early market trading, shares of URL-shortening sites and went up 4109%.