Monday, March 09, 2009

Apparently I am not not alone

Hey Google,

I really hate that as soon as I sign in to write (which inevitably leads to a few searches for links) you start recording every search result I click on.

Where's the "No evil for me, thanks" checkbox?

Blog end

Labels: , , , , , ,

Sunday, March 08, 2009

What'd they say?

While I was writing this blog, these are some of the quotes I was reading.
They are not all profoundly earth-shattering, but if they don't raise a few smiles then I am afraid your experience is insufficient for the current position.

Unix stores the smarts in the user; Windows stores the smarts in the OS.
If you can't traumatize a child at least once a day, what's the point of having them?
This absence of a defined way to manipulate test results for later display means that you have to take over the running of the tests if you want to do something interesting with the report
I would argue that the smartest programs -- the kind we want running on our machines at Amazon -- are programs that are self-aware enough to know when their fundamental assumptions have been violated, and to have at least some decision-making ability built in for handling such situations. I think it goes without saying that you'd rather have your program tell you when things are going wrong, rather than have your customers telling you, perhaps days later.
We learn idioms and then apply a kind of pattern-matching to recognise problems that can be solved with an idiom we already know. Some idioms are easier to express than others in each programming language.
The journey from genetic code to behavior can be circuitous and capricious, whether that code belongs to a fish or a fisherman. We have to be very careful in slicing the fat of instinct away from the bone of intelligence, lest we have nothing left to sink our teeth into.
That [study on relative brain size to body mass] puts sharks and rays in the same vicinity as birds and mammals. It makes you doubt whether sharks really 'mistake' a surfer for a seal on a crunchy cracker, doesn't it?
The carnivores have stayed ahead at each stage of evolution, as Jerison has shown using cranial fossil casts of extinct herbivores and carnivores. Interestingly, in South America, which is devoid of advanced carnivores, the herbivores have tiny brains. Unchallenged by the mind behind the hungry jaws, these grass munchers lived fruitfully and multiplied in blissful ignorance. In any contest of wits, never bet on the llama.
The Great Chain of Being is, in one sense, a generalization gradient of rights, with species more proximal to us favored with protection and species more distant flavored with sauces
Inconvenient experiences don’t have Web-scale potential, and platforms which monetize the gigantic scale of the Web is the only way to compete with the control you’ve lost, the only way to reclaim value in the music industry. If your consultants are telling you anything else, they are wrong.
every workplace is crazy; the trick is to find a place that matches your kind of crazy.
Since large programs grow from small ones, it is crucial that we develop an arsenal of standard program structures of whose correctness we have become sure -- we call them idioms -- and learn to combine them into larger structures using organizational techniques of proven value.
the Buddhist concept he’d been considering, and that the notion that anything was without context wasn’t really plausible
custom medicines per dose will head toward the free (but you’ll pay for your DNA profile)
There is already a good deal of syntax in Lisp. It's not necessarily bad to introduce more, as long as no one is forced to use it.
Programming is difficult to teach and most programmers learn their chops by looking at working code and using it as the basis for building their own programs.
Finding a bug in one’s code isn’t so much a surprise as a feeling of déjà vu.
Ohhhh yesssss, I remember thinking I should check that condition
Shorter code is more readable when it is shorter by dint of expressing the underlying relationship, without irrelevant details.
There are no metaphors at all in software and this is exactly why building software is so damn hard
That’s a people problem. It’s solved by four feet of rubber hose in the car park

"If you already know what recursion is, just remember the answer. Otherwise, find someone who is standing closer to Douglas Hofstadter than you are; then ask him or her what recursion is."

HTML is CICS with fonts

I wasn’t sure what I was going to tell an old-Europe industrial airline company but the gig was in Amsterdam so naturally I accepted


The plural of anecdote is not data

Good design is often slightly funny. This one may not always be true. But Durer's engravings and Saarinen's womb chair and the Pantheon and the original Porsche 911 all seem to me slightly funny. Godel's incompleteness theorem seems like a practical joke

By delaying learning VRML, I avoided having to learn it at all

When someone is working on the code to either add functionality or fix bugs, they usually have lots of feedback (sometimes involving large hammers)


Each journalist has about twenty awards on his or her desk - that's just armour plating for their egos

The Stone Age didn't end because we ran out of stone

They might as well have written the book in Klingon -- ehh, no I guess not, too many geeks can read Klingon

I sometimes think that it would be a good marketing trick to call it an improved version of Python. That sounds hipper than Lisp

Strangely enough, if you want to make something that will appeal to future generations, one way to do it is to try to appeal to past generations

According to Hackett, code scavenging is worth re-visiting because the Web makes it easier to find code and re-use it. He points to sites where massive amounts of existing code are available for potential scavenging such as Google code search, Sourceforge, Code Project, Microsoft's Codeplex, and O'Reilly's Code Search. Others include the Free Software Foundation (FSF), FreeVBcode.com, Freecountry and Freshmeat.

I think the reasoning here is flawed, in that it supposes that reflection on how we think is an accurate way of describing how we think

Consciousness is actually a fairly deep tower of self-awareness, and it's for the most part utterly absent in computer programs

I’ve been thinking a lot about content and context lately. As it turns out, there is only context

The acts of the mind, wherein it exerts its power over simple ideas, are chiefly these three: 1. Combining several simple ideas into one compound one, and thus all complex ideas are made. 2. The second is bringing two ideas, whether simple or complex, together, and setting them by one another so as to take a view of them at once, without uniting them into one, by which it gets all its ideas of relations. 3. The third is separating them from all other ideas that accompany them in their real existence: this is called abstraction, and thus all its general ideas are made.

John Locke, An Essay Concerning Human Understanding (1690)

So what does Hardy mean when he says there is no permanent place for ugly mathematics? He means the same thing Kelly Johnson did: if something is ugly, it can't be the best solution. There must be a better one, and eventually someone will discover it. ("best" means "most beautiful", which means ...)

Cognitively, "think" is just search for a solution in a high-dimension of variables, so we can consider all thought as a type of search

Unskilled users don't mind a bit of syntax. People have a natural ability to ignore the things they don't understand, that's how children learn to speak

Because each trick works in different situations, our power stems from being able to shift from one trick to another. To ask which meaning is correct - to count, or match, or group - is foolishness. Each has its uses and its ways to support the others. None has much power by itself, but together they make a versatile skill-system

In order to be shared, information is extracted from natural language, reduced to its distinct informational elements, and tagged into a database

..., it often depends on some background knowledge; once you have that knowledge, it's blindingly obvious and trivial, but without it, you just get confused

Every teacher knows that; trying to teach someone who doesn't care is a painful chore, but when you have a student who does, you have to work hard to keep information flowing to them as fast as they're eating it up. And those are the ones who make the whole process of teaching worthwhile

An article in Fortune magazine a couple of years ago compared the academic qualifications of people in business and found the qualification that correlated most highly with success was a philosophy degree

Babies are born with the desire to learn about the beings who populate their world and the ability to store information about each individual separately. They do not expect all adults to behave like their mother or all children to behave like their siblings. Children who quarrel incessantly with their brothers and sisters generally get along much better with their peers

The human brain is a marvel of associative processing, but in order to make associations, data must be loaded into memory


ASTONISHING, said Death. REALLY ASTONISHING. LET ME PUT FORWARD ANOTHER SUGGESTION: THAT YOU ARE NOTHING MORE THAN A LUCKY SPECIES OF APE THAT IS TRYING TO UNDERSTAND THE COMPLEXITIES OF CREATION VIA A LANGUAGE THAT EVOLVED IN ORDER TO TELL ONE ANOTHER WHERE THE RIPE FRUIT WAS?

Indeed, it is hard to imagine that linguistic communication could take place if our species could not mindread

Show me a switch statement as if it had been handled with a set of subclasses. There is underlying deep structure here. I should be able to view the code as if it had been done with switch or as if it had been done with polymorphism

Although we here in this room are computer users and we are thus stuck with the same annoyances, distractions, and addictions as all computer users, we are also developers and programmers, and for us, the computing experience should be intrinsically different. Programming is the most empowering thing we can do on a computer, and that’s what we do. We write the code that makes the whole world sing

While the Borg pattern works in Python, I do not see it a either necessary or interesting.

If I were in the dynamic language camp, I would argue that if you only want one instance of an object, make sure that you only create one instance of the object. This is clearly in the spirit of dynamic typing

Everyone agrees that one of the most important elements of writing self-documenting code is giving your variables and objects meaningful names

Global variables are basically gone in object-oriented programming, except that fields are now the new global variables, and they can be abused just as badly

Now, there are various ways to get in trouble

“Do the dumbest, simplest thing that almost works”

After a coder has spent a week or so writing a couple thousand lines of code it is not a good idea to then suddenly say to him/her that you want to do something different

Freedom of expression matters more to me than a little extra static checking

If any intelligence at all is put into the code, it's typically "protected" by compile-time or runtime flags that allow you to turn it off, so your program can run in Stupid Mode

Every computer program is a model, hatched in the mind, of a real or mental process

That's your stockholm syndrome talking. The saving should happen automatically-- saving your work should be the default, not throwing away your work! How insane is that?

The developers who put a lot of effort into optimizing things and making them tight and fast will wake up to discover that effort was, more or less, wasted, or, at the very least, you could say that it “conferred no long term competitive advantage,” if you’re the kind of person who talks like an economist.

Unlike programs, computers must obey the laws of physics.

Without user testing, you are designing by guesswork and superstition.

A programming language is a collection of programs.

The next time that you complain that languages don't contain enough novel new features, try and remember the language designers quandary


All programming languages are arcane and cryptic, in different ways and to varying degrees.


In the history of programming languages a surprising amount of effort has gone into preventing programmers from doing things considered to be improper.

Hackers share the surgeon's secret pleasure in poking about in gross innards, the teenager's secret pleasure in popping zits.

They will still strive fiercely -- almost instinctively -- to exercise all options open to them to change their minds.

That's how programmers read code anyway: when indentation says one thing and delimiters say another, we go by the indentation.

In practice, the way to get fast code is to have a very good profiler, rather than by, say, making the language strongly typed.

We made bloated class heirarchies for the imagined benefits of reuse.

While the choice of programming language is typically a sensitive subject, the truth is that it is not the language, but the libraries that come with it that make a difference.

If we ask 800 billion questions today and expect to get instant answers, where were those 800 billion questions 20 years ago?


Simplicity for the writer can mean work for the reader. Putting an added burden on the writer can simplify things for a learner. Simplifying for the writer is the most counter-productive form of introducing simplicity.

Simple code is code that does exactly what it appears to be doing. Simple code is not code that looks simple and then goes and does something different.

You're an amateur developer until you realize that everything you write sucks.

I forgot Pascal 15 years ago. This is not the same as never having learned it. For one thing, you mutter under your breath a lot more.

But you know there's nothing quite as permanent as a temporary stop-gap.


Actually, there are languages that do it even worse than COBOL. I remember one Pascal variant that required your keywords to be capitalized so that they would stand out. No, no, no, no, no! You don't want your functors to stand out. It's shouting the wrong words: IF! foo THEN! bar ELSE! baz END! END! END! END!

It's fairly unusual for any of our projects to have any plan more elaborate than "fix the current bugs and chase the next shiny thing we see".

You can spend years creating mountains of class hierarchies and volumes of UML in a heroic effort to tell people stories about all the great code you're going to write someday.

But in practice a good profiler may do more to improve the speed of actual programs written in the language than a compiler that generates fast code.

Consider this (real) example:

ASSERT(pFoo = NULL);

Not only does the assert not do what it was supposed to do (check that pFoo is NULL), but it accidentally fixes the problem it was meant to detect. If pFoo is wrong (not NULL), this assert will set it to NULL, obscuring the problem. If the ASSERT is compiled away in a release build, the code will start working worse than it did in the debug build.

A language that wants to capture the hearts and minds of the blue-collar programmer needs to work very hard to have rules that are always simple and straightforward.

But parsing techniques are not greatly valued in the LISP community, which celebrates the Spartan denial of syntax.

I think a part of the problem is that traditional languages make us feel that we're productive because we can generate lots of code quickly in them. But is this a good thing?

Labels: , , , , , , , , , , , , , , ,

Saturday, March 07, 2009

iCode

(Sorry for your trouble but (I like it, I like the links. To remind us there is something ... below us. I like that) in plain text that's merely more parenthetical.)

And I'm an artistic coder
  with sensitive moments when
    scripts surge on the slide to ssh.

No.

It was more like one of Homer's "disastertunities" - first thought that popped into my head when I saw nothing was "About time I tried a Mac".

I saw nothing because the PC's screen had just gone dead.
And the PC's a laptop, so the PC is dead.

Luckily, once it was officially dead,
it came straight back as a zombie. Not back as a headless zombie mind, because (for still unfathomable reasons) it could still feed out to the TV. Not to any other screen, and not via Linux. Only to the TV, and only from Windows.

So I spent a few weeks on The Dark Side™ of the boot,
  and I finally found a use for the TV.

Can't complain - the devil got the best tunes, and Windows has the best music software, so I got a few mash-ups out of it.

Back on the job there were a few hurdles to clear before the MacBook and the greedy claws could fulfill their foretold fête:
  • I can support it myself (the IT dept refused to even open the box when it arrived !)
  • it doesn't cost any more (although I had to drop from 15" to 13" screen to accommodate the extra hardware warranty they wanted)
  • I need one because, eh,
  • because, ...
  • because I need to view the company website with Safari !
  • and it's the only platform we can't virtualise. Well, not legally.
(fête, "Up against your will, through the thick and thin")

Hurdles were cleared, egos assuaged, and balances sheeted (or whatever it is they do), and the apple landed, on my desk (top), unopened. I opened it, smiled once or twice, and was quickly at a shell prompt with vim and ssh. I spent the day working.

Now it is a month later and it is the end of a productive week as the new dawn fades as we wend our weary ass way unto the time of the thinking and the drinking and the reflecting and the fond rememberemberinging and the cheese biscuits and the blogging and the barfing and the exaggerated claims on behalf of your client and no crumbs on the apple please.

11. Never publish when pissed

I smiled when opening it because they provide a cleaning cloth. When I lift the cloth the next box is labelled "designed by apple in cupertino". (Actually they claimed "in california", but it was in cupertino that I met the batmobile. Not The Batmobile mind you, just the one from the 60s TV series. It was hot. Round about Cupertino the grass is all brown. There's miles and miles of brown, all down 101. Except for the sprinklered strawberries, which are wow-red. (But I'm mumbling again't I)?)

I will smile when closing it because tomorrow when it opens again I can resume editing this page in Safari on Mac, and a single key will allow me to preview it in Firefox on Linux, and the same key again will take me to IE7 on XP, and back to here again. I'll smile because it just worked.

My old, and trusted, friend Gordon has a lot to do with it just working. It is indubitably easier to do two new virtual machines with 2 cores, 2GHz, 2Gb and 200Gb, than it was with the 2000 model. But it was also easier to find out how to do.

I do know the theory: my "desktop" at work has 8 different OSs installed, as chroots and virtuals. And I know the practice so well I wrote the intranet wiki page: the first of those guest OSs took a fortnight to set up, but we could get a student to do it now.

But I didn't know you shouldn't need a procedure, that you should just download a program that enables you to do what you want. (It's like the second, and especially third, mouse keys - from my recent stay on The Dark Side™ I'm sensitive to how much I depend on Unix's mouse copy and paste - where just a swipe gets the copy and paste is on every middle click.

If you (young Padawannabe) find yourself on The Dark Side™, you will expect to right-click and you could hope to find "paste" somewhere on the (ever changing) menu and you might pray that it will just take text but You Will Despair of ever guessing What We Intend. On The Dark Side™ you will quickly learn that We Are Inscrutable, and you are not We. you are kusers.

Mere misspelled lusers.

Seriously though, the whole fucking paradigm is so goddamnably error-prone. Especially when I suddenly remember it was supposed to be "copy".

Whoa, back up a moment, I started middle clicking when U2 were still a Boy, and I'm moving to a platform with only one mouse button ?

Consequences, Schmonsequences. As long as I'm rich) And it should work the way you already expected it would.

That's the one that, combined with a cocktail of pharmaceutical enhancements, keeps UI designers gibbering at night:
  • it should work the way the luser expects it to
    • oops
    • sorry
    • correctamundo: UI designers
    • gotcha
    • Thanks
  • it should work the way the user expects it to
, when you don't know what the user's context is. (And context is the only true foundation of meaning, as we all know)

The other Steve seems to have a fairly good grasp of my context. Seems to have made some good guesses as to where I'm coming from. And the forgotten punchline of the parenthetical mouse story above was that after a few days (I did try one button) I plugged in my Logitech mouse to the Macbook and it just worked. Middle click does the same paste as on KDE or Gnome. Mmmmm...

Bill does not even deign to notice ext2 partitions on "his" disks, but the other Steve has middle-click waiting ready for any passing Unix heads, even though (I assume) his core users have only one button and don't know it exists.

As a hacker, occasional cracker, itinerant knacker, ex-packer, dedicated slacker, gratuitous slasher, and new macker, I just feel more invited to the party on The Bright Side.

Didn't know they were checking IDs at the door.

This post was aided and abetted by Echo & The Bunnymen. Duh.

Labels: , , , , , , , , , , , , , ,