Wednesday, October 27, 2010

Get a list of files that have changed between revisions in Subversion

svn diff -r REVNO:HEAD --summarize

source: http://muffinresearch.co.uk/archives/2008/09/15/svn-tip-get-list-of-files-changed-between-revisions/

Thursday, October 21, 2010

Macs

You don't use a mac if you know what you're doing. You use a mac if you don't know what you're doing and you just want someone who does know what they're doing to configure a machine with all the basics set to the most popular settings.

Wednesday, October 20, 2010

Whole Trackpad as a scrollwheel

Update: The below is all crap. Turns out the best way to do it is to simply set up in your bashrc a call to synclient setting LeftEdge and RightEdge = 0

synclient LeftEdge=0
synclient RightEdge=0





--------------------------------------------------------------------------------------
Using synclient -l to display all values, find the value of LeftEdge and then set the value of RightEdge to the same value in LeftEdge in your xorg.conf under the 'InputDevice' section:

Section "InputDevice"
Identifier "Synaptics Touchpad"
Driver "synaptics"
Option "SendCoreEvents" "true"
Option "Device" "/dev/psaux"
Option "Protocol" "auto-dev"
Option "SHMConfig" "on"
Option "HorizEdgeScroll" "0"
Option "RightEdge" "1752"
EndSection


Source: http://ubuntuforums.org/showthread.php?t=886449

Interuptable Programmer

Source


I’m 37, and I’ve been a (professional) developer for 16 years. You would have thought that in that time, I’d have figured out an effective work style which delivered the desired outcomes (code cut, products shipped etc) without causing detrimental knock-on effects – but, sadly, you’d be wrong. I think the style in which I practiced my craft for the first 15 years of my career was much the same as every other enthusiastic developer: you put a ton of hours in. 12-16+ hour days, evening and weekend coding marathons, pizza in the keyboard, crunch times, 3am debugging sessions where you just can’t go to bed because you can feel the source of that bug just beyond your fingertips, dammit, desperate last-minute sprints to deadlines where you manage to slot that last piece in, Jack Bauer-like, just before the world goes to hell. If you’re in the demographic I’m talking about, you’re nodding sagely, and probably grinning a little too, reminiscing on past trials and glories. This sort of crazy dedication is respected in our circles, and is pretty much expected of any developer who has claimed to earn their stripes.
But, it turns out this kind of thing is not good for your health – who knew? Those of you who know me or keep up with my blog know that I’ve been dragged kicking and screaming away from my old ways, because of back issues that I initially ignored, then tried to cope with using token accommodations, and finally succumbed to in a big way. Being self-employed, this was a major problem. Crawling out of the pit I dug for myself took a long time and a lot of frustration – I read quite a few productivity books on the subject to try to find answers on how to keep working, and in the end found that the answers you mould for yourself tend to be the best ones. I’d like to share one of the things I learned along the way.
But I’m ‘In The Zone’!!
So, I want to talk about the biggest problem I encountered: concentration periods. I can’t sit at a desk for longer than about an hour at a time now; if I don’t get up and walk around, do some gentle stretching etc, at least this often, I’ll pay for it badly once I do move, and probably over the next few days too. I also can’t realistically work more than a standard 8 hour day without pain any more. The problem with this was that, as a programmer, the style which I developed over 15+ years involved getting gradually ‘Into The Zone’ and coding for very long periods at a time, uninterrupted. This is a common theme among coders, who like to shut themselves away for hours at a time, wear headphones to avoid distractions, have ‘quiet times’ and so on – and it’s also why we tend to react really badly when interrupted. Programming requires concentration, and concentration seems to run on a valve system – it takes time to warm up, and once it’s going, you don’t want to turn it off because starting it up again is a major hassle.
I thought there was no way around this, and had begun to resign myself to just being less productive because of it. However, over the last 6  months in particular, I’ve discovered that, far from being an intractable problem, this ‘slow warm up, long uninterrupted focus time’ approach is to a large degree a learned behaviour, and it’s possible to re-train yourself to cope with things differently. It’s a little like when people learn to adopt polyphasic sleep patterns – it’s not that you can’t do it, it’s just that when you’ve become accustomed to doing things a certain way, changing that is initially very, very hard. But it’s not impossible, given the right amount of motivation and time to adjust.
So, my goal was to acclimatise myself to many shorter work chunks during the day instead of a few very large ones, while still maintaining productivity. The key to this was to learn how to get back ‘In The Zone’ in the shortest time possible – much like the way polyphasic sleepers train themselves to achieve REM sleep more quickly. I’m mostly there now, or at least way better at it than I was, so, what techniques did I use to make this transition?
  1. Embrace interruptions
    This is less of a technique and more of a deliberate psychological adjustment which cuts across all the practical approaches I’ll cover next. Instead of being the typical coder who avoids interruptions at all costs, you need to accept them, and learn to manage them better. It’s hard – you have to try to set aside years of resisting interruptions and initially, until you adjust, you’ll feel like you can’t get enough done. Many people will probably want to give up, unless there’s something specific motivating them to push through it – for me, daily pain was a great motivator. My main message here is that the transition is just a phase, and that it is possible to be an interruptable programmer who still gets things done. But you have to learn not to fight against it, hence why this is the first point.
  2. Maintain context outside of your head at all times
    Much of the problem with interruptions is that of losing context. When you’re in that Zone, you’re juggling a whole bunch of context in your head, adjusting it on the fly, and maintaining and tweaking connections between issues constantly. Interruptions make you drop all that, and it takes time to pick it all up again. My answer to this was to externalise as much as possible, on as many levels as possible:
    1. Maintain a running commentary on your current task
      I am my very own chronicler.
      I write notes on what I’m doing all the time, whether it’s adding a comment line to a ticket, committing frequently and writing detailed commit notes (you do use a DVCS to make light commits more practical, right? ;)) scribbling a drawing on (ordered) pieces of paper. This really isn’t that onerous, and in fact externalising your thoughts can often help you clarify them. Basically the guide is that roughly every 30 minutes, I should have generated some new piece of context which is stored somewhere other than my head. If I haven’t, then that’s context I’d have more trouble re-building mentally if I’m interrupted. It doesn’t take much time to do, and it has other benefits too such as recording your thought & decision process.
    2. Ruthlessly ignore tangental issues
      You might have noticed that in the last bullet, I used the words ‘current task’, singular. Not ‘tasks’. There is no such thing as having more than one ‘current task’ – there is only the one task you’re actually working on, and distractions.
      We probably all use bug trackers / ticket systems to track bugs and feature requests, but when you’re working on a ticket, it’s very common to spot a new bug, or identify an opportunity for improvement, or think of a cool new feature. How many of us go ahead and deal with that right away, because it’s in the area we’re already in, or it’s ‘trivial’, or it’s a cool idea that you want to try right now?  I know I did – but I don’t any more; any tangental issues not related to what I’m currently doing get dumped into the ticket system and immediately forgotten until I’m done with the current task, regardless of their size, relevance or priority. It sounds simple and obvious, and this might even be official procedure in your organisation, but I challenge most coders to say that they actually do this all the time. The benefit is that even the tiniest of distractions add an extra level of context that you have to maintain, which is then harder to pick up again after an interruption.
      For this to work, you need a ticket system which is fast, lightweight, and doesn’t require you to be anal about how much detail you put in initially. You need to be in & out of there in 30 seconds so you can offload that thought without getting distracted – you can flesh it out later.
    3. Always know what you’re doing next
      This is one from GTD (‘Next actions’), but it’s a good one. When you come back from a break or interruption, you should spend no time at all figuring out what you need to be doing next. Your ticket system will help you here, and so will the running commentary that hopefully you’ve been keeping on your active task. If you’ve been forced to switch gears or projects, so long as you’ve maintained this external context universally, you should have no issue knowing what the next actions on each item are. The important thing is to have one next action on each project. If you have several, you’ll have to spend time choosing between them, and that’s wasted time (see the next section on prioritisation). At any one time, you should not only have just one current task, but one unambiguous next action on that task. Half the problem of working effectively is knowing what you’re doing next.
  3. Prioritise Negatively
    I mentioned next actions in the previous section, but how do you decide what comes next? A lot of time can be frittered away agonising over priorities, and I used to struggle with it; I would plan on the assumption that I wanted to do everything on the list, and I just needed to figure out which I needed to do first. I discovered that I could cut the amount of time I spent on planning, and also get better, less ambiguous priorities by inverting the decision making process – to assume a baseline that I wouldn’t do any of the tasks, and assessing the negative outcomes of not doing each one. So instead of ‘which of feature A or B is more important to have?’, it became ‘Let’s assume we ship without feature A and B. What are the issues caused by omitting them in each case?’. It might appear to be a subtle difference, but having to justify inclusion entirely, rather than trying to establish a relative ordering assuming they all get done eventually, tends to tease out more frank evaluations in my experience.
  4. Recognise the benefits of breaks
    Much of the above is about limiting the negative aspects of taking breaks, but the fact is, that they have many work-related benefits too. I’m willing to bet that all coders have stayed late at work, or late into the night, trying to fix a problem, only to find that they fix it within 15 minutes the next day, or think of the answer in some unlikely place like the shower. The reason for this is very simple – extended periods of concentration seem productive, and can be on operational / sequential thinking, but for anything else such as creative thinking or problem solving, it’s very often exactly the opposite. Not only do tired minds think less clearly, but often the answer to a problem lies not in more extensive thinking down the current path which you’ve been exploring in vain for the last few hours, but in looking at the problem from a completely different perspective. Long periods of concentration tend to ‘lock in’ current trains of thought, making inspiration and strokes of genius all too rare. Creativity always happens when you’re not trying, and it’s an often under-appreciated but vital element of the programming toolbox. Interrupting that train of thought can actually be a very good thing indeed.
There’s more I could talk about, but that’s quite enough for now I think. I hope someone finds this interesting or useful

Saturday, October 16, 2010

The Last of the 8s (Strange Adventures in Math)

Today I learned that I am one of the last of the 8s... I was born in 1978 and this year has a unique property that 19+78 = 97 the middle two numbers of the year 1978. Nifty. What other years have this?

Well assuming that we're just looking for years between 1000 and 10,000, we can write some python code to figure that out:
for i in range(1000,10000):
a = i/100
b = i%100
c = (i-(i/1000)*1000)/10
if (a+b == c):
print(i)
This produces something kinda cool... but that I don't really understand:
1208
1318
1428
1538
1648
1758
1868
1978
2307
2417
2527
2637
2747
2857
2967
3406
3516
3626
3736
3846
3956
4505
4615
4725
4835
4945
5604
5714
5824
5934
6703
6813
6923
7802
7912
8901
That's 8 years that end in 8, 7 that end in 7, 6 that end in 6 etc down to one year that ends in 1 (8901). But where are the 9's? They're in adding the first value and the last value together. 36 of them (9*4?).. There's some other patterns in there as well. I'm sure there's a whole dissertation in mathematics that's been written on this, but I'm not even sure where to start searching for more info about it (it's the Loblaw principle... named after it's founder, Bob Loblaw..)

Friday, October 8, 2010

Alex Martelli on python vs ruby

From comp.lang.python

Erik Max Francis wrote:
> "Brandon J. Van Every" wrote:

>> What's better about Ruby than Python? I'm sure there's something.
>> What is it?

> Wouldn't it make much more sense to ask Ruby people this, rather than
> Python people?

Might, or might not, depending on one's purposes -- for example, if
one's purposes include a "sociological study" of the Python community,
then putting questions to that community is likely to prove more
revealing of informaiton about it, than putting them elsewhere:-).

Personally, I gladly took the opportunity to follow Dave Thomas'
one-day Ruby tutorial at last OSCON. Below a thin veneer of syntax
differences, I find Ruby and Python amazingly similar -- if I was
computing the minimum spanning tree among just about any set of
languages, I'm pretty sure Python and Ruby would be the first two
leaves to coalesce into an intermediate node:-).

Sure, I do get weary, in Ruby, of typing the silly "end" at the end
of each block (rather than just unindenting) -- but then I do get
to avoid typing the equally-silly ':' which Python requires at the
_start_ of each block, so that's almost a wash:-). Other syntax
differences such as '@foo' versus 'self.foo', or the higher significance
of case in Ruby vs Python, are really just about as irrelevant to me.

Others no doubt base their choice of programming languages on just
such issues, and they generate the hottest debates -- but to me that's
just an example of one of Parkinson's Laws in action (the amount on
debate on an issue is inversely proportional to the issue's actual
importance).

One syntax difference that I do find important, and in Python's
favour -- but other people will no doubt think just the reverse --
is "how do you call a function which takes no parameters". In
Python (like in C), to call a function you always apply the
"call operator" -- trailing parentheses just after the object
you're calling (inside those trailing parentheses go the args
you're passing in the call -- if you're passing no args, then
the parentheses are empty). This leaves the mere mention of
any object, with no operator involved, as meaning just a
reference to the object -- in any context, without special
cases, exceptions, ad-hoc rules, and the like. In Ruby (like
in Pascal), to call a function WITH arguments you pass the
args (normally in parentheses, though that is not invariably
the case) -- BUT if the function takes no args then simply
mentioning the function implicitly calls it. This may meet
the expectations of many people (at least, no doubt, those
whose only previous experience of programming was with Pascal,
or other languages with similar "implcit calling", such as
Visual Basic) -- but to me, it means the mere mention of an
object may EITHER mean a reference to the object, OR a call
to the object, depending on the object's type -- and in those
cases where I can't get a reference to the object by merely
mentioning it I will need to use explicit "give me a reference
to this, DON'T call it!" operators that aren't needed otherwise.
I feel this impacts the "first-classness" of functions (or
methods, or other callable objects) and the possibility of
interchanging objects smoothly. Therefore, to me, this specific
syntax difference is a serious black mark against Ruby -- but
I do understand why others would thing otherwise, even though
I could hardly disagree more vehemently with them:-).

Below the syntax, we get into some important differences in
elementary semantics -- for example, strings in Ruby are
mutable objects (like in C++), while in Python they are not
mutable (like in Java, or I believe C#). Again, people who
judge primarily by what they're already familiar with may
think this is a plus for Ruby (unless they're familiar with
Java or C#, of course:-). Me, I think immutable strings are
an excellent idea (and I'm not surprised that Java, independently
I think, reinvented that idea which was already in Python), though
I wouldn't mind having a "mutable string buffer" type as well
(and ideally one with better ease-of-use than Java's own
"string buffers"); and I don't give this judgment because of
familiarity -- before studying Java, apart from functional
programming languages where _all_ data are immutable, all the
languages I knew had mutable strings -- yet when I first saw
the immutable-string idea in Java (which I learned well before
I learned Python), it immediately struck me as excellent, a
very good fit for the reference-semantics of a higher level
programming language (as opposed to the value-semantics that
fit best with languages closer to the machine and farther from
applications, such as C) with strings as a first-class, built-in
(and pretty crucial) data type.

Ruby does have some advantages in elementary semantics -- for
example, the removal of Python's "lists vs tuples" exceedingly
subtle distinction. But mostly the score (as I keep it, with
simplicity a big plus and subtle, clever distinctions a notable
minus) is against Ruby (e.g., having both closed and half-open
intervals, with the notations a..b and a...b [anybody wants
to claim that it's _obvious_ which is which?-)], is silly --
IMHO, of course!). Again, people who consider having a lot of
similar but subtly different things at the core of a language
a PLUS, rather than a MINUS, will of course count these "the
other way around" from how I count them:-).

Don't be misled by these comparisons into thinking the two
languages are _very_ different, mind you. They aren't. But
if I'm asked to compare "capelli d'angelo" to "spaghettini",
after pointing out that these two kinds of pasta are just
about undistinguishable to anybody and interchangeable in any
dish you might want to prepare, I would then inevitably have
to move into microscopic examination of how the lengths and
diameters imperceptibly differ, how the ends of the strands
are tapered in one case and not in the other, and so on -- to
try and explain why I, personally, would rather have capelli
d'angelo as the pasta in any kind of broth, but would prefer
spaghettini as the pastasciutta to go with suitable sauces for
such long thin pasta forms (olive oil, minced garlic, minced
red peppers, and finely ground anchovies, for example - but if
you sliced the garlic and peppers instead of mincing them, then
you should choose the sounder body of spaghetti rather than the
thinner evanescence of spaghettini, and would be well advised
to forego the achoview and add instead some fresh spring basil
[or even -- I'm a heretic...! -- light mint...] leaves -- at
the very last moment before serving the dish). Ooops, sorry,
it shows that I'm traveling abroad and haven't had pasta for
a while, I guess. But the analogy is still pretty good!-)

So, back to Python and Ruby, we come to the two biggies (in
terms of language proper -- leaving the libraries, and other
important ancillaries such as tools and environments, how to
embed/extend each language, etc, etc, out of it for now -- they
wouldn't apply to all IMPLEMENTATIONS of each language anyway,
e.g., Jython vs Classic Python being two implementations of
the Python language!):

1. Ruby's iterators and codeblocks vs Python's iterators
and generators;

2. Ruby's TOTAL, unbridled "dynamicity", including the ability
to "reopen" any existing class, including all built-in ones,
and change its behavior at run-time -- vs Python's vast but
_bounded_ dynamicity, which never changes the behavior of
existing built-in classes and their instances.

Personally, I consider [1] a wash (the differences are so
deep that I could easily see people hating either approach
and revering the other, but on MY personal scales the pluses
and minuses just about even up); and [2] a crucial issue --
one that makes Ruby much more suitable for "tinkering", BUT
Python equally more suitable for use in large production
applications. It's funny, in a way, because both languages
are so MUCH more dynamic than most others, that in the end
the key difference between them from my POV should hinge on
that -- that Ruby "goes to eleven" in this regard (the
reference here is to "Spinal Tap", of course). In Ruby,
there are no limits to my creativity -- if I decide that
all string comparisons must become case-insensitive, _I CAN
DO THAT_! I.e., I can dynamically alter the built-in string
class so that
a = "Hello World"
b = "hello world"
if a == b
print "equal!\n"
else
print "different!\n"
end
WILL print "equal". In python, there is NO way I can do
that. For the purposes of metaprogramming, implementing
experimental frameworks, and the like, this amazing dynamic
ability of Ruby is _extremely_ appealing. BUT -- if we're
talking about large applications, developed by many people
and maintained by even more, including all kinds of libraries
from diverse sources, and needing to go into production in
client sites... well, I don't WANT a language that is QUITE
so dynamic, thank you very much. I loathe the very idea of
some library unwittingly breaking other unrelated ones that
rely on those strings being different -- that's the kind of
deep and deeply hidden "channel", between pieces of code that
LOOK separate and SHOULD BE separate, that spells d-e-a-t-h
in large-scale programming. By letting any module affect the
behavior of any other "covertly", the ability to mutate the
semantics of built-in types is just a BAD idea for production
application programming, just as it's cool for tinkering.

If I had to use Ruby for such a large application, I would
try to rely on coding-style restrictions, lots of tests (to
be rerun whenever ANYTHING changes -- even what should be
totally unrelated...), and the like, to prohibit use of this
language feature. But NOT having the feature in the first
place is even better, in my opinion -- just as Python itself
would be an even better language for application programming
if a certain number of built-ins could be "nailed down", so
I KNEW that, e.g., len("ciao") is 4 (rather than having to
worry subliminally about whether somebody's changed the
binding of name 'len' in the __builtins__ module...). I do
hope that eventually Python does "nail down" its built-ins.

But the problem's minor, since rebinding built-ins is quite
a deprecated as well as a rare practice in Python. In Ruby,
it strikes me as major -- just like the _too powerful_ macro
facilities of other languages (such as, say, Dylan) present
similar risks in my own opinion (I do hope that Python never
gets such a powerful macro system, no matter the allure of
"letting people define their own domain-specific little
languages embedded in the language itself" -- it would, IMHO,
impair Python's wonderful usefulness for application
programming, by presenting an "attractive nuisance" to the
would-be tinkerer who lurks in every programmer's heart...).

Alex Martelli

Wednesday, October 6, 2010

Coach Johnny D's quote..

Our deepest fear is not that we are inadequate.
Our deepest fear is that we are powerful beyond measure.
It is our light, not our darkness, that most frightens us.
Your playing small does not serve the world.
There is nothing enlightened about shrinking so that other people won't feel insecure around you.
...We are all meant to shine as children do.
It's not just in some of us; it is in everyone.
And as we let our own lights shine, we unconsciously give other people permission to do the same.
As we are liberated from our own fear,
our presence automatically liberates others.

Marianne Williamson via Coach Johnny D