wissel.net

Usability - Productivity - Business - The web - Singapore & Twins

By Date: May 2011

Public Tenders for complex (IT) projects - a cure worse than the disease?


Most public sector institutions require contracts higher than a certain value to be awarded by tender. In Singapore the threshold is currently SGD 70k (that's at today's exchange rate approximately USD 57K or EUR 40k). The tender process is intended to ensure an impartial award of the project at hand to the most (or sufficient) capable bidder at the lowest price. This does not only make Alan Shepard nervous (he coined the famous sentence: " The fact that every part of this ship was built by the low bidder"). Tenders work well when the defined need published in one of them is easy to fulfil and fulfilment offers are easy to compare.
However the very moment complexity kicks in a tender process gets expensive. The Australian process defines eight distinct stages a tender process goes thru until awarded. This has a number of consequences:
  • No small innovative company can put up with the process, so innovation stays outside
  • Since the need needs to be defined tenders tend (pun intended) to cement waterfall approaches to software projects
  • Tender language needs to be vender neutral, so it is fun to see how tender documents try to disguise vendor specific features in neutral language or broad requirements no one can deliver (e.g. "messaging platform needs to run on all prevalent server operating systems (this kicks Exchange out) and support calendaring and scheduling for all prevalent desktop applications (this kicks basically ALL out since Calendar interoperability is all but a pipe dream)".
  • Since large tenders are attended only by a selected few, the public sector agencies will be overcharged. This is not malice on the side of the tender submitters, but simple economic logic. Let us look at a simplified example: In a small country, let us call it Morovia there are 5 system integrators (SI) that are eligible to bid for contracts for 1M and more. Due to the competitive pressure all of them bid for all of the projects. Since Morovia is looking into the future software projects are large and complex and end up with 1M budget each.
    All SI spend about 10% of each tender's value on running it through all the phases up to the final award, which as the nature of the tender process stipulates, goes to one only (I prepared a lot tenders and can assure you, that a lot of SI would be happy with 10%).
    All SI are highly qualified, so they win an average 1 out of 5 tenders. So far so good. But once you run the numbers (after all they are in business, striving for profit) you realise, that they had to spend 500k acquisition cost for 1M of contract value, leaving only 500k value to be delivered in the project. 50c on 1$ isn't a very desirable outcome. I wonder what other projects are plagued by under-funding too
    Tender process eating value
Unfortunately I don't have a better alternative for these processes. There is a tension between need for effective and efficient procurement, transparency, accountability and fairness. A few thoughts: put the definition phases into the public domain (like IdeaJam), so coming to an ideal solution is less dependent on vendor contribution (read: cheaper for the vendors) and more transparent. Trade ideas and concepts in a kind of stock exchange, so winners can emerge before awarding.

Posted by on 30 May 2011 | Comments (1) | categories: Business

What does an IBM Collaboration & Productivity Advisor do?


My uncle being 81 and a professor emeritus (thus very busy) asked me what I actually do. While the voice over is missing, the following prezi gives you an overview what I told him.

Posted by on 30 May 2011 | Comments (0) | categories: IBM

eMail migration - what do you do with the legacy?


Where do you put your old eMails Setting up an eMail server or signing up for a cloud based offering is very straight forward. Mastering the trade takes a little longer. However moving toward a new platform from wherever you are, is not so crystal clear. There have been various studies about migration cost, One of them puts the budget per user for eMail migration in the range of USD 200 (plus opportunity cost for training and lost productivity). A big slice of that cost is for moving historical eMails across to that new platform. There are a number of approaches to deal with the old eMails:
  • Go for amnesia and leave the past behind. It worked for the White House, it could work for you. Biggest drawback: you are in breach of several binding regulations and others can take you for a ride (an eMail always has at least two ends). Advantage: clear mind
  • Retain the previous eMail client to lookup historic records. It is only temporary since most jurisdictions allow to delete business records after 5-7 years (and a VM Image can preserve that old OS too)
  • Print all the record you want to keep. "Print" would include the creation of PDF files. Be clear about the fact that you will be most likely in breach of various electronic transaction acts when you only take the default printout that omits the header (transmission) information. So you might print items you want to keep only after a retention mandate has expired
  • Export eMails into a vendor neutral format (that would be MIME). You need to have a good way to put these files (one per eMail) into a useful storage (same problem applies to the PDF files). A file system might not qualify as such
  • Use an eMail platform neutral Enterprise archive to keep all your messages (works well with smart eMail life cycle management). The clear advantage here: the enterprise archive is a necessity regardless of your eMail strategy (stay where you are or depart to greener pastures) and can archive files as well. Usually it is a big ticket item and your storage vendor will love you
  • Finally: migrate your data to the new email platform
The big question here: is archival an user or an admin responsibility? And what does your legal counsel say about email retention laws (Keep in mind: based on the content an eMail can also qualify as a business record with its own set of legal constraints)?
As usual YMMV

Posted by on 30 May 2011 | Comments (0) | categories: Software

Saying NO needs a wrapper


Everywhere saying NO to a request is loaded with difficulties (unless your are an admin of course, since NO would be your only word, at least for developers). Depending on context and culture these difficulties vary. One of the core reasons is that every NO to the content of a request always carries the risk of being perceived as NO to the relationship between sender and recipient. To successfully say NO you need to separate relationship and request (easier said than done) and wrap it up nicely. After all a NO feels like a raw meat paddy thrown at you (Vegetarian need to stop reading now).
This is how a RAW NO feels like
The solution for the hamburger is to grill the meat, add garnish and wrap it into a bun. A similar approach is needed for a NO.
Proper treatment for a meat paddy, Vegetarians please imagine it is Tofu
The general pattern for NO is: YES - NO - YES. So you wrap your no into 2 yes:
  1. YES, I'm interested in continuing this relationship and work towards your success
  2. NO, this is not how it is going to work / not what I will do for you
  3. YES, this is my suggestion to deliver the success we are both interested in
Delivered in 2 YES the NO is much more bearable. Of course the challenge is to find the common success criteria in the eternal quest for a win-win scenario. Entire books have been written how that exactly works. Absolute recommended reading.

Posted by on 23 May 2011 | Comments (0) | categories: Business Intercultural

Focus to sell to your existing customers - should you?


Sales is a tricky business. It is the archetype of a money driven reward system where your quota and your ability to achieve it determines your reward. Selling software is complex and goes beyond a simple repetitive task, where science has proven, financial rewards might not work the way expected. Anyway a lot of sales people struggle to meet their quota. A recommendation they get to hear over and over from their sales leader is: "Focus to sell to your existing customers. It is so much easier (it is) to sell to an existing customer, than acquire a new one". Sound advise, isn't it? Not so fast. When looking at the problem with recently refreshed thinking the core pattern of Shifting the burden jumped at me. And that's not good!
Shifting the SalesBurden.jpg
Let's have a look:
  • The challenge is to meet the sales quota. If the sales person struggles the problem is most likely a too narrow customer base. However the symptom is a lack of sales orders. It is like tooth ache: the problem is a hole in your tooth, but the symptom is pain
  • Focusing to sell more licenses or additional products to you existing customers is like taking a Panadol: the pain goes away but the problem remains. The quota is met for one quarter, but the struggle will resurface in the next
  • The fundamental solution is to widen the customer base. The snag here: it takes more time and effort to get there and with the myopic focus on the quarter feels rather scary to go there. However one can't sell indefinitely to the same customers - after all they have been promised to save them money will all the stuff sold to them. Catch 22
  • With the total focus on existing customers more and more effort is required since selling additional stuff is subject to the law of diminishing returns. Further more an "existing customer" is something very different from "an entry in the list of corporations that once did buy stuff from us". The best way to turn a customer into a "former customer" is to show up only for sales. A relation to a customer includes genuine care for their success (that unfortunately isn't a tickbox in your CRM)
  • The side effect is a consequence of focusing on the upper loop: the delay to get the result of the lower loop increases (often dramatically) up to the level that it doesn't happen at all
So it that damn if you do and damn if you don't? The approach needs to be balanced choosing the middle way and divide attention between existing and new customers. Enlightened sales plans make use of a balanced scorecard to keep the burden on both shoulders.

Posted by on 23 May 2011 | Comments (1) | categories: Business

Deploying the extension library - client edition


Unless you are living under an XRock your next XPages project will use the Extension Library. While installing it into your development environment is easy (just follow the instructions) and deployment to your 8.5.2 servers got a lot easier, deploying it to XPiNC application takes a few steps more:
  1. You need an update site. Since you deployed it to the servers you would have one already
  2. You need a Widget catalog (typically called toolbox.nsf). Luckily that's just a File - Application - New
  3. You need an entry in the widget catalog for the extension library plug-in with a widget.xml describing the plug-in (details below)
  4. You need a client policy that points all Notes clients to the Widget catalog and makes the category you assigned to the Extlib Widget an automatic mandatory install
  5. Sit back and relax and see your extension library deployed
You can muse if you update site should use HTTP or NRPC. The later would have the advantage, that you can replicate the update site to your clients (using a policy) making the update process independent from actual network connections. The tricky part is to get the XML file right.

Read more

Posted by on 19 May 2011 | Comments (a) | categories: XPages

Testing Notes and Domino applications


Chuck Norris doesn't test. For the rest of us: you need a plan when preparing to test Notes applications. Traditionally Notes application testing has been an afterthought. After all adding a few fields onto a form and creating a few columns in a view was all you needed for an entry level Notes application. Like the frog that gets slowly boiled often we don't recognise the point when applications grew so big that a more rigorous test regime started to make sense. While TDD is all the rage for code driven development it hasn't taken a strong hold in the Domino world that is very visualy driven.
A lot of times the first consideration about a systemic test only arises when you want to upgrade a server: Will all my applications run? Do I need to change anything?
In general Notes and Domino are fiercely backward compatible, so the odds are with you. However the odd changes (like you used variable names that became class names) and subtle alterations in behaviour will require that you at least have a look.
Now testing an existing application that has been build anyhow is a complete different animal from building an application suitable for automated testing. It is like building a house with proper heating and plumbing vs. retrofitting this into an existing building - just ask your average castle owner. So let us have a look at the two use cases:
  • Testing an existing application

    This is the trickier use case since your application most likely deeply connects your user interface with your data back-end. So a few steps are necessary to perform tests.
    • Know your code: using Teamstudio Analyzer, DXLMagic, Ytria ScanEZ or OpenNTF's Source Sniffer you can get a feel for the code base: where are your agents, what @Db* functions are at work, where are external system calls, how complex is your code and where have developers committed a crime by omitting Option Declare. Teamstudio's Upgrade filters provide a handy way to get the pay-attention-to-this list when planning to test for upgrades
    • Prepare a base line version of your application. Base line doesn't mean empty, but fully configured with sample data (you want to test "this value gets edited and changed" too). ZIP away the base line, so you can repeat the test
    • Prepare an environment that reflects your use cases. The two main fallacies in testing are: using a small database and a fast network (and then act surprised when user complaint about performance of the large database on the slow network). So you might end up having a base line with a lot of data preconfigured. Using Apache's TCPMon you can simulate a slow network connection even in your exclusive single user gigabit test network. (There are more)
    • Write a test plan. Depending who (or what) is executing it, it can be short sentences or elaborate instructions. Make sure to include edge cases where values are close to a limit, exactly hit it or just exceed it a little.
      A Notes database is a good place to keep a test plan (I don't mean to attach a text document into a Notes database, but having a proper form with all test data). Test plans ensure that you don't miss something. Test plans are living documents. Your test plan WILL be incomplete, so be prepared to update them frequently (this is why every test deserves its own Notes document (with versioning)). When you look at a test plan you will find striking similarities to Use Cases. They are actual natural extensions. While the Use case describes what the user (or system) does, the test plan describes how it is done in detail
    • Pick a test tool that can replay testing sequences. Here it gets a little tricky. IBM doesn't offer a tool for Notes client testing. There is server.load, but that's mostly suitable for mail performance testing only. Luckily the Yellowsphere stepped in and SmartTouchan's Autouser fills the gap. It allows for both functional and performance testing. When you test Domino web applications your selection is wider. You need to distinguish between functional and performance testing:
      • Performance:

        Here you can test the raw Domino performance by requesting and sending http packages from/to the Domino server or the all over performance including your specific browsers JavaScript performance. Typically you go for the first case. Here you can start with JMeter or Rational Performance Tester (there are others, but my pay cheque doesn't come from there)
      • Functionality:

        Top of my list is Rational Functional Tester (same reason as above), but I also like Selenium which nicely runs in your Firefox. There are almost infinite combinations of browsers and operating systems, so to no surprise you can find a number of offerings that do the cross browser testing chore for you. Some of them can run against your intranet application, so you don't need to open up your applications to wild west.
      There is no magic bullet application, Testing is time consuming and comes with a learning curve (guess what a lot of interns do)
    • Run the tests in various environments (if you test for upgrades) or before and after code changes (if you test for performance or regression avoidance)
    • Be very clear: The cost for test coverage is growing exponentially and you can't afford 100%. A good measurement is to multiply the likelihood of an error with the economic damage it can do. If you spend more money on testing, you are wasting it. (Of course that is a slippery road if an application error can lead to physical harm, this is where I see the limit of "assess economic damage only")
    • Your test plan should start with the most critical and most used functions and then move to the lesser used and critical actions. Repeat until you run out of time or budget.
  • Building applications for testability

    Your all over test cycle gets shorter and more efficient when you design an application for testability from the very beginning. The approach is called "Test Driven Development" (TDD). In a nutshell: you write your tests before you write code (which will make them fail), then you write code until the test goes through. This works well for, well code. In a highly interactive visual environment like Domino designer that is much harder. Very often business logic is hidden (pun intended) in hide-when formulas. You (partially) need to let go of such ideas. Some pointers:
    • Put a business logic into script libraries. You can write test methods that call these functions more easily
    • Have a mechanism that generates your test data, so you can start over with a test series anytime
    • Use the XPages unit tests on OpenNTF
    • Abstract your logic from its implementation. So instead of writing @DbLookup(@DbName,"configLookup","departments",2) you write systemConfig.getDepartments() or at least systemConfig.getConfig("departments") and implement the actual data access in a library.
    • There is much more to say.... somewhen
As usual YMMV

Posted by on 18 May 2011 | Comments (1) | categories: Show-N-Tell Thursday

Where is my trusted system?


In the beginning was my Time System serving as mobile trusted system (and status symbol) and the world was good. My local storage was handled by Mappei which happened to have a 43 folder module more than 25 years ago and an ingenious way to deal with folders. The world was good.
Along came a rapid sequence of PDAs from Casio, Sharp, Psion and finally the original Palm Pilot. With all of them I spend quite some time to create my printout routine to keep my Time System updated. For the Mappei reference system I didn't find any good replacements. Scanners were either to slow or to expensive and the databases where the documents ended too rigid (tagging wasn't invented yet). Then I discovered Lotus Notes and initially used Haus Weilgut's CRM and standard document libraries to keep electronic information. Since necessity is the mother of invention I wrote tooling for Syccess Easy Office that could pull arbitrary PDF into Lotus Notes and deal with Fulltext extraction and meta data (I actually wrote multiple versions. In one I used a tool created by a student which you still can download). The world was good.

eMail was replacing mail and fax and the inbox became more and more important. eProductivity now helps to keep track of references, projects and commitments. The world was good.

Along came the social net and Facebook, Twitter, RSS/ATOM suddenly established new input/output channels. IBM Activities allow to share commitments. Evernote wants to be your reference system and Todeledo and RememberTheMilk want to track your commitments. All of these on a dual screen desktop, a laptop, a tablet and a mobile phone. And the world wasn't good anymore.
When I encounter something interesting I want to capture and process it. Evernote is very good at capturing, but lacks in GTD functionality (so it is suitable for the reference). Also I might want to share that information in Connections, Delicious, Digg or my Blog. Or a comment I make on a Facebook wall also triggers the recording of a new action. Currently I end up with a lot of copy and paste.
IBM tries to rectify this with project Vulcan where a single page can stream all input channels and the sharebox provides the unified output channel, so there is hope (I wonder how they will include Facebook and the other public networks). So the brave new Social Business world poses quite a challenge to GTD practitioners. This opens again then question how GTD needs to evolve from personal to Social Productivity. Sharing is still not easy.


Posted by on 13 May 2011 | Comments (1) | categories: GTD

The Network vs. the Tree


When I started to use Lotus Notes in version 2.1 (thanks to these guys) my primary interest wasn't to learn a new technology (I consider learning new technologies as icing on the cake), but to find a suitable tool to manage semi structured information. At that time computers mostly dealt with structured data or individual storage for pre-print artifacts (today known as Office documents). My main interests were and still are Knowledge Management (KM) and eLearning which IMHO are just different stages of the same thing: acquisition, provision and retention of capabilities.
Corporate HierarchyThe trickiest problem in KM and to a large extend in eLearning is the classification of items. Taking a hint from classical science the first approach was to use a Taxonomy to build a tree of classification. Classification tree are well established and deeply entrenched in corporate hierarchies: a human is a hominid is a primate is a mammal is a vertebrate is a animal from the realm of living beings. Tom is an engineer who works for Frank who is a team leader who reports to Sue who is a development manager working for Cloe who is head of development reporting to Steve who is CTO reporting to Annabel who is CIO reporting to CEO Jack and the board. Somehow it didn't work. The going joke is: " If you want to get rid of job competitors internally, make sure they sit in the Taxonomy committee, that will tie them up and frustrate them down." Truth is: not everything fits into an hierarchy and agreeing on a term as the single permissible label for an item is a pipe dream (and what you would have to smoke in that pipe would be illegal in most jurisdictions). Especially with the rise of " PC" a committee might come to the compromise to call something " a human muscular traditional digging device" while mentally sane people will insist to " Call a spade a spade".
Network by taggingThe rise of social computing with sites like Delicious or Digg added a new quality to classification attempts: tagging. With tagging suddenly naming something was given to all individual users rather than the "Committee of the final truth". Moreover items can be classified in any way thinkable and spade and classical digging device can coexist. Counting the occurrences a tag was associated with a term the "majority vote" or "common name" can be established without ditching the minority opinions. While it sounds messy it works well in practise and gets rapidly adopted in corporate social software.
Besides classification by gravity tagging added additional meta information: when was it added, who added it, how popular is it. Especially the "who" seems to be an important factor. With the constant overload of information it becomes increasingly hard to check all the facts, so trustworthiness of the source, even if it is just the classification, becomes more and more important. So every tag associated with an item in fact is a linking vector with all these attributes, the tag value being just one of them. Ironically predating the tagging breakthrough we already had a standard to exactly do that: XLink.
Unfortunately no gain comes without collateral damage and the flat tagging got rid of marking "the official term" as well as the context covered by a taxonomy. When you see a tag " bank", what does it mean? A turning manoeuvrer of a plane, the edge of a river, the expression of trust (I bank on you) or a form of financial institution?
Delicious took an interesting approach by forming hierarchies out of the tags provided which leads to a huge number of permutations when the tag number increases - and not all make sense. Of course the question is: do the nonsensical matter since no one will ever follow them? Recognising that the core value of a tag lies in its links let to tools tools like The Brain, that allow you to link facts by simple dragging a line or pressing a button. The tag becomes a member of the information repository in its own right. Unfortunately the links don't carry the information "why" they exist ("is a", "contains", "runs", "owns" etc.). It will be interesting to see how the brain will adopt to collaborative linking needs.
The concept of trust was further developed by features like the Facebook like button or the voting capabilities in sites like StackOverflow or IdeaJam. It all reminds me of the ancient Germanic court room principle of proving plaintiff's trustworthiness rather than looking at facts. There are services that want to help to establish trustworthiness for URLs. All these attempts of classification have their merits, what is lacking is a unified field theory for classifications.
Spheres of Influence
How to weight expert classifications (there is usually more than one, e.g. check for that really dangerous Dihydrogen monoxide), especially when they are unpopular, vs public opinion? How to quantify trust in your social graph (you would blindly follow Joe's music recommendations, but never ever let him near a kitchen to make food)?
So KM practitioners around the world have much to muse about. The key questions are still open: how to provide accurate, current, relevant and accessible know-how.

Posted by on 08 May 2011 | Comments (1) | categories: Software

What we need here is a bigger hammer!


I'm rereading Peter M. Senge's book The Fifth Discipline . It is dedicated to the learning organisation and advocates (besides other virtues) System Thinking spearheaded by the System Dynamics Group at the MIT. The Laws of the Fifth Discipline (in chapter 4) highlight many of the structural issues we face in business and society at large.
In business that is obsessed with " quick fixes" (at least in this part of the world) there are valuable insights to be gained seing the world as a system with feedback loops. Two aspects are worth highlighting: " compensating feedback" and " cause and effect disconnection" (I'll probably cover other aspects in later posts). Every system has an inbuild inertia. This will lead to compensating feedback to any action taken that is only addressing symptoms rather than causes. Compensating feedback will void any actions taken so the the status quo is preserved.
The laws of the 5th discipline
One of my favourite examples: A retailer has a problem with customer loyalty. The consultant (who's name and company shall not be named) suggested to implement a customer loyalty program. Clear cut: the problem is loyalty, so a loyalty program will fix it. The loyalty program did cost a lot of money, gained a lot of members, took all the attention of marketing and managers but failed to stem the loss of customers after a short while (things get better before they get worse). The root cause, who nobody wanted to touch, because it was difficult to fix however was: slipping quality of products and services. So instead of beating a dead horse, looking for the system at work will be more efficient and sustainable.
When you decide to achieve true mastery of a subject, you will experience a 10,000 hour delay until you get there. And that is just one skill for one person. There is a, typically larger than expected, delay between action and result. In a time of the constant strive for instant gratification managers often neglect and ignore the real result of their actions. More often than not executives have moved on before the impact of their actions can be felt.
A typical example: A new CEO orders a radical cost-cutting program. It includes freezing of IT investment, outsourcing of back office functions and a massive lay-off of employees. In the short term things look really good. Revenue per employee is up (the staff now working for the outsourcers doesn't count in the balance sheet) and profits return for the moment. However morale is down, so most of the employees are scared to the bone to stick their head out. The internal improvement program all but dies. Capable employees jump ship (getting replaced by cheap but less productive newbies). In social circles the opinion is formed: Acme Inc is not a place where you want to work, so talented entries into the workforce don't apply. Of course our "Le Costcutter" CEO moves on after 2 years, his job " saving the company" has been achieved after all. Five years later the company is history. Everybody concludes that this is due to the constant reinvention of business.
Another example: You join the gym. You train hard, but you don't see any results after a month, so you give up. Knowing about the delay you would have carried on for 3 month you be in awe about the personal transformation your started (get good advise on food and training to make it work).
Of course this is just a brief summary. Go Read the book yourself!

Posted by on 08 May 2011 | Comments (3) | categories: After hours Business

Does a minister in Singapore have to be an *elected* member of parliament?


The 2011 general election is drawing closer and the debates are heating up. One hotly contested district this time is The Aljunied GRC where the opposition Workers Party wants to unseat the incumbent PAP. GRC stands for " Group Representation Constituency" where a team of politicians gets elected together. In Aljunied the winning party will gain 5 seats, which is more than 5% of the total elected seats. The PAP team in Aljunied is headed by Mr George Yeo, who happens to be Singapore's current foreign minister.
In articles, on Facebook and else where assertions are made, that he has to leave government if he doesn't get elected in Aljunied. Having grown up in a continental European democracy I was quite puzzled. In Germany the cabinet is appointed by the president on suggestion of the chancellor (our equivalent to the prime minister) and the cabinet members might or might not be members of parliament (The chancellor is elected by the parliament, not appointed by the president). After all the ministers head the Executive and not the Legislative. So I though: " If in doubt check the source". The Singapore Constitution states in Part V Chapter 2 Paragraph 25 #1: " The President shall appoint as Prime Minister a Member of Parliament who in his judgment is likely to command the confidence of the majority of the Members of Parliament, and shall, acting in accordance with the advice of the Prime Minister, appoint other Ministers from among the Members of Parliament. Provided that, if an appointment is made while Parliament is dissolved, a person who was a Member of the last Parliament may be appointed but shall not continue to hold office after the first sitting of the next Parliament unless he is a Member thereof." (emphasis mine).
So on first view the concerns " if Mr. Yeo loses he is out" seem valid. My late father, who was a lawyer, taught me always to read surrounding paragraphs or the whole law to make sure to get the full picture. In Part VI Paragraph 39 1c we can read: " such other Members not exceeding 9 in number, who shall be known as nominated Members, as may be appointed by the President in accordance with the provisions of the Fourth Schedule". So while it wouldn't be an ideal proceeding and certainly not in the spirit of the Nominated Members of Parliament (NMP) idea, it looks like a NMP can be part of the cabinet (The speaker or deputy speaker of the parliament also can be appointed from non-elected members, but is explicitly banned in Paragraph 39(4) from becoming a cabinet member). Of course I'm not an expert in Anglo-Saxon inspired law, so I might stand corrected (and learn something new in the process). Also a constitution can be changed. Narrowing the selection base for ministers to members of the parliament excludes a lot of talent.
Now what would be really embarrassing (for the PAP): If Prime Minister Lee Hsien Loong's PAP team loses in Ang Mo Kio. In 2006 the team got 66.14% of the votes, presuming the number of voters hasn't changed, 23,579 people would need to change their mind in one direction. Is that a lot or a little?
We live in interesting times

Posted by on 04 May 2011 | Comments (1) | categories: Singapore

You have come a long way


End of this week are elections in Singapore where the 87 elected members of parliament face their voters. Since independence the result was always clear: The People's Action Party ( PAP) will win. Very few doubt that this will change. However things are different this time around. In a Wikipedia article about Singapore's Internal Security Act one can read: " Political opposition is technically allowed in Singapore, however many opposition politicians fear being imprisoned, fined, or bankrupted via government-led litigation (with the effect of not only economically destroying opponents, but also disqualifying them from elections) merely for voicing critical opinions." Today that fear is gone and the political landscape is vibrant as one can expect from a living democracy. Online media are abuzz and bloggers more daring than ever. Mr. Wang a.k.a. Gilbert Koh writes: " Abundantly clear that Lee Kuan Yew is mongering foolish fears. Don't be his sucker.", The Mr. Brown show pokes fun on PAP and government politics and performance ( If the driver falls asleep you better slap, slap, slap). Not so long ago they both probably would have ended in hot water for this. Just watching the stream on Twitter is fun and shows no trace of fear. Even Darth Vader is summoned into campaigning (Note to Lucas film: satire is a work of art, free speech and that spot covered by fair use, so don't take it down). If this is the harbinger of public discourse in Singapore I live in a good place. I'm not commenting on the various claims of the different parties here, there are better commentators around, I'm just enjoying the blossoming of a civil society. And where else can you get such campaign videos:


Enjoy!

Posted by on 03 May 2011 | Comments (1) | categories: Singapore

So you want to be a Domino developer - revisited


It has been a long time since part 1 and a lot of things have changed since the original graphic. XPages came along, Dojo got some decent documentation and everything became social. However the the graphic hasn't changed that much:
Domino Development Skills
There's ServerSide JavaScript and XPages, which isn't hard to snap up if you had sufficient exposure to JavaScript and XML (as suggested before). For die-hard-LotusScript-Forms-Only developers the new HTML/JavaScript driven way of doing things is a big learning curve. There is a Domino Designer that looks more and more like an Eclipse plug-in (which IMHO is a good thing) and made you put SSD and RAM on the Christmas wish list. What is new (and don't tell me next year you haven't seen it coming) is the convergence in all IBM APIs towards a set of common standards. Moving forward you must make yourself familiar with them. In a future version of Notes they will become the API of choice for exceptional work experiences (I shamelessly borrow a term from IBM marketing here). Since you are already familiar with XML, HTTP and JSON, these APIs are actually easy to comprehend:
  • ATOM Publishing Protocol (AtomPub): XML based format for reading and writing information with structured meta data. Properly implemented is makes heavy use of Dublin Core meta data descriptions. ATOM is behind a lot of data exchange and API including OData (the format Microsoft and SAP have committed to). There is Apache Abdera for Java, support for Atom in Dojo and even jQuery
  • OpenSocial: Framework to build applications for integrated experiences. The title might be a bit misleading, since it is much more than "add a share this" button to your page. It is a complete widget definition and interaction standard. Extends iWidgets. Both IBM Portal as well as the IBM XPages server will be both OpenSocial containers and widget contributors. Already today a Domino server cann serve a component in RCP Widget and iWidget format, so we can expect that the component model will support OpenSocial too
  • ActivityStreams: While OpenSocial defines interaction up to the UI level, ActivityStreams are the pipes feeding new information into your experience. The ver brief definition: ActivityStreams are ATOM data feeds that have at minimum the agreed set of attributes ad defined by the ActivityStreams working group. I expect ready ActivityStreams controls to first appear in the XPages Extension Library before they move to the product core control set
  • Last not least the IBM Social Business Toolkit: It brings all the above components together. The real remarkable aspect: As of today there is no shipping IBM product supporting the full API, nevertheless IBM provides a working test environment, so you can test and be prepared
From my training experience I can tell: If you grew up with web standards, developing on the XPages platform is fast and fun.

Posted by on 02 May 2011 | Comments (4) | categories: SYWTBADD

4 Validation errors to avoid and some fun with XPages Validation


Input validation is a tricky business (on any platform). On one hand you need to ensure that your form contains sufficient and accurate data to be processes as intended. On the other hand validation needs to get out of the user's attempts to get things done. XPages offers client and serverside validation. While you could continue to use onSubmit or QuerySave the recommended approach is the use of validators. They nicely separate the validation logic from the validation trigger and from the error display. This gives you a great deal of flexibility. You can go to the extend to provide your own Java validators that can be used by you application developers without knowing Java, add your own message programmatically or let your code raise an error. Per summarised error management nicely and Andre explains better SSJS validation rules. So we are pretty much covered. Nevertheless I see quite some worst-practises in action:
  1. Only client side validation: Client side validation is for the comfort of your user. It is to make her life easier. Client side validation is not suitable for preserving data integrity! Using Firebug, NoScript, cURL or simply notepad/textedit/gedit (to create a simple html form with method="post") a user can bypass any client side validation. So it must be accompanied with server side validation code (unless of course all your users are angels)
  2. Abrasive error messages: Sounds familiar: "Input error: This is not a phone number", "You must fill xxx", "Wrong character detected"? Users translate (conscious or subconscious) those messages into "You stupid moron, you not even can fill in the form - serve me, I'm your digital overlord". Guess what happens next: an eMail, a phone call and application avoidance. The better way is to a) be more liberal with what you accept (phone number can contain spaces, plus, minus and slash - and how hard is it to filter out spaces from a credit card number? b) Let the computer apologise for its short comings: "Sorry you need to complete all indicated fields before this application can process your request" or (if you buy into the idea that users attribute personality to computers and applications): "Sorry I only understand credit card numbers that have no spaces"
  3. Lack of save as draft: If your form has very few fields only, you can get away without. But any modest complex application will get a user to a point where he is stuck with a missing piece of information (or a weekend (s)he wants to start). So you need to accept input however incomplete it might be, just don't process it yet. Since an eMail system on the market has a "draft mode" your users will be both familiar with it and actually expect a save as draft function
  4. Validation traps (I'm not talking about a Catch 22): fields are validated on field exit and on failure the user is prompted and send back to that field. While it might be well meaning (make sure everything is fine) is is actually evil. It disrupts the user in her workflow and forces a certain working style on them. For a better way to use onBlur see below
To summarise:
Validation always must be server side, it should be augmented with client side validation for the comfort of the user and in XPages should be implemented using Validators.
( Should in the context of this statement means: " Do this unless you have 7 good reasons to do it differently".)
Does that mean that you have to code every validation twice? The short answer: No if you use Validators. I imagined a modification of the XPages engine, where you could trigger a validation onBlur of a field that would only hint the user about the problem (so NO messagebox prompt) and not require to duplicate code that you use in your validator. In a chat with M?ire Kehoe she enlightened me how this can be done in XPages, so I present the ValidationHelperControl

Read more

Posted by on 02 May 2011 | Comments (0) | categories: XPages