wissel.net

Usability - Productivity - Business - The web - Singapore & Twins

Designing Lightning Components for Reuse


This is a living document about a common sense approach to developing reusable Lightning components. It might change over time.

Salesforce documentation

As well as the instance specific component library

Principles

  • Components shall serve a single purpose, designed for reusability
  • Components shall use the most feasible least code approach
  • Components shall not contain country specific logic in the front-end
  • Components shall be documented and tested
  • Components shall use composition over inheritance. Inheritance is NOT forbidden, use it wisely
  • Components shall observe case sensitivity even for non case sensitive item (e.g. field names)
  • Components shall prefer component markup over html markup (e.g. lightning:card over div class="slds-...")
  • Components shall use component navigation (if navigation is needed)

Naming

Related files and components need to be named so they appear close to each other. E.g. a component ?VehicleInfoList? that depends on inner components. Those would also start with ?VehicleInfo? e.g. ?VehicleInfoCard? ?VehicleInfoLineItem?, ?VehicleInfoInterested? etc.
Files should be named like this:

  • SalesProcess.cmp
  • SalesProcessController.js
  • SalesProcessHelper.js
  • SalesProcess[WhatEvent].evt
  • SalesProcess.SVG

Interfaces

  • A component shall only implement the interfaces that it actually uses.
  • A component that relies on a current record, shall not use ?availableForAllPageTypes? and must implement ?hasRecordId? and the attribute ?recordId?.
  • Components that are not used on a page layout, but rather inside other components shall not implement interfaces (?availableFor??) that make them appear in the page editor
  • Components shall only implement the interfaces they actually use. Avoid interfaces the component ?might use in future?

Data access

Components shall use the ?least code? principles for data access. To be checked in this sequence:

  1. Does the component need data access or can the attributes of it provide all the input it requires?
  2. Can lightning:recordForm be used?
  3. Can lightning:recordEditForm and lightning:recordReadForm be used?
  4. Can force:recordData be used?
  5. Is a custom @AuraEnabled method in the controller needed for data provision?

This doesn't preclude fetching Meta data or configuration. Ensure to use storable actions where feasible. More principles:

  • Use data change handlers where appropriate
  • Use component events

Code principles

This section probably will expand over time

  • Code needs to be readable
  • The controllers (both client and server side) shall be light modules that marshall actual work to helper classes and helper functions
  • In Apex helper classes shall be instantiated using factory classes - this allows intoducing country specific behavior
  • All Apex helper classes shall be based on Interfaces
  • Methods and functions shall be single purpose and not exceed a page size. Break them down (makes them more testable anyway) if to big
  • Don't copy/paste
  • Run PMD (free download) on all Apex (eventually on JavaScript too)
  • Operations that can fail need to be handled with try/catch or its equivalent
  • Use @ApexDoc and @JSDoc style comments

Testing

  • All components need test code: both for Apex (natural) and the client side component.
  • A component is incomplete without a ?Lightning testing service? test.
  • Use assertions generously!

Documentation

  • Lightning components have a description
  • Each lightning component comes with a documentation section - don't waste time documenting them outside Salesforce.
  • Use the documentation to briefly explain what it does (no Pulitzer price for this writing!).
  • Include at least one example in the documentation

Parameter

  • Components that can be dragged onto a page can benefit from having parameters the page maintainer can use to configure the component, thus increasing reusability and limit the number of components that need to show up in the palette.
  • Parameter documentation - Check the documentation for details.
  • If a component is usable only for a specific object page, add that to the Design Resource.

As usual YMMV


Posted by on 26 July 2018 | Comments (0) | categories: Lightning Salesforce

Postman and the Salesforce REST API


The Salesforce API is a great way to access Salesforce data and can be used with tools like SoqlXplore or the Salesforce Workbench. The API uses OAuth and a Bearer Authentication, so some steps are required to make that work in Postman

Prepare Salesforce

You will need a connected APP. I usually create one that is pre-approved for my user profile(s), so I don't need to bother with the approval steps in Postman. However you could opt for self-approval and access the app once to approve its use, before you continue with the command line. Note down the ClientId and ClientSecret values.

Prepare Postman

Postman has great build in support for all sorts of authorization interactively. However my goal here is to fully automate it, so you can run a test suite without manual intervention. First stop is the creation of one environment. You can have multiple environments to cater to different Salesforce instances.

Important Never ever ever store the environment into version control. It would contain credentials -> bad bad idea!

My environment variables look like this:

{
	"CLIENT_ID" : "the ClientId from Salesforce",
	"CLIENT_SECRET" : "The ClientSecret from Salesforce",
    "USER_ID" : "some@email.com",
    "PASSWORD" : "DontTell",
    "LOGIN_URL" : "https://login.salesforce.com/"
}

Providing the Login URL allows to reuse postman collections between Sandboxes, Developer Orgs or Production Orgs without the need to actually edit the postman entries. Next on the menu: getting a token


Read more

Posted by on 06 July 2018 | Comments (1) | categories: Salesforce Software WebDevelopment

Mime is where Legacy Systems go to die


Your new system went live. Migration of current, active data went well. A decision was made not to move historic data and keep the old system around in ?read-only? mode, just in case some information needs to be looked up. Over time your zoo of legacy systems grows. I'll outline a way to put them to rest.

The challenges

All recent systems (that's younger than 30 years) data is stored more or less normalized. A business document, like a contract, is split over multiple tables like customer, address, header, line items, item details, product etc.

Dumping this data as is (csv rules supreme here) only creates a data graveyard instead of the much coveted data lake or data warehouse.

The issue gets aggravated by the prevalence of magic numbers and abbreviations that are only resolved inside the legacy system. So looking at one piece of data tells you squid. Only an old hand would be able to make sense of Status 82 or Flags x7D3z

Access to meaningful information is confined to the user interface of the legacy application. It provides search and assembly of business relevant context

The solution approach

Solving this puzzle requires a three step approach:

  • denormalize
  • transform
  • make accessible

Read more

Posted by on 22 June 2018 | Comments (1) | categories: Software Technology

Adventures in TDD


There are two challenges getting into TDD:

  • Why should I test upfront when I know it fails (there's this massive aversion of failure in my part of the world)?
  • Setting up the whole thing.

I made peace with the first requirement using a very large monitor and a split screen, writing code and test on parallel, deviating from the ?pure teachings' for the comfort of my workflow.

The second part is trickier, There are so many moving parts. This post documents some of the insights.

Testing in the IDE

TDD has the idea that you create your test first and only write code until your test passes. Then you write another failing test and start over writing code.

As a consequence you need to test in your IDE. For JavaScript or Java that's easy (the languages I use most):

  • In JavaScript you define a script test in your package.json you can run any time. For a connoisseur there are tools like WallabyJS or VSCode Mocha Sidebar that run your tests as you type and/or save. The tricky part is: what testing libraries (more on that below) to use?
  • In Java Maven has a default goal validate and junit is the gold standard for tests. For automated continuous IDE testing there is Infinitest
  • For Salesforce you have a combination of JavaScript and Apex (and clicks-not-code), testing is a little trickier. The commercials IDE TheWelkingSuite and Illuminated Cloud make that a lot easier. How easy is in they eye of the beholder. (Honorable mention: JetForcer - I simply haven't tested that one yet)

Testing in your Continuous Integration

Automated testing, after a commit to Github, GitLab or BitBucket happens once you configure a pipeline as a hook into the repository and have tests specified the pipeline can pick up. Luckily your maven and npm scripts will most likely work as a starting point.

The bigger challenge is the orchestration of various services like static testing, dependency management and reporting (and good luck if your infra guys claim, they could setup and run everything inhouse).

Some of the selections available:


Read more

Posted by on 10 June 2018 | Comments (1) | categories: JavaScript Salesforce TDD

What really happens in OAuth


OAuth in its various versions is the gold standard for Authorization (and usingOpenID Connect for Authentication as well). There are plenty of introductions around explaining OAuth. My favorite HTTP tool Postman makes it really simple to obtain access via OAuth.

Nevertheless all those explanations are quite high level, so I wondered what happens on the wire for the getToken part so I started digging. This is what I found. Nota bene: There is no inherit security in OAuth if you don't use https.

The components

  • Authorization server: server to interact with to get an authorization
  • Client identifier (ClientID): ?userid? of the application
  • Client Secret: ?password? of the application
  • A user

I'm not looking at the Resource Server here - it only comes into play before or after the actual token process.

The Form-Post Flow

There are several flows available to pick from. I'm looking at the Form-Post flow where user credentials are passed to the authentication server to obtain access and refresh tokens.

For this flow we need to post a HTTP form to the authorization server. The post has 2 parts: Header and body. A request looks like this:

POST /yourOAuthEndPoint HTTP/1.1
Host: authserver.acme.com
Accept-Encoding: gzip, deflate
Accept: *.*
Authorization: Basic Y2xpZW50aWQ6Y2xpZW50c2VjcmV0
Content-Type: application/x-www-form-urlencoded
Cache-Control: no-cache

grant_type=password
  &username=user%40email.com
  &password=password
  &scope=openid+email+profile
  &client_id=clientid

Some remarks:
- The Authorization header is just as Base64version of clientid:clientsecret - you have t replace it with your actual info
- Content-Type must be application/x-www-form-urlencoded
- The body is just one line with no spaces, I split it only for readability
- scope is a encoded list the + signs are actually spaces. Keeping that in mind you want to keep the server side scope names simple
- You need to repeat the clientid as header value

As a result you get back a JSON structure with authorization information. It can look like this:

{
    "access_token": "wildStringForAccess",
    "refresh_token": "wildStringForRefreshingAccess",
    "token_type": "Bearer",
    "expires_in": 300
}

The result is easy to understand:
- expires_in: Duration for the access token in seconds
- token_type: Bearer denotes that you call your resource server with a header value of Authorization: Bearer wildStringForAccess

As usual YMMV


Posted by on 04 June 2018 | Comments (0) | categories: Software WebDevelopment

Reuse a 3rd Party Json Web Token (JWT) for Salesforce authentication


The scenario

You run an app, could be a mobile native, a SPA, a PWA or just an application with JavaScript logic, in your domain that needs to incorporate data from your Salesforce instance or one of your Salesforce communities.

Users have authenticated with your website and the app is using a JWT Bearer Token to establish identity. You don't want to bother users with an additional authentication.

What you need

Salesforce has very specific requirements how a JWT must be formed to qualify for authentication. For example the token can be valid only for 5 minutes. It is very unlikely that your token matches the requirements.

Therefore you will need to extract the user identity from existing token, while checking that it isn't spoofed and create a new token that you present to Salesforce to obtain the session token. So you need:

  1. The key that can be used to verify the existing token. This could be a simple String, used for symmetrical signature or an X509 Public Key
  2. A private key for Salesforce to sign a new JWT (See below)
  3. A configured Connected App in Salesforce where you upload they full certificate and obtain the Consumer Key
  4. Some place to run the code, like Heroku

Authentication Flow for 3rd party JWT


Read more

Posted by on 03 May 2018 | Comments (0) | categories: Heroku Salesforce

Function length and double byte languages


Complexity is a prime enemy of maintainability. So the conventional wisdom suggests methods should be around 20 lines, with some evidence suggesting up to 100+ lines.

When I review code written by non-native English speakers, especially when their primary language is double byte based, I find methods in the 500-1000 lines range, with some special champions up to 5000 lines. So I wondered what might contribute to these function/method worms.


Read more

Posted by on 09 April 2018 | Comments (1) | categories: Java JavaScript NodeJS Software

Creative logging with $A.log()


In Lightning applications there are two ways to log: console.log(..) and $A.log(...). This has led to some confusion what to use.

The official statement: $A.log() will eventually go away, use console.log()

This is a real pity, since $A.log() is quite powerful and closer to what a developer would expect from logging. One reason for its demise: in a production setting $A.log() would output - nothing. There's no official documentation how to change that and the $A.logger.subscribe(...) method is neither documented nor guaranteed, only mentioned on Stack Exchange. So?

Enjoy it while it lasts

The simple case to activate console output in production is to add a helper function that can be triggered by a button or whatever you find necessary:

$A.logger.subscribe( "INFO", function( level, message, error ) {
                                console.log( message );
                                console.log( error );
                             });

Instead of sending output to the console, which could confuse users seeing all that ?tech' stuff, you could redirect it into a custom component (the following snippet fits into an onInit script):

var target = component.find("loggerlist").getElement();
$A.logger.subscribe( "INFO", function( level, message, error ) {
                               target.innerHTML += "<li>"+message+"</li><li>"+error+"</li>";
                             });

The target element would be <ol auraid="loggerlist"> so you get a running list.

Across the network

One is not limited to staying on the same machine. With a few lines of code logging can happen on a remote location as well. The following shows logging using websockets. For a production run (e.g. permanent instrumentation) I would make it a little more robust, like keeping the connection open and check if it is still there or send JSON, but for the occasional support this is good enough:

$A.logger.subscribe( "INFO", function( level, message, error ) {
    var wsEndPoint = 'wss://somewebsocket.url/ws/log';
    var connection = new WebSocket(wsEndPoint);
     connection.onopen = function(event) {
        connection.send(message);
        connection.send(error);
        connection.close();
    };
});

I'll show a potential receiving end implementation in a future post.
As I said: enjoy it while it lasts, it might go away soon. YMMV


Posted by on 03 April 2018 | Comments (0) | categories: Salesforce

Salesforce one year on


A year ago I said Good by IBM, Hello Salesforce. A lot has happened in the last 12 month. Salesforce is only my second salaried job, I've been running my own companies and been freelance before.

Coming from IBM, where Resource Actions had efficiently killed employee engagement, Salesforce's Ohana culture was a refreshing different. It makes such a difference to work with people who are genuinely interested in your success, without exception. In summary:

  • I became a Trailblazer Ranger, completing 30 trails, 206 badges and collecting 169625 points
  • Passed five Salesforce certifications
  • Contributed to customer success in Singapore, Australia and Korea
  • Wrote 25 blog entries (Way to little, more are coming)
  • Moved my blog from Domino to git (more on that below)
  • Contributed to OpenSource on github:
    • Maintainer for node-red-contrib-salesforce. The nodes that connect NodeRED to Salesforce, including the support for platform events
    • Excel2XML: Tool that converts XLSX tables into XML, so data can be extracted in command line applications. Main purpose is to make Excel data accessible in build pipelines (e.g. sample values for tests)
    • Spring Boot and Salesforce Canvas: Sample application that turns a Canvas POST into a JWT authentication, so classic multi pages applications can be integrated into Salesforce Canvas
    • Vert.x proxy Filtering proxy implemented in Apache vert.x. It allows to front a web application and filter HTML, JSON etc. based on content and URL
    • SFDC Platform Events: Modules for Apache vert.x to connect to Salesforce. It includes authentication and processing of platform events. This allows for high performance multi-threaded interaction with Salesforce APIs, not limited to platform events
    • Blog Comments Tool that accepts a JSON formated comment structure and creates a Bitbucket file, a commit and a pull request. Allows for a database free comment engine
    • BlogEngine: The application that powers this blog. It generates static files when commits/merges happen to my master branch on Bitbucket

What a ride, onto year two!


Posted by on 01 April 2018 | Comments (2) | categories: Salesforce

Boolean to get major overhaul


George Boole didn't seem to understand his five teenage daughters, (he didn't have sons, so this is about teenagers, not daughters) otherwise his boolean logic would encompass not only true and false, but also maybe or don't know. Luckily that omission will be addressed now.

Boolean to merge with Ternary

Quick recap: a boolean value has the values true (usually 1), false (usually 0). Ternary has 3 states, typically denoted -1, 0, 1. Not to confuse ternary with QBits which are true and false at the same time.

To reflect the real world, where nothing is certain, and cater to teenage level developers, the ternary and boolean data types will be merged into a new type: RealBoolean.

Proposals are under way to incorporate RealBoolean into all major programming languages ASAP. RealBoolean will have the values true, undecided and false. While it is up to the programming languages how these values are represented, consensus is, that the most likely candidates are -1, 0 and 1.

New hardware

Like specialized mining hardware for Crypto, RealBoolean will benefit from purpose build ternary computers. Early models had been running since 1958. Ternary computing also has arrived in micro processor architectures. Of course there are doubters

Transition period

Having multiple data types to express the truth might fit the political desire for alternate facts, but is an unsustainable confusion in programming. Therefore the classic boolean values will become illegal April 01, 2042.
In the transition period classic booleans will be ducktyped into RealBoolean whenever the values true, false or 1 are used. For boolean 0 or -1 (as some unfortunate languages use) compilers and runtimes are mandated to issue a warning for the first 5 years, thereafter a stern warning before they finally become illegal

Enforcement

All version control repositories will be scanned (the NSA does that anyway) and offending code flagged with new issues. Binary code, not compiled from a repository, will be treated as virus, blocked and deleted. After the deadline all remaining offending code will be transpiled into COBOL - good luck with finding developers to make sense of that code thereafter


Posted by on 01 April 2018 | Comments (4) | categories: After hours Software Technology