HTML5, CSS3, jQuery, JSON, Responsive Design...

NodeJS posting data to Domino

Michael Brown   August 13 2016 02:09:17 AM
So recently, I was working on project that was not domino based, but rather used web tools and Rest APIs.  What a breath of fresh air!  SublimeText, NodeJS, EsLint and all that other webbie-type goodness, that looks great on your CV.

Moving back to working with our Domino-based CMS (Content Management System), I came down to Earth with a very rude bump.  You see, in that system, we store our web programming content in Notes Documents.  Our HTML,  JavaScript and CSS is either typed/pasted directly into Notes Rich Text fields, or is stored as attachments within those same Notes Rich Text fields.

Not to criticise that CMS system itself, which happens to work rather well, as it happens.  It’s just the editing facilities, or lack thereof.  Typing text directly into a Rich Text field, you have no syntax checking, no linting, no colour coding: no visual feedback of any kind, in fact.  Not even of the limited kind that you get with the JavaScript Editor in the Notes Designer.

So I was faced with a choice:
  1. Go back to typing stuff directly into Notes fields, and finding my coding errors the hard way, i.e. when it fails in the browser.  Not fun.
  2. Use SublimeText/EsLint etc to get the code right on my hard drive, then copy and paste the results to the Notes field so I could test in the browser.  And kid myself that the last step isn’t a complete and utter productivity killer.

Obviously, neither option was particularly appealing.  Which got me to thinking… now, wouldn’t it be great if I could still use of all those achingly trendy, client webbie-type tools, but have the my code sync automatically synched up to my Notes Rich Text fields on the Domino server?  You know, in real time?  Then I’d have the best of both worlds.  But surely, not possible…

Actually, it is very possible (otherwise this would be a very short post!).  And I have built a system that does exactly that.  It’s based on NodeJS, npm on the client-side and a good old Notes Java agent on the server side.

Basic Approach

So here's the basic division of work between the NodeJS client and the Domino server:

Client/server Sequence diagram

(Sequence diagram created with PlantUML.)

The NodeJS client gathers up the user's source file, transpiling it if necessary, and posts it to a Domino agent as part of an encoded JSON object.  (Yes, I know JSON is actually a string, but I'll call it an object here.)  The agent works out where the target document is, based on the data passed in the JSON object.  It them posts the user's decoded data to a Rich Text field on that document (or attaches it), before sending a success or error message back to the client.  The agent runs a Web User agent, so ID and Domino HTTP Password are passed from client to server (not shown in diagram above).

The NodeJS can client can even be set to run in the background, and watch a file on your hard drive - multiple files, in fact - watching for if those files have been changed on your hard drive.  If detects a change, the Node system can post the changes to the Domino server immediately.  You can refresh your browser a couple of seconds later, and your changes are there, on the Domino server.

This isn't theory.  I have a working system now, that does exactly what I describe above.  I will post source code to Github if anybody's interested, but in the mean time here's a few tasters of how things are done.

Posting Data from the NodeJS Client: Request Package

The key to posting data from client to server is the npm Request package.  This is kind of a equivalent of jQuery's Ajax call, only in a NodeJS terminal instead of in a browser.  The code below shows how you might call request to post data to a Domino agent:

const request = require("request");

var postConfig = {
   url: "",
   method: "POST",
   rejectUnauthorized: false,
   json: true,
   "auth": {
         "user": username,
         "pass": password
   headers: {
       "content-type": "application/text"
   body: encodeURIComponent(JSON.stringify(configObj.postData))

request(postConfig, function(err, httpResponse, body) {
// Handle response from the server

The actual data that you would post to that agent, would look something like this:
"targetdbpath": "mike/dummycms.nsf",
"targetview": "cmsresources",
"targetfieldname": "contentfield",
"updatedbyfieldname": "lastupdatedby",
"attachment": false,
"devmode": true,
"data": "my URLEncoded data goes here"

Server Side Java Agent

So here's how the server-side Java agent interprets the JSON data that's been posted to it:

import lotus.domino.*;
import org.json.*;

public class JavaAgent extends AgentBase {
public void NotesMain() {
   try {
           Session session = getSession();
           AgentContext agentContext = session.getAgentContext();
           // Your code goes here
           Document currentDocument = agentContext.getDocumentContext();

           pw.println("Content-Type: text/text"); //Content of the Request

           PostedContentDecoder contentDecoder = new PostedContentDecoder(currentDocument);
           String decodedString = contentDecoder.getDecodedRequestContent();

It's a standard Domino Java agent.  I grab the context document from the agent context.

PostedContentDecoder is my own Java class, which grabs the actual content data from the request_content field of that document.  This is actually a bit more complicated than it sounds, because of the way different Domino handles data greater than 64kb in size that's posted to it.  If it's less than 64Kb, then Domino presents as single field called "request_content".  If it's more than 64kb, Domino presents a series of request_content fields, called "request_content_001", "request_content_002" and so on, up to how many fields are needed to hold the size of the data.  The PostedContentDecoder class takes care of these differences.  The class also takes care of URL decoding the data that was encoded by the client-side JavaScript call, encodeURIComponent() (see above), via the line below:

requestContentDecoded =, "UTF-8");

The final piece of the puzzle, in terms of interpreting the posted data on the server side, is to covert the JSON object string into an actual Java object.  There's no native way of doing this in Java, but the huge advantage of Java over LotusScript server agents - and I did try LotusScript first -  is that Java can easily import any number of 3rd-party .jar files to do their donkey work for them.  There's a number of such .jars that will convert JSON strings to Java objects, and vice versa.  Douglas Crockford's JSON reference page lists over 20 JSON packages for Java.

I went with Crockford's own org.json library, which you can download from the Maven Repository.  This gives you a new class, called JSONObject, and this what you should use.  Don't try to define your own Java data class and then try to map that to the JSON data somehow.  I tried that at first, and ran into some weird Domino Java errors.

Here's some code that turns the JSON into a JSONObject.  It then prints the various object member so the Domino server console.
JSONObject obj = new JSONObject(decodedString);
JSONObject obj = new JSONObject(decodedString);
Boolean devMode = false;
if (obj.has("devmode")) {
   devMode = obj.getBoolean("devmode");
   System.out.println("devMode (variable) = " + devMode);

if(devMode) {
   System.out.println("targetdbpath=" + obj.getString("targetdbpath"));
   System.out.println("targetview=" + obj.getString("targetview"));
   System.out.println("targetdockey=" + obj.getString("targetdockey"));
   System.out.println("targetfieldname=" + obj.getString("targetfieldname"));
   System.out.println("updatedbyfieldname=" + obj.getString("updatedbyfieldname"));
   System.out.println("effectiveUserName=" + agentContext.getEffectiveUserName());

Now I have the data, and know where it has to go, it's pretty much standard Notes agent stuff to paste the data there.

1Csaba Kiss  08/13/2016 9:48:25 AM  NodeJS posting data to Domino

Necessity is the mother of inventions. It's amazing what desperation can do to your workflow. I really like your hybrid solution.