Superpatterns Pat Patterson on the Cloud, Identity and Single Malt Scotch


Salesforce Mutual Authentication – Part 3: Java HTTP Clients

HTTP Client LogosIn part 1 of this short series of blog entries on Salesforce's Mutual Authentication feature, I explained how to enable, configure and test Mutual Authentication. In part 2, I documented the shortcomings of Salesforce's Web Service Connector when trying to use Mutual Authentication, and showed how to work around them. This time, I'm going to show you how to use common Java HTTP Clients to call Salesforce APIs with Mutual Authentication.

Recall from part 1 that enabling Mutual Authentiation on a Salesforce Profile means that users with that profile must call a separate API endpoint, connecting via TLS with a client key and certificate chain. A Java client application can load the client key and certificate as I explained in part 2:

// Make a KeyStore from the PKCS-12 file
KeyStore ks = KeyStore.getInstance("PKCS12");
try (FileInputStream fis = new FileInputStream(KEYSTORE_PATH)) {
  ks.load(fis, KEYSTORE_PASSWORD.toCharArray());

// Make a KeyManagerFactory from the KeyStore
KeyManagerFactory kmf = KeyManagerFactory.getInstance("SunX509");
kmf.init(ks, KEYSTORE_PASSWORD.toCharArray());

// Now make an SSL Context with our Key Manager and the default Trust Manager
SSLContext sslContext = SSLContext.getInstance("TLS");
sslContext.init(kmf.getKeyManagers(), null, null);

We'll also need to obtain a session ID. I'll just reuse the SOAP login code from last time, though you could also use any of the OAuth mechanisms.

// Login as normal to get instance URL and session token
ConnectorConfig config = new ConnectorConfig();

// Uncomment for more detail on what's going on!

// This will set the session info in config

// Display some current settings
System.out.println("Auth EndPoint: "+config.getAuthEndpoint());
System.out.println("Service EndPoint: "+config.getServiceEndpoint());

String sessionId = config.getSessionId();
String instance = new URL(config.getServiceEndpoint()).getHost();

Let's look at how we proceed then, in a few common scenarios. All of the below code is available in this Github project.

Java's HttpURLConnection

This is pretty much the most basic way of accessing an HTTP endpoint in Java. We create a URL object and get the HttpURLConnection as usual, then we can set the SSLSocketFactory on the connection:

// URL to get a list of REST services
// For example:
URL url = new URL("https://" + instance + ":" + MUTUAL_AUTHENTICATION_PORT
    + "/services/data/v" + API_VERSION);

HttpURLConnection conn = (HttpURLConnection)url.openConnection();

// Check that we did get an HttpsURLConnection before casting to it
if (conn instanceof HttpsURLConnection) {

Now we set the authorization header as we normally would. I'm also using the X-PrettyPrint header to make the REST API response a bit easier to read.

// Set the Authorization header
conn.setRequestProperty("Authorization", "OAuth "+sessionId);
// Make the response pretty
conn.setRequestProperty("X-PrettyPrint", "1");

Finally, we'll pull the data from the HttpURLConnection's OutputStream and dump it to System.out:

// Dump the response to System.out
try (BufferedReader br =
    new BufferedReader(
      new InputStreamReader(conn.getInputStream()))) {
  String input;

  while ((input = br.readLine()) != null){

The result is the expected list of Salesforce REST services:

  "tooling" : "/services/data/v41.0/tooling",
  "metadata" : "/services/data/v41.0/metadata",
  ...lots more...
  "sobjects" : "/services/data/v41.0/sobjects",
  "actions" : "/services/data/v41.0/actions",
  "support" : "/services/data/v41.0/support"

Apache HttpClient

How does the same example look with Apache HttpClient? We just need to set the SSLContext in the Apache CloseableHttpClient:

// URL to get a list of REST services
String url = "https://" + instance + ":" + MUTUAL_AUTHENTICATION_PORT
    + "/services/data/v" + API_VERSION;

// Set the SSLContext in the HttpClient
try (CloseableHttpClient httpclient = HttpClients.custom()
    .build()) {
  HttpGet httpGet = new HttpGet(url);
  // Set the Authorization header
  httpGet.addHeader("Authorization", "OAuth "+sessionId);
  // Make the response pretty
  httpGet.addHeader("X-PrettyPrint", "1");

  // Execute the request
  try (CloseableHttpResponse response = httpclient.execute(httpGet);
       BufferedReader br =
         new BufferedReader(
           new InputStreamReader(response.getEntity().getContent()))
    // Dump the response to System.out
    String input;
    while ((input = br.readLine()) != null){

The output is identical to the previous example.

Eclipse Jetty

Jetty is a little more complex. We need to create a Jetty SslContextFactory, rather than a standard Java KeyManagerFactory and SSLContext. Note that we need to set the KeyStore password in the SslContextFactory:

SslContextFactory sslContextFactory = new SslContextFactory();
// Need to set password in the SSLContextFactory even though it's set in the KeyStore

Now we can create a Jetty HttpClient with the SslContextFactory, and start it:

HttpClient httpClient = new HttpClient(sslContextFactory);

Executing the request proceeds as usual, and results in identical output:

String response = httpClient.newRequest(url)
    .header("Authorization", "OAuth " + sessionId)
    .header("X-PrettyPrint", "1")


Don't forget to stop the HttpClient when you're done with it:



Salesforce Mutual Authentication offers an additional layer of security over default server-authenticated TLS - clients must possess the key corresponding to a certificate configured in the Salesforce org. As I showed in part 1 of this series of blog entries, configuring Mutual Authentication in Salesforce is straightforward, as is testing the connection with curl, although the Salesforce documentation is not totally accurate. Salesforce's Web Service Connector requires some modifications to make it compatible with Mutual Authentication, although, as I explained in part 2, it is possible to engineer around the issues. The popular Java HTTP clients all provide mechanisms for setting the client key and certificate, and using them to call the Salesforce REST APIs is straightforward. Source code showing how to use Mutual Authentication via all of the above mechanisms is available in my mutual-auth GitHub repo.

I hope you've enjoyed this exploration of Mutual Authentication, and that you've saved yourself a bit of time by reading it!

Filed under: Uncategorized No Comments

Salesforce Mutual Authentication – Part 2: Web Service Connector (WSC)

CodeIn my last blog entry I explained how to enable, configure and test Salesforce's Mutual Authentication feature. This time, I'll share my experience getting Mutual Authentication working with the Java client SDK for Salesforce's SOAP and Bulk APIs: Web Service Connector, aka WSC.

StreamSets Data Collector's Salesforce integration accesses the SOAP and Bulk APIs via WSC, so, when I was implementing Mutual Authentication in SDC, I examined WSC to see where I could configure the client key and certificate chain. Although there is no mention of SSLContext or SSLSocketFactory in the WSC code, it is possible to set a custom TransportFactory on the WSC ConnectorConfig object. The TransportFactory is used to create a Transport, which in turn is responsible for making the HTTPS connection to Salesforce.

To enable Mutual Authentication I would need to create an SSLContext with the client key and certificate chain. This is straightforward enough:

// Make a KeyStore from the PKCS-12 file
KeyStore ks = KeyStore.getInstance("PKCS12");
try (FileInputStream fis = new FileInputStream(KEYSTORE_PATH)) {
  ks.load(fis, KEYSTORE_PASSWORD.toCharArray());

// Make a KeyManagerFactory from the KeyStore
KeyManagerFactory kmf = KeyManagerFactory.getInstance("SunX509");
kmf.init(ks, KEYSTORE_PASSWORD.toCharArray());

// Now make an SSL Context with our Key Manager and the default Trust Manager
SSLContext sslContext = SSLContext.getInstance("TLS");
sslContext.init(kmf.getKeyManagers(), null, null);

Given the SSLContext, we can create an SSLSocketFactory and set it on the HttpsURLConnection. Here's the code we'd use if we were simply using the classes directly:

URL url = new URL(someURL);
HttpURLConnection conn = (HttpURLConnection)url.openConnection();
// Check that we did get an HttpsURLConnection before casting to it
if (conn instanceof HttpsURLConnection) {

Mutual Authentication and the Salesforce SOAP API

The default Transport implementation, JdkHttpTransport, looked like a good place to start. My first thought was to extend JdkHttpTransport, overriding the relevant methods. Unfortunately, JdkHttpTransport's createConnection method, which calls url.openConnection(), is static, so it's impossible to override. The connectRaw() method also looked like a promising route, since it calls createConnection(), performs some setup on the HttpURLConnection, and then gets the OutputStream, but it's private, and once the OutputStream has been created, it's too late to set the SSLSocketFactory.

In my searching for an answer, I came across this comment from Salesforce Software Engineer Steven Lawrance in a Salesforce Trailblazer Community answer.

You'll generally need to set the TransportFactory in the ConnectorConfig object that you use to create the PartnerConnection (or EnterpriseConnection, etc), though another option is to set the Transport.

It's possible to create a Transport implementation that is based off of the class while having the JdkHttpTransport create the connection with its static createConnection method. Your Transport implementation can then set up the SSLSocketFactory (casting the connection to HttpsURLConnection is required to do that), and your SSLSocketFactory can be created from creating an SSLContext that is initialized to include your client certificate.

I followed Steven's advice and created ClientSSLTransport, a clone of JdkHttpTransport, and ClientSSLTransportFactory, its factory class. To minimize the amount of copied code, I changed the implementation of connectRaw() to call JdkHttpTransport.createConnection() and then set the SSLSocketFactory:

private OutputStream connectRaw(String uri, HashMap<String, String> httpHeaders, boolean enableCompression)
throws IOException {
  url = new URL(uri);

  connection = JdkHttpTransport.createConnection(config, url, 
      httpHeaders, enableCompression);
  if (connection instanceof HttpsURLConnection) {
  if (config.useChunkedPost()) {

  return connection.getOutputStream();

With this in place, I wrote a simple test application to call an API with Mutual Authentication. As I mentioned in the previous blog post, the Salesforce login service does not support Mutual Authentication, so the inital code to authenticate is just the same as the default case:

// Login as normal to get instance URL and session token
ConnectorConfig config = new ConnectorConfig();

connection = Connector.newConnection(config);

// display some current settings
System.out.println("Auth EndPoint: "+config.getAuthEndpoint());
System.out.println("Service EndPoint: "+config.getServiceEndpoint());

Running this bit of code revealed that, not only does the login service not support Mutual Authentication, it returns the default service endpoint:

Auth EndPoint:
Service EndPoint:

Before we can call an API, then, we have to override the service endpoint, changing the port from the default 443 to 8443, as well as setting the TransportFactory:

String serviceEndpoint = config.getServiceEndpoint();
// Override service endpoint port to 8443
config.setServiceEndpoint(changePort(serviceEndpoint, 8443));

// Set custom transport factory
config.setTransportFactory(new ClientSSLTransportFactory(sslContext));


private static String changePort(String url, int port) throws URISyntaxException {
  URI uri = new URI(url);
  return new URI(
      uri.getScheme(), uri.getUserInfo(), uri.getHost(),
      port, uri.getPath(), uri.getQuery(), uri.getFragment()).toString();

With this in place, I could call a SOAP API in the normal way:

System.out.println("Querying for the 5 newest Contacts...");

// query for the 5 newest contacts
QueryResult queryResults = connection.query("SELECT Id, FirstName, LastName, Account.Name " +
    "FROM Contact WHERE AccountId != NULL ORDER BY CreatedDate DESC LIMIT 5");
if (queryResults.getSize() > 0) {
  for (SObject s: queryResults.getRecords()) {
    System.out.println("Id: " + s.getId() + " " + s.getField("FirstName") + " " +
        s.getField("LastName") + " - " + s.getChild("Account").getField("Name"));

With output:

Querying for the 5 newest Contacts...
Id: 00336000009BusFAAS Rose Gonzalez - Edge Communications
Id: 00336000009BusGAAS Sean Forbes - Edge Communications
Id: 00336000009BusHAAS Jack Rogers - Burlington Textiles Corp of America
Id: 00336000009BusIAAS Pat Stumuller - Pyramid Construction Inc.
Id: 00336000009BusJAAS Andy Young - Dickenson plc


Mutual Authentication and the Salesforce Bulk API

Now, what about the Bulk API? Running a test app resulted in an error when I tried to create a Bulk API Job. Tracing through the WSC code revealed that when ConnectorConfig.createTransport() creates a Transport with a custom TransportFactory, it does not set the ConnectorConfig on the Transport:

public Transport createTransport() throws ConnectionException {
  if(transportFactory != null) {
    return transportFactory.createTransport();

  try {
    Transport t = (Transport)getTransport().newInstance();
    return t;
  } catch (InstantiationException e) {
    throw new ConnectionException("Failed to create new Transport " + getTransport());
  } catch (IllegalAccessException e) {
    throw new ConnectionException("Failed to create new Transport " + getTransport());

ConnectorConfig.createTransport() is only used when the WSC Bulk API client is POSTing to the Bulk API, since the POST method is hardcoded into JdkHttpTransport.connectRaw() (all SOAP requests use HTTP POST). When the client wants to do a GET, it uses BulkConnection.doHttpGet(), which does not use ConnectorConfig.createTransport(), instead calling config.createConnection():

private InputStream doHttpGet(URL url) throws IOException, AsyncApiException {
  HttpURLConnection connection = config.createConnection(url, null);
  connection.setRequestProperty(SESSION_ID, config.getSessionId());

The problem here is that config.createConnection() ultimately just calls url.openConnection() directly, bypassing any custom Transport:

public HttpURLConnection createConnection(URL url,
HashMap<String, String> httpHeaders, boolean enableCompression) throws IOException {

  if (isTraceMessage()) {
    getTraceStream().println( "WSC: Creating a new connection to " + url + " Proxy = " +
        getProxy() + " username " + getProxyUsername());

  HttpURLConnection connection = (HttpURLConnection) url.openConnection(getProxy());

Luckily, config.createConnection() is public, so my solution to these problems was to subclass ConnectorConfig as MutualAuthConnectorConfig, providing an SSLContext in its constructor, and overriding createConnection():

public class MutualAuthConnectorConfig extends ConnectorConfig {
  private final SSLContext sc;

  public MutualAuthConnectorConfig(SSLContext sc) { = sc;

  public HttpURLConnection createConnection(URL url, HashMap<String, String> httpHeaders, 
      boolean enableCompression) throws IOException {
    HttpURLConnection connection = super.createConnection(url, httpHeaders, enableCompression);
    if (connection instanceof HttpsURLConnection) {
    return connection;

If you look at ClientSSLTransport and ClientSSLTransportFactory, you'll notice that the factory has a two-argument constructor that allows us to pass the ConnectorConfig. This ensures that the Transport can get the configuration it needs, despite the fact that ConnectorConfig.createTransport() neglects to set the config.

Now, when creating a BulkConnection from a Partner API ConnectorConfig, I use my subclassed ConnectorConfig class AND set the TransportFactory on it, so that the SSLSocketFactory is set for both GET and POST:

  ConnectorConfig bulkConfig = new MutualAuthConnectorConfig(sslContext);
  bulkConfig.setTransportFactory(new ClientSSLTransportFactory(sslContext, bulkConfig));

  // The endpoint for the Bulk API service is the same as for the normal 
  // SOAP uri until the /Soap/ part. From here it's '/async/versionNumber' 
  String soapEndpoint = partnerConfig.getServiceEndpoint(); 
  String restEndpoint = soapEndpoint.substring(0, soapEndpoint.indexOf("Soap/")) 
      + "async/" + conf.apiVersion; 

  // Remember to swap the port for Mutual Authentication! 
  bulkConfig.setRestEndpoint(changePort(restEndpoint, 8443));

Running my simple sample app showed that I was able to successfully retrieve data via the Bulk API:

Querying for the 5 newest Contacts via the Bulk API...
Created job: 7503600000KbCyMAAV
Batch state is: Queued
Sleeping for a second...
Sleeping for a second...
Sleeping for a second...
Batch state is: Completed
Result header:[Id, FirstName, LastName, Account.Name]
Id: 00336000009BusFAAS Rose Gonzalez - Edge Communications
Id: 00336000009BusGAAS Sean Forbes - Edge Communications
Id: 00336000009BusHAAS Jack Rogers - Burlington Textiles Corp of America
Id: 00336000009BusIAAS Pat Stumuller - Pyramid Construction Inc.
Id: 00336000009BusJAAS Andy Young - Dickenson plc

You can grab my sample app and all of the above mentioned files here.

Proposed WSC Changes

With the above changes I was able to call both the SOAP and Bulk APIs and include the WSC JAR files unchanged. I filed issue #213 on WSC, and then fixed the problems in the WSC directly (pull request) by adding an SSLContext member variable and its getter/setter to ConnectorConfig and having JdkHttpTransport.connectRaw() and BulkConnection.doHttpGet() set the SSLSocketFactory on the HttpsURLConnection immediately after it's created. I'll update this blog entry if and when my pull request is accepted.


The first blog entry in this series explained how to enable, configure and test Salesforce Mutual Authentication. This time, I showed how to work around the shortcomings in the Salesforce Web Service Connector (WSC) to allow it to work with Mutual Authentication.

In part 3, the final installment in this series, I show you how to use Mutual Authentication with common HTTP clients to access Salesforce API endpoints directly.


Salesforce Mutual Authentication – Part 1: the Basics

Mutual Authentication was introduced by Salesforce in the Winter '14 release. As the Salesforce Winter '14 release notes explain,  mutually authenticated transport layer security (TLS) allows secure server-to-server connections initiated by a client using client certificate authentication, and means that both the client and the server authenticate and verify that they are who they say they are. In this blog post, I'll show you how to enable Mutual Authentication and perform some basic tests using the curl command line tool. In a future blog post, I'll show you how to implement Mutual Authentication in your Java apps.

In the default case, without Mutual Authentication, when an API client connects to Salesforce via TLS, the client authenticates the server via its TLS certificate, but the TLS connection itself gives the server no information on the client's identity. After the TLS session is established, the client sends a login request containing its credentials over the secure channel, the Salesforce login service responding with a session ID. The client then sends this session ID with each API request.

Mutual Authentication provides an additional layer of security. Each time you connect to a Salesforce API, the server checks that the client's certificate is valid for the client's org, as well as checking the validity of the session ID. Note that Mutual Authentication is intended for API use and not for user interface (web browser) use.

Before you can use Mutual Authentication, you need to obtain a client certificate. This certificate must be issued by a certificate authority with its root certificate in the Salesforce Outbound Messaging SSL CA Certificates list; Mutual Authentication will not work with a self-signed client certificate. More information is available in the Salesforce document, Set Up a Mutual Authentication Certificate. I bought an SSL certificate from GoDaddy - you can almost certainly find a cheaper alternative if you spend some time looking.

Enabling Mutual Authentication in Salesforce

Mutual Authentication is not enabled by default. You must open a support case with Salesforce to enable it. When it is enabled, you will see a Mutual Authentication Certificates section at Setup | Administer | Security Controls | Certificate and Key Management.

Mutual Authentication Configuration

You must upload a PEM-encoded client certificate to this list. Note that you need only upload the client certificate itself; do not upload a certificate chain.

You will also need to create a user profile with the Enforce SSL/TLS Mutual Authentication user permission enabled. Clone an existing Salesforce profile and enable Enforce SSL/TLS Mutual Authentication. Check that the profile has the Salesforce object permissions that your application will need to access data. Assign the new profile to the user which your app will use to access Salesforce.

Testing Mutual Authentication with curl

This was a stumbling block for me for some time. First, despite what the Salesforce documentation (Configure Your API Client to Use Mutual Authentication) says, the Salesforce login service does not support Mutual Authentication. You cannot connect to on port 8443 as described in the docs.

You can, however, send a normal authentication request for a user with Enforce SSL/TLS Mutual Authentication enabled to the default TLS port, 443. The login service responds with a session ID as for any other login request. Mutual Authentication is enforced when you use the session ID with an API endpoint.

Let's try this out. Here's a SOAP login request - add a username/password and save it to login.xml:

<?xml version="1.0" encoding="utf-8" ?>
<env:Envelope xmlns:xsd=""
    <n1:login xmlns:n1="">

Now you can send it to the login service with curl:

$ curl -s -k \
    -H "Content-Type: text/xml; charset=UTF-8" \
    -H "SOAPAction: login" \
    -d @login.xml | xmllint --format -
<?xml version="1.0" encoding="UTF-8"?>
<soapenv:Envelope xmlns:soapenv="" xmlns="" xmlns:xsi="">
          ...lots of user data...

We need to create a PEM file for curl with the signing key, client certificate, and all the certificates in its chain except the root. This file looks something like this:

...base 64 encoded private key data...
...base64 encoded client certificate data...
...base64 encoded CA issuing cert...
...another base64 encoded CA issuing cert...

We'll call the getUserInfo API. Here's the SOAP request - add the session ID returned from login and save it as getuserinfo.xml:

<?xml version="1.0" encoding="utf-8"?> 
<soapenv:Envelope xmlns:soapenv=""
    <urn:getUserInfo />

Now we're ready to make a mutually authenticated call to a Salesforce API! You'll need to specify the correct instance, as returned in the login response, in the URL. Note the port number is 8443:

$ curl -s -k \
    -H "Content-Type: text/xml; charset=UTF-8" \
    -H "SOAPAction: example" \
    -d @getuserinfo.xml \
    -E fullcert.pem | xmllint --format -
<?xml version="1.0" encoding="UTF-8"?>
<soapenv:Envelope xmlns:soapenv="" xmlns="" xmlns:xsi="">
        <type>API REQUESTS</type>
        ...all the user data...

Now let's look at a couple of failure modes. What happens when we call the 8443 port, but don't pass a client certificate?

$ curl -s -k \
    -H "Content-Type: text/xml; charset=UTF-8" \
    -H "SOAPAction: example" \
    -d @getuserinfo.xml
<html><head><title>Certificate Error</title></head><body bgcolor=#ffffff text=#3198d8><center><img src=""><p><h3>Client certificate error:<i>No client certificate provided.</i></h3></center></body></html>

Note the HTML response, rather than XML!

What about calling the regular 443 port with this session ID?

$ curl -s -k \
    -H "Content-Type: text/xml; charset=UTF-8" \
    -H "SOAPAction: example" \
    -d @getuserinfo.xml
<?xml version="1.0" encoding="UTF-8"?>
<soapenv:Envelope xmlns:soapenv="" xmlns:sf="" xmlns:xsi="">
      <faultstring>MUTUAL_AUTHENTICATION_FAILED: This session could not be mutually authenticated for use with the API</faultstring>
        <sf:UnexpectedErrorFault xsi:type="sf:UnexpectedErrorFault">
          <sf:exceptionMessage>This session could not be mutually authenticated for use with the API</sf:exceptionMessage>

This time we get a much more palatable response!

Now you know how to get the basics of Salesforce Mutual Authentication working. In part 2 of this series, I look at using Salesforce's Web Service Connector (WSC) to access the SOAP and Bulk APIs with Mutual Authentication, and in part 3, I explain how to access the Salesforce REST APIs with common Java HTTP clients such as the Apache and Jetty.

Tagged as: , No Comments

Uploading data to the Salesforce Wave Analytics Cloud

bi_phoneOverDesktopAs you might know from my last post, I moved from Salesforce to StreamSets a couple of weeks ago. It didn't take long before I was signing up for a fresh Developer Edition org, though! I'm creating a StreamSets destination to allow me to write data to Wave Analytics datasets, and it's fair to say that the documentation is... sparse. Working from the Wave Analytics External Data API Developer Guide and Wave Analytics External Data Format Reference (why are these separate docs???), and my understanding of how Salesforce works, I was able to put together a working sample Java app that creates a dataset from CSV data.

Here's the code - I explain a few idiosyncrasies below, and reveal the easiest way to get this working with Wave.

package wsc;

import java.nio.charset.StandardCharsets;
import java.util.Arrays;
import java.util.List;

import com.sforce.soap.partner.Connector;
import com.sforce.soap.partner.Error;
import com.sforce.soap.partner.PartnerConnection;
import com.sforce.soap.partner.QueryResult;
import com.sforce.soap.partner.SaveResult;
import com.sforce.soap.partner.sobject.SObject;

public class Main {

	// Describes the data we'll be uploading
	static String metadataJson = 
			"{\n" +
			"    \"fileFormat\": {\n" +
			"        \"charsetName\": \"UTF-8\",\n" +
			"        \"fieldsDelimitedBy\": \",\",\n" +
			"        \"fieldsEnclosedBy\": \"\\\"\",\n" +
			"        \"numberOfLinesToIgnore\": 1\n" +
			"    },\n" +
			"    \"objects\": [\n" +
			"        {\n" +
			"            \"connector\": \"AcmeCSVConnector\",\n" +
			"            \"description\": \"\",\n" +
			"            \"fields\": [\n" +
			"                {\n" +
			"                    \"description\": \"\",\n" +
			"                    \"fullyQualifiedName\": \"SalesData.Name\",\n" +
			"                    \"isMultiValue\": false,\n" +
			"                    \"isSystemField\": false,\n" +
			"                    \"isUniqueId\": false,\n" +
			"                    \"label\": \"Account Name\",\n" +
			"                    \"name\": \"Name\",\n" +
			"                    \"type\": \"Text\"\n" +
			"                },\n" +
			"                {\n" +
			"                    \"defaultValue\": \"0\",\n" +
			"                    \"description\": \"\",\n" +
			"                    \"format\": \"$#,##0.00\",\n" +
			"                    \"fullyQualifiedName\": \"SalesData.Amount\",\n" +
			"                    \"isSystemField\": false,\n" +
			"                    \"isUniqueId\": false,\n" +
			"                    \"label\": \"Opportunity Amount\",\n" +
			"                    \"name\": \"Amount\",\n" +
			"                    \"precision\": 10,\n" +
			"                    \"scale\": 2,\n" +
			"                    \"type\": \"Numeric\"\n" +
			"                },\n" +
			"                {\n" +
			"                    \"description\": \"\",\n" +
			"                    \"fiscalMonthOffset\": 0,\n" +
			"                    \"format\": \"MM/dd/yyyy\",\n" +
			"                    \"fullyQualifiedName\": \"SalesData.CloseDate\",\n" +
			"                    \"isSystemField\": false,\n" +
			"                    \"isUniqueId\": false,\n" +
			"                    \"label\": \"Opportunity Close Date\",\n" +
			"                    \"name\": \"CloseDate\",\n" +
			"                    \"type\": \"Date\"\n" +
			"                }\n" +
			"            ],\n" +
			"            \"fullyQualifiedName\": \"SalesData\",\n" +
			"            \"label\": \"Sales Data\",\n" +
			"            \"name\": \"SalesData\"\n" +
			"        }\n" +
			"    ]\n" +

	// This is the data we'll be uploading
	static String data = 
			"Name,Amount,CloseDate\n" +
			"opportunityA,100.99,6/30/2014\n" +

	// This will be the name of the data set in Wave
	// Must be unique across the organization
	static String datasetName = "tester";

	// Change these as appropriate
	static final String USERNAME = "";
	static final String PASSWORD = "p455w0rd";

	// Status values indicating that the job is done
	static final List&lt;String&gt; DONE = (List&lt;String&gt;)Arrays.asList(

	public static void main(String[] args) {
		PartnerConnection connection;
		ConnectorConfig config = new ConnectorConfig();

		try {

			connection = Connector.newConnection(config);

			System.out.println("Successfully authenticated as "+config.getUsername());

			// Wave time!
			// First, we create an InsightsExternalData job
			SObject sobj = new SObject();
			sobj.setField("EdgemartAlias", datasetName);
			sobj.setField("MetadataJson", metadataJson.getBytes(StandardCharsets.UTF_8));

			String parentID = null;
			SaveResult[] results = connection.create(new SObject[] { sobj });
			for(SaveResult sv:results) {
				if(sv.isSuccess()) {
					parentID = sv.getId();
					System.out.println("Success creating InsightsExternalData: "+parentID);
				} else {
					for (Error e : sv.getErrors()) {
						System.out.println("Error: " + e.getMessage());

			// Now upload some actual data. You can do this as many times as necessary,
			// subject to the Wave External Data API Limits
			sobj = new SObject();
			sobj.setField("DataFile", data.getBytes(StandardCharsets.UTF_8));
			sobj.setField("InsightsExternalDataId", parentID);
			sobj.setField("PartNumber", 1);
			results = connection.create(new SObject[] { sobj });
			for(SaveResult sv:results) {
				if(sv.isSuccess()) {
					String rowId = sv.getId();
					System.out.println("Success creating InsightsExternalDataPart: "+rowId);
				} else {
					for (Error e : sv.getErrors()) {
						System.out.println("Error: " + e.getMessage());

			// Instruct Wave to start processing the data
			sobj = new SObject();
			results = connection.update(new SObject[] { sobj });
			for(SaveResult sv:results) {
				if(sv.isSuccess()) {
					String rowId = sv.getId();
					System.out.println("Success updating InsightsExternalData: "+rowId);
				} else {
					for (Error e : sv.getErrors()) {
						System.out.println("Error: " + e.getMessage());

			// Periodically check whether the job is done
			boolean done = false;
			int sleepTime = 1000;
			while (!done) {
				try {
					sleepTime *= 2;
				} catch(InterruptedException ex) {
				QueryResult queryResults = connection.query(
						"SELECT Status FROM InsightsExternalData WHERE Id = '" + parentID + "'"
				if (queryResults.getSize() &gt; 0) {
					for (SObject s: queryResults.getRecords()) {
						String status = (String)s.getField("Status");
						if (DONE.contains(status)) {
							done = true;
							String statusMessage = (String)s.getField("StatusMessage");
							if (statusMessage != null) {
				} else {
					System.out.println("Can't find InsightsExternalData with Id " + parentID);
		} catch (ConnectionException e1) {
  • Lines 7-14 - I'm using the WSC with the SOAP Partner API, just because I'm working in Java, and that was what was used in the bits of sample code included in the docs.
  • Lines 19-72 - this is the metadata that describes the CSV you're uploading. This is optional, but recommended.
  • Lines 75-78 - CSV is the only format currently supported, though the docs reserve a binary format for Salesforce use.
  • Line 82 - the dataset name must be unique across your org.
  • Lines 85-86 - change these to your login credentials.
  • Line 117 - the API wants base64-encoded data, so you'd likely try encoding the data yourself and passing the resulting string here, resulting in an error message. Instead you have to pass the raw bytes of the unencoded string and let the WSC library sort it out.
  • Lines 137-154 - you can repeat this block in a loop as many times as necessary.

You will need the WSC jar, and the SOAP Partner API jar - follow Jeff Douglas' excellent article Introduction to the Web Services Connector for details on setting this up - use the 'uber' JAR as it contains all the required dependencies. The sample above used Jeff's Partner API sample as a starting point - thanks, Jeff!

The fastest way to get started with Wave is, of course, Salesforce Trailhead. Follow the Wave Analytics Basics module and you'll end up with a Wave-enabled Developer Edition all ready to go.

Once you have your Wave DE org, and the sample app, you should be able to run it and see something like:

Successfully authenticated as
Success creating InsightsExternalData: 06V360000008RIlEAM
Success creating InsightsExternalDataPart: 06W36000000PDXFEA4
Success updating InsightsExternalData: 06V360000008RIlEAM

If you go look in the Wave Analytics app, you should see the 'tester' dataset:


Click on 'tester' and you'll see the 'big blue line':


Now you can drill into the data (all 2 rows of it!) by account name, close date etc.

You could, of course, extend the above code to accept a CSV filename and dataset name on the command line, and create all sorts of interesting extensions. Follow the StreamSets blog to learn where I plan to go with this!


Thank You For The Music

I joined the developer evangelism team at Salesforce in October 2010, nearly five and a half years ago. It's been a fantastic run, but it's time for me to move on, and today will be my last day with Salesforce.

Over the past few years I've worked with Apex, Visualforce, the APIs, Heroku, Salesforce Identity and, most recently, the Internet of Things, but, more than any of the technologies, it's the people that have made Salesforce special for me. I've worked with the best developer marketing team in the industry, and the most awesome community of admins and developers.

So, what next? Starting on Monday I'll be 'Community Champion' at StreamSets, a San Francisco-based startup focused on open source big data ingest. I'll be blogging at their Continuous Ingest Blog, speaking at conferences (including GlueCon, coming up in May), tweeting, and learning all about this 'big data' thing I keep hearing about.

Thank you, Salesforce, for my #dreamjob, and all the fun times over the years. It's been a blast!

Tagged as: 13 Comments

Visualforce on Chromecast, as a Service!

After writing my last blog entry, on how to display any Visualforce Page on Google Chromecast, it occured to me that I could run the app on Heroku. So, if you have a Google Chromecast, and a Salesforce login with API access enabled, you can try it out right now.

Go to; you'll see this page:

Visualforce on Chromecast

Follow the instructions, log in, authorize the app to access your data, and you'll be able to select a Visualforce Page to 'cast' to your TV.

Select a Visualforce Page

One new feature here - if you select a Visualforce Page that uses a standard controller, and is thus expecting a record ID as a parameter, you'll get the opportunity to select a record. For simplicity, I'm just showing the first 10 records returned by the database.

Select a Record

Choose a record, hit send, and you'll see the page displayed by the Chromecast, in this case, it's a Mini Hack we ran a couple of Dreamforces ago:


As always, the code is on GitHub.

Having done Raspberry Pi, Minecraft, and now Chromecast, I'm looking for new ideas for interesting Salesforce integrations. Leave a comment if you think of one!


Display ANY Visualforce Page on Google Chromecast

Last time, I described how I ran a simple 'Hello World' application, served from a Site, on the Google Chromecast, a $35 digital media player. In this blog entry, I'll show you how to show any Visualforce page, not just a public page on a Site, on the Chromecast.


A quick recap... (Skip this paragraph if you've already read the previous entry). Chromecast is actually a tiny wifi-enabled Linux computer, running the Chrome browser, connected to a TV or monitor via HDMI. A 'receiver' app, written in HTML5, runs on the device, which has no input capability (mouse/keyboard), while a 'sender' app runs on a 'second screen' such as a laptop, smartphone, or tablet, the two apps communicating across the local wifi network via a message bus. The sender app typically allows the user to navigate content and control the media stream shown on the Chromecast (the 'first screen'). The CastHelloText-chrome sample allows the user to type a message in the sender app on the first screen, and displays it on the second screen via the receiver app.

Given a working sample, the next question was, how to access data from the receiver app? The core problem is that the Chromecast can only load a public web page - it can't login to The sender app runs on a desktop browser, smartphone or tablet, however, so perhaps it would be possible to login there, and send a session ID to the receiver app via the message bus? I worked through a few alternatives before I hit on the optimal solution:

Load the Visualforce page via Frontdoor.jsp

Frontdoor.jsp, which has existed for some time, but has only been formally documented and supported since the Winter '14 Salesforce release, "gives users access to Salesforce from a custom Web interface, such as a remote access site or other API integration, using their existing session ID and the server URL".

To authenticate users with frontdoor.jsp, you pass the server URL and session ID to frontdoor.jsp in this format:

Sounds perfect! The only problem is that the session ID you pass to frontdoor.jsp must come from one of:

  • The access_token from an OAuth authentication (obtained with 'web' or 'full' scope)
  • The LoginResult returned from a SOAP API login() call
  • The Apex UserInfo.getSessionId()

The session ID from a Visualforce page or controller isn't going to cut it here. So, I reached for Kevin O'Hara's excellent nforce and built a quick Node.js sender app that has the user authorize API access via OAuth (including web scope!), runs a query for the list of Visualforce Pages in the org and presents them as a drop-down list. You can choose a Visualforce Page, hit 'Send', and the sender app constructs the frontdoor URL with the OAuth access token and relative URL for the page and sends it to the receiver via the message bus.

Screen Shot 2014-03-21 at 12.09.08 PM

Note that, while you can indeed send any Visualforce page to the Chromecast for display, remember that the Chromecast doesn't have any capacity for user input, so tables and charts work best.

I tried a couple of approaches for the receiver app; first I simply redirected to the frontdoor URL, but then I realized that it would be more useful to load the frontdoor URL into a full-page iframe. That way, the receiver app could stay running in the 'top' document, ready to receive a different URL, and periodically reloading the iframe so that the session doesn't time out. Here it is in action:

All of the code is in my CastDemo project on GitHub. Feel free to fork it, extend it, and let me know in the comments how it works out.

When it came down to the code, this was a very straightforward integration; the vast majority of the work was thinking around the problem of how to have a device with no input capability authenticate and load a Visualforce page. Now that Frontdoor.jsp is documented and supported, it's an essential tool for the advanced developer.

POSTSCRIPT: Almost as soon as I hit 'publish' on this post, I realized I could push the app to Heroku, and allow anyone with a Chromecast and API access to Salesforce to see their Visualforce Pages on TV. Read the next installment here.


Getting Started with Chromecast on Visualforce

About a month ago, Google released the Google Cast SDK, allowing developers to create apps that run on the Chromecast, a $35 digital media player. The primary use case of Chromecast is to stream media - movies, TV shows, music and the like - via wifi to your HDMI TV/monitor, but, looking at the SDK docs, it became apparent that the Chromecast is actually a miniature ('system-on-chip') computer running Chrome OS (a Linux variant) and the Chrome browser. If it's running a browser, I wondered, could it load Visualforce pages from Salesforce and display, for example, a chart based on live data? If so, this would allow any HDMI-capable TV or monitor to be used as a dashboard at very low cost. When I was given a Chromecast by a colleague (thanks, Sandeep!) in return for alpha testing his app, I decided to find out!

This first blog post explains how I ran a simple 'Hello World' sample on the Chromecast, loading the app from Visualforce. Next time, I'll show you how I pulled data from Salesforce via the REST API and showed it as a chart.


Chromecast setup was pretty straightforward - a matter of connecting the device to an HDMI input on my TV and a USB power source, downloading and running the Chromecast app, and following the prompts to complete setup. The Chromecast app locates the device on the local network using the DIAL protocol. Note that, since the app is communicating directly with the device, it won't work on wifi networks that enforce AP/Client Isolation (many offices and hotels).

After installing the Cast Extension for Chrome and verifying that the Chromecast could display content from YouTube, it was time to put the device into development mode! This actually proved to be pretty tricky - you need to enter the Chromecast's serial number into the Google Cast SDK Developer Console. Sounds straightforward, but the serial number is laser etched into the Chromecast's black plastic case in very small type indeed. I entered it incorrectly the first time round, and had to take a photo of the serial number and zoom in to see that the last character was an S and not an 8!


Another gotcha I encountered is that it's necessary to go into the Chromecast settings (in the Chromecast app) and enable Send this Chromecast's serial number when checking for updates. This information is on a separate page from the device registration instructions, so it's easy to miss.

Now my Chromecast showed up in the developer console, it was time to get an app running. Since the Chromecast has no input devices (keyboard, mouse, etc), a 'receiver app' running in an HTML5 page on the device is controlled by a 'sender app' running on a 'second screen' such as a laptop, smartphone or tablet. The two apps are connected over the local network by a message bus exposed by the Google Cast SDK.


Looking through the samplesCastHelloText-chrome looked like the simplest example of a custom receiver. In the sample, the sender app, running on an HTML5 page in Chrome, allows you to enter a message ('Hello World' is traditional!) and sends it on the bus. The receiver app displays the message, and reflects it back to the sender, to demonstrate the bidrectional nature of the bus.

It was straightforward to convert the vanilla HTML pages to Visualforce - the first change was to wrap the entire page in an tag and remove the DOCTYPE, since Visualforce will supply this when it renders the page.

<apex:page docType="html-5.0" applyHtmlTag="false" applyBodyTag="false"
           showHeader="false" sidebar="false" standardStylesheets="false"
<!-- <!DOCTYPE html> -->
<html> of the page...

Visualforce doesn't like HTML attributes with no value, so, in chromehellotext, I changed

<input id="input" type="text" size="30" onwebkitspeechchange="transcribe(this.value)" x-webkit-speech/>


<input id="input" type="text" size="30" onwebkitspeechchange="transcribe(this.value)" x-webkit-speech="true"/>

Adding the Visualforce pages to a Site made them public on the web. This is important - the Chromecast can only load public web pages - it has no way of authenticating to a server. You'll find out in the next blog post how I was able to access the REST API to securely retrieve content.

Once I had a pair of public pages, I registered my sample app, entering the public URLs for my Visualforce pages, and pasted the resulting app ID into the chromehellotext page. Loading that page gave me a text control into which I could type a message. Hitting return to submit the message pops up the Cast device selector.


I selected my device from the list, and - 'BAM!' - my message popped up on the TV screen - success!


One very nice feature of the Chromecast is that it allows remote debugging in Chrome. You can find the device's IP address in the Chromecast app, say, and simply go to port 9222 at that address, in my example,


You get the usual Chrome developer tools, right down to the ability to set breakpoints and inspect variables in JavaScript - marvelous!


I've published the sample app, so you can try it out yourself. If you have a Chromecast, go to my sender app page; you should be able to connect to your device and send a message.

At this point, I had to do some thinking. The Chromecast, as I mentioned before, loads a page from a public web server. How could I show data on the page, preferably without making the data itself publicly available? Read on to the next post!

Portions of this page are reproduced from work created and shared by Google and used according to terms described in the Creative Commons 3.0 Attribution License.


Raspberry Pi fix for HDMI to DVI cable issue

My Raspberry Pi arrived this week. After creating a boot image on an SD card I had lying around (using the excellent RasPiWrite utility), I initially booted it up on my TV, using the composite video output - all working!

Raspberry Pi in text mode

After a little exploration from the command line, startx brought up the GUI.

Raspberry Pi running X

As well as the composite video output, the Raspberry Pi supports HDMI. My monitor (a Viewsonic VX2235WM-3) has VGA and DVI inputs, so I ordered the AmazonBasics HDMI to DVI Cable. Connecting up to my monitor, I was disappointed to see no video signal whatsover - the monitor wasn't seeing the Raspberry Pi at all.

Googling around, I discovered that you can set various configuration options that are read before the Raspberry Pi even boots. With a little experimentation, I found that setting


in config.txt solves the problem - I see video output from the moment I power up the Raspberry Pi! This makes sense - the description of hdmi_force_hotplug is "Use HDMI mode even if no HDMI monitor is detected" - I'm guessing the cable is not signalling the presence of a monitor to the Raspberry Pi, so it decides that it doesn't need to send HDMI output.

Watch this space for more Raspberry Pi fun!


Running Your Own Node.js Version on Heroku

UPDATE (3/3/12) - there's a much easier way of doing this now - see 'Specifying a version of Node.js / npm' in the Heroku Dev Center. The mechanism described below still works, but you should only go to all this trouble if you want something really custom.

Here's a completely unofficial, unsupported recipe for running your own Node.js version on Heroku. These instructions are based on those at the Heroku Node.js Buildpack repository, with some extra steps that I found were necessary to make the process work. Note that buildpack support at Heroku is still evolving and the process will likely change over time. Please leave a comment if you try the instructions here and they don't work - I'll do my best to keep them up to date.

Before you start, update the heroku gem, so it recognizes the --buildpack option:

gem update heroku

(Thanks to 'tester' for leaving a comment reminding me that using an out of date heroku gem can result in the error message ! Name must start with a letter and can only contain lowercase letters, numbers, and dashes.)

Note: If you just want to try out a completely unofficial, unsupported Node.js 0.6.1 on Heroku, just create your app with my buildpack repository:

$ heroku create --stack cedar --buildpack

Otherwise, read on to learn how to create your very own buildpack...

First, you'll need to fork Now, before you follow the instructions in the README to create a custom Node.js buildpack, you'll have to create a build server (running on Heroku, of course!) with vulcan and make it available to the buildpack scripts. You'll have to choose a name for your build server that's not already in use by another Heroku app. If vulcan create responds with 'Name is already taken', just pick another name.

$ gem install vulcan
$ vulcan create YOUR-BUILD-SERVER-NAME

Now you can create your buildpack. You'll need to set up environment variables for working with S3:


Create an S3 bucket to hold your buildpack. I used the S3 console, but, if you have the command line tools installed, you can use them instead.

Next you'll need to package Node.js and NPM for use on Heroku. I used the current latest, greatest version of Node.js, 0.6.1, and NPM, 1.0.105:

$ support/package_node 0.6.1
$ support/package_npm 1.0.105

Open bin/compile in your editor, and update the following lines:


Now commit your changes and push the file back to GitHub:

$ git commit -am "Update Node.js to 0.6.1, NPM to 1.0.105"
$ git push

You can now create a Heroku app using your custom buildpack. You'll also need to specify the Cedar stack:

$ heroku create --stack cedar --buildpack

When you push your app to Heroku, you should see the custom buildpack in action:

$ cd ../node-example/
$ git push heroku master
Counting objects: 11, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (8/8), done.
Writing objects: 100% (11/11), 4.02 KiB, done.
Total 11 (delta 1), reused 0 (delta 0)

-----> Heroku receiving push
-----> Fetching custom build pack... done
-----> Node.js app detected
-----> Fetching Node.js binaries
-----> Vendoring node 0.6.1
-----> Installing dependencies with npm 1.0.105

Dependencies installed
-----> Discovering process types
Procfile declares types -> web
-----> Compiled slug size is 3.3MB
-----> Launching... done, v6 deployed to Heroku

cd3c0e2..33fdd7a  master -> master
$ curl
Hello from Node.js v0.6.1


Note: Due to an incompatibility between the default BSD tar on my Mac and GNU tar on Heroku, I saw many warnings while pushing my Node.js app to Heroku, of the form

tar: Ignoring unknown extended header keyword `'
tar: Ignoring unknown extended header keyword `SCHILY.ino'
tar: Ignoring unknown extended header keyword `SCHILY.nlink'

These are annoying, but benign - the push completes successfully. If you're on a Mac and you want to get rid of them, add the line

alias tar=gnutar

just after the opening #!/bin/sh in both package scripts.

Tagged as: , 13 Comments