1 Comments

Microsoft recently announced Azure App Service, a new Azure service which integrates Web Apps, Mobile Apps, Logic Apps and API Apps in one service. Scott Guthrie’s blog has a very good article that explains this new service.

In this article I will talk about how to use Azure App Services to integrate with Salesforce. For this example I will develop a small solution based on a real case scenario. I will assume the reader is more familiarized with Microsoft’s technology so I will spend more time explaining the details about Salesforce.

The Scenario

The scenario is the following: suppose that our client is an insurance company that uses Salesforce as its CRM. Insured's call the Call Center department to requests assistances depending on the policy coverage they have contracted, for example: assistance on the road (for automotive policies), or request for professional assistance - e.g. a plumber, locksmith, etc. – at home (for home insurance policies). All assistances are attended by third party service providers (car shops, locksmiths, car towing companies, etc.) The idea is that service requests are made in the CRM, and then third party service providers can attend these requests and mark them as completed once the service has been delivered. It would be ideal that external providers could have access to the CRM to see all the pending requests and do updates directly in the system. However, in our case it won’t be that easy: there are many service providers and we don’t want to buy Salesforce licenses for them (Salesforce licenses are not precisely cheap), and besides, we don’t want external users to have access to our CRM. You could create a Community for this and be done with it, but in this case, and for the sake of illustration, I will instead show how you can develop a small web portal for service providers which we will synchronize with CRM using Azure App Services.

Our solution will have the following flow:

  1. Call Center users will receive assistance requests by insured’s. These requests will be entered in the CRM (Salesforce)
  2. At regular intervals, the Logic App will get from the CRM all pending requests using a Salesforce Connector. These requests will be sent to the database of the service provider web portal using a SQL Server connector
  3. New requests will appear in the service provider portal
  4. Service providers will take requests and deliver them. Once delivered, they will change the request’s status to completed
  5. At regular intervals the Logic App will get all completed requests and synchronize them back to the CRM using a Salesforce Connector
  6. Finally, CRM users will be able to see all the external service provider deliveries completed

scenario

In the diagram, we can see that we will require a bidirectional synchronization between Salesforce and the web app we will create.

It is worth noting that the goal of this article is to describe how to use Azure App Service to do integrations with Salesforce. The solution we will develop in this article does not pretend to be a complete solution, but a simple solution that will allow us to see the most important aspects to consider when doing these kind of integrations. A complete solution would probable have a better state workflow associated to requests, notifications (email, SMS and/or push) of new assignments to third party providers, notifications to insured’s of the status of their requests, etc. To maintain things simple, and the article not as long, we will focus on the most important aspects and leave some of the details of a real solution as homework.

Requirements

In order to follow along and create the solution we will need:

  • An Azure subscription. If you don’t have one you can get one free trial from the Azure web site
  • A Salesforce development environment. You can get a free Salesforce Developer instance from the Salesforce Developer’s web site. When you register you will get an email with information about how to connect to the instance.
  • Visual Studio 2013 Community Edition (Update 4 at the time if this writing). You can download it free from the Visual Studio web site. You will also need the Azure SDK (version 2.5.1 at the this of this writing).


Development of the Salesforce Application

The first thing we’ll do is creating the necessary objects in our Salesforce developer instance. If you requested a new development instance and is the first time you get into Salesforce you will directly enter the Salesforce setup area. From here we can customize the system and create new objects. In our case we want an object called Assistance.

salesforce create objectFrom the menu on the left, we will go the the “Build” section, then “Create”, and then “Objects”. You can also use the search box on the top and enter a keyword to filter the options available on the menu. On the “Custom Objects” screen click the “New Custom Object” button. On the screen to create the object fill the required data as shown in the following screen. For the rest of the information leave the default value.


salesforce new custom object


Notice the “Object Name” is Assistance. Every object in Salesforce has an API Name, which is the name that we will use to identify the object in development. As a rule, Salesforce adds the __c suffix to the name of all custom objects. So, the real name of our newly created object is Assistance__c.

Click on the “Save” button to create the object. In Salesforce objects are immediately available. We will proceed to create some fields to get the data required for an assistance. On the same screen of the object, on the “Custom Fields & Relationships” section, click the “New” button and launch the new field wizard. We will first create the relationship between assistances and contacts: a contact could have many assistances and an assistance belongs to a unique contact. Salesforce comes out-of-the-box with an object called Contact (here the name doesn’t have the __c suffix as this is a standard object and not a custom object, all out-of-the-box objects are standard objects). To create the relationship between assistance and contact we need to create a field of type “Lookup Relationship”, we will choose this option on the wizard screen and click “Next”. On the next step, in the “Related To” field choose the Contact object. On the next step specify Contact as the name of the field, and leave the default values for the rest.

salesforce new field


Click on “Next” and leave the default options for the rest of the steps and finalize the wizard clicking on “Save & New” to restart the wizard to create a new field. We will repeat the wizard to create all the fields described below:

Data TypeLabelNameObservations
Date Date Date In the field “Default Value” enter the function TODAY()
Text Area (Long) Description Description
Text Provider Provider Enter 50 in the field “Length
Picklist Status Status

In the field “Enter values for the picklist…” enter the following values:

New
Completed

Mark the field “Use first value as default value

Checkbox Synced Synced


After you finish creating the fields, these should look as follows:

salesforce assistance fields

Note that, as with the name of the custom object, the names of custom fields have also the __c suffix. Every object in Salesforce, custom or standard, has two fields called Id and Name. The first one is the primary key of the object, and the second is a text that works as the descriptor for the record. Both fields do not have the __c suffix, as they’re considered standard fields (even in custom objects).

Test the Application by Creating a Few Records

We already have our object for assistances. To test it we will create a new assistance. From the top menu choose the tab for “Contacts” to see a list of recent contacts (if it is the first time we use this tab, we will not have any recent contact and the list is empty). To see the list of all contacts, from the “View” dropdown select the “All Contacts” list and click the “Go” button. Select any of the contacts in the list to see his/her details. At the end of the form we should see the section for “Assistances”. Click on the “New Assistance” button to create a new assistance record. Notice how some of the fields are automatically filled based on default values: the Contact field gets filled since we created the record from a contact record, the Date field has today’s date as a default, and the Status is set to New as is the first value introduced in the picklist. Enter a name and description and save the assistance.

selesforce new assistance

Create one or two more assistances so we have enough data for the rest of the tests that will follow.

If we go back to the contact record we should see the requests we just created. However, in the list we only see the name of the request. To see additional fields we need to modify the form. At the top of the page there is a link called “Edit Layout” which opens the form editor.

salesforce edit layoutOn the edit layout screen look at the end for the list of assistances and with the mouse hover on the title of the list until a wrench icon appears. When clicking this icon you will see a screen where you can select the fields you want to display on the list. Select fields Date, Status and Provider and then click the “OK” button. To save the layout click the “Save” button at the top of the layout editor, returning back to the contact record. You will now be a able to see more details in the list of assistances:

salesforce assistances


Development of the External Service Provider’s Portal

We will now proceed with the creation of the provider’s portal. For this we will create an MVC application using Visual Studio, and later publish it in Azure as a Web App with a SQL Server database.

Open Visual Studio and create a new project of type “ASP.Net Web Application”. I have called mine ProvidersPortal. Use the MVC project template and choose “Individual User Accounts” as the authentication method. Once the project is created we will enable Entity Framework Migrations. From the Package Manager console enter the following commands:

enable-migrations
add-migration Initial
update-database

Next, we will create a new entity to represent assistances. In the Models directory, add a new class and call the file Assistance.cs and overwrite the content of the file with the following code:

using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System.Linq;
using System.Web;
 
namespace ProvidersPortal.Models
{
    public class Assistance
    {
        [MaxLength(20)]
        [Key]
        public string Id { get; set; }
 
        [MaxLength(80)]
        public string Name { get; set; }
 
        public string Description { get; set; }
 
        [MaxLength(50)]
        public string Provider { get; set; }
 
        [MaxLength(20)]
        public string Status { get; set; }
 
        public DateTime? Date { get; set; }
 
        public bool? Synced { get; set;}
    }
}

To expose this object to Entity Framework we need to add a property in the class for the context. To keep things simple we will use the same database used by ASP.Net Identity. Open the file Models\IdentityModels.cs and add the highlighted line to the class ApplicationDbContext:

public class ApplicationDbContext : IdentityDbContext<ApplicationUser>
{
    public ApplicationDbContext()
        : base("DefaultConnection", throwIfV1Schema: false)
    {
    }
 
    public static ApplicationDbContext Create()
    {
        return new ApplicationDbContext();
    }
 
    public DbSet<Assistance> Assistances { get; set; }
}

Next, add a new migration for the new object. In the Package Manager console enter the following commands:

add-migration Assistance
update-database

Now, we need a controller and a view for the assistance object. On the Controllers folder, add a new controller of type “MVC 5 Controller with views, using Entity Framework”. Fill the data as shown in the following figure:

vs add controller

Visual Studio will create (scaffold) files for the controller and the views for the typical CRUD operations. By default the controller is created allowing anonymous access to it, and in our case we want only authenticated users (external service providers) to have access to the assistance requests. Open the controller, the file Controllers\AssistanceController.cs and add the Authorize attribute to it:

[Authorize]
public class AssistancesController : Controller

Now all we have left is to add a menu option to access the assistances. Open the file Views\Shared\_Layout.cshtml and add a new option to the menu, as shown in the following highlighted code:

<ul class="nav navbar-nav">
    <li>@Html.ActionLink("Home", "Index", "Home")</li>
    <li>@Html.ActionLink("Assistances", "Index", "Assistances")</li>
    <li>@Html.ActionLink("About", "About", "Home")</li>
    <li>@Html.ActionLink("Contact", "Contact", "Home")</li>
</ul>

Compile and execute the application. After registering you should be able to see the assistances screen (which will be empty at the moment):

providers portal


Publishing of the Portal as a Web App

Now we’ll proceed to publishing the portal as a Web App in Azure. We will let Visual Studio do all the work for us. In the “Solution Explorer”, on the project node, right click and choose “Publish…”. Select “Microsoft Azure Web Apps” as the target. Visual Studio will ask you for credentials to connect to your Azure subscription. Click on the “New…” button to create a new application. I have called mine ProvidersPortal, and to keep things well organized I’ve decided to create a App Service plan as well as a new resource group for this application. We also need a database, so select the option to create a new database server. Click the “Create” button to let Visual Studio create the app in Azure.

azure web app creation

Write down the name of the App Service plan and the resource group, you will need these when you create the rest of the applications (Logic Apps and API Apps) as they need to have the same plan and group so they can see each other. In my case, both the plan and the group is called ProvidersPortal.

Once the web application is created in Azure, the wizard will continue with the publishing. In the “Settings” section make sure you select “Execute Code First Migrations (runs on application start)” in order to provision the Azure database with the schema required by our application.

vs publish wizard

Finally, click the “Publish” button and when the wizard finishes will open the provider’s portal executing on the cloud! Register a user and browse the Assistances menu item so Entity Framework will create the database schemas for the app.


Integration using a Logic App

We already have our two applications executing independently. Now comes the fun part: the integration of both applications using a Logic App in Azure. We will use the Azure portal to do all the steps that follow.

First, we will create the API Apps that will allow us to connect to Salesforce and SQL Server.

Creation of API Apps

We need two API Apps, one for Salesforce and the other for SQL Server. Both apps will be created from the Azure Marketplace.

Salesforce API App

salesforce connectorOn the Azure portal, click on “New”, select “Web + Mobile” and then “Azure Marketplace”. Select the “API Apps” category. On the search field enter “salesforce” and press enter. From the search results click on “Salesforce Connector”. A blade will open with the details of the connector and with links to its documentation. Click on the “Create” button. A new blade will open with the properties of the connector. From here select the “Package Settings” section. In this section you can specify the Salesfoce objects you would like to access. Remove the objects that appear by default and enter the name of the object we created in Salesforce: Assistance__c (remember the __c suffix, which is part of the name):

salesforce connector entities


salesforce connector propertiesIn the section “App Service Plan” select the plan created before when publishing the web application. The service plan must be the same so all the parts can work together. In my case the service plan is called ProvidersPortal. Click on “Create” and let Azure provision the API App. After a minute approximately you will see a notification telling the app is ready.

By the way, if you want to modify the connector configuration parameters later, you need to do it from the Host of the API App (you have a link to the host on the connector details blade)


SQL Server API App

sqlserver connectorAs we did before, from the Azure Marketplace search for “sql” and in the search results click on “Microsoft SQL Connector”. A blade will open with the connector details and a link to its documentation. Click the “Create” button to open a new blade with the configuration of the connector. From here select “Package Settings” section. In this section you can specify the connection details to the database. If you have doubts about what are the right values to put here you can look at the connection string that was created in the publishing profile of the web project when we published it to Azure (in Visual Studio, in Solution Explorer, go to Properties\PublishProfiles\<projectname>.pubxml). For the “Tables” specify Assistances. Make sure to select the same service plan we’ve been using. This is how the properties look in my case:

sql connector propertiesClick “Create” and let Azure provision the API App. After a minute approximately you will see a notification telling the app is ready. As with the previous API app, if you need to modify the configuration of the connector (because they changed or because you made a mistake) you will have to do it from the Host of the API App.



Creation of the Logic App

Let’s stop here for a moment and analyze what we want to accomplish. We want a bidirectional synchronization between Salesforce and the SQL Server database.

For the synchronization from Salesforce to SQL Server we want to do the following:

  • First we find all those assistances that are in Salesforce which have not been synced and with a status of “New”
  • For each one of these assistances we get its details
  • For each one of these assistances we will add a new record in the provider portal’s database
  • If the insertion is successful we mark the assistance as synced in Salesforce

For the synchronization from SQL Server to Salesforce we want to do the following:

  • We will look for all the assistance records that have been completed and have not been synced yet
  • For each one of these records we will update Salesforce
  • If the update is successful we will mark the record as synced in the portal

Since (for the moment) the logic inside Logic Apps is lineal (the current step always depends on the previous) we will need to create two Logic Apps, one for each direction of the synchronization.

Note: in this case we will use a boolean field to know what records need to be synced from one system to the other. In a real application this is probably not the best since each record will be synced only once between systems. A better approach would be to base the synchronization on a timestamp field that has the last modification date made to a record.

Synchronization from Salesforce to SQL Server

logic appWe will now proceed to create the synchronization between both applications. For this we will create a Logic App and use the connectors created before. On the Azure portal, click on “New”, then on “Web + Mobile” and select “Logic App”. A new blade will open where you can enter the name of the app and select the service plan. Again, make sure to select the same service plan specified in the connectors. I called my logic app ProvidersPortalLogic and selected the ProvidersPortal plan. Click on “Create” to provision the application. After a minute approximately you will see a notification telling the app is ready.

Open the application and select the section “triggers and actions” to show the canvas to edit the actions for the app. For now, and to be able to develop and test our app, click on the option “Run this logic manually” which will allow us to run the application on demand as we develop it. On the right section of the edition canvas you should see, among others, the two API applications we created in previous steps: the Salesforce and the SQL Server connectors. If you don’t see the connectors you probably didn’t select the appropriate service plans to be all the same.

Let’s start by adding the first step. On the “API Apps” panel click on the Salesforce connector. This will add a step to the logic. You need to authorize, i.e. provide the connection credentials, the connector so it can connect to the Salesforce instance. Click on the “Authorize” button, a screen requesting Salesforce credentials will appear. Enter your credentials and when asked if you want to give access to the Azure Logic App say yes. If everything went ok the connector will get the metadata from Salesforce and will present you with some options:

logic app salesforce connector

It is possible that you get an error saying the the metadata has not been generate (“Error fetching swagger api definition”). You could try remove the step (by clicking on the wrench icon in the upper right menu of the step and selecting “Delete this action”) and try again. If you still get the same error then you can do the following trick: change the editor to the the code view by clicking on the “Code View” icon code view and remove the Salesforce subscription (the one highlighted on the following image) and then try to add the step again.

salesforce subscription in code view

If the above trick doesn’t fix the problem the you probably have an error in the configuration of the API App (the connector).

From the step actions click on three dots “” to get more actions. Select the “Execute Query” action. This action allows you to execute a SOQL query against Salesforce. Enter the following query:

select Id from Assistance__c where Synced__c = false and Status__c = 'New'

We want the id of the records that have not yet been synchronized and with a status of “New”. When we validate the step we should see something like this.

salesforce execute query step

When this step gets executed we will get all the assistance records that satisfy the criteria, and they will be stored in a property called result. Next we will add the second step: get the details of each assistance.

Add a new step by clicking on the Salesforce connector in the “API Apps” panel. Select “Get_Assistance__c” as the action. This action will get all the fields of a particular record. We want this step to be executed by each of the records obtained in the previous step. To achieve this, on the step menu, click on the wrench icon and select “Repeat over a list”. A new field “Repeat” will appear allowing you to specify the list to use to get the records. We will use an expression to get the list. Expressions are specified by adding the @ character as a prefix. If we want to get the list of records from the previous step we will use the following expression:

@body(‘salesforceconnector’).result.records

The expression means the following: get the body of the step called salesforceconnector and search for a property called result.records. The body is formatted as JSON and the result property is an object which contains an array property called records, containing all the records returned by the query.

By default, the name of each step is the name of the connector plus an index, so for example, the first step of the Salesforce connector is called salesforceconnector, the second step is called salesforceconnector0, the third is salesforceconnector1, and so on.

In the field “Record Id” we will enter the id of the current record in the loop. To get the current record inside the loop we will use the @repeatItem() expression. So, if we want the id of the record we will use:

@repeatItem().Id

The step should look like this:salesforce connector get action

Note: we could not add this second step to the logic and instead get the required data by specifying it on the query of the first step, but I’ve done it this way to illustrate the concepts.

Now that we have the data for each record the third step is to insert the record in the database of the web application. For this we will use the SQL Server connector. On the “API Apps” panel click on “Microsoft SQL Connector”. This will add a new step to the logic. Select the action “Insert into Assistances (JSON)”. The connector will ask for parameters for each field in the table. As with the previous step, we want this insertion to be made for each record in the list so click on the wrench icon and select “Repeat over a list”. In this case we get the list using the following expression:

@actions('salesforceconnector0').outputs.repeatItems

salesforceconnector0 is the previous step, from which we want to take the outputs property representing the output of the step (this output is actually many records since the step was also executed for a list). From this output we want the repeatItems property, which is an array with the result of each record it got in the loop. To get a specific field we will use again the expression @repeatItem():

@repeatItem().outputs.body.Id

The above expression would get the field Id from the Salesforce record. If this looks complex to you the best thing is for you to execute the application every time you add a step and analyze the output of each step to see the JSON produced. This will give you an idea on how to get the values you need.

We will repeat the above expression for each field of the table. Remember once again that, because we’re dealing with custom fields, you need to add the __c suffix to the field name, except for the Id and Name fields which are standard fields (even though they belong to a custom object). Fill the fields using the following table:

FieldExpression
Id@repeatItem().outputs.body.Id
Name@repeatItem().outputs.body.Name
Description@repeatItem().outputs.body.Description__c
Provider
Status@repeatItem().outputs.body.Status__c
Date**
Synced

** The date field should have the following expression: @repeatItem().outputs.body.Date__c, however this value was giving me errors when trying to insert the record in SQL Server, something related with the date format. After trying unsuccessfully with various formats I decided to leave it blank at the moment and later find a solution.

We will leave fields Provider and Synced blank as these will be filled in the provider’s portal. The step should look as follows:

sql connector step

The last step would be to update the record in Salesforce and mark it as synced so the record would not be considered in subsequent runs of the logic app. Add a new step to the logic by clicking on the Salesforce connector in the “API Apps” panel. For this step select the action “Update Assistance__c”. Make the step execute for a list (select “Repeat over a list” from the step menu). The parameters for fields “Repeat” and “Record Id” are the same as in the previous step. Set the Synced field to true.

salesforce connector update

We have our logic app ready! Make sure to save changes if you haven’t done it yet and close the editor (you could get a message asking if you want to discard your changes even if you have saved them, say yes, your changes are already saved). Execute the application by clicking on the run icon run now. If everything works ok you should be able to see the results of the execution:

execution results

You should also be able to see the assistances you created in Salesforce in the provider’s portal:

providers porta with records

And records in Salesforce should be marked as synced:

salesforce synced

With this we have finished the first part of our solution.


Synchronizing from SQL Server to Salesforce

For the second part of the solution we will create a new Logic App. The steps are the same as the ones we did for the previous app. I’ll just mention that I will call this app SalesforceLogic. Remember to assign the same service plan. To execute this app on demand make sure you select the “Run this logic manually”.

Let’s start with the first step. Add a SQL Server connector to the app and select the action “Select from Assistances (JSON)”. Leave “Fields“ blank to get all the fields of the table. In the “Where” add the following condition:

Status = 'Completed' and Synced is null

The step should look like this:

sql server select step

Now add a second step by clicking on the Salesforce connector. Since this is a new logic app you need to authorize it against the Salesforce instance again. Select the action “Update Assistance__c”. We want to execute the update in Salesforce for each one of the records retrieved in the previous step, so we need to execute this step for a list: select the “Repeat over a list” from the step menu. For the list we will use the body of the previous step with the expression @body(‘microsoftsqlconnector’), and for the “Record Id” use the expression @repeatItem().Id. We need to update the fields Provider__c and Status__c in Salesforce by using the expressions @repeatItem().Provider and @repeatItem().Status, respectively. The step should look as follows:

salesforce connector update 2

Finally, we will add a third step using the SQL Server connector to update the Synced field in the portal’s database so it will not be considered when the logic app is executed again. For this we will make this step run for the same list of the first step: @body(‘microsoftsqlconnector’) and set the Synced field to true. This step requires a “Where” condition which needs to be dynamic using the following expression:

@concat('Id = %27',repeatItem().Id,'%27')

Here the trick is to use the URL codification of the ' character (simple quote) using the code '%27'. The reason to do it like this is because we cannot use a simple quote inside a text (even though I have tried different escape sequences). This will produce a condition such as this: Id = 'xxxxxxx'.

sql connector sync

We have our logic ready! Make sure to save changes and close the editor.

On the provider’s portal select some of the assistances and fill the Provider field and change the Status to Completed.

complete an assistance

Execute the application by clicking on the run icon run now. If everything works ok you should be able to see the records in Salesforce updated:

completed assistances in salesforce

Also you should see records marked as synced in the portal’s database:

synced assistances in sql server


Automatic Execution

recurrenceSo far we have executed both Logic Apps manually on demand. On a real application we will execute both applications automatically at regular intervals. For this we need to uncheck the option “Run this logic manually” and use some application with a trigger. The Salesforce connector doesn’t provide any trigger, but the SQL Server connector does. In both cases one option is to use the “Recurrence” application, which will allow use to automatically execute the application using a predefined time interval. The only inconvenience of this is that, if you use the free plan, the minimum time interval you can specify is one hour (one of the reasons why we chose to run the logic on demand for this article).


Conclusions

As we’ve seen, Azure App Services is a powerful tool. With a great amount of connectors (which is rapidly growing) and with the escalation possibilities that offers Azure, it is very easy to create solutions such as the one created in this article. Definitely a tool that any Software Architect should have in her tool belt when designing integrations solutions.

It is true that some areas could be improved. Documentation lacks some details, and there are some minor errors when designing the application logic, but I’m pretty sure this will evolve and get fixed (if not already) as time goes by. Also, the logic of an App Logic is very lineal, and although you can get around some scenarios, this limits some other scenarios, but again, I’m pretty sure this will change later as they will add more options to it.

So, things can only get better, so I will be following this service and see how it evolves from here.


Resources


You can also see this article in Spanish in the Microsoft MSDN blog for Spain:

2 Comments

(Puedes ver este artículo en español aquí)

A high percentage of the Salesforce projects I’ve seen require some kind of integration with backends systems, such as ERPs, databases, or even other CRMs. Salesforce offers a lot of options to make these integrations possible: workflow outbound messages, SOAP/REST callouts, batch integration with DataLoader, etc. Of course there’s lot of documentations on the topic, but most is based on Java and there is little documentation on using .Net. I’m a .Net developer, so I will provide a few examples on how to integrate with Salesforce using Visual Studio and C#. I can’t cover all the possible integration scenarios in just one article, so I will focus on one scenario that I consider as my first option when I design the architecture of a solution that requires integration: workflow outbound messages.

The Requirements

Allow me explain the concepts by using a concrete example: one of our clients required to integrate their Salesforce org with their ERP. Both Salesforce and the ERP manage accounts, but they required Salesforce to be the master system for accounts. The integration was based on the following rules:

  • Accounts are created in Salesforce (not in the ERP)
  • Any modification to account data in Salesforce needs to be reflected on the ERP (if the account exists in the ERP)

The above rules translate to the following:

  • We can create a trigger on account that does a callout to a web service in the ERP, but there’s an easier way (less code): use a workflow outbound message.
  • We will use a workflow outbound message in account to fire when the account is created or modified and send some of the fields we need to our ERP.

Let’s dive into code!

Workflow Outbound Message in Salesforce

In Salesforce, go to “Setup->Build->Create->Workflow & Approvals->Workflow Rules”. Create a new rule, and base it on the Account object:

Workflow account rule

Click next. Give it a name (I will use UpdateAccountOnERP) and for the evaluation criteria specify “created, and every time it’s edited”. We want this option since we would like to update the account on the ERP every time a modification is made in Salesforce. We want this workflow to be triggered every time, so for the rule criteria, choose “formula evaluates to true”, and specify true as the formula. The workflow should look as this:

Workflow account rule details

Click save and next. On the next screen you will add a workflow action. Click on the “Add Workflow Action” dropdown and select “New Outbound Message”. On the New Outbound Message screen, give it a name and unique name (I will use SendAccountToERP), for the endpoint put any url since we don’t have our web service yet, and finally select all the fields that you would like to send to the ERP. When finished, click Done. You will be taken to the workflow rule screen. Make sure you activate the workflow (click on the “Activate” button).

Account workflow rule summary

We have created an outbound message that will be triggered when an account is modified, but the outbound message will fail since we have not created a web service to handle it. For this we need to get the WSDL of the outbound message. Click on the outbound message description and you will be taken to Workflow Outbound Message Detail:

Account outbound message detail

Click on the hyperlink “Click for WSDL”. A new window with the WSDL will be opened. Save the WSDL to disc (I named it accountWorkflowOutboundMessage.wsdl), we will need it in Visual Studio.

Web Service in Visual Studio (using WCF)

Now that you have the WSDL for the outbound message, we need to create a web service to handle the message. This article gives you an overview of how you can create an asmx web service, but I rather work with WCF instead (asmx is kind of old and WCF is supposed to replace it, isn’t?). Also, I will use Visual Studio 2013 Community edition for this. Open Visual Studio and create a new empty ASP.Net Web Application:

New Asp.Net projectEmpty Asp.Net project



We will add a WCF service to the project. Right click on the project name in Solution Explorer and choose “Add->New Item”. Select “WCF Service”:

New WCF web wervice

Visual Studio will add the required assemblies to the project and will create three files: IAccountNotificationService.cs, AccountNotificationService.svc and the underlying AccountNotificationService.svc.cs. We will need to replace the interface definition to match the definition of the outbound message. Add the WSDL file to your project. Open a command prompt on the directory where the WSDL resides (if you have the Productivity Power Tools extension you can right click on the project in Solution Explorer and choose “Power Commands->Open Command Prompt”). At the command prompt type the following:

svcutil /noconfig /out:IAccountNotificationService.cs accountWorkflowOutboundMessage.wsdl

The svcutil command creates the interface for the web service definition. Here is the output of the command:

svcutil output

Notice that the command replaced the contents of the file IAccountNotificationService.cs with the web service definition of the WSDL. You project should now look like this:

Solution explorer

Now, there are some tricks to make this work. Open IAccountNotificationService.cs and change the following:

  • Put the whole class inside the namespace of the project (in my case is the name of the project WorkflowNotificationServices)
  • Change the name of the interface from NotificationPort to IAccountNotificationService.
  • Change the ConfigurationName parameter of the ServiceContractAttribute attribute from NotificationPort to AccountNotificationService.
  • Remove the parameter ReplyAction=”*” from the OperationContractAttribute attribute. This is important in order to make things work.
  • At the end of the file, remove the interface NotificationPortChannel and the class NotificationPortClient (we don’t need these since they are used by a client consuming the web service).

The interface should now look as the following (highlighted lines are the one that changed):

namespace WorkflowNotificationServices
{
    [System.CodeDom.Compiler.GeneratedCodeAttribute("System.ServiceModel", "4.0.0.0")]
    [System.ServiceModel.ServiceContractAttribute(Namespace = "http://soap.sforce.com/2005/09/outbound", ConfigurationName = "AccountNotificationService")]
    public interface IAccountNotificationService
    {

        // CODEGEN: Generating message contract since the operation notifications is neither RPC nor document wrapped.
        [System.ServiceModel.OperationContractAttribute(Action = "")]
        [System.ServiceModel.XmlSerializerFormatAttribute()]
        [System.ServiceModel.ServiceKnownTypeAttribute(typeof(sObject))]
        notificationsResponse1 notifications(notificationsRequest request);
    }

    /// rest of the code below
}

Next, open the file AccountNotificationService.svc.cs, remove the DoWork method and implement the IAccountNotificationService interface (place the cursor on text for the name of the interface and press Crtl+. and choose “Implement interface IAccountNotificationService”).

Finally, open web.config and replace it with the following:

<configuration>
    <connectionStrings>
        <add name="ERP" connectionString="Data Source=localhost;Initial Catalog=ERP;Integrated Security=True" providerName="System.Data.SqlClient"/>
    </connectionStrings>
    <system.web>
        <compilation debug="true" targetFramework="4.5" />
        <httpRuntime targetFramework="4.5" />
        <webServices>
            <protocols>
                <clear/>
                <add name="HttpSoap" />
                <add name="Documentation"/>
            </protocols>
        </webServices>
    </system.web>
    <system.serviceModel>
        <services>
            <service name="WorkflowNotificationServices.AccountNotificationService">
                <endpoint binding="basicHttpBinding" contract="AccountNotificationService"/>
            </service>
        </services>
        <behaviors>
            <serviceBehaviors>
                <behavior name="">
                    <serviceMetadata httpGetEnabled="true" httpsGetEnabled="true" />
                    <serviceDebug includeExceptionDetailInFaults="false" />
                </behavior>
            </serviceBehaviors>
        </behaviors>
        <serviceHostingEnvironment aspNetCompatibilityEnabled="true" multipleSiteBindingsEnabled="true" />
    </system.serviceModel>
</configuration>

Notice in line 3 I’ve created a connection string to the database for the ERP. I will use this later in the implementation of the web service. Also notice the declaration of the web service in lines 18-20.

You project should compile at this point. Also, we should be able to get the WSDL for our web service. Run the project and using a browser connect to the web service. You should see something like this:

Account web service

All we have left is to write the code to update our ERP. In this example I will just call a stored procedure on a SQLServer database to update the ERP (is just an example, you can do here whatever you need to communicate with the backend). My service implementation looks like this:

using System;
using System.Collections.Generic;
using System.Configuration;
using System.Data;
using System.Data.SqlClient;
using System.Linq;
using System.Runtime.Serialization;
using System.ServiceModel;
using System.Text;

namespace WorkflowNotificationServices
{
    public class AccountNotificationService : IAccountNotificationService
    {
        public notificationsResponse1 notifications(notificationsRequest request)
        {
            notifications notifications1 = request.notifications;

            AccountNotification[] accountNotifications = notifications1.Notification;
            foreach (AccountNotification accountNotification in accountNotifications)
            {
                Account account = (Account)accountNotification.sObject;

                ConnectionStringSettings connectionString = ConfigurationManager.ConnectionStrings["ERP"];
                using (SqlConnection cn = new SqlConnection(connectionString.ConnectionString))
                {
                    using (SqlCommand command = new SqlCommand("salesforce_updateAccount", cn))
                    {
                        command.CommandType = CommandType.StoredProcedure;

                        command.Parameters.Add("@idSalesforce", SqlDbType.VarChar).Value = account.Id;
                        command.Parameters.Add("@name", SqlDbType.VarChar).Value = account.Name;
                        command.Parameters.Add("@number", SqlDbType.VarChar).Value = (object)account.AccountNumber ?? DBNull.Value;
                        command.Parameters.Add("@address", SqlDbType.VarChar).Value = (object)account.BillingStreet ?? DBNull.Value;
                        command.Parameters.Add("@city", SqlDbType.VarChar).Value = (object)account.BillingCity ?? DBNull.Value;
                        command.Parameters.Add("@state", SqlDbType.VarChar).Value = (object)account.BillingState ?? DBNull.Value;
                        command.Parameters.Add("@postalCode", SqlDbType.VarChar).Value = (object)account.BillingPostalCode ?? DBNull.Value;
                        command.Parameters.Add("@country", SqlDbType.VarChar).Value = (object)account.BillingCountry ?? DBNull.Value;

                        cn.Open();
                        command.ExecuteNonQuery();
                    }
                }
            }

            notificationsResponse response = new notificationsResponse();
            response.Ack = true;

            return new notificationsResponse1() { notificationsResponse = response };
        }
    }
}

It is important to set the Ack variable of the response to true, otherwise Salesforce will think there was an error and will keep the outbound message in the queue and retry to send it at regular intervals.

Testing the Outbound Message

We need to publish our web service and make it available on the internet. The publishing is outside of the scope of this article. In my case I published it on an IIS server in our DMZ, and the public url is http://http.grupolanka.com/Salesforce/WorkflowNotificationServices/AccountNotificationService.svc (don’t try it, it won’t work)

Now I need to go back to Salesforce and change the url of the outbound message I created earlier. In Salesforce go to “Setup->Build->Create->Workflow & Approvals->Outbound Messages”. Edit the outbound message definition (in the example is SendAccountToERP) and edit the “Endpoint URL” field:

Account outbound message url

Save it. Open an account and make a modification, Salesforce will now call the web service passing the fields specified in the outbound message definition.

And that’s it! You can now have Salesforce to “talk” to your backend. In a next article I will complicate things a bit and let the web service call back to Salesforce for additional data. You can get the sample project here:

4 Comments

(Puedes ver este artículo en español aquí)

Salesforce allows to access external data sources using OData protocol. This feature is actually one of the most useful feature when designing and integration strategy with backend systems. Instead of using web services or the DataLoader to put the data into Salesforce, we can get up-to-date data directly from the third-party system by using an external data source. By using an external data source, and the external object it creates, we can access information located outside Salesforce as if it were stored inside Salesforce. The only disadvantage at the moment is that an external data source is read only, so we cannot use it to update data on third-party system. But still, this opens a whole lot of integration scenarios for us.

Let’s see an example of when this might come useful. I recently worked on a project where an integration between Salesforce and the ERP was required. Sales executives needed to see Account administrative alerts (e.g. “client has pending bills”, or “client has a claim on its latest bill”, etc.) in Salesforce. These alerts were generated in the ERP by the administrative users. Clearly we needed to bring those alerts to Salesforce.

One approach would be to use the DataLoader to upload alerts to Salesforce in batch, with a predefined frequency (once a day for example). This solution is quick and easy, but requires a lot of steps: creating the Alerts object in Salesforce, setting up the DataLoader command line mode to retrieve the data directly from the ERP and do an upsert in Salesforce, and using a scheduler program to do execute the DataLoader job at regular intervals. But it turns out there is an easier way! Well, easier if you read this post :-)

Instead of putting the data into Salesforce, we will let Salesforce retrieve the data dynamically from the ERP. For this we will use an external data source and create an external object. You can create an external object based in OData (unfortunately only version 2.0, although at the time of writing this OData is at version 4.0). Let’s see how we can use Visual Studio and .Net to expose the data from the ERP.

Setting up the Example

I will use a pretty simple example. I will simulate the ERP as a SQLServer database. We have an account table and an alert table that defines the alerts for our accounts. Our ERP will look like this:

Database

Use the following script to create the database and the tables:

/****** Object:  Database [ERP]    Script Date: 12/30/2014 17:22:48 ******/
CREATE DATABASE [ERP] ON  PRIMARY 
( NAME = N'ERP', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\ERP.mdf' , SIZE = 3072KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
 LOG ON 
( NAME = N'ERP_log', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\ERP_log.ldf' , SIZE = 1024KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
GO

USE [ERP]

/****** Object:  Table [dbo].[Account]    Script Date: 12/30/2014 17:22:49 ******/
CREATE TABLE [dbo].[Account](
    [id] [int] IDENTITY(1,1) NOT NULL,
    [name] [varchar](50) NOT NULL,
 CONSTRAINT [PK_Account] PRIMARY KEY CLUSTERED 
(
    [id] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
/****** Object:  Table [dbo].[Alert]    Script Date: 12/30/2014 17:22:49 ******/
CREATE TABLE [dbo].[Alert](
    [id] [int] IDENTITY(1,1) NOT NULL,
    [accountId] [int] NOT NULL,
    [alertText] [varchar](50) NULL,
 CONSTRAINT [PK_Alert] PRIMARY KEY CLUSTERED 
(
    [id] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
/****** Object:  ForeignKey [FK_Alert_Account]    Script Date: 12/30/2014 17:22:49 ******/
ALTER TABLE [dbo].[Alert]  WITH CHECK ADD  CONSTRAINT [FK_Alert_Account] FOREIGN KEY([accountId])
REFERENCES [dbo].[Account] ([id])
GO
ALTER TABLE [dbo].[Alert] CHECK CONSTRAINT [FK_Alert_Account]
GO

And populate the database with the following script:

insert into Account values('ModicaTech')
insert into Account values('Microsoft')
insert into Account values('Salesforce')
insert into Alert values(1,'Client requesting support')
insert into Alert values(2,'Client has pending receipts')

Creating the OData Web Service

Using Visual Studio 2013 Community edition, create a new empty ASP.Net Web project. We will add an Entity Framework data model to this project. Right click on the project name and choose “Add New Item”. Select “ADO.ET Entity Data Model” and name it ErpModel.

New ADO.Net data model

On the Entity Data Model Wizard, select “EF Designer from database” and click Next. Create a new connection to the SQLServer and the ERP database:

New connectionERP connection

Click Next and make sure you select “Entity Framework 5.0” as the version of Entity Framework to use, and click Next. Select both tables:

EF select tables

Click Finish. The Wizard will generate the code to connect to the database. We will now add the OData service to the project. Right click on the project name and choose “Add New Item…". Select “WCF Data Service 5.6” and name it ErpService.svc:

New OData service

The wizard will create the ErpService.svc.cs file and open it on the editor. Replace the text in the TODO for the name of the entity data model. The file should look like this:

using System;
using System.Collections.Generic;
using System.Data.Services;
using System.Data.Services.Common;
using System.Linq;
using System.ServiceModel.Web;
using System.Web;

namespace ErpODataService
{
    public class ErpService : DataService<ERPEntities>
    {
        // This method is called only once to initialize service-wide policies.
        public static void InitializeService(DataServiceConfiguration config)
        {
            config.SetEntitySetAccessRule("*", EntitySetRights.AllRead);
            // config.SetServiceOperationAccessRule("MyServiceOperation", ServiceOperationRights.All);
            config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
        }
    }
}

Notice line 11 where we specify the name of our entity data model. Line 16 specifies read access for all our entities. Finally, line 18 sets the version of OData to 2.0 (the one required by Salesforce). If you run the project you should see something like this:

OData service

Publish the web service to a public internet URL. I have published in an IIS in our DMZ with the URL http://www.grupolanka.com/Salesforce/ErpODataService/ErpService.svc (don’t try, it wont work)

Creating the External Data Source in Salesforce

Go to Salesforce, then go to “Setup->Build->Develop->External Data Sources” and click on the “New External Data Source” button. Specify a label and name and make sure you select “Lighting Connect: OData 2.0” as the type. In the server url put the url to access the OData service we created, and in the format select AtomPub. Specify anonymous as the authentication since we haven’t specify any security on the OData service (for simplicity). The external data source should like like the following:

 External data source

When you save it, you will click on “Validate and Sync” button:

Validate and sync external data source

Select the Alerts object and click the Sync button. Salesforce will create the Alerts__x object. Click on it to see its properties:

Alerts object

We will now create a relationship between this external object and the Account object to link alerts to accounts. But first we need to create a custom field on Account and mark it as an external id in order to make the relationship. Go to “Setup->Build->Customize->Accounts->Fields“ and in the “Account Custom Fields & Relationships” create a new custom field called External_ID. Make sure to mark it as “Unique” and “External ID”:

External id

Go to back to the Alerts external object (“Setup->Build->Develop->External Objects”). On the “Custom Fields & Relationships” section click on New. Choose “Indirect Lookup Relationship” as the field data type. Next select Accounts in the “Related To” field:

Related to

Click Next. In the target field select custom field for account we created before (External_ID__c)

Target field

Click Next. Enter a label and name for the field. Make sure you specify accountId as the “External Column Name”. This is the field in the external data source that contains the foreign key to account.

Indirect lookup relationship properties

Click Next on the rest of the Wizard.

We now need to create a Tab for the new object and add it to an application. Go to “Setup->Build->Create->Tabs” and create a new tab for the Alerts object. Select a theme. Then assign the tab to an application. You should now see something like this:

Alerts tab

Click on one of the records to see its details:

Alert detail


Seeing the External Data on a Salesforce Object

Now, let’s open an account (any account). Specify a value for the field “External Id”. Use 1 as the value so we will get all the alerts associated to the account with id 1

accountDetail

Notice that you will need to edit the layout for the related list to show the alertText field.

And this is it! We can dynamically see the data from our ERP in Salesforce. Let’s do a test. Insert a new alert for the account with id 1:

insert into Alert values(1,'This thing works!')

Refresh the account in Salesforce, you should see the new alert:

New alert


You can find the Visual Studio project here:


0 Comments

(Puedes ver este artículo en español aquí)

In a previous article, Call a .Net WCF Service from Salesforce, I explained how to make an external call in a Salesforce trigger to get data from a third party system by using a custom web service created in .Net WCF. This approach was using SOAP as the messaging protocol between Salesforce and our web service. We all know the advantages and disadvantages of SOAP compared to REST, but the truth is that REST is the preferred choice when it comes to creating web services. So, in this article I will modify the example presented in the previous post to use REST instead.

Allow me to refresh the problem we were trying to solve: our client wants to create sales quotes in Salesforce, but wants the price of products to come from the ERP. Our solution was to create a web service (a SOAP web service) in .Net WCF that given a product identifier returned the price of the product, we then called this service from a trigger defined in the QuoteLineItem object in Salesforce. We will keep the same architecture to the solution (the trigger and the web service) but this time we will use REST. I suggest you read the previous article to get a better understanding of what we would like to accomplish.

The WebAPI Service

Let’s start by creating our REST service. We will use an MVC WebAPI project in Visual Studio for this. I will use Visual Studio Community Edition to do this. Open Visual Studio and create a new “ASP.Net Web Application” name ErpApiService. Select “Web API” as the template and make sure you select “No Authentication” from the “Change Authentication” button (for simplicity we will not deal with security in this article).

New Asp.Net projectNew Web API project

Visual Studio will create a sample Web API project with a sample controller. Go to the Controllers folder and rename the file ValuesController.cs to ProductController.cs. If Visual Studio asks you to apply the rename to all references in code select yes. Replace the contents of the ProductController.cs file with the following:

using System;
using System.Collections.Generic;
using System.Configuration;
using System.Data;
using System.Data.SqlClient;
using System.Diagnostics;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Web.Http;

namespace ErpApiService.Controllers
{
    public class ProductController : ApiController
    {
        // GET api/product/getPriceForCustomer?id={id}
        public decimal GetPriceForCustomer(string id)
        {
            try
            {
                ConnectionStringSettings connectionString = ConfigurationManager.ConnectionStrings["ERP"];
                using (SqlConnection cn = new SqlConnection(connectionString.ConnectionString))
                {
                    using (SqlCommand command = new SqlCommand("salesforce_getProductPrice", cn))
                    {
                        command.CommandType = CommandType.StoredProcedure;

                        command.Parameters.Add("@productId", SqlDbType.VarChar).Value = (object)id ?? DBNull.Value;
                        SqlParameter priceParameter = command.Parameters.Add("@price", SqlDbType.Money);
                        priceParameter.Direction = ParameterDirection.Output;

                        cn.Open();
                        command.ExecuteNonQuery();

                        return (decimal)command.Parameters["@price"].Value;
                    }
                }
            }
            catch (Exception e)
            {
                Trace.WriteLine(e.Message);
                return 0;
            }
        }
    }
}

The highlighted line is the most important in this example, it defines the signature of the REST resource. By convention, if a method starts with Get (as in our example) the WebAPI engine will use the HTTP GET method to call it. The rest of the code is just ADO.Net boilerplate code to call a stored procedure in a database (in this case a SQLServer database).

Publish this web service to a public URL. In my case I published it to an IIS server in the DMZ and the URL is the following: http://www.grupolanka.com/Salesforce/ErpApiService/api/Product?id={id} (don’t try it, it won’t work). Now, we are ready to consume this REST service from Salesforce.

Calling the REST Service from Salesforce

Go to Salesforce and using the Developer Console let’s create an anonymous code to test our service. Type the following APEX code:

String url = 'http://www.grupolanka.com/Salesforce/ErpApiService/api/Product?id=PADAP20001';

Http h = new Http();
HttpRequest req = new HttpRequest();
req.setEndpoint(url);
req.setMethod('GET');
HttpResponse res = h.send(req);

system.debug('Price: ' + res.getBody());

You should be able to see the price sent from the ERP in the log.

Now let’s change the class we had from our previous article to do the REST call instead of SOAP. On the Developer Console, open the class QuoteLineItemProcesses class and replace the code with the following:

global with sharing class QuoteLineItemProcesses {
    @future (callout = true)
    public static void updateLinePrice(List<Id> lineItemIds) {
        Boolean changed = false;
        
        QuoteLineItem[] lineItems = [select QuoteId,Product2Id,UnitPrice from QuoteLineItem where Id in :lineItemIds];
        for(QuoteLineItem lineItem : lineItems) {
            Quote quote = [select q.AccountId from Quote q where Id = :lineItem.QuoteId];
            Product2 product = [select External_Id__c from Product2 where Id = :lineItem.Product2Id];
            
            String productCode = product.External_Id__c;    
            
            Decimal price = GetPrice(productCode);
            
            if(price > 0)
            {
                lineItem.UnitPrice = price;
                changed = true;
            }    
        }
        
        if(changed) update lineItems;
    }
    
    private static Decimal GetPrice(String productCode) {
        String url = 'http://www.grupolanka.com/Salesforce/ErpApiService/api/Product?id=' + productCode;

        Http h = new Http();
        HttpRequest req = new HttpRequest();
        req.setEndpoint(url);
        req.setMethod('GET');
        HttpResponse res = h.send(req);
        
        return Decimal.valueOf(res.getBody());
    }
}

The callout to a REST service from a trigger follows the same principles we outlined in our previous article: it needs to be called from a class marked with the @future tag (for asynchronous call), it needs to be a static void method, and the trigger needs to be a after insert trigger.

You can now test the call by adding a new line item to an existing quote, Salesforce should call the REST service and assign the product price from the ERP to the line item.

You can get the sample code from here:

1 Comments

(Puedes ver este artículo en español aquí)

In a previous article, A .Net WCF Service Handler for Salesforce Workflow Outbound Messages that Calls Back to Salesforce using SOAP API, I explained how to create a .Net WCF service to handle an outbound message and how to get additional data from Salesforce using the SOAP API. In this article I use the same example but instead of using the SOAP API I will use the REST API. You might wonder why do I need to change my SOAP code to use REST instead, and the answer is simple: you might have the SOAP API disabled (because of the Salesforce edition you have) in your organization and only have the REST API available.

Allow me to refresh what we would like to accomplish: our client wants to integrate accounts in Salesforce with accounts in their ERP, so every time a new opportunity is marked as “Closed Won” in Salesforce the account is created on the ERP. I suggest you to read the previous article which explains how to set things up for the workflow and the Visual Studio project to create the WCF service that handles the workflow outbound message.

Salesforce is able to expose its metadata as a REST service. As we did in the case of SOAP, we could use the REST API to query the account information, but this could be simplified by exposing an APEX class as a REST service. Salesforce makes this very simple. You can follow the steps explained in the Force.com Apex Code Developer's Guide for an overview on how to do this. In this article I will expose an APEX class as a REST service and then I will consume this service from our message handler created in .Net

Exposing the REST Service in Salesforce

This is actually very simple, all you need to do is open the Salesforce Developer Console and from the menu select “File->New->Apex Class”. Name it AccountRestService and replace the code with the following:

@RestResource(urlMapping='/Account/*')
global with sharing class AccountRestService {
    @HttpGet
    global static Account doGet() {
        RestRequest req = RestContext.request;
        RestResponse res = RestContext.response;
        String accountId = req.requestURI.substring(req.requestURI.lastIndexOf('/')+1);
        Account result = [SELECT Id, Name, BillingStreet, BillingCity, BillingState, BillingPostalCode FROM Account WHERE Id = :accountId];
        return result;
    }
}

Notice in line 1 that we use the special @RestResource to tell Salesforce that this is actually a REST service. In line 3 we specify that the doGet method will be called by HTTP GET. The URL for this service will be the following:

https://{instanceName}.salesforce.com/services/apexrest/Account/{accountId}

The https://{instanceName}.salesforce.com/services/apexrest is the URL base address for all REST services, and the /Account/{accountId} is specified by our class definition in the urlMapping parameter of the @RestResource tag. For example you could use the following URL to get the details for a specific account:

https://eu5.salesforce.com/services/apexrest/Account/00124000002uzps

There is one thing we haven’t considered yet: security. If you put the above URL into a browser you will get an INVALID_SESSION_ID error. If you read the documentation you will learn that the HTTP request issued against the REST service needs an Authorization HTTP header. You could create a connected app and use OAuth to call the login REST service and get a session id, but this is actually complex (I will explain it in a future article) but in our case, since we are calling this service from an outbound message handler, we already have the session id. Remember from the previous article that we marked the outbound message to “Send Session ID”:

Outbound message

So, all we need to do is to build the right HTTP request from our WCF message handler to call our REST service.


Calling the REST Service from Visual Studio

Open the Visual Studio project you created in the previous article (you can get a sample from here). We will use RestSharp as our REST client to call the service. Using NuGet, add the RestSharp package to your project. RestSharp can automatically transform the JSON text returned from a REST service to a strong typed object. Let’s create a model to encapsulate the data returned from the REST service: in Visual Studio, create a Model folder and add a class named Account to it. Replace the code with the following:

namespace WorkflowNotificationServices.Model
{
    public class Account
    {
        public string Id { get; set; }
        public string Name { get; set; }
        public string AccountNumber { get; set; }
        public string BillingStreet { get; set; }
        public string BillingCity { get; set; }
        public string BillingState { get; set; }
        public string BillingPostalCode { get; set; }
        public string BillingCountry { get; set; }
    }
}

We have defined an account class that encapsulates the data from the account object in Salesforce. Notice that for simplicity we have named the properties the same as the object fields in Salesforce (you don’t have to name things the same, you could use JSON.Net to get around it, but is not a topic we would like to do in this article).

Now, open the file OpportunityNotificationService.svc.cs and change the method CreateAccount with the following code:

private bool CreateAccount(string url, string sessionId, WorkflowNotificationServices.Opportunity opportunity)
{
    int recordsAffected = 0;

    Uri uri = new Uri(url);
    RestClient restClient = new RestClient(new Uri(String.Format("https://{0}", uri.Host)));
    RestRequest request = new RestRequest("services/apexrest/Account/{id}");
    request.AddUrlSegment("id", opportunity.AccountId);
    request.AddHeader("Authorization", String.Format("Bearer {0}", sessionId));
    IRestResponse<Model.Account> response = restClient.Execute<Model.Account>(request);

    Model.Account account = response.Data;

    ConnectionStringSettings connectionString = ConfigurationManager.ConnectionStrings["ERP"];
    using (SqlConnection cn = new SqlConnection(connectionString.ConnectionString))
    {
        using (SqlCommand command = new SqlCommand("salesforce_crearCliente", cn))
        {
            command.CommandType = CommandType.StoredProcedure;

            command.Parameters.Add("@idSalesforce", SqlDbType.VarChar).Value = account.Id;
            command.Parameters.Add("@name", SqlDbType.VarChar).Value = account.Name;
            command.Parameters.Add("@number", SqlDbType.VarChar).Value = (object)account.AccountNumber ?? DBNull.Value;
            command.Parameters.Add("@address", SqlDbType.VarChar).Value = (object)account.BillingStreet ?? DBNull.Value;
            command.Parameters.Add("@city", SqlDbType.VarChar).Value = (object)account.BillingCity ?? DBNull.Value;
            command.Parameters.Add("@state", SqlDbType.VarChar).Value = (object)account.BillingState ?? DBNull.Value;
            command.Parameters.Add("@postalCode", SqlDbType.VarChar).Value = (object)account.BillingPostalCode ?? DBNull.Value;
            command.Parameters.Add("@country", SqlDbType.VarChar).Value = (object)account.BillingCountry ?? DBNull.Value;

            cn.Open();
            recordsAffected = command.ExecuteNonQuery();
        }
    }

    return recordsAffected > 0;
}

The lines highlighted are the important ones. Notice in line 6 how we use the URL we got from the SOAP message sent by Salesforce to our WCF service and we get the host (and thus the instance name) we need to send the HTTP request to. In line 7 we create an HTTP request using RestSharp and specify the endpoint of our REST service, as explained before. In line 8 we specify the account Id we got from the SOAP of the outbound message on opportunity. In line 9 we set the security part we need to make this work. We need to set the Authorization header to the value Bearer {sessionId}. The session id we get it again from the SOAP of the outbound message sent by Salesforce (remember we marked the “Send session ID” field in the outbound message definition). Finally, in line 12 we make the HTTP call and tell RestSharp to convert the result to our Account object we created before. The rest of the code is just the same, using the values returned by the REST service (now strongly typed into a class) to call a stored procedure on the ERP.


Testing the Call Back

To test our web service we just follow the same steps outlined in the previous article. Notice that we only changed the API used to obtain data from Salesforce: we were using SOAP before and now we use REST

You can get the sample project here: