SharePoint Online and Salesforce Integration

SharePoint Online and Salesforce are two different cloud platforms, provided as “Software as a Service” from two different vendors.

SharePoint is a cloud-based “Software as a Service” using which organizations can share content with colleagues, partners, and customers. SharePoint online also provides flexibility to access it from anywhere, i.e. home or office or wherever Internet connectivity is available, and from any device, i.e. mobile/compact.

At the other end of the spectrum,  Salesforce is a cloud-based customer relationship management software solution for sales, service, marketing, collaboration, analytics, and building reports.

Both the above can be considered as “Software as a Service” platforms.

In most business situations, there may be a use case, where business users would like to seamlessly obtain data from Salesforce on to SharePoint online, so that, the information can be collaborated from one single platform, instead of two different places.

In this blog, I will explain the concept and steps required to integrate Salesforce with the SharePoint Online site, so that, information updated or created in Salesforce will, in real-time, be updated and created in SharePoint online site.

When the question arises as to how to write data in SharePoint online, the first hurdle is to remotely authenticate the user against it, and in order to remotely authenticate the user, the best way available is REST api.

You need to make few rest calls to get the authentication piece as given below:

  • Get the security token
  • Get the access token
  • Get the request digest

The above rest call should be translated into an apex class which is supported by force.com

The first task is to get the SharePoint online security token, which can be obtained by posting XML as the request body to https://login.microsoftonline.com/extSTS.srf. In the XML you need to pass on account credentials which has at least contribute access.

The above request would fetch a response security token which is needed to get the access token.

In order to get the access token, again a rest call is required to be posted to the following URL with the security token as the request body:

https://yourdomain.sharepointonline.com/_forms/default.aspx?wa=wsignin1.0

The rest call with the request body including security token to the above URL would fetch a response, which would contain cookies and these cookies must be included in all the subsequent rest calls.

After the above operation, you need to have the request digest. The request digest is obtained by posting the rest call along with the access token and the obtained cookies to the following URL:

https://yourdoamin.sharepointonline.com/_api/contextinfo.

The rest call that is posted to the above URL will fetch the response along with the request digest. Please note that the entire contents of the “FormDigestValue” tag would be required which includes the date-time portion as well as time zone offset.

All the above steps have to be carried out in terms of a global apex class, and call the apex class with the trigger in salesforce.

Apex class source code is as given below: 

global class SharePointOnlineWebserviceCallout{
 @future (callout=true)
 Public static Void GetAuthentication(string AccountTitle)
 {
 string body = '';
 string formattedCookie = '';
 string output = '';
 string cookie = '';
 string token = '';
 string username = 'account@sharepoint.com';
 string password = 'Password';
 string host = 'https://yourdoamin.sharepointonline.com';
 string tokenRequestXml ='<s:Envelope ' +
 'xmlns:s='http://www.w3.org/2003/05/soap-envelope' ' +
 'xmlns:a='http://www.w3.org/2005/08/addressing' ' +
 'xmlns:u='http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd'> ' +
 '<s:Header>' +
 '<a:Action s_mustUnderstand='1'>http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue</a:Action>' +
 '<a:ReplyTo> ' +
 '<a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address> ' +
 '</a:ReplyTo>' +
 '<a:To s_mustUnderstand='1'>https://login.microsoftonline.com/extSTS.srf</a:To> ' +
 '<o:Security ' +
 's:mustUnderstand='1' ' +
 'xmlns:o='http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd'> ' +
 '<o:UsernameToken> ' +
 '<o:Username>' + username + '</o:Username>' +
 '<o:Password>' + password + '</o:Password>' +
 '</o:UsernameToken>' +
 '</o:Security>' +
 '</s:Header>' +
 '<s:Body>' +
 '<t:RequestSecurityToken xmlns_t='http://schemas.xmlsoap.org/ws/2005/02/trust'> ' +
 '<wsp:AppliesTo xmlns_wsp='http://schemas.xmlsoap.org/ws/2004/09/policy'> ' +
 '<a:EndpointReference> ' +
 ' <a:Address>' + host + '</a:Address> ' +
 '</a:EndpointReference> ' +
 '</wsp:AppliesTo> ' +
 ' <t:KeyType>http://schemas.xmlsoap.org/ws/2005/05/identity/NoProofKey</t:KeyType> ' +
 '<t:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</t:RequestType> ' +
 ' <t:TokenType>urn:oasis:names:tc:SAML:1.0:assertion</t:TokenType> ' +
 '</t:RequestSecurityToken> ' +
 ' </s:Body> ' +
 '</s:Envelope>';

HttpRequest reqBinaryToken = new HttpRequest();
reqBinaryToken.setEndpoint(‘https://login.microsoftonline.com/extSTS.srf’);
reqBinaryToken.setMethod(‘POST’);
reqBinaryToken.setbody(tokenRequestXml);
reqBinaryToken.setHeader(‘Content-Length’,String.valueof(tokenRequestXml.length()));
reqBinaryToken.setTimeout(60000);

HttpResponse responseBinaryToken = new HttpResponse();
Http httpBinaryToken = new Http();
responseBinaryToken = httpBinaryToken.send(reqBinaryToken);
string xmlContent = responseBinaryToken.getBody();
Dom.Document doc = responseBinaryToken.getBodyDocument();
Dom.XMLNode address = doc.getRootElement();
//XmlStreamReader reader = new XmlStreamReader(responseBinaryToken.getBody());
string outxmlstring = String.valueof(doc.getRootElement().getName());//gives you root element Name

XmlStreamReader reader = new XmlStreamReader(responseBinaryToken.getBody());
while(reader.hasNext()) {
if (reader.getEventType() == XmlTag.START_ELEMENT && reader.getLocalName()== ‘BinarySecurityToken’) {
reader.next();
if(reader.hasNext()){
if(reader.getEventType() == XmlTag.CHARACTERS){
token = reader.getText();
token += ‘&p=’;
}
}
}
reader.next();
}

HttpRequest requestCookie = new HttpRequest();
requestCookie.setEndpoint(‘https://yourdoamin.sharepointonline.com/_forms/default.aspx?wa=wsignin1.0’);
requestCookie.setHeader(‘Content-Type’, ‘application/x-www-form-urlencoded’);
requestCookie.setHeader(‘User-Agent’,’Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)’);
requestCookie.setMethod(‘POST’);
requestCookie.setBody(token);
requestCookie.setHeader(‘Content-Length’,String.valueof(token.length()));

HttpResponse responseCookie = new HttpResponse();
Http httpCookie = new Http();
responseCookie = httpCookie.send(requestCookie);
string location = responseCookie.getHeader(‘Location’);

if(responseCookie.getStatus() == ‘MovedPermanently’){
HttpRequest reqMovedPermanently = new HttpRequest();
reqMovedPermanently.setHeader(‘Content-Type’, ‘application/x-www-form-urlencoded’);
reqMovedPermanently.setMethod(‘POST’);
reqMovedPermanently.setEndpoint(‘https://yourdoamin.sharepointonline.com/_forms/default.aspx?wa=wsignin1.0’);
reqMovedPermanently.setBody(token);
reqMovedPermanently.setHeader(‘Content-Length’,String.valueof(token.length()));
reqMovedPermanently.setHeader(‘Location’, location);
reqMovedPermanently.setHeader(‘User-Agent’,’Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)’);
HttpResponse responseMovedPermanently = new HttpResponse();
Http httpMovedPermanently = new Http();
responseMovedPermanently = httpMovedPermanently.send(reqMovedPermanently);
cookie = responseMovedPermanently.getHeader(‘Set-Cookie’);
}
else
{
cookie = responseCookie.getHeader(‘Set-Cookie’);
}

HttpRequest requestDigest = new HttpRequest();
requestDigest.setEndpoint(‘https://yourdoamin.sharepointonline.com/_api/contextinfo’);
requestDigest.setMethod(‘POST’);
requestDigest.setBody(body);
requestDigest.setHeader(‘Content-Length’,String.valueof(body.length()));
requestDigest.setHeader(‘Accept’,’application/json;odata=verbose’);
requestDigest.setHeader(‘Content-Type’,’application/json;odata=verbose’);
requestDigest.setHeader(‘Cookie’,cookie);

Http httpRequestDigest = new Http();
HttpResponse responseRequestDigest = new HttpResponse();
responseRequestDigest = httpRequestDigest.send(requestDigest);
string requestDigestValue = responseRequestDigest.toString();
string xmlContentRequestDigest = responseRequestDigest.getBody();

Integer index1 = xmlContentRequestDigest.indexOf(‘”FormDigestValue”:”‘);

Integer index2 = ‘”FormDigestValue”:”‘.length();

string contentRequestDigest = xmlContentRequestDigest.Substring(index1 + index2);

string requestDigestXml = contentRequestDigest.split(‘”‘)[0];

HttpRequest reqWrite = new HttpRequest();
HttpResponse resWrite = new HttpResponse();
Http httpWrite = new Http();
reqWrite.setEndpoint(‘https://yourdoamin.sharepointonline.com/_api/web/lists/GetByTitle(‘AccountTest’)/items’);
reqWrite.setMethod(‘POST’);
reqWrite.setCompressed(false);
reqWrite.setHeader(‘Accept’

Partner with us to ensure seamless collaboration and communication across the enterprise with SharePoint.

Search Architecture in SharePoint 2013

Search architecture in SharePoint 2013 has undergone massive changes in comparison to the earlier versions of SharePoint. A primary change is, SharePoint 2013, fast search has been embedded in the native search engine as compared to previous version of SharePoint 2010, where fast search was a separate installation.

  1. Search in SharePoint 2013 has been enhanced in several ways with enhancements for ‘types of users’. For example, for end-users, SharePoint 2013 search enables them find relevant information more quickly and easily. For developers and administrators, SharePoint 2013 search provides new APIs and customized search experience.
  2. Also, Search in SharePoint 2013 has been re-architected to a single search platform. Search architecture in SharePoint 2013 has the following components:
    1. Crawl and content processing
    2. Index component
    3. Query processing components
    4. Analytics components
    5. Search administration component

Below is the search architecture diagram, which has different components as indicated above.

Let us discuss each of the components of SharePoint 2013 search engine architecture in detail.

  1. Crawl Component: connects to different content sources using protocol handlers and crawls the content. Crawl engine crawls the contents and extracts the crawled properties and Metadata from the crawled content. Crawl component then passes it to the content processing component.
  2. Crawl Database: is one of the key sections of the crawl and content processing component. Crawler will crawl the content and the extracted information will be stored in the crawl database. Crawl database stores the information about the timestamp of the crawled items, last crawled time, last crawl ID and the type of update during the last crawl.
  3. Content Processing Component: Here, the information sent by the crawl component is processed and passed to the index component.
  4. Link Database: The crawled link information extracted from the crawl component and proceeding link information from the content processing component is stored in the link database.
  5. Index Component: gets the processed items from the content processing component and makes an entry about the processed items in the search index. Index component is also responsible for handling incoming queries form query processing component and gets the result form the search index and passes it to the query processing components.
  6. Query Processing Component: This component in the SharePoint 2013 search engine is responsible for analyzing and processing search query. Processed query is sent to the index component which in turn submits the query to the search index and search index returns the result set.
  7. Analytics Component: contains analytics processing component, link database, analytic reporting database and event store. Analytics processing component is responsible for usage analytics and search analytics. Link database in the analytics component stores search click information. Analytics database in this component stores the result of usage analytics. Event store captures the data for usage events that are captured at the front end.
  8. Search administration: This component in SharePoint 2013 search engine, helps running in system processes. Using search administration component you can add new instances of search components. Search administration database in search administration component stores search configuration data.

Search Service Application:

Search engine in SharePoint 2013 is exposed using a service application known as search service application. Search service application provides you an interface to configure different components of the search architecture discussed in the above segment.

To create a search application follow the steps and screenshots given below.

  1. Open the SharePoint Central Administration Site using an admin account.
  2. Under Application Management section click on manage service application as shown in the screenshot below.
  3. Click on new button on the service application page as shown in the screenshot below. Click on the search service application from the drop down.
  4. Click event would open a dialog page to capture the configuration properties information. The settings you specify here can be changed later using the properties button in the manage service application page, after selecting the appropriate service application.

Fill in the necessary settings details and click ok.

After few minutes the search service application will be created and a web page as shown in below screenshot will be displayed.

Using the above shown search administration page, you can configure the search in your form.

Managed Metadata in SharePoint 2013

Metadata is defined as ‘data about data’ or ‘metadata’.  It can also be defined as ‘information about information’. For example, a book can have Meta data such as, title and author. Metadata can be of many types and can be associated with any type of element.

In SharePoint this information can be centrally managed and this makes it ‘Managed Metadata’ in SharePoint. Managed Metadata in SharePoint can be defined as the hierarchical collection of centrally managed terms, term set and enterprise keywords.

Managed Metadata concept was introduced by Microsoft in SharePoint 2010 version.

In SharePoint Managed Metadata is exposed using a service application known as Managed Metadata service application.

Given below are a few steps to create a managed Metadata service application in SharePoint 2013.

  1. Open central admin site using an admin account.
  2. In the central admin site under application management section click on ‘Manage Service application’ as shown below.
  3. Manage service applications link will redirect the user to manage Service Applications page. In the manage service applications page under service applications tab click on new button. New button click will display a drop down menu as shown in the screenshot below.
  4. Click on the Managed Metadata service link, which will pop up a new window, where in you can specify the name of the service application, name of the database which will be created to store the created managed terms, application pool details and content settings, as shown in the screenshot below.
  5. After a few minutes the service application will be created and the managed metadata service application default page will be loaded as shown in the screenshot below.In the default service application page, on the right pane, the default settings will be displayed.
  6. In the left pane of the managed service application page, under the system folder which is nothing but the system group, there will be three different term sets created as shown in the screenshot below.
    The hashtags term set will store the data, which has been used for social collaboration tagging. For example, if you create a new entry in the news feed web part, with hash tag, the same will stored in the hashtag term set as shown in the screenshot below.

    Keywords term set would store the terms created in a managed metadata column in a SharePoint list and library. This allows multiple values and has set the allow submission policy set to true. Therefore, if any user adds a new value to the column, the same will be stored as the keyword under keywords ‘TermSet’. Orphaned TermSet would store any of the deleted term which are being used.

  7. There are some more groups created by default as shown in the screenshot below.

People group would contain three different TermSets as shown in the screenshot above. These TermSets are used in user profiles and properties to store the information about the user. Every new entry for a user profile property for these three TermSets will be stored here.

Search dictionaries group would contain different TermSets as shown in the screenshot above. Let’s discuss about each TermSet in detail.

You can manage company names in the company name inclusion and company name exclusion term set you can control which company names will be extracted and placed in a managed property field.

Search queries can be improved by managing “Did you mean” spelling suggestions in the Term Store using query spelling exclusion and inclusion.

Important terminology

  • Taxonomy: Taxonomy is the way to arrange the terms in a formal way.
  • Folksonomy: Folksonomy is free flow and informal way of classification of data.
  • Groups:  A group is a set of term sets that share common security requirements.
  • TermSets: TermSet can be defined as group of related terms. There are two types of TermSets. One is local TermSet which is created at site collection scope and one is global TermSet which is created in managed metadata service application.
  • Terms:  A term can be defined as specific word or phrase that ca n be associated with a SharePoint item.

New Features in SharePoint 2016

Office 365 Integration

Microsoft has enabled Office 365 integration in terms of hybrid scenario, which can be configured using SharePoint 2016 on-premises central administration site.

There are two ways, using which hybrid integration can be done.

  1. One Drive
  2. One Drive and sites

SMTP Connection Encryption

In SharePoint 2013 the SMTP was on port 25. However in SharePoint 2016 you can use the custom port for SMPT configuration and you can also enable SSL and encrypt the connection between SharePoint 2016 and SMTP.

System Settings

Using central administration site, you can manage services running on the server as well as those in the farm. You can even change the server role from central administration site for a particular server in the farm.

Enterprise Metadata and Keyword Settings

The Enterprise Keywords column allows users to enter one or more text values that will be shared with other users and applications, for ease of search and filtering, as well as metadata consistency and reuse. In SharePoint 2016 you can use metadata publishing in a document library or list, using which values added to managed metadata and enterprise keyword columns can be shared as social tags on My Sites to appear in news feed, profile pages, tag clouds and tag profile pages.

Project Server Inclusion

SharePoint 2016 server will by default be shipped with the project server. Microsoft is consolidating project server databases into the SharePoint content database. The project server engine is now embedded into SharePoint 2016 On-Premises version. However, this will not be available free of cost. A separate license will be required to use the Project server features. Project Server features will be available using project server service application.

New Improved Synchronization Settings

SharePoint 2016 will no longer support FIM (Forefront Identity Manager), which was used in the earlier SharePoint version, i.e. SharePoint 2013. All the profiles imported in SharePoint 2016 will be supported using active directory import. In order to overcome the active directory limitations, Microsoft has come up with MIM (Microsoft Identity Manager), which is actually the upgraded version of FIM (Forefront Identity Manager). MIM overcomes all the limitations within Active Directory Import. SharePoint 2016 would have improved bidirectional synchronization support.

New Site Template

SharePoint 2016 has a new site template known as “Compliance Policy Center”. It can be used as the Document Deletion Policy Center to manage policies that can delete documents after a specified period of time. These Policies can then be assigned to specific site collections or to site collections templates.

The other site template which has been introduced is ‘In-Place Hold Policy Center’. This site template can be used to create a site to manage policies to preserve content for a fixed period of time.

Role-based Server Farm Configuration

SharePoint 2016 is now shipped with a configuration wizard, which will help SharePoint administrator to configure SharePoint farm servers with specific roles right at the time of installation. This will help to maintain the best performance of the desired server role. The different roles available are as follows:

  • Front End
  • Application
  • Distributed Cache
  • Search
  • Custom
  • Single Server Farm

Role Conversion

In SharePoint 2016, you can change the role of a server in the farm at any time, using SharePoint Central Admin site interface under system settings, i.e. convert server roles in the Farm.

Enhanced Control for Document Library (Similar to SharePoint Online)

SharePoint 2016 document libraries would have the same features as you get in SharePoint online document library. This will allow users to execute commands on SharePoint online.

Five Key Boundaries and Limits – Improved

Content Database Size – has been increased from Gigabytes to Terabytes.

Site Collection per content Database – has been increased to 100,000.

List Threshold – is now greater than 5000 items

Maximum file Size upload – limit increased to 10GB from 2 GB and character restriction have been removed.

Indexed Items – have been doubled from the 500 million items for Search scaling.

Durable Links

When renaming and moving a document from one document library to another, even to a separate site collection, the links will not break.

SLAPI (SharePoint logging API)

A new API to read the logging data has been introduced. This API will give a better insight into, how users are using SharePoint.