Thursday 10 October 2019

Topics to study for Integration Architecture Designer exam


I have recently taken my fourth designer exam ( only one to go... yay !!). Overall the exam wasn't really difficult, if you have relevant hands on integration experience. Trailmix has everything what you need to study for this exam so I have listed the key topics/points which I've found really useful: 

Integration Patterns
Integration Patterns are classified into two categories:
Data Integration
Process Integration
These patterns address the requirement to synchronise data that resides in two
or more systems so that both systems always contain timely and meaningful data
The patterns in this category address the need for a business process to leverage two or more applications to complete its task
Here's the list of patterns:


Pattern Selection Matrix 


Introduction to APIs

Here's the list of API with key details.

  • Rest API: Its advantages include ease of integration and development, and it’s an excellent choice of technology for use with mobile applications and web projects.
  • SOAP API: provides a powerful, convenient, and simple SOAP-based web services interface for interacting with Salesforce. 
  • Chatter REST API: Use Chatter REST API to display Chatter feeds, users, groups, and followers, especially in mobile applications.
  • User Interface API: Build Salesforce UI for native mobile apps and custom web apps using the same API that Salesforce uses to build Lightning Experience and Salesforce for Android, iOS, and mobile web.
  • Use Apex REST API: when you want to expose your Apex classes and methods so that external applications can access your code through REST architecture.
  • Use Apex SOAP API: when you want to expose Apex methods as SOAP web service APIs so that external applications can access your code through SOAP.
  • Analytics REST API: You can access Analytics assets—such as datasets, lenses, and dashboards—programmatically using the Analytics REST API.
  • Bulk API is based on REST principles and is optimized for loading or deleting large sets of data. 
  • Use Metadata API: to retrieve, deploy, create, update, or delete customizations for your org. 
  • Use Streaming API: to receive near-real-time streams of data that are based on changes in Salesforce records or custom payloads. 
  • Use Tooling API: to integrate Salesforce metadata with other systems
Salesforce APIs



Choreography Vs Orchestration

The difference between choreography and service orchestration is:

Choreography
Orchestration
Choreography (where applications are multi-participants and there is no central “controller”) can be defined as “behaviour resulting from a group of interacting individual entities with no central authority.”
Orchestration (where one application is the central “controller”)  can be defined as “behaviour resulting from a central conductor coordinating the behaviours of individual entities performing tasks independent of each other.

Streaming Events 

Use the type of streaming event that suits your needs.

PushTopic Event
Receive changes to Salesforce records based on a SOQL query that you define. The notifications include only the fields that you specify in the SOQL query.

Change Data Capture Event

Receive changes to Salesforce records
with all changed fields. Change Data
Capture supports more standard objects than PushTopic events and provides more features, such as header fields that contain information about the change. Change Data Capture is part of a pilot program. To participate in the pilot, contact Salesforce.

Platform Event

Publish and receive custom payloads with a predefined schema. The data can be anything you define, including business data, such as order information. Specify the data to send by defining a platform event. Subscribe to a platform event channel to receive notifications.

Generic Event

Publish and receive arbitrary payloads without a defined schema.


WSDL (Webservice Definition Language) 

Enterprise WSDL

  • The Enterprise WSDL is strongly typed.
  • The Enterprise WSDL is tied (bound) to a specific configuration of Salesforce (ie. a specific organization's Salesforce configuration).
  • The Enterprise WSDL changes if modifications (e.g custom fields or custom objects) are made to an organization's Salesforce configuration.

Partner WSDL

  • The Partner WSDL is loosely typed.
  • The Partner WSDL can be used to reflect against/interrogate any configuration of Salesforce (ie. any organization's Salesforce configuration).
  • The Partner WSDL is static, and hence does not change if modifications are made to an organization's Salesforce configuration.

Delegated Authentication WSDL

The delegated authentication WSDL document is for users who want to created a delegated authentication application to support single-sign on.

Metadata WSDL

The Metadata WSDL document is for users who want to use the Metadata API to retrieve or deploy customization information,

Apex WSDL

The Apex WSDL document is for developers who want to run or compile Apex scripts in another environment.

Identity Type

Determines whether you're using one set or multiple sets of credentials to access the external system 

  • Anonymous — No Identity and therefore no authentication 
  • Named Principal—Your entire Salesforce org shares one login account on the external system. Salesforce manages all authentication for callouts that specify a named credential as the callout endpoint so that you don’t have to. You can also skip remote site settings, which are otherwise required for callouts to external sites, for the site defined in the named credential.
  • Per User—Your org uses multiple login accounts on the external system. You or your users can set up their personal authentication settings for the external system 

Certificates
  • The API client certificate is used by workflow outbound messages, the AJAX proxy, and delegated authentication HTTPS callouts.
  • Certificates with 2048-bit keys last one year and are faster than certificates with 4096-bit keys. Certificates with 4096-bit keys last two years. You can have a maximum of 50 certificates.
  • The expiration date of the certificate record is updated to the expiration date of the newly uploaded certificate.
  • Don't delete a key unless you're absolutely certain no data is currently encrypted using the key. After you delete a key, any data encrypted with that key can no longer be accessed.
  • Replace the Default Proxy Certificate for SAML Single Sign-On

Reason for backup

  • Recover from data corruption (unintended user error or malicious activity)
  • Prepare for a data migration rollback
  • Archive data to reduce volumes
  • Replicate data to a data warehouse/BI
  • Take snapshots of development versions
  • Vertical optimization: Backup time is, among other parameters, proportional to the number of records you are retrieving. Partial backup is also a type of vertical optimization.
  • Horizontal optimization: Backup time is, among other parameters, proportional to the number and types of columns you are retrieving. 

Sandboxes

  • A full sandbox copies the production org but using it as a DRP (Disaster Recovery Plan) is not recommended, neither as an alternative production environment (because the related infrastructure is not meant for production usage) nor as a backup (because there’s no guarantee of data integrity of the copy, and copy is not point-in-time).
  • Full org copy should also not be used as a substitute to Salesforce Org Sync

Salesforce Developer Limits

  • Formulas: maximum length 3,900 characters
  • Lightning pages: maximum components in a region - 25
  • Master-detail relationship: 8 maximum child records  - 10,000 
  • Recycle Bin: maximum records:  25 times your MB storage capacity as records. For example, an org with a storage allocation of 2,000MB (2GB) can have 50,000 records in the Recycle Bin: 25 x 2,000 = 50,000 records.
  • Each flow can have up to 50 versions and 2,000 steps 
  • Each org can have up to 500 active flows, 1,000 flows total, 30,000 waiting interviews at a given time, 1,000 events processed per hour and 20,000 defined relative alarm events across all flows and flow versions
  • If a file exceeds the maximum size, the text within the file isn't searched
  • PDF .pdf 25 MB and Word .doc, .docx, .docm 25 MB
  • The maximum number of times a file can be shared is 10
  • File storage and data storage are calculated asynchronously, so if you import or add a large number of records or files, the change in your org’s storage usage isn’t reflected immediately.
  • On-Demand Email-to-Case: Number of user licenses multiplied by 1,000; maximum 1,000,000
  • Total number of SOQL queries issued 100 Total number of records retrieved by SOQL queries 50,000 Total number of DML statements issued 150 17 Salesforce Application Limits Process Automation Limits Description Per-Transaction Limit Total number of records processed as a result of DML statements 10,000
  • Total number of characters in a process name 255 Total number of characters in a process’s API name 77 Total number of versions of a process 50 Total number of criteria nodes in a process 200
  • The daily limit for emails sent through email alerts is 1,000 per standard Salesforce license per org. The overall org limit is 2,000,000. 
  • These limits count for each Apex transaction. For Batch Apex, these limits are reset for each execution of a batch of records in the execute method.
  • This limit doesn’t apply to custom metadata types. In a single Apex transaction, custom metadata records can have unlimited SOQL queries
  • API Calls : Total Calls Per 24-Hour Period : up to a maximum of 1,000,000
  • You can submit up to 10,000 batches per rolling 24-hour period
  • A batch can contain a maximum of 10,000 records
  • Batches are processed in chunks. The chunk size depends on the API version. In API version 20.0 and earlier, the chunk size is 100 records. In API version 21.0 and later, the chunk size Batch processing time is 200 records. 
  • There’s no limit on sending individual emails to contacts, leads, person accounts, and users in your org directly from account, contact, lead, opportunity, case, campaign, or custom object pages.
  • Using the API or Apex, you can send single emails to a maximum of 5,000 external email addresses per day
  • Change sets: Inbound and outbound change sets can have up to 10,000 files of metadata.
  • In each specified relationship, no more than five levels can be specified in a child-to-parent relationship. For example, Contact.Account.Owner.FirstName (three levels).
  • Big objects: don’t support the following operators. – !=, LIKE, NOT IN, EXCLUDES, and INCLUDES
  • Visualforce Limit: Maximum rows retrieved by queries for a single Visualforce page request 50,000
  • Maximum records that can be handled by StandardSetController 10,000
  • Maximum collection items that can be iterated in an iteration component such as 1,000 and 

Useful Links:

Middleware Terms and Definitions

Integration Patterns Overview

Wednesday 11 September 2019

Salesforce Einstein Analytics Capabilities

KEY TAKEAWAYS:

Einstein Analytics Product (Wave + Prediction Builder + Discovery):


Einstein Analytics in a simplest form is a combination of Wave Analytics, Einstein Prediction Builder and Einstein Discovery

Wave Capability - Pull data from different sources. Only change is; now we can join different tables without using code. Earlier, we used to write SAQL (Salesforce Analytics Query Language) to do inner joins.

Prediction Builder Capability -  It predict the trend as number  (what’s the likelihood of something to happen in percentage) same as Einstein Prediction Builder.  The key difference is; in Einstein Prediction Builder, there’s a limitation that we can’t do cross object predictions, now this limitation is gone because the source of data is Wave engine

Discovery Capability - It tells the narrative (story) of data same as Einstein Discovery, the only difference again is data behind the scene can be enriched as its coming from Wave engine

Einstein Analytics Flow:

·       Data finds the customer  (Data coming from different sources)
·       Salesforce add narrative to that data (Add story around that data)
·       Predict what’s happening  (What happened in the past)
·       Predict what’s likely to happen (What’s going to happen in future, based on trends)
·       Do actions based on insights (Perform actions to change insights in your favour)


Einstein Analytics Assets:

·       App - It’s a collection of dashboards.
·       Dashboard - It’s a collection of lens (reports)
·       Lens - It’s a report, which can use of target dataset
·       Target Dataset - It’s a dataset, which can be used to create lens. It can be created using multiple recipe
·       Recipe - It’s simply a saved set of transformations, or steps, that you want to perform on a specific source dataset
·       Source Dataset - Raw dataset coming from different source for e.g. Oracle ERP, NetSuite etc

Frequency of data refresh:

You can set the frequency of your data to be refreshed on the following basis:

·       Time Based (Minimum is hourly)
·       Action Based (Refresh data based on any action)

Einstein Analytics License Structure

License Type     
Storage
Artificial Intelligence
Business Intelligence
Einstein Analytics Plus
1 Billion Rows  
Yes
Yes
Growth
1 Million Rows
No
Yes
Predictions
N/A
Yes
No

Random Points:


·       Minimum data to do analysis should be 5000 records.
·       Not all use cases can be achievable using Einstein Analytics. As of now, it does perditions only in numbers for e.g. (% of likelihood to happen something)
·       As of now, we can’t change the Machine Learning algorithm under the hood, but in future we might be able to plug in our own ML algorithm if we need to.
·       We can launch any action from dashboard which we’ve defined in platform for e.g. trigger, quick actions etc
·       You can define data cleaning rules and can reuse to enrich data periodically
·       Einstein Analytic runs on separate cluster from platform, it copies the data from different sources on its own cluster to store it as big flat file. We can add condition on copying the data, if we don’t want to copy over everything.

Wednesday 17 July 2019

Lightning Experience Migration tools

Here's the list of tools which you can use to facilitate the Lightning migration journey


Key feature: Link to all the tools for all checks in one place


Key feature: We can customise the Welcome screen in Lightning Screen as per customer requirements


Key feature: Conversion of classic type of documents into Lightning friendly docs


Key feature: Conversion of classic button/links to Lightning Actions


Key feature: Report to improve implementation for e.g highlights unused components.


Key feature: Track adoption and usage of Lightning Experience


Key feature: Help in change management from Classic to Lightning


Key feature: Calculate Increase revenue and reduce costs by switching to Lightning

Useful Link: Lightning Experience Transition Tools here