Xactly Connect Domain Objects

Xactly Connect Domain Objects

Xactly Connect actions are done using what are called “domain” objects. These are objects that can encapsulate information that describes your Connect environment and help you take actions such as invoking workflows.

The above icons show the supported domain objects (assets). From left to right: Schedules, Pipelines, Steps, Emails, Variables, Files, Iterators

Steps

A step is the smallest self-contained unit of processing that can be referenced and invoked by itself. You can think of a step as an envelope for any xCL or xSQL command. Even though you can invoke a step in isolation, a more prevalent use case is the inclusion of a step as one member of a larger processing unit called a pipeline.  A step has a name, includes an xCL command or xSQL query, as well as other metadata. When you invoke a step, it can change the state of the system, it will generate detailed information about it’s processing as an invocation record.

In a typical development process, you would initially test xSQL some queries or xCL commands through the ODBC or JDBC driver in a sandbox or implementation environment. This enables some experimentation to make sure your data integration and manipulation queries will yield the intended results end to end. Once you have verified some of the queries, you will want to put them together in a workflow so they can be invoked in the right order and with proper intermediate data being created. To do this in Connect, you will need to wrap each command or query in it’s own step object. Unlike an ad-hoc query or command that you may type manually in a database viewer application, a step can be run in background (asynchronous invocation), included as part of some other object (e.g. pipeline) and shared among many processes.

Besides the command or query, you need to provide a unique name for each step. Optionally a step can have a description, labels (tags) and a step can be disabled. If you invoke a step, it will run in the background (asynchronously). You will receive an invocation ID through which you can request the status of the step invocation (whether it has finished running, and whether it was successful or it failed). You can also search all steps for the ones in which you’re interested. For example you can search for steps that have a certain description or label or steps that were created in a certain time frame.

Variables

A variable is a reference-able container for any scalar value (text, number, boolean or date) or an xCL command or xSQL query that can be evaluated later. The main use case for a variable is to encapsulate values or expressions that can be re-used in other contexts like inside of a step or within an xSQL query. A variable has a name, description and value as well as other metadata.

Pipelines

A pipeline is a multi-stage processing unit that can partially or fully describe a workflow . You can group together a collection of already-created steps (and/or pipelines) into a pipeline which, upon invocation, will invoke each of it’s members (the steps and pipelines inside of it) in the specified sequence. A pipeline has a name, a list of members and other metadata.

When you invoke a pipeline, it will generate detailed information about it’s processing, as well as those of each of it’s members as an invocation record. At the time of invocation you can optionally provide a list of variables and their values. This is a way to override the values for those variables for the duration of the pipeline invocation. Any place those variables are called during the pipeline invocation of the pipeline, the values of those variables indicated will be replaced by the values provided to the invocation. Additionally, a pipeline invocation can have optional success and failure callbacks called webhooks. Webhooks are REST APIs that can be called when the pipeline invocation completes with either success or failure.

Invocations

Invocations are current and historical status records of every step, pipeline or command that was executed or is currently running in your Connect environment. It includes details about the object that was invoked, whether the invocation is running or has finished and whether it was successful or, in case of a failure, why it failed. If the step or pipeline is being invoked by a schedule, then information about the schedule is also include in the invocation record.  If the case of a pipeline invocation, details about each individual pipeline member’s invocation is captured in an array of invocation detail objects. Invocation records include many more details and they are by nature read-only.

When you invoke (run) a step or pipeline, instead of the final results you receive an invocation ID along with the response. This is your handle for requesting information about the invocation which may or may not complete quickly. In order to find out what happened with the invocation, you have to use the invocation ID to see the metadata about the run. You can also search all invocations (past and present) for the ones you are looking for. For example, you can search for the ones that were created as part of a specific schedule or the invocations that happened during a specific time frame.

Iterators

An iterator is a very simple looping mechanism for invoking a single step or pipeline multiple times in a row. Other that the usual name and description attributes, when creating an iterator, you specify a specific step or pipeline to be invoked. In addition, you specify a xCL command that produces a result set of rows and columns. The number of rows in the result set determine exactly how many times the step or pipeline will be invoked. Furthermore, the column names in each row will be interpreted as names of variables that may be used in the step or pipeline. For example, if the iterator’s command returns the following 2-row result set:

myNumber myString
2 monday
4 wednesday

Then in the step object in the first iteration the step will be invoked with variable myNumber initialized with value  2 and variable myString initialized with value “monday”. In the second iteration, the step will be invoked with variable myNumber initialized with value 4 and variable myString initialized with value “wednesday”.

Schedules

A schedule is a processing container for a single step or pipeline controlled by a timer. A schedule has a name, metadata and a cron (timer) expression which will govern when and how often the step or pipeline inside of it will be invoked. A schedule has many options including conditional invocation where the user can specify a condition, number of retries, a retry interval and an on condition false clause. Additionally, a schedule can be suspended if you do not want it to run. Once the schedule fires, you can determine whether the step or pipeline within was successfully executed by looking at that object’s invocation record.

The level of granularity for a schedule definition is an hour and the system helps randomize the minutes expression (in multiples of 10). If for any reason a schedule mis-fires the next time it gets invoked is the next fire time per the schedule definition. One of the reasons why a schedule may mis-fire is if there is an overlap, i.e. a previous invocation of the schedule is currently in progress at fire-time.

Files

A file represents custom data files similar to ones you might upload/download to/from your Xactly Secure FTP directory. Other than importing and uploading data directly from/to Incent tables, files are the main way to get your data into and out of your Connect environment for consumption of external programs or systems. The main use case for files is to upload your custom data or to download processed data to your computer. With the file service you can see a listing of all the files in your directory hierarchy and upload and download individual files.

Credentials

A credential is an object that contains username and password (or other information) needed to run commands directly against “outside” systems like Salesforce, or when communicating directly with Incent. For example, when uploading data into Incent tables, Connect needs to provide valid Incent credentials to do so. The credential object is how you can specify and update this authentication information. You also have the option of testing the Credential to confirm it is valid. If your Incent password policy invalidates/ expires these credentials after a specific time duration, remember to update the details in Connect. You can create a credential, give it a unique name, a type (Incent, Salesforce, etc.) and later you can update the credential with new information if necessary.

Emails

You can send notifications to interested parties through emails. An email object includes a list of recipients, subject, body and other metadata. You use an explicit command to send an email using it’s name or ID. Then the service will compose an email an email based on it’s configuration (subject, body, etc.) and sent to everyone in the recipient list. Because emails are sent to outside servers, Connect can not guarantee 100% that it will be received but information about the sending of the email is kept in the audits.

You can also search all emails for specific ones in which you’re interested. For example you can search for all emails which are to be sent to a certain email address or all emails that were created in a specific time frame.

Audits

Audits are detailed records of every interaction with your Connect environment. Every time a person or program interacts with Connect services (object creation and manipulation, explicit invocation, running ad-hoc queries, etc.) an audit record is kept with all the details.  Each audit record is created once and by nature it is read-only. The Audit is not meant to serve as a point of record for tracking system changes (For e.g. Who deleted this pipeline) instead it tracks system activity (For e.g. How many failed login attempts in the last week)

Sessions

Every time you authenticate and obtain a new OAuth token, a session object is created to represent the context of your interactions with connect. In order to provide more visibility and control, Connect provides a service to view the list of all of the sessions currently active for your business, as well as the ability to delete (log out) any of the active sessions. You should use extreme caution before deleting a session because it can potentially log out another user or process. The session delete API is only provided as a way to help you in case you have reached the maximum limit for the number of active sessions allowed. A session is allowed 30 minutes of inactivity before it expires.

Snapshots

Your Connect environment is a collection of “domain” objects like pipelines, steps, schedules, etc.  As the name suggests, taking a snapshot is a way for you to capture and save the current state of those objects at this moment in time. It includes the current descriptions of all the object described above but not information about invocations and audits. It also does not include any information or data from the custom tables found in your environment. Snapshots are sometimes used in conjunction with Deployments.

Deployments

In a typical development cycle, you would design and experiment with your Connect workflows in a sandbox or implementation runtime environment, making your changes and tweaks, before you are comfortable running the process in a production environment. A deployment is the process and data associated with migrating all your “domain” objects from one Connect runtime environment to another. Think of this as taking a snapshot of your whole environment (or part of your environment) and then copying that snapshot to a different environment to replace what’s in that environment. As with snapshots, this does not include any data from custom tables or Xactly tables. Deployment objects capture information about this migration. If the deployment is unsuccessful the target environment will remain untouched. When initiating a deployment, you will supply valid Connect credentials to prove you have access to the target environment. If you are performing partial deployments, you will be using a Deployment Container object to do it. (see below)

Deployment Containers

As stated above, the default behavior of a deployment is to replace all domain objects (assets) in the target system. However, in some cases, in order to limit the impact of a deployment, you may want to limit the deployment to a small set of hand-picked objects. You can use Deployment Container objects to assist with this type of “partial deployment”. You create a container and place assets into it (steps, schedules, pipelines, emails, variables, etc.) You can add or remove objects until you have it configured for your environment. At that point you can use a variation of the deployment service to deploy using the container. Only those objects that you have in the container will be copied to the target system. You can also control whether to replace any objects that already exist in the target environment or not.