This article was written by Teja Bhutada from Exalate.
Businesses today generate a jaw-dropping amount of data every single year. This data is being stored and processed by numerous applications running in the business ecosystem, which in turn are used by various teams like marketing, sales, development, and so on.
If this gamut of data lying in different systems is brought together and collated, it would bring forth strategically vital information available to everyone in the organization. But the major hindrance in this arrangement is ensuring the data remains secure, safe, and untampered.
Well, for this sole reason, we are targeting how a decentralized approach toward data integration can increase its security in this article.
But before that, let’s see what data integration is and why you need it in the first place.
Data Integration and its Benefits
Data integration can sound scary, but it just boils down to a simple concept: bringing together data from different source systems and integrating it to give a unified view to whoever wants access to it.
As such, it is an integral part of B2B integration. The reason I’m clearing the air here is that, at the core, integrating data is for achieving common business goals by automating critical business processes to help it grow. Ultimately, this is what every business bank’s on, right?
Overall, data integration is about untampered, filtered, accurate data at your disposal when you want it, displayed within your own application.
Without it, data will be lying in your own siloed environment for days on end without you even realizing it can be beneficial to other teams. And if eventually others need access to that data, then it would be through spreadsheets, emails, and phone calls.
Data integration would allow you to stop these manual ways of working and automate the process so it could lead to fewer errors and less friction amongst teams.
In addition to the discussion above, it can benefit your business in the following ways:
- It reduces the complexity of managing data in separate applications.
- It makes the data available in a timely manner.
- It strengthens collaboration and interaction between teams helping them work towards better customer relations and reflect tangible/ intangible business benefits.
- You no longer have to deal with “trial and errors” in your business decisions since an integrated view of data in the hands of a correct person would mean smarter business decisions and transparent business processes.
Data integration is no longer limited to strict boundaries but is spreading across multiple organizations. In this context, we have tried to separate the integration process within a company as intra-company integration and between multiple companies as a cross-company integration.
The reason behind this is simple. The intricacies and details of an integration setup where multiple companies are involved are slightly different than when the integration happens within a single company. This is true, especially regarding the security of the data being shared.
So let me help you draw a picture of how exactly security can be stressed in a cross-company integration scenario.
How to Ensure Security in an Integration Setup
Security is as good as being ingrained into data integration. That’s pretty evident with the increase in security threats and breaches over the years. And especially in these trying times, where remote working has become a norm, it is all the more imperative for us to address the security landscape in a firm manner.
At the basic level, securing your data within an integration environment would be to avoid unwanted attacks (like the man in the middle attack) or unauthorized access to shared data, tampered with or wrong data being shared, or for the data to conform to regulatory requirements.
True, that data integration within a single company poses security risks, but these risks manifest in a cross-company setup and need to be addressed delicately.
For this, the general approach integration tools take to secure data is to:
- use tokens as a substitute to refer or map back to sensitive information. The tokens are random data strings that have no meaning on their own, since they are not calculated based on some mathematical formula or algorithm as encryption keys.
So it is difficult to decipher or reverse them. No key or method can get back the original data by finding the relationship between the data and the token. Since this relation is stored at an external data source called a token vault, where the original data lies.
- mask the data, that means converting it into a format that is of no use to the intruders but is still useful to the authorized user.
It means that the data is obscured through either of the following ways:
- By substituting the existing data with another authentic looking value.
- By shuffling the data either randomly or according to some set formulae.
- Deleting or varying some part of the data.
- Masking or scrambling certain fields in the data. For instance, hiding the account or credit card number with a series of “XXXX”.
- Or some other advanced rules.
- encrypt the data when being exchanged. Encryption is the process of converting plain text into cipher text at the sender’s end. The cipher text received at the destination is decrypted back to the plain text. The encryption and decryption process is done with the help of a secret encryption key where the source and the destination either use a shared secret key or have a public-private key pair. These keys are a collection of algorithms that can scramble and unscramble the data back to the desired format (plain-text).
While these are surely the correct ways to handle security issues, an additional measure over this would be for the integration to be inherently decentralized.
We will see what it means ahead.
What is Decentralized Integration (Autonomy)
In simple terms, decentralized integration means that both integration parties have complete control over what information is being sent and received independently. Here, there is a need to stress the word “independent/ autonomous” since most integration tools have a centralized approach.
Decentralized integration can be implemented through specialized processors that filter and send (or receive) only required information between applications; unwanted information is simply not sent (or received).
The way decentralized integration works is for the tool to support a distributed architecture. So once the components, i.e., the integration application, are installed on both systems, every company assigns an agent who is responsible for looking at what information is sent and received from the application he/ she is assigned to. This allows control of the integration to be handled by each application independently. There is no centralized interface or controller that needs to be configured for every integration request.
As an obvious outcome of the above point, the systems within such an integration setup are loosely coupled.
So, you no longer need an integration tool sitting on the driving seat and controlling all operations from a single interface. This acknowledges the fact that integrations develop over time. If there is a tight coupling between systems, it leads to unnecessary dependencies and increased overheads.
To name a few:
- Generally, tight coupling is a “win-or-bust” kind of a proposition. The interfaces are all related to one another in functionality or structure, increasing dependency between them. So if one of the components fails, it is likely to create a ripple effect, failing the other dependent ones too, eventually risking an entire system failure. This is unlike the loosely coupled systems where single point of failures are inherently avoided due to reduced dependencies.
- With tightly coupled systems, all business partners involved must work in tandem to design, develop and even maintain the systems. So in essence this requires increased resources and thus higher costs are incurred. While loosely-coupled systems provide a sandbox environment for every component. So changes, if otherwise don’t adversely affect the other components involved, can be executed smoothly. This in turn reduces the overall costs and resources associated.
- Tightly-coupled systems are inflexible and difficult to scale. This is because of the inherent dependencies between them, so newer integrations or requirements are difficult to implement in a timely manner, since care must be taken to ensure the changes are applied carefully to all the dependent components, increasing the overhead.
Loosely coupled systems on the other hand, are easy to maintain, are flexible and even increase scalability because it simply boils down to adding just another component or interface to the existing set-up and giving the necessary permissions, access mechanisms or APIs to interact.
How Decentralized Integration Can Benefit Your Data Sync?
Decentralized integration can benefit in the following ways:
- Both systems will retain the information they don’t want to send to the other side and would choose how to deal with the information coming over from the other side. This will help avoid unauthorized access to information that should not be shared.
- Another benefit of this approach is not having to worry about the admin from the other side messing up your sync. This means that once the integration is set-up, you solely get to decide what must be sent and received without even consulting the other side. So you get complete control over your sync.
- Loosely-coupled systems in a distributed architecture can lead to a robust and resilient integration, less aversive to downtimes and system failures. Such systems also ensure that the local configuration of your integration can evolve without the need to inform the other side about that change.
So how do you find a tool that supports all this and much more?
Well, we have made it simple for you.
Exalate: a Decentralized Cross-Company Integration Solution
Exalate is a bi-directional decentralized integration solution that helps you connect your work across multiple applications like Jira, GitHub, ServiceNow, Salesforce, Azure DevOps, Zendesk, GitHub Enterprise, HP ALM, and more.
For instance, if you are looking for Jira integrations to integrate your Jira with other applications, then Exalate can help you achieve it with the least amount of fuss.
Note: You can also have a look at the different integrations it supports.
It ensures security through encrypted data exchange, use of HTTPS protocol, JWT-based token mechanism, and so on.
But its unique offering remains: security and flexibility through decentralized integration.
This is achieved using the following ways.
Exalate maintains decentralized integration through a distributed architecture, which means that both integrating applications have their own Exalate node (i.e., an instance of an Exalate application) that translates any information passed between the two systems.
This scenario is well-depicted in the diagram below.
As seen, there are 2 Exalate nodes, one per tracker. Trackers in this sense mean the applications on which Exalate has been installed, for example, Jira, Salesforce, ServiceNow, etc.
Assume a synchronization from the blue tracker to the red tracker. First, the information is sent to the Blue tracker’s Exalate app and then is composed as a secure message. It is sent over to the Red tracker’s Exalate app, which then applies the information in the message locally on the Red tracker. After which, an acknowledgment message is sent over to the Blue tracker.
Such a distributed architecture ensures that the agents at both ends control access to the Exalate app and also the configuration requirements of the integration independently.
Incoming and Outgoing Sync Processors
The information synced using the above process has to go through the incoming and outgoing sync processors, present at both ends.
Suppose information needs to be sent over from the Blue tracker to the Red one, then first, the Exalate app’s outgoing processor filters information to be sent over based on the Sync rules on the Blue tracker. This filtered information reaches the incoming processor on the Red tracker’s Exalate app. Here, the incoming processor applies the information locally based on the Sync rules provided on the Red tracker.
These processors can be managed on the Exalate UI with the help of Sync rules.
Exalate uses an intuitive scripting engine that has Groovy scripts present at both ends. These rules allow users at both ends to set up advanced configurations specifying how the sync should behave in different situations.
This makes Exalate flexible enough to handle complex or advanced integration cases. The Outgoing and Incoming Sync on either side of the connection decides what information needs to be exchanged. Whereas the Outgoing and Incoming sync processors handle what needs to be sent and received by analyzing these scripts.
For the synchronization process, Exalate uses sync queues at both ends to make the whole process asynchronous. This is essential to ensure reliable data integration. Every sync event passes through a sync queue and is applied in the same order as its initiation at the other side. This ensures that all changes are applied in the correct order in case of downtime or system failure. In addition, it also has an integrated retry mechanism that helps recover from any failure, allowing the sync to resume from the point of interruption.
So this is the time to explore it as an integration tool, secure and flexible enough to handle even the most complex and advanced integration cases.
To conclude, the need for security is evident in any data integration process. Of course, there are conventional approaches to deal with it. But a guaranteed approach would be to adopt a tool supporting decentralized integration, increasing the security of your integration.
We did discuss how keeping the integrating systems distributed and loosely coupled would increase the security, reliability, maintainability, and scalability of your integration. And then we pointed out a tool offering such a capability, Exalate. It supports decentralized integration at its core, in addition to the various other security mechanisms and features to further enhance its performance.