Flexible Micro-Service Framework

The approach of the SARCI enterprise design was to meet five main and critical criteria viz. responding to change, cost-effective to maintain, ability to scale, ease of integration to other software and ability to change the technology seamlessly. Each of the above criteria are related to each other and together reduces the total cost of ownership.

The second design consideration was that the platform must support a variety of different clients ranging from desktop applications, machine data, mobile apps, different browsers as well as expose or consume third party interfaces.

To meet these goals, the SARCI platform architecture is based on micro-services and are easier to manage & modify than single large applications. Small moving parts that are self-sustained carry out specific tasks based on rules or user requests.
The platform is built on the principle of one REST call per user request or events. The request contains the essential information , is for one specific purpose and can be completed as one transaction. The module responsible to carry out the request itself is usually just a simple small piece of code that is written and maintained independently. Token based security is used to provide for access control and permissions to execute the request.

The above approach enables the platform to be deployed across servers, use the best technology to process a specific request and responds to changing needs by simply replacing specific code segments rather than whole modules or applications.

Rule-based Work Automation

The built-in rules-based work automation engine is extremely powerful for business process management. It essentially has two parts, one the configuration of rules and the other is the flow based on the rules. Each rule is the set of business policies and the workflow is the implementation of business policy.

The rules can be simple conditions such as approvals required to clear purchase order above a specific amount. It can also be extended to be based on external events, time periods, number of iterations and other combinations. Examples of these would be service calls that are not resolved within n days need to be escalated.

The workflow is the next set of tasks to be performed by users assigned again by policy. An example would be that purchase orders need to be approved by department heads in case of value below a certain amount or the managing director for higher amounts.
The engine can be configured for rules to trigger the workflow per micro-service request. This approach essentially creates a state machine for each business component right down to the lowest level of interaction.

While the set of tasks in a workflow is usually static and rarely changes, the rules are more dynamic. To provide for live updates of the governing rules and to minimize downtime, the rules are dynamically loaded and mapped to each task. The date from which the rules apply can also be configured. The micro-service actions immediately adjust to the new set of rules from the effective date.

Different paths of execution such as conditional routing, parallel execution and others are pre-configured and typically loaded at run-time.

Each action, corresponding data and the user details is recorded on the system as part of the workflow. This creates audit trails which can traced back for any purpose.

Business Analysis & Reporting

The systems addresses three kinds of reporting needs viz. operational statements, business analysis and predictive modelling. Operational statements are reports that report on transactions such as job status, stock in hand, costing reports and similar. This is usually at the lower to mid management levels.

Business analysis reports give indications on the trends, ratios and key indicators of business health. These give a more in-depth view of the operations and are extremely useful for senior management.

Predictive modelling uses current trends to statistically predict probable outcomes, based on data collected both within the business as well as external events. These are usually customized per requirement.
Transaction data as it occurs is stored in a NoSQL format and immediately transformed into multi-dimensional data marts. These dimensions are decided based on the industry type. For example, a service industry would require reporting done on time series, customer categories, work types.

Micro-service interfaces with SQL-like capabilities then query the data marts based on the report requirement. The data obtained is then processed by the front-end for specific visualizing and other grouping needs. The reporting engine by itself does not put restrictions on the presentation.

Due to its independent nature, the report presentation layer give more choices on the visualization techniques, variety of reports that can be created, the devices on which it can be supported and the speed at which it can be implemented.

Data Security & Integrity

The triad of confidentiality, integrity and availability of data is the foundation of information security. All three need to be maintained at all stages of the data life-cycle viz. transient data, active storage, archived records.

The protection of transient data is required during transaction, across the network and back-end processing. These are tackled by standard first level techniques using SSL security, protection against SQL injections as well as data validation before processing information. Network security at the server level is maintained by IP firewalls and regular data snapshots for recovery.

Transient data on client side is kept only for duration of the specific login session and no cookies or other persistent methods are used. Also, no fields are auto-populated from within the local cache.

Data can be stored at both client and server side, and hence its storage needs to be protected on both sides. By design, there is no data stored persistently on the client side thereby removing an entire class of security issues.

On the server side, the sensitive data that need to be kept in encrypted format can be configured. The keys to decrypt and process the data on either side are exchanged during login. This method helps to continuously re-encrypt data at periodic intervals.

Lastly, the archived data is usually kept in encrypted format in storage locations such as AWS S3 which by themselves require different credentials to access. Hash keys for the archived folder as well as multiple copies of the archives give a high level of guarantee that the data is available, consistent and reliable.

Identity & Role-based Access

The system provides to identify and give access to regular users, machine interfacing as well as external software interfacing. All three are required in order to leverage trends in IoT and Web programming models.

Regular users login with their mobile number, password and time-based OTP key generation procedure. The TOTP process is enabled by default for all regular users and it is recommended to make this mandatory for all users.

Machines are identified using the PKI infrastructure and mutual trust is established by a certifying authority. Each machine number is mapped to a certificate and verified. Periodically, new certificates can be issued for higher security measures.
Interfacing with external software is similar to the methods used to integrate with machines. In cases, when the other software is unable to use certificates for authentication, alternate ways such as passwords, limiting calls per day or just plain manual export / import of data is used.

Each entity can be configured to access specific applications and perform tasks within that application. Usually an application functionality is cloned for interfacing with machines or other software to give a layer of additional security and limit changes to the actual functionality.

All entities identified by the system are given a time-limited token key to access the system. The token is renewed periodically by a successful challenge passphrase.

Extensions & API Interfaces

The micro-service framework can be easily extended to cater to different integration scenarios. A RESTful api is defined and can be accessed by any client software after the proper identification procedure.

It is left to the client software on how to use the api. However, it is useful to note that the configured business workflow or process will be activated for that specific call. As such, a reasonable understanding of the internal business process is essential before the integration starts.

In cases where the SARCI platform requires to be the client, a middle-ware is custom written to meet the requirements. The architectural design allows this middle-ware to be deployed anywhere and not necessarily, on the same system.
The preferred data exchange format is JSON and supports unicode. The client software can use the same querying capabilties of the micro-service framework as is given to internally developed clients. For example, a query to return purchase orders for today, this week or month can be done using the same REST api but giving different periods.

Regular features that export data in excel, email can also be used by other software to process the data in a semi-automated manner.

The above methods is used to integrate with most business software including Tally, SAP, Reporting tools as well as it is used to pull in data from other data sources for higher level reporting.
Centum Technologies