Data Integration and Operations

Data is not always centralized or stored in the form that is consumed by business decisions, rules and processes.
Also, Data Quality and Validation is always a necessity!

Multi Data Source, Load, Connect and Transform

Moving data between different data sources and automating the data flow between systems is always a challenge. FlexRule’s Information Requirement Diagram allows you to visually model the flow while the logic is decoupled from the data sources.

Building Data Quality and Validation Pipeline

Ensuring data quality requires validating business rules. It also requires different perspectives (e.g. quality for safety, compliance, operation etc.). Using FlexRule, different Data Quality and Validation abstractions can be defined and executed against the same source of data.

In-Memory Operation and Manipulation

Data operations can be modelled visually or by data operation expressions (monadic operators). Connect multiple data sources together to enrich and manipulate the data. Then compile the information by applying analytics and AI to build the results all in-memory so privacy is not compromised.

Data Virtualisation – Single View of Things

Build data virtualization based on multiple data sources. Transform the results based on different abstractions by applying business rules. Put together the end results and expose them as a new list or view for your consumers, customers, users, devices and more.

Connectors and ETL

Connect, Extract, Transform and Load


Connect to Data, Model and Run ETL

Connect to all sorts of data sources such as SQL (SQL Server, Oracle, PostGreSQL), MS Access, NoSQL, and even custom databases. Connect to sources on the cloud or on-premises. You can also connect locally and remotely to Excel, XML, JSON, and text files to extract data. You can even call and include data from services using REST APIs, SOAP endpoints, or websites.

Extract data from one or multiple data sources, then Transform them using the drag-and-drop visual tool or our reach data operators, and then Load the result into any medium (i.e. Database, Excel, or other files).

Run, Debug and View

Run the model with either a sample or the real data. Check the results of any operator at runtime, and view the result visually for either inputs or outputs of your model.

Data Virtualization

Build a Single View of Your Data

Pull Your Data Together

Data Virtualization allows you to pull data from disparate data sources, then run it through the model that creates outputs based on the input of data sources, and finally Expose it to the data consumers.

Our Data Virtualization capability allows the linking of data from multiple data sources, the matching and finding of relationships between them using an exact match or a fuzzy logic match with a confidence score, and exposes the result as a REST API end point to the data consumers.

Standardize Data Exchange

Contextually Validate Data using Rules and Constraints


Dynamic Data Validation is Crucial

With data located all over the place, and with many different processes within the organization needing some part of it, Dynamic Data Validation is a vital way of establishing Data Standard Exchange based on scenarios.

Fact Concept is an advanced approach which allows you to define a data structure based on individual scenarios and business processes needing the data. It allows you to define different view points and abstractions of data structure, constraints and rules for different cases rather than a fixed structure and format of data for all. One size, does not fit all!

Data Integration: Connectors, ETL, Virtualization, Dynamic Validation and Analytics.