What does Non-Disruptive set-up mean?

Pivot Cloud Platform’s set-up is completely transparent and non-disruptive. Once complete no further changes are required for the current set-up or product operations which is unique.

Normally adding, removing or updating features to business computer systems, necessitates one or more of the following:

  • Meetings with Developers, Business Analysts etc. discuss current features and required changes.
  • Valuable work time is lost
  • Documentation when it exists, tends to be out of date adding to the complexity and delays.
  • Since most businesses evolved over time, there are complex software and products in place.

Unlike conventional consulting methods, Pivot Cloud Platform involves no disruption to business continuity. AI maps business processes and data flows, while your customers business teams work as normal.

Contact us to see a working demo.

Where is the data stored?

Generally anonymized (with customer consultation) data is stored in local cloud (regional) data centers approved by the customer.

Since Pivot Cloud Platform is Cloud provider agnostic, customers can choose alternative providers with no impact on the delivery or functionality.

If you are already using IBM, Azure, AWS, we support them. If not, we can give you a choice of providers to select from.

What programming languages does Pivot Cloud Platform support?

Unlike code readers or parsers, Pivot Cloud Platform (PCP) uses AI tools and extensive process mining techniques to map business processes. Programming languages do not affect how PCP works as it is language agnostic. AI-driven PCP learns how your business computer systems are being used today. Here are a few examples.

  • AS400 RPGLE reports generated through console application on IBM DB2, then, Pivot Cloud AI needs to know which program (library and file) that triggers the report generation as well as DB2 database connections settings.
  • Web-based application with HTML interface and accessing Oracle database, then, Pivot Cloud AI requires Web site URL and Oracle connection settings such as host, SSID, etc.
  • External access to the mainframe is only through MQ messaging, then Pivot Cloud AI requires access to MQ messages and data file storage locations

Specific requirements can be fulfilled quickly as PCP is designed to be non-disruptive.

How does it work?

Simple answer

Smart AI.

More elaborate answer

Pivot Cloud Platform’s (PCP) pre-trained AI analyses user interactions and data, and makes sense of it. The more processes the AI gets access to, the more accurate the process flows it will generate.

In other words, smart AI models are generated on the fly to suit the environments by our innovative technology to target your systems without manual intervention. The result is an accurate understanding of how the processes are working and how the data is flowing within the system.

This knowledge allows your business to perform various tasks that would normally take a long time to achieve. Here are a few examples of how PCP’s AI arrives pre-trained(out of the box),

  • Process Mapping (Business Processes as well as transactional processes)
  • Migration of legacy program to micro-service
  • Robotic Process Automation
  • BlackBox Legacy system
  • Single-Sign-On (SSO) for older systems without code changes
  • Moving legacy data to Salesforce (auto mapping and creating business objects)
  • … and more.
What is the cost of Pivot Cloud Platform?

Fixed license fee + customer-specific professional services for additional implementations. You are in control of budget and billing. You can use internal staff or use our professional services.

The entire platform fee + professional services come to less than one consultant’s fee you might be paying. This is Pay-As-You-Go pricing, so you are in control of the costs.

We saved an average of 70% -80% cost and time compared with traditional consultant-oriented project costs.

How does Pivot Cloud Platform incorporate Agile Development?

Pivot Cloud Platform (PCP) is designed to complement Agile development. In-house resources participating in agile development need information quickly and efficiently so, their work can be delivered reliably. PCP Process Insights provides that information automatically.

Migrating to Cloud process by process seamlessly with no disruption to current live processes, is a key feature of our approach to agile development.

What data security procedures do you have in place?

We comply with C4 level security practices
We can customize to comply with regulatory requirements at no additional cost to you.
Data is stored after anonymizing the identifiable information.
Data is only used by AI, not humans (except in-house staff, or professional services if desired), reducing the security risk.

Here is an example of straight standards and processes we comply with. Read more here.

How long does it take to convert a legacy system?

It depends on the type and complexity of the legacy system.

As a rule of thumb, converting a simple insurance product to the cloud takes less than three weeks, including analysis and code generation.

More complex processes may take an additional few weeks, never months, as long as the application can be used. For instance, if a batch process needs to be documented and it only runs on the last day of the month, you have to wait till it runs to get the documentation unless you can trigger the process manually.

Can I use Pivot Cloud Platform AI separately?

Yes, if you just want to be a step ahead of the market place without jargon, Pivot Cloud Platform AI, delivers by simply using REST API calls.

There are a number of use cases and implications available using Pivot Cloud AI working separately from already developed products.

Contact us to find out what Pivot Cloud AI can do for your business.

How many people does it need to operate?

Typically 1 to 2 in-house employees are enough to work with it. However, the system can be used by as many users as you like. The typical training period would be between 4 hr to 20 hrs.

The nature of Pivot Cloud AI being platform based means that your IT team of 2 is only required during the implementation phase to connect to the in-house system as per requirements. Ex.

  • AS400 RPGLE reports generated through console application on IBM DB2, then, Pivot Cloud AI needs to know which program (library and file) that triggers the report generation as well as DB2 database connections settings.
  • Web-based application with HTML interface and accessing Oracle database, then, Pivot Cloud AI requires Web site URL and Oracle connection settings such as host, SSID, etc.
  • External access to the mainframe is only through MQ messaging, then Pivot Cloud AI requires access to MQ messages and data file storage locations
Are analytics available for other applications to use?


The Pivot Cloud Platform supports REST API access with JWT authentication to access any data stored in the system. You can use the data generated with any application you like, there is no licensing cost involved.

Who owns the data?

You are, of-course.

Unlike other cloud providers, you are in charge of all the data, analytics, code, and AI models generated by the platform.

Pivot Cloud Platform simply delivers the results for you.

Do I need any other licenses to pay?

No. Pivot Cloud Platform generates micro-services in open-source format ex. Spring Boot MVC, Bootstrap, Angular, Python, Go, etc. all are open source technologies.

No need to pay any license fee forever. In addition, code can be updated, modified to suit future requirements, making it a truly future-proof solution.

How much storage do you provide?

We can provide a detailed quote once we understand the basic requirements. But generally, if you are using our product for migration or process mapping, there will be no additional fee.

Our storage cost is 20-40% less than mainstream providers like Azure, Amazon, Google. There are no hidden charges.

How long does it take to migrate legacy systems?

Generally between 12 and 24 weeks which is determined by process and data storage complexities, not volume.

How do I know if this is working?

Pivot Cloud Platform has a built-in ‘verification’ mechanism, where every piece of functionality is tested with two routes (current vs migrated micro-services) to verify each and every process while constantly checking data integrity.

Until it is verified with the pre-set confidence level, the Pivot Cloud Platform will be re-doing the entire capture-code-generation cycle.

I want to automate business processes, but I don’t want to learn another system!

Conventional Robotic Process Automation (RPA) is based on IT developing automation to process repeated transactional processes. This means, there is a large investment as well as a learning curve involved, not to mention commitment.

Pivot Cloud Platform uses AI to learn how the processes are being used by regular users. Then repeats them using a variety of data sources. The result no need to learn a new system, AI does the work with data provided.

As a bonus, your IT can deploy Machine learning to any RPA process with ease. If you can use the system like end-user, your RPA is ready for you in min. No learning curve no mistakes.

How many resources do I have to spare?

During the set-up time, we need connectivity assistance from your IT, once it is done, all you need is getting the results as and when you need them.

Process Insights: Business Analysts, Developers, and DevOps can access as required.

Intelligent Migration: Developers and Dev Ops, for verification and development.

RPA: Business Users and Business Analysts for using RPA.

BlackBox: Business Users and Developers to create new functionality.

Why Pivot Cloud Platform?

We find that the best answer requires a farming analogy. When you have a tractor at your disposal, would you consciously choose to plow your fields by hand?

When AI and cheaper cloud is available now, why waste resources and time with old and outdated systems?

How are 3rd party APIs dealt with in the conversion process?

3rd party or heterogeneous in-house applications accessing the legacy system can be either in-bound or out-bound.

In-bound APIs need a network update to point to cloud micro-services interfaces (Pivot Cloud Platform has built-in API gateway for routing the APIs to generated micro-services). Other than that no changes are required as the cloud migrated micro-services comply with the exact specifications of the legacy system API interfaces.

Out-bound interfaces will have to be notified to Pivot Cloud Platform before the migration process (See the help article on how to add new data sources for more information).

Can I make changes to my legacy process during the conversion process?

Yes. Pivot Cloud Platform is the only non-disruptive system that allows you to update the legacy systems while conversion is going on.

Pivot Cloud Platform enables your existing CI/CD platforms to plug-in change notifications, which will trigger the “Analysis Micro-service Generation” cycle.

What is the difference between Pivot Cloud Platform’s conversion process and AWS or IBM moving workloads to the cloud?

Moving workloads to the Cloud is like putting old wine into new bottles (rather bigger bottles), it solves immediate capacity problems but not the underlying issues. If you are only looking to expand performance, it seems like a good option. Make sure the legacy code is compatible with the target platforms as the most used technologies like micro-COBOL, open-COBOL.

Pivot Cloud Platform uses AI to understand business and functional processes, and move the logic to micro-services giving you a future proof solution.

What is the advantage of automating during the conversion process?

Since Pivot Cloud Platform captures the business processes and maps them, it is a simple step to automate them (feed the data needed to complete the business process). By automating the business processes, manpower is released from managing the legacy system, which can then be re-deployed for other essential tasks.