What does Non-Disruptive set-up mean?

Pivot Cloud Platform’s set-up is completely transparent and non-disruptive. Once complete no further changes are required for the current set-up or product operations which is unique.

Normally adding, removing or updating features to business computer systems, necessitates one or more of the following:

  • Meetings with Developers, Business Analysts etc. discuss current features and required changes.
  • Valuable work time is lost
  • Documentation when it exists, tends to be out of date adding to the complexity and delays.
  • Since most businesses evolved over time, there are complex software and products in place.

Unlike conventional consulting methods, Pivot Cloud Platform involves no disruption to business continuity. AI maps business processes and data flows, while our customers business teams work as normal.

Contact us to see a working demo.

Where is the data stored?

Generally anonymized (with customer consultation) data is stored in local cloud (regional) data centers approved by the customer.

Since Pivot Cloud Platform is Cloud provider agnostic, customers can choose alternative providers with no impact on the delivery or functionality.

If you are already using IBM, Azure, AWS, we support them. If not, we can give you a choice of providers to select from.

What programming languages does Pivot Cloud Platform support?

Unlike code readers or parsers, Pivot Cloud Platform (PCP) uses AI tools and extensive process mining techniques to map business processes. Programming languages do not affect why PCP works as it is language agnostic. AI driven PCP learns how your business computer systems are being used today. Here are few examples.

  • AS400 RPGLE reports generated through console application on IBM DB2, then, Pivot Cloud AI needs to know which program (library and file) that triggers the report generation as well as DB2 database connections settings.
  • Web-based application with HTML interface and accessing Oracle database, then, Pivot Cloud AI requires Web site URL and Oracle connection settings such as host, SSID, etc.
  • External access to the mainframe is only through MQ messaging, then Pivot Cloud AI requires access to MQ messages and data file storage locations

Specific requirements can be fulfilled quickly as PCP is designed to be non-disruptive.

How does it work?

Simple answer: Smart AI.

Detailed answer: Pivot Cloud Platform’s (PCP) pre-trained AI analyses user interactions and data, and makes sense of it. The more processes the AI gets access to, the more accurate the process flows it will generate.

In other words, smart AI models are generated on the fly to suit the environments by our innovative technology to target your environment without manual intervention. The result is an accurate understanding of how the processes are working and how the data is flowing within the system.

This knowledge allows your business to perform various tasks that would normally take a long time to achieve. Here are a few examples of how PCP’s AI arrives pre-trained(out of the box),

  • Process Mapping (Business Processes as well as transactional processes)
  • Migration of legacy program to micro-service
  • Robotic Process Automation
  • BlackBox Legacy system
  • Single-Sign-On (SSO) for older systems without code changes
  • Moving legacy data to Salesforce (auto mapping and creating business objects)
  • … and more.
What is the cost of Pivot Cloud Platform?

Fixed license fee + customer-specific professional services for additional implementations. You are in control of budget and billing. You can use internal staff or use our professional services.

The entire platform fee + professional services come to less than one consultant’s fee you might be paying. This is Pay-As-You-Go pricing, so you are in control of the costs.

How does Pivot Cloud Platform incorporate Agile Development?

Pivot Cloud Platform (PCP) is designed to complement Agile development. In-house resources participating in agile development need information quickly and efficiently so, their work can be delivered reliably. PCP Process Insights provides that information automatically.

Migrating to Cloud process by process seamlessly with no disruption to current live processes, is a key feature of our approach to agile development.

What data security procedures do you have in place?

We comply with C4 level security practices
We can customise to comply with regulatory requirements at no additional cost to you.
Data is stored after anonymizing the identifiable information.
Data is only used by AI, not humans (except in-house staff, or professional services if desired), reducing the security risk.

How long does it take to convert a legacy system?

It depends on the type and complexity of the legacy system.

As a rule of thumb, converting a simple insurance product to the cloud takes less than a week, including analysis and code generation.

More complex processes may take an additional week or two, never months, as long as the application can be used. For instance, if a batch process needs to be documented and it only runs on the last day of the month, you have to wait till it runs to get the documentation unless you can trigger the process manually.

Can I use Pivot Cloud Platform AI separately?

Yes, if you just want to be a step ahead of the market place without jargon, Pivot Cloud Platform AI, delivers by simply using REST API calls.

There are a number of use cases and implications available using Pivot Cloud AI working separately from already developed products.

Contact us to find out what Pivot Cloud AI can do for your business.

How many people does it need to operate?

Typically 1 to 2 in-house employees are enough to work with it. However, the system can be used by as many users as you like. The typical training period would be between 4 hr to 20 hrs.

The nature of Pivot Cloud AI being platform based means that your IT team of 2 is only required during the implementation phase to connect to the in-house system as per requirements. Ex.

  • AS400 RPGLE reports generated through console application on IBM DB2, then, Pivot Cloud AI needs to know which program (library and file) that triggers the report generation as well as DB2 database connections settings.
  • Web-based application with HTML interface and accessing Oracle database, then, Pivot Cloud AI requires Web site URL and Oracle connection settings such as host, SSID, etc.
  • External access to the mainframe is only through MQ messaging, then Pivot Cloud AI requires access to MQ messages and data file storage locations
Are analytics available for other applications to use?

Yes.

The Pivot Cloud Platform supports REST API access with JWT authentication to access any data stored in the system. You can use the data generated with any application you like, there is no licensing cost involved.

Who owns the data?

You of-course.

Unlike other cloud providers, you are in charge of all the data, analytics, code, and AI models generated by the platform.

Pivot Cloud Platform simply delivers the results for you.

Do I need any other licenses to pay?

No. Pivot Cloud Platform generates micro-services in Open source format ex. Spring Boot MVC, Bootstrap, Angular, Python, Go, etc. all are open source technologies.

No need to pay any license fee forever. In addition, code can be updated, modified to suit future requirements, making it a truly future-proof solution.

How much storage do you provide?

We can provide a detailed quote once we understand the basic requirements. But generally, if you are using our product for migration or process mapping, there will be no additional fee.

Our storage cost is 20-40% less than mainstream providers like Azure, Amazon, Google. There are no hidden charges.

How long does it take to migrate legacy systems?

Generally between 12 and 24 weeks which is determined by data complexity, not volume.

How do I know if this is working?

Pivot Cloud Platform has a built-in ‘verification’ mechanism, where every piece of functionality is tested with two routes (current vs migrated micro-services) to verify each and every process while constantly checking data integrity.

Until it is verified with the pre-set confidence level, the Pivot Cloud Platform will be re-doing the entire capture-code-generation cycle.

How many resources do I have to spare?

During the set-up time, we need connectivity assistance from your IT, once it is done, all you need is getting the results as and when you need them.

Why Pivot Cloud Platform?

We find that the best answer requires a farming analogy. When you have a tractor at your disposal, would you consciously choose to plow your fields by hand?

When AI and cheaper cloud is available now, why waste resources and time with old and outdated systems?