Process Plant Design – Data Life Cycle.
Introducing R.O.I. for Design systems using distributed and data centric data structures in the Process Plant Data Life Cycle
Collaboration and design delivery is quickened using specialized enterprise data and reporting structure’s, this in combination with intelligent 2D/ 3D CAD design tools will produce reports, graphs, charts, automatic fabrication drawings, bills of material, intelligent schematics and many more data sets. As project activity flows thorough pre defined templates, completion times are reduced, (made in jig time or mass production comes to mind) contributing to, “reduced time to market”.
Investing in project delivery technology from this pint of view makes sense.
Diagram 1. Showing integrated design and data life cycle system.
If a single outsourced design system is used, project delivery is now a matter of data access, rather than a massive file transfer and retro integration (this can be significant with outsourcing to low cost service providers).
Most engineering consultant’s and contractor’s will own some sort of design delivery system, the big limitation here is in developing the tool set, since their main skill set is in project delivery.
Most plants outsource design and maintenance work to third parties: such as engineering consultants, maintenance companies and individual contractors makes plant free form providing design and data systems.
Simply buying a data base & CAD system will not provide a world class delivery ready design and data system. Installations require large scale customer intervention and any service provider that deliver’s project’s can direct system requirement, but does not have tool development skills. Neither do software sales, although a bit better, still fall long ways short of world class, unless there is a very large budget.
It seems counter intuitive, that all who want world class plant design and data life cycle tools must negotiate uncharted financial gap to get standard delivery of which in my experience 95% generic.
It is because systems are mostly generic that the distributed system makes sense, which is, distribute the 95% and locally consult over the remainder.
The Data file silo
The Data Silo system is the single computer file, in data terms it chokes data flow, cumbersome formats and work replication are all hallmarks of this.
The Data silo is a symptom of its time; the vast number of applications that manipulate data and back up to cul-de-sacs is staggering, all this data can now be reformed onto a Data Platform.
Here is how the old file centered system looks in terms of data flow, policy implementation and maintenance tasks, through a data life cycle.
The islands of data are isolated within proprietary formats and require intervention to get data to flow, even if there is an xml aspect, the xml still needs to be integrated.
First a brief recap on some new technologies
Cloud computing: An internet search engine, no purchase, no maintenance, choose and use, requests are processed remotely, delivers results. Cloud applications are like this except you pay subscription, ROI far better than ownership and depreciation (ever buy, update or install a search engine?).
Cloud allows a “one to many relationship” develop between service supplier and customer, it is this that pushes cloud application performance return to cost outlay far in excess of premises configured and supported solution.
Customers benefit from “zero infrastructures” concepts.
Virtualisation: Allows an operating system to run inside a computer file (simulated computer environment), great for testing new applications and distributing integrated applications, since a golden template can be worked at one location and the benefits are distributed over LAN/WAN, ROI rises rapidly at this junction, customers can now benefit from “zero maintenance” concepts.
Service Integrator: using virtualisation and cloud computing applications to fully integrate many additional functions included, the “service Integrator” from one location integrates applications and distributes them on virtual machine format, client can run this on own infrastructure or remote server, client side zero maintenance and zero infrastructure concepts are now possible.
Data Platform: A Data base at its core plus it has business processes embedded into it, through what is called a “meta data layer” by which large scale administration automation is possible, including security and data presentation features, extended function such as real time reporting, electronic validation signatures and bar coding are some examples of device and regulatory integration within the Data Platform.
Golden template: A very expertly configured and tested system that is secure, reliable, replicable and distributable as a tool set over many systems.
Benefits of using integrated service.
The integrated service makes sharp distinction between work tool and work data, the work tool is supported and distributed by the “service integrator”, the work itself is carried out by selected entities on a data base that belongs to the client and or service provider.
This reduces the cost of tool select, procure, install, configure maintain and integrate cost as normally borne by who ever delivers the project, furthermore project delivery is a simple data access change, a data ownership transfer, what could be easier?.
Using Data platform on top of a data base means all data and documentation is distributed, accessed and reviewed in a very secure environment, this in combination with new devices such as tablets and digital signatures allows all data to remain securely contained within a data firewall, and reduces the need for wet signatures, paper printouts and the file splurge common on a good many systems (e.g. my Documents folder, my desktop).
Integrated service components
Here is the integrated service golden template concept adapted to integrated plant data and design software.
Golden template is the “Service Integrator tool set” that is configured for all stages of the Plant Life Cycle; this tool is distributed to all involved producing pre integrated deliverables.
Here is how the golden template becomes service integration.
Part 1, the golden template is formed by the service integrator; this is an on going expansion of generic function.
Part 2, the golden template is put through a template multiplier, typically a virtualization process
Part 3, when by process of consultation the template is customized to the remaining 5%, it is distributed to the client, a point to make here is that larger clients will be requiring more consultation, for the simple reason that they will be driving the initial process that determines how the design is done in the very first instance, after that it will be the smaller service providers that will keep the plant and its data maintained, relieving them of the tool development burden will save them time and ensure the plant data life cycle integrity.
Stage 1 (Service integrator tool set) used for Front End Design
Stage 2 (Service integrator tool set) used for Detailed Design shown in diagrammatic form
Stage 3 (Service integrator tool set) used by Construction
The Service integrator working overview
The next stages up to stage 6, following similar order, the idea here is that the same outsourced system as configured on the golden template is the tool used in all activities, the data produced by the golden tool set is the property of the client.
Diagram showing the relation between the Plant Data Life-cycle phases, the data platform, the service providers and the service integrator tool set.
The dark blue presents the Data Platform as the most important element, it grows through the plant life cycle and provides secure and integral basis for task automation, fire walling and controlled data distribution. Using enterprise class data base architecture, data flows easily within pre defined data templates allowing for a host of data view configurations that are to a large extent automated that is automation of delivery content.
Collaboration is made easy and secure for all stake holders in a safe and secure manner as access to data is determined by the Data Platform.
Below is an example of a project in the design phase showing how all data is accessible to all stake holders continually, in real time or in time with stake holder meeting cycles.
Accessing design systems, who must spent how much on what and why, is summarized in the following diagram the following diagram is to show
1. Customer service usage category
2. Cost centers and customer type
3. Relative ROI levels
4. Comparative performance compared to cloud based systems
This does not mean that in-house development comes to an end, it will get more specialized within the framework of the outsourced system.
Cloud and outsourced systems will become dominant where cost to performance ratio’s are viewed, this is an inevitable consequence of increased “One-To-Many-Relation” that cloud based service providers have with many clients across many nations and many diverse corporate entities.
Return on investment is here clarified by system
To conclude, this is an exploration of how data systems can be used to deliver R.O.I. ranging form 20 to 60% over existing systems.
The performance of such a design tool is dependant on the building and maintaining of data handling tools, which are becoming more and more complex, especially in the last 5 years, this has been driven by a number of factors,
1. The demand for efficiency and value engineering in industry
2. The rise in internet access and services
3. Need for collaboration across many continents and disciplines
4. Pressure to outsource parts of jobs.
5. Real time information available at a moments notice
6. Automation of tasks
7. Business process integration with other disciplines such as accounting and billing.
Data centered systems are ideal candidates for implementation of the new breed of integrated application services, that are in a constant state of becoming more and more useful and productive, as opposed to the depreciating high maintenance counterpart we are more familiar with.
Ref 1 ISO 15926-1:2004(E)