Information Management with IQIMS
IQIMS : Information management 2.0 for industry 4.0
IQIMS: reduction of TCO about 90%, Time to Market < 1h
The world leading and most efficient approach of meta data managed repository systems (MDMRS).
In use since 1995, technically mature, established, innovative, flexible, secure, all the time State of the Art Technology.
This means to you:
optimized database development, simplified data management,
increased flexibility and connectivity, reduced IT-complexity
optimized processes, simple data collection & exchange,
minimized costs, increased efficiency, enhanced security
The newest version 5 of IQIMS (based on Oracle(R)) was released to continue the successful way ahead of transparent, fully historised, revision safe (e.g.(Euro-)SOX-compatible) storage of information. The suite realises the optimized idea of encapsulated information management. This enables our customer to build reliable, secure, high efficient and long term available solutions with a parallel reduction of IT-costs, exculpate of IT personal, minimizing companies external dependencies. The suite can be integrated transparent into companies IT-landscape, for example to enhance ERP.
Typical areas of usage of IQIMS:
HQ Master data management / Reference data management (mil. aircraft industry proofed)
- immediate support of corporate wide available, reliable, revision safe master data without programming efford
- Integration of existing logical dependencies
- Collection and administration by customers without additional tools or office applications
Data consolidation (SPOT - Synchronisation) (mil. aircraft industry proofed)
- Harmonization of heterogeneous data landscapes
- Company-wide problem-related view on those Data management systems
- Enabling data quality, reducing interfaces, eleminates orchestration
- High secure, Revision safe data storage
HQ Data collection (mil. aircraft industry proofed)
- Support of operational systems (to Real time - Data Warehousing)
- Enhancement of Highest Quality Data
- Revision safe, historical data storage, high secure
Data quality assurance
- reliable information for strategic/obligatory statements
- Reduction of the data rework/ Efficiency increase
- Pursuit of data changes/ Information tracking
HQ Document management and Asset control
- individual modelling of Information, any, historically correct overall build standards can be documented and administered.
- in combination with the data consolidation results it is a complete production management system
HQ Revision safe information exchange platform (mil. Aircraft industry proofed)
- Synchron (e.g. dblink) or asyncron (e.g. SOA, XML) exchange of any (up to legal binding) data
- Data acceptance and rejection workflows, data quality testing, automated history, high secure….
Building individual business applications will be one of challenges, because of the implicit existing complexity of business workflows and decision pathes on the one hand and of the simplicity of the end user devices like mobiles on the other. Todays approaches offers tendencies to simplify complex circumstances with the danger to simplify decision background information (overcompactness) or in contradiction to overload mobile applications with functionality and density to make it not usable. Due to the limitation of mobile devices the information needs to be sized to the media carrier like a mobile, a PDA, a tablet or even a notebook. Every media device have its special strength and is only able to allow the maximum use with optimized presentation and frontends. All this would be a challenge, but with the trend of multiple plattforms (n mobile, m PDA, x tablet, y notebook manufacturer) the typical complexity will be n*m*x*y (different plattforms, operations systems, resolutions, user interfaces, security features, security impacts) and with that impossible to handle individually. The following architecture is a result of a several year work inside big enterprises in the area of logistics, transportation and military aircraft. The combination of fast changing business on the one hand and reliable, trustworthiness, security on the other lead into an architectural approach which roots began 15 years before and had proven its future readiness until today.
INDIVIDUAL BUSINESS APPS
Information system paradoxon
Before starting building business apps nearly every modern approach requires a requirement capturing (e.g. Process defintions, UseCases), a logical information (and/or database) model, a system architecture, a programming architecture, a database architecture etc.
Most of the work is broke down to a very small level of workpieces so that the resulting application can be provided by everyone regardless if the people are working nearby or realizing it offshore. This approach implies that every member of the working team is working nearly at optimum level and best expertise is available. Results in the past years had proven that this approach will lead into a complexity level which is not easy to handle. To reduce failures enormous effords are spend in quality assurance and testing of these solutions. As this will not be enough the aggregation and merger of companies will produce more products with more specialized functionality and even more complexity. Once started this approach will lead into the today usual technology and complexity trap which freezes companies into one technological direction, regardless from the manufacturer. A typical success story will bring the project leader of such a project into a net higher position, because he had successfully implemented such a standard platform. Getting down into deep he had successfully implemented a high complex network of usual about 20.000 tables inside a database which contain information in a non normalized way. These tables are structured in a standard way in which every possible task of any customer is supported. As a single human is not able to master a complex datastructure like this is, the enormous effords are more than easy to explain. In historic approaches of data management the most efficient way of handling information was the real success. Most of actual Standard platforms try to strengthen the addiction to their technology with a maximum of complexity. This antagonism between efficiency and today life technology i´m calling the todays information system paradoxon.
Architectural approach “UniversalCore”
To enable flexible, individual solutions the customer needs to clarify his information structures. This includes a methodology of collecting information objects and their dependencies. Identifying and collection this information objects is a common way of building logical data models based on business requirements and processes. In my approach this stage is the pure business requirement, which the end-user needs realized.
This logical model includes informationobjects as a generalized representation of the real business objects. It consists of attributes and relations between these attributes and objects. As this is already included in object databases it is state of the art. The advantage of a new approach will be generated if it will be possible to freeze this object into a special layer which provides independence from the actual technology. This layer should manage the technical database system and the views to the result sets which are needed by following layers. In addition the Layer should also handle the history of the information and some additional information for example legal information, sources etc. This layer “encapsules” the information from technical aspects and keep (holds oder stores) the information and their dependencies. Simplified described: To hold the logical datamodel and the corresponding data in one area. A big (The great advantage is the …) advantage would be the separation of database development processes, which now can be driven independent from datamanagement process, if the environment supports the automated generation of simple standard applications like master data management or reports.This approach needs to be “neutral” this means it could be used by every business application inside the company. Data Exchange is included by integrated “Import” and “Export” Areas. An Export Area consist of a single or a set of views, which are made available to a group of users.. Users can also use other systems or platforms. In completion to this the User will be supported by needs (requirements) and workflow optimized Applications. All this is running on a standardized Infrastructure to minimize administration. These Cores are categorized in specialized Tasks (departments).
To support this encapsulation layer the architecture need to be managed by metadata based datamanagement. The separation from technical representation and logical model provides the needed flexibility and abstraction to build individual optimized databases without repeating standard problems. Best practice experience and user-requirements in metadata management lead into the following requirements to the data management system:
- Metadata driven multilayer object oriented Data Warehousing similar repository system
- Object classes, object attributes, inherence, object hierarchies
- Logical datalayer model offers a transparent and a historical bi-directional view regarding to timesaxis
- automated protocolled auditing
- separation of logical and physical primary keys to publish normalized data without loosing original datastructure.
- Generalisation of physical structure and dynamical code generation reduces efforts for indexing, primary keys and foreign keys to zero.
- Reduction of equal datatypes with efficient references and storage in generalized standard datatypes.
- Categorisation of data, this means different characteristics of data dependent from its defined context (e.g. language, currency)
The separation of layers enable the encapsulation framework to focus to the management of data. A example of a Architectural framework is shown below:
Standard Applications which could be generalized are typically:
- Administration Applications
- User/Role Administration,
- Administration of logical Datamodel,
- Administration of User defined Views and Analysis,
- master data management application,
- Analysis functionalities, User defined Views for Data analysis, Basic Data visualisation
- Data Exchange applications.
Long term experience have proved that 70% of basis functionality can be covered by these standard applications.
To build up a flexible and reliable architecture the database concept and model is one of the most important aspect of the backbone. In order to support this, the architecture needs to be stable, easy to maintain, transparent and flexible enough to be open to future enhancements. The basic concept consists of a task oriented structure for the areas of metamodel storage, data storage, the management area, the application definition area and user/role area.
The metamodel storage area contains the definition of classes, attributes, relations and class inheritance (hierarchy). The definition is also managed by metadata like domains, timeslices etc.
Additional the application area contains metadata needed for applications, like definitions, styles, setups etc. Last but not least the user and role management handles the access privileges additional to the database built in security. As the data is held in a separate way the user rights need to be organized. The Database architecture contains with all the modules a complete engine to define and manage reliable data to solve nearly every requirement.
Most of the business intelligence is stored inside the abstraction layer, so the middleware is needed to present the applications to the user, support other architectures, support Services and handle the frontend applications for a web presentation. This layer is needed for reliable data exchange like figured below, if this is needed. The following picture shows a data exchange environment called workbook. This module manages the secure data exchange between communication partner and includes the arrival of data packages, the examination and acceptance of data and the integration into the data environment.
On the top level of this architecture the developers are open to decide which frontend the need to use. As the database layer provides standard database views the development is absolutely free how to present this data. Standard frontend platforms are at the moment Oracle Forms on Oracle Middleware and RubyonRails on Apache Middleware. Both extended with Java and actual frameworks. In the beginning of the development the metamodels where managed via other tools like MS-Excel, MS-Access or development environments like Centura or simple HTML with a Coldfusion Server. This gives a short overview how easy it is to exchange frontends. The frontends can easily adapted to any new technology, because the database architecture ensure that the data-views have all the time the same structure. With this background the development towards mobile application becomes more easy.
The dataexchange architecture consist of several supported approaches. The easiest is a direct online access to the views via a database link with ODBC, JDBC, ODI etc. This enables every granted application directly access to the information to work with. The next level is the definition of user-defined views and access via a standard application. This applications allow the user to analyse the data, to export it in several formats (XML, EXCEL, WORD, PDF, ASCII,..) and forward it to the target system. The third level is a trigger managed dataexchange platform which enables workflow driven data exchange with other systems. This can be a service bus, web service and is integrated into the middleware. This enables dataexchange with legacy systems or enterprise applications, also SOA-buses can be addressed and managed in a simplified way to get information into and out of the system. One important aspect with the workflow driven approach is the possibility to have additional quality assurance processes for data exchange. The intelligent identification of data packets and a separation into known and unknown formats/structures enables different handling of these data. The known data will be sent into a import area which enables the user to have a pre-loading assurance to optimize data quality and get a direct (automated) feedback to the results of loading this data. To export the data an similar procedure to define the export-data-sets and to authorize them is provided (if needed). The different ways are just examples how data exchange can be performed. As most of the companies have different infrastructures and approaches every way of exchanging information is possible.
Universal Core Approach
Building up a backbone like described sounds like it will be real challenge to set up an environment like this. Enterprises need usually 3 days for setup. Another apect is that we live in a networked environment it should be no problem to identify the data and to present it into a mobile application. If the information infrastructure is optimal organized an well structured, this is correct. But usually a company consist of a systems infrastructure which contains several ages of Information technology. Even companies which complains to have a standardized Application Plattform like ERP-Systems usually have many users which work with Excel, access etc. to support the lacks of the central systems. To collect all this shadow-data into a presentable way a backbone approach like described support the enduser to get free from the shadow approach and work towards this central provided plattforms. As a business mobile App needs to prove a business enhancement to the actual situation / solution environment. To enable this enhancements a flexible frontend strategy is needed to take place as the apps can run on different plattforms like PDA, Tablet, Notebook. Every platform can consist of different OS, different sensors, functionalities and security features. The following description is a application in which people can communicate with internal functionalities, placing sticky notes on any website for research issues and as a example of a working architecture for workflow support on any device.
Connecting to the backbone
The first challenge for applications is to provide a “company private cloud” which enables the business user to access to the needed applications and data. If the company is able to provide a backbone like described above the user access only to a applic-schema which is located in a DMZ. To ensure the information infrastructure is held inside the company every service from the backbone to the web has to be devided into separate physical hardware, if possible with different (hardened) operating systems. This supports the robustness and prevents from hacking into this cloud with a single leak. Main advantage is the freedom to choose any frontend developing infrastructure. This enables most companies to contiunue with best practice approaches and minimizes dependencies to any vendor. In addition to this rapid prototyping, SCRUM and all other software development methodologies are supported without limiting them. In worst case even Access or Excel Applications could access to the data inside the backbone.
Resulting Universal Core Infrastructure
With implementing a Universal Core Infrastructure companies are able to support their central systems, enhance functionalities in a quick way and interface with all necessary information partners. With a consequent realisation of these Infrastucture model the company can minimize costs for interfacing and information management dramatically, because all informations are held inside one “Core”. Ideally the new infrastructure will look like in figure 12 or in figure 3.
To provide a “business ready” application to the enduser, many steps in the background have to be taken in advice.
First the security issue. Many manager like to work remotely, but with the more functionality a mobile is able to perform, the risk of loosing a “danger” amount of information increases also. The business impact can be enormous. How dangerous a leak can be is shown by the HBGary security company or the crack of Sony Network. Even in security aware environments the WikiLeaks disaster is a good example of collecting data without responding security environments.
Second aspect is the provision of a reliable backbone to support the companies workflows. The aspect of supporting business processes to raise the business potentials.
The third aspect of a flexible frontend framework to minimize customization will be one of the actual challenges to provide secure business apps. Business needs to be supported without having months of development. Regarding to the strategy of companies the provision of business solutions is necessary to support the mobile worklife. The user acceptance is directly dependent to the comfort, the time of provision and the benefit.
This directs to the fourth aspect how a business app can be measured against productivity and quality aspects.
And last but not least the technology has to be discussed with trades and work council to receive a support for the new way of work. The danger of people tracking and surveillance is very close to the benefit.
This aspects can be solved with a Universal Core approach, because Data is secured, Users are managed, Interfaces are transparent, individual Applications are central managed. This are the key success factors for providing a efficient and secure Information management infrastructure.
 Henning, Edward. 2011. More background on the US security firm break in. heise-online security UK (Feb. 2011), DOI= http://www.h-online.com/security/news/item/More-background-on-the-US-security-firm-break-in-1191797.html
 Löw, Alexander. 2010. Cost Efficient Information Management am Beispiel Eurofighter. Presentation at “SASPF Introduction Conference”. Bonn
 U.S. Army Cio Office. 2011. The Universal Data Core. DOI=http://data.army.mil/datastrategy_universal_core.html
 NISO Press. 2004. Understanding Metadata. ISBN: 1-880124-63-9
 Löw Alexander. 2011. Building efficient mobile business applications from backbone to frontend.