DataMagic Technical Column Vol.13

(Extra Edition) DataMagic Implementation Case Studies

Introduction

This time, we will introduce a real case study that utilizes the tips from DataMagic technical column we have introduced so far. This is a case study of a distribution company that implemented DataMagic and achieved master data integration.

background

In the distribution industry, it is extremely important to understand and manage not only best-selling products, but also those that are selling poorly, and to create a product lineup that matches market trends.To achieve this, it is desirable to obtain accurate data using a highly accurate information system and apply it to sales strategies and sales promotion plans.

Therefore, distributor A decided to fundamentally review its product code system and build a master management system with the aim of achieving detailed individual item management.

Issues with the existing system

In the existing system, product code master data created by the sales management system, MD system, accounting system, and card company was linked to the subsystem via mission-critical system, core system in a nightly batch process, which meant it took up to two days for the product codes to be reflected, creating a bottleneck in the required business speed. In addition to a fundamental review of the product code system, the new master data management system was also required to contribute to business speed. To solve the above issues, it was decided to build the new master data management system based on the following three policies.

Development policy for the new master management system

  • The core mainframe will be migrated to an open system to ensure flexibility in responding to changes. However, to minimize the impact of interface changes with the subsystem, the old mainframe data will be used in an open environment.
  • Ensures the ability to quickly process the number of master types and items that increase significantly as product code systems become more specialized.
  • The masters scattered across subsystems will be integrated and organized to build an efficient master integration platform.

New system configuration

In the new master management system, DataMagic was introduced along with the integration of masters, and master editing, which had previously been done in each subsystem, was carried out in the master management system.

Resolution Policy

  1. Supports a variety of data formats to link with multiple platforms
  2. Business requirements require the ability to process large amounts of data in a short time
Positioning of DataMagic An interface development tool for collecting and send various master data (such as product code master data) to various systems Sales management, MD, accounting, and card company data are sent to the master data management system, and HULFT for master data management and the master data management DB (Oracle) communicate via DataMagic (DB Option (Oracle)). HULFT for master data management sends data to POS, MD, DWH, sales management, BI, customer management, project management, and card companies. 1. Master data reception (4 systems, 25 master data) 2. Master data editing (generation management, differential extraction, logical deletion, character code conversion (external characters)) 3. Master data transmission (8 systems, 80 master data)

DataMagic selection point 1: Support for various data formats

In order to accommodate the diverse data formats listed as requirements for the system, specific issues that arose included integration with multiple platforms, data migration from mainframes, and compatibility with mainframe data formats.

assignment

  • In order to link with a variety of systems across multiple platforms, a wide range of data formats are supported.
  • Existing mainframe assets need to be migrated to the new system.
  • To minimize the impact of the system switchover, the interfaces between the systems will not be changed. Therefore, the master management system must also be able to handle the mainframe data format.

The key point in selecting DataMagic was that it is easier to use mainframe-format data in a Windows environment than other ETL tools.

1) Support for various data formats (It is necessary to be able to handle various data formats in order to link with many systems on multiple platforms) 2) Data migration from mainframes (It is necessary to migrate data from existing systems held by mainframes to the new system) 3) Affinity with mainframe formats (In order to minimize the impact of system switching, the policy is to not change the interface between systems. It is necessary to be able to handle data formats unique to conventional mainframes {fixed length, variable length, multi-layout, EBCDIC, shift code, PACK data}) DataMagic is used as an interface development tool in order to efficiently develop over 100 interfaces

DataMagic selection point 2: Support for large amounts of data and high-speed processing

The business requirements for the new system included the ability to integrate 40 million product codes (16GB) within five hours, and the performance requirement was that the same product codes be integrated within two hours. DataMagic 's performance testing demonstrated that integration could be achieved in 40 minutes, which was a key factor in the system's selection.

The business requirement for the sales management system (the number of product code master records has increased significantly due to improvements to the product code system) to the master management system (layout conversion, master conversion, multiple master merge) and MD system (system integration required in a short time for optimal product management (inventory management and order management)) was 5 hours, and the performance requirement was 2 hours. To integrate 40 million product code master records and 16GB of product code master data in 2 hours... DataMagic was adopted as a high-speed ETL tool.

summary

The benefits of using DataMagic as an interface development tool include:

Point 1. Same-day updates to product code master data: Thanks to the realization of high-speed processing of large amounts of data, product code master data, which was previously updated the next day or the day after, is now updated the same day, contributing to improved business speed. Point 2. Wide range of applications thanks to abundant data conversion functions: Support for a variety of data formats (fixed length, variable length, multi-layout, EBCDI data) allows for use in interface development, data migration tools from mainframes, and the import and extraction of mainframe-specific data formats, thereby consolidating conversion functions in the new master management system. Point 3. High cost-effectiveness: With a wide range of functions and an improved UI during development, productivity is high, and the cost calculated from the results of system cutover was implemented at approximately 1/4 the cost of starting from scratch.

lastly

This time, we introduced a case study of DataMagic implementation.
Please download DataMagic trial version and try out the technical column explanations.

  • The trial version is free to use for 60 days.
  • After you sign up for the trial version, you will receive 90 days of free technical support.

DataMagic Column List

Related Content

Return to column list