EnterpriseDB is a valuable actor in PostgreSQL’s world. In addition to provide support, they also deliver very useful tools to manage easily your Postgres environments. Among these we can mention EDB Enterprise Manager, EDB Backup & Recovery Tool, EDB Failover Manager, aso…
With this post I will present one of the last in the family, EDB Postgres Migration Portal, a helpful tool to migrate from Oracle to Postgres.

To acces to the Portal, use your EDB account or create one if you don’t have. By the way, with your account you can also connect to PostgresRocks, a very interesting community platform. Go take a look 🙂 .

Once connected, click on “Create project” :
1

Fulfill the fields and click on “Create”. Currently it is only possible to migrate from Oracle 11 or 12 to Postgres EDB Advanced Server 10 :
2

All your projects are displayed at the bottom of the page. Click on the “Assess” link to continue :
3

The migration steps consist of the following :

  1. Extracting the DDL metadata from Oracle database using the EDB’s DDL Extractor script
  2. Running assessment
  3. Correcting conflicts
  4. Downloading and running the new DDL statements adapted to your EDB Postgres database
  5. Migrating data

1. Extracting the DDL metadata from Oracle database

The DDL Extractor script is easy to use. You just need to specify the schema name to extract the DDLs and the path to store the DDLs file. As you can guess, the script run the Oracle dbms_metadata.get_dll package to extract objects definitions :
Connected to:
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options

SQL> select object_type, count(*) from dba_objects where owner='HR' and status='VALID' group by object_type order by 1;

OBJECT_TYPE COUNT(*)
----------------------- ----------
INDEX 19
PROCEDURE 2
SEQUENCE 3
TABLE 7
TRIGGER 2

SQL>

SQL> @edb_ddl_extractor.sql
# -- EDB DDL Extractor Version 1.2 for Oracle Database -- #
# ------------------------------------------------------- #
Enter SCHEMA NAME to extract DDLs : HR
Enter PATH to store DDL file : /home/oracle/migration


Writing HR DDLs to /home/oracle/migration_gen_hr_ddls.sql
####################################################################################################################
## DDL EXTRACT FOR EDB POSTGRES MIGRATION PORTAL CREATED ON 03-10-2018 21:41:27 BY DDL EXTRACTION SCRIPT VERSION 1.2
##
## SOURCE DATABASE VERSION: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
####################################################################################################################
Extracting SYNONYMS...
Extracting DATABASE LINKS...
Extracting TYPE/TYPE BODY...
Extracting SEQUENCES...
Extracting TABLEs...
Extracting PARTITION Tables...
Extracting CACHE Tables...
Extracting CLUSTER Tables...
Extracting KEEP Tables...
Extracting INDEX ORGANIZED Tables...
Extracting COMPRESSED Tables...
Extracting NESTED Tables...
Extracting EXTERNAL Tables..
Extracting INDEXES...
Extracting CONSTRAINTS...
Extracting VIEWs..
Extracting MATERIALIZED VIEWs...
Extracting TRIGGERs..
Extracting FUNCTIONS...
Extracting PROCEDURE...
Extracting PACKAGE/PACKAGE BODY...


DDLs for Schema HR have been stored in /home/oracle/migration_gen_hr_ddls.sql
Upload this file to the EDB Migration Portal to assess this schema for EDB Advanced Server Compatibility.


Disconnected from Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
oracle@vmrefdba01:/home/oracle/migration/ [DB1]

2. Assessment

Go back to your browser. It’s time to check if the Oracle schema can be imported to Postgres or not. Upload the output file…
4…and click on “Run assessment” to start the check.
The result is presented as follow :
6

3. Correcting conflicts

We can notice an issue in the report above… the bfile type is not supported by EDB PPAS. You can click on the concerned table to get more details about the issue :7Tips : when you want to manage bfile columns in Postgres, you can use the external_file extension.
Of course several other conversion issues can happen. A very good point with the Portal is that it provide a knowledge base to solve conflicts. You will find all necessary information and workarounds by navigating to the “Repair handler” and “Knowledge base” tabs. Moreover, you can do the corrections directly from the Portal.

4. Creating the objects in Postgres database

Once you have corrected the conflicts and the assess report indicates a 100% success ratio, click on the top right “Export DLL” button to download the new creation script adapted for Postgres EDB :
8
Then connect to your Postgres instance and run the script :
postgres=# \i Demo_HR.sql
CREATE SCHEMA
SET
CREATE SEQUENCE
CREATE SEQUENCE
CREATE SEQUENCE
CREATE TABLE
CREATE TABLE
CREATE TABLE
CREATE TABLE
CREATE TABLE
CREATE TABLE
CREATE TABLE
CREATE INDEX
CREATE INDEX
CREATE INDEX
CREATE INDEX
CREATE INDEX
CREATE INDEX
CREATE INDEX
CREATE INDEX
CREATE INDEX
CREATE INDEX
CREATE INDEX
ALTER TABLE
ALTER TABLE
ALTER TABLE
ALTER TABLE
ALTER TABLE
ALTER TABLE
ALTER TABLE
ALTER TABLE
ALTER TABLE
ALTER TABLE
CREATE PROCEDURE
CREATE PROCEDURE
CREATE TRIGGER
CREATE TRIGGER
postgres=#

Quick check :
postgres=# select object_type, count(*) from dba_objects where schema_name='HR' and status='VALID' group by object_type order by 1;
object_type | count
-------------+-------
INDEX | 19
PROCEDURE | 2
SEQUENCE | 3
TABLE | 7
TRIGGER | 2
(5 rows)

Sounds good ! All objects have been created successfully.

5. Migrating data

The Migration Portal doesn’t provide an embedded solution to import the data. So to do that you can use the EDB Migration Tool Kit.
Let see how it works…
You will find MTK in the edbmtk directory of the {PPAS_HOME}. Inside etc the toolkit.properties file is used to store the connection parameters to the source & target database :
postgres@ppas01:/u01/app/postgres/product/10edb/edbmtk/etc/ [PG10edb] cat toolkit.properties
SRC_DB_URL=jdbc:oracle:thin:@192.168.22.101:1521:DB1
SRC_DB_USER=system
SRC_DB_PASSWORD=manager

TARGET_DB_URL=jdbc:edb://localhost:5444/postgres
TARGET_DB_USER=postgres
TARGET_DB_PASSWORD=admin123
postgres@ppas01:/u01/app/postgres/product/10edb/edbmtk/etc/ [PG10edb]

MTK use JDBC to connect to the Oracle database. You need to download the Oracle JDBC driver (ojdbc7.jar) and to store it in the following location :
postgres@ppas01:/home/postgres/ [PG10edb] ll /etc/alternatives/jre/lib/ext/
total 11424
-rw-r--r--. 1 root root 4003800 Oct 20 2017 cldrdata.jar
-rw-r--r--. 1 root root 9445 Oct 20 2017 dnsns.jar
-rw-r--r--. 1 root root 48733 Oct 20 2017 jaccess.jar
-rw-r--r--. 1 root root 1204766 Oct 20 2017 localedata.jar
-rw-r--r--. 1 root root 617 Oct 20 2017 meta-index
-rw-r--r--. 1 root root 2032243 Oct 20 2017 nashorn.jar
-rw-r--r--. 1 root root 3699265 Jun 17 2016 ojdbc7.jar
-rw-r--r--. 1 root root 30711 Oct 20 2017 sunec.jar
-rw-r--r--. 1 root root 293981 Oct 20 2017 sunjce_provider.jar
-rw-r--r--. 1 root root 267326 Oct 20 2017 sunpkcs11.jar
-rw-r--r--. 1 root root 77962 Oct 20 2017 zipfs.jar
postgres@ppas01:/home/postgres/ [PG10edb]

As HR’s objects already exist, let’s start the data migration with the -dataOnly option :
postgres@ppas01:/u01/app/postgres/product/10edb/edbmtk/bin/ [PG10edb] ./runMTK.sh -dataOnly -truncLoad -logBadSQL HR
Running EnterpriseDB Migration Toolkit (Build 51.0.1) ...
Source database connectivity info...
conn =jdbc:oracle:thin:@192.168.22.101:1521:DB1
user =system
password=******
Target database connectivity info...
conn =jdbc:edb://localhost:5444/postgres
user =postgres
password=******
Connecting with source Oracle database server...
Connected to Oracle, version 'Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options'
Connecting with target EDB Postgres database server...
Connected to EnterpriseDB, version '10.5.12'
Importing redwood schema HR...
Loading Table Data in 8 MB batches...
Disabling FK constraints & triggers on hr.countries before truncate...
Truncating table COUNTRIES before data load...
Disabling indexes on hr.countries before data load...
Loading Table: COUNTRIES ...
[COUNTRIES] Migrated 25 rows.
[COUNTRIES] Table Data Load Summary: Total Time(s): 0.054 Total Rows: 25
Disabling FK constraints & triggers on hr.departments before truncate...
Truncating table DEPARTMENTS before data load...
Disabling indexes on hr.departments before data load...
Loading Table: DEPARTMENTS ...
[DEPARTMENTS] Migrated 27 rows.
[DEPARTMENTS] Table Data Load Summary: Total Time(s): 0.046 Total Rows: 27
Disabling FK constraints & triggers on hr.employees before truncate...
Truncating table EMPLOYEES before data load...
Disabling indexes on hr.employees before data load...
Loading Table: EMPLOYEES ...
[EMPLOYEES] Migrated 107 rows.
[EMPLOYEES] Table Data Load Summary: Total Time(s): 0.168 Total Rows: 107 Total Size(MB): 0.0087890625
Disabling FK constraints & triggers on hr.jobs before truncate...
Truncating table JOBS before data load...
Disabling indexes on hr.jobs before data load...
Loading Table: JOBS ...
[JOBS] Migrated 19 rows.
[JOBS] Table Data Load Summary: Total Time(s): 0.01 Total Rows: 19
Disabling FK constraints & triggers on hr.job_history before truncate...
Truncating table JOB_HISTORY before data load...
Disabling indexes on hr.job_history before data load...
Loading Table: JOB_HISTORY ...
[JOB_HISTORY] Migrated 10 rows.
[JOB_HISTORY] Table Data Load Summary: Total Time(s): 0.035 Total Rows: 10
Disabling FK constraints & triggers on hr.locations before truncate...
Truncating table LOCATIONS before data load...
Disabling indexes on hr.locations before data load...
Loading Table: LOCATIONS ...
[LOCATIONS] Migrated 23 rows.
[LOCATIONS] Table Data Load Summary: Total Time(s): 0.053 Total Rows: 23 Total Size(MB): 9.765625E-4
Disabling FK constraints & triggers on hr.regions before truncate...
Truncating table REGIONS before data load...
Disabling indexes on hr.regions before data load...
Loading Table: REGIONS ...
[REGIONS] Migrated 4 rows.
[REGIONS] Table Data Load Summary: Total Time(s): 0.025 Total Rows: 4
Enabling FK constraints & triggers on hr.countries...
Enabling indexes on hr.countries after data load...
Enabling FK constraints & triggers on hr.departments...
Enabling indexes on hr.departments after data load...
Enabling FK constraints & triggers on hr.employees...
Enabling indexes on hr.employees after data load...
Enabling FK constraints & triggers on hr.jobs...
Enabling indexes on hr.jobs after data load...
Enabling FK constraints & triggers on hr.job_history...
Enabling indexes on hr.job_history after data load...
Enabling FK constraints & triggers on hr.locations...
Enabling indexes on hr.locations after data load...
Enabling FK constraints & triggers on hr.regions...
Enabling indexes on hr.regions after data load...
Data Load Summary: Total Time (sec): 0.785 Total Rows: 215 Total Size(MB): 0.01

Schema HR imported successfully.
Migration process completed successfully.

Migration logs have been saved to /home/postgres/.enterprisedb/migration-toolkit/logs

******************** Migration Summary ********************
Tables: 7 out of 7

Total objects: 7
Successful count: 7
Failed count: 0
Invalid count: 0

*************************************************************
postgres@ppas01:/u01/app/postgres/product/10edb/edbmtk/bin/ [PG10edb]

Quick check :
postgres=# select * from hr.regions;
region_id | region_name
-----------+------------------------
1 | Europe
2 | Americas
3 | Asia
4 | Middle East and Africa
(4 rows)

Conclusion

Easy, isn’t it ?
Once again, EnterpriseDB is providing a very practical, user-frendly and quick to handle tool. In my demo the HR schema is pretty simple. The migration of more complexe schema can be more challenging. Currently only migrations from Oracle are available but SQL Server and other legacy databases should be supported in future versions. In the meantime, you must use EDB Migration Tool Kit for that.

That’s it. Have fun !