Friday, December 5, 2025
Oracle JDBC Connection Test
Tuesday, June 10, 2025
April Patch 19c 2025
Oracle Critical Patch Update on 19c - January 2025
Oracle Critical Patch Update (CPU)
Patch 37591516 - Combo of OJVM Component Release Update 19.27.0.0.250415 + Grid Infrastructure Apr 2025 Release Update 19.27.0.0.25041
Execute the following commands:
Oracle® Database Patch 37641958 - GI Release Update 19.27.0.0.250415
Download lest OPatch from "https://updates.oracle.com/download/6880880.html"
Unzip January patch file downloaded.
$ORACLE_HOME/OPatch/opatch prereq CheckConflictAgainstOHWithDetail -phBaseDir $DOWNLOAD_PATCH/37641958/37643161
$ORACLE_HOME/OPatch/opatch prereq CheckConflictAgainstOHWithDetail -phBaseDir $DOWNLOAD_PATCH/37641958/37654975
$GRID_HOME/OPatch/opatchauto apply $DOWNLOAD_PATCH/37641958
Post-Installation Instructions
export PATH=$PATH:/usr/ccs/bin
#$ORACLE_HOME/OPatch/opatch apply $DOWNLOAD_PATCH/37499406
==== DONE ====
Enjoy !
Tuesday, May 7, 2024
physical standby without duplicate command
physical standby without duplicate command
create a physical standby database using RMAN without using duplicate command
PURPOSE: All documents are provided on this Blog just for educational purposes only. Please make sure that you run it in your test environment before to move on to production environment.
Step 2: Move the backups to the standby server
Step 3: Make proper changes in the parameter files of both primary and standby database
Step 4: Do the restore and recover on standby database
Step 5: Put the standby database in recover managed mode
Step 1:
=>> Primary:
Backup controlfile for standby
Step 2:
Move the backups and standby controlfile to the standby server
-- Standby steps
Standby DB
-- RMAN> set dbid=3959250272; if need it. In my case no need
-- Primany DB:
-- check the parameter
-- Standby DB
Wait and check the alert.log files
You can also check the last log apply date and time:
Friday, December 15, 2023
Oracle Data Pump
Oracle Data Pump
PURPOSE: All
documents are provided on this Blog just for educational purposes
only. Please make sure that you run it in your test environment
before to move on to production environment.
ORA-31633: unable
to create master table.
You can check
the running Job:
SELECT owner_name,
job_name, operation, job_mode, state, attached_sessions FROM dba_datapump_jobs
WHERE job_name NOT LIKE 'BIN$%' ORDER BY 1,2;
You can see the
master table.
SELECT o.status,
o.object_id, o.object_type, o.owner||'.'||object_name "OWNER.OBJECT"
FROM dba_objects
o, dba_datapump_jobs j WHERE o.owner=j.owner_name AND o.object_name=j.job_name
AND j.job_name NOT LIKE 'BIN$%' ORDER BY 4,2;
drop table
SYS.EXPDP_JOB;
STEPS:
1. Create
or change a directory for the expdp
CREATE OR REPLACE DIRECTORY DATA_PUMP_DIR AS '/fast_recovery_area/dpump_DB';
GRANT READ, WRITE ON DIRECTORY DATA_PUMP_DIR TO system;
Directory
location.
col OWNER format a10;
col DIRECTORY_NAME format a20;
set linesize 200;
2. Export
TABLE:
expdp system/xxxxx directory=DATA_PUMP_DIR dumpfile=table_data_%U.dmp tables=KARDBTES.MIGTAB01 content=data_only
logfile=export_to_fix_.log
3. Import
TABLES:
cd /oracledb/datapump
impdp system/xxxx directory=DATA_PUMP_DIR dumpfile=table_data_%u.dmp tables=KARDBTES.MIGTAB01 , KARDBTES.MIGTAB02
logfile=import_to_fix.log parallel=4
Export Schema:
#!/bin/bash
. /users/oracle/set_home_env KARDBTES
expdp "'"/ as sysdba"'" schemas=KARDBTES01 directory=DATA_PUMP_DIR logfile=exp_KARDBTES01_`date +%b_%d_%H_%M_%S.`log JOB_NAME=expdp_KARDBTES01 DUMPFILE=KARDBTES01_%U_`date +%b_%d_%H_%M_%S.`dmp
Or:
4. Create
a script to export multiple schemas.
vi datapump_exp_schema.sh
. /users/oracle/set_home_env KARDBTES
expdp "'"/ as sysdba"'"
schemas=KARDBTES01, KARDBTES02 directory=DATA_PUMP_DIR parallel=5
logfile=dexp_KARDBTES .log JOB_NAME= dexp_KARDBTES DUMPFILE=kardbtes_%U.dmp
Execute the script as follow.
nohup /oracledb/datapump/datapump_exp_schema.sh >/oracledb/datapump/datapump_exp_schema.log &
Import
data into another DB with same Schemaname.
vi datapump_imp_schema.sh
. /users/oracle/setdb_home KARDBTES
impdp "'"/ as sysdba"'" schemas=
KARDBTES01, KARDBTES02 directory=DATA_PUMP_DIR parallel=5 logfile=dimp_
KARDBTES.log JOB_NAME= dimp_KARDBTES DUMPFILE= KARDBTES _%U.dmp TRANSFORM=oid:n
#Execute JOB es follow:
nohup /oracledb/datapump/datapump_imp_schema.sh >/oracledb/datapump / datapump_imp_schema.log &
Import as different Schemaname
vi
datapump_imp_schema.sh
#!/bin/bash
. /users/oracle/setdb_home
KARDBTES
impdp \'/ as sysdba\' schemas=KARDB_DEV REMAP_SCHEMA=KARDB_DEV:KARDB_TST
exclude=statistics directory=data_pump_dir LOGFILE=SANC_DEV_imp.$(date
+%d%m%y-%H%M).log JOB_NAME=imp_kardb_job DUMPFILE=KARDB_DEV01.210624-1109.dmp
5. At
the end, do execute the following command. It depend whether you just exporting
table or schema
sqlplus /
SQL> exec dbms_stats.gather_table_stats
(ownname => 'SCHEMA_NAME', tabname => 'TABLE_NAME' ,
estimate_percent => 5, degree => 8, cascade => true);
SQL> alter system flush shared_pool;
6. If
you want to just export and import Schema
structure without data. Create a script as follow. (no_date)
more expdb_full_schema_description.sh
#!/bin/bash
. set_home_env KARDBTES
echo $ORACLE_SID
date
YYYYMMDD=`date +%Y%m%d`
expdp "'"/ as
sysdba"'" content=METADATA_ONLY \
schemas=schema1,schema2,schema3 \
directory=DATA_PUMP_DIR
reuse_dumpfiles=y \
dumpfile=kardbtes_description_$YYYYMMDD.dmp
\ logfile=kardbtes_description_$YYYYMMDD.log
date
7. You can import as schema as follow.
You may nee to create table and index befor you import.
example:
Tablespace:
CREATE
TABLESPACE DATA
DATAFILE '+DATA01' SIZE 1G AUTOEXTEND ON NEXT 1G MAXSIZE UNLIMITED
LOGGING
DEFAULT
NO INMEMORY
ONLINE
EXTENT MANAGEMENT LOCAL AUTOALLOCATE
BLOCKSIZE 16K
SEGMENT SPACE MANAGEMENT AUTO
FLASHBACK
ON;
Index
CREATE
TABLESPACE INDX
DATAFILE '+DATA01' SIZE 1G AUTOEXTEND ON NEXT 1G MAXSIZE UNLIMITED
LOGGING
DEFAULT
NO INMEMORY
ONLINE
EXTENT MANAGEMENT LOCAL AUTOALLOCATE
BLOCKSIZE 16K
SEGMENT SPACE MANAGEMENT AUTO
FLASHBACK
ON;
#!/bin/bash
.
/users/oracle/set_home_env KARDBTES
echo $ORACLE_SID
date
YYYYMMDD=`date +%Y%m%d`
impdp "'"/ as
sysdba"'" content=metadata_only \
schemas=schema1,schema2,schema3 \
exclude=statistics \
logfile=kurs1_nodata_MDV_MASTER_impdp_$YYYYMMDD.log \
JOB_NAME=impdp_nodataKARDBTES DUMPFILE=KARDBTES_description_schema_20210319.dmp
date
Exclude some tables expdp (export complete schema
except some tables):
#!/bin/bash
.
/users/oracle/set_home_env KARDBTES
expdp
"'"/ as sysdba"'" schemas=schemaname
directory=DATA_PUMP_DIR \ logfile=exp_schema.log PARALLEL=7
JOB_NAME=expdp_exclude \ DUMPFILE=schema_name_%U.dmp \
EXCLUDE=TABLE:\"IN\(\'table1\',\'table2\',\'table3\',\'table4\'\)\"
Exclude some tables imp(Import complete schema
except some tables):
#!/bin/bash
.
/users/oracle/set_home_env KARDBTES
impdp
"'"/ as sysdba"'" schemas=schemaname directory=DATA_PUMP_DIR
\ logfile=imp_schema.log PARALLEL=7 JOB_NAME=impdp_exclude \
DUMPFILE=schema_name_%U.dmp \
EXCLUDE=TABLE:\"IN\(\'table1\',\'table2\',\'table3\',\'table4\'\)\"
\ TABLE_EXISTS_ACTION=REPLACE
Oracle JDBC Connection Test
How to test Oracle connection via JDBC Thin driver. Oracle JDBC Connection Test SCOPE: Oracle JDBC Connection Test PURPOSE: All do...
-
How to prepare Oracle Physical Standby DB Step by Step Multiple Standby Databases Dataguard Oracle Primary db_unique_name= KARDBP...
-
We need to do following: Upgrade DB and Grid from 11.2.0.2 to 11.2.0.3 and to 12.2.0.1 TASK: 1. Update Oracle data...
-
RMAN-06025: no backup of archived log for thread 1 If you see such type of errors: RMAN-06025: no backup of archived log for threa...