Many databases seem to accumulate invalid objects over time. If you are monitoring for objects becoming invalid, you have to learn to ignore those and then we have a tendency to ignore all the warnings about objects being invalid, and that is not good.
Since these objects cannot be used, why not remove them? Getting rid of code is a bit scary, maybe we will have to write something like it again in the future, so it would be nice to have a backup. Regular backups from before the time you dropped the objects would work, but will you have them, or want to use them, then the time comes to get that code back? Better to make a specific backup of those objects and keep it around for a long time, “just in case”.
This SQL will list the invalid objects, along with DDL to back up the code and to drop them after the backup is completed and archived somewhere safe.
SELECT ‘Invalid ‘ || object_type || ‘: ‘ || owner || ‘.’ || object_name listing,
‘select dbms_metadata.get_ddl(”’ || object_type || ”’,”’ || object_name || ”’,”’ || owner || ”’) from dual;’ backup_cmd,
‘drop ‘ || object_type || ‘ ‘ || owner || ‘.’ || object_name || ‘;’ drop_cmd
FROM dba_objects WHERE status = ‘INVALID’ ORDER BY owner, object_name;
The first column could be emailed to interested parties to let them know what you are about to do.
The second column can be put into a script and wrapped with spool commands to make the backup file.
The third column is the DDL to actually perform the drops.
I used these commands around the backup lines to make a backup into a .sql file:
set long 300000 pages 0 lines 140
set trimspool on
set head off
spool invalid_code.sql…spool off
