Wednesday, November 1, 2023

Autonomous Health Framework (AHF) Installation

Autonomous Health Framework (AHF) is a one-stop tool to diagnose all of your system. You can download the latest Autonomous Health Framework (AHF) from the Oracle site from Autonomous Health Framework (AHF) - Including TFA and ORAchk/EXAChk (Doc ID 2550798.1). 

If you hit any errors related to Oracle Trace File Analyzer (TFA), you can check https://sajidkhadarabad.blogspot.com/2018/10/tfa-00104-tfa-00002-oracle-trace-file.html

[root@sajidserver01 tfa]# unzip AHF-LINUX_v23.9.0.zip

Archive:  AHF-LINUX_v23.9.0.zip
replace ahf_setup? [y]es, [n]o, [A]ll, [N]one, [r]ename: A
  inflating: ahf_setup
 extracting: ahf_setup.dat
  inflating: README.txt
  inflating: oracle-tfa.pub

[root@sajidserver01 tfa]# ./ahf_setup -local
AHF Installer for Platform Linux Architecture x86_64
AHF Installation Log : /tmp/ahf_install_239000.log
Starting Autonomous Health Framework (AHF) Installation
AHF Version: 23.9.0 Build Date: 202310
AHF is already installed at /usr/tfa/oracle.ahf
Installed AHF Version: 23.4.2 Build Date: 202305
Do you want to upgrade AHF [Y]|N : Y
Upgrading /usr/tfa/oracle.ahf
Shutting down AHF Services
Upgrading AHF Services
Beginning Retype Index
TFA Home: /usr/tfa/oracle.ahf/tfa
Moving existing indexes into temporary folder
Index file for index moved successfully
Index file for index_metadata moved successfully
Index file for complianceindex moved successfully
Moved indexes successfully
Starting AHF Services
No new directories were added to TFA
Directory /usr/grid/crsdata/sajidserver01/trace/chad was already added to TFA Directories.
Do you want AHF to store your My Oracle Support Credentials for Automatic Upload ? Y|[N] : N
.-----------------------------------------------------------------.
| Host     | TFA Version | TFA Build ID          | Upgrade Status |
+----------+-------------+-----------------------+----------------+
| Sajidserver01 |  23.9.0.0.0 | 23090002023| UPGRADED       |
| Sajidserver02 |  23.9.0.0.0 | 23090002023| UPGRADED       |
'----------+-------------+-----------------------+----------------'
Setting up AHF CLI and SDK
AHF is successfully upgraded to latest version.

[root@sajidserver01 bin]# ./tfactl status
.--------------------------------------------------------------------------------------------------.
| Host     | Status of TFA | PID    | Port | Version    | Build ID              | Inventory Status |
+----------+---------------+--------+------+------------+-----------------------+------------------+
| sajidserver01 | RUNNING       | 368155 | 8200 | 23.9.0.0.0 | 23090002023| COMPLETE         |
| sajidserver02 | RUNNING       | 942953 | 8200 | 23.9.0.0.0 | 23090002023 | COMPLETE         |
'----------+---------------+--------+------+------------+----

[root@sajidserver01 bin]# ./tfactl toolstatus
Running command tfactltoolstatus on sajidserver01 ...
.------------------------------------------------------------------.
|                  TOOLS STATUS - HOST : sajidserver01                 |
+----------------------+--------------+--------------+-------------+
| Tool Type            | Tool         | Version      | Status      |
+----------------------+--------------+--------------+-------------+
| AHF Utilities        | alertsummary |       23.0.9 | DEPLOYED    |
|                      | calog        |       23.0.9 | DEPLOYED    |
|                      | dbglevel     |       23.0.9 | DEPLOYED    |
|                      | grep         |       23.0.9 | DEPLOYED    |
|                      | history      |       23.0.9 | DEPLOYED    |
|                      | ls           |       23.0.9 | DEPLOYED    |
|                      | managelogs   |       23.0.9 | DEPLOYED    |
|                      | menu         |       23.0.9 | DEPLOYED    |
|                      | param        |       23.0.9 | DEPLOYED    |
|                      | ps           |       23.0.9 | DEPLOYED    |
|                      | pstack       |       23.0.9 | DEPLOYED    |
|                      | summary      |       23.0.9 | DEPLOYED    |
|                      | tail         |       23.0.9 | DEPLOYED    |
|                      | triage       |       23.0.9 | DEPLOYED    |
|                      | vi           |       23.0.9 | DEPLOYED    |
+----------------------+--------------+--------------+-------------+
| Development Tools    | oratop       |       14.1.2 | DEPLOYED    |
+----------------------+--------------+--------------+-------------+
| Support Tools Bundle | darda        | 2.10.0.R6036 | DEPLOYED    |
|                      | oswbb        | 22.1.0AHF    | RUNNING     |
|                      | prw          | 12.1.13.11.4 | RUNNING     |
'----------------------+--------------+--------------+-------------'
Note :-
  DEPLOYED    : Installed and Available - To be configured or run interactively.
  NOT RUNNING : Configured and Available - Currently turned off interactively.
  RUNNING     : Configured and Available.

[root@sajidserver01 bin]# ./tfactl -help
Usage : /usr/19.0.0/grid/bin/tfactl <command> [options]
    commands:diagcollect|analyze|ips|run|start|stop|enable|disable|status|print|access|purge|directory|host|set|toolstatus|uninstall|diagnosetfa|syncnodes|upload|availability|rest|events|search|changes|isa|blackout|rediscover|modifyprofile|refreshconfig|get|version|floodcontrol|queryindex|index|purgeindex|purgeinventory|set-sslconfig|set-ciphersuite|collection

Wednesday, October 4, 2023

Mysqldump: Error 2020: Got packet bigger than 'max_allowed_packet'

When utilizing mysqldump utility for database backup, it is possible to encounter an error as mentioned below when dealing with a database of considerable size.

[mysql@Sajidserver ~]$mysqldump -u root -p<pwd> <DB Name> > /u03/backup.sql 
mysqldump: Error 2020: Got packet bigger than 'max_allowed_packet' bytes when dumping table `log` at row:

To achieve a successful backup, it is recommended to utilize the command "--max_allowed_packet=1024M" before executing the mysqldump utility, as this will address the issue at hand.

[mysql@Sajidserver ~]$mysqldump -u root -p<pwd> <DB Name> > /u03/backup.sql --max_allowed_packet=1024M
[mysql@Sajidserver ~]$

You can even edit the my.cnf file with max_allowed_packet=1024M save it and run the backup normally.

Thursday, July 13, 2023

Machine Learning

Machine learning is a branch of artificial intelligence that provides computers with the ability to learn without being explicitly programmed. Machine learning focuses on the development of computer programs that can access data and use it to learn for themselves.

Machine learning is the study of algorithms that modify their behavior as they process new data.

Machine learning algorithms are used in many areas, including but not limited to:

1. Image Recognition
2. Natural Language Processing (NLP)
3. Robotics
4. Facial Recognition
5. Stock Market Analysis

Machine Learning is the science of getting computers to act without being explicitly programmed.

Machine learning algorithms can be broken down into two categories: supervised and unsupervised. Supervised learning algorithms use input data that has been labeled by a human to train the machine, while unsupervised learning algorithms do not have this labeled data and instead look for patterns in the raw data itself.

Supervised learning algorithms can be further divided into regression and classification models. Regression models are used to predict continuous values, such as stock prices over time or how quickly a car will drive on a highway, while classification models are used to predict discrete values, such as whether or not someone has cancer or if a person has purchased something on a website before.


Machine learning is the field of study that gives computers the ability to learn without being explicitly programmed.

The two main approaches to machine learning are supervised and unsupervised learning. In supervised learning, the data has an underlying structure that is known to a human. This data can be labeled (that is, associated with a label indicating its true value) or unlabeled (no labels are given to indicate what the correct answers are). In unsupervised learning, the data does not have an underlying structure that is known to a human. Instead, algorithms can be used to group together items based on their similarity.

Machine learning algorithms can be grouped into three broad categories linear methods, non-linear methods and kernel methods. Linear methods include classification and regression. Non-linear methods include clustering and anomaly detection kernel methods including support vector machines and Gaussian processes.

Monday, June 26, 2023

Change Oracle Database Compatible Parameter in Primary and Standby Servers

To change the compatibility parameter to 19.0.0.0 on both Primary and Standby servers, follow the below steps. Please note that this process requires database downtime. Begin by changing the compatibility in Standby, followed by the Primary server.


SQL> SELECT value FROM v$parameter WHERE name = 'compatible';

VALUE
-------------------------------------------------------------
12.2.0

ALTER SYSTEM SET COMPATIBLE= '19.0.0.0' SCOPE=SPFILE SID='*';

Bounce the Standby database in the mounted state and restart the Managed Recovery Process.

[oracle@sajidserver01 ~]$ srvctl stop database -d sajid_texas
[oracle@sajidserver01 ~]$ srvctl start database -d sajid_texas -o mount


alter database recover managed standby database disconnect from session;

Now change the compatibility on the Primary database, make sure you get a proper rman backup of your database before doing it. If you want to revert back to the compatibility.

ALTER SYSTEM SET COMPATIBLE= '19.0.0.0' SCOPE=SPFILE SID='*';

Bounce the Primary database now and make sure there is no lag in DGMGRL.

[oracle@sajidserver01 ~]$ srvctl stop database -d sajid_pittsburgh
[oracle@sajidserver01 ~]$ srvctl start database -d sajid_pittsburgh

SQL> SELECT value FROM v$parameter WHERE name = 'compatible';

VALUE
-------------------------------------------------------------
19.0.0.0

DGMGRL> show configuration;
Configuration - SAJID_CONF
  Protection Mode: MaxPerformance
  Members:
  sajid_pittsburgh - Primary database
  sajid_texas - Physical standby database
Fast-Start Failover:  Disabled
Configuration Status:
SUCCESS   (status updated 50 seconds ago)

Your Database compatibility at this stage is changed to 19.0.0.0. It is one Pre-requirement to install or upgrade Oracle Enterprise Monitor to version 13.5.


Friday, March 17, 2023

ORA-28017: The password file is in the legacy format.

 If you’ve ever encountered the ORA-28017: The password file is in the legacy format error, you know how frustrating it can be to solve. This Oracle database error occurs when a user attempts to connect using an old version of the Oracle Database that uses a pre-12c password file. In this blog post, we will discuss what this error means and provide steps for resolving it. 

The first step in resolving the ORA-28017: The Password File Is In Legacy Format Error is understanding why it occurred in the first place. As mentioned earlier, this issue typically arises when users attempt to connect with an older version of Oracle Database that uses a pre-12c password file format instead of its current 12c secure file format (also known as SYSKM). When attempting such connections with these outdated versions, they may encounter errors like “ORA 28017” or “Password File Is In Legacy Format". 

To resolve this issue quickly and easily without having to upgrade your entire system or reinstall software packages, we suggest following these steps : 

  1. Check if there are any existing legacy Password files on your server by running "ls -ltr" command which lists all files & directories present under the root directory. If yes, then delete them immediately using rm -rf command followed by the filename.
  2. Create new secure file-based passwords using "orapwd" utility provided by default within ORACLE_HOME/bin directory. Please refer to DOC ID 2112456 for more details about creating secure files based passwords & related troubleshooting tips.  


SQL> create user asmsnmp identified by <password>;

create user asmsnmp identified by <password>

*

Error at line 1:

ORA-28017: The password file is in the legacy format.

Check the output srvctl config ASM

[grid@sajidahmed ~]$ srvctl config ASM

ASM home: <CRS home>
Password file: orapwASM
Backup of Password file: <Location>
ASM listener: LISTENER
ASM instance count: 2
Cluster ASM listener: ASMNETLSNR

sqlplus / as sysasm


SQl> alter system flush passwordfile_metadata_cache;
SQL> select * from v$pwfile_users;
USERNAME SYSDB SYSOPER SYSASM SYSBACKUP SYSDG SYSKM     CON_ID
--------------- -----   -----     -----   -----   -----   ---
SYS          TRUE  TRUE  FALSE FALSE FALSE FALSE          0


Now create the asmsnmp user and grant it privileges, this issue will be resolved.

[grid@sajidahmed ~]$ asmcmd orapwusr --add ASMSNMP

Enter Password: *************

[grid@sajidahmed ~]$ asmcmd orapwusr --grant sysasm ASMSNMP

[grid@sajidahmed ~]$ sqlplus asmsnmp/<password> as sysasm
SQL*Plus: Release 19.0.0.0.0 - test on Fri Mar 17 13:31:06 2023
Version 19.18.0.0.0
Copyright (c) 1982, 2022, Oracle.  All rights reserved.
Connected to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Version 19.18.0.0.0
SQL> show user;
USER is "ASMSNMP"


By following these simple steps above, one should be able to resolve their issues related to "ORA 28017: The Password File Is In Legacy Format" errors very quickly without much effort!


Wednesday, February 8, 2023

Data Migration in AWS, GCP, AZURE, and OCC

  1. Data Migration: What Is It?
  2. Many databases and data kinds
  3. Naming Conventions for data migration technologies used in AWS, GCP, AZURE, and OCC
  4. Data migration tools and strategies from on-premises to cloud services like AWS, GCP, and OCC
  5. Advantages of both on-premises and cloud databases
  6. Conclusion


The process of moving data from one system to another, such as from an old database to a new one or from an on-premises system to a cloud-based system, is referred to as Data Migration.

Different database management systems (DBMS) including MySQL, Oracle, Postgres, DB2 and SQL Server are used to manage various sorts of data and databases, including structured data (like that found in a relational database) and unstructured data (such text and images).

Data transfer technologies like the AWS Database Migration Service, GCP Database Migration Service, and OCC Data Transfer Service are all offered by AWS, GCP, and OCC (Oracle Cloud Infrastructure). Data migration between several databases and/or cloud platforms is possible using these technologies.

AWS, GCP, and OCC all offer a variety of data migration tools to help users move data between their different services and platforms.

AWS:
  • AWS Data Migration Service (DMS): A fully managed service that makes it easy to migrate data to and from various databases, data warehouses, and data lakes.
  • AWS Schema Conversion Tool (SCT): A tool that helps convert database schema and stored procedures to be compatible with the target database engine.
  • AWS Database Migration Service (DMS) and AWS SCT can be used together to migrate data and schema both.

GCP:
  • Google Cloud Storage Transfer Service: A fully managed service that allows you to transfer large data sets from on-premises storage to Cloud Storage.
  • Google Cloud Storage Nearline: A storage service that stores data at a lower cost but with a slightly longer retrieval time.
  • Google Cloud SQL: A fully-managed relational database service that makes it easy to set up, maintain, manage, and administer your relational databases on Google Cloud.
  • Cloud Dataflow
  • Cloud Dataproc
  • Cloud SQL
  • Cloud Spanner

Azure:
  • Azure Database Migration Service (DMS)
  • Azure Data Factory
  • Azure Data Lake Storage Gen1
  • Azure Data Lake Storage Gen2
  • Azure Databricks
  • Azure Stream Analytics

OCC:
  • Oracle Cloud Infrastructure Data Transfer Appliance: A physical appliance that allows you to transfer large data sets from your on-premises data center to Oracle Cloud.
  • Oracle Cloud Infrastructure FastConnect: A service that provides a dedicated, private connection between your on-premises data center and Oracle Cloud.
  • Oracle Cloud Infrastructure File Transfer: A service that allows you to transfer files between your on-premises data center and Oracle Cloud.
  • Data Pump
  • Data Integrator
  • Data Migration Assistant
  • GoldenGate
  • SQL Developer



The process typically involves several steps:

  • Identification of the data that needs to be migrated
  • Planning for the migration, including assessing the data's size and complexity, determining the necessary resources, and developing a migration schedule
  • Backup of the existing data
  • Testing the migration process
  • Execution of the migration
  • Verification of the migrated data
  • Switchover to the new cloud-based system
  • Post-migration monitoring and maintenance
  • It's important to note that the specifics of data migration to the cloud can vary depending on the specific cloud service provider and the type of data being migrated.

When migrating data from an on-premise system to a cloud-based system such as AWS, GCP, or OCC, the process typically involves several steps, such as assessing the current data and design, planning the migration, and testing the migrated data.

Cloud migration tools and strategies can include various options, such as using pre-built templates and scripts, leveraging cloud-native services, and utilizing third-party migration tools.

The benefits of on-prem and cloud databases can vary, with on-premise databases providing more control and customization while cloud-based databases often offer scalability and cost savings.

In conclusion, data migration is the process of transferring data from one system to another, with different types of data and databases, and various migration tools available in AWS, GCP, and OCC. Cloud migration tools and strategies can be used to migrate data from on-premise systems to cloud-based systems and the benefits and drawbacks of both on-premise and cloud databases.



Saturday, January 14, 2023

Artificial Intelligence (AI) Stages and Progress

 Artificial intelligence is a growing field of study, and it's essential to understand the possibilities and limitations of this technology. While there are many who believe that AI will soon replace human workers, there are others who think that AI has yet to even reach its full potential.

AI can help us automate tasks that would otherwise be done by humans, and it can also improve our lives in ways we never imagined possible before. The future of AI looks bright, and we can't wait to see what comes next!

Artificial Intelligence is on the rise. It's predicted to grow by 33% by 2023 and will only continue to expand as more companies discover its potential.

AI can be used to make work easier and more efficient, saving time and money for businesses. It can also help with data analysis, which allows you to understand your customers better and make better decisions about how you run your business. AI is going to be prevalent in the near future.

  • Reactive Machines
  • Limited Memory
  • Theory of Mind
  • Self-aware

AI has different levels, and each level of intelligence depicts a different stage in the development of AI. The three stages define the progress of Artificial Intelligence.

  1. The first stage is Artificial Narrow Intelligence (ANI). This refers to AI that can only perform one task at a time.
  2. The second stage is Artificial General Intelligence(AGI). This refers to AI that can perform multiple tasks with equal proficiency.
  3. The third stage is Artificial Super Intelligence(ASI). This refers to AI that has surpassed human intelligence in all areas.

Artificial Narrow Intelligence refers to any machine capable of performing a single task. This includes things like Siri, which is capable only of performing vocal commands and internet searches. It is considered narrow because it doesn't possess intelligence in the same way humans do. ANI systems can't think abstractly, form their own opinions or generalize skills across different tasks.

Artificial General Intelligence refers to any machine able to perform any human-like task, indistinguishable from human performance. Thus far, most of these systems have not yet been created or tested in the wild.

Artificial Super Intelligence refers to any machine that can outperform humans in virtually every single way possible. Again, this is theoretical, as the skill sets required for superintelligence don't necessarily exist on Earth today. For example, take education research. They simply don't have enough information to accurately predict every scenario and solve it right now. And until they do, AI will not be able to fully replace teachers as educators.

I will come up with more interesting topics on Machine Learning in the next blog. Until then have Good Fortune.