Have you ever experienced expired oracle users? Or perhaps just about and seen the user enter a GRACE period. This is a common mechanism to ensure that users renew their passwords and follow specified security profiles. But if a service account expires, applications might begin to fail. If you have a monitoring tool that allows you to add your own checks, it can be a good idea to look for accounts that are about to expire, to catch the situation before it becomes a problem. The reason that a service account expires may be that it has a wrong profile. So verify the profile as well. However, it does not take long to make a check that captures all databases. Before an account expires, it will enter a GRACE period. This gives you some time to adjust. In its simplest form, a check may look something like this. Install in cron and run every morning.
#!/bin/bash
script_dir="/your_scripts"
logfile="grace.log"
ORACLE_BASE="/your_oracle"
receivers="your-email-address"
running_db=`ps -ef|grep smon|grep -v grep|awk '{print $8}'|awk -F "ora_smon_" '{print $2}'`
export ORACLE_BASE
date > $script_dir/$logfile
for i in `echo $running_db`;do
ORACLE_SID=$i
ORAENV_ASK=NO
. /usr/local/bin/oraenv 1> /dev/null 2> /dev/null
$ORACLE_HOME/bin/sqlplus /nolog <$script_dir/grace.sql |grep tag: > $script_dir/$i.grace
grep "GRACE" $script_dir/$i.grace 1> /dev/null 2> /dev/null
found="$?"
if [ "$found" -eq 0 ]
then
echo "$i contain one or more accounts in grace mode" >> $script_dir/$logfile
cat $script_dir/$i.grace >> $script_dir/$logfile
else
rm -f $script_dir/$i.grace
fi
done
nbo_lines=`cat $script_dir/$logfile|wc -l`
if [ "$nbo_lines" -gt 1 ] then
cat $script_dir/$logfile | mailx -s "Oracle warning - accounts in GRACE mode" $receivers
fi
exit 0
grace.sql:
connect
select 'tag:'||username,account_status from dba_users where account_status like '%GRACE%';

Welcome to my dba blog. Here you can find Linux and Oracle stuff for daily task operations. Click on the picture above to get home. Click on the posts for more details. Anything found here can be used freely, but at your own risk of course. Please tailor scripts and commands to meet your own installation.
Tuesday, 13 March 2018
Wednesday, 7 March 2018
Search for user information
You may have reasons to search for user information in your databases. Maybe you need to know the source of all clients. Lets say you want to move one or more of your databases. Then you need to find all the associated clients and alter the config. A query, in oracle, to view logged on users will only give you the activity right now.
If you want to get the complete picture you need to do this over a period of time, weeks. Either as a regular job and save the output, or make a logon trigger that can do it. When catching logon information on intervals you are able to capture logon and logout timing as well.
If you use a logon trigger instead, then you will have the actual user in variables and can make the code much more compact. Keep in mind that a logon trigger can give you an extra load on your system if you have a loaded web server in front. To get an idea of the number of logon/logoff activity on your system you can check the audit log.
Sometimes you need to go deeper. Lets say you need more details about the login, like name, email, phone etc. This kind of information, if it exist, will be stored in your application data model. Sometimes this is linked to the connections you see, other times not. Most applications will have tools to handle this kind of user information. If you do not know the application and really need to get hold of the user information stored you have to ask the vendor, or you can start looking for it yourself. As a start you could try out the query below. Check out the result and query the objects for user info. Somethimes you need to go wider. Look for extra attributes, like last logon, email, short name, anything you can use to qualify the user and possibly add some extra value. Finally you quite often see applications using a central user repository for authentication. The application system documentation might give you an idea or two as well. Good luck.
- select username||'.'||osuser|'.'||machine||'.'||program||'.'||process from v$session where type='USER'
- A scheduled task / cron job that run on a regular basis, with the purpose of getting an overview of connected clients can look something like this (you need to create the table s_accounting first)
- declare
ls_day varchar2(8);
ls_time varchar2(5);
cursor c_user is
select username,osuser,machine,count(*) sessions
from v$session
where username <> 'SYS'
and status <> 'KILLED'
and type = 'USER'
group by username,osuser,machine
order by machine,username; - begin
ls_day := to_char(sysdate,'DDMMYYYY');
ls_time := to_char(sysdate,'HH24:MI');
for c_user_rec in c_user loop
update s_accounting set s_logout=ls_time
where s_day = ls_day and
s_username=c_user_rec.username and
s_osuser=c_user_rec.osuser and
s_machine=c_user_rec.machine;
if SQL%NOTFOUND then
insert into s_accounting (s_day,s_username,s_osuser,s_machine,s_login,s_logout)
values (ls_day, c_user_rec.username, c_user_rec.osuser,
c_user_rec.machine,ls_time, ls_time);
end if;
end loop;
commit;
exception
when others then
rollback;
raise;
end;
If you use a logon trigger instead, then you will have the actual user in variables and can make the code much more compact. Keep in mind that a logon trigger can give you an extra load on your system if you have a loaded web server in front. To get an idea of the number of logon/logoff activity on your system you can check the audit log.
- select action_name,count(*) from dba_audit_session where timestamp>(sysdate-7) group by action_name;
Sometimes you need to go deeper. Lets say you need more details about the login, like name, email, phone etc. This kind of information, if it exist, will be stored in your application data model. Sometimes this is linked to the connections you see, other times not. Most applications will have tools to handle this kind of user information. If you do not know the application and really need to get hold of the user information stored you have to ask the vendor, or you can start looking for it yourself. As a start you could try out the query below. Check out the result and query the objects for user info. Somethimes you need to go wider. Look for extra attributes, like last logon, email, short name, anything you can use to qualify the user and possibly add some extra value. Finally you quite often see applications using a central user repository for authentication. The application system documentation might give you an idea or two as well. Good luck.
- select 'object='||owner||'.'||object_name from dba_objects where owner not like 'SYS%' and object_type in ('TABLE','VIEW') and (object_name like '%USER%' or object_name like '%LOGIN%' or object_name like '%LOGON%') and owner in (select username from dba_users where account_status='OPEN') order by owner,object_name;
Tuesday, 27 February 2018
oracle dba account - connect through
The system user has the dba role granted. On most systems this user is regarded as an internal oracle admin user not used by administrators unless you have to. Administrators should have their own named users with all the necessary rights granted.
If you for some reason want to lend someone an admin user without giving them the password you can use this trick. Shown with user system, but it works for any user.
SQL> create user kpe identified by kpe;
User created.
SQL> alter user system grant connect through kpe;
User altered.
SQL> connect kpe[system]/kpe
Connected.
SQL> show user
USER is "SYSTEM"
SQL> alter user system revoke connect through kpe;
SQL> drop user kpe;
If you for some reason want to lend someone an admin user without giving them the password you can use this trick. Shown with user system, but it works for any user.
SQL> create user kpe identified by kpe;
User created.
SQL> alter user system grant connect through kpe;
User altered.
SQL> connect kpe[system]/kpe
Connected.
SQL> show user
USER is "SYSTEM"
SQL> alter user system revoke connect through kpe;
SQL> drop user kpe;
Friday, 16 February 2018
GDPR - example 1
The previous post will give you a quick intro to this topic. http://kp-dba.blogspot.no/2018/02/gdpr-practical-approach.html
Please note that this is not a product. It is my own practical approach. It takes quite a bit to dig out all the plugins you need. What you see here is just examples and meant as inspiration.
There's also lots of commercial software that can help you out.
The most important part of this job is to locate all relevant information from underlying systems. The overview will never be better than the program that presents it. Obviously, you can add plugins crawling across several data sources, to find documents, signatures, etc., but you don't start there. As mentioned earlier, I have chosen to use MySQL. It has the functionality I'm looking for. In addition, it is free and can be installed on many platforms. What am I looking for?
Character sets are important. When you retrieve data from sub systems, the output encoding can vary. If possible, specify character set by retrieval. Keep in mind that the information will be loaded into a new data model and special characters must be represented correctly.
See below for two small examples of extracting data from subsystems. You might consider installing a cron job / scheduled task to refresh the lists on a regular basis.
List of inactive AD accounts:
File inactive-users.bat:
powershell.exe -ExecutionPolicy ByPass -file inactive-users.ps1
File inactive-users.ps1:
import-module activedirectory
$domain = "your-domain"
$DaysInactive = 365
$time = (Get-Date).Adddays(-($DaysInactive))
# get AD Users with LastLogonTimestamp less than 365 days
Get-ADUser -Filter {LastLogonTimeStamp -lt $time } -Properties LastLogonTimeStamp,Name,Company,Department,SamAccountName |
# output to csv
select-object Name,Company,Department,SamAccountName,@{Name="Stamp"; Expression={[DateTime]::FromFileTime($_.lastLogonTimestamp).ToString('yyyy-MM-dd_hh:mm:ss')}} | export-csv inactive-users.csv -notypeinformation -encoding utf8
List of users from one of your applications:
ORAENV_ASK=NO
ORACLE_SID=your-db
ORACLE_HOME=your-oracle-home
. $ORACLE_HOME/bin/oraenv
sqlplus /nolog <<eol|grep "tag:"|awk -F "tag:" '{ print $2 }' > app-user-list.csv
connect user/pw
select 'tag:'||user-id||','||full_name||','||company||','||days_since_last_login from users;
exit;
eol
You now have some nice comma separated files that represents a small selection of what you're up to. Now its time to load those files into MySQL. Be sure to set the character set and spot check your data. If you think you have missed out on fields containing useful information, then you must add this field and reproduce your csv file.
MySQL is, by default, case insensitive, but if your system is case sensitive, you can always use the lower and upper functions. The tables now represent the files you have extracted from your sub systems. Dependent on the where clause used in the plugin you now might find some weird rows, mistakes from the production environment, mistypings, users intended for courses that was never held and so on. But as you have loaded logon data from several sub systems you should be able to connect them. Really, what you want to find is all digital traces for one specific person. You have to get familiar with the data and look for connections. If there are no connections you can make them. Create a table that link together sub systems. To find your entries you can make your queries wider. If you intend to reload the tables representing the sub systems you should be careful altering them. Create new tables or views to add the extra functionality you need. Here in this example I assume that the windows short name (samaccount) is the same as the one used in the app. If this is not the case you have to build a mapping table and join this to sub select with the correct id.
A simple php sample presenting inactive users still registered in your app could look something along the lines of:
<!DOCTYPE html>
<html>
<head>
<style>
table, th, td {
border: 1px solid black;
}
</style>
</head>
<body>
<?php
set_include_path('dir ref to your php7 modules');
$servername = "localhost";
$username = "your-acc";
$password = "your-pw";
$dbname = "your-db";
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
if (!$conn->set_charset("utf8")) {
printf("Error loading character set utf8: %s\n", $mysqli->error);
exit();
}
$sql = 'SELECT * from ad where SamAccountName in (select user-id from users) order by stamp';
$result = $conn->query($sql);
if ($result->num_rows > 0) {
echo "<table><tr><th>Name</th><th>Company</th><th>Department</th><th>SamAccountName</th><th>LastLogin</th></tr>";
// output data of each row
while($row = $result->fetch_assoc()) {
echo "<tr><td>" . $row["Name"]. "</td><td>" . $row["Company"] . "</td><td>" . $row["Department"] . "</td><td>" . $row["SamAccountName"] . "</td><td>" . $row["Stamp"] . "</td></tr>";
}
echo "</table>";
} else {
echo "0 results";
}
$conn->close();
?>
</body>
</html>
Please note that this is not a product. It is my own practical approach. It takes quite a bit to dig out all the plugins you need. What you see here is just examples and meant as inspiration.
There's also lots of commercial software that can help you out.
The most important part of this job is to locate all relevant information from underlying systems. The overview will never be better than the program that presents it. Obviously, you can add plugins crawling across several data sources, to find documents, signatures, etc., but you don't start there. As mentioned earlier, I have chosen to use MySQL. It has the functionality I'm looking for. In addition, it is free and can be installed on many platforms. What am I looking for?
- Good support for character sets
- Easy handling of data in / out of db via the file system
- Generally good functionality and easily recognizable
Character sets are important. When you retrieve data from sub systems, the output encoding can vary. If possible, specify character set by retrieval. Keep in mind that the information will be loaded into a new data model and special characters must be represented correctly.
See below for two small examples of extracting data from subsystems. You might consider installing a cron job / scheduled task to refresh the lists on a regular basis.
List of inactive AD accounts:
File inactive-users.bat:
powershell.exe -ExecutionPolicy ByPass -file inactive-users.ps1
File inactive-users.ps1:
import-module activedirectory
$domain = "your-domain"
$DaysInactive = 365
$time = (Get-Date).Adddays(-($DaysInactive))
# get AD Users with LastLogonTimestamp less than 365 days
Get-ADUser -Filter {LastLogonTimeStamp -lt $time } -Properties LastLogonTimeStamp,Name,Company,Department,SamAccountName |
# output to csv
select-object Name,Company,Department,SamAccountName,@{Name="Stamp"; Expression={[DateTime]::FromFileTime($_.lastLogonTimestamp).ToString('yyyy-MM-dd_hh:mm:ss')}} | export-csv inactive-users.csv -notypeinformation -encoding utf8
List of users from one of your applications:
ORAENV_ASK=NO
ORACLE_SID=your-db
ORACLE_HOME=your-oracle-home
. $ORACLE_HOME/bin/oraenv
sqlplus /nolog <<eol|grep "tag:"|awk -F "tag:" '{ print $2 }' > app-user-list.csv
connect user/pw
select 'tag:'||user-id||','||full_name||','||company||','||days_since_last_login from users;
exit;
eol
You now have some nice comma separated files that represents a small selection of what you're up to. Now its time to load those files into MySQL. Be sure to set the character set and spot check your data. If you think you have missed out on fields containing useful information, then you must add this field and reproduce your csv file.
- create table ad (Name varchar(60), Company varchar(70), Department varchar(40), SamAccountName varchar(40), Stamp varchar(30));
- load data local infile 'inactive-users.csv' into table ad character set utf8 fields terminated by ',';
- show warnings;
- check the content: select * from ad;
- same procedure for app
MySQL is, by default, case insensitive, but if your system is case sensitive, you can always use the lower and upper functions. The tables now represent the files you have extracted from your sub systems. Dependent on the where clause used in the plugin you now might find some weird rows, mistakes from the production environment, mistypings, users intended for courses that was never held and so on. But as you have loaded logon data from several sub systems you should be able to connect them. Really, what you want to find is all digital traces for one specific person. You have to get familiar with the data and look for connections. If there are no connections you can make them. Create a table that link together sub systems. To find your entries you can make your queries wider. If you intend to reload the tables representing the sub systems you should be careful altering them. Create new tables or views to add the extra functionality you need. Here in this example I assume that the windows short name (samaccount) is the same as the one used in the app. If this is not the case you have to build a mapping table and join this to sub select with the correct id.
- To list inactive users: select * from ad;
- To list inactive users still registered in app: select * from ad where samaccountname in (select user-id from app);
- You might want to add functionality to days_since_last_login
A simple php sample presenting inactive users still registered in your app could look something along the lines of:
<!DOCTYPE html>
<html>
<head>
<style>
table, th, td {
border: 1px solid black;
}
</style>
</head>
<body>
<?php
set_include_path('dir ref to your php7 modules');
$servername = "localhost";
$username = "your-acc";
$password = "your-pw";
$dbname = "your-db";
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
if (!$conn->set_charset("utf8")) {
printf("Error loading character set utf8: %s\n", $mysqli->error);
exit();
}
$sql = 'SELECT * from ad where SamAccountName in (select user-id from users) order by stamp';
$result = $conn->query($sql);
if ($result->num_rows > 0) {
echo "<table><tr><th>Name</th><th>Company</th><th>Department</th><th>SamAccountName</th><th>LastLogin</th></tr>";
// output data of each row
while($row = $result->fetch_assoc()) {
echo "<tr><td>" . $row["Name"]. "</td><td>" . $row["Company"] . "</td><td>" . $row["Department"] . "</td><td>" . $row["SamAccountName"] . "</td><td>" . $row["Stamp"] . "</td></tr>";
}
echo "</table>";
} else {
echo "0 results";
}
$conn->close();
?>
</body>
</html>
Wednesday, 14 February 2018
GDPR - a practical approach
GDPR is an EU regulation that will strengthen the control we as individuals will have over our own digital information. It will also limit the amount of information businesses can store about us. Most businesses need to adjust to be compliant with the regulation. I do realise that GDPR is much bigger than what is discussed here. As an IT person I will focus on the applications and personal information stored in those systems. What does this mean in real life? Well, first of all, lets starts with the obvious, formed as questions:
How many times have you logged in to a web shop and found that they have an impressive amount of information about you. It's nice to not key in all this information every time, but at the same time it's also part of a threat. For how long can they keep this information? Customer lists and passwords are sold and shared, and sometimes also abused.
What about the dentist or physio therapist? Maybe you want to switch to a new one? Are they able to transfer data about you to the new physio therapist? Do they clean up their own systems? Banks and insurance companies are better at this. They can easily move a client from one company to another. Whether they clean up afterwards, well, that's another story.
Either way, you need to get started. Maybe you want to buy stuff. Or, maybe you want to collect information to find a status, a starting point. What this is about is to make your own plugins and let the system tell you what it looks like. To do that you need someone who can extract information about users in all relevant systems. And maybe also look up the things you initially did not think of. This is typically someone in your IT department who can find the underlying systems and extract the user information you need. Once you have found the information you need, you must assemble this and load it into a separate data model. Here you can either just create queries and retrieve information directly. Or make some simple reports. They can come in handy if you need some extra manpower to clean out all the old fun.
I have created a prototype with some examples of how to proceed. I chose to use Apache, MySQL and PHP on Linux, but that is not important here. Choose anything you think will do the trick. I will later show bits and pieces of what I have done. Then I recon you will get an idea of how you can do something similar. Please join in for a discussion.
- Do you have procedures for removing customers or former employees from all internal systems? IE mail, home areas, access to applications, etc.
- Do you have obvious candidates in your systems for removal
- Do you have applications that does not allow deletion of users / customers? If so, are there any procedures for making those users anonymous?
- Can you find all digital traces from any given person registered in your systems?
- Do you perform regular checks against your central repository to capture users who will be candidates for deletion?
- Are your security policies compliant with the regulation?
- Is it natural to send mail directly to the user, with information about what type of information you have stored. And maybe, if the user has been inactive for a long time, offer an unsubscription?
- How good are your procedures? Will they survive an audit?
How many times have you logged in to a web shop and found that they have an impressive amount of information about you. It's nice to not key in all this information every time, but at the same time it's also part of a threat. For how long can they keep this information? Customer lists and passwords are sold and shared, and sometimes also abused.
What about the dentist or physio therapist? Maybe you want to switch to a new one? Are they able to transfer data about you to the new physio therapist? Do they clean up their own systems? Banks and insurance companies are better at this. They can easily move a client from one company to another. Whether they clean up afterwards, well, that's another story.
Either way, you need to get started. Maybe you want to buy stuff. Or, maybe you want to collect information to find a status, a starting point. What this is about is to make your own plugins and let the system tell you what it looks like. To do that you need someone who can extract information about users in all relevant systems. And maybe also look up the things you initially did not think of. This is typically someone in your IT department who can find the underlying systems and extract the user information you need. Once you have found the information you need, you must assemble this and load it into a separate data model. Here you can either just create queries and retrieve information directly. Or make some simple reports. They can come in handy if you need some extra manpower to clean out all the old fun.
I have created a prototype with some examples of how to proceed. I chose to use Apache, MySQL and PHP on Linux, but that is not important here. Choose anything you think will do the trick. I will later show bits and pieces of what I have done. Then I recon you will get an idea of how you can do something similar. Please join in for a discussion.
Friday, 19 January 2018
oracle archive log history
Have you ever measured the amount of changes you have in your database. Sometimes it can be a good idea to know. Maybe you need to set up a standby database and you need details about the traffic between the two system. Or maybe you plan to move the primary database and need to evaluate the different options. This will vary quite a bit from day to day, but one query that level this out on a monthly basis can be as shown below.
Multiply the count with the average size of your archive logs and you will have an idea.
Now may also be a good time to evaluate the number of log switches you have in your database. Every minute is too frequent. Every 10 minute to every hour is a more optimal configuration. Consider to resize the log files (ie create a new set of logfiles and disable the old ones) if you are far off.
select to_char(first_time, 'yyyy-mm'), count(*)
from V$log_history
group by to_char(first_time, 'yyyy-mm')
order by 1;
Multiply the count with the average size of your archive logs and you will have an idea.
Now may also be a good time to evaluate the number of log switches you have in your database. Every minute is too frequent. Every 10 minute to every hour is a more optimal configuration. Consider to resize the log files (ie create a new set of logfiles and disable the old ones) if you are far off.
select to_char(first_time, 'yyyy-mm'), count(*)
from V$log_history
group by to_char(first_time, 'yyyy-mm')
order by 1;
Subscribe to:
Posts (Atom)