Traffic summary using iptables

From Skytech
Revision as of 16:54, 12 March 2011 by 192.168.0.250 (talk) (Created page with "= Overview = Using iptables and a simple perl script to analyze data traffic. = Prereq = <code> Perl ver 5.6+ Iptables </code> = Iptables setup = = Perl script to grab data = ...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Overview

Using iptables and a simple perl script to analyze data traffic.

Prereq

Perl ver 5.6+ Iptables

Iptables setup

Perl script to grab data

I'm putting the data from iptables into a local mysql database. From there I further analyze.

It's a simple two-stage process; 1. Get the data from the newly created chains; 2. Put into db and reset counter.

Mine is based around getting data every hour, so I've made a cron entry for that

Perl script:

#!/usr/bin/perl

use strict;
use DBI;

## Setup database connection
my $dsn = 'dbi:mysql:<dbname>:<ip or localhost>:3306'; my $user = '<username>'; my $pass = '<password>';
my $dbh = DBI->connect($dsn, $user, $pass) or die "Horrible!!\n$DBI::errstr\n";

my ($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = localtime(time);
$mon++;
$year += 1900;

## Only grab outgoing data
my @aData = `/sbin/iptables -L TRAFFIC_ACCT_OUT -n -v -x | awk '\$1 ~ /^[0-9]+\$/ { printf \"%s, %d \\n\", \$7, \$2 }'`;

foreach (@aData)
{
        chomp;
        my @aSplitter = split(/, /, $_);
        my $sExtraSQL = "ON DUPLICATE KEY UPDATE traffic = ".$dbh->quote($aSplitter[1]);
        my $sSQL = "INSERT INTO traffic (source, year, month, day, hour, traffic) VALUES (?, ?, ?, ?, ?, ?) $sExtraSQL\n";
        my $sth = $dbh->prepare($sSQL);
        $sth->execute($aSplitter[0], $year, $mon, $mday, $hour, $aSplitter[1]);
}
my $bResetIptableCounter = `/sbin/iptables -Z TRAFFIC_ACCT_OUT`;

The code is extremely straightforward and no checks really. If I miss out of one hours traffic, it's no biggie, so haven't put much work into that part.

Cron entry


= Database create options =
For those wanting it; I've made a unique key formed by year+month+day+hour+source. It's highly inefficient, but I'm dealing with a relatively low amount of data on my end (checking 3 hosts, so for a year I'll have a max of 3 hosts * 24 hours * 365 days ~= 25000 entries).
<pre>