SLUG Mailing List Archives
RE: [SLUG] Removing duplicate entries from a file
- To: "MacFarlane, Jarrod" <jmacfarlane@xxxxxxxxxxxxxxxxxx>, <slug@xxxxxxxxxxx>
- Subject: RE: [SLUG] Removing duplicate entries from a file
- From: "Dave Kempe" <david@xxxxxxxxxxxxxxxxxx>
- Date: Wed Sep 6 09:50:02 2000
I think you are best off with calamaris, the log analyser for squid. it
turns out really nice log reports with soem good detail. It a perl script so
you could modify it if it wasn't up to scratch, but im sure it is.
Search for it on google.
> -----Original Message-----
> From: slug-admin@xxxxxxxxxxx [mailto:slug-admin@xxxxxxxxxxx]On Behalf Of
> MacFarlane, Jarrod
> Sent: Wednesday, 6 September 2000 9:43 AM
> To: 'slug@xxxxxxxxxxx'
> Subject: [SLUG] Removing duplicate entries from a file
> Hey sluggers,
> I need to look at a particular machines web hits.. I am currently using:
> cat /usr/local/squid/logs/access.log |grep 220.127.116.11 |cut -f4 -d"/" >
> This outputs something like:
> and so on....
> The problem is that it has many double ups... are there a long confusing
> string of commands that will go through my logfile and remove all but one
> instance of every domain listed?
> SLUG - Sydney Linux User Group Mailing List - http://slug.org.au/
> More Info: http://slug.org.au/lists/listinfo/slug