Tugger the SLUGger!SLUG Mailing List Archives

RE: [SLUG] Removing duplicate entries from a file


I think you are best off with calamaris, the log analyser for squid. it
turns out really nice log reports with soem good detail. It a perl script so
you could modify it if it wasn't up to scratch, but im sure it is.
Search for it on google.

dave :)



> -----Original Message-----
> From: slug-admin@xxxxxxxxxxx [mailto:slug-admin@xxxxxxxxxxx]On Behalf Of
> MacFarlane, Jarrod
> Sent: Wednesday, 6 September 2000 9:43 AM
> To: 'slug@xxxxxxxxxxx'
> Subject: [SLUG] Removing duplicate entries from a file
>
>
> Hey sluggers,
>
> I need to look at a particular machines web hits.. I am currently using:
>
> cat /usr/local/squid/logs/access.log |grep 1.2.3.4 |cut -f4 -d"/" >
> logfile.txt
>
> This outputs something like:
> www.reallynaughtysite.com
> www.smackmeimbad.com
> and so on....
>
> The problem is that it has many double ups... are there a long confusing
> string of commands that will go through my logfile and remove all but one
> instance of every domain listed?
>
> Thanks,
> Jarrod.
>
>
> --
> SLUG - Sydney Linux User Group Mailing List - http://slug.org.au/
> More Info: http://slug.org.au/lists/listinfo/slug
>