Wireshark mailing list archives

Re: [Wireshark-users] tshark or dumpcap ring buffer limitations


From: Sake Blok <sake () euronet nl>
Date: Fri, 21 May 2010 18:46:22 +0200

On 20 mei 2010, at 23:24, Jaap Keuter wrote:
On Thu, 20 May 2010 12:05:09 -0400, Jeff Morriss
<jeff.morriss.ws () gmail com> wrote:
[Redirecting to -dev for this question.]

Jaap Keuter wrote:
On 05/19/2010 07:38 PM, Joseph Laibach wrote:
All,

I’m running a continuous capture of data. I’m trying to use a ring
buffer of 25000 files with an 8mb file size. The problem is that the
ring buffer starts overwriting after 10000 files. I’ve tried it with
dumpcap and tshark. The command is using the –b files:25000 –b
filesize:8192. Is there a limitation to the size of the ring buffer for
dumpcap and/or tshark?

[...]

That's a fixed limit:

jaap@host:~/src/wireshark/trunk$ grep RINGBUFFER_MAX_NUM_FILES *.h
ringbuffer.h:#define RINGBUFFER_MAX_NUM_FILES 10000

Hmmm, actually, it's not: if you specify a value of 0 you get 
"unlimited" files.  (I just tried it and killed dumpcap after it created
26,492 files.)

Why have an "upper limit" at all if we also allow unlimited files?

Ehm, when specifying 0 it's not a circular buffer anymore.

This appeared in rev 7912 and it appears that the max # of files limit 
was there originally because *ethereal kept the old files open so we 
would (prior to that commit) run out of fds.

Any reason not to just take this constant out and let users specify any 
number?

Any number would mean an array of keeping names of that size as well. And
it's some sort of self protection, since not all file systems handle a
kazillion files well. But what an appropriate limit would be, who knows?

Well, since the filesnames in the fileset have a 5 digit counter, the upper limit would be 99999 (not 100000 as it has 
to be able to create the next file before deleting the oldest one). Why not give the user the rope to hang themselves 
with if they really need/want to? It's not that the files are accessed repeatedly, so I wonder if the performance hit 
of having a kazillion files in one directory is really a problem here...

Just my $0,02


Sake
___________________________________________________________________________
Sent via:    Wireshark-dev mailing list <wireshark-dev () wireshark org>
Archives:    http://www.wireshark.org/lists/wireshark-dev
Unsubscribe: https://wireshark.org/mailman/options/wireshark-dev
             mailto:wireshark-dev-request () wireshark org?subject=unsubscribe


Current thread: