[sr-dev] [kamailio/kamailio] hash table size issue (#2311)

fborot notifications at github.com
Fri May 1 16:51:29 CEST 2020


I have my hash table set to size = 16 
modparam("htable", "htable", "a=>size=16;autoexpire=0;dbtable=htable;")

I had around 3000 entries in that table, some entries with more than 10 duplicate keys (key_type = 1) and we found that when  reading the entries of the duplicated keys it could not go pass index 9 even though in the physical table there were 17 entries.
I deleted some old entries and now the table is about 2000 items and it seems to work properly again ,(loads all value in memory). I say seems because I can not dump all values:
```
./kamcmd htable.dump a
ERROR: reply too big
```

But I check manually all the ones that were not being loaded and they are now
Questions:
1- the size parameter is for the buckets or entries size ?
2- is 16 a good value for a table with 3000 entries? ( 2^16 is 65k so I figured it would be ok)
3- can the the way I define my key impose a limitation?
4- what is the max size for an htable to be dumped with the above command?


-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/kamailio/kamailio/issues/2311
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.kamailio.org/pipermail/sr-dev/attachments/20200501/d004df13/attachment.html>


More information about the sr-dev mailing list