Hello,
double check and be sure that the key_type for these items in database
is set to 1.
And instead of kamcmd, try to use "kamctl rpc ..." (or "kamcli rpc"),
they work with jsonrpcs module which should not use buffers with limited
size for rpc response.
Cheers,
Daniel
On 09.03.21 22:44, Henning Westerholt wrote:
Hi Fabian,
just some notes, as it could be different causes.
The hash table size (2^16) is just for the number of hash buckets; it can of course hold
more values than this. Have e.g., a look to Wikipedia for the hash table data structure
concepts.
So, the hash table should be able to hold all of your values, it is a widely used data
structure in kamailio cfg from many people.
Have a look to the log files of kamailio to see if you get maybe some memory error (the
hash table is stored in shared memory). The other error you quoted below related to the
hash table dump command, this can happen especially for large tables. These RPC commands
are usually not meant to dump frequently really large tables, more for debugging etc..
Some modules provide a second command for more optimized dump of data.
Cheers,
Henning
--
Henning Westerholt -
https://skalatan.de/blog/
Kamailio services -
https://gilawa.com
-----Original Message-----
From: sr-users <sr-users-bounces(a)lists.kamailio.org> On Behalf Of Fabian Borot
Sent: Tuesday, March 9, 2021 8:59 PM
To: Kamailio (SER) - Users Mailing List <sr-users(a)lists.kamailio.org>
Subject: [SR-Users] limits in hashtable
Hi
I have a hashtable with about 2500 elements in it. Some of the key_names are the same and
in the config file and I traverse them using the index.
The problem that I am having is that not all the values are loaded into memory. Lets say
I have 5 entries with key_name and key_values like this:
1.2.34:5060:10.10.10 --> 0001232555690
1.2.34:5060:10.10.10 --> 0004562555691
1.2.34:5060:10.10.10 --> 0007892555692
1.2.34:5060:10.10.10 --> 0003212555693
1.2.34:5060:10.10.10 --> 0006542555694
when I do this
kamcmd htable.get mytable 1.2.34:5060:10.10.10[0] kamcmd htable.get mytable
1.2.34:5060:10.10.10[1] kamcmd htable.get mytable 1.2.34:5060:10.10.10[2] kamcmd
htable.get mytable 1.2.34:5060:10.10.10[3] kamcmd htable.get mytable
1.2.34:5060:10.10.10[4]
I am supposed to get their values on the console but sometimes I can not get the 3rd or
4th value , instead I receive this error
error: 500 - Key name doesn't exist in htable.
What I have been doing so far is to delete extra/old keys from the physical hashtable and
load the values again and then it works for a few days. But there is a process that adds
entries to the hashtable and at some point it breaks again and I have to delete again
old/unused ones.
I can not use 'expire' here because the time it takes to become "old"
is unknown/variable.
This is how the table is defined..2^16 is more than enough to hold 2500 values. What am I
missing here?
modparam("htable", "htable",
"a=>size=16;autoexpire=0;dbtable=htable;")
I have taken pcap captures between the kamailio server and the db and I see all the
values being transferred over the network when the kamailio is loading/refreshing the
values from the db.
Is there another limit somewhere?
Also when I try to dump the values I get 'reply too big'.
modparam("ctl", "binrpc_max_body_size", 10240)
modparam("ctl", "binrpc_struct_max_body_size",10240)
./kamcmd htable.dump a
ERROR: reply too big
any assistance is highly appreciated
fborot
_______________________________________________
Kamailio (SER) - Users Mailing List
sr-users(a)lists.kamailio.org
https://lists.kamailio.org/cgi-bin/mailman/listinfo/sr-users
_______________________________________________
Kamailio (SER) - Users Mailing List
sr-users(a)lists.kamailio.org
https://lists.kamailio.org/cgi-bin/mailman/listinfo/sr-users