Hi
I have a hashtable with about 2500 elements in it. Some of the key_names are the same and
in the config file and I traverse them using the index.
The problem that I am having is that not all the values are loaded into memory. Lets say I
have 5 entries with key_name and key_values like this:
1.2.34:5060:10.10.10 --> 0001232555690
1.2.34:5060:10.10.10 --> 0004562555691
1.2.34:5060:10.10.10 --> 0007892555692
1.2.34:5060:10.10.10 --> 0003212555693
1.2.34:5060:10.10.10 --> 0006542555694
when I do this
kamcmd htable.get mytable 1.2.34:5060:10.10.10[0]
kamcmd htable.get mytable 1.2.34:5060:10.10.10[1]
kamcmd htable.get mytable 1.2.34:5060:10.10.10[2]
kamcmd htable.get mytable 1.2.34:5060:10.10.10[3]
kamcmd htable.get mytable 1.2.34:5060:10.10.10[4]
I am supposed to get their values on the console but sometimes I can not get the 3rd or
4th value , instead I receive this error
error: 500 - Key name doesn't exist in htable.
What I have been doing so far is to delete extra/old keys from the physical hashtable and
load the values again and then it works for a few days. But there is a process that adds
entries to the hashtable and at some point it breaks again and I have to delete again
old/unused ones.
I can not use 'expire' here because the time it takes to become "old" is
unknown/variable.
This is how the table is defined..2^16 is more than enough to hold 2500 values. What am I
missing here?
modparam("htable", "htable",
"a=>size=16;autoexpire=0;dbtable=htable;")
I have taken pcap captures between the kamailio server and the db and I see all the values
being transferred over the network when the kamailio is loading/refreshing the values from
the db.
Is there another limit somewhere?
Also when I try to dump the values I get 'reply too big'.
modparam("ctl", "binrpc_max_body_size", 10240)
modparam("ctl", "binrpc_struct_max_body_size",10240)
./kamcmd htable.dump a
ERROR: reply too big
any assistance is highly appreciated
fborot