[SR-Users] Diameter (CDP) crashes on startup. ERROR: cdp [peerstatemachine.c:634]: I_Snd_CER(): I_Snd_CER(): Error on finding local host address > Socket operation on non-socket

Denys Pozniak denys.pozniak at gmail.com
Fri Jan 18 20:17:39 CET 2019


I have just tested on Kamailio 5.2 and result is the same.
Maybe issue is with config?

[root at test]# kamailio -v
version: kamailio 5.2.1 (x86_64/linux) 947769
flags: STATS: Off, USE_TCP, USE_TLS, USE_SCTP, TLS_HOOKS, USE_RAW_SOCKS,
DISABLE_NAGLE, USE_MCAST, DNS_IP_HACK, SHM_MEM, SHM_MMAP, PKG_MALLOC,
Q_MALLOC, F_MALLOC, TLSF_MALLOC, DBG_SR_MEMORY, USE_FUTEX,
FAST_LOCK-ADAPTIVE_WAIT, USE_DNS_CACHE, USE_DNS_FAILOVER, USE_NAPTR,
USE_DST_BLACKLIST, HAVE_RESOLV_RES
ADAPTIVE_WAIT_LOOPS=1024, MAX_RECV_BUFFER_SIZE 262144 MAX_URI_SIZE 1024,
BUF_SIZE 65535, DEFAULT PKG_SIZE 8MB
poll method support: poll, epoll_lt, epoll_et, sigio_rt, select.
id: 947769
compiled on 11:35:28 Jan 16 2019 with gcc 4.8.5

[root at test]#kamailio -f ./kamailio.cfg -DDDD -E
...
26(17359) DEBUG: cdp [peermanager.c:263]: peer_timer(): peer_timer():
taking care of peers...
26(17359) DEBUG: cdp [peermanager.c:280]: peer_timer(): peer_timer(): Peer
dev-diameter-1.ams.proxy.dev State 0
26(17359) DEBUG: cdp [peerstatemachine.c:90]: sm_process(): sm_process():
Peer dev-diameter-1.ams.proxy.dev State Closed Event Start
26(17359) INFO: cdp [peerstatemachine.c:525]: I_Snd_Conn_Req():
I_Snd_Conn_Req(): Peer dev-diameter-1.ams.proxy.dev
26(17359) INFO: cdp [receiver.c:875]: peer_connect(): peer_connect():
Trying to connect to 10.10.10.20 port 3868
23(17356) DEBUG: cdp [receiver.c:106]: log_serviced_peers(): --- Receiver
cdp receiver peer unknown Serviced Peers: ---
23(17356) DEBUG: cdp [receiver.c:114]: log_serviced_peers():
--------------------------------------------------------
24(17357) DEBUG: cdp [receiver.c:106]: log_serviced_peers(): --- Receiver
cdp_receiver_peer=dev-diameter-1.ams.proxy.dev Serviced Peers: ---
24(17357) DEBUG: cdp [receiver.c:112]: log_serviced_peers():  Peer:
dev-diameter-1.ams.proxy.dev  TCP Socket: -1  Recv.State: 0
24(17357) DEBUG: cdp [receiver.c:114]: log_serviced_peers():
--------------------------------------------------------
26(17359) INFO: cdp [receiver.c:954]: peer_connect(): peer_connect(): Peer
dev-diameter-1.ams.proxy.dev:3868 connected
24(17357) DEBUG: cdp [receiver.c:702]: receive_loop(): select_recv(): There
is something on the fd exchange pipe
24(17357) DEBUG: cdp [receiver.c:711]: receive_loop(): select_recv(): fd
exchange pipe says fd [10] for peer
0x7fc375dedbe8:[dev-diameter-1.ams.proxy.dev]
26(17359) DEBUG: cdp [peermanager.c:143]: log_peer_list(): --- Peer List:
---
24(17357) DEBUG: cdp [peerstatemachine.c:90]: sm_process(): sm_process():
Peer dev-diameter-1.ams.proxy.dev State Wait_Conn_Ack Event I_Rcv_Conn_Ack
24(17357) DEBUG: cdp [diameter_msg.c:184]: AAANewMessage(): AAANewMessage:
param session received null and it's a request!!
24(17357) ERROR: cdp [peerstatemachine.c:634]: I_Snd_CER(): I_Snd_CER():
Error on finding local host address > Socket operation on non-socket
24(17357) DEBUG: cdp [diameter_msg.c:81]: AAABuildMsgBuffer():
AAABuildMsgBuffer(): len=188
24(17357) DEBUG: cdp [receiver.c:1013]: peer_send_msg(): peer_send_msg():
Pipe push [0x7fc375e01da0]
24(17357) DEBUG: cdp [receiver.c:106]: log_serviced_peers(): --- Receiver
cdp_receiver_peer=dev-diameter-1.ams.proxy.dev Serviced Peers: ---
24(17357) DEBUG: cdp [receiver.c:112]: log_serviced_peers():  Peer:
dev-diameter-1.ams.proxy.dev  TCP Socket: 10  Recv.State: 0
24(17357) DEBUG: cdp [receiver.c:114]: log_serviced_peers():
--------------------------------------------------------
24(17357) DEBUG: cdp [receiver.c:756]: receive_loop(): select_recv(): There
is something on the send pipe
24(17357) DEBUG: cdp [receiver.c:769]: receive_loop(): select_recv(): Send
pipe says [0x7fc375e01da0] 8
24(17357) DEBUG: cdp [diameter_msg.c:410]: AAAFreeMessage():
AAAFreeMessage: Freeing message (0x7fc375e01da0) 257
24(17357) DEBUG: cdp [receiver.c:106]: log_serviced_peers(): --- Receiver
cdp_receiver_peer=dev-diameter-1.ams.proxy.dev Serviced Peers: ---
24(17357) DEBUG: cdp [receiver.c:112]: log_serviced_peers():  Peer:
dev-diameter-1.ams.proxy.dev  TCP Socket: 10  Recv.State: 0
24(17357) DEBUG: cdp [receiver.c:114]: log_serviced_peers():
--------------------------------------------------------
26(17359) DEBUG: cdp [peermanager.c:145]: log_peer_list(): State of peer:
Wait_I_CEA FQDN: dev-diameter-1.ams.proxy.dev Port: 3868 Is dynamic
26(17359) DEBUG: cdp [peermanager.c:149]: log_peer_list():
------------------
24(17357) DEBUG: cdp [receiver.c:106]: log_serviced_peers(): --- Receiver
cdp_receiver_peer=dev-diameter-1.ams.proxy.dev Serviced Peers: ---
24(17357) DEBUG: cdp [receiver.c:112]: log_serviced_peers():  Peer:
dev-diameter-1.ams.proxy.dev  TCP Socket: 10  Recv.State: 1
24(17357) DEBUG: cdp [receiver.c:114]: log_serviced_peers():
--------------------------------------------------------
24(17357) DEBUG: cdp [receiver.c:579]: do_receive(): receive_loop():
[dev-diameter-1.ams.proxy.dev] Recv Version 1 Length 204
24(17357) DEBUG: cdp [receiver.c:106]: log_serviced_peers(): --- Receiver
cdp_receiver_peer=dev-diameter-1.ams.proxy.dev Serviced Peers: ---
24(17357) DEBUG: cdp [receiver.c:112]: log_serviced_peers():  Peer:
dev-diameter-1.ams.proxy.dev  TCP Socket: 10  Recv.State: 2
24(17357) DEBUG: cdp [receiver.c:114]: log_serviced_peers():
--------------------------------------------------------
24(17357) DEBUG: cdp [receiver.c:1107]: receive_message():
receive_message(): [dev-diameter-1.ams.proxy.dev] Recv msg 257
24(17357) DEBUG: cdp [peerstatemachine.c:90]: sm_process(): sm_process():
Peer dev-diameter-1.ams.proxy.dev State Wait_I_CEA Event I_Rcv_CEA
24(17357) DEBUG: cdp [peerstatemachine.c:698]:
count_Supported_Vendor_Id_AVPS(): Found 0 Supported_Vendor AVPS24(17357)
DEBUG: cdp [peerstatemachine.c:681]: add_peer_application(): Application 0
of maximum 0
24(17357) DEBUG: cdp [diameter_msg.c:410]: AAAFreeMessage():
AAAFreeMessage: Freeing message (0x7fc375e01da0) 257
24(17357) CRITICAL: <core> [core/mem/q_malloc.c:149]:
qm_debug_check_frag(): BUG: qm: prev. fragm. tail overwritten(0,
abcdefed)[0x7fc375e023c8:0x7fc375e02400]! Memory allocator was called from
cdp: diameter_avp.c:365. Fragment marked by cdp: diameter_avp.c:142. Exec
from core/mem/q_malloc.c:504.
25(17358) DEBUG: cdp [tcp_accept.c:221]: accept_loop(): accept_loop(): No
connection attempts
23(17356) DEBUG: cdp [receiver.c:106]: log_serviced_peers(): --- Receiver
cdp receiver peer unknown Serviced Peers: ---
23(17356) DEBUG: cdp [receiver.c:114]: log_serviced_peers():
--------------------------------------------------------
23(17356) DEBUG: cdp [receiver.c:106]: log_serviced_peers(): --- Receiver
cdp receiver peer unknown Serviced Peers: ---
23(17356) DEBUG: cdp [receiver.c:114]: log_serviced_peers():
--------------------------------------------------------
26(17359) DEBUG: cdp [session.c:396]: cdp_sessions_log(): ------- CDP
Sessions ----------------
26(17359) DEBUG: cdp [session.c:431]: cdp_sessions_log():
-------------------------------------
32(17365) CRITICAL: <core> [core/pass_fd.c:277]: receive_fd(): EOF on 39
32(17365) DEBUG: <core> [core/tcp_main.c:3513]: handle_ser_child(): dead
child 24, pid 17357 (shutting down?)
32(17365) DEBUG: <core> [core/io_wait.h:602]: io_watch_del(): DBG:
io_watch_del (0xa838a0, 39, -1, 0x0) fd_no=39 called
 0(17333) ALERT: <core> [main.c:756]: handle_sigs(): child process 17357
exited by a signal 6
 0(17333) ALERT: <core> [main.c:759]: handle_sigs(): core was generated
 0(17333) INFO: <core> [main.c:781]: handle_sigs(): terminating due to
SIGCHLD
 0(17333) DEBUG: <core> [main.c:783]: handle_sigs(): terminating due to
SIGCHLD
31(17364) INFO: <core> [main.c:836]: sig_usr(): signal 15 received
...
 0(17333) DEBUG: cdp_avp [cdp_avp_mod.c:225]: cdp_avp_destroy(): Destroying
module cdp_avp
 0(17333) INFO: cdp [cdp_mod.c:255]: cdp_exit(): CDiameterPeer child
stopping ...
 0(17333) INFO: cdp [diameter_peer.c:364]: diameter_peer_destroy():
destroy_diameter_peer(): Terminating all children...
 0(17333) INFO: cdp [diameter_peer.c:371]: diameter_peer_destroy():
destroy_diameter_peer(): Waiting for child [17359] to terminate...
...
 0(17333) INFO: cdp [diameter_peer.c:383]: diameter_peer_destroy():
destroy_diameter_peer(): All processes terminated. Cleaning up.
 0(17333) INFO: cdp [worker.c:139]: worker_destroy(): Unlocking workers
waiting on empty queue...
 0(17333) INFO: cdp [worker.c:142]: worker_destroy(): Unlocking workers
waiting on full queue...
 0(17333) DEBUG: cdp [peermanager.c:129]: peer_manager_destroy():
peer_manager_init(): ...Peer Manager destroyed
 0(17333) CRITICAL: cdp [diameter_peer.c:423]: diameter_peer_destroy():
destroy_diameter_peer(): Bye Bye from C Diameter Peer test
 0(17333) INFO: cdp [cdp_mod.c:257]: cdp_exit(): ... CDiameterPeer child
stopped
 0(17333) DEBUG: tm [t_funcs.c:85]: tm_shutdown(): start
 0(17333) DEBUG: tm [t_funcs.c:88]: tm_shutdown(): emptying hash table
 0(17333) DEBUG: tm [t_funcs.c:90]: tm_shutdown(): removing semaphores
 0(17333) DEBUG: tm [t_funcs.c:92]: tm_shutdown(): destroying tmcb lists
 0(17333) DEBUG: tm [t_funcs.c:95]: tm_shutdown(): done
 0(17333) INFO: <core> [core/sctp_core.c:53]: sctp_core_destroy(): SCTP API
not initialized
 0(17333) DEBUG: <core> [core/mem/shm.c:283]: shm_destroy_manager():
destroying memory manager: q_malloc
 0(17333) DEBUG: <core> [core/mem/q_malloc.c:1178]: qm_shm_lock_destroy():
destroying the shared memory lock
 0(17333) DEBUG: <core> [core/mem/pkg.c:91]: pkg_destroy_manager():
destroying memory manager: q_malloc

Configs:

[root at test]# cat diameter.xml
<?xml version="1.0" encoding="UTF-8"?>
<DiameterPeer
        FQDN="10-10-10-10.ams.proxy.dev"
        Realm="proxy.dev"
        Product_Name="Diameter Credit Control"
        Vendor_Id="10415"
        AcceptUnknownPeers="1"
        DropUnknownOnDisconnect="1"
        Workers="4">
        <Peer FQDN="dev-diameter-1.ams.proxy.dev" Realm="proxy.dev"
port="3868"/>
        <Acceptor port="3868" bind="10.10.10.10"/>
        <SupportedVendor vendor="10415"/>
        <DefaultRoute FQDN="dev-diameter-1.ams.proxy.dev" metric="10"/>
        <Auth id="16777216" vendor="10415"/>
        <Auth id="4" vendor="0"/>
        <Acct id="4" vendor="0"/>
</DiameterPeer>

[root at test]# cat kamailio.cfg
#!KAMAILIO
children=5
memdbg=5
memlog=5
debug=3
log_stderror=no
log_facility=LOG_LOCAL0
fork=yes
disable_tcp=no
auto_aliases=no
mpath="/usr/lib64/kamailio/modules/"
loadmodule "kex.so"
loadmodule "tm.so"
loadmodule "sl.so"
loadmodule "rr.so"
loadmodule "xlog.so"
loadmodule "cfg_rpc.so"
loadmodule "cdp.so"
loadmodule "cdp_avp.so"

modparam("cdp","config_file","/etc/kamailio_test/diameter.xml")
modparam("cdp", "debug_heavy", 4)

request_route {
;
}




чт, 17 янв. 2019 г. в 19:01, Denys Pozniak <denys.pozniak at gmail.com>:

> Hello!
> Please help me to find the issue with CDP module as it blocks Kamailio to
> start.
>
> [root at 10-10-10-10 kamailio]# cat diameter.xml
> <?xml version="1.0" encoding="UTF-8"?>
> <DiameterPeer
>         FQDN="10-10-10-10.ams.proxy.dev"
>         Realm="proxy.dev"
>         Product_Name="Diameter Credit Control"
>         Vendor_Id="10415"
>         AcceptUnknownPeers="1"
>         DropUnknownOnDisconnect="1"
>         Tc="30"
>         Workers="4"
>         QueueLength="32">
>
>         <Peer FQDN="dev-diameter-1.ams.proxy.dev" Realm="proxy.dev"
> port="3868"/>
>         <Acceptor port="3868" bind="10.10.10.10"/>
>         <Auth id="16777216" vendor="10415"/>
>         <Acct id="16777216" vendor="0" />
>         <Auth id="16777216" vendor="10415"/>
>         <Auth id="16777216" vendor="0" />
>         <SupportedVendor vendor="10415"/>
>        <Realm name="proxy.dev">
>                 <Route FQDN="dev-diameter-1.ams.proxy.dev" metric="1"/>
>         </Realm>
>         <DefaultRoute FQDN="dev-diameter-1.ams.proxy.dev" metric="10"/>
> </DiameterPeer>
>
>
> 38(7825) INFO: cdp [worker.c:332]: worker_process(): [1] Worker process
> started...
> 37(7824) INFO: cdp [worker.c:332]: worker_process(): [0] Worker process
> started...
> 39(7826) INFO: cdp [worker.c:332]: worker_process(): [2] Worker process
> started...
> 40(7827) INFO: cdp [worker.c:332]: worker_process(): [3] Worker process
> started...
> 41(7828) INFO: cdp [receiver.c:450]: receiver_process():
> receiver_process(): [] Receiver process doing init on new process...
> 41(7828) INFO: cdp [receiver.c:455]: receiver_process():
> receiver_process(): [] Receiver process starting up...
> 42(7829) INFO: cdp [receiver.c:450]: receiver_process():
> receiver_process(): [dev-diameter-1.ams.proxy.dev] Receiver process doing
> init on new process...
> 42(7829) INFO: cdp [receiver.c:184]: add_serviced_peer():
> add_serviced_peer(): Adding serviced_peer_t to receiver for peer
> [dev-diameter-1.ams.proxy.dev]
> 42(7829) INFO: cdp [receiver.c:455]: receiver_process():
> receiver_process(): [dev-diameter-1.ams.proxy.dev] Receiver process
> starting up...
> 43(7830) INFO: cdp [acceptor.c:81]: acceptor_process(): Acceptor process
> starting up...
>  0(7787) INFO: cdp [cdp_mod.c:244]: cdp_child_init(): ... CDiameterPeer
> child started
> 43(7830) WARNING: cdp [tcp_accept.c:121]: create_socket():
> create_socket(): Trying to open/bind/listen on 10.10.10.10 port 3868
> 43(7830) WARNING: cdp [tcp_accept.c:146]: create_socket():
> create_socket(): Successful socket open/bind/listen on 10.10.10.10 port 3868
> 43(7830) INFO: cdp [acceptor.c:95]: acceptor_process(): Acceptor opened
> sockets. Entering accept loop ...
> 44(7831) INFO: cdp [timer.c:205]: timer_process(): Timer process starting
> up...
> [root at 10-10-10-10 kamailio]# 44(7831) INFO: cdp [peerstatemachine.c:525]:
> I_Snd_Conn_Req(): I_Snd_Conn_Req(): Peer dev-diameter-1.ams.proxy.dev
> 44(7831) INFO: cdp [receiver.c:869]: peer_connect(): peer_connect():
> Trying to connect to 10.10.10.20 port 3868
> 44(7831) INFO: cdp [receiver.c:937]: peer_connect(): peer_connect(): Peer
> dev-diameter-1.ams.proxy.dev:3868 connected
> 42(7829) ERROR: cdp [peerstatemachine.c:634]: I_Snd_CER(): I_Snd_CER():
> Error on finding local host address > Socket operation on non-socket
> 42(7829) CRITICAL: <core> [core/mem/q_malloc.c:149]:
> qm_debug_check_frag(): BUG: qm: prev. fragm. tail overwritten(0,
> abcdefed)[0x7f3ecb7ef018:0x7f3ecb7ef050]! Memory allocator was called from
> cdp: diameter_avp.c:365. Fragment marked by cdp: diameter_avp.c:142. Exec
> from core/mem/q_malloc.c:504.
> 45(7832) CRITICAL: <core> [core/pass_fd.c:277]: receive_fd(): EOF on 52
>  0(7787) ALERT: <core> [main.c:739]: handle_sigs(): child process 7829
> exited by a signal 6
>  0(7787) ALERT: <core> [main.c:742]: handle_sigs(): core was generated
>  0(7787) INFO: <core> [main.c:764]: handle_sigs(): terminating due to
> SIGCHLD
>
> DNS record dev-diameter-1.ams.proxy.dev is pointed to 10.10.10.20
>
> NU gdb (GDB) Red Hat Enterprise Linux 7.6.1-114.el7
> Copyright (C) 2013 Free Software Foundation, Inc.
> License GPLv3+: GNU GPL version 3 or later <
> http://gnu.org/licenses/gpl.html>
> This is free software: you are free to change and redistribute it.
> There is NO WARRANTY, to the extent permitted by law.  Type "show copying"
> and "show warranty" for details.
> This GDB was configured as "x86_64-redhat-linux-gnu".
> For bug reporting instructions, please see:
> <http://www.gnu.org/software/gdb/bugs/>...
> Reading symbols from /usr/sbin/kamailio...Reading symbols from
> /usr/lib/debug/usr/sbin/kamailio.debug...done.
> done.
> [New LWP 7829]
> [Thread debugging using libthread_db enabled]
> Using host libthread_db library "/lib64/libthread_db.so.1".
> Core was generated by `kamailio -f kamailio.cfg -DDDD -E'.
> Program terminated with signal 6, Aborted.
> #0  0x00007f3ed7dc0207 in raise () from /lib64/libc.so.6
> Missing separate debuginfos, use: debuginfo-install
> glibc-2.17-260.el7.x86_64 keyutils-libs-1.5.8-3.el7.x86_64
> krb5-libs-1.15.1-34.el7.x86_64 libcom_err-1.42.9-13.el7.x86_64
> libgcc-4.8.5-36.el7.x86_64 libselinux-2.5-14.1.el7.x86_64
> libstdc++-4.8.5-36.el7.x86_64 libxml2-2.9.1-6.el7_2.3.x86_64
> mariadb-libs-5.5.60-1.el7_5.x86_64 openssl-libs-1.0.2k-16.el7.x86_64
> pcre-8.32-17.el7.x86_64 xz-libs-5.2.2-1.el7.x86_64 zlib-1.2.7-18.el7.x86_64
> (gdb) bt
> #0  0x00007f3ed7dc0207 in raise () from /lib64/libc.so.6
> #1  0x00007f3ed7dc18f8 in abort () from /lib64/libc.so.6
> #2  0x000000000067d5f7 in qm_debug_check_frag (qm=0x7f3ecb4d0000,
> f=0x7f3ecb7ef018, file=0x7f3ed068d932 "cdp: diameter_avp.c", line=365,
> efile=0x7df137 "core/mem/q_malloc.c", eline=504)
>     at core/mem/q_malloc.c:151
> #3  0x000000000068077a in qm_free (qmp=0x7f3ecb4d0000, p=0x7f3ecb7ef050,
> file=0x7f3ed068d932 "cdp: diameter_avp.c", func=0x7f3ed068f418
> <__FUNCTION__.7016> "AAAFreeAVP", line=365, mname=0x7f3ed068d760 "cdp")
>     at core/mem/q_malloc.c:504
> #4  0x000000000068a2d6 in qm_shm_free (qmp=0x7f3ecb4d0000,
> p=0x7f3ecb7ef050, file=0x7f3ed068d932 "cdp: diameter_avp.c",
> func=0x7f3ed068f418 <__FUNCTION__.7016> "AAAFreeAVP", line=365,
>     mname=0x7f3ed068d760 "cdp") at core/mem/q_malloc.c:1268
> #5  0x00007f3ed06691c8 in AAAFreeAVP (avp=0x7ffd86bdd300) at
> diameter_avp.c:365
> #6  0x00007f3ed0636d66 in AAAFreeAVPList (avpList=0x7f3ecb7eea40) at
> diameter_msg.c:396
> #7  0x00007f3ed0637123 in AAAFreeMessage (msg=0x7ffd86bdd3b0) at
> diameter_msg.c:416
> #8  0x00007f3ed06088ec in Process_CEA (p=0x7f3ecb7a00d8,
> cea=0x7f3ecb7ee9f0) at peerstatemachine.c:804
> #9  0x00007f3ed0601ab1 in sm_process (p=0x7f3ecb7a00d8, event=I_Rcv_CEA,
> msg=0x7f3ecb7ee9f0, peer_locked=0, sock=8) at peerstatemachine.c:166
> #10 0x00007f3ed0659825 in receive_message (msg=0x7f3ecb7ee9f0,
> sp=0x7f3ed76feec8) at receiver.c:1128
> #11 0x00007f3ed064f295 in do_receive (sp=0x7f3ed76feec8) at receiver.c:593
> #12 0x00007f3ed0653068 in receive_loop (original_peer=0x7f3ecb7a00d8) at
> receiver.c:800
> #13 0x00007f3ed064c7c6 in receiver_process (p=0x7f3ecb7a00d8) at
> receiver.c:459
> #14 0x00007f3ed05fd7de in diameter_peer_start (blocking=0) at
> diameter_peer.c:289
> #15 0x00007f3ed05ef9b1 in cdp_child_init (rank=0) at cdp_mod.c:243
> #16 0x0000000000544e31 in init_mod_child (m=0x7f3ed7630170, rank=0) at
> core/sr_module.c:943
> #17 0x0000000000544ad3 in init_mod_child (m=0x7f3ed7630e90, rank=0) at
> core/sr_module.c:939
> #18 0x0000000000544ad3 in init_mod_child (m=0x7f3ed7631290, rank=0) at
> core/sr_module.c:939
> #19 0x0000000000544ad3 in init_mod_child (m=0x7f3ed7631700, rank=0) at
> core/sr_module.c:939
> #20 0x0000000000545205 in init_child (rank=0) at core/sr_module.c:970
> #21 0x0000000000424f85 in main_loop () at main.c:1701
> #22 0x000000000042b87e in main (argc=5, argv=0x7ffd86bde218) at main.c:2638
>
>
> (gdb) bt full
> #0  0x00007f3ed7dc0207 in raise () from /lib64/libc.so.6
> No symbol table info available.
> #1  0x00007f3ed7dc18f8 in abort () from /lib64/libc.so.6
> No symbol table info available.
> #2  0x000000000067d5f7 in qm_debug_check_frag (qm=0x7f3ecb4d0000,
> f=0x7f3ecb7ef018, file=0x7f3ed068d932 "cdp: diameter_avp.c", line=365,
> efile=0x7df137 "core/mem/q_malloc.c", eline=504)
>     at core/mem/q_malloc.c:151
>         __FUNCTION__ = "qm_debug_check_frag"
> #3  0x000000000068077a in qm_free (qmp=0x7f3ecb4d0000, p=0x7f3ecb7ef050,
> file=0x7f3ed068d932 "cdp: diameter_avp.c", func=0x7f3ed068f418
> <__FUNCTION__.7016> "AAAFreeAVP", line=365, mname=0x7f3ed068d760 "cdp")
>     at core/mem/q_malloc.c:504
>         qm = 0x7f3ecb4d0000
>         f = 0x7f3ecb7ef018
>         size = 140726864040496
>         next = 0x7df14b
>         prev = 0x7f3ed067f329
>         __FUNCTION__ = "qm_free"
> #4  0x000000000068a2d6 in qm_shm_free (qmp=0x7f3ecb4d0000,
> p=0x7f3ecb7ef050, file=0x7f3ed068d932 "cdp: diameter_avp.c",
> func=0x7f3ed068f418 <__FUNCTION__.7016> "AAAFreeAVP", line=365,
>     mname=0x7f3ed068d760 "cdp") at core/mem/q_malloc.c:1268
> No locals.
> #5  0x00007f3ed06691c8 in AAAFreeAVP (avp=0x7ffd86bdd300) at
> diameter_avp.c:365
>         __FUNCTION__ = "AAAFreeAVP"
> #6  0x00007f3ed0636d66 in AAAFreeAVPList (avpList=0x7f3ecb7eea40) at
> diameter_msg.c:396
>         avp_t = 0x7f3ecb7ef050
>         avp = 0x7f3ecb7eef40
> #7  0x00007f3ed0637123 in AAAFreeMessage (msg=0x7ffd86bdd3b0) at
> diameter_msg.c:416
>         __FUNCTION__ = "AAAFreeMessage"
> #8  0x00007f3ed06088ec in Process_CEA (p=0x7f3ecb7a00d8,
> cea=0x7f3ecb7ee9f0) at peerstatemachine.c:804
>         avp = 0x7f3ecb7ef440
> #9  0x00007f3ed0601ab1 in sm_process (p=0x7f3ecb7a00d8, event=I_Rcv_CEA,
> msg=0x7f3ecb7ee9f0, peer_locked=0, sock=8) at peerstatemachine.c:166
>         result_code = -798486160
>         next_event = 32574
>         msg_received = 0
>         __FUNCTION__ = "sm_process"
> #10 0x00007f3ed0659825 in receive_message (msg=0x7f3ecb7ee9f0,
> sp=0x7f3ed76feec8) at receiver.c:1128
>         avp1 = 0xd0684fc0
>         avp2 = 0x7f3ecb7eed10
>         __FUNCTION__ = "receive_message"
> #11 0x00007f3ed064f295 in do_receive (sp=0x7f3ed76feec8) at receiver.c:593
>         cnt = 184
>         n = 184
>         version = 1
>         dst = 0x7f3ecb7ef4f4 ""
>         dmsg = 0x7f3ecb7ee9f0
>         __FUNCTION__ = "do_receive"
> #12 0x00007f3ed0653068 in receive_loop (original_peer=0x7f3ecb7a00d8) at
> receiver.c:800
>         rfds = {__fds_bits = {256, 0 <repeats 15 times>}}
>         efds = {__fds_bits = {0 <repeats 16 times>}}
>         tv = {tv_sec = 0, tv_usec = 999998}
>         n = 1
>         max = 49
>         cnt = 1
>         msg = 0x0
>         sp = 0x7f3ed76feec8
>         sp2 = 0x7f3ed76feec8
>         p = 0x7f3ecb7a00d8
>         fd = 8
>         fd_exchange_pipe_local = 49
>         __FUNCTION__ = "receive_loop"
> #13 0x00007f3ed064c7c6 in receiver_process (p=0x7f3ecb7a00d8) at
> receiver.c:459
> ---Type <return> to continue, or q <return> to quit---
>         __FUNCTION__ = "receiver_process"
> #14 0x00007f3ed05fd7de in diameter_peer_start (blocking=0) at
> diameter_peer.c:289
>         pid = 0
>         k = -1
>         p = 0x7f3ecb7a00d8
>         __FUNCTION__ = "diameter_peer_start"
> #15 0x00007f3ed05ef9b1 in cdp_child_init (rank=0) at cdp_mod.c:243
>         __FUNCTION__ = "cdp_child_init"
> #16 0x0000000000544e31 in init_mod_child (m=0x7f3ed7630170, rank=0) at
> core/sr_module.c:943
>         __FUNCTION__ = "init_mod_child"
> #17 0x0000000000544ad3 in init_mod_child (m=0x7f3ed7630e90, rank=0) at
> core/sr_module.c:939
>         __FUNCTION__ = "init_mod_child"
> #18 0x0000000000544ad3 in init_mod_child (m=0x7f3ed7631290, rank=0) at
> core/sr_module.c:939
>         __FUNCTION__ = "init_mod_child"
> #19 0x0000000000544ad3 in init_mod_child (m=0x7f3ed7631700, rank=0) at
> core/sr_module.c:939
>         __FUNCTION__ = "init_mod_child"
> #20 0x0000000000545205 in init_child (rank=0) at core/sr_module.c:970
> No locals.
> #21 0x0000000000424f85 in main_loop () at main.c:1701
>         i = 32
>         pid = 7821
>         si = 0x0
>         si_desc = "udp receiver child=31 sock=10.10.10.10:5060\000\177\000\000`\336\275\206\375\177\000\000A\307g\000\000\000\000\000`\202A\000\000\000\000\000ȖP\313>\177",
> '\000' <repeats 14 times>,
> "\001\000\000\000\260\336\275\206\375\177\000\000\234\240h\000\000\000\000\000
> \243w\000\000\000\000\000H\361m\327>\177\000"
>         nrprocs = 32
>         woneinit = 1
>         __FUNCTION__ = "main_loop"
> #22 0x000000000042b87e in main (argc=5, argv=0x7ffd86bde218) at main.c:2638
>         cfg_stream = 0x2b13020
>         c = -1
>         r = 0
>         tmp = 0x0
>         tmp_len = 0
>         port = 0
>         proto = 0
>         options = 0x758490
> ":f:cm:M:dVIhEeb:l:L:n:vKrRDTN:W:w:t:u:g:P:G:SQ:O:a:A:x:X:Y:"
>         ret = -1
>         seed = 4201223303
>         rfd = 4
>         debug_save = 0
>         debug_flag = 0
>         dont_fork_cnt = 4
>         n_lst = 0x0
>         p = 0x0
>         st = {st_dev = 19, st_ino = 82198, st_nlink = 2, st_mode = 16832,
> st_uid = 995, st_gid = 2, __pad0 = 0, st_rdev = 0, st_size = 60, st_blksize
> = 4096, st_blocks = 0, st_atim = {tv_sec = 1547739314,
>             tv_nsec = 582000000}, st_mtim = {tv_sec = 1547743303, tv_nsec
> = 194148453}, st_ctim = {tv_sec = 1547743303, tv_nsec = 194148453},
> __unused = {0, 0, 0}}
>         __FUNCTION__ = "main"
> (gdb)
>
>
>
> --
>
> BR,
> Denys Pozniak
>
>
>
>

-- 

BR,
Denys Pozniak
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.kamailio.org/pipermail/sr-users/attachments/20190118/239ba806/attachment.html>


More information about the sr-users mailing list