2.2.0alpha3: Internal error

Christopher Odenbach odenbach at hni.uni-paderborn.de
Mon Apr 2 16:09:30 GMT 2001


Hi,

> Can you reproduce this consistently?

Of course :-(

> In order to track this down, recompile smbd with debug support
> (env CFLAGS=-g ./configure) and add
>
> 	panic action = /bin/sleep 90000
>
> to your smd.conf and watch for the error in log.smbd.
> Once you see smbd die, attach a gdb session to it
>
> 	gdb <path to smbd> <pid of dead process>
>
> and send the backtrace.

Did as you requested, and here you are:

(gdb) backtrace
#0  0xef5b924c in waitid ()
#1  0xef5d45cc in _waitpid ()
#2  0xef5e92c0 in system ()
#3  0x19a3c8 in smb_panic (why=0x1f5648 "internal error") at lib/util.c:1131
#4  0x1759f0 in fault_report (sig=11) at lib/fault.c:45
#5  0x175a70 in sig_fault (sig=11) at lib/fault.c:65
#6  <signal handler called>
#7  0xc4c5c in close_policy_hnd (p=0x2eeff0, hnd=0xeffff248)
    at rpc_server/srv_lsa_hnd.c:192
#8  0xe6e3c in close_printer_handle (p=0x2eeff0, hnd=0xeffff248)
    at rpc_server/srv_spoolss_nt.c:270
#9  0xe9794 in _spoolss_closeprinter (p=0x2eeff0, q_u=0xeffff248, 
    r_u=0xeffff230) at rpc_server/srv_spoolss_nt.c:1053
#10 0xe22c8 in api_spoolss_closeprinter (p=0x2eeff0)
    at rpc_server/srv_spoolss.c:143
#11 0xe1844 in api_rpcTNP (p=0x2eeff0, rpc_name=0x1e0f58 "api_spoolss_rpc", 
    api_rpc_cmds=0x213344) at rpc_server/srv_pipe.c:1198
#12 0xe6608 in api_spoolss_rpc (p=0x2eeff0) at rpc_server/srv_spoolss.c:1192
#13 0xe1484 in api_pipe_request (p=0x2eeff0) at rpc_server/srv_pipe.c:1149
#14 0xc9878 in process_request_pdu (p=0x2eeff0, rpc_in_p=0xeffff5d8)
    at rpc_server/srv_pipe_hnd.c:537
#15 0xc9bac in process_complete_pdu (p=0x2eeff0)
    at rpc_server/srv_pipe_hnd.c:609
#16 0xca064 in process_incoming_data (p=0x2eeff0, data=0x28d868 "\024", n=28)
---Type <return> to continue, or q <return> to quit---
    at rpc_server/srv_pipe_hnd.c:705
#17 0xca344 in write_to_pipe (p=0x2eeff0, data=0x28d868 "\024", n=44)
    at rpc_server/srv_pipe_hnd.c:734
#18 0x26fa8 in api_fd_reply (conn=0x2987e0, vuid=100, outbuf=0x2a8ef1 "", 
    setup=0x28c068, data=0x28d858 "\005", params=0x0, suwcnt=2, tdscnt=44, 
    tpscnt=0, mdrcnt=1024, mprcnt=0) at smbd/ipc.c:306
#19 0x2735c in named_pipe (conn=0x2987e0, vuid=100, outbuf=0x2a8ef1 "", 
    name=0xeffff906 "", setup=0x28c068, data=0x28d858 "\005", params=0x0, 
    suwcnt=2, tdscnt=44, tpscnt=0, msrcnt=0, mdrcnt=1024, mprcnt=0)
    at smbd/ipc.c:350
#20 0x282b0 in reply_trans (conn=0x2987e0, inbuf=0x298ae9 "", 
    outbuf=0x2a8ef1 "", size=124, bufsize=61440) at smbd/ipc.c:494
#21 0x7ec04 in switch_message (type=37, inbuf=0x298ae9 "", outbuf=0x2a8ef1 
"", 
    size=124, bufsize=61440) at smbd/process.c:733
#22 0x7ed0c in construct_reply (inbuf=0x298ae9 "", outbuf=0x2a8ef1 "", 
    size=124, bufsize=61440) at smbd/process.c:762
#23 0x7f1a0 in process_smb (inbuf=0x298ae9 "", outbuf=0x2a8ef1 "")
    at smbd/process.c:850
#24 0x80510 in smbd_process () at smbd/process.c:1230
#25 0x14e28 in main (argc=2, argv=0xeffffd04) at smbd/server.c:795


Hope this helps (doesn't say _anything_ to me).

Best regards,

Christopher




More information about the samba-technical mailing list