discrepancy in tx_time

Steve J steve_newus at yahoo.com
Fri Mar 22 00:36:58 EST 2002


Hi,
I am dong some experiments and I was just trying to
figure out the actual transmission time.

SO I call dogettimeofday() just before the actual
transmission is triggered in dldwd_xmit() 

i.e. before - "err = hermes_docmd_wait(hw,
HERMES_CMD_TX | HERMES_CMD_RECL, txfid, &resp);"

and again call the dogettimefday() whe I receive an
interrupt from the card abt the succcessful
transmission.

I measure the interval betwen these two calls as
actaul trasnmission time.

When I start a UDP application which sends 1300 bytes
packet at interval of 10msec, the tx_time come out to
be order of 100 microsecond. IT is beyond my
understanding how this is possible as I set the card
rate limit as 2Mbps. So given this setting the tx_time
should come out as 1300*8/2*(10^6) = 5msec

Also if this UDP allpicalltion sends packet as fast  
as posible, i.e.without any sleeep calls(), the the 
tx_time come out to be ~600 microsecond. This is still
not correct and also why should the transmission time
should change depending on the packet geneeration rate
of application ??
Kindly help
-S

__________________________________________________
Do You Yahoo!?
Yahoo! Movies - coverage of the 74th Academy Awards®
http://movies.yahoo.com/




More information about the wireless mailing list