determine packet delays
January 17th, 2019
I’m trying to learn about this, but I have a small question and i’m sure it’s a silly one.
This is an exercise from a website I found: Consider sending voice from host A to host B over a packet-switched network (for example, Internet phone). Host A converts analog voice to a digital 64 Kbps bit stream on the fly. Host A then groups the bits into 48-byte packets. There is one link between host A and B; its transmission rate is 1 Mbps and its propagation delay is 2 ms. As soon as host A gathers a packet, it sends it to host B. As soon as host B receives an entire packet, it converts the packet’s bits to an analog signal. How much time elapses from the time a bit is created (from the original analog signal at host A) until the bit is decoded (as part of the analog signal at host B)
Answer:
Before the first bit of a packet can be transmitted, all the other bits belonging to the same packet need to be generated. This requires: 48.8 / 64×10^3 = 6 ms
The time to transmit this packet is: 48.8 / 10^6 = .384 ms
The propagation delay is 2 ms
Therefore, the delay until decoding is: 6ms + 0.384ms + 2ms = 8.384 ms
Can anyone tell me why they multiply by 8? I don’t get why they do that. Do all the bits generated hold 48 bytes?
thanks in advance
8 bits in a byte