Has anyone done anything like this?

Consider the queuing delay and packet loss rate in a router buffer (proceeding an outgoing link). Suppose all packets are L bits, the transmission rate is R/2 bps, and that N packets simultaneously arrive at the buffer every LN/R seconds. The packets in the buffer are transmitted in an FIFO order. The size of this router buffer is LN, therefore, packets will be dropped immediately once the buffer becomes full. Find the average queuing delay of the successfully transmitted packets and the average packet loss rate. Note, both quantities above should be calculated in a long run.
I can do it assuming a buffer of infinite size. I also am having trouble figuring out the average packet loss rate (we haven't been taught or shown any examples).

Here is a question similar to the first one with the answer:
Consider the queuing delay in a router buffer (preceding an outbound link). Suppose all packets are L bits, the transmission rate is R bps, and that N packets simultaneously arrive at the buffer every LN/R seconds. Find the average queuing delay of a packet.
This is easy, you just sum up...

L/R + 2L/R + ... + (n - 1)L/R = L(n - 1)/2R

note: it's n - 1 because the first packet in the queue has no delay.

Thanks for any help you may be able to give.