Low Earth Orbit (LEO) satellite constellations have been identified for new massive access networks, as a complement to traditional cellular ones, due to their native ubiquity. Despite being a feasible alternative, such networks still raise questions on their performance, in particular regarding the delay and queuing management under realistic channels. In this work, we study the queuing delay of a single satellite-to-ground link, considering a Land Mobile Satellite (LMS) channel in LEO with finite buffer lengths. We analyze the trade-off between delay and packet loss probability, using a novel model based on Markov chains, which we assess and extend with an extensive analysis carried out by means of system level simulation. The developed tools capture with accuracy the queuing delay statistical behavior in the S and Ka frequency bands, where LEO communications are planned to be deployed. Our results show that we can use short buffers to ensure less than 5-10% packet loss, with tolerable delays in such bands.