I want to start by sketching an example of the workings of the stack.
Let's take a simple base case. Assume the information stream is the integers in order. Let's also assume that each "cycle" of time it takes one thing to process, exactly two things come in. We'll call this a handling period of 2 ticks, compared to a data period of 1 tick (or handling frequency of 1/2, compared to a data frequency of 1).
So the process feed looks like this. "1" comes in, so we process it in two ticks. In the meantime, "2" and "3" come in in one tick intervals, and are sequentially put on top of the stack. So two ticks after first receiving "1", our processor is free, reads the stack, and retrieves "3". Over the course of the next two ticks, our processor processes "3", but "4" and "5" are added in one tick intervals to the stack. Our processor finishes, reads "5" from the stack for processing, and so we continue on.
This is clearly a process that is creating an ever expanding stack of even numbers, and we'll never get to process "2" at the bottom of the stack if the sequence continues infinitely. That is to say, there is no integer such that we could guarantee that "2" would be processed in that length of time.
The same is not true for your queue model. Given any integer "n", we can prove that there is an integer length of time within which we might expect to process "n" - namely, 2n ticks! We cannot prove that the entire sequence will ever complete, but we can prove that for any given element of that sequence, there is a length of time within which we will process that element.
This, I think, is the philosophically interesting distinction in your question: Given that the queue structure is a strategy within which we can define a finite time for any given integer processing to happen, why can we still not say that we will capture every single element of the series?
Well, in answer to that question, look at the cardinality and Value of the set of outputs within a given time for the Stack versus the Queue. The outputs of both systems are in perfect 1:1 correlation, so they are solving the same number of values as each other. And yet, what the top of the stack shows us is that at a time t, there is always going to be an integer that is coming downstream that the Queue system hasn't processed yet. If the queue is currently looking at the integer n, then the top of the stack is the n'th even number, which is still an integer and which the queue hasn't yet passed to the front!
It's all fine and good to say that for any given integer n we can prove that we will deal with it in 2n ticks, but we can also say that for that same integer, we know that there are at least n more elements for us to process; completely uncontroversially so, even, because there's one sitting on top of the stack as a result of a well-defined process. So at no point in time can we say we're done, because there's definitely always more to do.
This is what the philosopher Michael Dummett called the intuition of Indefinite Extensibility. It seems totally reasonable for us to say that we can capture the length of time taken to process any given integer in your queue model, but also correct to say that there is still somehow more that we need to do without ever stepping outside the bounds of our set of integers. While classical Set theory recognizes that the integers themselves are a perfectly legitimate set (thanks to the axiom of infinity), other philosophers of mathematics have made sense of this intuition using Set theoretic technology to apply it to more complex notions than this particular sketch; for example, the class of all Ordinal numbers (including the transfinite ones) seems to make sense, but cannot be treated as a set on pain of paradox. Indefinite Extensibility suggests a way to conceptualize a distinction between sets and proper classes.
Dummett wanted to use this concept to demonstrate that certain kinds of mathematical practice were not suited to reasoning using classical logic. To do this, he pointed out that there were indeed some well-defined mathematical concepts (mostly transfinite) that nonetheless lacked the potential to be rolled out into a well-defined extensional treatment suitable for classical quantification. This was a reason to consider alternative logical treatments, such as those forms of semantics grounded in proofs rather than abstract truth.
This position has been quite influential in the drive to develop systems like theorem provers and natural deduction/sequent calculus models for reasoning about logical statements and computer programs. If this kind of stuff interests you, there's definitely plenty to read on!