In many of such models, space becomes ‘relational’: how distant one thing is from another is, roughly, related to how direct two events are connected. So if you visualize spacetime as a graph, in which events (‘spacetime points’) are the vertices, then the number of steps you have to go from one event to another determines how far these events are from one another.
Some models, for instance, have an ageometric phase, in which the spacetime graph is maximally connected: there exists an edge, i.e. a link, between any two events. In this phase, no ‘space’ in the ordinary sense exists, since everything is right ‘on top’ of everything else. Then, in a sort of condensation or cooling process, links between events get removed, and space(time) emerges.
There’s thus no arrangement of quanta as such (as this would imply them to be arranged in space), rather, the quantized relations between events give rise to space: the underlying graph is a pregeometric notion. It’s thus a bit of a misunderstanding to talk about the ‘size’ of space quanta, as in these models, the notion of size is emergent, and not applicable at the fundamental level.
As for particles, it is somewhat unclear how to integrate them into such models – one proposal I find promising is that they are in fact excitations of the underlying graph, similar to how quasiparticles, such as phonons, the quanta of sound, are excitations of an underlying condensed matter lattice. This requires typically adding some extra structure to the graph. These particles then have typically a position defined by their interaction with other particles, which in turn depends on the connections of the parts of spacetime they’re in with one another.
In general, position in discrete models is subject to uncertainty: since the coordinates don’t commute with one another, position along one axis is only precisely definable at the expense of defining position along another axis less exactly. So, particles would be at least a little non-local – which fits well with the picture of them being collective excitations of some underlying structure.
As for the speed of light, going by a condensed matter analogy, it is indeed possible that it is anisotropic – i.e. different in different directions. But since this anisotropy would apply to all physics, we would never notice: all the probes we could use would be subject to the same anisotropy, so the measured speed of light would be the same in all directions for an ‘inside’ observer, even though it might not be for an ‘outside’ one.
And for the importance of the Planck length, consider how our ability to measure smaller distances is related to the energy of the probes we use: the higher the energy, the smaller the distance. At some point, however, you end up cramming so much energy into such a small space, that you end up producing a black hole – which will ‘block the view’ to all smaller distances. The distance at which this happens is the Planck length – so whether or not it’s the smallest actual distance, it certainly is the smallest theoretically resolvable one.