RE: stds-802-16-tg4: Call for input - Data Encoding
You are right that we need to define the environment first
before any decisions in one or another direction can be made.
From this perspective I support your view for the causes of
interference, though the exact parameters can be discussed.
By the way, there is also interference caused by primary band
users (like radar). Some are narrow others wider, but usually
we get killed by a hit of such a device or it is distant or low
power enough to just color the noise floor.
Regardless of operation in a clean or polluted area we still need
to cope with the radio channel itself. Reasonable performance in
this area is the primary requirement but shouldn't be that hard
to meet. Anyhow, we should bundle fading with interference when
comparing the coding schemes.
We are hopefully soon close enough to decent understanding
of our burst and frame structure to be able to identify these
interference scenarios. Thus we should be able to agree upon the
environmental parameters in the next meeting(s) to enable real
comparisons. I can't imagine the group agrees that a
certain type of coding is the most appropriate and thus the
real coding scheme(s) must be presented before the decision.
In case of several ones, they must be compared!
My suggestion is to continue on the road paved by Octavian to
generate a feasible and reasonable comparison environment. It
might be that this "environmental" knowledge could help in
decisions regarding co-existence issues and some other system
> -----Original Message-----
> From: ext Octavian Sarca [mailto:firstname.lastname@example.org]
> Sent: 16 April, 2001 19:31
> To: Minfei Leng; Liebetreu, John; Stds-802-16-Tg4 (E-mail)
> Subject: RE: stds-802-16-tg4: Call for input - Data Encoding
> Dear All,
> I think the discussions about the encoding methods should be
> part of the
> section coordinated by Tal which is designated to establish evaluation
> criteria. I would like to encourage everybody to submit their
> opinion to
> that section. I strongly believe that data encoding section should
> accept ALL encoding schemes that fall within the scope outlined by the
> group during the previous meetings. This is to let the group make an
> informed decision.
> I also have few comments about interference in UNII band.
> Potentially, I
> see are 3 types of devices operating in these bands:
> 1. 802.11a/HyperLAN devices
> 2. 802.16.4/802.16b devices
> 3. PTP and PTMP devices using non-standard, proprietary protocols
> In the first case, the minimum length of a burst is 16us (preamble) +
> 4us (SIGNAL) + 4us (1 data symbol) = 24us. However the
> maximum duration
> can be
> 16us (preamble) + 4us (SIGNAL) + 783*4us (max data frame @ 6Mb/s) =
> In the second case, on downlink we expect bursts of 0.5ms to 8ms. On
> uplink we may expect something in the range of tens to
> hundreds of us.
> In the third case, we have PTP devices that operate with FDD
> (like those
> developed by Western Multiplex and CRC/Redline) which use continuous
> transmission or TDD devices (like those developed at Malibu Networks)
> which will have characteristics similar or close to 802.16b.
> Interference from FDD devices with continuous transmission will be
> perceived as a raised noise floor. Thus any burst noise immunity does
> not help here. Coding cannot do anything significant here but
> we can use
> deployment strategies, directional antennas, etc.
> Concerning the other devices, we expect noise bursts with a length
> between 24us and 8ms, with the higher probability concentrated between
> 100us and 1ms. The SNR degradation during these bursts can also be
> anywhere between few dB and enough to kill any reception.
> Now, let's get real. No coding can be immune to bursts unless
> interleaving time is much larger than the burst. In our case,
> what shall
> we choose? 1ms? 10ms? 100ms? What about delay? Remember that the long
> downlink burst is actually made of small packets/bursts with different
> destinations, coding rates and modulations. What about the
> upstream? Can
> we do 1ms interleaving on a 100us burst?
> I think that the only chance we have to improve immunity to
> is to use smaller granularity of the data stream. In other
> words, if we
> use symbol-size interleaving, we place CRC on each separate data
> fragment/packet/burst and we use ARQ we may obtain the best results.
> Why? Because the interfering noise bursts will affect the least number
> of data fragments.
> Suppose a very strong interfering burst destroying say 14 OFDM symbols
> (this is very likely to happen in the UNII band). With symbol-size
> interleaving this would kill data in 14 OFDM symbols. Any longer
> interleaving scheme will kill more data, i.e. more bursts on uplink or
> more packets/fragments on the downlink.
> I agree with you that we could have saved this data if we had
> been able
> to make the interleaving block size to be much larger that
> the length of
> the interfering bursts. But this is not possible. Even if we ignore
> interference from other systems and we consider only 802.16b
> devices we
> arrive to a vicious circle. Larger interleaving block implies larger
> bursts which requires even larger interleaver.
> Best Regards,