Thread Links Date Links
Thread Prev Thread Next Thread Index Date Prev Date Next Date Index

Re: XAUI jitter tolerance




Hi Rich,

In an earlier note to Mike Jenkins, I explained how the fc/1667 break
frequency was calculated.  And it was different from what Mike
described.  Tom Lindsay of Vixel is the person who wrote Annex G, which
I referenced in my explanation.  Tom, feel free to correct me if I was
wrong in my note to Mike.  Mike's calculation was not the basis for the
fc/1667 number, but rather a verification that the fc/1667 number did
not violate the +/- 100ppm constraint put on RefClk.  I would also like
to add that Annex G only shows how Fibre Channel's fc/1667 was derived
from Sonet's fc/2500.  But again, nobody knows where the Sonet number
came from.  

I don't disagree with you that today's systems are working perfectly
with these current specs.  However, I don't agree with your unwritten
assumption that any changes to the current spec would render these
systems broken.  Agilent's proposal is nothing more than an update to an
old spec.  It merely reflects what the industry is already
implementing.  My experience, as well as many who have written on this
thread, is that today's receivers have a bandwidth of 3MHz and above. 
Our proposal is not revolutionary or dangerous.  It merely seeks to put
in writing what people are implementing.  And in doing so, we believe
that transmitters and systems will be cheaper and easier to implement. 
If you want proof that today's receivers have a bandwidth beyond 3MHz, I
can submit data of Agilent's SerDes on this matter.  And I will
volunteer to measure other vendor's receiver bandwidth if they are
willing to give me a sample.  I am not making the argument that fc/1667
is wrong or that it doesn't apply to XAUI.  I am, however, making the
argument that we can improve this number and make it more useful for the
implementation of XAUI which has its differences with other standards.

Please see additional comments below.

Rich Taborek wrote:
> 
> Allan,
> 
> I agree with Mike Jenkins' analysis of maintaining the fc/1667 break
> frequency spec for XAUI. I also agree with the current "problem free"
> usage of this spec, as proposed in the T11 MJS Technical Report, for
> Gigabit Ethernet 1000BASE-X, Fibre Channel 1G and 2G devices and
> InfiniBand 2.5G devices. My rationale for voting on the prevailing side
> of the very lopsided straw poll result to not pursue a change to the
> implicit fc/1667 break frequency spec for XAUI is that insufficient
> technical evidence has been provided to convince me that a change is
> warranted. Previously, in a not on this thread aired over the Serial PMD
> reflector, Tom Lindsay of Vixel and Mike Jenkins, among others pointed
> to very clear and concise documentation of exactly how this value was
> chosen for Fibre Channel originally. That documentation is available in
> ftp://ftp.t11.org/t11/member/fc/jitter_meth/99-151v2.pdf. See Annex G
> entitled: "Choosing the Corner Frequency: fc/1667" on page 94 (PDF 103)
> of this document. I would be willing to change my vote if it can be
> proven that the analysis provided in Annex G is erroneous and/or not
> applicable to XAUI.
> 
> Please also see my additional comments below:
> 
> Allan Liu wrote:
> >
> > Hi,
> >
> > A jitter specification has been proposed by Ali Ghasi of Broadcom for
> > XAUI.  At the December Jitter Meeting in Austin, Texas, Agilent made a
> > proposal for an improvement.  Those presentations, as well as a summary
> > of the meeting by Anthony Sanders(facilitator of the XAUI Jitter Ad-Hoc)
> > are available at:
> > http://www.ieee802.org/3/ae/public/adhoc/serial_pmd/documents/index.html
> >
> > Our proposal is to increase the "break" frequency of the jitter
> > tolerance mask.  Currently, the proposal puts that "break" frequency at
> > 1.8MHz.  We are proposing to push it up higher to 3-5MHz.
> > There are four reasons why we believe the time is ripe for making this
> > change:
> >
> > 1) The current "break" frequency was derived from Fibre Channel which
> > derived it from Sonet; it's equal to the fundamental frequency divided
> > by 1667.  I have looked long and hard and I have not found any
> > documentation as to why this number was picked.  If anybody has an
> > explanation, there is massive group of people waiting to hear the
> > answer, including myself.  In the meanwhile, let me just repeat what I
> > have heard from different people; this is in line with what Larry Devito
> > of Analog Devices has posted to the reflector before.  Current wisdom
> > has it that this number had to do with old SAW filter technology that
> > was used at the time Sonet was created.  In addition, Sonet had to
> > contend with the inherent problems with using regenerators in the system
> > and thus had to make their jitter specs more stringent.  And to be
> > compatible with older systems, today's Sonet systems are designed to the
> > same old spec.  Fibre Channel comes along and copies this spec from
> > Sonet.  Infiniband comes along and copies it from Fibre Channel.  And
> > now, XAUI comes along and also wants to copy it from Fibre Channel.  And
> > nobody knows why! XAUI is brand new and does not carry any old baggage.
> > We have a chance to do it right and to write the specification to
> > reflect current technologies and current implementations.
> 
> See MJS Annex G for the explanation. SONET uses fc/2500, not the same
> break frequency as FC, GE and IB.
> 
===================================

Annex G does not offer an explanation of where SONET got it's fc/2500. 
It only explains how Fibre Channel got fc/1667.  Even the author of
Annex G, Tom Lindsay, does not know where the SONET number came from.

=====================================
> > 2) Today's Fibre Channel systems use receivers with a much higher
> > bandwidth.  My measurements show that they are in the 3-5MHz range.
> > During the December meeting, Jeff Cain of Cisco said their 1G Ethernet
> > systems are using SerDes with bandwidths in the same range.  And all
> > these systems are working perfectly.  So why do we continue to limit
> > ourselves to a legacy specification that everybody exceeds? We should
> > write the XAUI spec to reflect what people are implementing and what
> > makes sense.  And from my understanding of how receivers work, a higher
> > bandwidth equals better performance.
> 
> I'm confused as to what exactly is being limited by the fc/1667
> specification. Are higher bandwidth receivers somehow precluded from
> being compliant with this spec?

===========================
Higher bandwidth receivers are not precluded from being compliant with
this spec.  However, as Mike Jenkins correctly pointed out earlier, the
fc/1667 puts a constraint on the transmitter.  And this is an
unneccesary constraint.  Think about it.  The transmitter has to limit
its jitter to have frequency content below fc/1667, approximately 1.8MHz
at 3.125GHz.  However, receivers are able to track jitter up to and
probably beyond 3MHz.  As I said in my response to Mike, why do want to
put this unnecessary burden on the transmitter?  Moving the break
frequency to reflect what people are implementing will ease the design
of transmitters and ease the integration of XAUI into bigger chips.  

You and others have indicated that the current spec already allows for
massive integration and that designers are free to make the receiver PLL
cap as small as possible.  Yes, that's true, but again, the current spec
places an unnecessary restriction on the transmitter.  Mike Jenkins at
the Austin meeting pointed out that only one transmitter is required for
every XAUI implmentation.  Frankly, I think that's very short-sighted. 
We have yet to see an abundance of XAUI chips.  Even when the first
generation of XAUI chips come out, they will undoubtedly have only 1 or
2 channels.  What happens when we want to integrate 100 or more
channels? Are we so sure that one transmitter is all we need? Are we so
sure that routing a high speed clock on a huge chip with massive number
of gates will be easy? I am not willing to answer either of the above
questions in the affirmative.  Without experience to guide us, we can
only do our best to anticipate the future.  We should make available the
option to integrate more than one transmitter per XAUI chip.  The
simplest way to do so is to make the constraints on the transmitter be
consistent with the receiver specs.  Making this change to the spec will
allow the transmitter to use a smaller PLL cap, just like the receivers
do today.  Consistency is good!  And consistency is all we are asking
for.

============================
> 
> > 3) Other technologies do not necessarily have the same ambitious cost
> > structure as Ethernet historically has.  XAUI is suppose to be a low
> > cost interface to connect the optics to the MAC.  At 3.125G, it is not
> > easy to build a functional system.  Increasing the "break" frequency
> > means that the receiver is able to track more jitter.  This will make it
> > easier for everybody to meet specification and produce a fuctional
> > system.  And of course, this will drive down the cost of each port.  But
> > the crucial point is not cost, but that increasing the break frequency
> > will NOT impact system performance in any negative way.
> 
> It had been pointed out by Mike Jenkins and Tom Lindsay in earlier notes
> on this thread that: "If the CDR is too fast and
> tracks every little bit of noise and jitter that comes along, we're
> asking for trouble. If we think about CDRs as high-pass filters, complex
> jitter with fundamentals below the corner frequency gets
> "differentiated", which can effectively double the peak to peak jitter
> the CDR has to dissipate. (Think about sending a low frequency square
> wave through a high-pass RC filter). There has been a lot of work done
> on this, both analytically and in the lab." I'm having a hard time
> convincing myself that changing the current fc/1667 break frequency will
> drive down the cost of XAUI ports if the CDR circuitry, clearly among
> the most complex XAUI circuitry, has to dissipate more jitter.
> 
======================
I'll take Tom's note one step further.  I take it to mean that if we do
not want trouble, then we should put a maximum bandwidth requirement on
all receivers.  The fact that we don't have such a spec makes the point
moot.  People are not implementing receivers with infinite bandwidth. 
And we are not asking people to do so.  We are not asking to change the
spec so that "CDRs track every little bit of noise and jitter that comes
along."  We are merely proposing to update the spec to reflect what
people are implementing.  And in doing so, we hope to make the XAUI spec
easier and cheaper for everybody to work with.

=====================


> > 4) Increasing the "break" frequency will also make it easier and cheaper
> > for the integration of XAUI into bigger chips, like MACs.  Integration
> > can mean 1 channel of XAUI or 100 channels of XAUI.  Obviously the
> > current generation of XAUI will be discrete, but from the days of HARI,
> > integration has been the goal.  We must not forget this goal as we move
> > forward with the specification.  And again, this goes back to the point
> > about XAUI being a low cost interface.  Increasing the bandwidth means
> > the ability to use a smaller filtering cap in the PLL which means a
> > higher level of integration can be achieved.
> 
> I wholeheartedly agree with achieving high levels of XAUI integration
> and efficient PLL designs allowing high levels of integration. I don't
> agree that leaving the break frequency where it is prevents an increase
> in receiver bandwidth. Therefore, it seems that you are free to use a
> smaller PLL filtering cap.

================
See my note about the implications on the transmitter implementation
above.

================

> 
> > I believe the result from the straw poll during the December meeting is
> > short-sighted.  There is nothing scary or dangerous about this change.
> > It is certainly a logical and much needed change to reflect what our
> > technology can do and what the market is producing.  There is no need to
> > copy other standards, especially if those standards have no technical
> > basis for their jitter tolerance numbers.  However, there are clear
> > reasons, as listed above, as to why these numbers need to be changed.
> > Any claims that XAUI can leverage from this standard and that standard
> > are unfounded.  There are no standards out there that has the same
> > goals as XAUI or have the same fundamental frequency as XAUI.  And
> > why should XAUI continue to carry the old baggage as the other
> > standards do? XAUI was suppose to be easy.  After all, 3.125G is only
> > only 625MHz from 2.5G.  Wrong! XAUI chips have yet to be abundantly
> > available.  This suggests that they are much harder to implement than
> > people previously thought.  Any thoughts of XAUI being easy because it's
> > "similar" to other standards are simply preposterous.
> 
> XAUI, and its 3.125 Gbps line rate is fairly new. XAUI virtually must be
> implemented in CMOS because of the multiple lanes, embedded CODECs and
> lane coordinating logic. Many vendors are making great progress with
> XAUI and it's even showing up in FPGAs! I believe that many who voted as
> they did in the straw poll in December were informed and cognizant of
> the decision they were making. Similarity with other standards was not
> an issue in my vote. Confidence in the MJS work and the folks involve in
> that effort was a deciding factor in my vote and I stand by it.

==================
Like you, I also have great confidence in the MJS work.  It is the basis
for the XAUI jitter spec.  Agilent's proposal does not conflict with any
of the work MJS did.  It only looks to make it consistent with current
implementations of receivers.  If you look at the presentation online,
it only deviates from the Fibre Channel spec in that the "break"
frequency has been changed; everything else is the same. 

I have address each and every reason that has been floated with regards
to not accepting the Agilent proposal.  And with each response, I have
pointed out how the counter-argument does not hold water and how our
proposal will benefit the industry.  But from what I have heard, I think
the main issue here is not technical or economical.  The real problem is
fear.  The "If it broke, don't fix it" mentality is with us all.  But
there is really nothing to be afraid of.  People are implementing
receivers the way they are, regardless of our proposal.  But if we make
the spec be consistent with those implementations, we buy ourselves some
insurance for later generations of XAUI when integration will be the key
issue.  It might be the case that Mike Jenkins is right, that only one
transmitter is needed for each XAUI chip.  But if he's wron, then we
will all pay the price of more difficult implementations and lower
levels of integration.



Regards,

-Allan



> 
> Happy Holidays,
> Rich
> 
> --
> 
> > This message is to open up this discussion to a wider audience and to
> > get people thinking about this issue as we approach the next meeting.
> > Any and all input on this matter is appreciated.
> >
> > Regards,
> >
> > -Allan
> 
> -------------------------------------------------------
> Richard Taborek Sr.                 Phone: 408-845-6102
> Chief Technology Officer             Cell: 408-832-3957
> nSerial Corporation                   Fax: 408-845-6114
> 2500-5 Augustine Dr.        mailto:rtaborek@xxxxxxxxxxx
> Santa Clara, CA 95054            http://www.nSerial.com