Tải bản đầy đủ (.pdf) (4 trang)

Big pipe on campus

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (115.59 KB, 4 trang )

C OVER S TORY

BIG Pipe

on Campus
Ohio institutions implement a
10-Gigabit Ethernet switched-fiber
backbone to enable high-speed
desktop applications over UTP copper.

“G

For Tim Deaver, the challenge was finding a horizontal
UTP cabling solution that would support both gigabit
today and 10 Gbps in the future.

igabit to the desk is not
just for the engineering
students,” says Tim
Deaver, manager of technical services for the 300-acre campus shared
by Ohio State University Newark (OSUN) and
Central Ohio Technical College (COTC). “Access to video is an important part of just about
every curriculum. We have students in digital
media design and other fields who require access to very large files. Our goal is for students
to access their applications from anywhere on
the campus.”
With plans to grow the student population
from 5,000 today to more than 7,500 by 2009,
campus officials determined that brick and mortar alone would not be the differentiator that
would attract students. The IT infrastructure
supporting the campus was deemed an equally


important element to meet enrollment targets.
With the proper infrastructure in place, the campus could offer students the technical amenities of advanced video applications, virtually
non-blocking Internet access and ample storage
for data and e-mail.
Yet, designing an infrastructure to support gigabit to the desktop was complicated by an imperative to future-proof the wired network. As
alternatives were considered, fiber to the desk
was evaluated, yet only briefly due to the added
costs for fiber switches, cables and NICs. “Copper is still going to be a driving force to the
desktop for the future, especially as long as the


C ov e r S t o r y

price for fiber components remains
higher than for copper,” Deaver says.
The challenge for Deaver was finding a horizontal UTP cabling solution
that would support both gigabit today
and 10 Gbps in the future. A major obstacle was that at the time Deaver was
considering cabling options, all UTP
solutions could only support a theoretical 10GBase-T to only 55 meters–not
the required standard of 100 meters for
horizontal cabling systems. At 55
meters, more telecommunications
rooms and more Ethernet switches
would be required in each building.
For OSUN and COTC to compete
effectively for students, the IT infrastructure would have to be designed to
support many different and bandwidthintensive applications. Video on demand would be accessible through portal technology, presenting students with
a dashboard for storage and retrieval of
many types of files. A high-speed infrastructure would also be required to expand conferencing services, already in

use for nursing and other fields, so that
full motion video could be carried across
the campus or across the state, enabling
teachers to multicast to many classrooms at once.

A BENEFICIAL ARRANGEMENT
The shared arrangement between
OSUN and COTC has proven beneficial to both colleges, Deaver says. While
admissions and advising remain separate, this combined campus affords both
organizations the cost efficiency of sharing facilities and resources, such as classrooms, computers and the IT infrastructure. In fact, sharing facilities and

In the absence of standards
for 10-Gigabit Ethernet over
UTP, Shannon’s Law provided
the guidance needed to find
the correct solution.
resources has allowed more services for
students than would have been possible
on separate campuses, even as enrollment has grown nearly 40% over the
past several years.
The creation of an advanced learning environment continues to provide
OSUN/COTC a competitive edge over
other colleges in the area. Aside from

In the digital media design lab, students and
teachers have access to conferencing and
collaboration capabilities enabled by the
campus’ high-speed connectivity.

top-notch programs and staff, new facilities will continue to be required to

meet enrollment projections. Already,
a modern academic building, auditorium and conference center have been
built. A new student center and library
are also in the project plan.
To fully support video and other technical services that would give the
schools an edge in student recruitment,
the campus network was designed with
a 10-Gigabit Ethernet switched fiber
backbone linking all buildings. This fiber ring would support not only current and planned applications but also
would allow the campus to link with
other schools in Ohio and across the
nation. In Ohio, a dark fiber initiative
has already linked the top 20 schools in
the state to facilitate research and collaboration, as well as enable distance
learning between Ohio schools.
With a 10-Gbps fiber backbone in
place, design priorities for infrastructure turned to enabling gigabit to the


desktop for every student and faculty
member. Standards for 10-Gigabit
Ethernet over UTP, however, were not
in place at that time–and are not ready
today. In the absence of such standards,
Deaver and his team were able to select
cabling products using a common principal used in the electronics industry
for calculating the capacity required for
communications between devices–
Shannon’s Law.
Also known as Shannon’s capacity,

Shannon’s Law is a measure that describes how efficiently a cable can transmit data at different rates. The
Shannon’s table shows values for capacity expressed in bits per second for various distances. Taking into account the
additional bandwidth required to overcome noise produced by active hardware, such as jitter, especially troublesome in the higher frequencies, a
Shannon’s capacity of 18-Gbps is required from the cabling infrastructure
to achieve 10-Gbps transmission at 100
meters.
THE 100-METER SOLUTION
When Deaver was searching for a
10-Gbps UTP solution in early 2004,
cable manufacturers could not demonstrate Shannon’s capacity of 18 Gbps at
100 meters, due primarily to alien
crosstalk, which is the amount of noise
measured on a pair within a cable that
is induced from an adjacent cable. Several vendors could meet the 18-Gbps
requirement, but only at 55 meters.
At a time when the OSUN/COTC
staff considered the prospect of designing cable routes and telecom rooms to
accommodate 55 meters, KRONE,
which was acquired by ADC in 2004,
was able to demonstrate Shannon’s capacity in excess of 18 Gbps over a
100-meter cable with its new
CopperTen cabling. Instead of using the
traditional star filler to separate pairs
within a cable, the CopperTen cable design achieves a higher degree of separation of adjacent pairs through an oblique star filler design, creating
sufficient separation between the same
color pairs to prevent alien crosstalk.
In the absence of standards for
10-Gbps Ethernet over UTP, Shannon’s

Law provided the guidance needed to

find the correct solution. “Shannon’s
capacity really told us what we needed
to watch for, to make sure our network
was future-proof,” explains Deaver. “We
don’t have that future network now.
Our idea was to put in the best cabling
that we could possibly get so when those
10-gig copper cards come out, we’ll be
ready.”
Because ADC’s CopperTen cabling is
backwards compatible to the TIA/EIA
standards for Category 6 cabling and
could demonstrate 18-Gbps transmission over 100 meters, Deaver was able

One way risk was reduced
was by requiring that the
new 10-Gbps cabling
system be fully channel
and component compliant
with Category 6.
to have the infrastructure designed for
standard 100-meter distances on each
floor. With long hallways to deal with,
all other solutions that could only
handle 10-Gbps transmission at 55
meters would have greatly increased the
cost of the project, he says.
In the Founder’s Building, 110,000
feet of CopperTen cable was installed
so that five telecom rooms could be consolidated into one. If the cabling system supported 10 Gbps at only 55

meters, the cost for the building infrastructure would have doubled, according to Deaver. “For Founder’s, a 55meter solution would have required two
telecom rooms and two Ethernet
switches,” he says, citing the six-figure
cost of an Ethernet switch.
In addition, Deaver and his team continue to deploy voice over IP (VoIP)
where it makes sense to consolidate
voice and data traffic. While VoIP will
probably never require the bandwidth
available from a 10-Gbps cabling solution, an infrastructure with fewer
telecom rooms will require less active
equipment to support VoIP, reducing

capital expenditures for the campus
network. As with any project, the cost
of active equipment dwarfs the cost of
the passive network infrastructure.
“If we were to really future-proof the
network, we would need a (cabling) system that would allow us to consolidate
our data closets to as few as possible.
Anything less than 100 meters was going to cost us a lot of money today and
in the long run,” says Deaver.
The condition of existing cable paths
in all buildings was not a surprise. Years
of growth and change resulted in a spider web of cabling, with no apparent
thought process given for ongoing management of the infrastructure. Once
cable and connectors were selected,
Deaver and his team relied on
Starcomm, the cabling contractor, to
design appropriate cable routes for each
building.

200,000 FEET OF CABLE
Starcomm removed most of the old
cable and installed new cable trays and
ladder racks throughout each building
to create defined cable routes. Proper
access was provided so that moves, adds
and changes in the future would be
easier to perform. The installation team
found no real difference between working with CopperTen and Category 5e
or Category 6 cable.
Most significant, however, was the
tight installation timeframe that required the majority of the work to be
performed during the summer break.
The installers removed old cable, created new cable routes, installed about
200,000 feet of cable, and dressed and
terminated everything in less than 60
days.
With any major project, risk is a consideration, especially in the absence of
standards to guide infrastructure design
and product selection. One way risk was
reduced was by requiring that the new
10-Gbps cabling system be fully channel and component compliant with Category 6. Achieving noise and loss characteristics that satisfy transmission at
both 250 MHz for Category 6 and 625
MHz for 10 Gbps was accomplished
with ADC’s CopperTen augmented
Category 6 cable.


C ov e r S t o r y
CopperTen is designed to work on the

upper end of the scales being set by various standards working groups. Today,
TIA contends 10-Gbps signals should
be sent at 500 MHz, while ISO desires
a higher level, controlling the sweep out
to 625 MHz. ADC chose to design
CopperTen for transmission at
625 MHz, so that the cabling system
will be well within specifications once
standards are formally issued for
10-Gbps Ethernet over UTP.
What really mitigated the risk of the
project, however, was the warranty, says
Deaver. The CopperTen cabling system
was guaranteed for Category 6 channel
and component compliance. It was also
guaranteed for 18-Gbps capacity to enable 10-Gbps transmission. “The warranty made all the difference in the
world when it came to cable selection,”
says Deaver.
Investing in an infrastructure that
would support gigabit to the desktop today and someday be able to take full advantage of the 10-Gbps campus backbone was for more than just the obvious
cost savings of not having to recable

buildings when 10-Gbps over UTP became a reality. Rather, the motivation was
all about the students, Deaver offers.
“We are committed to providing an

exceptional education experience. One
way we accomplish that goal is making
sure students can take full advantage of
the technologies available,” he says.



About ADC
ADC was founded in 1935 and today provides global network
infrastructure products and services that enable the delivery of
high-speed Internet, data, video and voice services. The company
has sales in more than 150 countries.
With the acquisition of the KRONE Group in 2004, ADC now
provides an integrated portfolio of products for enterprise networks with TrueNet Structured Cabling Solutions. TrueNet combines cable, connectivity and cable-management solutions for fiber, 10-Gigabit Ethernet over UTP copper and Category 6/5e from
Robert Switz
the data center to the desktop.
Industry veteran Robert E. Switz is chief executive officer and president of ADC. He has played an
instrumental role in transforming ADC in recent years, developing and implementing the strategies
that are extending the company’s leadership in network infrastructure solutions for all types of
networks. Switz joined ADC in 1984 and has served as CFO, president of the broadband access and
transport business unit, and executive vice president. Switz also serves as a director on the boards
of Hickory Tech Corp. and Broadcom Corp.
For more information from ADC:
www.adc.com

Reprinted from Communications News, March 2005
Copyright © 2005 by Nelson Publishing Inc. • www.comnews.com



Tài liệu bạn tìm kiếm đã sẵn sàng tải về

Tải bản đầy đủ ngay
×