The telecommunications industry is built on standards. If each company used its own protocols for moving information and managing services, telecommunications would be a group of islands with awkward, limited connections to one another. Standardization means that users don’t have to know or care who provides services for the parties they communicate with. Connections are transparent and smooth.
The development process is an evolutionary one. Standards bodies and industry consortia create new ones and update existing ones. The aim is to increase the degree of consistency and to keep up with new technological developments. The amount of logic which can fit on a chip keeps growing, making more functionality possible.
A Brief History of Telecommunication Standards
At one time, telecommunication operations were generally national operations, run by government agencies or legal monopolies. Bell Labs and AT&T dominated communications in the United States and had a strong worldwide influence. The result was a high level of standardization but a slow rate of progress, since there was no competition to spur innovation.
In 1982, a consent decree mandated the breakup of AT&T, and in 1984, its telephone monopoly broke up into regional holding companies. In the 1990s, cable TV companies began to deliver digital services. People increasingly had multiple options for voice and television, and the emerging Internet promised even more.
The Internet was based from the beginning on the TCP/IP standard. This allowed the transmission of any kind of data, whether text documents, voice, or video. The early history of data formats was chaotic, but gradually a limited number of formats gained in popularity, as it became possible to view a PDF file or JPEG image on any computer.
ISO developed the OSI network model in the 1980s. It defines seven layers, from the physical to the application, which standards have developed around. TCP/IP doesn’t strictly follow OSI, but it maps to it. Today most networking standards fit in with the OSI model, though some of them straddle layers.
Interoperability among standards has become important, so that information can flow from one type of equipment to another without difficulty. At one time voice communication, television, and data transfer were separate realms. Today all of them may be part of the same session.
Standardization has reached into new areas. There is a growing impetus to adopt standards at the application level, which was once considered the private domain of each company. Automation of business practices has led to the emergence of standards for placing and fulfilling orders and performing maintenance and upgrades. They make products and services obtained through business partners as easy to manage as ones that don’t involve third parties.
The Standards Development Process
The first model for a standard is often a single company’s work. The company that developed it would like everyone to use it, and the advantage in the market generally outweighs the cost of relinquishing some control. Standards bodies choose the most promising candidates. The original version often has unstated assumptions and some unclear descriptions. The standards body gets input from stakeholders and works toward a specification which is clear and unambiguous.
In most cases, there are several draft releases before the final version. Often the organizations interested in them want additional features or changes to existing ones. It can take years before all stakeholders are satisfied.
Sometimes progress is so slow that companies create products based on drafts which aren’t final. If the changes in the final version are minimal, there may not be any problems. A product update may be necessary to achieve 100% compatibility. HTML5 is a notable example, with over six years before the first public draft and the W3C Recommendation. Browsers and servers began supporting it well before its final form.
After a standard is released, organizations may call for changes to it. They may see problems in the specification or want more features. Then there is a new cycle of drafts for the next version. The new one tries to keep backward compatibility with previous versions where possible, to avoid making the existing product base obsolete.
Data and Cable
Let’s look now at some of the standards which are advancing the state of telecommunications today.
Several standards have contributed to data over cable. DOCSIS is one of the most important. It was defined by CableLabs, with the support of many leading telecommunications companies. The first release, DOCSIS 1.0, came in 1997. The latest versions, DOCSIS 3.1 (2013) and DOCSIS 3.1 Full Duplex (not yet finalized), support higher speeds, more efficient use of bandwidth, and better energy management. Version 1.0 supported 40 megabits per second downstream and 10 upstream; The full-duplex version of 3.1 supports 10 gigabits per second simultaneously in both directions.
The United States’ and Europe’s versions of DOCSIS have important differences, largely because they grew from different television standards. The differences mostly affect how signals are physically sent; higher levels are the same regardless of geography.
DOCSIS deals with the physical, data link, and network layers of the OSI model. It works with the IP protocol, including IPv6.
There has been an increasing push for standardizing not just the way bytes move but the higher levels of communication. This has led to the development of standards for defining and delivering services. The definition of “service” is very broad, covering any aspect of communication which is delivered under an agreement. A service can be built on any network layer or combination of layers. It has a business aspect as well as a technological aspect
The MEF Forum has issued over 60 standards relating to IP services. An IP service is an aspect of providing and managing an IP network. A sampling will give an idea of their range.
MEF 2 concerns Ethernet service protection. “Protection” means maintaining availability of service rather than security. It provides a framework for defining protection types and levels, allowing SLA compliance to be measured objectively.
Several standards, including MEF 40, 41, and 42, deal with SNMP (Simple Network Management Protocol), which concerns device management in a network.
- MEF 48 gives requirements and use cases for service activation testing of Carrier Ethernet.
- MEF 51.1 defines terms for operator ethernet service, i.e., for a service provided to an operator or a service provider.
- MEF 55 provides a framework for lifecycle service orchestration (LSO), facilitating automated deployment and management of services. It specifies a set of APIs for coordinating orchestration with partner domains as well as customers.
The TM Forum APIs provide an open specification for interfaces in the MEF 55 LSO Reference Architecture. They define REST-based calls covering the various aspects of service automation. The Forum’s Open API Manifesto stresses the need to “execute at the speed of the digital economy.” Signatories to the manifesto agree to use open APIs as much as possible in relevant products and processes.
The LSO architecture defines three domains:
- Customer: The user of services from a provider.
- Service Provider: The vendor of services in a network infrastructure.
- Partner: Other providers of services, delivering them through the provider.
The names of the APIs follow a musical theme, playing on the concept of orchestration.
- Cantata and Allegro define the interface between the customer and the provider.
- Legato, Presto, and Adagio specify the connections between business applications, service orchestration, infrastructure management, and element management for a provider or partner.
- Sonata and Interlude specify the interface between a service provider and a partner.
The Benefits of Keeping up With Standards
In the short run, locally developed techniques or a vendor’s proprietary protocols may seem like the best way to build up some aspects of a network. They may offer extra efficiency or convenience because they’re tailored for a specific situation. Sometimes they’re a necessity, because no generally accepted standards exist. When broadly accepted standards are a feasible alternative, though, they offer significant long-term advantages.
An obvious benefit is avoiding lock-in to one vendor. When vendors know that moving to an alternative would be difficult, they can charge premium prices. Following standards allows choosing hardware and software from a variety of sources.
Lock-in can turn into dead-end technology. If the one vendor stops making the product, it’s necessary to move to an incompatible one, with attendant costs of re-engineering.
Standards make devices more economical and reliable. All vendors can use the same chips, resulting in economies of scale. The chips have been more widely tested and are less likely to have serious bugs. Re-inventing a solution inevitably introduces bugs which need to be found and fixed, increasing the time to market.
Employees with relevant skills are easier to find when equipment and services follow accepted standards. Finding an engineer with skills in an unusual niche is apt to take longer and require paying more.
Businesses can’t afford to lag too far behind the latest standards. It isn’t necessary for every company to be on the bleeding edge, but clinging to old standards means falling behind the competition. Others will offer more and better services.
The best approach is to devise a long-term plan for tracking new standards and adopting them at a healthy pace. The right balance between stability and progress isn’t the same for everyone, but delivering services that are both reliable and up to date is a necessity for any business.