Showing posts with label Standards. Show all posts
Showing posts with label Standards. Show all posts

Wednesday, May 16, 2012

The IP Finance weblog welcomes the latest guest post from Keith Mallinson (WiseHarbor) on a subject which he has really made his own: the subtle interplay of sometimes competing and sometimes congruent private and public interest in the shaping of the dynamics of the market for the licensing of technology in the information, communication and telecom sector.  This post touches on a topic that has been the subject of all-too-little discussion in IP circles.  IP rights help to establish the existence of winners in the marketplace -- but who gets to decide who those winners should be?
The Folly of Picking Winners in ICT

Government attempts to favour and promote certain business models, companies and technologies are justifiably criticised. The UK Cabinet Office’s proposed policy to mandate the use of only pre-selected, royalty-free standards in public ICT procurement is similarly flawed. This will limit choice by foreclosing many popular open standards, numerous products which adhere to them and companies who depend on upstream licensing revenues. The Open Standards Board responsible for implementing this policy will face significant governance challenges in ensuring impartially in standards selections. In contrast, free-market processes allowing competition among a much wider array of open standards and software licensing maximises customer choice across many different government departments, fosters innovation, reduces lifecycle costs and enables obsolete or poorly performing standards to be superseded.
Mandating particular standards and discriminating against or excluding royalty-based business models in government procurement constitutes hazardous industrial policy for the UK. The government is the largest UK ICT spender on with annual expenditures of approximately £18 billion in recent years. Direct and likely indirect consequences of this large purchaser on the ICT marketplace, such as explicitly or implicitly obliging citizens, as well as government suppliers of other goods and services, to adopt the same standards, would be significant with this policy.

Dirigisme versus facilitation
Governments have a history of making bad decisions in championing particular companies, technologies and business models. For example, the Inmos semiconductor company received £211 million from the UK government in the 1970s and 1980s with its strategy to produce commodity D-RAMs and develop its “transputer”, but the company foundered, did not become profitable after many years and was sold to SGS-Thomson in 1989. The UK is effectively nonexistent in semiconductor manufacturing today. The UK’s “fabless” semiconductor companies such as ARM, Picochip (acquired by Mindspeed Technologies in 2012) and Icera (acquired by NVIDIA in 2011) rely on partners including foreign “foundries” to fabricate their designs. 
State monopoly France Telecom forced adoption of the Minitel videotext online service in the 1980s by withdrawing phone books and spending billions giving away the terminals to citizens. The associated technological standards and equipment manufacturers made minimal headway with Minitel technologies abroad and were eclipsed by the advance of the Internet in the 1990s. Minitel provided consumers with their first means of online access. However, views on long-term benefits to French consumers are mixed. Resistance to replace the entrenched home-grown standard caused France to be a laggard in Internet adoption.

In contrast, supporting entire industry sectors where a nation has strategic strength is more justifiable and attracts widespread support from various commentators. For example, clustering of complementary and competitive companies can be beneficial. In these circumstances, market forces spur competitive behaviour, including some Schumpeterian “creative destruction”, which helps eliminate the sclerosis and risks that come with monoculture. For example, Silicon Valley in California provides a fertile technical and commercial environment in which various business models and many ICT companies, standards and products have flourished while others have failed.

Better for less

A key stated objective with the proposed Cabinet Office policy is to level the “playing field” for open source and proprietary software. It is, therefore, perverse that standards based on Fair Reasonable and Non-Discriminatory (FRAND) licensing and requiring patent fees should be the principle target for elimination with this policy. The policy will automatically also exclude many proprietary offerings that are based on those standards and which cannot practically be adapted to other, royalty-free, standards. In many cases, such standards are widely implemented by many suppliers and are used by the vast majority of business customers and consumers.

The cabinet office seeks to mandate specific royalty-free standards to achieve various objectives including cost reduction and avoiding vendor lock-in, as well as making ICT solutions fully interoperable. However, a report entitled Better for Less, published in 2010 by Liam Maxwell, now Deputy Government CIO and the proposed policy’s champion, identifies that most UK government ICT spending is with systems integration companies including HP/EDS, Fujitsu Services, Capgemini and IBM. The Government's over-reliance on large contractors for its IT needs combined with a lack of in-house skills is also a "recipe for rip-offs" according to a report by the Public Administration Select Committee (PASC) in July 2011.These suppliers are typically deeply embedded with long-term contracts that government finds difficult to unravel.

Software represents only a relatively small playing field in comparison to others in ICT spending. According to Forrester Research figures, market segments where open source software competes or combines with proprietary software products represent just 12.4% of $2.5 trillion total global business and government ICT expenditures including operating system software (1.0%), non-custom-built applications (6.7%) and middleware (4.7%). In comparison, IT services (11.6%) and outsourcing (9.8%) combined represent 21.5% of spending. Computer equipment represents 13.9%. The $2.5 trillion total appears to exclude very significant costs for internal staffing.

Software licensing costs are included even in modestly-priced PCs. The PASC report also indicated it was “ridiculous that some departments spend an average of £3,500 on a desktop PC”. A 2011 Cabinet Office press release stated it would “end poor value contracts such as those where Government departments and agencies paid between £350 and £2,000 for the same laptop”. The response to a government procurement freedom of information request on this matter by fullfact.org shows that while these prices actually represent totally different PC specifications, the proprietary operating system and office document software is identical in each case, with differences relating to microprocessors, displays, wireless modems and functionality such as fingerprint recognition accounting for the very large pricing disparity.

Uncertain scope, invalid distinctions

The proposed policy states that standards selection will be limited to software interoperability, data and document formats. The scope of these terms is unclear. And, in the next few years it will become even more difficult meaningfully to separate standardisation in these from other domains. The consultation’s terms of reference make the invalid assumption that software is distinct from hardware and that telecommunication is distinct from computing. Evidence weighs against these arguments with increasing technological convergence and other changes in ICT. Smartphones and tablets are becoming the dominant computing platforms in our personal lives and at work. Similarly, PCs have overtaken mainframe computers and revolutionised ICT usage since the 1980s. Communications is intrinsic to these new mobile devices and is increasingly integrated with most desktop PCs including web, and cloud-based usage where demarcations between software, hardware and service are submerged.

Video is becoming most prevalent. According to long-standing Cisco CEO, John Chambers, in a recent Bloomberg Business Week article, “Every device, five years from now, will be video. That’s how you’ll communicate with your kids, with work.” Switching video standard is nothing like the peripheral task of simply replacing or adapting the mains plug on a TV set. Interoperability standards for video compression and encoding are highly complex algorithms that are deeply and extensively embedded in the workings of core hardware and software. Around one third of Internet traffic is streaming video and mobile video traffic already exceeds 50%.Virtually all of that conforms to FRAND-based standards requiring patent licensing, including AVC/H.264 (MPEG 4 Part 10) with most widespread adoption.

The customer is always right

Standards requirements change with technological innovations and shifting user needs. It is very difficult for any centralized government administration to anticipate or react with the dynamics of ICT supply and demand. Competition among standards is highly beneficial. Market forces precipitate occasional revolutionary changes with new standards displacing old standards (e.g. HTML substitutes for videotext standards such as that used by Minitel) and continuous, incremental improvements to existing standards (e.g., HTML5 replaces previous versions of HTML). Changes in user preference and demand can be difficult to predict. For example, within a few years of the introduction of Apple’s iOS-based iPhone in 2007 and Google’s Android in 2008, former smartphone market leaders Nokia and RIM, each with its own operating system software, were completely up-ended. The highly innovative capabilities with the new software platforms and devices have succeeded because they are very different to and much better than what they have replaced.

Different government departments have diverse needs. Whereas interoperability among UK government departments is important, so is optimising interoperability and access by end users, commercial partners and international organisations. Defence requirements can preclude the most widespread propagation of interoperability and encryption standards. Maximising functionality, security and interoperability for patient records among health authorities will be compromised by imposing standards that are chosen to accommodate requirements in education.

From a user’s perspective, functionality and interoperability with other users trumps supply-side considerations including the number of prospective ICT suppliers and lowest price.

Upstream savings, downstream costs

While seeking to eliminate licensing fees, open source software and royalty-free standards do not ensure lower overall costs. On the contrary, there is significant evidence that open source is no cheaper than proprietary solutions, including total ICT lifecycle costs with project implementation and support. In many cases, total costs may also be lower with technical efficiencies and large economies of scale that arise from the implementation of popular royalty-charging standards. It is practically impossible to create some high-performance ICT standards without infringing any patents for which royalties might be demanded.

Patent fees on popular FRAND-based standards are typically modest. Patent pool administrator MPEG LA licenses 2,339 patents it deems essential to H.264 from 29 licensors to 1,112 licensees for a maximum per unit rate of $0.20. This covers the vast majority of patents declared as essential to the standard. With around 6 billion mobile phones in service worldwide, aggregate royalties are low enough for GSM phones to be sold at price points down to less than $20. However, these fees significantly enable technology companies with upstream business models. They also allow vertically-integrated players to recoup some of their development costs from companies with downstream business models who make products but do not invest in developing the standards-based technologies. Eliminating the possibility of royalties merely forecloses upstream business models in favour of the downstream businesses, such as those that dominate government ICT spending, including hardware manufacturing, systems integration, technical support and outsourcing.

Open and competitive ICT markets allow the widest range of business models and licensing practices, including royalty free standards and open source software. There are many examples of open source software running on FRAND-based standards requiring royalty fees. For example, there are various proprietary and open source software codec implementations available for the H.264 video standard. It would be nonsense to bar this standard in favour of another standard that has only tiny adoption (the most fundamental barrier to interoperability among users), inferior or unproven performance including technical compliance and interoperability among implementations. And, in the case of video, for example, it would most likely infringe some of the very same patents used by the successful standard it would be replacing. So there is a significant possibility that patent fees would be required despite wanting to wish them away. Developing a high-quality video codec standard is a formidable task drawing upon lots of intellectual property. Designing around the best technologies to avoid royalty bearing technologies will result in inferior standards and implementations.

There is generally no conflict between open source licensing and paying patent royalties to third parties. In certain cases where there is conflict, this is the problem of the licensors’ making. The most stringent open source licenses; such as GNU GPLv3—in which “patents cannot be used to render the program non-free”—is seldom used because of such conflicts. In cases where licensing prohibits patent fees, the only legal solution is for such software to be written to ensure it does not infringe any IP that has not also been specifically declared royalty free by its owner.

Governance with selector selection

The Open Standards Board responsible for implementing the policy will face significant governance challenges in ensuring impartiality in its members and the standards selection processes they oversee. It will be difficult to recruit board members who have the required competence in ICT standards, and who as individuals, employees, or academics, are completely free of any interests in the outcome of any standards selections. Members will be affected by their other interests in specific companies, standards groups and business models.

International harmonisation and liberalisation

The European Commission’s approved guidelines on the applicability of Article 101 of the Treaty on the Functioning of the European Union (TFEU) for horizontal co-operation agreements recognise the importance and value of standardization agreements.
“Standards which establish technical interoperability and compatibility often encourage competition on the merits between technologies from different companies and help prevent lock-in to one particular supplier.”
These gtidelines lay out a comprehensive approach for conformity of standardisation agreements with Article 101 TFEU, creating a “safe harbour” while affording standard-setting organisations significant autonomy in setting policies for disclosure of IP and its licensing terms. FRAND licensing, with and without payment of royalties, is explicitly recognised. Licensing policies of many international ICT standards-setting organisations including IEEE, ETSI, ITU-T, CEN/CENELEC are consistent with these guidelines and the charging of patent fees on their standards. It would be a travesty to exclude their standards from government usage in the UK, even if this was only on the basis of attempting to do so for what the Cabinet Office delineates as software interoperability, data and document formats.
Category: articles

Friday, February 3, 2012

It has been a little while since the IP Finance weblog has hosted a piece from regular guest contributor and ICT patents and standards expert Keith Mallinson (WiseHarbor) -- but we are pleased to welcome him back for his first post for 2012:
ICT Esperanto and Competition among Standards

Open and competitive ICT markets produce many standards, but not all will flourish or even survive. With free choice, customers and their users often overwhelmingly plump for one standard over others in pursuit of highest performance (e.g., from HSPA in 3G cellular) or widest interoperability (e.g., with SMS for mobile messaging).

Significantly different levels of compliance and interoperability among vendors, all notionally supplying to the same standard, may also cause customers to favour one vendor over all others. If product performance and interoperability is inferior to that of established standards from leading vendors, the introduction of new and open standards or alternative vendors will fail with customers. Increasing choice of standards or suppliers is useless if implementations do not work properly. In many cases that requires full interoperability with large bases of existing users.

However, coexistence of competing standards, including proprietary extensions to these, is also vital to facilitate innovation, satisfy diverse requirements and enable the emergence of new leading standards. Strong user preferences for certain standards justifiably rewards those who have had the foresight, taken the risks and made the investments to develop and promote them, and build a user base for their fully-featured and standards-compliant products.

Evolution beats creation with rich and widespread use

The need for high levels of performance and interoperability in ICT can be illustrated with an analogy in human language. According to Wikipedia, the goal with Esperanto’s creation was “an easy-to-learn and politically neutral language that transcends nationality and would foster peace and international understanding between people with different regional and/or national languages”.  However, using Esperanto has never become more than a niche activity. It is spoken by less than 0.1% of the world’s population versus more than 10% with English as a first or second language. Evolved languages have richer vocabularies, much more extensive literature and large numbers of existing mother-tongue and second-language users. English, Spanish, French, German and other languages have remained preeminent across regions, nations and within particular international domains in business, art, ICT and engineering.  Customer preference and supplier leadership for books and courses by indigenous organizations, for example, Collins for English as a foreign language and the Goethe Institute for German, are a natural consequence in development of market supply for language education.

Improving interoperability among different suppliers’ kit to eliminate, so called, vendor lock-in is a common requirement of centralized procurement authorities, but it is not, the only, let alone the most important need in selecting or mandating standards. Standards get displaced when alternatives provide distinctly better functionality or end-user interoperability. For example, the introduction of the GSM standard in mobile communications from 1992 was a major technological step forward from the various different analogue technologies deployed in European nations and no other digital standard was allowed to contend. GSM increased network efficiency with use of scarce spectrum allocations, introduced a succession of new capabilities including text messaging and data communications and enabled much wider geographic interoperability for users with international roaming. It also created a fresh start for European manufacturers who had been impeded by disparate national analogue standards with correspondingly fragmented handset and network equipment markets. The openness of the GSM standard also provided greater choice and created an expectation of less customer dependency on particular vendors. The needs of consumers, operators and equipment vendors were so much better satisfied with a new standard that it was quite possible to ignore the complications of backward compatibility and start afresh in European cellular with GSM.

Backward compatibility and dual-mode working

In order to preserve interoperability while adding new functionality and improved efficiency in ICT, it was sometimes necessary to create standards with backward compatibility to older standards or provide dual-mode working for many years. In the UK, two of only three TV channels were broadcast in monochrome with 405 line VHF transmissions until 1970, when colour was introduced with the addition of 625 line UHF transmissions.  Although the colour transmissions were backward compatible with 625-line UHF monochrome receivers, a simulcast was required to serve old, single standard, TV sets until the 405-line VHF network was eventually closed down between 1982 and 1985. In the US, cellular systems including incompatible and competing digital CDMA and TDMA technologies —introduced from the mid 1990s—were designed to incorporate existing analogue capabilities to ensure national roaming. The requirement for cellular licensees to maintain capabilities for analogue roaming throughout their networks was not relinquished until 2008.  Whereas 3G and 4G technologies (including WCDMA, HSPA, LTE and LTE Advanced) surpass the performance of 2G technologies (including GSM, GPRS and EDGE) with respect to spectral efficiency, network speeds and latency—2G technologies will most likely remain embedded in our phones for at least another decade so we can continue to roam widely worldwide. Similarly, HSPA will be included in multimode devices to maximise access to mobile broadband coverage where LTE is not present or uses incompatible frequencies. Most Blu-Ray players sold for decades to come will retain the ability to play our old libraries of regular DVDs.

 In fact, with low incremental costs of retaining old standards, due to software-defined capabilities running on cheap processing and memory, manufacturers find it increasingly attractive to retain old standards and incorporate multiple standards in their products. For example, as I draft this article, Microsoft Word offers me the option to “Save As” from among 18 different formats including .odt, (ODF OpenDocument format text), html, pdf and docx, as well as .doc. Apple’s iPhone 4S incorporates three distinctly different cellular standards: GSM, CDMA EV-DO and HSPA (plus several other closely-related and compatible standards in each case) across five frequency bands, plus WiFi and Bluetooth. Customer and end-user choice is maximized without excessive incremental costs. Crowds can now choose for themselves which standards they prefer to actually use.

Forbidden fruit is most succulent

Prohibiting use of popular standards and products in favour of “open” alternatives can significantly harm end users because, unsurprisingly, the former generally work best. Functionality, quality and interoperability among users must take precedence over ability to switch or mix suppliers. Document formatting standards provide a good example. It is a domain where standards selection has become a most prominent issue. While it is widely and correctly recognized there are problems with interoperability across different formats, e.g., going from ODF to OOXML, it is commonly and incorrectly assumed that all different vendor implementations of a particular document format will fully interoperate and faithfully reproduce identical documents after editing and saving.

Research by Rajiv Shah and Jay P. Kesan shows that what are supposedly the most open document standards do not in themselves ensure the highest or even satisfactory levels of interoperability in many cases when documents are transferred, edited and saved among different world-processing programs. On the contrary, compatibility among different vendors’ implementations of the same open document formats can be quite poor.  In contrast, the leading proprietary standard has the greatest functionality and this was best preserved when documents are exchanged, edited and saved only among different users with the same word-processing program or different programs from the same vendor.  This research included the three most popular word processor document formats: ODF that is generally regarded as the most open format, OOXML and .doc that is seen as most proprietary or closed.  Given that open standards do not ensure interoperability among different vendors; there is no guarantee of vendor choice and the resulting price competition that authorities such as governments expect from procurement policies that insist on what are commonly regarded as being the most open standards.

Interoperability case study

Will anything less that 100% standards compliance and interoperability ever be good enough?  Whereas that goal is unachievable – particularly given that most standards must regularly be updated with various changes and interoperate among other standards serving different purposes, such as, presentations and spreadsheets as well as word-processing – it can be highly desirable to reach as close as possible to that ideal. Personal experience illustrates how demanding conditions can be with risk of embarrassment or something worse with the seemingly slight incompatibilities or data loss. From time to time I am retained as an expert witness in litigation on temporary case teams with contributions from up to dozens of different firms (e.g., with many case co-defendants) including lawyers, economists and industry clients. Drafting and editing expert reports and other documents involves the “master” being passed around with changes to text, graphics, footnotes and redline “track changes” implemented by many different people before being finalised and hurriedly submitted in advance of a fixed deadline. According to Shah and Kesan, as referenced above, these are precisely the types of document features that tend not to be preserved when documents are modified and re-saved in different vendors’ applications –let alone when transferring from one standard format to another. Several ydars ago, I was satisfied with my checks to a certain finalised word-processing document, but was subsequently horrified to discover that in a chart – created in a presentation program and faithfully reproduced into the word-processing document, the background shading had moved in front of a graph line when the document was converted into .pdf format immediately prior to submission.  This obscured the key turning point that was the entire purpose of my chart. At the other extreme, many users may seek only basic functionality. They might, quite reasonably, prefer to trade-off functionality and interoperability in order to pay the lowest price possible or obtain the document program for free.

Winner may take all – but not for ever

Standards can rise from a variety or origins and for various reasons. Communications standards including fax for document images, SMS for mobile messaging, SMTP for email, TCP/IP and HTML on the Internet took hold rapidly and most extensively in their respective domains because the world lacked, while users desperately needed, standards with widespread adoption for interoperability. These characteristics were lacking, for example, in the closed environments with proprietary email systems used internally by corporations. The .doc and other office suite standards remain entrenched because they already provide the highest levels of functionality and interoperability with 95% of users. Usage also includes significant legacies with user-customized templates–including un-standardised macro programming–for particular business purposes such as order entry and monthly financial reporting.

The major fall in fax usage since the advent of interoperable and widely-adopted Internet-based email a decade ago (though most of us retain fax capability and still list fax numbers on our business cards), and a decline in SMS in recent years, show that even the most popular and seemingly enduring standards can eventually take a tumble with new technologies and alternative standards.

Mobile communications has flourished due to, not despite of, extensive competition among standards. Multivendor mobile technology supply has not significantly constrained functionality and interoperability because new mobile standards were developed from inception to achieve these. The U.S. has thrived and now leads the world in network deployment of HSPA and LTE technologies with the most rapid adoption of the most advanced smartphones. There has been competition among four different 2G technologies, and several 3G technologies including CDMA (including EV-DO), WCDMA (including HSPA) and most recently between WiMAX and LTE technologies. The latter two are standardized by rivals  IEEE and the 3rd Generation Partnership Project respectively. CDMA is standardized by another group called 3GPP2This in turn has spurred operator competition in the U.S. and also accelerated technology developments worldwide.  The 3G successor to GSM’s radio layer, which is called UMTS or WCDMA, has far more in common with CDMA than it does with GSM. Pioneering work in CDMA was of great benefit to WCDMA. There was a call to arms with LTE for cellular operators against WiMAX by Vodafone’s former CEO, Arun Sarin at the GSM Association’s 2007 Mobile World Congress.  Later that year, Vodafone and its CDMA technology-based partner Verizon Wireless announced they would both pursue LTE as their common next generation technology.  A keynote presentation by Verizon Wireless CTO, Dick Lynch, at the 2009 Barcelona show announced the LTE vendor line up and most ambitious launch dates. The acceleration and strength of commitment to LTE, precipitated by the WiMAX challenge, has ensured the latter will be kept to a minor position versus LTE.

Current consolidation of cellular standards development in 3GPP has not eliminated competition and it does not preclude significant challenges from standards groups such as IEEE in the future.  3GPP cellular standards have become increasingly strong versus 3GPP2 and IEEE wireless and mobile standards, but there is internal competition within 3GPP and rival standards bodies will continue to present competitive challenges with innovations in rapidly growing and changing markets. Notwithstanding the rise of LTE, based on OFDMA protocols, CDMA-based technologies such as HSPA are also continuously being improved to closely rival the capabilities of LTE.  Some 3GPP contributors have distinct technical and commercial preferences for one standard over the other.

IP financing and just rewards

When selecting standards, customers and end-users in particular want highest performance, most exhaustive compliance and widest user interoperability. It is no surprise customers and end-users tend to make the same selections. Apple’s popular iPhone, with its App Store, iOS and deep integration of software with silicon is a very closed and proprietary system; but this provides superlative performance end-to-end. Microsoft is a leading beneficiary in word-processing applications– with its leading .doc standard and with its contribution to OOXML because its implementations provide the richest functionality and most compliant interoperability among the widest base of users. Whereas there is no consensus on what qualifies, and what does not qualify as an open standard, the likes of GSM, HSPA, EV-DO, LTE and WiFi are as open as anything on offer in their respective domains. Major contributors, for example, Ericsson and Intel respectively, have been principle beneficiaries. Their gains are in upstream licensing income, downstream product markets or from both. Whichever way, returns are for taking significant risks and making investments in developing standards, products and markets full of standards-compliant users. 
Category: articles
Fashion,Fashion Style Ttrends, hair Style, Fashion Style, Fashion Style Fashion,Fashion Style Ttrends, hair Style, Fashion Style, Fashion Style 2015