Tuesday, June 19, 2012


Bjarne Sorenson, Bang & Olufsen's Automotive Technology Director, is a very lucky guy. As we sat in the Aston Martin DBS listening to music, he described the approval process for the company's recent work with BMW. As BMW was putting the final touches on its new 6-series, Sorenson got to spend a few days driving the Autobahn at 100 mph in the car, making sure the optional Bang & Olufsen audio system produced the same perfect sound at high speeds as it did at a stop.

Sorenson has also been lucky in launching Bang & Olufsen's automotive practice about eight years ago, as automakers spent the last decade scrambling to sign deals for branded audio systems in their cars. His practice area has grown to a significant part of Bang & Olufsen's business, with placement in cars from Aston Martin, Audi, Mercedes-Benz, and BMW.
At the Geneva show, Sorenson gave me a tour of the systems in three different cars, providing an excellent opportunity to hear the different implementations. We started at the bottom, with Bang & Olufsen's latest work in the all new Audi A3. I had previously been impressed how Audi is giving the A3 as much available tech as the higher end cars in its line-up. The inclusion of a Bang & Olufsen audio option, which Sorenson said would go for 900 euros, is further evidence that Audi will leave no car behind.

AltFor the new A3, Bang & Olufsen 14 speakers and a 705 watt amp with 15 channels. In front, the doors each hold a woofer and a mid, with tweeters mounted in the A pillars. A subwoofer sits under the floor of the cargo area. Missing from this system are the acoustic lenses, which have become a signature for Bang & Olufsen.

Ensuring the tone controls were all flat, Sorenson played his test CD. Most of the music we listened to was jazzy, featuring a balance of guitar, percussion, and some vocals. The system reproduced the music with strong detail and timbre. On a track with an acoustic guitar, I could hear the musicians fingers slide along a nylon string. From high volume to low, the music came through cleanly, without losing background elements.
The next car we went to was the Aston Martin DBS, which Sorenson said was his favorite of the work Bang & Olufsen had done in the automotive space. This car, one of Bang & Olufsen's earlier efforts, has 13 speakers, with two being acoustic lenses that rise up from the corners of the dashboard. Two amps power these speakers, one with 250 watts going through 13 channels and the other with 750 watts and five channels.

In this car Sorenson again put in his test CD, but initially chose a track that was nearly all drums. Turning up the volume, I was blown away by the presence of the sound, the sense that each drum hit was happening right in front of me. The depth was astonishing, letting me hear the different types of drums and the full range of sound each produced. A symphonic piece we listened to showed the extreme detail of this system. I could hear the distant sound of the musicians turning pages*in the music, and a woodwind player taking a breath before setting off on a solo.

Listening to the tracks we had heard in the A3, the audio quality was clearly superior in the DBS, a much more expensive system.
AltFor the last part of the tour, we got into the Mercedes-Benz SL AMG. For this roadster, Bang & Olufsen used 12 speakers and a 940 watt amp. It has the acoustic lenses that rise from the corners of the dashboard, a center speaker mounted between the seats facing forward, and woofers that have actually been designed into the footwell space of the car.

For this part of the tour, we were hampered by the open top and the crowds of people around checking out the car. Instead of using Sorenson's CD, we tried a USB drive I was carrying with MP3 tracks compressed at 320 kilobits.

Playing a track by Adele, her voice came out as clearly as I have ever heard it, the system reproducing vocal tones nicely. The production on this album, 21, is not very good, though, flattening much of the backing musicians' work. The deep bass guitar of a track by The XX was reproduced very well by this system, the low tones coming out with enough pressure to be felt, yet not causing speaker rattle.

Sorenson suggested Bang & Olufsen was being very selective about the automotive brands with which it would partner, focusing on the European luxury market. He expressed no interesting in working with Lexus, for example. It also seemed the company will not cheapen the brand, the system in the A3 about as low as it would go.
Category: articles
Automotive Technology and Powersports Program Students

Graduates of the Automotive Technology program experience an exciting period of transition as technologies continue their shift toward much higher fuel efficiency. Recruiters and employers of SUNY Canton's graduates include dealerships, service industries, automobile manufacturers, and parts suppliers. Graduates learn how to troubleshoot, diagnose and repair all aspects of the automobile power train, suspension, steering, braking and air conditioning systems.
Mission Statement:

We educate and train for future Automotive Professionals. Using critical thinking skills, students will understand the history of the industry, evaluate the dynamics of the present, and envision mobility in the future.
Students in this Major:

* Utilize the latest technology in an electronics-based curriculum.
* Acquire extensive hands-on experience in well-equipped laboratories.
* Receive a world class education in automotive mechanical, technical, and services areas.
* Learn on late model cars donated by automotive manufacturers.
* Get special attention from faculty in small laboratory classes.
* Are encouraged to take the Automotive Service Excellence certification test upon completion of course work.
* Enjoy outstanding career placement.

Career Opportunities:

  • Automotive Service Technician
  • Service Manager
  • Service Advisor
  • Industrial Research/Development
  • Automotive Machine Shop
  • Auto Parts Manager/Owner
  • Technical Representative
  • Automatic Transmission Technician
  • Wheel Alignment/Suspension Technician
  • Maintenance Technician
  • Fleet Maintenance Supervisor/Technician
  • Heavy Equipment Maintenance Technician

Recent Employers of SUNY Canton Graduates:

  • Advanced Auto Parts
  • Blevins Brothers dealerships
  • Chrysler Corporation
  • Fay Motors Cadillac Dealership
  • Firestone Tire Company
  • Ford Motor Company
  • General Motors Corporation
  • Goodyear Tire Company
  • Mort Backus and Sons Chevrolet Buick and GMC
  • NAPA Auto Parts
  • Sears
  • Snap-On Tools Corporation
  • Subaru Distribution Corporation
  • Taylor Rental Corporation
  • Toyota (Lexus Division)
  • Troyer Race Car Engineering
  • Various dealerships throughout NYS
  • Many graduates own their own businesses 

    Career Outlook:

    The U.S. Department of Labor cites a strong demand for qualified automotive technicians and master technicians.


    Transfer Opportunities Include:

    * SUNY Utica/Rome, SUNY Oswego
    * Indiana State University

    National Accreditation:

    SUNY Canton’s automotive technology program has received the highest level of national certification following an extensive review and analysis. The program has received certification by National Automotive Technicians Education Foundation (NATEF) and the National Institute for Automotive Service Excellence (ASE). It gives the already-renowned program instant credibility nationwide and will assist graduates in their career pursuits anywhere in the nation.


    To achieve this coveted recognition, the school’s automotive technology program underwent rigorous evaluation and met or exceeded nationally accepted standards of excellence in areas including instruction, facilities, and equipment.

    * Applicants who have completed a two year vocational-technical automotive program may qualify for advanced standing (transfer credit).

    Admission Requirements:

    * Refer to the table of high school course prerequisites for admission.
    * Students must be qualified to entdr Applied College Mathematics (MATH 101)

    Category: articles

    FERRARI'S 458 Italia has taken another international prize, winning UK magazine Car's performance car of the year 2010.

    "The 458 Italia is an extraordinary Ferrari," says Car editor Phil McNamara. "That one of the fastest, most thrilling and communicative supercars of all time is also one of the easiest to drive day in, day out is a remarkable achievement.
    "Next year's new McLaren supercar will have to be extraordinarily talented to dislodge the Italia."

    Car magazine's "Performance Car of the Year" test took place over more than 4000km in Europe earlier this year, with competitors from eight different brands. The 458's Car gong follows a chestful of COTY medals including those of Fifth Gear, BBC Top Gear Magazine and Auto Express.

    The Ferrari 458 Italia was revealed at the Frankfurt Motor Show in September 2009 and has since had journalists flailing for superlatives.
    Category: articles
    AltThe first car bought by John Lennon after he passed his driving test is being offered for sale at auction.

    The Ferrari 330 GT 2+2 Coupe was bought by the Beatle in 1965 and is expected to sell for between 120,000 euros ($A156,188) and 170,000 euros ($A221,267).
    Lennon's biographer, Philip Norman, described how car dealers descended on the singer's home when the news emerged that he had passed his test and offered him a "gleaming smorgasbord" of luxury cars.

    But it was the Ferrari that caught his eye, with Lennon paying STG2,000 for the car, It was soon joined by his trademark Rolls-Royce, complete with psychedelic paintwork, and a Mini.

    The car will be sold by auction house Bonhams at a sale in Paris on February 5.
    Category: articles

    IT'S A Ferrari, but not as we know it. This is the first 4WD Ferrari.

    Famed for its two-seater, rear-wheel-drive supercars, the Italian manufacturer has revealed its first four-wheel drive, the four,seater Ferrari FF. The latest addition to the Maranello prancing horse fleet is also a hatchback or "shooting brake", but unlike any normal hatchback.
    Its 6262cc direct-injection V12 engine delivers 485kW of power that slings the red missile from standstill to 100km/h in just 3.7 seconds and a maximum speed of 335km/h.

    But the main point of difference in the "Ferrari Four" is the addition of four-wheel drive for the first time which places it in even closer competition with all-wheel-drive Lamborghinis. Ferrari's patented 4RM four-wheel drive system is claimed to weigh half as much as other systems to provide a balanced weight distribution of 53 per cent over the rear axle.

    While no details of how the drive system works have been released, it is believed Ferrari favours a part-time system. This could be a system that is activated by driver selection, when slip is detected in the rear wheels or engaged at lower speeds then kicks into rear-wheel-drive for better fuel economy and performance.

    It is integrated with the car's electronic dynamic control systems and has the latest version of Ferrari's magnetic suspension damping system and Brembo carbon-ceramic brakes.

    Carsguide has published spy photos from Carparazzi of the car heavily disguised but looking frumpy in the rear end. However, with the covers removed it appears Italian design house Pininfarina has produced a sleek supercar that looks like an aerodynamic version of the 1970s Jensen Interceptor.

    It has generous space for four passengers and even 450 litres of luggage. With the rear seats down, luggage space increases to 800 litres.
    The new four-seater gran turismo style puts it in direct competition with the emergence over the past few years of other four-steer GTs such as the Porsche Panamera and Aston Rapide.

    The FF will make its official debut at the Geneva Motor Show in March and will arrive in Australia early next year. Australian importer Ateco says it will replace the 612 Scaglietti in its four-car line-up.

    The current Scaglietti sells here for $698,000, but the FF's drive system is expected to boost that price. It would join Ferrari's current Australian line-up of California Convertible ($459,650), 458 Italia ($526,950) and 599 Fiorano ($677,250).

    The Italian manufacturer is currently enjoying record sales in the US and China and in Australia, Ferrari sold 126 cars last year, up 21.2 per cent which is double the market trend.

    Ateco spokesman Edward Rowe says the FF will appeal to "people who want a Ferrari that is able to used across a broad range of uses. What'r been happening over the past 10-15 years is Ferrari owners' average mileage they drive has been increasing significantly every year and Ferrari owners want to be able to use their cars in a much wider range of uses," he says.

    "The idea of this car is it's fully capable of going to a high-speed performance day and then take you and your family and skis in the car down to the snow for a ski weekend. This illustrates the enormous breadth and ability of this car."

    Rowe says the FF is "still a true supercar" in performance and handling. "It remains a true mid-engined Ferrari but at the same time it's like no other Ferrari that's gone before it."

    Rowe says they already have "double figures" of customers "putting their hand up" for the FF.

    Ferrari FF

    Price: TBA
    Engine: 65-degree 6262cc direct-injection V12
    Power: 485kW @ 8000 rpm
    Torque: 683Nm @ 6000 rpm
    Dimensions (mm): 4907 (l), 1953 (w), 1379 (h) Dry weight: 1790kg Weight distribution: 47% front, 53% rear Top speed: 335 km/h
    0-100km/h: 3.7 sec
    Economy: 15.4L/100km
    CO2: 360g/kmFerrari FF
    Category: articles

    WITH a roar and a chest-crushing leap forward, Ferrari's FF accelerates from the dirt carpark with barely a spit of stone and dust.

    The surprise isn't the acceleration - though 3.7 seconds to 100km/h is pretty good - rather the fact that 485kW/683Nm has been instructed to put all these numbers through the wheels and the result is barely any wheelspin. This is where the Ferrari FF - the company's first all-wheel drive - excels.
    Born from customer demand for a car that can transverse slippery snow and sand tracks, it has been a difficult development that started in 2004. The FF uses an all-wheel drive system that is breathtaking in its simplicity - a direct drive from the front of the crankshaft to a small box with three cogs, one multi-disc clutch box and two shafts to either front wheel.

    The rear wheels are driven conventionally, in this case through a seven-speed dual-clutch transmission - another first for Ferrari - that's mounted with the electronically-controlled diff at the rear.

    There is no mechanical connection between the drive to the front wheels and the drive to the rear wheels. There are numerous benefits of this system, Firstly, it's very light (Ferrari claims 40kg that is half that of a traditional all-wheel drive system) and small (the power transfer unit is only 170mm long).

    The clutches are monitored and controlled by computers to allocate torque where needed. For example, in situations where grip is lost, it is reallocated to the other wheel. This is constantly monitored so maximum grip is available to either of the front wheels when needed. The same use of electronics applies to the rear diff.

    Because there is a speed difference between front wheels and rear wheels, Ferrari has incorporated two gears in the PTU that alter the ratios from the engine to the wheels. To pick up any smaller differences, the clutch pack is allowed to slip.

    Ferrari says the two-cog gearbox in the PTU is good for up to 200km/h at which point the system will disengage drive to the front wheels. It is possible that in extreme situations that 20 per cent of maximum torque will temporarily go to the front wheels - such as when the rear wheels are on ice - to provide maximum traction.
    Category: articles
    Category: articles

    Thumbs Up: The first truly new automotive technology in decades.

    Thumbs Down: New “lower” price makes car more expensive.

    Buy This Car If: You want an electric car that has a virtually unlimited range.

    Even before the current post-crash-test fire media hype, the Chevrolet Volt was the most misunderstood and politicized car in the history of the automobile. Detractors called it “just another hybrid,” without bothering to learn what makes the Volt different from ordinary parallel hybrid automobiles like the best-selling Toyota Prius. Worse, they panned it for “only” getting 37 miles per gallon in generator mode, or “only” having a battery range of 35 miles, missing the point of the car entirely. General Motors was upfront during both the development and the launch of the car: for an admittedly narrow range of the population, the Volt is the perfect automobile. For the rest of us, it’s a very good choice for a four-seat sedan that blends battery power and range-extending generator power.


    I’ve covered the technology behind the Volt on several occasions, so I’ll just summarize here. Unlike conventional, parallel hybrids, which use a battery and electric motor to supplement a fuel-efficient gasoline motor, the Volt is what’s known as a series hybrid, or extended-range electric vehicle. It’s primary propulsion system is always an electric motor, driven by either a lithium-ion battery pack or by a gasoline powered generator, which produces electricity to power the motor when the batteries are 70 percent depleted.


    If that’s hard for you to wrap your head around, don’t think of the Volt as a car. Instead, think of it as a miniature version of a diesel-electric submarine, or a diesel-electric locomotive (albeit with a gasoline engine). While critics will point out that the Volt is driven by its gasoline motor, too, we’d argue that this is only partially true. Electric motors have the benefit of making peak torque at zero RPM; in other words, they can move an electric car from a standing start with a surprising amount of haste. The downside to electric motors is that they run out of speed at high RPMs, which make them less than ideal for automotive applications. What good is a car that doesn’t have sufficient power for a passing move at 60 miles per hour?


    To solve this problem, GM engineers used a planetary gearbox to provide supplemental torque from the engine to the Volt’s electric motor under certain conditions, such as passing at (or above) highway speeds. Critics point out that this means the gas motor can drive the wheels, but that’s not entirely true. There is no direct linkage between the motor and the drive wheels, and should someone steal your electric motor, you can’t just start the gas engine and drive home. The Volt is an electric car, but it adds the functionality of a gas-powered range extender.


    As for the fire issue, the post-crash incident occurred in one car, with a fire risk presented in two others. To understand how minimal the danger is, first you need to understand the NHTSA crash test procedures. Following the side impact test, the car is put on a rotisserie and spun 90 degrees for five minutes to check for leaking gasoline (which presents a much greater risk of fire). In the case of the crash-tested Volts, the impact created a leak in the battery coolant system. Over a three-week period, this leaked coolant dried, crystallized, and caused a short-circuit in the electronics surrounding the battery pack. As a result, the batteries overheated and caused smoldering, or in one case, a fire.


    Why is this a non-issue for Volt owners? First, an impact severe enough to rupture the battery coolant line (in the center of the battery pack itself) will likely result in the total loss of the car. Second, GM has now issued procedures on de-energizing the Volt’s battery pack for post-crash workers. Finally, there have been no documented cases of an undamaged Volt being the cause of a fire. If you’re concerned over a fire risk, ponder this: every day you likely park a conventional car, filled with highly combustible gasoline, in a structure attached to your house. All it takes to create a very real fire risk is a leaking fuel line or electrical short, yet few people give this a second thought.


    Taken for what it is, the Volt is a very comfortable and reasonably entertaining four-seat sedan. In battery mode, it can travel roughly 35 miles on a single charge, using no gasoline in the process. In fact, Jay Leno now has over 11,000 miles on his personal Chevy Volt, and he’s yet to put gas in the car. If you travel less than 35 miles per day, the Volt’s gasoline-powered generator will likely start only to cycle fuel through the system.


    On the other hand, the Volt is one of two electric cars on the market today (the other being the $96,000 Fisker Karma) that you can jump into in New York and drive clear across the country, stopping only for gas. The Volt will run in generator mode as far as you need it to, which makes it as practical as a conventional gasoline powered automobile.

    “But,” critics will argue, “it only gets 37 mpg in generator mode. My Prius gets about 50 mpg.”

    That may be the case, but try driving a Prius for 11,000 miles without once putting gas into it. The Prius may be a more fuel-conscious choice for commutes above a bertain distance, but current models can’t be driven to work in battery mode. The Prius is a parallel hybrid, not an extended-range electric vehicle.


    Therein lies the plus and minus of Volt ownership. It’s the perfect car for those who only need the ability to haul three others, drive less than 35 miles per day and have the $40,000 price of admission. In our eyes, it’s far more practical than a Nissan Leaf, which requires a lengthy charge when the batteries are depleted. Even a quick charge, which replenishes the Leaf’s batteries to 80 percent, takes 30 minutes, and that assumes you can find a quick-charge station.


    For the rest of us, the Volt is still a solid choice even if it isn’t an absolutely perfect fit. Behind the wheel, the Volt accelerates with more enthusiasm than the 0 – 60 time of 9.2 seconds would indicate, and you can really feel the car’s 273 pound-feet of torque when you mash the accelerator. Since the Volt is silent in battery mode, it takes your senses some time to adjust to the lack of noise and vibration. On long road trips, the Volt feels more like a luxury car than a mainstream sedan.

    In generator mode, you hear the 1.4-liter engine running, but the noise is minimal and vibration is never intrusive. Since the engine speed is determined by load on the electric motor and not by throttle position, the disconnect takes some getting used to. Stomp the accelerator going up a hill, and the engine may remain near idle. Lift off as you crest the hill, and the engine may suddenly go to redline, as the car attempts to replenish a charge to the batteries.

    If there’s a drawback to the 2012 Chevy Volt, is that the car is, on an apples-to-apples basis, more expensive than the 2011 Chevy Volt. Last year the car had a base price of $40,280, but that included the Bose audio system, Bose speakers and navigation system. This year, the 2012 Chevy Volt starts at $39,145, but adding the audio system with navigation adds $1,995, while adding the Bose speaker system adds another $495. In other words, a car that cost $40,280 in 2011 will cost you $41,635 in 2012. Is that a big deal? No, but given that the most common objection we’ve heard to the Volt is its price, we’d say that it isn’t moving in the right direction.


    Since Chevy is making an effort to get Volt testers to dealers in all fifty states, we’d encourage you to go drive one before you make your mind up about the car. While it isn’t perfect for all drivers and commutes, it works well enough to be worth a look for most of us.


    Chevrolet supplied the 2012 Volt for my evaluation. My press fleet tester had a base price of $39,145, and came equipped with the $1,995 Audio / Navigation System, the $1,395 Premium Trim Package (perforated leather seats, heated front seats, leather-wrapped steering wheel), the $695 Rear Camera / Park Assist Package (ultrasonic park assist, rear vision camera), the $595 Polished Aluminum Wheels and the $495 Bose Premium Speaker System, for a total sticker price of $45,170.


    Since the Volt exists in a rather exclusive class of automobiles, comparisons to pure electric or parallel hybrids are somewhat irrelevant. The only directly comparable, series hybrid competitor to the Volt is the Fisker Karma luxury sedan, which has a starting price of $95,900.
    Category: articles

    In today's excerpt – the accelerating pace of change. I began my career in financial services in the late 1970s. In my first decade in that industry, there were only a handful of new products introduced, and the number of new computer products offered to support our industry by then-dominant IBM was equally small. Today, there are new financial products introduced every day and an uninterruptible explosion of new technology offerings. In fact, when graphed, the pace of change in an increasing number of industries is moving forward at an exponential rate – doubling every one to two years.

    In the view of esteemed inventor Ray Kurzweil, the pace of technological change in our world is now accelerating so rapidly that there will soon be no distinction between human and machine or between physical and virtual reality - and eventually our technology will match and then vastly exceed the refinement and suppleness of what we regard as the best of human traits:
    "Consider this parable: a lake owner wants to stay at home to tend to the lake's fish and make certain that the lake itself will not become covered with lily pads, which are said to double their number every few days. Month after month, he patiently waits, yet only tiny patches of lily pads can be discerned, and they don't seem to be expanding in any noticeable way. With the lily pads covering less than one percent of the lake, the owner figures that it's safe to take a vacation and leaves with his family. When he returns a few weeks later, he's shocked to discover that the entire lake has become covered with the pads, and his fish have perished. By doubling their number every few days, the last seven doublings were sufficient to extend the pads' coverage to the entire lake. (Seven doublings extended their reach 128-fold.) This is the nature of exponential growth.

    "Consider Gary Kasparov, who scorned the pathetic state of computer chess in 1992. Yet the relentless doubling of computer power every year enabled a computer to defeat him only five years later. The list of ways computers can now exceed human capabilities is rapidly growing. Moreover, the once narrow applications of computer intelligence are gradually broadening in one type of activity after another. For example, computers are diagnosing electrocardiograms and medical images, flying and landing airplanes, controlling the tactical decisions of automated weapons, making credit and financial decisions, and being given responsibility for many other tasks that used to require human intelligence. The performance of these systems is increasingly based on integrating multiple types of artificial intelligence (AI). But as long as there is an AI shortcoming in any such area of endeavor, skeptics will point to that area as an inherent bastion of permanent human superiority over the capabilities of our own creations.
    Category: articles

    Monday, June 18, 2012

    One of the most vexing subjects in the area of innovation is the interrelationship between innovation and imitation, and its implications for intellectual property. More particularly, at least since 1986, when David Teece published his classic article,"Profiting from technological innovation: Implications for integration, collaboration, licensing and public policy", the question was, and is, who is more likely capture value from innovation -- the innovator, the imitator, or the downstream providers of so-called "Complementary Assets", such as manufacture, distribution and marketing? Teece focused on intellectual property under the rubric of "Appropriability Regime" and asked this question: is the innovator's appropriability regime strong or weak? If the former, and if the innovator did not need to share a material portion of the value in the complementary assets, it was more likely to reap the lion's share of the benefits.

    Teece's analysis came out of a different industrial era, standing, as he was, at the cusp of the digital age. While it is not stated explicitly, his analysis rests on the notion that there are sufficient incentives for the innovator that under the right circumstances, i.e., if his IP is sufficiently robust, he will be likely to capture significant value from his inventions and creations. In such a situation, with ample potential reward for the innovator for creating valuable IP, the imitator serves his classic role, exploiting IP rights by design around in a manner that doesnot infringe existing IP rights, together with successful utilization of the complementary assets necessary to commercialise the development. In such a circumstance, IP is central to the Teece framework, both for the innovator and the imitator.

    How different is the role of IP and the imitator in today's world. One need look no further than a article that appeared in the June 2nd issue of The Economist. Entitled "VC Clone home: Venture capital in emerging markets" here, the article describes the phenomenon of "tropicalisation", which is defined as "the practice of backing start-ups that take an established business model and adapt it to an emerging market." The article refers inter alia to Peixe Urbano, described as Brazilian clone of Groupon, Baidu, characterized as "Chinese interpretation of Google", and Trendyol, a Turkish version of Vente-privee.com here, tweaked for the local market. Perhaps the most interesting scalable attempt of imitation in this regard is Rocket Internet here, which reportedly operates a " 'cloning' factory" that apes successful US and European businesses and then seeks to find entrepreneurs to export these clones to the developing world.

    One motivation for this phenomenon seems to be the diminishing track record of success for VCs in the developed world. The search for returns is driving them to seek returns further afield. What is interesting is that the focus of these investments seems to be less, indeed far less, in innovation of the kind described by Teece, and more in the imitation of successful business models in the social media and online commerce space. In considering the examples given in the article, one is hard-pressed to find even instance in which breakthrough technology and supporting strong IP rights is the driver of the adapted business model.

    In fact, the trade mark and brand of the businesses being imitated may be the most valuable IP asset of those companies and it is the IP asset that is the least likely to be copied. Thus, it may be true, as Eric Archer of Monashees Capital states,"w]ith innovation, you have a global side, but with copycat innovation you have geographical limits." However, change the name of the local copycat and adroitly implement the business model within the requirements of the local market, and the so-described "global side" of innovation, embodied in the company's trade mark and brand, may not be enough.

    Perhaps of most concern in the developments described in the article is the nature of the innovation being imitated. In comparison with the innovation contemplated by Teece, these kinds of VC-sponsored activities in the developing world appear IP-lite, resting almost entirely on successful exploitation of complementary assets in the local jurisdiction. Indeed, the closing words of the article should give pause to all those who wrestle with the challenge of engendering innovative IP in the developing world. The article concludes: "It will not be long before emerging markets spawn their own innovations that can be trotted out on a global scale. That would be closer to the spirit of venture capital, which is supposed to ferret out and fund new ideas, not imitations. Until then, however, tropicalisation is set to become an ever more popular strategy. Copy that."

    While that sounds uplifting, there is nothing in the article that supports this conclusion. It is equally plausible that troplicalisation will merely beget more tropicalisation. If Teece described the conditions for "good" imitation, the circumstances that surround tropicalisation suggest the opposite: "bad" imitation with little or no prospect for innovation, at least some of which will be supported by strong and robust IP rights.
    Category: articles

    Tuesday, June 12, 2012

    We just learned that Google has – somewhat quietly – discontinued Google knol, a very interesting project that presumably never really took off and/or was so eclipsed by Wikipedia that Google decided to kill it.  I’m a little surprised because it seems to me folks should NOT start social media projects unless they intend to at least keep them posted online indefinitely.  Otherwise, in my opinion, you have sort of broken an implied contract with your contributors that you will at least keep a website/project/media alive for the long haul.   I can understand “no more support” for such things because that could represent a significant expense, but it seems we have more and more examples where big players are abandoning projects and deleting content from the web, which in general is not cool at all.    Google (and another example, Kodak’s photo environment) generally allow people to download or move their content to other services, but I’d sure be upset if I’d devoted many hours to knol only to learn all my time would soon be wasted by the big G.    THAT SAID, Google deserves a lot of credit for creating a way for people to redirect the knol links to WordPress.   It does appear to me they have anticipated the concerns expressed here and made it fairly easy for folks to preserve their work.   Likewise with Kodak where they are pushing the content to Shutterfly.    So, in summary, maybe it’s overreacting here given that the content will be preserved, just at different URLs:

    Frequently Asked Questions
    What is Knol?
    Knol launched in 2007 as a method for authors to work together to create authoritative articles about specific topics that they know about.
    Why is Knol being discontinued?
    Google is prioritizing our product efforts so we can make things much simpler for our users and devote more resources to high impact products.
    When is Knol being discontinued?
    From now through April 30th, 2012, Knol will work as usual, but we’ve made it easy for you to download your knols to file and/or export them to WordPress.com. From May 1, 2012 through October 1, 2012, knols will no longer be viewable, but can be downloaded and exported. After that time, Knol content will no longer be accessible.
    How do I download my knols?
    From now through October 1, 2012, you can download your knols from knol.google.com. There will be a large section at the top of the page with links to either export your knols to WordPress.com or download them to a file. Clicking the latter will take you to Google Takeout where you can create your archive.
    What are the alternatives to Knol?
    Knol authors have worked together to create very informative and authoritative content. To continue fostering this, we have worked with Solvitor and Crowd Favorite to create Annotum, an open-source scholarly authoring and publishing platform based on WordPress.
    Why is WordPress.com the default destination for Knol content?
    The team at Automattic, which runs WordPress.com, has worked hard to enable Annotum and make its review and publishing functionality available by imported knols. The peer review workflow can be enabled for any Annotum site by following a few simple steps outlined in this knowledge base article.
    Can I redirect my knol’s URLs so people can find the content?
    Yes. Knols exported to WordPress.com will automatically be redirected. To manually set redirects, visit knol.google.com and click on My knols. On the Knols tab, each published knol will have a link to “Set a redirect URL”. Click that link and enter your new target URL. Click Save to set the redirect. Note that users who try to visit the knol’s original URL will see a page informing them that the page’s author would like to send them to a new page and be given the choice of whether to continue. Any co-author or co-owner of a knol has permission to set or remove a redirect URL for that knol until May 1, 2012, when they will be locked.
    Category: articles

    Monday, June 11, 2012

    You know, for a company that holds just a small percentage of the electronics industry market share (from 11%-17% depending on who you ask) Apple sure gets a lot of attention. And who can resist listening when Apple talks? Personally, I don't own a single Apple product so I have no dog in this fight, but I can help but pay attention when Apple starts to make announcements on future products.
    Remember this? I think the new products
    are a wee bit more advanced.

    At the 2012 WWDC (World Wide Developer's Conference), Apple has started to release the first post-Steve Jobs products. Undoubtedly, these products were set before the passing of Apple's legendary leader, so this is the consumer's chance to see what's coming for the next 12 months and what they might find under the tree in December.

    The list is a little long for complete coverage, but here's some things you can expect from the conference:

    • A new MacBook pro, updated with speedier processors and more oomph, including the ability to use USB 3.0 (lightning quick data transfers)
    • Retina displays on some computers (as an option or standard? Not sure) 
    • A new iPhone and/or iPod? (probably not, but...) 
    Engadget has complete coverage details and posts breaking news as it happens. Go over to their site now for analysis and insight into Apple's latest offerings and, who knows? I might just ask Santa for a new Apple gizmo this Christmas. 
    Category: articles

    Monday, June 4, 2012

    The E-Bugster was specially designed for this mission: a two-seat Beetle
    speedster, 85 kW in power, 0 to 100 km/h in 10.8 seconds, with zero
    emissions yet the sharpest of proportions. The central electric module of the E-Bugster has an innovative design;
    it weighs just 80 kg. The energy for powering the electric motor is
    stored in a lithium-ion battery whose modules are housed in a
    Category: articles

    Porsche's mid-engine roadster, the two-seat Boxster, and its coupe
    counterpart, the Cayman, serve as the German brand's entry-level sports
    cars. The Cayman is covered in a separate report in the Cars.com
    Research section. The Boxster competes with the Audi TT roadster, BMW
    Z4, Chevrolet Corvette convertible and Mercedes-Benz SLK-Class.

    The Boxster is available in lightweight, more powerful
    Category: articles

    Range Rover Sport Limited Edition.. Alter solon fashionable. Symmetrical sharper. Regularise much synchronic. Purchasable in Sumatra Unfortunate Surface Coating, with body dark entryway handles, grating and indorse cleft snarl in calamitous, as wellspring as head-turning 20 inch Glossiness Person mixture wheels. Inside, it's finer than e'er with the Dim Lacquer veneers. Arrange Rover Game Small
    Category: articles

    On Concept 2012 Mopar Jeep J-12  The cabin of the J-12 concept has been “dressed down” in support of a
    basic truck theme, with the carpet being replaced by rugged truck bed
    liner, and the bucket seats re-crafted into a modern version of a bench
    seat, trimmed in white with a whimsical plaid pattern.

    The Automobile J-12 concept is essentially an outspread type of Mopar's recently
    Category: articles
    Windows rules the world.

    Despite the huge (and loud) Apple fanbase, the fact remains that the Microsoft Windows operating system is the predominate computer system by a factor of 10 or more. Since the 1980's, the majority of the world's computers run on the system begun by Bill Gates and a few others in a small storefront in Albuquerque, New Mexico decades ago.

    Windows Phone 7 with
    a look towards the future.
    And as each new version of Windows emerges for the consuming public, new methods of computing and the human/machine interface gets a little bit better each time. Until recently, changes were evolutionary- a little bit better each time, but still recognizable as the same ol' Windows- just a bit more grown up.

    Not this time though. For 2012, Microsoft is debuting a brand new system called Windows 8 with some deeply interesting elements sure to catch the eye of Windows lovers all over the world. If you've seen the Microsoft smart phone operating system Windows Phone 7, you get the idea. Tiles on the home screen are "alive" with information like e-mails, weather, Facebook postings, sports scores, etc.

    Looks different, doesn't it?

    Much of what gives Windows 8 its unique appearance is what gives Windows Phone 7 its appearance and that's on purpose. The thinking is, if you become used to the W8 system, then you'll be more likely to see the WP7 system as comfortable, familiar and you'll be more likely to buy one or the other due to this familiarity. I don't pretend to understand this approach, but Microsoft seems to believe the concept has merit.

    No release date is available but it's almost sure to be in the 4th quarter of this year. For those of you who still have questions, PCWorld.com has the answers to some of the more common queries users have been asking. But, if we all wait just a couple of months, all of our questions will be answered when the systems premieres. Click here to see if your question was answered. 

    By the way, work has already begun on Windows 9. Some would call that too soon.

    I call it job security.

    Category: articles

    Thursday, May 31, 2012

    We yesterday posted our observations here about a recent talk given by Dr Kristina Johnson, a prominent figure in the world of engineering, as part of our continuing interest in the narrative that thought-leaders in the world of innovation employ with respect to the role of IP and in particular the role of patents. A reader was kind enough to share with us several additional links in connection with Dr Johnson and this important subject. We are delighted to bring these links (here, here and here) to the attention of our readers. 

    We thank our reader for these links and for contributing to the discussion on this important subject.
    Category: articles
    Today's splashdown of the SpaceX Dragon capsule off the coast of California completes the first successful visit of a private spacecraft to the International Space Station.

    This news isn't really China-related, but it illustrates an important aspect of state vs private sector investment. To over-generalize a bit, (American) conservatives would prefer that the private sector do everything, and (American) liberals would prefer that the state do everything.

    Here we have illustrated the importance of public-private partnership. Without the initial leadership of the state, the US would not have reached the moon in 1969. It took far more investment than any private sector investor would have been willing to risk for what was basically a huge experiment with no monetary payoff.

    Based on a few decades of learning, a private company has now managed to come up with a solution far less expensive, and at least as effective, as anything the state could have come up with to send supplies to the space station. And this was apparently just the beginning; the Dragon capsule will someday ferry people too.

    Bringing this lesson closer to my area of interest, perhaps it also makes sense that governments subsidize the development of alternative methods of fueling personal transportation. Right now, the economics of electric vehicles simply don't make sense. This is why it is important that governments step in to subsidize the experimental phase.

    As with the original moon shot, alternative fuels and methods of propulsion are experimental (and arguably far more important), but no private sector investor would be willing to take such a risk to build a product that doesn't yet provide a positive return on investment.

    This isn't to say that governments will always be right, but that's the nature of experimentation. Thomas Edison conducted over 3,000 experiments with different materials for filaments until he finally made a lightbulb that worked. For small experiments such as Edison's, no government help was needed, but without government regulation and assistance, we would never have moved beyond the traditional gas guzzlers we were driving back in the 1970s.

    Scorn of conservatives: The Chevrolet Volt

    When I began to research business-government relations in China's auto industry several years ago, I started with a question of why China seemed to be achieving such great success under the heavy hand of the state. What I have since learned is that, while, yes, the state can drive outstanding growth in an agrarian society with low living standards, the ability of the government to make good decisions declines as technology becomes more complex.

    China has now reached a point where its government needs to be smart enough to step back and allow the private sector to compete freely (and not just talk about it). All the government assistance in the world will not make state-owned businesses competitive. They simply lack the proper incentives.

    At the same time, the American people also need to realize that the US will never grow as fast as China has grown over the past three decades (and neither will China again). We also need to realize that there is a place for government to invest in experimentation that will result in much-needed future technology. Sometimes the government will make mistakes, but fear of making mistakes will deprive us of potential successes.

    Being counted among the most advanced countries on earth is not easy, as the Chinese are about to discover, but the notion that either the government OR the private sector is best suited to drive future innovation is a false choice. We need both. We need the competitive zeal of the private sector, and, sometimes, we also need the deep pockets of the government to take on problems too big for the private sector.

    The Chinese system of dominant state owndrship is nearing the end of its usefulness, and if the Chinese don't figure that out, they are doomed to stagnation and chaos. US arguments over "socialism" vs "free-markets" are also pointless. Somewhere, in the middle, there is an ideal system in which ownership still lies primarily with the private sector, but the government takes the big risks that can potentially save the planet for our children and grandchildren.
    Category: articles

    In a hard-hitting report on the dire state of the economy, the European Commission called for drastic action to prevent catastrophe tearing the region apart.

    The single currency crashed to its lowest level for nearly two years last night as Brussels warned that the eurozone faces ‘financial disintegration’.

    It proposed that money set aside for keeping debt-ridden governments afloat should now be used to rescue troubled European banks.

    Read more:Daily Mail
    Category: articles

    Wednesday, May 30, 2012

    Those of you who follow me in the blogosphere know that one of my continuing interests is how thought leaders in the entrepreneurship space view the role of IP. I assume that IP practitioners are not involved in IP for its own sake (as intellectually engaging as we may find it), but rather for the services that IP may provide in enabling innovation and creativity. If that is so, the question is whether the practice of IP carries with it a certain dysfunctional tunnel vision, the far end of which yields only a narrow view that fails to take into account the broader context.

    One way that I constantly try to measure the potential dysfunctionality of my professional vision is to read or listen to thought leaders in the world of entrepreneurship. While I recognize that not every entrepreneurial idea need involve the kind of innovation that requires IP protection (particularly patent and trade secrets), it is still more likely than not that most successful entrepreneurs are presumably acting in an area that one can expect to attract IP protection. To test this belief, I have become an avid podcast listener of the weekly (during school term) podcast broadcasts offered by the DFJ Entrepreneurial Thought Leaders Seminar offered under the aegis of the Stanford Technology Ventures Program.

    Since this lectures series and the related entrepreneurship program are lodged within the world-famous Stanford Engineering program and its cache enables it to attract world-class figures in innovation and entrepreneurship, it has always seemed to me that the narratives set out by these lecturers should be a useful barometer in measuring how they view the role of IP. Accordingly, I was particularly eager to hear the remarks of Dr. Kristina Johnson, one of America's most distinguished engineering personalities here.

    Dr Johnson has a breathtaking resume, from Stanford Ph.D. to a university professorship, to scores of registered patents, to Deanship of the Engineering School at Duke University, to Provost at Johns Hopkins University, to a stint as an Under Secretary in the Department of Energy in the Obama administration, to her current involvement in an energy-related start-up. Given her broad engagement in just about every possible aspect of innovative activity, Dr Johnson seems to be an ideal observer about where IP stands in this world.

    Listening to her comments, I was struck once again by the pronounced sense of disjunction between her view of IP and that of the professional IP community. In commenting about her own professional trajectory, Dr Johnson mentioned patents only once, and that was in the context of anecdotal description of an early idea of hers (that ultimately was not patented). While she mentioned in passing her numerous registered patents, we are not given any insights into how these inventions were exploited, if it all. While the Bayh-Dole Act here is briefly referred to, we are given no indication whether its enactment affected her inventive activity and/or whether it resulted in increased commercial exploitation of her inventions.

    Dr. Johnson's overall narrative was a compelling one, but IP played only a minimal role in the tale. It was the Q&A that brought out her further thoughts about IP. The first question was asked by a member of the audience who identified himself as retired from a well-known industrial (tech?) company of another era. The question was simple: how did she see the role of IP? It would seem that the question derived from the same sense that I had in listening to her comments--something seemed to be missing in her narrative.

    Dr Johnson's response was, more or less, to acknowledge trade secrets and patents (in that order as I recall), with the comment that patents served mainly to provide early stage "barrier to entry." No other potential benefits of patents were mentioned. One could claim that had she been given additional time or a heads-up on this question, she might have answered differently. That is possible, but for the record, her response was that the principal purpose of patents was as a barrier to entry. She added words to the effect that, in her view, patents are often overemphasized where it is really the overall know-how of the company and the ability to execute that ultimately determines the company's success. Stated otherwise, patents offer certain tactical benefits but little more.

    There is one more aspect to my tale. Stanford is the site of the National Center for Engineering Pathways to Innovation (Epicenter) here, a program funded by the National Science Foundation, whose mission to infuse entrepreneurship and innovation skills into undergraduate engineering programs. During the podcast, Dr Johnson was mentioned as an advisor to the Epicenter. Keeping in mind that the Epicenter has as its goal the inculcation of innovation and entrepreneurship skills in engineering students, here is the question: How does the Epicenter view the role of IP in carrying out its mission? Does Dr Johnson reflect the consensus, or are there other views about the role of IP? If so, what are the views? Given the centrality of Stanford and Silicon Valley to engineering education as well as entrepreneurial activity, the answer to this question will have a material affect on the role of IP among the actors in that world.
    Category: articles
    "Key tax aspects of IP rights transfers", by Edita Ivanauskienė, Antanas Butrimas and Jurgita Randakevičiūtė (LAWIN Vilnius), provides a handy introduction to the niceties of the taxation of intellectual property right transactions in Lithuania. Published on International Law Office (here), this piece outlines the country's basic tax regime and then considers the position of tax-resident individuals, non-resident individuals, taxation of entities, Value-added tax (VAT) and the avoidance of double taxation.

    Regarding VAT,
    "The transfer of copyright and related rights, industrial property rights, franchise rights or know-how is considered a provision of services and is subject to value added tax at the standard rate of 21%".
    It always seems strange to this blogger that an outright transfer of an intellectual property right should be regarded as the provision of a service.
    Category: articles

    Tuesday, May 29, 2012

    Anti-Theft and Locks. for Ford 2012 F-150 Platinum Locking Pickup Truck Tailgate Power Door Locks Operated Via Remote, SecuriCode(TM) Door Keypad and Internal Switch With Automatic Locking Vehicle Anti-Theft Via Alarm, SecuriLock(R) Engine Immobilizer and Exterior Monitoring Child Safety Door Locks Located On Rear Doors  Vehicle Anti-Lockout.

    Braking and Traction.

        Electronic Transfer
    Category: articles

    Interior On top of its already comfortable, spacious interior, the F-150 offers a host of excellent standard features, such as the Easy Fuel™ Capless Fuel Filler and AdvanceTrac® with Roll Stability Control™ (RSC®), plus numerous option upgrades and wheel choices, which differ per model.

    Exterior Features
    F-150 is tough and functional from the outside in. The available integrated
    Category: articles

    Friday, May 25, 2012

    To celebrate the imminent release of my book, Designated Drivers: How China Plans to Dominate the Global Auto Industry, I will be giving away three signed copies next week.

    Designated Drivers is about much more than the auto industry. It uses China's auto industry to tell a story about how China's central government manages its economy in its drive to create global industrial champions. It's an important story for anyone considering doing business in or with China as well as for policymakers who want to better understand business-government relations in what is still the world's most consistently fast-growing economy.

    If you would like a chance to win a free copy of my book, all you need to do is "like" the Designated Drivers Facebook page and share it on your page. The Facebook page is here:

    On Tuesday, May 29 at 08:00, pacific time, I will select three "likers" at random* and send each of them an autographed copy of Designated Drivers.

    And if the number of "likes" surpasses 500, I'll throw in another two copies for a total of five possibilities to win!

    This morning a friend emailed me a picture taken at a bookstore in Hong Kong showing a few copies already on the shelves there, so it should only be another week or so before copies reach North America.

    * To ensure randomness, I will copy all names into an Excel spreadsheet, number them consecutively, and use Excel's random function to select three (or maybe five!) numbers from the list.
    Category: articles

    Tuesday, May 22, 2012

    I will admit: I am Pareto's principal devotee. The basic idea that numerous phenomena can be characterized by the "notion of the vital few" will account for the lion's share of something seems to resonate with my own anecdotal and analytical experience. Often referred to as the "80/20 rule" (such as "80% of one's sales come from 20% of one's customers"), I find myself finding Pareto-like results even when I am not really looking for them.

    The eponymous principle derives from an observation made by Vilfredo Pareto that 80% of land in Italy (and other countries) was owned by 20% of the population. However, the name "Pareto principle" was in fact not coined by Pareto himself, but by a fascinating U.S. engineer named Joseph Juran, who story is itself worth retelling. Suffice to say that Juran's contributions to management based on his adaptation of Pareto's findings is an often under- appreciated milestone in the field here.

    I would like to use the notion of the Pareto principle to consider a question that was asked on a recent "60 Second Tech" podcast produced by Scientific American magazine: How many people download only free or spend very little (a dollar or so) for smartphone apps? Citing a study by ABI Research here, the podcast reported that 70% of all users of smartphone apps download only free or virtually free apps, This, more or less, suggests a Pareto-like division, whereby only 30% of smartphone users ever pay for any apps that are not free or virtually so.

    This result resonates with a finding that I remember hearing about Twitter, whereby only 20%-30% of Twitter users account for the lion's share of Tweets. Once again, Pareto seems to be popping up all over. But the ABI Research-inspired podcast reported additional data regarding the app-buying habits of smartphone users. Having regard to the release that accompanied the report, it turns out that only 3% of app users account for nearly 20% of all expenditures for apps, where the average outlay by users who have at least once paid for an app (including this small number of hard-core users) is approximately $14.00 per month. Pushing even further on these results, it turns out that the median monthly outlay for app users is only approximately $7.50, approximately 50% less than the average monthly expenditures of $14.00.

    Consistent with a Pareto-like view of the world, there is certainly nothing Gaussian about the discrepancy between the mean and median amount of the monthly outlay. In a word, app developers who hope to cash in on their creative efforts are relying a very thin layer of app users. Moreover, there appears to be a clear distinction in the type of apps that will attract "high-roller" users. Most of these apps are what is called "a utility app", most frequently for business purposes. A second category of apps that attract paying customers are "iOS games monetized through strings of in-app purchases." At the other end, apps regarding sports and the like can expect to have little or no commercial traction.

     The ABI Research report suggested two ways that app developers can somehow improve the commercial odds against enjoying even a semblance of commercial success:
     1. Try to make sure that your app either supports or is supported by a web component. 
    2. Try to find a way to convince your customer that the app merits a long-term (by app standards) engagement by them. 
    One way to do so is by the time-honoured practice of first giving the app away for free with the hope that you will develop a small yet highly devoted band of followers who will be willing to lay out monthly sums to ensure their continued access to the app and its updates or upgrades. I have to admit--this is all very depressing. My son is nearing the completion of a Computer Science degree and I wonder what I would advise him, should he announce one day that he has decided to work on developing smart phone apps. If not quite "blood, sweat and tears", should I counsel him on the dismal likelihood of success? Or should I simply wish him the best on his journey and assure him that he has always has a roof over his head, if all else fails?

    More generally, I wonder whether the business model described above is sustainable in the longer run, whereby the promise of substantial revenues for the very few, together with the more broadly based challenge to develop an app that will be used by others, irrespective of whether such an app generates revenues, will continue.

    More on the Pareto principle here.
    Category: articles

    Thursday, May 17, 2012

    OUP: a great opening
    for the right candidate
    Oxford University Press has a newly-created post for a good and suitably-qualified soul who will hold responsibility for setting up, managing and administering the acquisition and licensing of intellectual property rights within OUP's English Language Teaching (ELT) Division -- described as 
    "a global leader in the provision of multimedia English language teaching and learning materials. Operating in over 100 countries, it reaches millions of teachers and students each year ..."
    The job title is Intellectual Asset Manager.  For the record, since we're talking about a body which forms part of the great and perplexingly complex legal octopus which is Oxford University, the adjective "intellectual" applies to "asset" rather than "manager".

    A "broad understanding of copyright law, including experience of publishing agreements" is among the items listed in the job specification -- but there's nothing to say that a legal qualification is required, so this might be a perfect position for a gifted and enthusiastic amateur.

    For full details just click here.  Closing date for applications is 3 June 2012.
    Category: articles

    Wednesday, May 16, 2012

    The IP Finance weblog welcomes the latest guest post from Keith Mallinson (WiseHarbor) on a subject which he has really made his own: the subtle interplay of sometimes competing and sometimes congruent private and public interest in the shaping of the dynamics of the market for the licensing of technology in the information, communication and telecom sector.  This post touches on a topic that has been the subject of all-too-little discussion in IP circles.  IP rights help to establish the existence of winners in the marketplace -- but who gets to decide who those winners should be?
    The Folly of Picking Winners in ICT

    Government attempts to favour and promote certain business models, companies and technologies are justifiably criticised. The UK Cabinet Office’s proposed policy to mandate the use of only pre-selected, royalty-free standards in public ICT procurement is similarly flawed. This will limit choice by foreclosing many popular open standards, numerous products which adhere to them and companies who depend on upstream licensing revenues. The Open Standards Board responsible for implementing this policy will face significant governance challenges in ensuring impartially in standards selections. In contrast, free-market processes allowing competition among a much wider array of open standards and software licensing maximises customer choice across many different government departments, fosters innovation, reduces lifecycle costs and enables obsolete or poorly performing standards to be superseded.
    Mandating particular standards and discriminating against or excluding royalty-based business models in government procurement constitutes hazardous industrial policy for the UK. The government is the largest UK ICT spender on with annual expenditures of approximately £18 billion in recent years. Direct and likely indirect consequences of this large purchaser on the ICT marketplace, such as explicitly or implicitly obliging citizens, as well as government suppliers of other goods and services, to adopt the same standards, would be significant with this policy.

    Dirigisme versus facilitation
    Governments have a history of making bad decisions in championing particular companies, technologies and business models. For example, the Inmos semiconductor company received £211 million from the UK government in the 1970s and 1980s with its strategy to produce commodity D-RAMs and develop its “transputer”, but the company foundered, did not become profitable after many years and was sold to SGS-Thomson in 1989. The UK is effectively nonexistent in semiconductor manufacturing today. The UK’s “fabless” semiconductor companies such as ARM, Picochip (acquired by Mindspeed Technologies in 2012) and Icera (acquired by NVIDIA in 2011) rely on partners including foreign “foundries” to fabricate their designs. 
    State monopoly France Telecom forced adoption of the Minitel videotext online service in the 1980s by withdrawing phone books and spending billions giving away the terminals to citizens. The associated technological standards and equipment manufacturers made minimal headway with Minitel technologies abroad and were eclipsed by the advance of the Internet in the 1990s. Minitel provided consumers with their first means of online access. However, views on long-term benefits to French consumers are mixed. Resistance to replace the entrenched home-grown standard caused France to be a laggard in Internet adoption.

    In contrast, supporting entire industry sectors where a nation has strategic strength is more justifiable and attracts widespread support from various commentators. For example, clustering of complementary and competitive companies can be beneficial. In these circumstances, market forces spur competitive behaviour, including some Schumpeterian “creative destruction”, which helps eliminate the sclerosis and risks that come with monoculture. For example, Silicon Valley in California provides a fertile technical and commercial environment in which various business models and many ICT companies, standards and products have flourished while others have failed.

    Better for less

    A key stated objective with the proposed Cabinet Office policy is to level the “playing field” for open source and proprietary software. It is, therefore, perverse that standards based on Fair Reasonable and Non-Discriminatory (FRAND) licensing and requiring patent fees should be the principle target for elimination with this policy. The policy will automatically also exclude many proprietary offerings that are based on those standards and which cannot practically be adapted to other, royalty-free, standards. In many cases, such standards are widely implemented by many suppliers and are used by the vast majority of business customers and consumers.

    The cabinet office seeks to mandate specific royalty-free standards to achieve various objectives including cost reduction and avoiding vendor lock-in, as well as making ICT solutions fully interoperable. However, a report entitled Better for Less, published in 2010 by Liam Maxwell, now Deputy Government CIO and the proposed policy’s champion, identifies that most UK government ICT spending is with systems integration companies including HP/EDS, Fujitsu Services, Capgemini and IBM. The Government's over-reliance on large contractors for its IT needs combined with a lack of in-house skills is also a "recipe for rip-offs" according to a report by the Public Administration Select Committee (PASC) in July 2011.These suppliers are typically deeply embedded with long-term contracts that government finds difficult to unravel.

    Software represents only a relatively small playing field in comparison to others in ICT spending. According to Forrester Research figures, market segments where open source software competes or combines with proprietary software products represent just 12.4% of $2.5 trillion total global business and government ICT expenditures including operating system software (1.0%), non-custom-built applications (6.7%) and middleware (4.7%). In comparison, IT services (11.6%) and outsourcing (9.8%) combined represent 21.5% of spending. Computer equipment represents 13.9%. The $2.5 trillion total appears to exclude very significant costs for internal staffing.

    Software licensing costs are included even in modestly-priced PCs. The PASC report also indicated it was “ridiculous that some departments spend an average of £3,500 on a desktop PC”. A 2011 Cabinet Office press release stated it would “end poor value contracts such as those where Government departments and agencies paid between £350 and £2,000 for the same laptop”. The response to a government procurement freedom of information request on this matter by fullfact.org shows that while these prices actually represent totally different PC specifications, the proprietary operating system and office document software is identical in each case, with differences relating to microprocessors, displays, wireless modems and functionality such as fingerprint recognition accounting for the very large pricing disparity.

    Uncertain scope, invalid distinctions

    The proposed policy states that standards selection will be limited to software interoperability, data and document formats. The scope of these terms is unclear. And, in the next few years it will become even more difficult meaningfully to separate standardisation in these from other domains. The consultation’s terms of reference make the invalid assumption that software is distinct from hardware and that telecommunication is distinct from computing. Evidence weighs against these arguments with increasing technological convergence and other changes in ICT. Smartphones and tablets are becoming the dominant computing platforms in our personal lives and at work. Similarly, PCs have overtaken mainframe computers and revolutionised ICT usage since the 1980s. Communications is intrinsic to these new mobile devices and is increasingly integrated with most desktop PCs including web, and cloud-based usage where demarcations between software, hardware and service are submerged.

    Video is becoming most prevalent. According to long-standing Cisco CEO, John Chambers, in a recent Bloomberg Business Week article, “Every device, five years from now, will be video. That’s how you’ll communicate with your kids, with work.” Switching video standard is nothing like the peripheral task of simply replacing or adapting the mains plug on a TV set. Interoperability standards for video compression and encoding are highly complex algorithms that are deeply and extensively embedded in the workings of core hardware and software. Around one third of Internet traffic is streaming video and mobile video traffic already exceeds 50%.Virtually all of that conforms to FRAND-based standards requiring patent licensing, including AVC/H.264 (MPEG 4 Part 10) with most widespread adoption.

    The customer is always right

    Standards requirements change with technological innovations and shifting user needs. It is very difficult for any centralized government administration to anticipate or react with the dynamics of ICT supply and demand. Competition among standards is highly beneficial. Market forces precipitate occasional revolutionary changes with new standards displacing old standards (e.g. HTML substitutes for videotext standards such as that used by Minitel) and continuous, incremental improvements to existing standards (e.g., HTML5 replaces previous versions of HTML). Changes in user preference and demand can be difficult to predict. For example, within a few years of the introduction of Apple’s iOS-based iPhone in 2007 and Google’s Android in 2008, former smartphone market leaders Nokia and RIM, each with its own operating system software, were completely up-ended. The highly innovative capabilities with the new software platforms and devices have succeeded because they are very different to and much better than what they have replaced.

    Different government departments have diverse needs. Whereas interoperability among UK government departments is important, so is optimising interoperability and access by end users, commercial partners and international organisations. Defence requirements can preclude the most widespread propagation of interoperability and encryption standards. Maximising functionality, security and interoperability for patient records among health authorities will be compromised by imposing standards that are chosen to accommodate requirements in education.

    From a user’s perspective, functionality and interoperability with other users trumps supply-side considerations including the number of prospective ICT suppliers and lowest price.

    Upstream savings, downstream costs

    While seeking to eliminate licensing fees, open source software and royalty-free standards do not ensure lower overall costs. On the contrary, there is significant evidence that open source is no cheaper than proprietary solutions, including total ICT lifecycle costs with project implementation and support. In many cases, total costs may also be lower with technical efficiencies and large economies of scale that arise from the implementation of popular royalty-charging standards. It is practically impossible to create some high-performance ICT standards without infringing any patents for which royalties might be demanded.

    Patent fees on popular FRAND-based standards are typically modest. Patent pool administrator MPEG LA licenses 2,339 patents it deems essential to H.264 from 29 licensors to 1,112 licensees for a maximum per unit rate of $0.20. This covers the vast majority of patents declared as essential to the standard. With around 6 billion mobile phones in service worldwide, aggregate royalties are low enough for GSM phones to be sold at price points down to less than $20. However, these fees significantly enable technology companies with upstream business models. They also allow vertically-integrated players to recoup some of their development costs from companies with downstream business models who make products but do not invest in developing the standards-based technologies. Eliminating the possibility of royalties merely forecloses upstream business models in favour of the downstream businesses, such as those that dominate government ICT spending, including hardware manufacturing, systems integration, technical support and outsourcing.

    Open and competitive ICT markets allow the widest range of business models and licensing practices, including royalty free standards and open source software. There are many examples of open source software running on FRAND-based standards requiring royalty fees. For example, there are various proprietary and open source software codec implementations available for the H.264 video standard. It would be nonsense to bar this standard in favour of another standard that has only tiny adoption (the most fundamental barrier to interoperability among users), inferior or unproven performance including technical compliance and interoperability among implementations. And, in the case of video, for example, it would most likely infringe some of the very same patents used by the successful standard it would be replacing. So there is a significant possibility that patent fees would be required despite wanting to wish them away. Developing a high-quality video codec standard is a formidable task drawing upon lots of intellectual property. Designing around the best technologies to avoid royalty bearing technologies will result in inferior standards and implementations.

    There is generally no conflict between open source licensing and paying patent royalties to third parties. In certain cases where there is conflict, this is the problem of the licensors’ making. The most stringent open source licenses; such as GNU GPLv3—in which “patents cannot be used to render the program non-free”—is seldom used because of such conflicts. In cases where licensing prohibits patent fees, the only legal solution is for such software to be written to ensure it does not infringe any IP that has not also been specifically declared royalty free by its owner.

    Governance with selector selection

    The Open Standards Board responsible for implementing the policy will face significant governance challenges in ensuring impartiality in its members and the standards selection processes they oversee. It will be difficult to recruit board members who have the required competence in ICT standards, and who as individuals, employees, or academics, are completely free of any interests in the outcome of any standards selections. Members will be affected by their other interests in specific companies, standards groups and business models.

    International harmonisation and liberalisation

    The European Commission’s approved guidelines on the applicability of Article 101 of the Treaty on the Functioning of the European Union (TFEU) for horizontal co-operation agreements recognise the importance and value of standardization agreements.
    “Standards which establish technical interoperability and compatibility often encourage competition on the merits between technologies from different companies and help prevent lock-in to one particular supplier.”
    These gtidelines lay out a comprehensive approach for conformity of standardisation agreements with Article 101 TFEU, creating a “safe harbour” while affording standard-setting organisations significant autonomy in setting policies for disclosure of IP and its licensing terms. FRAND licensing, with and without payment of royalties, is explicitly recognised. Licensing policies of many international ICT standards-setting organisations including IEEE, ETSI, ITU-T, CEN/CENELEC are consistent with these guidelines and the charging of patent fees on their standards. It would be a travesty to exclude their standards from government usage in the UK, even if this was only on the basis of attempting to do so for what the Cabinet Office delineates as software interoperability, data and document formats.
    Category: articles
    Fashion,Fashion Style Ttrends, hair Style, Fashion Style, Fashion Style Fashion,Fashion Style Ttrends, hair Style, Fashion Style, Fashion Style 2015